Following the readme to configure local models --configure does not give an option to enter a local LLM endpoint / Ollama: <img width="469" alt="image" src="https://github.com/user-attachments/assets/ee03d193-209d-474c-864c-80ad9f7d86b5"> <img width="528" alt="image" src="https://github.com/user-attachments/assets/6c8e2494-d17e-4d75-a7f3-2623f75e1ca5"> Readme: <img width="476" alt="image" src="https://github.com/user-attachments/assets/8c015db8-bdf6-461f-a4d8-41d0f8fc3a88"> I also tried in the fancy new UI (very cool idea) and the only options there are also Claude and "Open"AI.