Hi,
Thanks for this promising tool.
In the readme, you mention that pantheon-cli can be used with local llm such gpt-oss.
Could you give some insight on how to change the default ollama/llama3.2 model to gpt-oss or other open-source models ?
Best regards,