[Bug]: GenAI Ollama and OpenAI don't support private servers with private certificate authorities (self-signed root) #20003
-
Checklist
Describe the problem you are havingWhen trying to connect to an Ollama or OpenAI compatible API that is hosted with a private certificate authority, it fails the TLS negotiation due to the self signed root cert not being a public root. I am using the same intermediary and root CA that is in the fullchain.pem for hosting the Frigate UI. I checked the OpenAI and Ollama client implementations and they both use httpx, so you should be able to leverage a similar code path for both clients in creating the Steps to reproduce
Version0.16.1-e664cb2 In which browser(s) are you experiencing the issue with?N/A Frigate config filegenai:
enabled: True
provider: openai
api_key: "{FRIGATE_OPENAI_API_KEY}"
model: "llava:7b" docker-compose file or Docker CLI commandenv var from docker-compose.yaml
`OPENAI_BASE_URL: https://llm.mynetwork.local/api` Relevant Frigate log outputOpen AI
2025-09-09 16:50:49.004495426 [2025-09-09 16:50:49] openai._base_client INFO : Retrying request to /chat/completions in 0.453879 seconds
2025-09-09 16:50:49.515989294 [2025-09-09 16:50:49] openai._base_client INFO : Retrying request to /chat/completions in 0.950329 seconds
2025-09-09 16:50:50.540518458 [2025-09-09 16:50:50] frigate.genai.openai WARNING : OpenAI returned an error: Connection error.
Ollama:
2025-09-09 16:22:24.541917284 [2025-09-09 16:22:24] frigate.genai.ollama WARNING : Error initializing Ollama: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:992) Relevant go2rtc log outputN/A Operating systemDebian Install methodDocker Compose Network connectionWired Camera make and modelN/A Screenshots of the Frigate UI's System metrics pagesN/A Any other information that may be helpfulNo response |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
This is a feature request rather than a bug, so I'll close this and open it in the Github issues queue (which is what we use for feature requests). |
Beta Was this translation helpful? Give feedback.
-
Feature request: #20004 |
Beta Was this translation helpful? Give feedback.
Feature request: #20004