-
-
Notifications
You must be signed in to change notification settings - Fork 454
Description
Describe the bug
The AI Chat feature is not working on a self-hosted Docker Compose instance. When trying to access "Ask AppFlowy AI", the UI shows an error message toast that paradoxically says "Operation completed successfully" with a red 'X' icon.
The root cause appears to be that the ai
service fails to initialize its API routes. Nginx access logs show that a frontend request to GET /api/ai/.../model/list
receives a 404 Not Found
response from the ai
service. This failure happens silently, as the ai
service logs show no errors, even with RUST_LOG=debug
enabled.
To Reproduce
- Deploy AppFlowy-Cloud using the latest official
docker-compose.yml
and a standard.env
configuration on a clean server. - Provide a valid
AI_OPENAI_API_KEY
in the.env
file. - Start the stack with
docker compose up -d
. - Log in to the web application and open or create a new document.
- Attempt to use the "Ask AppFlowy AI" feature.
- See the "Operation completed successfully" error toast.
Expected behavior
The AI chat interface should initialize correctly, allowing me to send prompts and receive responses from the language model. The initial call to /api/ai/model/list
should return 200 OK
with a list of available models.
Screenshots

Desktop (please complete the following information):
Additional context
This issue was extensively troubleshooted. We can confirm that the problem is not related to user configuration.
Environment:
- Self-hosted on Ubuntu 22.04 using the official Docker Compose setup.
- All images are using the
:latest
tag as of Sep 5, 2025. - Final working
nginx.conf
,docker-compose.yml
, and.env
files can be provided if necessary. They are based on the official repository files.
Key Diagnostic Findings:
- ✅ Valid API Key: The OpenAI API key is valid. A direct
curl
test from the host server tohttps://api.openai.com/v1/chat/completions
using the key and thegpt-4o-mini
model was successful. - ✅ Correct Environment Variable: The
ai
container correctly receives the API key. The output ofdocker compose exec ai printenv | grep OPENAI
shows theOPENAI_API_KEY
variable is set to the correct, valid key. - ✅ Correct Nginx Proxying: The Nginx configuration correctly proxies requests for
^~ /api/ai/
to thehttp://ai:5001/
upstream. - ❌ The Core Problem (
404 Not Found
): Nginx access logs confirm that when the frontend makes a request toGET /api/ai/{workspace_id}/model/list
, the upstreamai
service responds with404 Not Found
. - 🤫 Silent Failure: The
ai
service logs show no errors. They only show a successful startup sequence (start server on 0.0.0.0:5001
). Even withRUST_LOG=debug
set in the.env
file, no additional error or debug information is logged when the404
occurs. - 🤔 Misleading Health Check: The
/health
endpoint of theai
service returns200 OK
, causing Docker to report the container as healthy, even though its primary functionality is broken.
Conclusion: The ai
service is receiving a valid API key but is silently failing to initialize its API routes (like /model/list
). This appears to be an application-level bug within the appflowy_ai
service.
This issue seems related to, but different from, #1352, as the error is a 404
from a silent application failure, not a 502
from a proxy misconfiguration.