Skip to content

[Bug] AI service returns 404 for /model/list with valid API key #1558

@srilankabankg-oss

Description

@srilankabankg-oss

Describe the bug

The AI Chat feature is not working on a self-hosted Docker Compose instance. When trying to access "Ask AppFlowy AI", the UI shows an error message toast that paradoxically says "Operation completed successfully" with a red 'X' icon.

The root cause appears to be that the ai service fails to initialize its API routes. Nginx access logs show that a frontend request to GET /api/ai/.../model/list receives a 404 Not Found response from the ai service. This failure happens silently, as the ai service logs show no errors, even with RUST_LOG=debug enabled.

To Reproduce

  1. Deploy AppFlowy-Cloud using the latest official docker-compose.yml and a standard .env configuration on a clean server.
  2. Provide a valid AI_OPENAI_API_KEY in the .env file.
  3. Start the stack with docker compose up -d.
  4. Log in to the web application and open or create a new document.
  5. Attempt to use the "Ask AppFlowy AI" feature.
  6. See the "Operation completed successfully" error toast.

Expected behavior

The AI chat interface should initialize correctly, allowing me to send prompts and receive responses from the language model. The initial call to /api/ai/model/list should return 200 OK with a list of available models.

Screenshots

Image

Desktop (please complete the following information):

Additional context

This issue was extensively troubleshooted. We can confirm that the problem is not related to user configuration.

Environment:

  • Self-hosted on Ubuntu 22.04 using the official Docker Compose setup.
  • All images are using the :latest tag as of Sep 5, 2025.
  • Final working nginx.conf, docker-compose.yml, and .env files can be provided if necessary. They are based on the official repository files.

Key Diagnostic Findings:

  • Valid API Key: The OpenAI API key is valid. A direct curl test from the host server to https://api.openai.com/v1/chat/completions using the key and the gpt-4o-mini model was successful.
  • Correct Environment Variable: The ai container correctly receives the API key. The output of docker compose exec ai printenv | grep OPENAI shows the OPENAI_API_KEY variable is set to the correct, valid key.
  • Correct Nginx Proxying: The Nginx configuration correctly proxies requests for ^~ /api/ai/ to the http://ai:5001/ upstream.
  • The Core Problem (404 Not Found): Nginx access logs confirm that when the frontend makes a request to GET /api/ai/{workspace_id}/model/list, the upstream ai service responds with 404 Not Found.
  • 🤫 Silent Failure: The ai service logs show no errors. They only show a successful startup sequence (start server on 0.0.0.0:5001). Even with RUST_LOG=debug set in the .env file, no additional error or debug information is logged when the 404 occurs.
  • 🤔 Misleading Health Check: The /health endpoint of the ai service returns 200 OK, causing Docker to report the container as healthy, even though its primary functionality is broken.

Conclusion: The ai service is receiving a valid API key but is silently failing to initialize its API routes (like /model/list). This appears to be an application-level bug within the appflowy_ai service.

This issue seems related to, but different from, #1352, as the error is a 404 from a silent application failure, not a 502 from a proxy misconfiguration.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions