-
Notifications
You must be signed in to change notification settings - Fork 2k
Description
Describe the bug
The letta server command exit with error when attempt to run the self-hosted (installed with uv)
The sreenshots of error(s) plus local/dev environment + PostgreSQL 18 setup with pgvector 0.8.1 installed, attached.
Please describe your setup
- How are you running Letta?
- pip (legacy) (via
uv add) - Desktop (ADE installed)
- pip (legacy) (via
- Describe your setup
- What's your OS (Windows/MacOS/Linux)? Win10 x86_64
- What is your
docker run ...command (if applicable) N/A
Screenshots
If applicable, add screenshots to help explain your problem.
LETTA_PG_URI=postgresql+asyncpg://letta_selfhosted:letta2025@localhost:5432/letta_selfhosted
dev (.venv) environment and error (3 shots)

Additional context
Add any other context about the problem here.
- What model you are using
Agent File (optional)
Please attach your .af file, as this helps with reproducing issues.
If you're not using OpenAI, please provide additional information on your local LLM setup:
Local LLM details
If you are trying to run Letta with local LLMs, please provide the following information:
- The exact model you're trying to use (e.g.
dolphin-2.1-mistral-7b.Q6_K.gguf) - The local LLM backend you are using (web UI? LM Studio?)
- Your hardware for the local LLM backend (local computer? operating system? remote RunPod?)
