You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The ingester requires a configuration file in the `packages/agents` directory. Create `packages/agents/config.toml` with the following content:
89
+
90
+
```toml
91
+
[API_KEYS]
92
+
OPENAI = "your-openai-api-key-here"
93
+
94
+
[VECTOR_DB]
95
+
POSTGRES_USER = "cairocoder"
96
+
POSTGRES_HOST = "postgres"
97
+
POSTGRES_DB = "cairocoder"
98
+
POSTGRES_PASSWORD = "cairocoder"
99
+
POSTGRES_PORT = "5432"
100
+
```
101
+
102
+
Replace `"your-openai-api-key-here"` with your actual OpenAI API key. The database credentials should match those configured in your `.env` file.
103
+
104
+
4. **Configure LangSmith (Optional but Recommended)**
88
105
To monitor and debug LLM calls, configure LangSmith.
89
106
90
107
- Create an account at [LangSmith](https://smith.langchain.com/) and create a project.
@@ -95,7 +112,7 @@ Using Docker is highly recommended for a streamlined setup. For instructions on
95
112
LANGSMITH_API_KEY="lsv2..."
96
113
```
97
114
98
-
4. **Add your API keys to `python/.env`: (mandatory)**
115
+
5. **Add your API keys to `python/.env`: (mandatory)**
99
116
100
117
```yaml
101
118
OPENAI_API_KEY="sk-..."
@@ -105,7 +122,7 @@ Using Docker is highly recommended for a streamlined setup. For instructions on
105
122
106
123
Add the API keys required for the LLMs you want to use.
107
124
108
-
5. **Run the ingesters (mandatory)**
125
+
6. **Run the ingesters (mandatory)**
109
126
110
127
The ingesters are responsible forpopulating the vector database with the documentation sources. They need to be ran a first time,in isolation, so that the database is created.
111
128
@@ -115,7 +132,7 @@ Using Docker is highly recommended for a streamlined setup. For instructions on
115
132
116
133
Once the ingester completes, the database will be populated with embeddings from all supported documentation sources, making them available for the RAG pipeline. Stop the database when you no longer need it.
117
134
118
-
6. **Run the Application**
135
+
7. **Run the Application**
119
136
Once the ingesters are done, start the database and the Python backend service using Docker Compose:
0 commit comments