You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+14-16Lines changed: 14 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -44,7 +44,6 @@ Cairo Coder is an intelligent code generation service that makes writing Cairo s
44
44
-**Multiple LLM Support**: Works with OpenAI, Anthropic, and Google models
45
45
-**Source-Informed Generation**: Code is generated based on Cairo documentation, ensuring correctness
46
46
47
-
48
47
## Installation
49
48
50
49
There are mainly 2 ways of installing Cairo Coder - With Docker, Without Docker. Using Docker is highly recommended.
@@ -68,7 +67,6 @@ There are mainly 2 ways of installing Cairo Coder - With Docker, Without Docker.
68
67
pnpm install
69
68
```
70
69
71
-
72
70
5. Inside the packages/agents package, copy the `sample.config.toml` file to a `config.toml`. For development setups, you need only fill in the following fields:
73
71
74
72
-`OPENAI`: Your OpenAI API key. **You only need to fill this if you wish to use OpenAI's models**.
@@ -115,45 +113,45 @@ There are mainly 2 ways of installing Cairo Coder - With Docker, Without Docker.
115
113
```
116
114
117
115
This configuration is used by the backend and ingester services to connect to the database.
118
-
Note that `POSTGRES_HOST` is set to ```"postgres"``` and `POSTGRES_PORT` to ```"5432"```, which are the container's name and port in docker-compose.yml.
116
+
Note that `POSTGRES_HOST` is set to `"postgres"` and `POSTGRES_PORT` to `"5432"`, which are the container's name and port in docker-compose.yml.
119
117
120
118
**Important:** Make sure to use the same password, username and db's name in both files. The first file initializes the database, while the second is used by your application to connect to it.
121
119
122
-
123
120
7.**Configure LangSmith (Optional)**
124
121
125
122
Cairo Coder can use LangSmith to record and monitor LLM calls. This step is optional but recommended for development and debugging.
126
-
123
+
127
124
- Create an account at [LangSmith](https://smith.langchain.com/)
128
125
- Create a new project in the LangSmith dashboard
129
126
- Retrieve your API credentials
130
127
- Create a `.env` file in the `packages/backend` directory with the following variables:
- Add the `.env` in an env_file section in the backend service of the docker-compose.yml
138
135
139
-
With this configuration, all LLM calls and chain executions will be logged to your LangSmith project, allowing you to debug, analyze, and improve the system's performance.
136
+
- Add the `packages/backend/.env` in an env_file section in the backend service of the docker-compose.yml
140
137
138
+
With this configuration, all LLM calls and chain executions will be logged to your LangSmith project, allowing you to debug, analyze, and improve the system's performance.
141
139
142
-
9. Run the application using one of the following methods:
140
+
8. Run the application using one of the following methods:
143
141
144
142
```bash
145
143
docker compose up postgres backend
146
144
```
147
145
148
-
8. The API will be available at http://localhost:3001/v1/chat/completions
146
+
9. The API will be available at http://localhost:3001/v1/chat/completions
149
147
150
148
## Running the Ingester
151
149
152
150
After you have the main application running, you might need to run the ingester to process and embed documentation from various sources. The ingester is configured as a separate profile in the docker-compose file and can be executed as follows:
153
151
154
-
```bash
155
-
docker compose up ingester
156
-
```
152
+
```bash
153
+
docker compose up ingester
154
+
```
157
155
158
156
Once the ingester completes its task, the vector database will be populated with embeddings from all the supported documentation sources, making them available for RAG-based code generation requests to the API.
159
157
@@ -188,6 +186,7 @@ curl -X POST http://localhost:3001/v1/chat/completions \
188
186
The API accepts all standard OpenAI Chat Completions parameters.
189
187
190
188
**Supported Parameters:**
189
+
191
190
-`model`: Model identifier (string)
192
191
-`messages`: Array of message objects with `role` and `content`
0 commit comments