Skip to content

Commit 5e6f598

Browse files
fix(ingester, doc): small fix for OZ ingester and langmisth support doc (#19)
1 parent 89539b7 commit 5e6f598

File tree

3 files changed

+20
-21
lines changed

3 files changed

+20
-21
lines changed

README.md

Lines changed: 14 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,6 @@ Cairo Coder is an intelligent code generation service that makes writing Cairo s
4444
- **Multiple LLM Support**: Works with OpenAI, Anthropic, and Google models
4545
- **Source-Informed Generation**: Code is generated based on Cairo documentation, ensuring correctness
4646

47-
4847
## Installation
4948

5049
There are mainly 2 ways of installing Cairo Coder - With Docker, Without Docker. Using Docker is highly recommended.
@@ -68,7 +67,6 @@ There are mainly 2 ways of installing Cairo Coder - With Docker, Without Docker.
6867
pnpm install
6968
```
7069

71-
7270
5. Inside the packages/agents package, copy the `sample.config.toml` file to a `config.toml`. For development setups, you need only fill in the following fields:
7371

7472
- `OPENAI`: Your OpenAI API key. **You only need to fill this if you wish to use OpenAI's models**.
@@ -115,45 +113,45 @@ There are mainly 2 ways of installing Cairo Coder - With Docker, Without Docker.
115113
```
116114

117115
This configuration is used by the backend and ingester services to connect to the database.
118-
Note that `POSTGRES_HOST` is set to ```"postgres"``` and `POSTGRES_PORT` to ```"5432"```, which are the container's name and port in docker-compose.yml.
116+
Note that `POSTGRES_HOST` is set to `"postgres"` and `POSTGRES_PORT` to `"5432"`, which are the container's name and port in docker-compose.yml.
119117

120118
**Important:** Make sure to use the same password, username and db's name in both files. The first file initializes the database, while the second is used by your application to connect to it.
121119

122-
123120
7. **Configure LangSmith (Optional)**
124121

125122
Cairo Coder can use LangSmith to record and monitor LLM calls. This step is optional but recommended for development and debugging.
126-
123+
127124
- Create an account at [LangSmith](https://smith.langchain.com/)
128125
- Create a new project in the LangSmith dashboard
129126
- Retrieve your API credentials
130127
- Create a `.env` file in the `packages/backend` directory with the following variables:
128+
131129
```
132-
LANGSMITH_TRACING=true
133-
LANGSMITH_ENDPOINT="https://api.smith.langchain.com"
134-
LANGSMITH_API_KEY="<your-api-key>"
130+
LANGCHAIN_TRACING=true
131+
LANGCHAIN_ENDPOINT="https://api.smith.langchain.com"
132+
LANGCHAIN_API_KEY="<your-api-key>"
135133
LANGCHAIN_PROJECT="<your-project-name>"
136134
```
137-
- Add the `.env` in an env_file section in the backend service of the docker-compose.yml
138135

139-
With this configuration, all LLM calls and chain executions will be logged to your LangSmith project, allowing you to debug, analyze, and improve the system's performance.
136+
- Add the `packages/backend/.env` in an env_file section in the backend service of the docker-compose.yml
140137

138+
With this configuration, all LLM calls and chain executions will be logged to your LangSmith project, allowing you to debug, analyze, and improve the system's performance.
141139

142-
9. Run the application using one of the following methods:
140+
8. Run the application using one of the following methods:
143141

144142
```bash
145143
docker compose up postgres backend
146144
```
147145

148-
8. The API will be available at http://localhost:3001/v1/chat/completions
146+
9. The API will be available at http://localhost:3001/v1/chat/completions
149147

150148
## Running the Ingester
151149

152150
After you have the main application running, you might need to run the ingester to process and embed documentation from various sources. The ingester is configured as a separate profile in the docker-compose file and can be executed as follows:
153151

154-
```bash
155-
docker compose up ingester
156-
```
152+
```bash
153+
docker compose up ingester
154+
```
157155

158156
Once the ingester completes its task, the vector database will be populated with embeddings from all the supported documentation sources, making them available for RAG-based code generation requests to the API.
159157

@@ -188,6 +186,7 @@ curl -X POST http://localhost:3001/v1/chat/completions \
188186
The API accepts all standard OpenAI Chat Completions parameters.
189187

190188
**Supported Parameters:**
189+
191190
- `model`: Model identifier (string)
192191
- `messages`: Array of message objects with `role` and `content`
193192
- `temperature`: Controls randomness (0-2, default: 0.7)
@@ -202,7 +201,6 @@ The API accepts all standard OpenAI Chat Completions parameters.
202201
- `user`: User identifier
203202
- `response_format`: Response format specification
204203

205-
206204
### Response Format
207205

208206
#### Standard Mode Response

packages/backend/.env.example

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
LANGCHAIN_TRACING_V2=true
2+
LANGCHAIN_ENDPOINT="https://api.smith.langchain.com"
3+
LANGCHAIN_API_KEY=API_KEY
4+
LANGCHAIN_PROJECT="cairocoder"

packages/ingester/src/ingesters/OpenZeppelinDocsIngester.ts

Lines changed: 2 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,7 @@ import * as fs from 'fs';
22
import * as fsPromises from 'fs/promises';
33
import * as path from 'path';
44
import { Document } from '@langchain/core/documents';
5-
import {
6-
BookChunk,
7-
DocumentSource,
8-
} from '@cairo-coder/agents/types/index';
5+
import { BookChunk, DocumentSource } from '@cairo-coder/agents/types/index';
96
import { BookConfig, BookPageDto } from '../utils/types';
107
import { logger } from '@cairo-coder/agents/utils/index';
118
import { AsciiDocIngesterConfig } from './AsciiDocIngester';
@@ -125,6 +122,6 @@ export class OpenZeppelinDocsIngester extends AsciiDocIngester {
125122
protected async createChunks(
126123
pages: BookPageDto[],
127124
): Promise<Document<BookChunk>[]> {
128-
return super.createChunks(pages, false);
125+
return super.createChunks(pages, true);
129126
}
130127
}

0 commit comments

Comments
 (0)