Skip to content
This repository was archived by the owner on Jun 18, 2025. It is now read-only.

timescale/ai-docs-chat

Repository files navigation

ai-chat-backend

This repo contains the code for setting up and running the backend for the AI chat feature. It contains two pieces:

  • scraper - Fetches and vectorizes documentation pages as well as anything else.
  • server - Server process to run user's queries against and have LLM answer them.

Please view their respective READMEs on the specifices of setting and running them. Both services require a connection to a PostgreSQL database with pgvector, where scraper requires write access to an embeddings table and server requires read access to that table, and write to an analytics table. server requires an OpenAI API key as well. Both projects share an .env file at the root of the repo

Both repos can use an .env file to setup the constants necessary for the service, read from either their folder or from the root of the repo.

Getting Started

Create the env file:

cp docker/.env.docker .env

Then open .env file and fill in the OPENAI_API_KEY value.

Afterwards, to run the app, run:

docker compose up

and can access the web server at http://localhost:4567/. See server README for information on available API endpoints.

To populate the document store (or update it to fetch new docs), run:

docker compose run scraper

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •