The purpose of the data-agents-service
is to implement LangGraph workflow graphs that receive business questions and respond via a Large Language Model (LLM) based on data from the analytics-service. The documentation data is sourced from the pages at Bringg Help via Retrieval-Augmented Generation (RAG). The analytics-service provides connections to BI dashboards and a reports builder. Additionally, the graph supports a human-in-the-loop mechanism and stores threads in Redis to continue chats.
- LangGraph Workflow Graphs: Implement workflow graphs to handle business questions.
- LLM Integration: Respond to queries using a Large Language Model.
- Data Source: Utilize documentation from Bringg Help via RAG.
- Analytics Service: Connect to BI dashboards and reports builder.
- Human-in-the-Loop: Support for human intervention in the workflow.
- Thread Redis Storage: Store conversation threads for continued interactions.
To get started with the data-agents-service
, follow these steps:
-
Clone the repository:
git clone [email protected]:bringg/data-agents-service.git
-
Install dependencies:
cd data-agents-service npm install
-
Run locally analytics-service and dongosolo: Setup Analytics-Service locally
-
Start the service:
npm run start-dev
Once the service is running, you can interact with it by sending business questions to the endpoints provided below. The service will process the questions using the LangGraph workflow and respond with answers based on the data from the analytics-service.
-
POST -
http://localhost:3010/chat
- Body:
{ "initialMessage": "what is the driver with the biggest amount of completed orders according to the reports?" }
-
POST -
http://localhost:3010/chat/{thread_id}
- Body:
{ "message": "I also need you to check how many drivers do I have." }
-
GET -
http://localhost:3010/chat/{thread_id}
We welcome contributions to improve the data-agents-service
. Please fork the repository and submit pull requests.