Using OpenInference and OpenTelemetry to send traces from your LangChain MCP agent app to Signoz
First, install all the necessary dependencies for the backend:
Optional Create Python virtual env:
python -m venv myenv && \
source myenv/bin/activateThen:
pip install -r requirements.txtInstall all the necessary dependencies for the frontend:
cd frontend && \
npm installNext create a .env file with the following(in root directory):
OPENAI_API_KEY=<your-openai-api-key>
SIGNOZ_INGESTION_KEY=<your-signoz-ingestion-key>Run the fastapi backend:
uvicorn main:app --reload --port 8001Run the frontend:
cd frontend && \
npm startSetup MCP server: Follow the README in the following repo to setup and deploy the signoz mcp server locally (default localhost port 8000).
Open http://localhost:3000 with your browser to see the result and interact with the application.
Go to the Dashboards tab in SigNoz.
Click on + New Dashboard
Import the langchain-mcp-dashboard.json file from the repo.
Your dashboard should now be imported and look something like this:

