Calcu-LLM-Agent is a locally-run mathematical reasoning agent that leverages power of locall llms using open-source frameworks- Ollama, LangChain, SmolAgents, and LlamaIndex-to answer all kinds of math questions. You must be wondering why all those frameworks? It is just to compare performance with each framework. It uses tool calling (math functions) and locally running Ollama models to provide accurate, step-by-step solutions.
- Local LLMs: Uses Ollama to run large language models (e.g., Mistral, Llama) entirely on your machine-no cloud required!
- Framework Synergy: Integrates LangChain, SmolAgents, and LlamaIndex for robust agent orchestration and retrieval-augmented generation.
- Math Tool Calling: Automatically invokes math functions (symbolic, numeric, plotting, etc.) to solve questions.
- Extensible: Easily add new tools or math capabilities.
- Multiple Interfaces: Use via CLI or Python API.
- Ollama for local LLMs
- LangChain
- SmolAgents
- LlamaIndex
Happy Calculating!