A Python package containing easy to use tools for working with various language models and AI services. AIMU is specifically designed for running models locally, using Ollama or Hugging Face Transformers. However, it can also be used with cloud models (OpenAI, Anthropic, Google, etc.) with experimental aisuite support.
-
Model Clients: Support for multiple AI model providers including:
- Ollama (local models)
- Hugging Face Transformers (local models)
- Cloud Providers, such as OpenAI, Anthropic, Google, AWS, etc. (via aisuite)
-
MCP Tools: Model Context Protocol (MCP) client for enhancing AI capabilities
-
Chat Conversation (Memory) Storage/Management: TinyDB-based memory for chat conversations
-
Prompt Storage/Management: SQLAlchemy-based prompt catalog for storing and versioning prompts
AIMU can be installed with Ollama support, Hugging Face (Transformers) support, or both! For both, simply install the full package.
pip install aimu
Alternatively, for Ollama-only support:
pip install aimu[ollama]
Or for Hugging Face:
pip install aimu[hf]
Once you've cloned the repository, run:
pip install -e .
For developer tools (tests, linting/formatting):
pip install -e '.[dev]'
from aimu.models import OllamaClient as ModelClient ## or HuggingFaceClient
model_client = ModelClient(ModelClient.MODEL_LLAMA_3_1_8B)
response = model_client.generate("What is the capital of France?", {"temperature": 0.7})
from aimu.models import OllamaClient as ModelClient
model_client = ModelClient(ModelClient.MODELS.LLAMA_3_1_8B)
response = model_client.chat("What is the capital of France?")
from aimu.tools import MCPClient
mcp_client = MCPClient({
"mcpServers": {
"mytools": {"command": "python", "args": ["tools.py"]},
}
})
mcp_client.call_tool("mytool", {"input": "hello world!"})
from aimu.models import OllamaClient as ModelClient
from aimu.tools import MCPClient
mcp_client = MCPClient({
"mcpServers": {
"mytools": {"command": "python", "args": ["tools.py"]},
}
})
model_client = ModelClient(ModelClient.MODEL_LLAMA_3_1_8B)
model_client.mcp_client = mcp_client
model_client.chat(
"use my tool please",
tools=mcp_client.get_tools()
)
from aimu.models import OllamaClient as ModelClient
from aimu.memory import ConversationManager
chat_manager = ConversationManager("conversations.json", use_last_conversation=True) # loads the last saved convesation
model_client = new ModelClient(ModelClient.MODEL_LLAMA_3_1_8B)
model_client.messages = chat_manager.messages
model_client.chat("What is the capital of France?")
chat_manager.update_conversation(model_client.messages) # store the updated conversation
from aimu.prompts import PromptCatalog, Prompt
catalog = PromptCatalog("prompts.db")
prompt = Prompt("You are a helpful assistant", model_id="llama3.1:8b", version=1)
catalog.store_prompt(prompt)
This project is licensed under the Apache 2.0 license.