AgentMake AI: a software developement kit for developing agentic AI applications that support 14 AI backends and integrate tools and agents. (Developer: Eliran Wong)
Supported backends: anthropic, azure, cohere, custom, deepseek, genai, github, googleai, groq, llamacpp, mistral, ollama, openai, vertexai, xai
This SDK incorporates the best aspects of our favorite projects, LetMeDoIt AI, Toolmate AI and TeamGen AI, to create a library aimed at further advancing the development of agentic AI applications.
anthropic
- Anthropic API
azure
- Azure OpenAI API
cohere
- Cohere API
custom
- any openai-compatible backends that support function calling
deepseek
- DeepSeek API
genai
- Vertex AI or Google AI
github
- Github API
googleai
- Google AI
groq
- Groq Cloud API
llamacpp
- Llama.cpp Server - locat setup required
mistral
- Mistral API
ollama
- Ollama - local setup required
openai
- OpenAI API
vertexai
- Vertex AI
xai
- XAI API
For simplicity, agentmake
uses ollama
as the default backend, if parameter backend
is not specified. Ollama models are automatically downloaded if they have not already been downloaded. Users can change the default backend by modifying environment variable DEFAULT_AI_BACKEND
.
Basic:
pip install --upgrade agentmake
Basic installation supports all AI backends mentioned above, except for vertexai
.
Extras:
We support Vertex AI via Google GenAI SDK. As this package supports most platforms, except for Android Termux, we separate this package google-genai
as an extra. To support Vertex AI with agentmake
, install with running:
pip install --upgrade agentmake[genai]
This SDK is designed to offer a single function generate
for interacting with all AI backends, delivering a unified experience for generating AI responses. The main APIs are provided with the function generate
located in this file.
Find documentation at https://github.com/eliranwong/agentmake/blob/main/docs/README.md
The following examples assumes Ollama is installed as the default backend.
To import:
from agentmake import generate
To generate, e.g.:
generate("What is AI?")
To work with parameter tool
, e.g.:
generate("What is ToolMate AI?", tool="search_google")
generate("How many 'r's are there in the word 'strawberry'?", tool="task")
generate("What time is it right now?", tool="task")
generate("Open github.com in a web browser.", tool="task")
generate("Convert file 'music.wav' into mp3 format.", tool="task")
generate("Send an email to Eliran Wong at [email protected] to express my gratitude for his work.", tool="send_gmail")
To work with parameters input_content_plugin
and output_content_plugin
, e.g.:
generate("what AI model best", input_content_plugin="improve_writing", output_content_plugin="translate_into_chinese", stream=True)
To work with parameter system
, context
, follow_up_prompt
, e.g.:
generate("Is it better to drink wine in the morning, afternoon, or evening?", context="reflect", stream=True)
generate("Is it better to drink wine in the morning, afternoon, or evening?", context="think", follow_up_prompt=["review", "refine"], stream=True)
generate("Provide a detailed introduction to generative AI.", system=["create_agents", "assign_agents"], follow_up_prompt="Who is the best agent to contribute next?", stream=True, model="llama3.3:70b")
To work with parameter agent
, e.g.:
generate("Write detailed comments about the works of William Shakespeare, focusing on his literary contributions, dramatic techniques, and the profound impact he has had on the world of literature and theatre.", agent="teamgenai", stream=True, model="llama3.3:70b")
To work collaboratively with different backends, e.g.
messages = generate("What is the most effective method for training AI models?", backend="openai")
messages = generate(messages, backend="googleai", follow_up_prompt="Can you give me some different options?")
messages = generate(messages, backend="xai", follow_up_prompt="What are the limitations or potential biases in this information?")
generate(messages, backend="mistral", follow_up_prompt="Please provide a summary of the discussion so far.")
As you may see, the generate
function returns the messages
list, which is passed to the next generate
function in turns.
Therefore, it is very simple to create a chatbot application, you can do it as few as five lines or less, e.g.:
messages = [{"role": "system", "content": "You are an AI assistant."}]
user_input = "Hello!"
while user_input:
messages = generate(messages, follow_up_prompt=user_input, stream=True)
user_input = input("Enter your query:\n(enter a blank entry to exit)\n>>> ")
These are just a few simple and straightforward examples. You may find more examples at:
https://github.com/eliranwong/agentmake/tree/main/agentmake/examples
- add documentation about tool creation
- add examples
- convert availble ToolMate AI tools into tools that runable with this SDK
- add built-in system messages
- add built-in predefined contexts
- add built-in prompts
- add cli options for running simple inference, tools or testing
- improve code generation handling
- add backend support of Cohere API