Declarative AI components - make AI component initialization easy with auto-imports and prebuilt YAML configs.
pip install aini[lang]
List all available YAML config files for LangChain components:
In [1]: from aini import alist
In [2]: alist(key='lang')
โญโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ Found 9 YAML file(s) โ
โ โโโ aini / Site-Packages: C:/Python3/Lib/site-packages/aini/ โ
โ โโโ lang/ โ
โ โ โโโ config.yml: config โ
โ โ โโโ graph.yml: state_graph โ
โ โ โโโ llm.yml: ds, r1, sf-qwen, sf-qwen-14b, sf-qwen-30b, sf-qwen-32b โ
โ โ โโโ memory.yml: instore, saver โ
โ โ โโโ msg.yml: msg_state, sys, human, user, ai, invoke, prompt โ
โ โ โโโ react.yml: agent โ
โ โ โโโ supervisor.yml: supervisor โ
โ โ โโโ tools.yml: tavily โ
โ โโโ lang_book/ โ
โ โโโ idea_validator.yml: clarifier, researcher, competitor, report โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
See the exact configuration for a component by passing akey
:
In [3]: from aini import aini
In [4]: aini('lang/llm:ds', araw=True)
Out [4]:
{
'class': 'langchain.llms.DeepSeek',
'params': {'model': 'deepseek-chat'}
}
Initialize and use the component directly:
# Instantiate the DeepSeek LLM (make sure DEEPSEEK_API_KEY is set)
In [5]: ds = aini('lang/llm:ds')
# Use the model (example: send a message)
In [6]: ds.invoke('hi').pretty_print()
======================== Ai Message ========================
Hello! ๐ How can I assist you today?
In [7]: from aini import aview
In [8]: aview(ds.invoke('hi'))
<langchain_core.messages.ai.AIMessage>
{
'content': 'Hello! ๐ How can I assist you today?',
'response_metadata': {
'token_usage': {'completion_tokens': 11, 'prompt_tokens': 4, 'total_tokens': 15, 'prompt_cache_miss_tokens': 4},
'model_name': 'deepseek-chat',
'system_fingerprint': 'fp_8802369eaa_prod0425fp8',
'id': '2be77461-5d07-4f95-8976-c3a782e1799b',
'finish_reason': 'stop'
},
'type': 'ai',
'id': 'run--5cdbede5-9545-441e-a137-ebe25699bf36-0',
'usage_metadata': {'input_tokens': 4, 'output_tokens': 11, 'total_tokens': 15}
}
In [9]: from aini import ameth
In [10]: ameth(ds)
Out [10]:
['invoke', 'predict', 'stream', ...]
Use environment variables, input variables, or defaults in your YAML:
llm:
class: "langchain_deepseek.ChatDeepSeek"
params:
api_key: ${DEEPSEEK_API_KEY}
model: ${model|'deepseek-chat'}
temperature: ${temp|0.7}
Resolution priority:
- Input variables (kwargs to
aini()
) - Environment variables
- Defaults section in YAML
- Fallback after
|
You can pass additional parameters when initializing components (only for single component):
In [11]: ds = aini('lang/llm:ds', max_tokens=100)
Specify custom init methods:
model_client:
class: autogen_core.models.ChatCompletionClient
init: load_component
params:
model: ${model}
expected: ${expected}
In [13]: import operator
In [14]: from functools import reduce
# Get a list of agents from LangGraph
In [15]: agents = aini('lang_book/idea_validator)
# Chain them together
In [16]: workflow = reduce(operator.or_, agents.value())
# Get report from the workflow
In [17]: ans = workflow.invoke({'messages': 'Consistency check for AI agents'})
In [18]: ans['messages'][-1].pretty_print()
pip install aini[autogen]
In [17]: client = aini('autogen/client', model=aini('autogen/llm:ds'))
In [18]: agent = aini('autogen/assistant', name='deepseek', model_client=client)
In [19]: ans = await agent.run(task='What is your name')
In [20]: aview(ans)
pip install aini[agno]
In [21]: agent = aini('agno/agent', tools=[aini('agno/tools:google')])
In [22]: ans = agent.run('Compare MCP and A2A')
In [23]: aview(ans, exc_keys=['metrics'])
<<<<<<< HEAD
### [Mem0](https://mem0.ai/)
```bash
pip install aini[mem0]
In [24]: memory = aini('mem0/memory:mem0')
YAML or JSON, with support for defaults, variable substitution, and nested components.
defaults:
api_key: "default-key-value"
temperature: 0.7
assistant:
class: autogen_agentchat.agents.AssistantAgent
params:
name: ${name}
model_client: ${model_client|client}
tools: ${tools}
pip install aini