Skip to content

alpha-xone/aini

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

AINI

aini

Declarative AI components - make AI component initialization easy with auto-imports and prebuilt YAML configs.


๐Ÿš€ Quick Start: Discover, Inspect, Use

0. Install (with LangChain support)

pip install aini[lang]

1. Discover Available LangChain Configurations

List all available YAML config files for LangChain components:

In [1]: from aini import alist

In [2]: alist(key='lang')
โ•ญโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ
โ”‚ Found 9 YAML file(s)                                                             โ”‚
โ”‚ โ””โ”€โ”€ aini / Site-Packages: C:/Python3/Lib/site-packages/aini/                     โ”‚
โ”‚     โ”œโ”€โ”€ lang/                                                                    โ”‚
โ”‚     โ”‚   โ”œโ”€โ”€ config.yml: config                                                   โ”‚
โ”‚     โ”‚   โ”œโ”€โ”€ graph.yml: state_graph                                               โ”‚
โ”‚     โ”‚   โ”œโ”€โ”€ llm.yml: ds, r1, sf-qwen, sf-qwen-14b, sf-qwen-30b, sf-qwen-32b      โ”‚
โ”‚     โ”‚   โ”œโ”€โ”€ memory.yml: instore, saver                                           โ”‚
โ”‚     โ”‚   โ”œโ”€โ”€ msg.yml: msg_state, sys, human, user, ai, invoke, prompt             โ”‚
โ”‚     โ”‚   โ”œโ”€โ”€ react.yml: agent                                                     โ”‚
โ”‚     โ”‚   โ”œโ”€โ”€ supervisor.yml: supervisor                                           โ”‚
โ”‚     โ”‚   โ””โ”€โ”€ tools.yml: tavily                                                    โ”‚
โ”‚     โ””โ”€โ”€ lang_book/                                                               โ”‚
โ”‚         โ””โ”€โ”€ idea_validator.yml: clarifier, researcher, competitor, report        โ”‚
โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ

2. Inspect a Componentโ€™s Component

See the exact configuration for a component by passing akey:

In [3]: from aini import aini

In [4]: aini('lang/llm:ds', araw=True)
Out [4]:
{
  'class': 'langchain.llms.DeepSeek',
  'params': {'model': 'deepseek-chat'}
}

3. Instantiate and Use the Component

Initialize and use the component directly:

# Instantiate the DeepSeek LLM (make sure DEEPSEEK_API_KEY is set)
In [5]: ds = aini('lang/llm:ds')

# Use the model (example: send a message)
In [6]: ds.invoke('hi').pretty_print()
======================== Ai Message ========================
Hello! ๐Ÿ˜Š How can I assist you today?

๐Ÿง‘โ€๐Ÿ’ป Extended Usage

Visualize and Debug

In [7]: from aini import aview

In [8]: aview(ds.invoke('hi'))
<langchain_core.messages.ai.AIMessage>
{
  'content': 'Hello! ๐Ÿ˜Š How can I assist you today?',
  'response_metadata': {
    'token_usage': {'completion_tokens': 11, 'prompt_tokens': 4, 'total_tokens': 15, 'prompt_cache_miss_tokens': 4},
    'model_name': 'deepseek-chat',
    'system_fingerprint': 'fp_8802369eaa_prod0425fp8',
    'id': '2be77461-5d07-4f95-8976-c3a782e1799b',
    'finish_reason': 'stop'
  },
  'type': 'ai',
  'id': 'run--5cdbede5-9545-441e-a137-ebe25699bf36-0',
  'usage_metadata': {'input_tokens': 4, 'output_tokens': 11, 'total_tokens': 15}
}

List Methods

In [9]: from aini import ameth

In [10]: ameth(ds)
Out [10]:
['invoke', 'predict', 'stream', ...]

๐Ÿ› ๏ธ Advanced Features

Variable Substitution

Use environment variables, input variables, or defaults in your YAML:

llm:
  class: "langchain_deepseek.ChatDeepSeek"
  params:
    api_key: ${DEEPSEEK_API_KEY}
    model: ${model|'deepseek-chat'}
    temperature: ${temp|0.7}

Resolution priority:

  1. Input variables (kwargs to aini())
  2. Environment variables
  3. Defaults section in YAML
  4. Fallback after |

Additioal Parameters

You can pass additional parameters when initializing components (only for single component):

In [11]: ds = aini('lang/llm:ds', max_tokens=100)

Custom Initialization

Specify custom init methods:

model_client:
  class: autogen_core.models.ChatCompletionClient
  init: load_component
  params:
    model: ${model}
    expected: ${expected}

๐Ÿ“š More Examples

In [13]: import operator
In [14]: from functools import reduce
# Get a list of agents from LangGraph
In [15]: agents = aini('lang_book/idea_validator)
# Chain them together
In [16]: workflow = reduce(operator.or_, agents.value())
# Get report from the workflow
In [17]: ans = workflow.invoke({'messages': 'Consistency check for AI agents'})
In [18]: ans['messages'][-1].pretty_print()
pip install aini[autogen]
In [17]: client = aini('autogen/client', model=aini('autogen/llm:ds'))
In [18]: agent = aini('autogen/assistant', name='deepseek', model_client=client)
In [19]: ans = await agent.run(task='What is your name')
In [20]: aview(ans)
pip install aini[agno]
In [21]: agent = aini('agno/agent', tools=[aini('agno/tools:google')])
In [22]: ans = agent.run('Compare MCP and A2A')
In [23]: aview(ans, exc_keys=['metrics'])

<<<<<<< HEAD
### [Mem0](https://mem0.ai/)

```bash
pip install aini[mem0]
In [24]: memory = aini('mem0/memory:mem0')

๐Ÿ“ Configuration File Format

YAML or JSON, with support for defaults, variable substitution, and nested components.

defaults:
  api_key: "default-key-value"
  temperature: 0.7

assistant:
  class: autogen_agentchat.agents.AssistantAgent
  params:
    name: ${name}
    model_client: ${model_client|client}
    tools: ${tools}

๐Ÿ”— Links


๐Ÿ“ฆ Installation

pip install aini

About

Declarative AI components

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages