You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
avantechain: declarative chain execution logic (e.g. run this step → conditionally continue).
This mode would allow anyone to define powerful, project-specific assistants using just simple configuration and Lua glue — no hardcoding of prompts, agents, models, or flows.
Design Philosophy
Not all nodes are agents: some steps are just simple inference, while others are minimal Lua functions or wrappers around model calls.
No built-in agents required: users define their own context-fetchers, editors, or tools based on their needs.
Minimal Lua exposure: users only write small utility functions when needed — all orchestration handled via .avantechain.
Composable, rule-driven: workflows follow explicit rules defined in .avanterules — fully deterministic, no surprises.
Chain branching: next steps can depend on model output or Lua function results, allowing conditional logic in workflows.
Example Use Case
A user wants to:
Extend behavior from shared .avanterules templates.
Run a custom Lua function that fetches internal docs based on the prompt.
Feed the result + prompt into an expensive reasoning model.
If the model suggests a refactor, apply code- using a cheaper model. Otherwise, do nothing.
This use case:
Has no built-in “agent”
Uses minimal Lua glue
Is fully declarative
Can be reused or shared across teams by changing .avantechain + .avanterules
Benefits
Zero hardcoded flows: Everything is opt-in and user-defined.
Fits large projects: Teams can tailor .avantechain to their internal tooling, docs, and coding style.
Fast iteration: Swapping chains or templates doesn’t require plugin modification.
Token-efficient: Prompt steps are reused across chains, minimizing API calls.
Supports parallel or conditional chains: e.g. run doc and code search in parallel, then route based on combined output.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Overview
Introduce a fully modular execution mode in avante.nvim, where users can orchestrate a pipeline using:
This mode would allow anyone to define powerful, project-specific assistants using just simple configuration and Lua glue — no hardcoding of prompts, agents, models, or flows.
Design Philosophy
Example Use Case
A user wants to:
This use case:
Benefits
Key Concepts
.avanterulesfor consistent behavior and persona.avanterulesfiles used by prompt and agent stepsSupported Step Types
prompt.avanterulesblocks and send to LLMagent.avanterulescodebuiltin-toolrouternoopgroupExample: docs-and-code-lookup.avantechain
Key points in this example:
run.chaininstead of retyping prompts.Thanks for reading my idea. Let me know if you are interested in building this out. I think it would be awesome.
Beta Was this translation helpful? Give feedback.
All reactions