Lola lets you package AI contexts (personas, workflows, scripts, templates) into reusable modules that can be shared and installed across projects. Instead of writing monolithic prompts, you build modular, efficient AI programs.
Think of it this way:
- The LLM is a non-deterministic CPU
- Your prompts are the assembly language
- Lola modules are libraries that get loaded only when needed
- Context injection is lazy loading for AI instructions
To manage AI context as installable modules for LLM assistants, Lola implements lazy context loading - a meta-programming technique for optimizing LLM workflows through modular, on-demand context injection.
Traditional approach: Load everything into a massive prompt
You are a chef and developer and writer and...
[Thousands of tokens of instructions]
Lazy context loading approach: Load only what you need, when you need it
Main context monitors triggers
User says "chocolate cake"
Load chef persona + baking workflow
Execute step-by-step instructions
Unload when done
# Clone the repository
git clone https://github.com/mrbrandao/lola
cd lola
# Install with uv
uv pip install -e .
# Or with pip
pip install -e .lola mod ls# Install the module chef-buddy to a test directory
lola mod install chef-buddy -d /tmp/testThis creates:
- .lolas/chef-buddy/- Module assets (contexts, scripts, templates)
- AGENTS.md- Main context file (for Cursor)
cd /tmp/test
cursor .When you start your AI assistant:
- Acts as an enthusiastic baking chef (persona loaded) - say helloand start to interact with theChef Baking Buddy
- Provides step-by-step recipes when you say "chocolate cake" (context injected)
- Creates blog posts when you say "new blog post" (workflow executed)
Well we need a way to share all those cool, contexts, that's why
Lola Modules were created. modules are called LoLas from Load Lazy
Now that you know how Lola works, what about help us to extend lola with you own modules?
Here's the quick overview on how to create your own modules.
- Create module structure in modules/your-module/
- Define module in modules/lolamod.yml
- Create main context file (AGENTS.md, CLAUDE.md, or GEMINI.md)
- Add persona and workflow contexts
- Add helper scripts and templates
- Test and share
See the complete guide on Creating Lola Modules
Also check the Chef Baking Buddy Module as example, with a module that makes your LLM become a cooking chef using a lola module.
- Creating Lola Modules - Complete guide to building modules
- Chef Buddy Example - Working example module
- Vision and Roadmap - Future plans for Lola
Lola is an experimental project exploring lazy context loading patterns. Contributions welcome:
- Create new context modules
- Improve existing modules
- Add features to the CLI
- Share your use cases
Igor Brandao
The AI is a CPU. Prompts are the assembly. Lazy context loading is your build system.