Skip to content

Commit e2f1445

Browse files
committed
add further explanation
1 parent 3fecebc commit e2f1445

File tree

1 file changed

+40
-0
lines changed

1 file changed

+40
-0
lines changed

README.md

Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -139,6 +139,46 @@ In addition, Acorn.js provides the following modules for native integrations wit
139139
- [Anthropic](src/modules/openai)
140140
- [LangChain](src/modules/langchain)
141141

142+
## Why Acorn.js?
143+
144+
Many GenAI applications require a combination of probabilistic reasoning (what LLMs are great at) and deterministic data access (what APIs are built for). LLMs are excellent at interpreting natural language, inferring user intent, and reasoning in ambiguous contexts. But when it comes to executing precise, safe, and reliable actions—like retrieving a customer information or aggregating transactions—they need deterministic tools.
145+
146+
This is where Acorn.js comes in: it bridges the probabilistic world of LLMs with the deterministic world of GraphQL APIs.
147+
148+
Why this combination matters:
149+
150+
🔍 **Precision**
151+
152+
LLMs are trained to predict the next best word—not necessarily the right one. When dealing with structured data, precision matters. You can’t “mostly” retrieve the right invoice or “approximately” update a user profile. Acorn.js ensures that when an LLM decides what to do, the how is delegated to a schema-validated, deterministic API call.
153+
154+
🛡️ **Safety**
155+
156+
Uncontrolled LLM outputs can lead to unsafe behavior—calling the wrong function, passing in unvalidated arguments, or leaking sensitive data. By constraining LLMs to tools derived from a GraphQL schema, Acorn.js acts as a contract enforcer. It provides validation, guards against misuse, and helps sandbox sensitive operations that should never be exposed to the model.
157+
158+
💸 **Cost**
159+
160+
LLMs are expensive—especially if you rely on them to hallucinate long, complex outputs like tabular data or query results. Offloading data-heavy tasks to structured APIs reduces token usage, speeds up response times, and improves overall efficiency. You let the LLM do what it’s good at (language) and let the API do what it’s built for (data).
161+
162+
**Speed**
163+
164+
Structured APIs return results in milliseconds. LLMs, on the other hand, have to stream outputs and sometimes retry when things go wrong. By using Acorn.js to define clear tool boundaries, you minimize guesswork and retries, speeding up the overall response cycle of your chatbot or agent.
165+
166+
### Why GraphQL APIs?
167+
168+
In the context of building GenAI applications, GraphQL is a great choice because it provides the right balance between flexibility, ease-of-use, semantics, widespread support, and schema validation.
169+
170+
**🧠 Semantic Clarity and Validation**
171+
172+
GraphQL defines a standard schema language that maps cleanly to LLM tool definitions and supports the semantic annotations LLMs need to make intelligent decisions and invoke the correct tools/APIs.
173+
174+
The schema also enables validation of tool calls and provides a level of safeguarding against hallucinated tool invocations.
175+
176+
**🔧 Flexibility and Control**
177+
178+
GraphQL provides a significant amount of query flexibility so the LLM can query for the data it needs given a particular context. Query flexibility allows LLMs to produce more precise and intelligent answers.
179+
180+
However, GraphQL provides enough control over the interface to ensure safe, compliant, and controlled access to data.
181+
142182
## Contributing
143183

144184
We love contributions. Open an issue if you encounter a bug or have a feature request. See [CONTRIBUTING.md](./CONTRIBUTING.md) for more details.

0 commit comments

Comments
 (0)