You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+40Lines changed: 40 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -139,6 +139,46 @@ In addition, Acorn.js provides the following modules for native integrations wit
139
139
-[Anthropic](src/modules/openai)
140
140
-[LangChain](src/modules/langchain)
141
141
142
+
## Why Acorn.js?
143
+
144
+
Many GenAI applications require a combination of probabilistic reasoning (what LLMs are great at) and deterministic data access (what APIs are built for). LLMs are excellent at interpreting natural language, inferring user intent, and reasoning in ambiguous contexts. But when it comes to executing precise, safe, and reliable actions—like retrieving a customer information or aggregating transactions—they need deterministic tools.
145
+
146
+
This is where Acorn.js comes in: it bridges the probabilistic world of LLMs with the deterministic world of GraphQL APIs.
147
+
148
+
Why this combination matters:
149
+
150
+
🔍 **Precision**
151
+
152
+
LLMs are trained to predict the next best word—not necessarily the right one. When dealing with structured data, precision matters. You can’t “mostly” retrieve the right invoice or “approximately” update a user profile. Acorn.js ensures that when an LLM decides what to do, the how is delegated to a schema-validated, deterministic API call.
153
+
154
+
🛡️ **Safety**
155
+
156
+
Uncontrolled LLM outputs can lead to unsafe behavior—calling the wrong function, passing in unvalidated arguments, or leaking sensitive data. By constraining LLMs to tools derived from a GraphQL schema, Acorn.js acts as a contract enforcer. It provides validation, guards against misuse, and helps sandbox sensitive operations that should never be exposed to the model.
157
+
158
+
💸 **Cost**
159
+
160
+
LLMs are expensive—especially if you rely on them to hallucinate long, complex outputs like tabular data or query results. Offloading data-heavy tasks to structured APIs reduces token usage, speeds up response times, and improves overall efficiency. You let the LLM do what it’s good at (language) and let the API do what it’s built for (data).
161
+
162
+
⚡ **Speed**
163
+
164
+
Structured APIs return results in milliseconds. LLMs, on the other hand, have to stream outputs and sometimes retry when things go wrong. By using Acorn.js to define clear tool boundaries, you minimize guesswork and retries, speeding up the overall response cycle of your chatbot or agent.
165
+
166
+
### Why GraphQL APIs?
167
+
168
+
In the context of building GenAI applications, GraphQL is a great choice because it provides the right balance between flexibility, ease-of-use, semantics, widespread support, and schema validation.
169
+
170
+
**🧠 Semantic Clarity and Validation**
171
+
172
+
GraphQL defines a standard schema language that maps cleanly to LLM tool definitions and supports the semantic annotations LLMs need to make intelligent decisions and invoke the correct tools/APIs.
173
+
174
+
The schema also enables validation of tool calls and provides a level of safeguarding against hallucinated tool invocations.
175
+
176
+
**🔧 Flexibility and Control**
177
+
178
+
GraphQL provides a significant amount of query flexibility so the LLM can query for the data it needs given a particular context. Query flexibility allows LLMs to produce more precise and intelligent answers.
179
+
180
+
However, GraphQL provides enough control over the interface to ensure safe, compliant, and controlled access to data.
181
+
142
182
## Contributing
143
183
144
184
We love contributions. Open an issue if you encounter a bug or have a feature request. See [CONTRIBUTING.md](./CONTRIBUTING.md) for more details.
0 commit comments