Skip to content

Commit 5b1816b

Browse files
Merge pull request #130 from SasinduDilshara/update-natural-functions
Update direct llm call documentation
2 parents 3c36579 + 4dadf2e commit 5b1816b

File tree

2 files changed

+4
-4
lines changed

2 files changed

+4
-4
lines changed

en/docs/integration-guides/ai/direct-llm-call/direct-llm-call.md renamed to en/docs/integration-guides/ai/direct-llm-call/direct-llm-invocation-with-ballerina-model-providers.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Direct LLM Invocation with Ballerina Model Providers
1+
# Direct LLM invocation with Ballerina model providers
22

33
In this tutorial, you will create an integration that makes a direct call to a Large Language Model (LLM) using Ballerina’s model providers. Direct LLM calls are designed for simple, stateless interactions where conversational history is not required, giving you fine-grained control over each request.
44
With Ballerina, you can send a prompt along with a type descriptor, instructing the LLM to generate a response that automatically conforms to your desired type-safe format (e.g., JSON, Ballerina records, integers). This eliminates manual parsing and ensures structured, predictable outputs.
@@ -17,7 +17,7 @@ Follow the steps below to implement the integration.
1717
4. Select Project Directory and click on the **`Select Location`** button.
1818
5. Click on the **`Create New Integration`** button to create the integration project.
1919

20-
### Step 2: Define Types
20+
### Step 2: Define types
2121

2222
1. Click on the **`Add Artifacts`** button and select **`Type`** in the **`Other Artifacts`** section.
2323
2. Click on **`+ Add Type`** to add a new type.
@@ -107,7 +107,7 @@ Follow the steps below to implement the integration.
107107

108108
### Step 5: Configure default WSO2 model provider
109109

110-
1. As the workflow uses the `Default Model Provider (WSO2)`, you need to configure its settings:
110+
1. Ballerina supports direct calls to Large Language Models (LLMs) with various providers, such as OpenAI, Azure OpenAI, and Anthropic. This demonstration focuses on using the Default Model Provider (WSO2). To begin, you need to configure its settings:
111111
- Press `Ctrl/Cmd + Shift + P` to open the VS Code command palette.
112112
- Run the command: `Ballerina: Configure default WSO2 model provider`.
113113
This will automatically generate the required configuration entries.

en/mkdocs.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,7 @@ nav:
8585
- AI Agents and Other Gen AI Integrations:
8686
- Direct LLM Invocation:
8787
- Natural functions(Beta): integration-guides/ai/natural-functions/natural-functions.md
88-
- Direct LLM Invocation using Model providers: integration-guides/ai/direct-llm-call/direct-llm-call.md
88+
- Model providers: integration-guides/ai/direct-llm-call/direct-llm-invocation-with-ballerina-model-providers.md
8989
- RAG:
9090
- Rag Ingestion: integration-guides/ai/rag/rag-ingestion.md
9191
- Rag Query: integration-guides/ai/rag/rag-query.md

0 commit comments

Comments
 (0)