You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: en/docs/integration-guides/ai/direct-llm-call/direct-llm-invocation-with-ballerina-model-providers.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
# Direct LLM Invocation with Ballerina Model Providers
1
+
# Direct LLM invocation with Ballerina model providers
2
2
3
3
In this tutorial, you will create an integration that makes a direct call to a Large Language Model (LLM) using Ballerina’s model providers. Direct LLM calls are designed for simple, stateless interactions where conversational history is not required, giving you fine-grained control over each request.
4
4
With Ballerina, you can send a prompt along with a type descriptor, instructing the LLM to generate a response that automatically conforms to your desired type-safe format (e.g., JSON, Ballerina records, integers). This eliminates manual parsing and ensures structured, predictable outputs.
@@ -17,7 +17,7 @@ Follow the steps below to implement the integration.
17
17
4. Select Project Directory and click on the **`Select Location`** button.
18
18
5. Click on the **`Create New Integration`** button to create the integration project.
19
19
20
-
### Step 2: Define Types
20
+
### Step 2: Define types
21
21
22
22
1. Click on the **`Add Artifacts`** button and select **`Type`** in the **`Other Artifacts`** section.
23
23
2. Click on **`+ Add Type`** to add a new type.
@@ -107,7 +107,7 @@ Follow the steps below to implement the integration.
107
107
108
108
### Step 5: Configure default WSO2 model provider
109
109
110
-
1. As the workflow uses the `Default Model Provider (WSO2)`, you need to configure its settings:
110
+
1. Ballerina supports direct calls to Large Language Models (LLMs) with various providers, such as OpenAI, Azure OpenAI, and Anthropic. This demonstration focuses on using the Default Model Provider (WSO2). To begin, you need to configure its settings:
111
111
- Press `Ctrl/Cmd + Shift + P` to open the VS Code command palette.
112
112
- Run the command: `Ballerina: Configure default WSO2 model provider`.
113
113
This will automatically generate the required configuration entries.
0 commit comments