Direct LLM invocation with Ballerina model providers¶
In this tutorial, you will create an integration that makes a direct call to a Large Language Model (LLM) using Ballerina’s model providers. Direct LLM calls are designed for simple, stateless interactions where conversational history is not required, giving you fine-grained control over each request. With Ballerina, you can send a prompt along with a type descriptor, instructing the LLM to generate a response that automatically conforms to your desired type-safe format (e.g., JSON, Ballerina records, integers). This eliminates manual parsing and ensures structured, predictable outputs.
In this tutorial, you’ll leverage this capability to analyze blog content—prompting the LLM to return a structured review, including a suggested category and a rating, using the default WSO2 model provider.
Step 1: Create a new integration project¶
- Click on the WSO2 Integrator: BI icon on the sidebar.
- Click on the
Create New Integrationbutton. - Enter
BlogRevieweras the project name. - Click the
Select Pathbutton to set the Integration Path. - Click on the
Create New Integrationbutton to create the integration project.
Step 2: Define types¶
- Click on the
Add Artifactsbutton and selectTypein theOther Artifactssection. - Click on
+ Add Typeto add a new type. - Click on
Importbutton in the top right corner of the type editor. -
Use
Blogas theName. Then selectJSONfrom the dropdown and paste the following JSON payload. Then click theImportbutton.{ "title": "Tips for Growing a Beautiful Garden", "content": "Spring is the perfect time to start your garden. Begin by preparing your soil with organic compost and ensure proper drainage. Choose plants suitable for your climate zone, and remember to water them regularly. Don't forget to mulch to retain moisture and prevent weeds." } -
Add another type with
Reviewas theNameand paste the following JSON payload.{ "suggestedCategory": "Gardening", "rating": 5 } -
The types are now available in the project.
BlogandRevieware the types that represent the blog content and review respectively.
Step 3: Create an HTTP service¶
- In the design view, click the
Add Artifactbutton. - Select
HTTP Serviceunder theIntegration as APIcategory. - Select the
Design from Scratchoption as theService Contractand use/blogsas theService base path. -
Click the
Createbutton to create the new service with the specified configurations. -
From the HTTP Service view, click
+ Add Resourceand select thePOSTmethod. - Give the resource path as
review. - Click
Define Payload, go to the third tab (Browse Existing Types), search for the typeBlog, select it, and clickSave. - Under Responses, edit the 201 response and change its response body schema to Review using the Advanced Configurations section.
-
Click
Saveto create the resource with the specified configurations.
Step 4: Implement the resource logic¶
- Once redirected to the
reviewresource implementation designer view, follow these steps to implement the logic: - Hover over the arrow after the Start node and click the ➕ button to add a new action to the resource.
- Select
Model Providerfrom the node panel. - Click
+ Add Model Provider. - Click
Default Model Provider (WSO2). - Enter
modelas the name of the model provider andai:Wso2ModelProvideras the result type, then clickSave. - Click the
modelvariable under theModel Providersnode. - It will show the list of available APIs from the model provider. Select the
generateAPI from the list. -
Use the following prompt as the
Promptfor the blog review use case. Set the name of the result variable toreview, useReviewas the return type, and convert it to a nilable type using type operators. Then clickSave.You are an expert content reviewer for a blog site that categorizes posts under the following categories: "Gardening", "Sports", "Health", "Technology", "Travel" Your tasks are: 1. Suggest a suitable category for the blog from exactly the specified categories. If there is no match, use null. 2. Rate the blog post on a scale of 1 to 10 based on the following criteria: - **Relevance**: How well the content aligns with the chosen category. - **Depth**: The level of detail and insight in the content. - **Clarity**: How easy it is to read and understand. - **Originality**: Whether the content introduces fresh perspectives or ideas. - **Language Quality**: Grammar, spelling, and overall writing quality. Here is the blog post content: Title: ${payload.title} Content: ${payload.content} -
Add a new node after the
generateAPI call and selectReturnfrom the node panel. -
Select the
reviewvariable from the dropdown and clickSave.
Step 5: Configure default WSO2 model provider¶
- Ballerina supports direct calls to Large Language Models (LLMs) with various providers, such as OpenAI, Azure OpenAI, and Anthropic. This demonstration focuses on using the Default Model Provider (WSO2). To begin, you need to configure its settings:
- Press
Ctrl/Cmd + Shift + Pto open the VS Code command palette. - Run the command:
Ballerina: Configure default WSO2 model provider. This will automatically generate the required configuration entries.
- Press
Step 6: Run the integration¶
Response May Vary
Since this integration involves an LLM (Large Language Model) call, the response values may not always be identical across different executions.
- Click on the
Runbutton in the top-right corner to run the integration. - The integration will start and the service will be available at
http://localhost:9090/blogs. - Click on the
Try itbutton to open the embedded HTTP client. -
Enter the blog content in the request body and click on the ▶️ button to send the request.
{ "title": "The Healthy Maven", "content": "For those who want a 360-degree approach to self-care, with advice for betterment in the workplace, home, gym, and on the go, look no further. The Healthy Maven offers recipes for every type of meal under the sun (salads, sides, soups, and more), DIY tips (you’ll learn how to make your own yoga mat spray), and quick workouts. If you like where all this is going, there’s a supplementary podcast run by blogger Davida with guest wellness experts." } -
The blog content is analyzed by the LLM to suggest a category and rate it based on predefined criteria.




