This module provides a generic API for connecting with OpenAI's LLM chat completion models.
Issues and Projects tabs are disabled for this repository as this is part of the Ballerina Library. To report bugs, request new features, start new discussions, view project boards, etc., go to the Ballerina Library parent repository. This repository only contains the source code for the module.
-
Download and install Java SE Development Kit (JDK) version 21 (from one of the following locations).
-
Generate a GitHub access token with read package permissions, then set the following
env
variables:export packageUser=<Your GitHub Username> export packagePAT=<GitHub Personal Access Token>
Execute the commands below to build from the source.
-
To build the package:
./gradlew clean build
-
To run the tests:
./gradlew clean test
-
To run a group of tests
./gradlew clean test -Pgroups=<test_group_names>
-
To build the without the tests:
./gradlew clean build -x test
-
To debug the package with a remote debugger:
./gradlew clean build -Pdebug=<port>
-
To debug with Ballerina language:
./gradlew clean build -PbalJavaDebug=<port>
-
Publish the generated artifacts to the local Ballerina central repository:
./gradlew clean build -PpublishToLocalCentral=true
-
Publish the generated artifacts to the Ballerina central repository:
./gradlew clean build -PpublishToCentral=true
As an open-source project, Ballerina welcomes contributions from the community.
For more information, go to the contribution guidelines.
All the contributors are encouraged to read the Ballerina Code of Conduct.