Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Feature: Adding watsonx.ai LLM Platform support #1888

Open
wants to merge 29 commits into
base: master
Choose a base branch
from

Conversation

ticlazau
Copy link

Pull Request Type

  • ✨ feat
  • πŸ› fix
  • ♻️ refactor
  • πŸ’„ style
  • πŸ”¨ chore
  • πŸ“ docs

What is in this change?

Adding watsonx.ai support as LLM Platform

Additional Information

New LLM backend supporing various LLMs is added. Running watsonx.ai will require:

  • WATSONX_AI_ENDPOINT
  • WATSONX_AI_APIKEY
  • WATSONX_AI_PROJECT_ID
  • WATSONX_AI_MODEL (i.e meta-llama/llama-2-70b-chat)
  • WATSONX_EMBEDDING_MODEL_PREF (i.e. baai/bge-large-en-v1)
    In addition, we are intorducing AI Guardrails for input and LLM output

Developer Validations

  • [ x] I ran yarn lint from the root of the repo & committed changes
  • [ x] Relevant documentation has been updated
  • [ x] I have tested my code functionality
  • [x ] Docker build succeeds locally

@timothycarambat
Copy link
Member

I cannot for the life of me figure out where to get the proper credentials in the IBM cloud to test this integration. Is there any documentation on where to get the above ENVs?

@ticlazau
Copy link
Author

ticlazau commented Aug 1, 2024

@timothycarambat here is the link: https://cloud.ibm.com/docs/account?topic=account-userapikey&interface=ui
API is obtained via https://cloud.ibm.com/ and the ProjectID is obtained via https://dataplatform.cloud.ibm.com after project creation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants