Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Groq Support #1238

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open

Added Groq Support #1238

wants to merge 4 commits into from

Conversation

fire17
Copy link

@fire17 fire17 commented Apr 26, 2024

Describe the changes you have made:

Groq's official python api now fits well into oi flow, no errors.
Though final answers are halucinated rather than actual output.
Seems to plan, write code, but not execute yet.

Reference any relevant issues:

Trying to get groq/mixtral to work #1237
aka groq is not working with litellm out of the box --model groq/mixtral-8x7b-32768 throws errors

Pre-Submission Checklist (optional but appreciated):

  • I have included relevant documentation updates (stored in /docs)
  • I have read docs/CONTRIBUTING.md
  • [-] I have read docs/ROADMAP.md (not fully yet, but no mention of groq)

OS Tests (optional but appreciated):

  • Tested Through Github Codespaces

fire17 added 3 commits April 26, 2024 01:34
… no errors. Though final answers are halucinated rather than actual output. Seems to plan, write code, but not execute.
@fire17
Copy link
Author

fire17 commented Apr 26, 2024

@KillianLucas please open a branch groq so i can make a PR to it instead of main
atleast until tests are added, and it is fully production ready

as it is now, the errors are comming from litellm's side,
groq is too good to not include, hope you feel the same
i can see it being the default for a lot of people. it is already default for me in all my other pipelines

Thanks a lot and all the best! 😇

@fire17 fire17 changed the title Added Groq Support - groq api integration now fits well into oi flow, no errors. Though final answers are halucinated rather than actual output. Seems to plan, write code, but not execute. Added Groq Support Apr 26, 2024
@fire17
Copy link
Author

fire17 commented Apr 26, 2024

Current state of PR:

  • groq api integration now fits well into oi flow, no errors.
    • Final answers are still halucinated rather than using actual output.
    • Seems to plan, write code, but not execute yet. - Probably just need to match it to oi expected json output
  • groq.mdx docs added

Todos:

  • match mixtral output to oi expected output,
    • do the same for llama3 (8b/70b)
  • Add tests, make production-ready

@@ -26,6 +31,7 @@ def __init__(self, interpreter):

# Settings
self.model = "gpt-4-turbo"
self.model = "groq/mixtral-8x7b-32768" # can now use models from groq. `export GROQ_API_KEY="your-key-here")` or use --model
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line should be deleted before merging to main branch

@Cobular
Copy link
Contributor

Cobular commented Apr 26, 2024

Damn I came to add this fork, beat me by an hour! Nice going!

@fire17
Copy link
Author

fire17 commented Apr 26, 2024

haha thanks @Cobular , as stated this pr doest make code execute yet, just swaps the completion apis correctly,
so if yours does, you might as well

One important thing.... there is a way to make it work Now just not with --model yet

With techfren's help ❤️
The key is to use api_base url --api_base "https://api.groq.com/openai/v1" --api_key $GROQ_API_KEY --model "mixtral-8x7b-32768"

This is working

export GROQ_API_KEY='<your-key-here>'
poetry run interpreter --api_base "https://api.groq.com/openai/v1" --api_key $GROQ_API_KEY  --model "mixtral-8x7b-32768" --context_window 32000

This is NOT working

export GROQ_API_KEY='<your-key-here>'
poetry run interpreter  --model "groq/mixtral-8x7b-32768" --context_window 32000

@CyanideByte
Copy link
Contributor

BerriAI/litellm#3176
Potentially already being done by litellm

@KillianLucas
Copy link
Collaborator

NICE. Love Groq, great work on this @fire17. As @CyanideByte mentioned I think we should push this into LiteLLM (they abstract away the Groq interaction so it's = to an OpenAI client.)

And it looks like it works with the latest LiteLLM! interpreter --model groq/llama3-70b-8192 runs OI with groq, and can execute code, if I also pass in my api_key.

In that case, it would be great if we could merge this PR with just the documentation then. I'll make that change then merge if that's okay with you. If there's anything else to include from the PR let me know, we can reopen.

@fire17
Copy link
Author

fire17 commented May 7, 2024

I'll make that change then merge if that's okay with you. If there's anything else to include from the PR let me know, we can reopen.

For sure! @KillianLucas
I've just added the --api_base workaround to the docs incase anyone still running into issues.
Welcome to take the doc :)

All the best!

Ps checkout my new PR #1259

@MikeBirdTech
Copy link
Contributor

Hey @fire17

I didn't see this PR!

Would you mind updating the docs to take into account this recently merged PR?

#1376

Thanks and great job!

@OriginalSimon
Copy link

When will the merger happen?

@MikeBirdTech
Copy link
Contributor

@fire17 Would you like to update this PR or prefer that I close it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants