Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there any plan to use assistant api in LLM to get response? #761

Open
xiaoxin01 opened this issue Nov 29, 2024 · 2 comments
Open

Is there any plan to use assistant api in LLM to get response? #761

xiaoxin01 opened this issue Nov 29, 2024 · 2 comments
Labels
help wanted Looking for someone to take on this issue

Comments

@xiaoxin01
Copy link

Description

Currently, OpenAI uses the chat completion API to implement this. Are there plans to add an assistant API to implement this in the future?

@genchilu
Copy link

I have the same problem also. Does pipecat have plans to support the assistants API?

@chadbailey59 chadbailey59 added enhancement help wanted Looking for someone to take on this issue labels Jan 10, 2025
@chadbailey59
Copy link
Contributor

We don't have any current plans, but we'd welcome a PR to add the functionality!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Looking for someone to take on this issue
Projects
None yet
Development

No branches or pull requests

3 participants