-
Notifications
You must be signed in to change notification settings - Fork 439
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLM user frame processor with tests #703
base: main
Are you sure you want to change the base?
Conversation
Hey @aconchillo how is it going? any updates on when you think this PR will be merged? |
Hello team! @aconchillo @marcus-daily , any ETA on when you think this PR could be merged? I've been using a forked repo with this fix, works super well. Would love to go back to the main repo! |
There are a couple of use cases (the ones were we interrupt) that I'm not sure about and I have never ran into actually. One of the cases is:
In this case we get 2 transcriptions after |
In our experience this does happen quite a lot. If you do not want to send interruptions from this processors (I agree it is not very clean), you can implement a similar logic in the input processor directly. |
Interesting. Well, I guess we can give this a try 😃 . I have some local changes (e.g. pushing |
@EdgarMMProsper @skandere In the case discussed above:
what is an example of the contents of |
I was actually reading the original commit and |
T1: start of the sentence. T2: end of the sentence. Both things said between S and E but received a bit late because of STT latency. If you do not interrupt the bot will generate two very similar outputs "repeating" itself. |
It seems the foundational examples series 22 would solve this issue without needing to interrupt. I was assuming T1 and T2 would be two different sentences in the which case the interruptions didn’t make sense. But if it’s the same sentence then those examples might help better without needing to interrupt. What if T1 has more context than T2? |
I've decided I will make this a constructor argument. So you will have the option to interrupt the bot if you want to. But I have the feeling the series 22 examples (or any other approach) would work better. |
bf78cee
to
569d10a
Compare
This seems necessary because of how pytest works. If a task is cancelled, pytest will know the task has been cancelled even if # `asyncio.CancelledError` is handled internally in the task.
569d10a
to
eef3f32
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
Alright, I have been playing with this a bit. I wanted to add it to 0.0.52 but it's not going to happen. We will rethink LLM aggregators in January considering also how multi-modal models works. For now, I'll park this. |
This PR is based on #450 and improves VAD and user transcription ordering issues fixing the LLM context aggregators.