We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When specifying --timestamp word, the error : "WhisperFlashAttention2 attention does not support output_attentions". insanely-fast-whisper=0.0.15.
The text was updated successfully, but these errors were encountered:
#231 (comment). The timestamp of each word cannot be output. Is it a problem with this version or does it not support this function?
Sorry, something went wrong.
Ok
I have the same problem. I even tried modifying the source code of transformers whisper to make output_attentions=False and that still didn't work
No branches or pull requests
When specifying --timestamp word, the error : "WhisperFlashAttention2 attention does not support output_attentions". insanely-fast-whisper=0.0.15.
The text was updated successfully, but these errors were encountered: