Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TTS Interruption and Pipeline Termination Issue #902

Open
golbin opened this issue Dec 21, 2024 · 6 comments
Open

TTS Interruption and Pipeline Termination Issue #902

golbin opened this issue Dec 21, 2024 · 6 comments

Comments

@golbin
Copy link
Contributor

golbin commented Dec 21, 2024

Description

Is this reporting a bug or feature request?

Bug Report

Environment

  • pipecat-ai version: 0.0.49~0.0.51
  • Python version: 3.12
  • OS: Docker image: python:3.12-slim-bookworm

Issue description

TTS (Cartesia) service stops functioning after a certain point during the session. Once this issue occurs, ending the session with EndFrame does not terminate the process. Additionally, repeated errors regarding reconnection attempts and cancelled tasks are observed in the logs.

Repro steps

  • Note that this issue occurs intermittently, making it difficult to reproduce consistently.
  • This behavior has been observed to occur repeatedly.

Expected behavior

  • The TTS service should consistently generate TTS responses throughout the session.
  • On sending EndFrame, the session and associated processes should terminate correctly.

Actual behavior

  • TTS service stops generating responses after a certain point, without any apparent error in the immediate logs.
  • Ending the session does not terminate the process.
  • Logs show repeated reconnection attempts and errors such as "tasks cancelled" and "no close frame received or sent."

Logs

Logs when TTS is working:

2024-12-19 00:52:17.333 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:stop_ttfb_metrics:44 - OpenAILLMService#0 TTFB: 0.36797094345092773

2024-12-19 00:52:17.456 | DEBUG    | pipecat.services.cartesia:run_tts:317 - Generating TTS: [Oops, didn’t quite catch that!]

2024-12-19 00:52:17.457 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:start_tts_usage_metrics:79 - CartesiaTTSService#0 usage characters: 30

2024-12-19 00:52:17.457 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:stop_processing_metrics:59 - CartesiaTTSService#0 processing time: 0.0012862682342529297

2024-12-19 00:52:17.507 | DEBUG    | pipecat.services.cartesia:run_tts:317 - Generating TTS: [ Could you repeat that part?]

2024-12-19 00:52:17.508 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:start_tts_usage_metrics:79 - CartesiaTTSService#0 usage characters: 28

2024-12-19 00:52:17.508 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:stop_processing_metrics:59 - CartesiaTTSService#0 processing time: 0.0013811588287353516

2024-12-19 00:52:17.513 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:start_llm_usage_metrics:67 - OpenAILLMService#0 prompt tokens: 2710, completion tokens: 14

2024-12-19 00:52:17.516 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:stop_processing_metrics:59 - OpenAILLMService#0 processing time: 0.5504031181335449

2024-12-19 00:52:17.587 | DEBUG    | pipecat.transports.base_input:_handle_interruptions:124 - User started speaking

2024-12-19 00:52:17.588 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:stop_ttfb_metrics:44 - CartesiaTTSService#0 TTFB: 0.13139104843139648

2024-12-19 00:52:22.337 | DEBUG    | pipecat.transports.base_input:_handle_interruptions:131 - User stopped speaking

2024-12-19 00:52:22.339 | DEBUG    | pipecat.services.openai:_stream_chat_completions:174 - Generating chat:

Logs when TTS stops working:

2024-12-19 00:52:31.767 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:stop_ttfb_metrics:44 - OpenAILLMService#0 TTFB: 0.32897281646728516

2024-12-19 00:52:32.660 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:start_llm_usage_metrics:67 - OpenAILLMService#0 prompt tokens: 2729, completion tokens: 68

2024-12-19 00:52:32.661 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:stop_processing_metrics:59 - OpenAILLMService#0 processing time: 1.2226619720458984

2024-12-19 00:52:34.717 | DEBUG    | pipecat.transports.base_input:_handle_interruptions:124 - User started speaking

2024-12-19 00:52:35.977 | DEBUG    | pipecat.transports.base_input:_handle_interruptions:131 - User stopped speaking

2024-12-19 00:52:35.979 | DEBUG    | pipecat.services.openai:_stream_chat_completions:174 - Generating chat:

Logs showing reconnection attempts, lingering process errors, and logs after attempting to terminate the pipeline with EndFrame:

2024-12-19 00:52:43.991 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:stop_ttfb_metrics:44 - OpenAILLMService#0 TTFB: 0.37228965759277344

2024-12-19 00:52:44.074 | INFO     | pipecat.transports.services.daily:on_participant_left:605 - Participant left 230983e3-699a-4dde-b00e-30ab2c57f87b

2024-12-19 00:52:44.076 | DEBUG    | pipecat.services.deepgram:_disconnect:201 - Disconnecting from Deepgram

tasks cancelled error: 

ERROR:deepgram.clients.common.v1.abstract_async_websocket:tasks cancelled error: 

tasks cancelled error: 

ERROR:deepgram.clients.common.v1.abstract_async_websocket:tasks cancelled error: 

2024-12-19 00:52:44.810 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:start_llm_usage_metrics:67 - OpenAILLMService#0 prompt tokens: 2743, completion tokens: 63

2024-12-19 00:52:44.811 | DEBUG    | pipecat.processors.metrics.frame_processor_metrics:stop_processing_metrics:59 - OpenAILLMService#0 processing time: 1.1924996376037598

2024-12-19 00:53:28.486 | INFO     | modules.pipeline:check_participant_count:160 - Number of participants is less than 2. Ending the session.

2024-12-19 00:57:19.132 | WARNING  | pipecat.services.cartesia:_reconnect_websocket:280 - CartesiaTTSService#0 reconnecting (attempt: 1)

2024-12-19 00:57:19.133 | DEBUG    | pipecat.services.cartesia:_disconnect_websocket:216 - Disconnecting from Cartesia

2024-12-19 00:57:19.133 | DEBUG    | pipecat.services.cartesia:_connect_websocket:203 - Connecting to Cartesia

2024-12-19 01:02:19.376 | WARNING  | pipecat.services.cartesia:_reconnect_websocket:280 - CartesiaTTSService#0 reconnecting (attempt: 2)

2024-12-19 01:02:19.376 | DEBUG    | pipecat.services.cartesia:_disconnect_websocket:216 - Disconnecting from Cartesia

2024-12-19 01:02:19.376 | DEBUG    | pipecat.services.cartesia:_connect_websocket:203 - Connecting to Cartesia

2024-12-19 01:07:19.616 | ERROR    | pipecat.services.cartesia:_receive_task_handler:299 - CartesiaTTSService#0 error receiving messages: no close frame received or sent

2024-12-19 01:07:19.617 | ERROR    | pipecat.pipeline.task:_handle_upstream_frame:64 - Error running app: ErrorFrame#0(error: CartesiaTTSService#0 error receiving messages: no close frame received or sent, fatal: True)

2024-12-19 01:07:19.617 | INFO     | pipecat.transports.services.daily:leave:435 - Leaving

tasks cancelled error: 

2024-12-19 01:07:19.625 | INFO     | pipecat.transports.services.daily:leave:443 - Left

ERROR:deepgram.clients.common.v1.abstract_async_websocket:tasks cancelled error: 

2024-12-19 01:07:19.625 | DEBUG    | pipecat.services.deepgram:_disconnect:201 - Disconnecting from Deepgram

tasks cancelled error: 

ERROR:deepgram.clients.common.v1.abstract_async_websocket:tasks cancelled error: 

2024-12-19 01:07:19.626 | DEBUG    | pipecat.services.cartesia:_disconnect_websocket:216 - Disconnecting from Cartesia
@balalofernandez
Copy link
Contributor

Can you be more specific about the version? There was a bug in version 0.0.50, just update to latest

@golbin
Copy link
Contributor Author

golbin commented Dec 22, 2024

I tested on 0.0.51 for a few days, and the issue is still there.

@markbackman
Copy link
Contributor

Someone else reported this. @aconchillo investigated this and pushed this fix, which is on main: https://github.com/pipecat-ai/pipecat/pull/898/files.

@golbin
Copy link
Contributor Author

golbin commented Dec 22, 2024

Great! I think #898 is the same problem.

@aconchillo
Copy link
Contributor

@golbin Would it be possible for you to retest with main? The problem was that frames were getting stuck inside a queue in CartesiaTTSService. The commit that @markbackman mentioned should fix the issue.

@golbin
Copy link
Contributor Author

golbin commented Dec 23, 2024

@aconchillo This issue does not occur consistently, and it’s difficult to reproduce it exactly. If you release the patch, I will apply it to the server, monitor the situation, and report back whether the issue recurs or not.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants