Skip to content

feat: add BaileysMessageProcessor for improved message handling #1726

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jul 16, 2025

Conversation

Santosl2
Copy link

@Santosl2 Santosl2 commented Jul 16, 2025

PT

Essa implementação adiciona uma fila reativa para o processamento de mensagens recebidas pelo Baileys, utilizando RxJS. O objetivo é garantir que os lotes de mensagens sejam processados de forma sequencial, evitando concorrência e race conditions que poderiam ocorrer em fluxos com alto volume.

Com o uso do Subject em conjunto com concatMap, garantimos que uma nova mensagem só será processada após a conclusão da anterior. Isso é especialmente importante em cenários onde o onMessageReceive realiza operações assíncronas que não devem ocorrer em paralelo (ex: gravação em banco, chamadas de API, etc).

Além disso, o tratamento de erros impede que falhas isoladas interrompam o fluxo completo de mensagens.

Essa abordagem torna o processamento mais previsível, seguro e resiliente, principalmente sob carga.

ENHere’s the English version for the PR:


This implementation introduces a reactive queue for processing incoming Baileys messages using RxJS. The goal is to ensure that message batches are processed sequentially, avoiding concurrency issues and race conditions that could occur under high message volume.

By leveraging a Subject combined with concatMap, we ensure that each batch is only processed after the previous one completes. This is critical in scenarios where onMessageReceive performs asynchronous operations that must not run in parallel (e.g., database writes, external API calls).

Error handling is also included to prevent a single failure from breaking the entire message stream.

Overall, this approach improves the reliability, safety, and predictability of message processing, especially under heavy load.

Summary by Sourcery

Introduce BaileysMessageProcessor leveraging RxJS to asynchronously batch and process incoming WhatsApp messages, integrate it into BaileysStartupService for message handling and cleanup

New Features:

  • Introduce BaileysMessageProcessor class to asynchronously process incoming WhatsApp messages using RxJS streams
  • Mount BaileysMessageProcessor in BaileysStartupService to handle messages and teardown on logout

Enhancements:

  • Refactor message upsert handling in BaileysStartupService to route through the new processor instead of direct invocation

Build:

  • Add rxjs as a dependency in package.json

Copy link
Contributor

sourcery-ai bot commented Jul 16, 2025

Reviewer's Guide

This PR introduces a BaileysMessageProcessor leveraging RxJS to asynchronously process incoming message batches, integrates it into the BaileysStartupService for decoupled handling, and adds the rxjs dependency.

Sequence diagram for asynchronous message processing with BaileysMessageProcessor

sequenceDiagram
    participant BaileysStartupService
    participant BaileysMessageProcessor
    participant onMessageReceive (Handler)

    BaileysStartupService->>BaileysMessageProcessor: processMessage(payload, settings)
    BaileysMessageProcessor->>BaileysMessageProcessor: messageSubject.next({messages, type, requestId, settings})
    BaileysMessageProcessor-->>onMessageReceive (Handler): onMessageReceive({messages, type, requestId}, settings) (via RxJS pipeline)
    onMessageReceive (Handler)-->>BaileysMessageProcessor: (async processing complete)
Loading

Class diagram for MountProps type used in BaileysMessageProcessor

classDiagram
    class MountProps {
        +onMessageReceive(payload: MessageUpsertPayload, settings: any): Promise<void>
    }
    BaileysMessageProcessor ..> MountProps : mount({ onMessageReceive })
Loading

File-Level Changes

Change Details Files
Introduce BaileysMessageProcessor using RxJS to handle message batches asynchronously
  • Created new BaileysMessageProcessor class with a Subject-based pipeline
  • Implemented mount, processMessage, and onDestroy methods with logging and error handling
  • Configured concatMap to map batches to the onMessageReceive handler
src/api/integrations/channel/whatsapp/baileysMessage.processor.ts
Integrate messageProcessor into BaileysStartupService for decoupled handling
  • Imported and instantiated BaileysMessageProcessor
  • Mounted processor in constructor with onMessageReceive callback bound
  • Replaced direct messages.upsert call with processor.processMessage
  • Added processor.onDestroy call in logoutInstance
src/api/integrations/channel/whatsapp/whatsapp.baileys.service.ts
Add RxJS dependency for asynchronous streams
  • Added rxjs to project dependencies
package.json

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@Santosl2 Santosl2 changed the base branch from main to develop July 16, 2025 00:35
Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @Santosl2 - I've reviewed your changes - here's some feedback:

  • Consider guarding in processMessage to log or throw if mount wasn’t called first, so you don’t emit into an uninitialized Subject.
  • To prevent unbounded queue growth under load, think about adding a concurrency/backpressure strategy (e.g. mergeMap with a concurrency limit) instead of using an unbounded Subject with concatMap.
  • Once BaileysMessageProcessor is fully in place, remove the commented-out direct call to messageHandle to keep the event loop code clean.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- Consider guarding in processMessage to log or throw if mount wasn’t called first, so you don’t emit into an uninitialized Subject.
- To prevent unbounded queue growth under load, think about adding a concurrency/backpressure strategy (e.g. mergeMap with a concurrency limit) instead of using an unbounded Subject with concatMap.
- Once BaileysMessageProcessor is fully in place, remove the commented-out direct call to messageHandle to keep the event loop code clean.

## Individual Comments

### Comment 1
<location> `src/api/integrations/channel/whatsapp/baileysMessage.processor.ts:44` </location>
<code_context>
+
+  processMessage(payload: MessageUpsertPayload, settings: any) {
+    const { messages, type, requestId } = payload;
+    this.messageSubject.next({ messages, type, requestId, settings });
+  }
+
</code_context>

<issue_to_address>
No backpressure or queue size limit is enforced on messageSubject.

This could cause memory issues if many messages are processed rapidly. Consider implementing a queue size limit or backpressure mechanism.

Suggested implementation:

```typescript
  // Limit the queue to 100 messages (adjust as needed)
  protected readonly MESSAGE_QUEUE_LIMIT = 100;
  protected messageQueue: Array<{ messages: any; type: any; requestId: any; settings: any }> = [];
  protected messageSubject = new Subject<{

```

```typescript
  processMessage(payload: MessageUpsertPayload, settings: any) {
    const { messages, type, requestId } = payload;
    if (this.messageQueue.length >= this.MESSAGE_QUEUE_LIMIT) {
      // Drop the oldest message to make room for the new one (or log/drop new, as preferred)
      this.processorLogs.warn(
        `Message queue full (${this.MESSAGE_QUEUE_LIMIT}). Dropping oldest message.`
      );
      this.messageQueue.shift();
    }
    const message = { messages, type, requestId, settings };
    this.messageQueue.push(message);
    this.messageSubject.next(message);
  }

```
</issue_to_address>

### Comment 2
<location> `src/api/integrations/channel/whatsapp/baileysMessage.processor.ts:48` </location>
<code_context>
+  }
+
+  onDestroy() {
+    this.subscription?.unsubscribe();
+    this.messageSubject.complete();
+  }
+}
</code_context>

<issue_to_address>
Unsubscribing and completing the Subject may cause issues if processMessage is called after onDestroy.

Calling processMessage after onDestroy will throw an error due to the completed Subject. Please guard processMessage or document its usage constraints.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.


processMessage(payload: MessageUpsertPayload, settings: any) {
const { messages, type, requestId } = payload;
this.messageSubject.next({ messages, type, requestId, settings });
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion (performance): No backpressure or queue size limit is enforced on messageSubject.

This could cause memory issues if many messages are processed rapidly. Consider implementing a queue size limit or backpressure mechanism.

Suggested implementation:

  // Limit the queue to 100 messages (adjust as needed)
  protected readonly MESSAGE_QUEUE_LIMIT = 100;
  protected messageQueue: Array<{ messages: any; type: any; requestId: any; settings: any }> = [];
  protected messageSubject = new Subject<{
  processMessage(payload: MessageUpsertPayload, settings: any) {
    const { messages, type, requestId } = payload;
    if (this.messageQueue.length >= this.MESSAGE_QUEUE_LIMIT) {
      // Drop the oldest message to make room for the new one (or log/drop new, as preferred)
      this.processorLogs.warn(
        `Message queue full (${this.MESSAGE_QUEUE_LIMIT}). Dropping oldest message.`
      );
      this.messageQueue.shift();
    }
    const message = { messages, type, requestId, settings };
    this.messageQueue.push(message);
    this.messageSubject.next(message);
  }

Comment on lines +48 to +49
this.subscription?.unsubscribe();
this.messageSubject.complete();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion (bug_risk): Unsubscribing and completing the Subject may cause issues if processMessage is called after onDestroy.

Calling processMessage after onDestroy will throw an error due to the completed Subject. Please guard processMessage or document its usage constraints.

@Santosl2 Santosl2 changed the title feat: add BaileysMessageProcessor for improved message handling and i… feat: add BaileysMessageProcessor for improved message handling Jul 16, 2025
@DavidsonGomes
Copy link
Collaborator

Por favor, resolva os conflitos apontados!

@DavidsonGomes DavidsonGomes merged commit 9fd40a4 into EvolutionAPI:develop Jul 16, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants