Skip to content

Add support for arbitrary SQS message #69

@januszm

Description

@januszm

Hi. I came across this project because I am looking for a solution that will allow to run an execution pool (workers) that can process tasks from any AWS SQS queues with little effort. I mean queues that were not created by the application but are already present in AWS. These are e.g. various types of notifications from AWS systems, such as the creation of a file in S3.
I tried Celery, but support for SQS in Celery comes down only to publishing and processing its own messages.
I was hoping that pyqs would be able to process any message, but here the situation looks similar to Celery, we have:

def decode_message(message):
    message_body = message['Body']
    json_body = json.loads(message_body)
    if 'task' in message_body:
        return json_body
    # elif ... <<< here, or extract this part into its own method, like 'detect_message_type(message)'
    else:
        # Fallback to processing celery messages
        return decode_celery_message(json_body['body'])

Have you considered this possibility? This could work using the configuration option, or using a part of message that uniquely identifies Celery (such as, I think, "task" identifies pyqs in the code above).

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions