Parallelising sequence tasks #309
Replies: 1 comment 1 reply
-
This feature has been on my TODO list since version 0.1.0 😆 The reason it's not implemented yet is just due to the complexity inherent in elegantly handling output from multiple tasks in one console. I agree this would be a valuable feature but it would require some refactor of how task execution is managed. My last idea was to make the executor use asyncio.create_subprocess, and then offer a few options for what to do with the output (the default being something like what docker compose does to multiplex streams with prefixes lines). See #24 for more discussion. In the mean time you can get a version of this behavior with background jobs in a shell task. The downside being that task outputs are randomly interleaved. check.shell = """
poe -C ./libraries/core check &
poe -C ./services/api check &
poe -C ./services/consumer check &
poe -C ./services/worker check &
wait
""" |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey,
We use poe pretty extensively in a small monorepo project. I recently setup a sequence command at the root that runs commands in the individual projects within the repo.
I was wondering if there was any reason to not support parallelisation of these tasks?
Beta Was this translation helpful? Give feedback.
All reactions