Replies: 2 comments 2 replies
-
|
@tsegismont @vietj might give more details (it's a Vert.x question). The main problem is indeed the lack of back-pressure for streaming use-cases at the event-bus level. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
I think best option would be to pause the HttpClientResponse and pass it over to the httpListener that would unpause it and stream it in all cases |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a kind of "proxy" application where POSTing to an endpoint triggers an EventBus message which in return triggers an http client to fetch some data and return in via
msg.reply. Roughly like this:sequenceDiagram autonumber actor client participant h as httpListener participant e as EventBus participant v as Verticle participant c as HttpClient participant d as Target v->>e: listen for processurl client->>h: POST {url: ..} h->>e: request processurl e->>v: request processurl v->>c: request processurl c->>d: GET url d->>c: return result c->>v: return result v->>e: package for eventbus e->>h: return result h->>client: return resultI created a sample where I stripped out all the pre- and post- processing to focus on the flow (there is plenty in the production app). So far the approach worked - until it doesn't :-(
I need to add support for URLs that return chunked responses. I don't want to load all the chunks eventual blowing my memory. In principal I know how to implement that with the http client. My stumbling block now is: how to reply to one
eventBus.request(...)multiple times, once for each chunk. Or what would be a good approach for such a task?Beta Was this translation helpful? Give feedback.
All reactions