Live data feed query results in pipeline runner #30
Replies: 4 comments
-
Ideally, in order to implement streaming market data (running throughout the trading session), you'd want to implement an adaptor against the chosen broken API that will expose the generic The pipeline itself cares only for the Source class. Meaning that it will probably be up to the adaptor implementation to keep the 3rd party connection open and receiving data, transform it, and pass it to a We could wrap it in some kind of "adaptor runner" that will bridge the broker to the Source object. But that means creating an interface that will bind the different adaptor's implementations. Because there is no implementation for a streaming provider at the moment, I didn't see a reason to create it yet. After the first implementation, we'll get a sense of how the interface should look and maybe go in that direction. |
Beta Was this translation helpful? Give feedback.
-
I think because of the current state of my additional providers, the best way to get this feature to the main project would be to create a dummy streaming provider which returns some fixed bars over time and completes the interface for technical reference. |
Beta Was this translation helpful? Give feedback.
-
I'm not familiar with many brokers streaming APIs to know how the interface should be designed. There are APIs that are symbol based and you have to request symbols one by one. There are fire hoses that send you everything. There are pulling ones... I'd really like to have several providers before committing to a unified binding interface. It is possible to go and create this dummy adaptor that will provide a long-running |
Beta Was this translation helpful? Give feedback.
-
The current provider that I am working on implementing is TradingView Pro unofficial websockets feed. |
Beta Was this translation helpful? Give feedback.
-
What is the proposed architecture for a long running live session of market data?
Currently the
PipelineRunner
will expect aSource
to return aIterator[Candle]
. TheAsyncQueryResult
is being returned by theAsyncMarketProvider.request_symbol_history
and ourQuerySubscription
attached to the results willwait
for thedone_event
before returning. Once the market provider returns the symbol history and our subscriptions complete, the pipeline ends.Ideally, the architecture should be able listen for incoming candles from a source and continuously process new incoming data through the pipeline.
Any thoughts on the implementation of this would be greatly appreciated.
This might be related to #2 section 2.
Beta Was this translation helpful? Give feedback.
All reactions