You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Steaming that process individual increments as a whole is innately slower than processing a static batch entirely alone. Streaming requires additional 10x overhead costs to manage quote and indicator history caches needed for handling each increment. For example, both of these are true:
processing a static batch of 500 quotes into indicators is 25us
processing and caching 500 quotes and indicators is 250us
adding one quote and recalculating static batch is 25us
adding one quote and calculating stream increment is 5us
In other words, streaming has a 90% more efficient steady state for handling new quote arrivals, but is 10x slower when processing batches of quotes. As a result, the more proficient way to initialize an indicator data series is with a batch method, then turn on the streaming quote handler (e.g. do not emulate the stream to catch up the initial load).
Refactor chainable streaming mechanism to support more than current use of
Quote
orUse
bases. The current form of observable management won't work.The text was updated successfully, but these errors were encountered: