You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, net flows can record hugely bad information (i.e. 58T in flows), due to bad token-level data, either in usd price, or number of tokens.
Build an intermediate model with some logic that says "if the lead and lag (up to 2 away) is > 20x different than the current value, null this row due to bad data"
Example below, we know the middle row is incorrect: