Skip to content

Process large result sets in batches to minimize the risk of data running stale #394

@paterczm

Description

@paterczm

Consistency checker processes data in time frames. The amount of rows/docs it fetches for a certain time frame can be very large and processing can take significant time. During that time, data in source may change, which can result in migrator overwriting good data with previous state. To minimize this risk, migrator should process large data sets in smaller batches. The downside of this solution is more load, but with a configurable batch size it should not be a problem.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions