Skip to content

Latest commit

 

History

History
39 lines (30 loc) · 4.64 KB

mlops-monitoring-and-alerts.md

File metadata and controls

39 lines (30 loc) · 4.64 KB

MLOps Monitoring & Alerts

MONITORING & ALERTS

Drift

  1. Data & concept drifts, 2,
  2. Arize.ai
    1. Data, concept, feature drifts - various comparison between train/prod/validation time windows, diff models, a/b testing etc and how to measure drifts
    2. Model store, Feature store, evaluation store
    3. Monitor model performance in prod - real time, biased, delayed and no ground truth.
  3. Some advice on medium, relabel using latest model (can we even trust it?) retrain after.
  4. Adversarial Validation Approach to Concept Drift Problem in User Targeting Automation Systems at Uber - Previous research on concept drift mostly proposed model retraining after observing performance decreases. However, this approach is suboptimal because the system fixes the problem only after suffering from poor performance on new data. Here, we introduce an adversarial validation approach to concept drift problems in user targeting automation systems. With our approach, the system detects concept drift in new data before making inference, trains a model, and produces predictions adapted to the new data.
  5. Drift estimator between data sets using random forest, formula is in the medium article above, code here at mlBOX
  6. Alibi-detect - is an open-source Python library focused on outlier, adversarial, and drift detection, by Seldon.

Alibi Detection Drift Features

Tool Comparisons

  1. State of MLOps (by me), medium article, open-source AirTable.
  2. Neptune.AI MLOPS tools landscape
  3. Twimlai ML AI solutions
  4. Ambiata how to choose the best MLOps tools
  5. Lakefs on the state of data engineering - has monitoring and observability inside
  6. The NLP Pandec - MLOps for NLP
  7. ml-ops.org
  8. Awesome production ML

Awesome production ML