- Monitor! Stop being a blind DS
- Monitor your dependencies! Stop being a blind DS
- Data science observability for executives
- Production Machine Learning Monitoring: Outliers, Drift, Explainers & Statistical Performance, youtube, uses alibi-explain (see compendium) and ali-detect (see compendium)
- Mlflow, Hyperparameterhunter,hyperopt, concept drift, unit tests.
- meta anomaly over multiple models, aggregate.
- Vidhya on monitoring data / models
- Data & concept drifts, 2,
- Arize.ai
- Data, concept, feature drifts - various comparison between train/prod/validation time windows, diff models, a/b testing etc and how to measure drifts
- Model store, Feature store, evaluation store
- Monitor model performance in prod - real time, biased, delayed and no ground truth.
- Some advice on medium, relabel using latest model (can we even trust it?) retrain after.
- Adversarial Validation Approach to Concept Drift Problem in User Targeting Automation Systems at Uber - Previous research on concept drift mostly proposed model retraining after observing performance decreases. However, this approach is suboptimal because the system fixes the problem only after suffering from poor performance on new data. Here, we introduce an adversarial validation approach to concept drift problems in user targeting automation systems. With our approach, the system detects concept drift in new data before making inference, trains a model, and produces predictions adapted to the new data.
- Drift estimator between data sets using random forest, formula is in the medium article above, code here at mlBOX
- Alibi-detect - is an open-source Python library focused on outlier, adversarial, and drift detection, by Seldon.
- State of MLOps (by me), medium article, open-source AirTable.
- Neptune.AI MLOPS tools landscape
- Twimlai ML AI solutions
- Ambiata how to choose the best MLOps tools
- Lakefs on the state of data engineering - has monitoring and observability inside
- The NLP Pandec - MLOps for NLP
- ml-ops.org
- Awesome production ML