Skip to content

Commit a7f7404

Browse files
authored
Merge branch 'master' into dev
2 parents 81fe63a + 4d6992c commit a7f7404

File tree

1 file changed

+7
-7
lines changed

1 file changed

+7
-7
lines changed

README.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# δMG: Generic Differentiable Modeling Framework
1+
# 𝛿MG: Generic Differentiable Modeling Framework
22

33
[![Python](https://img.shields.io/badge/python-3.12%20%7C%203.13-blue)](https://www.python.org/downloads/)
44
[![PyTorch](https://img.shields.io/badge/PyTorch-2.7.0-EE4C2C?logo=pytorch)](https://pytorch.org/)
@@ -10,19 +10,19 @@
1010

1111
---
1212

13-
A generic framework for building [differentiable models](https://www.nature.com/articles/s43017-023-00450-9). dMG enables seamless coupling of neural networks with differentiable process-based equations, leveraging PyTorch's auto-differentiation for efficient, large-scale optimization on GPU. The spiritual successor to [HydroDL](https://github.com/zhonghu17/HydroDL), dMG generalizes differentiable modeling for cross-domain application while also imposing basic standardizations for research-to-operations pipelines.
13+
A generic framework for building [differentiable models](https://www.nature.com/articles/s43017-023-00450-9). 𝛿MG enables seamless coupling of neural networks with differentiable process-based equations, leveraging PyTorch's auto-differentiation for efficient, large-scale optimization on GPU. The spiritual successor to [HydroDL](https://github.com/zhonghu17/HydroDL), 𝛿MG generalizes differentiable modeling for cross-domain application while also imposing basic standardizations for research-to-operations pipelines.
1414

1515
### Key Features
1616

1717
- 🤝 **Hybrid Modeling**: Combine neural networks with process-based equations for enhanced interpretability and generalizability. Instead of manual model parameter calibration, for instance, use neural networks to directly learn robust and interpretable parameters ([Tsai et al., 2021](https://doi.org/10.1038/s41467-021-26107-z)).
1818
- 🔁 **PyTorch Integration**: Scale with PyTorch for efficient training and compatibility with modern ML tools and numerical solvers.
1919
- 🧩 **Modular Plugin Architecture**: Swap in domain-specific components and configurations with ease.
20-
-**Benchmarking**: All in one place. dMG + hydrodl2 will enable rapid deployment and replication of key published MHPI results.
21-
- 🌊 **NextGen-ready**: Designed for [CSDMS BMI](https://csdms.colorado.edu/wiki/BMI) compliance to support differentiable hydrological models in [NOAA-OWP](https://water.noaa.gov/about/owp)'s [NextGen National Water Modeling Framework](https://github.com/NOAA-OWP/ngen). (See the NextGen-ready [𝛿HBV2.0](https://github.com/mhpi/dHBV2.0) for an example with a dMG-supported BMI).
20+
-**Benchmarking**: All in one place. 𝛿MG + hydroDL2 will enable rapid deployment and replication of key published MHPI results.
21+
- 🌊 **NextGen-ready**: Designed for [CSDMS BMI](https://csdms.colorado.edu/wiki/BMI) compliance to support differentiable hydrological models in [NOAA-OWP](https://water.noaa.gov/about/owp)'s [NextGen National Water Modeling Framework](https://github.com/NOAA-OWP/ngen). See the NextGen-ready [𝛿HBV2.0](https://github.com/mhpi/dHBV2.0) with 𝛿MG-supported BMI for an example.
2222

2323
</br>
2424

25-
dMG is designed to scale with modern deep learning tools (e.g., foundation models) while maintaining physical interpretability. Our peer-reviewed and published [benchmarks](https://mhpi.github.io/benchmarks/#10-year-training-comparison) show that well-tuned differentiable models can match deep networks in performance—while better extrapolating to extreme or data-scarce conditions and predicting physically meaningful variables.
25+
𝛿MG is designed to scale with modern deep learning tools (e.g., foundation models) while maintaining physical interpretability. Our peer-reviewed and published [benchmarks](https://mhpi.github.io/benchmarks/#10-year-training-comparison) show that well-tuned differentiable models can match deep networks in performance—while better extrapolating to extreme or data-scarce conditions and predicting physically meaningful variables.
2626

2727
Differentiable modeling introduces more modeling choices than traditional deep learning due to its physical constraints. This includes learning parameters, missing process representations, corrections, or other enhancements for physical models.
2828

@@ -38,7 +38,7 @@ This work is mantained by [MHPI](http://water.engr.psu.edu/shen/) and advised by
3838

3939
## Installation
4040

41-
To install dMG, clone the repo and install in developer mode with [Astral UV](https://docs.astral.sh/uv/):
41+
To install 𝛿MG, clone the repo and install in developer mode with [Astral UV](https://docs.astral.sh/uv/):
4242

4343
```bash
4444
git clone [email protected]:mhpi/generic_deltamodel.git
@@ -137,7 +137,7 @@ Currently in development. Find more details and results in [Aboelyazeed et al. (
137137

138138
</br>
139139

140-
## dMG Architecture
140+
## 𝛿MG Architecture
141141

142142
- **Data Loaders**: Bulk data preprocessors customized per dataset.
143143
- **Data Samplers**: Dataset samplers for minibatching during model training and inference.

0 commit comments

Comments
 (0)