Skip to content

Commit 4d6992c

Browse files
authored
Update README.md (#53)
Updating symbology for dMG.
1 parent a1dc259 commit 4d6992c

File tree

1 file changed

+33
-33
lines changed

1 file changed

+33
-33
lines changed

README.md

Lines changed: 33 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# dMG: Generic Differentiable Modeling Framework
1+
# 𝛿MG: Generic Differentiable Modeling Framework
22

33
[![Python](https://img.shields.io/badge/python-3.12%20%7C%203.13-blue)](https://www.python.org/downloads/)
44
[![PyTorch](https://img.shields.io/badge/PyTorch-2.7.0-EE4C2C?logo=pytorch)](https://pytorch.org/)
@@ -10,19 +10,19 @@
1010

1111
---
1212

13-
A generic framework for building [differentiable models](https://www.nature.com/articles/s43017-023-00450-9). dMG enables seamless coupling of neural networks with differentiable process-based equations, leveraging PyTorch's auto-differentiation for efficient, large-scale optimization on GPU. The spiritual successor to [HydroDL](https://github.com/zhonghu17/HydroDL), dMG generalizes differentiable modeling for cross-domain application while also imposing basic standardizations for research-to-operations pipelines.
13+
A generic framework for building [differentiable models](https://www.nature.com/articles/s43017-023-00450-9). 𝛿MG enables seamless coupling of neural networks with differentiable process-based equations, leveraging PyTorch's auto-differentiation for efficient, large-scale optimization on GPU. The spiritual successor to [HydroDL](https://github.com/zhonghu17/HydroDL), 𝛿MG generalizes differentiable modeling for cross-domain application while also imposing basic standardizations for research-to-operations pipelines.
1414

1515
### Key Features
1616

1717
- 🤝 **Hybrid Modeling**: Combine neural networks with process-based equations for enhanced interpretability and generalizability. Instead of manual model parameter calibration, for instance, use neural networks to directly learn robust and interpretable parameters ([Tsai et al., 2021](https://doi.org/10.1038/s41467-021-26107-z)).
1818
- 🔁 **PyTorch Integration**: Scale with PyTorch for efficient training and compatibility with modern ML tools and numerical solvers.
1919
- 🧩 **Modular Plugin Architecture**: Swap in domain-specific components and configurations with ease.
20-
-**Benchmarking**: All in one place. dMG + hydroDL2 will enable rapid deployment and replication of key published MHPI results.
21-
- 🌊 **NextGen-ready**: Designed for [CSDMS BMI](https://csdms.colorado.edu/wiki/BMI) compliance to support differentiable hydrological models in [NOAA-OWP](https://water.noaa.gov/about/owp)'s [NextGen National Water Modeling Framework](https://github.com/NOAA-OWP/ngen). (See the NextGen-ready [𝛿HBV2.0](https://github.com/mhpi/dHBV2.0) for an example with a dMG-supported BMI).
20+
-**Benchmarking**: All in one place. 𝛿MG + hydroDL2 will enable rapid deployment and replication of key published MHPI results.
21+
- 🌊 **NextGen-ready**: Designed for [CSDMS BMI](https://csdms.colorado.edu/wiki/BMI) compliance to support differentiable hydrological models in [NOAA-OWP](https://water.noaa.gov/about/owp)'s [NextGen National Water Modeling Framework](https://github.com/NOAA-OWP/ngen). See the NextGen-ready [𝛿HBV2.0](https://github.com/mhpi/dHBV2.0) with 𝛿MG-supported BMI for an example.
2222

2323
</br>
2424

25-
dMG is designed to scale with modern deep learning tools (e.g., foundation models) while maintaining physical interpretability. Our peer-reviewed and published [benchmarks](https://mhpi.github.io/benchmarks/#10-year-training-comparison) show that well-tuned differentiable models can match deep networks in performance—while better extrapolating to extreme or data-scarce conditions and predicting physically meaningful variables.
25+
𝛿MG is designed to scale with modern deep learning tools (e.g., foundation models) while maintaining physical interpretability. Our peer-reviewed and published [benchmarks](https://mhpi.github.io/benchmarks/#10-year-training-comparison) show that well-tuned differentiable models can match deep networks in performance—while better extrapolating to extreme or data-scarce conditions and predicting physically meaningful variables.
2626

2727
Differentiable modeling introduces more modeling choices than traditional deep learning due to its physical constraints. This includes learning parameters, missing process representations, corrections, or other enhancements for physical models.
2828

@@ -38,7 +38,7 @@ This work is mantained by [MHPI](http://water.engr.psu.edu/shen/) and advised by
3838

3939
## Installation
4040

41-
To install dMG, clone the repo and install in developer mode with [Astral UV](https://docs.astral.sh/uv/):
41+
To install 𝛿MG, clone the repo and install in developer mode with [Astral UV](https://docs.astral.sh/uv/):
4242

4343
```bash
4444
git clone [email protected]:mhpi/generic_deltamodel.git
@@ -137,7 +137,7 @@ Currently in development. Find more details and results in [Aboelyazeed et al. (
137137

138138
</br>
139139

140-
## dMG Architecture
140+
## 𝛿MG Architecture
141141

142142
- **Data Loaders**: Bulk data preprocessors customized per dataset.
143143
- **Data Samplers**: Dataset samplers for minibatching during model training and inference.
@@ -149,32 +149,32 @@ Currently in development. Find more details and results in [Aboelyazeed et al. (
149149

150150
## Repo
151151

152-
```text
153-
.
154-
├── src/dMG/
155-
│ ├── __main__.py # Runs dMG; models, experiments
156-
│ ├── core/
157-
│ │ ├── calc/ # Calculation utilities
158-
│ │ ├── data/ # Data loaders and samplers
159-
│ │ ├── post/ # Post-processing utilities; plotting
160-
│ │ └── utils/ # Helper functions
161-
│ ├── models/
162-
│ │ ├── criterion # Loss functions
163-
│ │ ├── delta_models # Differentiable model modalities
164-
│ │ ├── multimodels # Multimodeling processors
165-
│ │ ├── neural_networks/ # Neural network architectures
166-
│ │ ├── phy_models/ # Physical Models
167-
│ │ └── model_handler.py # High-level model manager
168-
│ └── trainers/ # Model training routines
169-
├── conf/
170-
│ ├── hydra/ # Hydra settings
171-
│ ├── observations/ # Observation configuration files
172-
│ ├── config.py # Configuration validator
173-
│ └── default.yaml # Default master configuration file
174-
├── docs/
175-
├── envs/ # Python ENV configurations
176-
└── example/ # Tutorials
177-
```
152+
```text
153+
.
154+
├── src/dMG/
155+
│ ├── __main__.py # Runs 𝛿MG; models, experiments
156+
│ ├── core/
157+
│ │ ├── calc/ # Calculation utilities
158+
│ │ ├── data/ # Data loaders and samplers
159+
│ │ ├── post/ # Post-processing utilities; plotting
160+
│ │ └── utils/ # Helper functions
161+
│ ├── models/
162+
│ │ ├── criterion # Loss functions
163+
│ │ ├── delta_models # Differentiable model modalities
164+
│ │ ├── multimodels # Multimodeling processors
165+
│ │ ├── neural_networks/ # Neural network architectures
166+
│ │ ├── phy_models/ # Physical Models
167+
│ │ └── model_handler.py # High-level model manager
168+
│ └── trainers/ # Model training routines
169+
├── conf/
170+
│ ├── hydra/ # Hydra settings
171+
│ ├── observations/ # Observation configuration files
172+
│ ├── config.py # Configuration validator
173+
│ └── default.yaml # Default master configuration file
174+
├── docs/
175+
├── envs/ # Python ENV configurations
176+
└── example/ # Tutorials
177+
```
178178

179179
</br>
180180

0 commit comments

Comments
 (0)