You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Pip and Conda are also supported, though UV is recommended. See [setup](./docs/setup.md) for further details.
49
49
@@ -55,49 +55,49 @@ See [how to run](./docs/how_to_run.md).
55
55
56
56
**Example -- Differentiable Parameter Learning**: Use an LSTM to learn parameters for the [HBV](https://en.wikipedia.org/wiki/HBV_hydrology_model) hydrological model.
This exposes a key characteristic of the differentiable model `DplModel`: composition of a physical model, `phy_model`, and a neural network, `nn`. Internally, `DplModel` looks like
Check out [examples](https://github.com/mhpi/generic_deltamodel/tree/master/example/hydrology) to see model training/testing/simulation in detail. We recommend starting [here](./example/hydrology/example_dhbv_1_0.ipynb), which is a continuation of the above. A [Colab Notebook](https://colab.research.google.com/drive/19PRLrI-L7cGeYzkk2tOetULzQK8s_W7v?usp=sharing)for this δHBV example is also available.
100
+
Check out [examples](https://github.com/mhpi/generic_deltamodel/tree/master/example/hydrology) to see model training/testing/simulation in detail. We recommend starting with the [δHBV 1.0 tutorial](./example/hydrology/example_dhbv_1_0.ipynb), which can also be run in a [Colab Notebook](https://colab.research.google.com/drive/19PRLrI-L7cGeYzkk2tOetULzQK8s_W7v?usp=sharing)to leverage online compute.
101
101
102
102
</br>
103
103
@@ -129,7 +129,7 @@ Currently in development. Find more details and results in [Aboelyazeed et al. (
129
129
130
130
## Ecosystem Integration
131
131
132
-
-**HydroDL 2.0 ([`hydroDL2`](https://github.com/mhpi/hydroDL2))**: Home to MHPI's suite of process-based hydrology models and differentiable model augmentations.
132
+
-**HydroDL 2.0 ([`hydrodl2`](https://github.com/mhpi/hydrodl2))**: Home to MHPI's suite of process-based hydrology models and differentiable model augmentations.
133
133
<!-- - **HydroData ([`hydro_data_dev`](https://github.com/mhpi/hydro_data_dev))**: Data extraction, processing, and management tools optimized for geospatial datasets. (In development) -->
134
134
<!-- - **Config GUI ([`GUI-Config-builder`](https://mhpi-spatial.s3.us-east-2.amazonaws.com/mhpi-release/config_builder_gui/Config+Builder+GUI.zip))([Source](https://github.com/mhpi/GUI-Config-builder))**: An intuitive, user-friendly tool designed to simplify the creation and editing of configuration files for model setup and development. -->
135
135
-**Differentiable Ecosystem Modeling ([`diffEcosys (dev version only)`](https://github.com/hydroPKDN/diffEcosys/))**: A physics-informed machine learning system for ecosystem modeling, demonstrated using the photosynthesis process representation within the Functionally Assembled Terrestrial Ecosystem Simulator (FATES) model. This model is coupled to NNs that learn parameters from observations of photosynthesis rates.
@@ -151,8 +151,8 @@ Currently in development. Find more details and results in [Aboelyazeed et al. (
Copy file name to clipboardExpand all lines: docs/configuration_files.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -57,7 +57,7 @@ If you wish to use additional configuration files to store distinguished setting
57
57
58
58
Configuration file managment is handled by the Hydra config manager (see above). Essentially, at the start of a model experiment, Hydra will load configs into a single Python dictionary object of all settings that can be accessed throughout the framework.
59
59
60
-
You can see this demonstrated in the main dMG run file, `./generic_deltamodel/src/dMG/__main__.py`, at the start of the main function we call the decorator
60
+
You can see this demonstrated in the main dMG run file, `./generic_deltamodel/src/dmg/__main__.py`, at the start of the main function we call the decorator
Copy file name to clipboardExpand all lines: docs/configuration_glossary.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -50,11 +50,11 @@ The settings are broken down as they appear in the YAML configuration files, wit
50
50
51
51
**gpu_id**: [0] If `device = cuda`, the index of the GPU in your system to run models on. Index 0 will always be available.
52
52
53
-
**data_loader**: Class name of the data loader to use. E.g., *HydroLoader* located in `./generic_deltamodel/src/dMG/core/data/loaders/hydro_loader.py`. Note class name must be Camel-case w/o spaces corresponding to the file name.
53
+
**data_loader**: Class name of the data loader to use. E.g., *HydroLoader* located in `./generic_deltamodel/src/dmg/core/data/loaders/hydro_loader.py`. Note class name must be Camel-case w/o spaces corresponding to the file name.
54
54
55
-
**data_sampler**: Class name of the data sampler used in training/inference. E.g., *HydroSampler* located in `./generic_deltamodel/src/dMG/core/data/samplers/hydro_sampler.py`. Follows same convention as data_loader.
55
+
**data_sampler**: Class name of the data sampler used in training/inference. E.g., *HydroSampler* located in `./generic_deltamodel/src/dmg/core/data/samplers/hydro_sampler.py`. Follows same convention as data_loader.
56
56
57
-
**trainer**: Class name of the trainer used in training/inference. E.g., *Trainer* located in `./generic_deltamodel/src/dMG/trainers/trainer.py`. Follows same convention as data_loader.
57
+
**trainer**: Class name of the trainer used in training/inference. E.g., *Trainer* located in `./generic_deltamodel/src/dmg/trainers/trainer.py`. Follows same convention as data_loader.
58
58
59
59
**save_path**: [./output] root path to which model weights, outputs, statistics, and metrics will be saved.
60
60
@@ -116,7 +116,7 @@ The settings are broken down as they appear in the YAML configuration files, wit
116
116
117
117
**loss_function**:
118
118
119
-
-**model**: [KgeBatchLoss, KgeNormBatchLoss, MseLoss, NseBatchLoss, NseSqrtBatchLoss, RmseCombLoss, RmseLoss] Loss function for training. See `./generic_deltamodel/src/dMG/models/criterion/` for all available loss functions. You can add custom criterion, but they must follow Class-File convention as illustrated for `data_loader`, etc.
119
+
-**model**: [KgeBatchLoss, KgeNormBatchLoss, MseLoss, NseBatchLoss, NseSqrtBatchLoss, RmseCombLoss, RmseLoss] Loss function for training. See `./generic_deltamodel/src/dmg/models/criterion/` for all available loss functions. You can add custom criterion, but they must follow Class-File convention as illustrated for `data_loader`, etc.
120
120
121
121
</br>
122
122
@@ -132,7 +132,7 @@ The settings are broken down as they appear in the YAML configuration files, wit
132
132
-`HBV_adj`: δHBV with adjoint method.
133
133
-`HBV_1_1p`: δHBV 1.1p
134
134
-`HBV_2_0`: δHBV 2.0
135
-
-`custom_model`: If you create and add a physical to [phy_model/](../src/dMG/models/phy_models/), this will be the class name. Note it must follow the Class-File convention as illustrated for `data_loader`, etc.
135
+
-`custom_model`: If you create and add a physical to [phy_model/](../src/dmg/models/phy_models/), this will be the class name. Note it must follow the Class-File convention as illustrated for `data_loader`, etc.
136
136
137
137
-*nmul*: Number of parallel parameter sets to use. These will be averaged for single physical model output.
138
138
@@ -214,7 +214,7 @@ The settings are broken down as they appear in the YAML configuration files, wit
214
214
215
215
-*scaling_function*: [sigmoid, softmax] Method to use for scaling learned weights.
216
216
217
-
-*loss_function*: [KgeBatchLoss, KgeNormBatchLoss, MseLoss, NseBatchLoss, NseSqrtBatchLoss, RmseCombLoss, RmseLoss] Loss function for training. See `./generic_deltamodel/src/dMG/models/criterion/` for all available loss functions. You can add custom criterion, but they must follow Class-File convention as illustrated for `data_loader`, etc.
217
+
-*loss_function*: [KgeBatchLoss, KgeNormBatchLoss, MseLoss, NseBatchLoss, NseSqrtBatchLoss, RmseCombLoss, RmseLoss] Loss function for training. See `./generic_deltamodel/src/dmg/models/criterion/` for all available loss functions. You can add custom criterion, but they must follow Class-File convention as illustrated for `data_loader`, etc.
218
218
219
219
-*use_rb_loss*: [bool] If True, include range-bound loss regularization. Penalize learned weights when their sum exceeds specific bounds.
0 commit comments