diff --git a/README.md b/README.md index e6ec94fa..19a0bf5d 100644 --- a/README.md +++ b/README.md @@ -11,12 +11,11 @@ Machine Learning Collective Variables for Enhanced Sampling **PAPER** [![paper](https://img.shields.io/badge/JCP-10.1063/5.0156343-blue)](https://doi.org/10.1063/5.0156343) [![preprint](https://img.shields.io/badge/arXiv-2305.19980-lightblue)](https://arxiv.org/abs/2305.19980) -The documentation is available at: -- **stable** version: https://mlcolvar.readthedocs.io -- **latest** version: https://mlcolvar.readthedocs.io/en/latest/ - +--- --- +## Overview + `mlcolvar` is a Python library aimed to help design data-driven collective-variables (CVs) for enhanced sampling simulations. The key features are: 1. A unified framework to help test and use (some) of the CVs proposed in the literature. @@ -26,26 +25,28 @@ The documentation is available at: The library is built upon the [PyTorch](https://pytorch.org/) ML library as well as the [Lightning](https://lightning.ai/) high-level framework. --- +--- -Some of the **CVs** which are implemented, organized by learning setting: -- _Unsupervised_: PCA, (Variational) AutoEncoders [[1](http://dx.doi.org/%2010.1002/jcc.25520),[2](http://dx.doi.org/%2010.1021/acs.jctc.1c00415)] -- _Supervised_: LDA [[3](http://dx.doi.org/10.1021/acs.jpclett.8b00733)], DeepLDA [[4](http://dx.doi.org/%2010.1021/acs.jpclett.0c00535)], DeepTDA [[5](http://dx.doi.org/%2010.1021/acs.jpclett.1c02317)] -- _Time-informed_: TICA [[6](http://dx.doi.org/%2010.1063/1.4811489)], DeepTICA/SRVs [[7](http://dx.doi.org/10.1073/pnas.2113533118),[8](http://dx.doi.org/%2010.1063/1.5092521)], VDE [[9](http://dx.doi.org/10.1103/PhysRevE.97.062412)] -And many others can be implemented based on the building blocks or with simple modifications. Check out the [tutorials](https://mlcolvar.readthedocs.io/en/stable/tutorials.html) and the [examples](https://mlcolvar.readthedocs.io/en/stable/examples.html) section of the documentation. +## Documentation +The documentation is available at: +- **stable** version: https://mlcolvar.readthedocs.io +- **latest** version: https://mlcolvar.readthedocs.io/en/latest/ +--- --- +## Installation -**Install with `pip`** +**1. Install latest stable version with `pip`** -The library is available on [PyPi](https://pypi.org/project/mlcolvar/) and can be installed with `pip`. This is the preferred choice for **users** as it automatically installs the package requirements. +The **latest stable version** of library is available on [PyPi](https://pypi.org/project/mlcolvar/) and can be installed with `pip`. This is the preferred choice for **users** as it automatically installs the package requirements. ```bash pip install mlcolvar ``` -**Clone from GitHub** +**2. Clone repository from GitHub** The library can also be installed cloning the repository from GitHub. This is the preferred choice for **developers** as it provides more flexibility and allows editable installation. @@ -55,16 +56,52 @@ cd mlcolvar pip -e install . ``` +--- +--- + +## CV methods + +Some of the **CVs** which are implemented, organized by learning setting: +- _Unsupervised_: PCA, (Variational) AutoEncoders [[1](http://dx.doi.org/%2010.1002/jcc.25520),[2](http://dx.doi.org/%2010.1021/acs.jctc.1c00415)] +- _Supervised_: LDA [[3](http://dx.doi.org/10.1021/acs.jpclett.8b00733)], DeepLDA [[4](http://dx.doi.org/%2010.1021/acs.jpclett.0c00535)], DeepTDA [[5](http://dx.doi.org/%2010.1021/acs.jpclett.1c02317)] +- _Time-informed_: TICA [[6](http://dx.doi.org/%2010.1063/1.4811489)], DeepTICA/SRVs [[7](http://dx.doi.org/10.1073/pnas.2113533118),[8](http://dx.doi.org/%2010.1063/1.5092521)], VDE [[9](http://dx.doi.org/10.1103/PhysRevE.97.062412)] +- _Committor-based_ [[10](https://doi.org/10.1038/s43588-024-00645-0),[11](https://doi.org/10.1038/s43588-025-00799-5)] +- _Multi-task_ [[12](https://doi.org/10.1063/5.0156343)] + +And many others can be implemented based on the building blocks or with simple modifications. Check out the [tutorials](https://mlcolvar.readthedocs.io/en/stable/tutorials.html) and the [examples](https://mlcolvar.readthedocs.io/en/stable/examples.html) section of the documentation. + +--- +--- + +## Model architectures: feed-forward vs graph-based + +- **Feed-forward**: All the CV methods can be used using *standard* neural networks as architecture, either feed-forward or autoencoders. +In this case, for the inputs there are two possibilities: + - Directly use precomputed physical descriptors, ideally obtained using PLUMED. This options is faster and covers most use cases. + - Compute physical descriptors within the model starting from the atomic positions, ideally obtained from PLUMED. This can be done using as a *preprocessing module* the tools available in the **transform** module of the library or implementing your own descriptors. This option is typically slower and, for example, it should be chosen if the desired descriptors are not already available in PLUMED. + +- **Graph neural networks**: All the CV methods **not based on autoencoders** can be used also using graph neural networks as architecture and directly **atomic positions** as inputs, following the scheme reported in [[JCTC 2024](https://doi.org/10.1021/acs.jctc.4c01197)]. In this case, the inputs are directly the atomic positions and species. +Note that, in general, feed-forward based methods are faster than those graph-based. --- +--- + +### PLUMED interfaces + The resulting CVs can be deployed for enhancing sampling with the [PLUMED](https://www.plumed.org/) plugin compiled with `libtorch`. In particular: -**PLUMED interface**: the resulting CVs can be deployed for enhancing sampling with the [PLUMED](https://www.plumed.org/) package via the [pytorch](https://www.plumed.org/doc-master/user-doc/html/PYTORCH_MODEL/) interface, available since version 2.9. +- **Feed-forward-based** CV models can be employed via the [pytorch](https://www.plumed.org/doc-master/user-doc/html/PYTORCH_MODEL/) interface, available with the official release of PLUMED since version 2.9. + - Note: The transition-state-oriented Kolmogorov bias proposed in [[Nat.Comp.Sci. 2024](https://doi.org/10.1038/s43588-024-00645-0) and [2025](https://doi.org/10.1038/s43588-025-00799-5)], can be employed using the custom interface available at #TODO +- **Graph-based** models can be employed using the custom interface developed in [[JCTC 2024](https://doi.org/10.1021/acs.jctc.4c01197)] available at #TODO. + - Note: This interface already supports the calculation of transition-state-oriented Kolmogorov bias proposed in [[Nat.Comp.Sci. 2024](https://doi.org/10.1038/s43588-024-00645-0) and [2025](https://doi.org/10.1038/s43588-025-00799-5)] --- +--- -**Notes**: in early versions (`v<=0.2.*`) the library was called `mlcvs`. This is still accessible for compatibility with PLUMED masterclasses in the [releases](https://github.com/luigibonati/mlcolvar/releases) or by cloning the `pre-lightning` branch. +## Notes +In early versions (`v<=0.2.*`) the library was called `mlcvs`. This is still accessible for compatibility with PLUMED masterclasses in the [releases](https://github.com/luigibonati/mlcolvar/releases) or by cloning the `pre-lightning` branch. +--- --- Copyright (c) 2023 Luigi Bonati, Enrico Trizio, Andrea Rizzi and Michele Parrinello. diff --git a/devtools/conda-envs/test_env.yaml b/devtools/conda-envs/test_env.yaml index ac8b28e6..077b6e45 100644 --- a/devtools/conda-envs/test_env.yaml +++ b/devtools/conda-envs/test_env.yaml @@ -3,6 +3,7 @@ channels: - pytorch - conda-forge + - pyg - defaults dependencies: @@ -24,9 +25,13 @@ dependencies: - matplotlib - scikit-learn - scipy - + - pyg + # Pip-only installs - pip: - KDEpy - nbmake + - mdtraj + - matscipy + diff --git a/docs/api_core.rst b/docs/api_core.rst index 75af705b..66366362 100644 --- a/docs/api_core.rst +++ b/docs/api_core.rst @@ -1,10 +1,14 @@ Core modules ------------- +============ These are the building blocks which are used to construct the CVs. -.. rubric:: NN +NN +-- +This module implements the architectures with learnable weights that can be used to build CV models. +Descriptors-based +^^^^^^^^^^^^^^^^ .. currentmodule:: mlcolvar.core.nn .. autosummary:: @@ -13,7 +17,31 @@ These are the building blocks which are used to construct the CVs. FeedForward -.. rubric:: Loss +Graphs-based +^^^^^^^^^^^^ +.. currentmodule:: mlcolvar.core.nn.graph + +Base class +"""""""""" +.. autosummary:: + :toctree: autosummary + :template: custom-class-template.rst + + BaseGNN + +Architectures +""""""""""""" +.. autosummary:: + :toctree: autosummary + :template: custom-class-template.rst + + SchNetModel + GVPModel + + +Loss +---- +This module implements the loss functions that can be used to optimize CV models. .. currentmodule:: mlcolvar.core.loss @@ -31,8 +59,13 @@ These are the building blocks which are used to construct the CVs. GeneratorLoss SmartDerivatives -.. rubric:: Stats +Stats +----- +This module implements statistical methods with learnable weights that can be used in CV models. + +Base class +^^^^^^^^^^ .. currentmodule:: mlcolvar.core.stats .. autosummary:: @@ -40,11 +73,26 @@ These are the building blocks which are used to construct the CVs. :template: custom-class-template.rst Stats + +Linear methods +^^^^^^^^^^^^^^ +.. currentmodule:: mlcolvar.core.stats + +.. autosummary:: + :toctree: autosummary + :template: custom-class-template.rst + PCA LDA TICA -.. rubric:: Transform + +Transform +--------- +This module implements **non-learnable** pre/postprocessing tools + +Base class +^^^^^^^^^^ .. currentmodule:: mlcolvar.core.transform @@ -55,7 +103,9 @@ These are the building blocks which are used to construct the CVs. Transform -.. rubric:: Transform.descriptors +Descriptors +^^^^^^^^^^^ +This submodule implements several descriptors that can be computed starting from atomic positions. .. currentmodule:: mlcolvar.core.transform.descriptors @@ -69,7 +119,9 @@ These are the building blocks which are used to construct the CVs. EigsAdjMat MultipleDescriptors -.. rubric:: Transform.tools +Tools +^^^^^ +This submodule implements pre/postporcessing tools. .. currentmodule:: mlcolvar.core.transform.tools diff --git a/docs/api_data.rst b/docs/api_data.rst index 671cff12..a3e6be5d 100644 --- a/docs/api_data.rst +++ b/docs/api_data.rst @@ -1,6 +1,9 @@ Data ---- +General: dataset, module and loader +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + .. currentmodule:: mlcolvar.data This module contains the classes used for handling datasets and for feeding them to the Lightning trainer. @@ -11,4 +14,19 @@ This module contains the classes used for handling datasets and for feeding them DictDataset DictLoader - DictModule \ No newline at end of file + DictModule + +Graph specific tools +^^^^^^^^^^^^^^^^^^^^ +.. currentmodule:: mlcolvar.data.graph + +This module contains the classes used for handling and creating graphs. + +.. autosummary:: + :toctree: autosummary + :template: custom-class-template.rst + + AtomicNumberTable + Configuration + get_neighborhood + create_dataset_from_configurations \ No newline at end of file diff --git a/docs/api_explain.rst b/docs/api_explain.rst index 91352802..23f514f0 100644 --- a/docs/api_explain.rst +++ b/docs/api_explain.rst @@ -1,12 +1,16 @@ Explain ------- -.. rubric:: Sensitivity analysis +Sensitivity analysis +^^^^^^^^^^^^^^^^^^^^ Perform sensitivity analysis to identify feature relevances .. currentmodule:: mlcolvar.explain.sensitivity +Descriptors-based +"""""""""""""""""" + .. autosummary:: :toctree: autosummary :template: custom-class-template.rst @@ -14,7 +18,19 @@ Perform sensitivity analysis to identify feature relevances sensitivity_analysis plot_sensitivity -.. rubric:: Sparse linear model +Graph-based +""""""""""" +.. currentmodule:: mlcolvar.explain.graph_sensitivity + +.. autosummary:: + :toctree: autosummary + :template: custom-class-template.rst + + graph_node_sensitivity + + +Sparse linear models +^^^^^^^^^^^^^^^^^^^^ Use sparse models to approximate classification/regression tasks diff --git a/docs/api_utils.rst b/docs/api_utils.rst index 1fe87a1c..d9e5c5e9 100644 --- a/docs/api_utils.rst +++ b/docs/api_utils.rst @@ -1,7 +1,9 @@ Utils ----- -.. rubric:: Input/Output + +Input/Output +^^^^^^^^^^^^ Helper functions for loading dataframes (incl. PLUMED files) and directly creating datasets from them. @@ -14,7 +16,9 @@ Helper functions for loading dataframes (incl. PLUMED files) and directly creati load_dataframe create_dataset_from_files -.. rubric:: Time-lagged datasets + +Time-lagged datasets +^^^^^^^^^^^^^^^^^^^^ Create a dataset of pairs of time-lagged configurations. @@ -26,31 +30,35 @@ Create a dataset of pairs of time-lagged configurations. create_timelagged_dataset -.. rubric:: FES - -.. rubric:: Trainer -Functions used in conjunction with the lightning Trainer (e.g. logging, metrics...). +FES +^^^ +Compute (and plot) the free energy surface along the CVs. -.. currentmodule:: mlcolvar.utils.trainer +.. currentmodule:: mlcolvar.utils.fes .. autosummary:: :toctree: autosummary :template: custom-class-template.rst - MetricsCallback + compute_fes -Compute (and plot) the free energy surface along the CVs. -.. currentmodule:: mlcolvar.utils.fes +Trainer +^^^^^^^ +Functions used in conjunction with the lightning Trainer (e.g. logging, metrics...). + +.. currentmodule:: mlcolvar.utils.trainer .. autosummary:: :toctree: autosummary :template: custom-class-template.rst - compute_fes + MetricsCallback + -Plotting utils +Plot +^^^^ .. currentmodule:: mlcolvar.utils.plot diff --git a/docs/autosummary/mlcolvar.core.nn.FeedForward.rst b/docs/autosummary/mlcolvar.core.nn.FeedForward.rst index 1078460e..5ed3efa2 100644 --- a/docs/autosummary/mlcolvar.core.nn.FeedForward.rst +++ b/docs/autosummary/mlcolvar.core.nn.FeedForward.rst @@ -17,6 +17,7 @@ .. autosummary:: ~FeedForward.__init__ + ~FeedForward.backward ~FeedForward.forward @@ -29,27 +30,9 @@ .. autosummary:: - ~FeedForward.CHECKPOINT_HYPER_PARAMS_KEY - ~FeedForward.CHECKPOINT_HYPER_PARAMS_NAME - ~FeedForward.CHECKPOINT_HYPER_PARAMS_TYPE ~FeedForward.T_destination - ~FeedForward.automatic_optimization ~FeedForward.call_super_init - ~FeedForward.current_epoch - ~FeedForward.device - ~FeedForward.dtype ~FeedForward.dump_patches - ~FeedForward.example_input_array - ~FeedForward.fabric - ~FeedForward.global_rank - ~FeedForward.global_step - ~FeedForward.hparams - ~FeedForward.hparams_initial - ~FeedForward.local_rank - ~FeedForward.logger - ~FeedForward.loggers - ~FeedForward.on_gpu - ~FeedForward.trainer ~FeedForward.training diff --git a/docs/autosummary/mlcolvar.core.nn.graph.BaseGNN.rst b/docs/autosummary/mlcolvar.core.nn.graph.BaseGNN.rst new file mode 100644 index 00000000..0d3f69da --- /dev/null +++ b/docs/autosummary/mlcolvar.core.nn.graph.BaseGNN.rst @@ -0,0 +1,41 @@ +mlcolvar.core.nn.graph.BaseGNN +============================== + +.. currentmodule:: mlcolvar.core.nn.graph + +.. autoclass:: BaseGNN + :members: + :show-inheritance: + :inherited-members: Module,LightningModule + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~BaseGNN.__init__ + ~BaseGNN.embed_edge + + + + +.. + + + .. rubric:: Attributes + + .. autosummary:: + + ~BaseGNN.T_destination + ~BaseGNN.call_super_init + ~BaseGNN.dump_patches + ~BaseGNN.in_features + ~BaseGNN.out_features + ~BaseGNN.training + + + + \ No newline at end of file diff --git a/docs/autosummary/mlcolvar.core.nn.graph.GVPModel.rst b/docs/autosummary/mlcolvar.core.nn.graph.GVPModel.rst new file mode 100644 index 00000000..68ba5df3 --- /dev/null +++ b/docs/autosummary/mlcolvar.core.nn.graph.GVPModel.rst @@ -0,0 +1,41 @@ +mlcolvar.core.nn.graph.GVPModel +=============================== + +.. currentmodule:: mlcolvar.core.nn.graph + +.. autoclass:: GVPModel + :members: + :show-inheritance: + :inherited-members: Module,LightningModule + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~GVPModel.__init__ + ~GVPModel.forward + + + + +.. + + + .. rubric:: Attributes + + .. autosummary:: + + ~GVPModel.T_destination + ~GVPModel.call_super_init + ~GVPModel.dump_patches + ~GVPModel.in_features + ~GVPModel.out_features + ~GVPModel.training + + + + \ No newline at end of file diff --git a/docs/autosummary/mlcolvar.core.nn.graph.SchNetModel.rst b/docs/autosummary/mlcolvar.core.nn.graph.SchNetModel.rst new file mode 100644 index 00000000..05a5e532 --- /dev/null +++ b/docs/autosummary/mlcolvar.core.nn.graph.SchNetModel.rst @@ -0,0 +1,42 @@ +mlcolvar.core.nn.graph.SchNetModel +================================== + +.. currentmodule:: mlcolvar.core.nn.graph + +.. autoclass:: SchNetModel + :members: + :show-inheritance: + :inherited-members: Module,LightningModule + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~SchNetModel.__init__ + ~SchNetModel.forward + ~SchNetModel.reset_parameters + + + + +.. + + + .. rubric:: Attributes + + .. autosummary:: + + ~SchNetModel.T_destination + ~SchNetModel.call_super_init + ~SchNetModel.dump_patches + ~SchNetModel.in_features + ~SchNetModel.out_features + ~SchNetModel.training + + + + \ No newline at end of file diff --git a/docs/autosummary/mlcolvar.core.nn.graph.radial.RadialEmbeddingBlock.rst b/docs/autosummary/mlcolvar.core.nn.graph.radial.RadialEmbeddingBlock.rst new file mode 100644 index 00000000..962513c3 --- /dev/null +++ b/docs/autosummary/mlcolvar.core.nn.graph.radial.RadialEmbeddingBlock.rst @@ -0,0 +1,39 @@ +mlcolvar.core.nn.graph.radial.RadialEmbeddingBlock +================================================== + +.. currentmodule:: mlcolvar.core.nn.graph.radial + +.. autoclass:: RadialEmbeddingBlock + :members: + :show-inheritance: + :inherited-members: Module,LightningModule + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~RadialEmbeddingBlock.__init__ + ~RadialEmbeddingBlock.forward + + + + +.. + + + .. rubric:: Attributes + + .. autosummary:: + + ~RadialEmbeddingBlock.T_destination + ~RadialEmbeddingBlock.call_super_init + ~RadialEmbeddingBlock.dump_patches + ~RadialEmbeddingBlock.training + + + + \ No newline at end of file diff --git a/docs/autosummary/mlcolvar.cvs.AutoEncoderCV.rst b/docs/autosummary/mlcolvar.cvs.AutoEncoderCV.rst index 13d32fab..93425727 100644 --- a/docs/autosummary/mlcolvar.cvs.AutoEncoderCV.rst +++ b/docs/autosummary/mlcolvar.cvs.AutoEncoderCV.rst @@ -32,10 +32,11 @@ .. autosummary:: - ~AutoEncoderCV.BLOCKS ~AutoEncoderCV.CHECKPOINT_HYPER_PARAMS_KEY ~AutoEncoderCV.CHECKPOINT_HYPER_PARAMS_NAME ~AutoEncoderCV.CHECKPOINT_HYPER_PARAMS_TYPE + ~AutoEncoderCV.DEFAULT_BLOCKS + ~AutoEncoderCV.MODEL_BLOCKS ~AutoEncoderCV.T_destination ~AutoEncoderCV.automatic_optimization ~AutoEncoderCV.call_super_init diff --git a/docs/autosummary/mlcolvar.cvs.BaseCV.rst b/docs/autosummary/mlcolvar.cvs.BaseCV.rst index 33fa73c0..4d368e28 100644 --- a/docs/autosummary/mlcolvar.cvs.BaseCV.rst +++ b/docs/autosummary/mlcolvar.cvs.BaseCV.rst @@ -22,6 +22,7 @@ ~BaseCV.forward_cv ~BaseCV.initialize_blocks ~BaseCV.initialize_transforms + ~BaseCV.parse_model ~BaseCV.parse_options ~BaseCV.setup ~BaseCV.test_step @@ -37,6 +38,8 @@ .. autosummary:: + ~BaseCV.DEFAULT_BLOCKS + ~BaseCV.MODEL_BLOCKS ~BaseCV.example_input_array ~BaseCV.n_cvs ~BaseCV.optimizer_name diff --git a/docs/autosummary/mlcolvar.cvs.Committor.rst b/docs/autosummary/mlcolvar.cvs.Committor.rst index 3aa14710..8b91866a 100644 --- a/docs/autosummary/mlcolvar.cvs.Committor.rst +++ b/docs/autosummary/mlcolvar.cvs.Committor.rst @@ -17,6 +17,7 @@ .. autosummary:: ~Committor.__init__ + ~Committor.forward_nn ~Committor.training_step @@ -29,10 +30,11 @@ .. autosummary:: - ~Committor.BLOCKS ~Committor.CHECKPOINT_HYPER_PARAMS_KEY ~Committor.CHECKPOINT_HYPER_PARAMS_NAME ~Committor.CHECKPOINT_HYPER_PARAMS_TYPE + ~Committor.DEFAULT_BLOCKS + ~Committor.MODEL_BLOCKS ~Committor.T_destination ~Committor.automatic_optimization ~Committor.call_super_init diff --git a/docs/autosummary/mlcolvar.cvs.DeepLDA.rst b/docs/autosummary/mlcolvar.cvs.DeepLDA.rst index 2d1eeb29..d4113105 100644 --- a/docs/autosummary/mlcolvar.cvs.DeepLDA.rst +++ b/docs/autosummary/mlcolvar.cvs.DeepLDA.rst @@ -32,10 +32,11 @@ .. autosummary:: - ~DeepLDA.BLOCKS ~DeepLDA.CHECKPOINT_HYPER_PARAMS_KEY ~DeepLDA.CHECKPOINT_HYPER_PARAMS_NAME ~DeepLDA.CHECKPOINT_HYPER_PARAMS_TYPE + ~DeepLDA.DEFAULT_BLOCKS + ~DeepLDA.MODEL_BLOCKS ~DeepLDA.T_destination ~DeepLDA.automatic_optimization ~DeepLDA.call_super_init diff --git a/docs/autosummary/mlcolvar.cvs.DeepTDA.rst b/docs/autosummary/mlcolvar.cvs.DeepTDA.rst index 9ebe1355..011f71b2 100644 --- a/docs/autosummary/mlcolvar.cvs.DeepTDA.rst +++ b/docs/autosummary/mlcolvar.cvs.DeepTDA.rst @@ -29,10 +29,11 @@ .. autosummary:: - ~DeepTDA.BLOCKS ~DeepTDA.CHECKPOINT_HYPER_PARAMS_KEY ~DeepTDA.CHECKPOINT_HYPER_PARAMS_NAME ~DeepTDA.CHECKPOINT_HYPER_PARAMS_TYPE + ~DeepTDA.DEFAULT_BLOCKS + ~DeepTDA.MODEL_BLOCKS ~DeepTDA.T_destination ~DeepTDA.automatic_optimization ~DeepTDA.call_super_init diff --git a/docs/autosummary/mlcolvar.cvs.DeepTICA.rst b/docs/autosummary/mlcolvar.cvs.DeepTICA.rst index 8dc7e0a7..6e8b0929 100644 --- a/docs/autosummary/mlcolvar.cvs.DeepTICA.rst +++ b/docs/autosummary/mlcolvar.cvs.DeepTICA.rst @@ -31,10 +31,11 @@ .. autosummary:: - ~DeepTICA.BLOCKS ~DeepTICA.CHECKPOINT_HYPER_PARAMS_KEY ~DeepTICA.CHECKPOINT_HYPER_PARAMS_NAME ~DeepTICA.CHECKPOINT_HYPER_PARAMS_TYPE + ~DeepTICA.DEFAULT_BLOCKS + ~DeepTICA.MODEL_BLOCKS ~DeepTICA.T_destination ~DeepTICA.automatic_optimization ~DeepTICA.call_super_init diff --git a/docs/autosummary/mlcolvar.cvs.RegressionCV.rst b/docs/autosummary/mlcolvar.cvs.RegressionCV.rst index 54edee56..e9e0422c 100644 --- a/docs/autosummary/mlcolvar.cvs.RegressionCV.rst +++ b/docs/autosummary/mlcolvar.cvs.RegressionCV.rst @@ -29,10 +29,11 @@ .. autosummary:: - ~RegressionCV.BLOCKS ~RegressionCV.CHECKPOINT_HYPER_PARAMS_KEY ~RegressionCV.CHECKPOINT_HYPER_PARAMS_NAME ~RegressionCV.CHECKPOINT_HYPER_PARAMS_TYPE + ~RegressionCV.DEFAULT_BLOCKS + ~RegressionCV.MODEL_BLOCKS ~RegressionCV.T_destination ~RegressionCV.automatic_optimization ~RegressionCV.call_super_init diff --git a/docs/autosummary/mlcolvar.cvs.VariationalAutoEncoderCV.rst b/docs/autosummary/mlcolvar.cvs.VariationalAutoEncoderCV.rst index dbfa2c47..5ac5a041 100644 --- a/docs/autosummary/mlcolvar.cvs.VariationalAutoEncoderCV.rst +++ b/docs/autosummary/mlcolvar.cvs.VariationalAutoEncoderCV.rst @@ -32,10 +32,11 @@ .. autosummary:: - ~VariationalAutoEncoderCV.BLOCKS ~VariationalAutoEncoderCV.CHECKPOINT_HYPER_PARAMS_KEY ~VariationalAutoEncoderCV.CHECKPOINT_HYPER_PARAMS_NAME ~VariationalAutoEncoderCV.CHECKPOINT_HYPER_PARAMS_TYPE + ~VariationalAutoEncoderCV.DEFAULT_BLOCKS + ~VariationalAutoEncoderCV.MODEL_BLOCKS ~VariationalAutoEncoderCV.T_destination ~VariationalAutoEncoderCV.automatic_optimization ~VariationalAutoEncoderCV.call_super_init diff --git a/docs/autosummary/mlcolvar.data.graph.AtomicNumberTable.rst b/docs/autosummary/mlcolvar.data.graph.AtomicNumberTable.rst new file mode 100644 index 00000000..881eff0e --- /dev/null +++ b/docs/autosummary/mlcolvar.data.graph.AtomicNumberTable.rst @@ -0,0 +1,34 @@ +mlcolvar.data.graph.AtomicNumberTable +===================================== + +.. currentmodule:: mlcolvar.data.graph + +.. autoclass:: AtomicNumberTable + :members: + :show-inheritance: + :inherited-members: Module,LightningModule + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~AtomicNumberTable.__init__ + ~AtomicNumberTable.from_zs + ~AtomicNumberTable.index_to_symbol + ~AtomicNumberTable.index_to_z + ~AtomicNumberTable.z_to_index + ~AtomicNumberTable.zs_to_indices + + + + +.. + + + + + \ No newline at end of file diff --git a/docs/autosummary/mlcolvar.data.graph.Configuration.rst b/docs/autosummary/mlcolvar.data.graph.Configuration.rst new file mode 100644 index 00000000..41e2a5fd --- /dev/null +++ b/docs/autosummary/mlcolvar.data.graph.Configuration.rst @@ -0,0 +1,43 @@ +mlcolvar.data.graph.Configuration +================================= + +.. currentmodule:: mlcolvar.data.graph + +.. autoclass:: Configuration + :members: + :show-inheritance: + :inherited-members: Module,LightningModule + + + .. automethod:: __init__ + + + .. rubric:: Methods + + .. autosummary:: + + ~Configuration.__init__ + + + + +.. + + + .. rubric:: Attributes + + .. autosummary:: + + ~Configuration.environment + ~Configuration.system + ~Configuration.weight + ~Configuration.atomic_numbers + ~Configuration.positions + ~Configuration.cell + ~Configuration.pbc + ~Configuration.node_labels + ~Configuration.graph_labels + + + + \ No newline at end of file diff --git a/docs/autosummary/mlcolvar.data.graph.create_dataset_from_configurations.rst b/docs/autosummary/mlcolvar.data.graph.create_dataset_from_configurations.rst new file mode 100644 index 00000000..d07941ab --- /dev/null +++ b/docs/autosummary/mlcolvar.data.graph.create_dataset_from_configurations.rst @@ -0,0 +1,23 @@ +mlcolvar.data.graph.create\_dataset\_from\_configurations +========================================================= + +.. currentmodule:: mlcolvar.data.graph + +.. autoclass:: create_dataset_from_configurations + :members: + :show-inheritance: + :inherited-members: Module,LightningModule + + + .. automethod:: __init__ + + + + + +.. + + + + + \ No newline at end of file diff --git a/docs/autosummary/mlcolvar.data.graph.get_neighborhood.rst b/docs/autosummary/mlcolvar.data.graph.get_neighborhood.rst new file mode 100644 index 00000000..0602c951 --- /dev/null +++ b/docs/autosummary/mlcolvar.data.graph.get_neighborhood.rst @@ -0,0 +1,23 @@ +mlcolvar.data.graph.get\_neighborhood +===================================== + +.. currentmodule:: mlcolvar.data.graph + +.. autoclass:: get_neighborhood + :members: + :show-inheritance: + :inherited-members: Module,LightningModule + + + .. automethod:: __init__ + + + + + +.. + + + + + \ No newline at end of file diff --git a/docs/autosummary/mlcolvar.explain.graph_sensitivity.graph_node_sensitivity.rst b/docs/autosummary/mlcolvar.explain.graph_sensitivity.graph_node_sensitivity.rst new file mode 100644 index 00000000..1da846f6 --- /dev/null +++ b/docs/autosummary/mlcolvar.explain.graph_sensitivity.graph_node_sensitivity.rst @@ -0,0 +1,23 @@ +mlcolvar.explain.graph\_sensitivity.graph\_node\_sensitivity +============================================================ + +.. currentmodule:: mlcolvar.explain.graph_sensitivity + +.. autoclass:: graph_node_sensitivity + :members: + :show-inheritance: + :inherited-members: Module,LightningModule + + + .. automethod:: __init__ + + + + + +.. + + + + + \ No newline at end of file diff --git a/docs/notebooks/examples/ex_TPI-DeepTDA.ipynb b/docs/notebooks/examples/ex_TPI-DeepTDA.ipynb index 52805569..c7ab7826 100644 --- a/docs/notebooks/examples/ex_TPI-DeepTDA.ipynb +++ b/docs/notebooks/examples/ex_TPI-DeepTDA.ipynb @@ -167,7 +167,7 @@ "target_sigmas = [0.2, 0.2]\n", "nn_layers = [45,24,12,1]\n", "# MODEL\n", - "model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, layers=nn_layers)" + "model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, model=nn_layers)" ] }, { @@ -417,7 +417,7 @@ "target_sigmas = [0.2, 1.5, 0.2]\n", "nn_layers = [45,24,12,1]\n", "# MODEL\n", - "model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, layers=nn_layers)" + "model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, model=nn_layers)" ] }, { diff --git a/docs/notebooks/examples/ex_committor.ipynb b/docs/notebooks/examples/ex_committor.ipynb index c07e874d..b3d41279 100644 --- a/docs/notebooks/examples/ex_committor.ipynb +++ b/docs/notebooks/examples/ex_committor.ipynb @@ -323,7 +323,7 @@ " 'nn' : {'activation' : 'tanh'}}\n", "\n", "# initialize model\n", - "model = Committor(layers=[45, 32, 32, 1],\n", + "model = Committor(model=[45, 32, 32, 1],\n", " atomic_masses=atomic_masses,\n", " alpha=1e1,\n", " options=options, \n", @@ -807,7 +807,7 @@ " 'nn' : {'activation' : 'tanh'}}\n", "\n", "# initialize model\n", - "model = Committor(layers=[45, 32, 32, 1],\n", + "model = Committor(model=[45, 32, 32, 1],\n", " atomic_masses=atomic_masses,\n", " alpha=1e1,\n", " options=options, \n", diff --git a/docs/notebooks/paper_experiments/paper_2_supervised.ipynb b/docs/notebooks/paper_experiments/paper_2_supervised.ipynb index 38842fde..e5a524c4 100644 --- a/docs/notebooks/paper_experiments/paper_2_supervised.ipynb +++ b/docs/notebooks/paper_experiments/paper_2_supervised.ipynb @@ -190,7 +190,7 @@ "options = {'nn' : {'activation' : 'shifted_softplus'} }\n", "# MODEL\n", "if run_calculations:\n", - " model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, layers=nn_layers)\n", + " model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, model=nn_layers)\n", "else:\n", " model = torch.jit.load(f'{RESULTS_FOLDER}/model_deepTDA.pt')" ] diff --git a/docs/notebooks/paper_experiments/paper_4_multitask.ipynb b/docs/notebooks/paper_experiments/paper_4_multitask.ipynb index 03ca22bd..ea4deac4 100644 --- a/docs/notebooks/paper_experiments/paper_4_multitask.ipynb +++ b/docs/notebooks/paper_experiments/paper_4_multitask.ipynb @@ -216,7 +216,7 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -598,7 +598,7 @@ }, { "cell_type": "code", - "execution_count": 16, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ diff --git a/docs/notebooks/tutorials/adv_gnn_based_cvs.ipynb b/docs/notebooks/tutorials/adv_gnn_based_cvs.ipynb new file mode 100644 index 00000000..d7ac2d9c --- /dev/null +++ b/docs/notebooks/tutorials/adv_gnn_based_cvs.ipynb @@ -0,0 +1,7487 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Using `mlcolvar` with graph neural networks (GNNs)\n", + "[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/luigibonati/mlcolvar/blob/main/docs/notebooks/tutorials/adv_gnn_based_cvs.ipynb)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### NOTE\n", + "Most of the workings of the library are the same using standard feed-forward-nn-based machine-learning CVs or GNN-based ones.\n", + "Thus, it is recommended to first go through the basic tutorials for the standard scenario before moving to this tutorial." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Feed-Forward-based CVs vs GNN-based CVs\n", + "\n", + "The default setting of `mlcolvar` is to represent the CVs as the output nodes of Feed-Forward Neural Networks (FFNNs or NNs, for simplicity) which take as input a set of physical descriptors (e.g., distances, angles, etc.).\n", + "The code is thus designed to reflect this choice, with the default values of the classes set to intilialize the CV model in this framework, which is the most diffused for the time being in the field of machine-learning CVs and suits the needs of most users.\n", + "\n", + "However, recently a different approach have been proposed, in which the CVs are represented as Graph Neural Networks (GNNs) which directly take as input the Cartesian coordinates of the atoms in the studied system and return the CV space after a node-pooling operation on the output layer.\n", + "This approach is thus descriptor-free and goes in the direcion of a more automated way of desgining CVs.\n", + "Unfortunately, it typically comes at a higher computational cost (i.e., slower trainign and evaluation fo the CV) and the underlying codebase is more complex (i.e., more complex models and data format.)\n", + "\n", + "In this tutorial, we show how GNN models can be used within `mlcolvar` to build CVs using the implemented CV methods.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "# Colab setup\n", + "import os\n", + "\n", + "if os.getenv(\"COLAB_RELEASE_TAG\"):\n", + " import subprocess\n", + " subprocess.run('wget https://raw.githubusercontent.com/luigibonati/mlcolvar/main/colab_setup.sh', shell=True)\n", + " cmd = subprocess.run('bash colab_setup.sh TUTORIAL', shell=True, stdout=subprocess.PIPE)\n", + " print(cmd.stdout.decode('utf-8'))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Outline\n", + "Typically, the process of constructing a GNN-based CV requires the following ingredients;\n", + "1. A **dataset** of attributed connected graphs (nodes and edges), which are constructed from the atomic positions\n", + "2. A **GNN-model** to represent the CV. Different architectures can be used in this regard.\n", + "3. A **CV method** and the associated **loss function**. These are all the methods implemented for *standard* machine-learning CVs, except for those based on autoencoders. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Load data\n", + "#### The inputs of GNNs CVs\n", + "The input of GNN models are attributed and connected graphs, in which nodes (representing the atoms, in our case) are connected by edges (the lines of the graph).\n", + "Nodes and edges are then assigned with scalar and, eventually, vector features that are then processed through the layers of the GNN.\n", + "\n", + "In the context of GNN-CVs, such graphs most likely are created directly from the atomic coordinates from a trajectory file and the connectivity between the nodes is determined according to a radial `cutoff`.\n", + "\n", + "#### Truncated graphs\n", + "In some cases, graphs can be built focusing the attention on a subset of the whole system, e.g., a molecule on a surface, but still keeping into account the interaction with the environment, e.g., the surface.\n", + "In this case, only the ndoes from the `system_selection` will be used for the final pooling, whereas the nodes from the `enviroment_selection` will be used only to update the information through the layers.\n", + "Moreover, to reduce the computational costs, only the atoms closer to the `system_selection` atoms will be included in the graphs, according to the set `cutoff` and a `buffer` value to ensure stability e continuity. \n", + "For example, this setup is useful when treting solvent or surface interactions.\n", + "\n", + "#### Create dataset from trajectory files\n", + "To make this process easier, in `mlcolvar` there is an util function to do this under-the-hood: `create_dataset_from_trajectories`, which is analogous to the create_dataset_from_files used with descriptors.\n", + "The loading process is built on the external library [`MDTraj`](https://www.mdtraj.org/), which can natively load most common trajectory+topology format used in biophysics.\n", + "On the other hand, for less-bio applications (e.g., solids, surfaces, molecules) we recommend using the `.xyz` file format.\n", + "\n", + "One advantage of MDTraj, is that it comes with a simple and user friendly synthax for atom selection, which can be used also here.\n", + "\n", + "Here, as an example, we load some data about the state A and B of Alanine Dipeptide." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Dataset info:\n", + " DictDataset( \"data_list\": 4000, \"z_table\": [6, 7, 8], \"cutoff\": 10.0, \"used_idx\": tensor([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]), \"used_names\": [ACE1-CH3, ACE1-C, ACE1-O, ALA2-N, ALA2-CA, ALA2-CB, ALA2-C, ALA2-O, NME3-N, NME3-C], \"data_type\": graphs )\n", + "\n", + "Datamodule info:\n", + " DictModule(dataset -> DictDataset( \"data_list\": 4000, \"z_table\": [6, 7, 8], \"cutoff\": 10.0, \"used_idx\": tensor([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]), \"used_names\": [ACE1-CH3, ACE1-C, ACE1-O, ALA2-N, ALA2-CA, ALA2-CB, ALA2-C, ALA2-O, NME3-N, NME3-C], \"data_type\": graphs ),\n", + "\t\t train_loader -> DictLoader(length=0.8, batch_size=4000, shuffle=True),\n", + "\t\t valid_loader -> DictLoader(length=0.2, batch_size=4000, shuffle=True))\n" + ] + } + ], + "source": [ + "from mlcolvar.data import DictModule\n", + "from mlcolvar.utils.io import create_dataset_from_trajectories\n", + "\n", + "# loading arguments \n", + "# same as to laod_dataframe\n", + "load_args = [{'start' : 0, 'stop' : 10000, 'stride' : 5},\n", + " {'start' : 0, 'stop' : 10000, 'stride' : 5}]\n", + "\n", + "# create dataset\n", + "dataset = create_dataset_from_trajectories(\n", + " trajectories=[\"alad_A.trr\", \n", + " \"alad_B.trr\"],\n", + " top=\"alad.gro\", \n", + " folder=\"data/alanine_gnn\", \n", + " cutoff=10.0, # Angstrom \n", + " labels=None, \n", + " system_selection='all and not type H',\n", + " show_progress=False,\n", + " load_args=load_args,\n", + " lengths_conversion=10.0, # MDTraj uses nm by defualt, we use Angstroms\n", + " )\n", + "print('Dataset info:\\n', dataset, end=\"\\n\\n\")\n", + "\n", + "# load dataset into a DictModule\n", + "datamodule = DictModule(dataset=dataset)\n", + "print('Datamodule info:\\n', datamodule)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Accessing graph data\n", + "The built graphs are then stored as `torch_geometric.Data` objects into the usual `DictDataset` with the information about each graph entry (e.g., nodes positons, edges, weights, elabels etc.) under tehe key `data_list` and the common information for all the graphs (e.g., map from types to chemical species, cutoff) in the `metadata` attribute dictionary." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Example of a graph entry:\n", + " Data(edge_index=[2, 90], shifts=[90, 3], unit_shifts=[90, 3], positions=[10, 3], cell=[3, 3], node_attrs=[10, 3], graph_labels=[1, 1], n_system=[1, 1], n_env=[1, 1], weight=1.0, names_idx=[10])\n", + "\n", + "Dataset metadata:\n", + " {'z_table': [6, 7, 8], 'cutoff': 10.0, 'used_idx': tensor([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]), 'used_names': [ACE1-CH3, ACE1-C, ACE1-O, ALA2-N, ALA2-CA, ALA2-CB, ALA2-C, ALA2-O, NME3-N, NME3-C], 'data_type': 'graphs'}\n" + ] + } + ], + "source": [ + "print('Example of a graph entry:\\n', dataset['data_list'][0], end='\\n\\n')\n", + "print('Dataset metadata:\\n', dataset.metadata)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Initializing the GNN model\n", + "At variance with the procedure with FFNNs, here the model is initialized **outside** the CV class, to which is then passed only later as an input.\n", + "GNN architectures are indeed much more complex than FFNNs and have many parameters that can be set.\n", + "In addition, when introducing GNN models into the code, we maintained the standard CVs as the default, which still covers most of the users.\n", + "\n", + "Here, for example, we initialize a `SchNetModel`.\n", + "Many other architectures are available in [`pytorch_geometric`](https://pytorch-geometric.readthedocs.io/en/latest/) and can be readily adapted to this library.\n", + "\n", + "#### NOTE\n", + "As the input graph are built with the dataset and then processed in the GNN-model, it is wise to initialize the model directly refering to the values stored in the `dataset.metadata` (e.g., cutoff, z_table)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "from mlcolvar.core.nn.graph.schnet import SchNetModel\n", + "\n", + "gnn_model = SchNetModel(n_out=1,\n", + " cutoff=dataset.metadata['cutoff'],\n", + " atomic_numbers=dataset.metadata['z_table'],\n", + " pooling_operation=\"mean\",\n", + " n_bases=16,\n", + " n_layers=2,\n", + " n_filters=16,\n", + " n_hidden_channels=16,\n", + " w_out_after_pool=True,\n", + " aggr='mean'\n", + " )" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Initializing CV class\n", + "The initalization of the CV class is almost identical to the standard case, with the only difference that we provide the initialized GNN object as model.\n", + "\n", + "Here, for example, we use the `DeepTDA` CV." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/home/etrizio@iit.local/Bin/miniconda3/envs/graph_mlcolvar_test_2.5/lib/python3.9/site-packages/lightning/pytorch/utilities/parsing.py:198: Attribute 'model' is an instance of `nn.Module` and is already saved during checkpointing. It is recommended to ignore them using `self.save_hyperparameters(ignore=['model'])`.\n" + ] + } + ], + "source": [ + "import torch\n", + "from mlcolvar.cvs import DeepTDA\n", + "\n", + "# we can still set the options for the optimizer the usual way\n", + "# options for the BLOCKS of the cv are disabled when passing an external model\n", + "options = {'optimizer' : {'lr' : 1e-3},\n", + " 'lr_scheduler': {\n", + " 'scheduler': torch.optim.lr_scheduler.ExponentialLR,\n", + " 'gamma': 0.9999}\n", + " }\n", + "\n", + "model = DeepTDA(n_states=2,\n", + " n_cvs=1,\n", + " target_centers=[-7, 7],\n", + " target_sigmas=[0.2, 0.2],\n", + " model=gnn_model)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Training the CV\n", + "Here, everything works the same!" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "GPU available: True (cuda), used: True\n", + "TPU available: False, using: 0 TPU cores\n", + "IPU available: False, using: 0 IPUs\n", + "HPU available: False, using: 0 HPUs\n", + "LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0]\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "ad302baf46f44ca4b3aa26af17ab165c", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "Sanity Checking: | | 0/? [00:00" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "fig, ax = plt.subplots(1,1,figsize=(4,3))\n", + "plot_metrics(metrics.metrics,\n", + " keys=['train_loss', 'valid_loss'],\n", + " colors=['fessa1', 'fessa5'],\n", + " yscale='linear',\n", + " ax = ax)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Testing the model\n", + "As the graph data are stored as `torch_geometric.Data` they need to be loaded using a loader object.\n", + "For convenience, we implemented both in `DictDataset ` and `DictModule` a method `.get_graph_data` to do it so that one can simply evaluate the model calling either:\n", + "- `model(dataset.get_graph_data())` --> Returns the **whole dataset**\n", + "- `model(datamodule.get_graph_data())` --> Returns either the **train or valid dataset**" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAz8AAAE8CAYAAADuXg/EAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8g+/7EAAAACXBIWXMAAA9hAAAPYQGoP6dpAAA09ElEQVR4nO3deVyU5f7/8fewI8iQCyAuiGau5S5Rp/SXFCl1Kj2ZHitQjmZhaZqlj8ytxSUrT2aZJ0PPQz2WrSdbzcwWydSyhRSzNLcDaAq4JAhcvz/8MjGyyCDLwP16Ph73Q7ju6577uubW+fie+557bMYYIwAAAACo5zxqewAAAAAAUBMIPwAAAAAsgfADAAAAwBIIPwAAAAAsgfADAAAAwBIIPwAAAAAsgfADAAAAwBIIPwAAAAAsgfADAAAAwBIIPwAAAKiT9u7dK5vNpmXLllVqe5vNphkzZlTpmODeCD9wW8uWLZPNZit1mTx5cm0Pr1zFx+rl5aVGjRqpZ8+eGjdunH766adKP+6pU6c0Y8YMffrpp1U32AuwadMmzZgxQ1lZWbU9FAAoFzWlJHerKUBN8KrtAQDnM2vWLEVGRjq1denSpZZGU3HXXnut7rzzThljlJ2dre+++07Lly/X888/r7lz52rChAkuP+apU6c0c+ZMSVK/fv2qeMSu27Rpk2bOnKmEhAQFBwfX9nAA4LyoKX9yt5oC1ATCD9zegAED1KtXrwr1PX36tHx8fOThUfsnNS+55BLdfvvtTm1z5szRjTfeqIkTJ6pDhw4aOHBgLY0OAKyJmgJYW+3/awYq6dNPP5XNZtPq1as1depUNW/eXA0aNFBOTo4kac2aNerZs6f8/f3VpEkT3X777Tp48KDTYyQkJCgwMFD79u3TDTfcoMDAQDVv3lyLFi2SJP3www+65pprFBAQoIiICK1ateqCxty4cWOtXr1aXl5eevzxxx3teXl5mjZtmnr27Cm73a6AgABdddVV2rBhg6PP3r171bRpU0nSzJkzHZdAFF2r/P333yshIUFt2rSRn5+fwsLCNHLkSP3+++9OYzh+/LjGjx+v1q1by9fXVyEhIbr22mv1zTffOPXbvHmzrr/+etntdjVo0EB9+/bVl19+6Vg/Y8YMTZo0SZIUGRnpGM/evXsv6DkCgNpATalcTZkxY4ZsNpt27dql22+/XXa7XU2bNtUjjzwiY4z279+vm266SUFBQQoLC9NTTz1VYh6ZmZlKTExUaGio/Pz81LVrVy1fvrxEv6ysLCUkJMhutys4OFjx8fGlXnbdr1+/Us9kJSQkqHXr1ud9Xg8ePKiRI0cqNDRUvr6+6ty5s15++eXzboe6gTM/cHvZ2dk6cuSIU1uTJk0cPz/66KPy8fHRAw88oNzcXPn4+GjZsmUaMWKEevfurdmzZysjI0P//Oc/9eWXX+rbb791ukSroKBAAwYM0NVXX6158+Zp5cqVGjt2rAICAvTwww9r+PDhGjRokBYvXqw777xT0dHRJS6ZcEWrVq3Ut29fbdiwQTk5OQoKClJOTo5eeuklDRs2TKNGjdLx48e1dOlSxcbG6uuvv1a3bt3UtGlTvfDCC7r77rt1yy23aNCgQZKkyy67TJK0bt06/frrrxoxYoTCwsKUmpqqJUuWKDU1VV999ZVsNpskacyYMXrttdc0duxYderUSb///ru++OIL7dixQz169JAkffLJJxowYIB69uyp6dOny8PDQ8nJybrmmmv0+eefq0+fPho0aJB27dql//znP3rmmWccx6SomAKAO6KmVG1NKXLbbbepY8eOmjNnjt5991099thjatSokV588UVdc801mjt3rlauXKkHHnhAvXv31tVXXy1J+uOPP9SvXz/t3r1bY8eOVWRkpNasWaOEhARlZWVp3LhxkiRjjG666SZ98cUXGjNmjDp27Kg333xT8fHxlX7uSpORkaHLL79cNptNY8eOVdOmTfX+++8rMTFROTk5Gj9+fJXuD7XAAG4qOTnZSCp1McaYDRs2GEmmTZs25tSpU47t8vLyTEhIiOnSpYv5448/HO1r1641ksy0adMcbfHx8UaSeeKJJxxtx44dM/7+/sZms5nVq1c72nfu3GkkmenTp5937JJMUlJSmevHjRtnJJnvvvvOGGNMfn6+yc3Ndepz7NgxExoaakaOHOloO3z4cJljKP4cFPnPf/5jJJnPPvvM0Wa328sdW2FhoWnXrp2JjY01hYWFTo8fGRlprr32Wkfbk08+aSSZPXv2lPl4AOAOqCnVU1OmT59uJJnRo0c72vLz802LFi2MzWYzc+bMcRqDv7+/iY+Pd7QtWLDASDIrVqxwtOXl5Zno6GgTGBhocnJyjDHGvPXWW0aSmTdvntN+rrrqKiPJJCcnO9r79u1r+vbtW2L88fHxJiIiwqnt3PknJiaaZs2amSNHjjj1Gzp0qLHb7aU+L6hbuOwNbm/RokVat26d01JcfHy8/P39Hb9v3bpVmZmZuueee+Tn5+doj4uLU4cOHfTuu++W2Mc//vEPx8/BwcFq3769AgICNGTIEEd7+/btFRwcrF9//fWC5xQYGCjp7CVokuTp6SkfHx9JUmFhoY4ePar8/Hz16tWrxOVoZSn+HJw+fVpHjhzR5ZdfLklOjxEcHKzNmzfr0KFDpT7O9u3b9fPPP+vvf/+7fv/9dx05ckRHjhzRyZMn1b9/f3322WcqLCx0fdIA4AaoKVVbU4oUn7Onp6d69eolY4wSExMd7UXPRfE5v/feewoLC9OwYcMcbd7e3rrvvvt04sQJbdy40dHPy8tLd999t9N+7r333grNpyKMMXr99dd14403yhjjqH9HjhxRbGyssrOzK/z8wX1x2RvcXp8+fcr9cOq5lwv89ttvks4WlnN16NBBX3zxhVObn59fiUu17Ha7WrRoUeK0vt1u17Fjx1waf2lOnDghSWrYsKGjbfny5Xrqqae0c+dOnTlzxtFe0cshjh49qpkzZ2r16tXKzMx0Wpedne34ed68eYqPj1fLli3Vs2dPDRw4UHfeeafatGkjSfr5558lqdxLCbKzs3XRRRdVaFwA4E6oKVVbU4q0atXK6Xe73S4/Pz+nSwqL2ot/bui3335Tu3btStxUomPHjo71RX82a9bMEfSKlHZcKuvw4cPKysrSkiVLtGTJklL7nPtcoO4h/KDOK/7uVGV4enq61G6MuaD9SdKPP/4oT09PRxFasWKFEhISdPPNN2vSpEkKCQmRp6enZs+erV9++aVCjzlkyBBt2rRJkyZNUrdu3RQYGKjCwkJdf/31TmdqhgwZoquuukpvvvmmPvroIz355JOaO3eu3njjDQ0YMMDR98knn1S3bt1K3de5xQcA6gtqylkVrSlFSptfdc65PDabrdR9FBQUlLtd0bxuv/32Mt8ALPpMFOouwg/qnYiICElSWlqarrnmGqd1aWlpjvW1Zd++fdq4caOio6Md79K99tpratOmjd544w2ndwanT5/utO257xoWOXbsmNavX6+ZM2dq2rRpjvaiszjnatasme655x7dc889yszMVI8ePfT4449rwIABatu2rSQpKChIMTEx5c6lrPEAQH1BTTl/TbkQERER+v7771VYWOh09mfnzp2O9UV/rl+/XidOnHB6Ay4tLa3EY1500UWlXk5YdBapLE2bNlXDhg1VUFBw3vqHuovP/KDe6dWrl0JCQrR48WLl5uY62t9//33t2LFDcXFxtTa2o0ePatiwYSooKNDDDz/saC96d6z4O1WbN29WSkqK0/YNGjSQpBK39ixte0lasGCB0+8FBQUlLlcICQlReHi447nq2bOn2rZtq/nz5zsupSju8OHDjp8DAgJKHQ8A1BfUlD+dW1OqwsCBA5Wenq5XXnnF0Zafn6+FCxcqMDBQffv2dfTLz8/XCy+84OhXUFCghQsXlnjMtm3baufOnU716rvvvnP6uobSeHp6avDgwXr99df1448/llhf/PFQd3HmB/WOt7e35s6dqxEjRqhv374aNmyY47akrVu31v33318j49i1a5dWrFghY4xycnL03Xffac2aNTpx4oSefvppXX/99Y6+N9xwg9544w3dcsstiouL0549e7R48WJ16tTJKYD4+/urU6dOeuWVV3TJJZeoUaNG6tKli7p06eK4reqZM2fUvHlzffTRR9qzZ4/TmI4fP64WLVrob3/7m7p27arAwEB9/PHH2rJli+O7Fzw8PPTSSy9pwIAB6ty5s0aMGKHmzZvr4MGD2rBhg4KCgvTOO+9IOhuUJOnhhx/W0KFD5e3trRtvvNERigCgrqOmlF1TqsLo0aP14osvKiEhQdu2bVPr1q312muv6csvv9SCBQscZ7NuvPFGXXnllZo8ebL27t2rTp066Y033ij180cjR47U008/rdjYWCUmJiozM1OLFy9W586dHd/bVJY5c+Zow4YNioqK0qhRo9SpUycdPXpU33zzjT7++GMdPXq0yp8D1LBaucccUAFFtyXdsmVLqeuLbku6Zs2aUte/8sorpnv37sbX19c0atTIDB8+3Bw4cMCpT3x8vAkICCixbd++fU3nzp1LtEdERJi4uLjzjl3FbqHq4eFhgoODTffu3c24ceNMampqif6FhYXmiSeeMBEREcbX19d0797drF27ttTbcm7atMn07NnT+Pj4ON2i88CBA+aWW24xwcHBxm63m1tvvdUcOnTIqU9ubq6ZNGmS6dq1q2nYsKEJCAgwXbt2Nc8//3yJMX377bdm0KBBpnHjxsbX19dERESYIUOGmPXr1zv1e/TRR03z5s2Nh4cHt70G4LaoKVVfU4z581bXhw8frvRzkZGRYUaMGGGaNGlifHx8zKWXXup06+oiv//+u7njjjtMUFCQsdvt5o477jDffvttiVtdG2PMihUrTJs2bYyPj4/p1q2b+fDDDyt0q+ui8SQlJZmWLVsab29vExYWZvr372+WLFlSYkyoe2zGVPOnzgAAAADADfCZHwAAAACWQPgBAAAAYAmEHwAAAACWQPgBAAAAYAmEHwAAAACWQPgBAAAAYAl18ktOCwsLdejQITVs2FA2m622hwMAlmGM0fHjxxUeHi4PD94/K47aBAC1w5XaVCfDz6FDh9SyZcvaHgYAWNb+/fvVokWL2h6GW6E2AUDtqkhtqpPhp2HDhpLOTjAoKKiWRwMA1pGTk6OWLVs6XofxJ2oTANQOV2pTnQw/RZcTBAUFUWAAoBZwWVdJ1CYAqF0VqU1csA0AAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACzBpfBTUFCgRx55RJGRkfL391fbtm316KOPyhjj6GOM0bRp09SsWTP5+/srJiZGP//8s9PjHD16VMOHD1dQUJCCg4OVmJioEydOVM2MAACWQm0CAFSUS+Fn7ty5euGFF/Tcc89px44dmjt3rubNm6eFCxc6+sybN0/PPvusFi9erM2bNysgIECxsbE6ffq0o8/w4cOVmpqqdevWae3atfrss880evToqpsVAMAyqE0AgIqymeJvjZ3HDTfcoNDQUC1dutTRNnjwYPn7+2vFihUyxig8PFwTJ07UAw88IEnKzs5WaGioli1bpqFDh2rHjh3q1KmTtmzZol69ekmSPvjgAw0cOFAHDhxQeHj4eceRk5Mju92u7OxsvkgOAGqQO77+UpsAwNpcef116czPFVdcofXr12vXrl2SpO+++05ffPGFBgwYIEnas2eP0tPTFRMT49jGbrcrKipKKSkpkqSUlBQFBwc7ioskxcTEyMPDQ5s3by51v7m5ucrJyXFaAACQqE0AgIrzcqXz5MmTlZOTow4dOsjT01MFBQV6/PHHNXz4cElSenq6JCk0NNRpu9DQUMe69PR0hYSEOA/Cy0uNGjVy9DnX7NmzNXPmTFeGCgCwCGoTAKCiXDrz8+qrr2rlypVatWqVvvnmGy1fvlzz58/X8uXLq2t8kqQpU6YoOzvbsezfv79a9wcAqDuoTQCAinLpzM+kSZM0efJkDR06VJJ06aWX6rffftPs2bMVHx+vsLAwSVJGRoaaNWvm2C4jI0PdunWTJIWFhSkzM9PpcfPz83X06FHH9ufy9fWVr6+vK0OtMq0nv+v4ee+cuFoZAwCgbFasTQCAynHpzM+pU6fk4eG8iaenpwoLCyVJkZGRCgsL0/r16x3rc3JytHnzZkVHR0uSoqOjlZWVpW3btjn6fPLJJyosLFRUVFSlJwIAsCZqEwCgolw683PjjTfq8ccfV6tWrdS5c2d9++23evrppzVy5EhJks1m0/jx4/XYY4+pXbt2ioyM1COPPKLw8HDdfPPNkqSOHTvq+uuv16hRo7R48WKdOXNGY8eO1dChQyt0Nx0AAIqjNgEAKsql8LNw4UI98sgjuueee5SZmanw8HDdddddmjZtmqPPgw8+qJMnT2r06NHKysrSX/7yF33wwQfy8/Nz9Fm5cqXGjh2r/v37y8PDQ4MHD9azzz5bdbMCAFgGtQkAUFEufc+Pu6jJ71LgMz8A8Ce+y6ZsPDcAUDuq7Xt+AAAAAKCuIvwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABL8KrtAQAAzq/15HcdP++dE1eLIwEAoO7izA8AAAAASyD8AAAAALAEwg8AAAAAS+AzPwBQxxT//I/EZ4AAAKgozvwAAAAAsATCDwAAAABLIPwAAAAAsATCDwAAAABL4IYHAADUEdzsAgAuDGd+AAAAAFgC4QcAAACAJRB+AAAAAFgC4QcAAACAJRB+AAAAAFiCy+Hn4MGDuv3229W4cWP5+/vr0ksv1datWx3rjTGaNm2amjVrJn9/f8XExOjnn392eoyjR49q+PDhCgoKUnBwsBITE3XixIkLnw0AwJKoTQCAinAp/Bw7dkxXXnmlvL299f777+unn37SU089pYsuusjRZ968eXr22We1ePFibd68WQEBAYqNjdXp06cdfYYPH67U1FStW7dOa9eu1WeffabRo0dX3awAAJZBbQIAVJRL3/Mzd+5ctWzZUsnJyY62yMhIx8/GGC1YsEBTp07VTTfdJEn697//rdDQUL311lsaOnSoduzYoQ8++EBbtmxRr169JEkLFy7UwIEDNX/+fIWHh1fFvAAAFkFtAgBUlEtnfv773/+qV69euvXWWxUSEqLu3bvrX//6l2P9nj17lJ6erpiYGEeb3W5XVFSUUlJSJEkpKSkKDg52FBdJiomJkYeHhzZv3lzqfnNzc5WTk+O0AAAgUZsAABXnUvj59ddf9cILL6hdu3b68MMPdffdd+u+++7T8uXLJUnp6emSpNDQUKftQkNDHevS09MVEhLitN7Ly0uNGjVy9DnX7NmzZbfbHUvLli1dGTYAoB6jNgEAKsql8FNYWKgePXroiSeeUPfu3TV69GiNGjVKixcvrq7xSZKmTJmi7Oxsx7J///5q3R8AoO6gNgEAKsql8NOsWTN16tTJqa1jx47at2+fJCksLEySlJGR4dQnIyPDsS4sLEyZmZlO6/Pz83X06FFHn3P5+voqKCjIaQEAQKI2AQAqzqXwc+WVVyotLc2pbdeuXYqIiJB09gOmYWFhWr9+vWN9Tk6ONm/erOjoaElSdHS0srKytG3bNkefTz75RIWFhYqKiqr0RAAA1kRtAgBUlEt3e7v//vt1xRVX6IknntCQIUP09ddfa8mSJVqyZIkkyWazafz48XrsscfUrl07RUZG6pFHHlF4eLhuvvlmSWffjbv++usdlyScOXNGY8eO1dChQ7mbDgDAZdQmAEBFuRR+evfurTfffFNTpkzRrFmzFBkZqQULFmj48OGOPg8++KBOnjyp0aNHKysrS3/5y1/0wQcfyM/Pz9Fn5cqVGjt2rPr37y8PDw8NHjxYzz77bNXNCgBgGdQmAEBF2YwxprYH4aqcnBzZ7XZlZ2dX+zXWrSe/6/h575y4at0XAJSl+GvRuWrytakmX3/rmpp4bs79e0BdAgDXXn9d+swPAAAAANRVhB8AAAAAlkD4AQAAAGAJhB8AAAAAlkD4AQAAAGAJhB8AAAAAlkD4AQAAAGAJhB8AAAAAlkD4AQAAAGAJhB8AAAAAlkD4AQAAAGAJhB8AAAAAlkD4AQAAAGAJhB8AAAAAlkD4AQAAAGAJhB8AAAAAluBV2wMAAAAAUPe0nvxuqe1758TV8EgqjjM/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB7/kpRVn3LAcAAABQdxF+AACoo4q/WefOXyoIAO6Cy94AAAAAWALhBwAAAIAlEH4AAAAAWALhBwAAAIAlEH4AAAAAWAJ3e3MBd9UBAAAA6i7O/AAAAACwBMIPAAAAAEsg/AAAAACwBMIPAAAAAEsg/AAAAACwBMIPAAAAAEsg/AAAAACwBMIPAAAAAEsg/AAAAACwBMIPAAAAAEsg/AAAAACwhAsKP3PmzJHNZtP48eMdbadPn1ZSUpIaN26swMBADR48WBkZGU7b7du3T3FxcWrQoIFCQkI0adIk5efnX8hQAACQRG0CAJSt0uFny5YtevHFF3XZZZc5td9///165513tGbNGm3cuFGHDh3SoEGDHOsLCgoUFxenvLw8bdq0ScuXL9eyZcs0bdq0ys8CAABRmwAA5atU+Dlx4oSGDx+uf/3rX7rooosc7dnZ2Vq6dKmefvppXXPNNerZs6eSk5O1adMmffXVV5Kkjz76SD/99JNWrFihbt26acCAAXr00Ue1aNEi5eXlVc2sAACWQ20CAJxPpcJPUlKS4uLiFBMT49S+bds2nTlzxqm9Q4cOatWqlVJSUiRJKSkpuvTSSxUaGuroExsbq5ycHKWmppa6v9zcXOXk5DgtAAAUR20CAJyPl6sbrF69Wt988422bNlSYl16erp8fHwUHBzs1B4aGqr09HRHn+LFpWh90brSzJ49WzNnznR1qAAAi6A2AQAqwqUzP/v379e4ceO0cuVK+fn5VdeYSpgyZYqys7Mdy/79+2ts3wAA90ZtAgBUlEvhZ9u2bcrMzFSPHj3k5eUlLy8vbdy4Uc8++6y8vLwUGhqqvLw8ZWVlOW2XkZGhsLAwSVJYWFiJO+wU/V7U51y+vr4KCgpyWgAAkKhNAICKcyn89O/fXz/88IO2b9/uWHr16qXhw4c7fvb29tb69esd26SlpWnfvn2Kjo6WJEVHR+uHH35QZmamo8+6desUFBSkTp06VdG0AABWQW0CAFSUS5/5adiwobp06eLUFhAQoMaNGzvaExMTNWHCBDVq1EhBQUG69957FR0drcsvv1ySdN1116lTp0664447NG/ePKWnp2vq1KlKSkqSr69vFU0LAGAV1CYAQEW5fMOD83nmmWfk4eGhwYMHKzc3V7GxsXr++ecd6z09PbV27Vrdfffdio6OVkBAgOLj4zVr1qyqHgoAAJKoTQCAsy44/Hz66adOv/v5+WnRokVatGhRmdtERETovffeu9BdAwBQKmoTAKA0lfqeHwAAAACoawg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACyB8AMAAADAEgg/AAAAACzBq7YHAAC4MK0nv+v4ee+cuFocCQAA7o0zPwAAAAAsgfADAAAAwBIIPwAAAAAsgfADAAAAwBIIPwAAAAAsgbu9AYCbKn4XNwAAcOE48wMAAADAEgg/AAAAACyBy94AAHBjXP4IAFWHMz8AAAAALIHwAwAAAMASuOwNAAAAQIXU9UtxOfMDAAAAwBIIPwAAAAAswaXwM3v2bPXu3VsNGzZUSEiIbr75ZqWlpTn1OX36tJKSktS4cWMFBgZq8ODBysjIcOqzb98+xcXFqUGDBgoJCdGkSZOUn59/4bMBAFgOtQkAUFEuhZ+NGzcqKSlJX331ldatW6czZ87ouuuu08mTJx197r//fr3zzjtas2aNNm7cqEOHDmnQoEGO9QUFBYqLi1NeXp42bdqk5cuXa9myZZo2bVrVzQoAYBnUJgBARdmMMaayGx8+fFghISHauHGjrr76amVnZ6tp06ZatWqV/va3v0mSdu7cqY4dOyolJUWXX3653n//fd1www06dOiQQkNDJUmLFy/WQw89pMOHD8vHx+e8+83JyZHdbld2draCgoIqO/wyVeSDXHvnxFX5fgGguMp8qLS6X5uq+/W3KtS32lTRvwfUJQA1wR3/n+zK6+8FfeYnOztbktSoUSNJ0rZt23TmzBnFxMQ4+nTo0EGtWrVSSkqKJCklJUWXXnqpo7hIUmxsrHJycpSamlrqfnJzc5WTk+O0AABQGmoTAKAslb7VdWFhocaPH68rr7xSXbp0kSSlp6fLx8dHwcHBTn1DQ0OVnp7u6FO8uBStL1pXmtmzZ2vmzJmVHSoAwCKoTQBQ+849O+ROZ6YrfeYnKSlJP/74o1avXl2V4ynVlClTlJ2d7Vj2799f7fsEANQ91CYAQHkqdeZn7NixWrt2rT777DO1aNHC0R4WFqa8vDxlZWU5vcOWkZGhsLAwR5+vv/7a6fGK7rhT1Odcvr6+8vX1rcxQAQAWQW0CAJyPS2d+jDEaO3as3nzzTX3yySeKjIx0Wt+zZ095e3tr/fr1jra0tDTt27dP0dHRkqTo6Gj98MMPyszMdPRZt26dgoKC1KlTpwuZCwDAgqhNAICKcunMT1JSklatWqW3335bDRs2dFwHbbfb5e/vL7vdrsTERE2YMEGNGjVSUFCQ7r33XkVHR+vyyy+XJF133XXq1KmT7rjjDs2bN0/p6emaOnWqkpKSeAcNAOAyahMAoKJcCj8vvPCCJKlfv35O7cnJyUpISJAkPfPMM/Lw8NDgwYOVm5ur2NhYPf/8846+np6eWrt2re6++25FR0crICBA8fHxmjVr1oXNBABgSdQmAEBFuRR+KvKVQH5+flq0aJEWLVpUZp+IiAi99957ruwaAIBSUZvOKn53JXe6sxIAuJML+p4fAAAAAKgrCD8AAAAALIHwAwAAAMASCD8AAAAALIHwAwAAAMASCD8AAAAALMGlW10DAAAAQJG9fn93/Nz69KpaHEnFcOYHAAAAgCUQfgAAAABYAuEHAAAAgCXwmR8AAOqQ4tfXS3XjGnsAcBec+QEAAABgCYQfAAAAAJZA+AEAAABgCYQfAAAAAJZA+AEAAABgCdztrZJaT37X8fPeOXG1OBIAAACg9tWFu1Fy5gcAAACAJRB+AAAAAFgC4QcAAACAJRB+AAAAAFgC4QcAAACAJXC3NwAA3Ny5d1Aqa5073lkJANwJZ34AAAAAWALhBwAAAIAlcNmbnL+wFAAAAEDZyrsU191x5gcAAACAJRB+AAAAAFgC4QcAAACAJRB+AAAAAFgC4QcAAACAJRB+AAAAAFgCt7oGADfCrfcBAKg+hB8AqEeKh6e9c+JqcSQAALgfLnsDAAAAYAmEHwAAAACWQPgBAAAAYAl85gcAAABAmS70Zjzu9HlUwg8AAG6Gu/4BQPUg/AAAUM+407usAKxrr9/fHT+3Pr2qFkfyJz7zAwAAAMASOPNTBc69PIF32QAAAFCfFD+LU5dx5gcAAACAJXDmBwCAesIdr68HAHdSq+Fn0aJFevLJJ5Wenq6uXbtq4cKF6tOnT20OCQBqHHf2ch/uVJfqyyUmAOqe+lyXai38vPLKK5owYYIWL16sqKgoLViwQLGxsUpLS1NISEhtDatKcJcdAO6A1yLX1Oe6BACVUR/fhLEZY0xt7DgqKkq9e/fWc889J0kqLCxUy5Ytde+992ry5MnlbpuTkyO73a7s7GwFBQVVav+1nWj5jwiAIjX9enQhrz9V8frrri6kLklV+9y0nvzuBf+no7TL3qg9ACqiqC5VV/gpen2qqtckV15/a+XMT15enrZt26YpU6Y42jw8PBQTE6OUlJQS/XNzc5Wbm+v4PTs7W9LZiVZWYe6pSm9bFVrdv6ZC/X6cGXvePl2mf+jyNrWh+DgrOsbKbFNd3GksdVFdef7O/fdUpPiYy+pzbr+KPG5tuJDXzqJta+l9s2rjal2Sqqc2aXYLSdL3Nikn9zx9z+N727ASbTlTnH/vcnrpeR/nx5mxjnFJkqYcOO82deXfe52fl4vjR0l18f9QxVV3bcqxVc9rfdHrU86UP1+HLuS5d6k2mVpw8OBBI8ls2rTJqX3SpEmmT58+JfpPnz7dSGJhYWFhcZNl//79NVUyaoSrdckYahMLCwuLuy0VqU114m5vU6ZM0YQJExy/FxYW6ujRo2rcuLFsNpvLj5eTk6OWLVtq//799eayjfo2p/o2H4k51QX1bT5S1c/JGKPjx48rPDy8CkZXt1Gbylff5iMxp7qgvs1HYk4V4UptqpXw06RJE3l6eiojI8OpPSMjQ2FhYSX6+/r6ytfX16ktODj4gscRFBRUb/4SFalvc6pv85GYU11Q3+YjVe2c7HZ7lTyOO3G1LknUpoqqb/ORmFNdUN/mIzGn86lobaqVLzn18fFRz549tX79ekdbYWGh1q9fr+jo6NoYEgDAwqhLAGANtXbZ24QJExQfH69evXqpT58+WrBggU6ePKkRI0bU1pAAABZGXQKA+q/Wws9tt92mw4cPa9q0aUpPT1e3bt30wQcfKDQ0tNr37evrq+nTp5e4XKEuq29zqm/zkZhTXVDf5iPVzzlVl9qsS1L9O1b1bT4Sc6oL6tt8JOZU1Wrte34AAAAAoCbVymd+AAAAAKCmEX4AAAAAWALhBwAAAIAlEH4AAAAAWIIlws+nn34qm81W6rJly5Yyt+vXr1+J/mPGjKnBkZetdevWJcY2Z86ccrc5ffq0kpKS1LhxYwUGBmrw4MElvtCvtuzdu1eJiYmKjIyUv7+/2rZtq+nTpysvL6/c7dztGC1atEitW7eWn5+foqKi9PXXX5fbf82aNerQoYP8/Px06aWX6r333quhkZ7f7Nmz1bt3bzVs2FAhISG6+eablZaWVu42y5YtK3E8/Pz8amjE5ZsxY0aJsXXo0KHcbdz5+Eilvw7YbDYlJSWV2t+dj48VUZvOojZVv/pSm+pbXZKoTVItHCNjAbm5ueZ///uf0/KPf/zDREZGmsLCwjK369u3rxk1apTTdtnZ2TU48rJFRESYWbNmOY3txIkT5W4zZswY07JlS7N+/XqzdetWc/nll5srrriihkZcvvfff98kJCSYDz/80Pzyyy/m7bffNiEhIWbixInlbudOx2j16tXGx8fHvPzyyyY1NdWMGjXKBAcHm4yMjFL7f/nll8bT09PMmzfP/PTTT2bq1KnG29vb/PDDDzU88tLFxsaa5ORk8+OPP5rt27ebgQMHmlatWpX79yw5OdkEBQU5HY/09PQaHHXZpk+fbjp37uw0tsOHD5fZ392PjzHGZGZmOs1n3bp1RpLZsGFDqf3d+fhYEbXpLGpT9apPtam+1SVjqE3G1PwxskT4OVdeXp5p2rSpmTVrVrn9+vbta8aNG1czg3JRRESEeeaZZyrcPysry3h7e5s1a9Y42nbs2GEkmZSUlGoY4YWbN2+eiYyMLLePOx2jPn36mKSkJMfvBQUFJjw83MyePbvU/kOGDDFxcXFObVFRUeauu+6q1nFWVmZmppFkNm7cWGaf5ORkY7fba25QLpg+fbrp2rVrhfvXteNjjDHjxo0zbdu2LfM/zu58fEBtKkJtqlr1uTbV9bpkDLXJmJo/Rpa47O1c//3vf/X7779X6Fu7V65cqSZNmqhLly6aMmWKTp06VQMjrJg5c+aocePG6t69u5588knl5+eX2Xfbtm06c+aMYmJiHG0dOnRQq1atlJKSUhPDdVl2drYaNWp03n7ucIzy8vK0bds2p+fXw8NDMTExZT6/KSkpTv0lKTY21q2Ph6TzHpMTJ04oIiJCLVu21E033aTU1NSaGF6F/PzzzwoPD1ebNm00fPhw7du3r8y+de345OXlacWKFRo5cqRsNluZ/dz5+FgdteksalPVqe+1qT7UJYnaJNXsMfKqtkd2Y0uXLlVsbKxatGhRbr+///3vioiIUHh4uL7//ns99NBDSktL0xtvvFFDIy3bfffdpx49eqhRo0batGmTpkyZov/97396+umnS+2fnp4uHx8fBQcHO7WHhoYqPT29Bkbsmt27d2vhwoWaP39+uf3c5RgdOXJEBQUFJb4JPjQ0VDt37ix1m/T09FL7u+PxKCws1Pjx43XllVeqS5cuZfZr3769Xn75ZV122WXKzs7W/PnzdcUVVyg1NfW8/96qW1RUlJYtW6b27dvrf//7n2bOnKmrrrpKP/74oxo2bFiif106PpL01ltvKSsrSwkJCWX2cefjA2pTce76b43a5D7qQ12SqE1SLRyjGjvHVA0eeughI6ncZceOHU7b7N+/33h4eJjXXnvN5f2tX7/eSDK7d++uqik4qcx8iixdutR4eXmZ06dPl7p+5cqVxsfHp0R77969zYMPPlil8yiuMnM6cOCAadu2rUlMTHR5f9V9jMpy8OBBI8ls2rTJqX3SpEmmT58+pW7j7e1tVq1a5dS2aNEiExISUm3jrKwxY8aYiIgIs3//fpe2y8vLM23btjVTp06tppFV3rFjx0xQUJB56aWXSl1fl46PMcZcd9115oYbbnBpG3c+PnUZtelP1KazqE1Vrz7WJWOoTcZU/zGq02d+Jk6cWG6SlKQ2bdo4/Z6cnKzGjRvrr3/9q8v7i4qKknT2nZ+2bdu6vP35VGY+RaKiopSfn6+9e/eqffv2JdaHhYUpLy9PWVlZTu+wZWRkKCws7EKGXS5X53To0CH9v//3/3TFFVdoyZIlLu+vuo9RWZo0aSJPT88Sdygq7/kNCwtzqX9tGTt2rNauXavPPvvM5XdgvL291b17d+3evbuaRld5wcHBuuSSS8ocW105PpL022+/6eOPP3b5XWV3Pj51GbXpT9Sms6hNVau+1iWJ2iTVwDGqlkjlpgoLC01kZOR579JSli+++MJIMt99910Vj+zCrVixwnh4eJijR4+Wur7oQ6XF31XcuXOnW32o9MCBA6Zdu3Zm6NChJj8/v1KPUZvHqE+fPmbs2LGO3wsKCkzz5s3L/VDpue+GREdHu82HFgsLC01SUpIJDw83u3btqtRj5Ofnm/bt25v777+/ikd34Y4fP24uuugi889//rPU9e5+fIqbPn26CQsLM2fOnHFpO3c+PlZCbaI2Vaf6VJvqe10yhtpkTPUfI0uFn48//rjM0/MHDhww7du3N5s3bzbGGLN7924za9Yss3XrVrNnzx7z9ttvmzZt2pirr766poddwqZNm8wzzzxjtm/fbn755RezYsUK07RpU3PnnXc6+pw7H2POniJu1aqV+eSTT8zWrVtNdHS0iY6Oro0plHDgwAFz8cUXm/79+5sDBw443e6weB93PkarV682vr6+ZtmyZeann34yo0ePNsHBwY7bNd5xxx1m8uTJjv5ffvml8fLyMvPnzzc7duww06dPd6vbVd59993GbrebTz/91Ol4nDp1ytHn3DnNnDnTcUvYbdu2maFDhxo/Pz+TmppaG1NwMnHiRPPpp5+aPXv2mC+//NLExMSYJk2amMzMTGNM3Ts+RQoKCkyrVq3MQw89VGJdXTo+VkZtojZVp/pUm+pbXTKG2mRMzR8jS4WfYcOGlfndAXv27HG6B/m+ffvM1VdfbRo1amR8fX3NxRdfbCZNmuQW36Wwbds2ExUVZex2u/Hz8zMdO3Y0TzzxhNM11efOxxhj/vjjD3PPPfeYiy66yDRo0MDccsstTi/gtSk5ObnM666L1IVjtHDhQtOqVSvj4+Nj+vTpY7766ivHur59+5r4+Hin/q+++qq55JJLjI+Pj+ncubN59913a3jEZSvreCQnJzv6nDun8ePHO+YfGhpqBg4caL755puaH3wpbrvtNtOsWTPj4+Njmjdvbm677Tan6+/r2vEp8uGHHxpJJi0trcS6unR8rIzaRG2qbvWlNtW3umQMtcmYmj9GNmOMqZ4L6gAAAADAfVjye34AAAAAWA/hBwAAAIAlEH4AAAAAWALhBwAAAIAlEH4AAAAAWALhBwAAAIAlEH4AAAAAWALhBwAAAIAlEH4AAAAAWALhBzhHenq6xo0bp4svvlh+fn4KDQ3VlVdeqRdeeEGnTp1y9GvdurVsNpu++uorp+3Hjx+vfv36OX6fMWOGbDabxowZ49Rv+/btstls2rt3b7nj2b17t0aMGKEWLVrI19dXkZGRGjZsmLZu3aqMjAx5e3tr9erVpW6bmJioHj16uPYEAADcCnUJqDqEH6CYX3/9Vd27d9dHH32kJ554Qt9++61SUlL04IMPau3atfr444+d+vv5+emhhx467+P6+flp6dKl+vnnn10az9atW9WzZ0/t2rVLL774on766Se9+eab6tChgyZOnKjQ0FDFxcXp5ZdfLrHtyZMn9eqrryoxMdGlfQIA3Ad1CahaXrU9AMCd3HPPPfLy8tLWrVsVEBDgaG/Tpo1uuukmGWOc+o8ePVqLFy/We++9p4EDB5b5uO3bt1dISIgefvhhvfrqqxUaizFGCQkJateunT7//HN5ePz5XkW3bt00btw4SWffRbv55pu1b98+tWrVytFnzZo1ys/P1/Dhwyu0PwCA+6EuAVWLMz/A//n999/10UcfKSkpyanAFGez2Zx+j4yM1JgxYzRlyhQVFhaW+/hz5szR66+/rq1bt1ZoPNu3b1dqaqomTpzoVGCKBAcHS5IGDhyo0NBQLVu2zGl9cnKyBg0a5OgHAKhbqEtA1SP8AP9n9+7dMsaoffv2Tu1NmjRRYGCgAgMDS72UYOrUqdqzZ49WrlxZ7uP36NFDQ4YMqdDlCJIclyJ06NCh3H6enp6Kj4/XsmXLHO8A/vLLL/r88881cuTICu0LAOB+qEtA1SP8AOfx9ddfa/v27ercubNyc3NLrG/atKkeeOABTZs2TXl5eeU+1mOPPabPP/9cH3300Xn3e+6lDOUZOXKk9uzZow0bNkg6++5a69atdc0111T4MQAAdQN1Cag8wg/wfy6++GLZbDalpaU5tbdp00YXX3yx/P39y9x2woQJ+uOPP/T888+Xu4+2bdtq1KhRmjx58nmLyCWXXCJJ2rlz53nH3q5dO1111VVKTk5WYWGh/v3vf2vEiBElLocAANQd1CWg6hF+gP/TuHFjXXvttXruued08uRJl7YNDAzUI488oscff1zHjx8vt++0adO0a9euMm8DWqRbt27q1KmTnnrqqVKv287KynL6PTExUa+//rpef/11HTx4UAkJCS7NAQDgXqhLQNUj/ADFPP/888rPz1evXr30yiuvaMeOHUpLS9OKFSu0c+dOeXp6lrnt6NGjZbfbtWrVqnL3ERoaqgkTJujZZ58tt5/NZlNycrJ27dqlq666Su+9955+/fVXff/993r88cd10003OfW/9dZb5e3trbvuukvXXXedWrZsWfGJAwDcEnUJqFqEH6CYtm3b6ttvv1VMTIymTJmirl27qlevXlq4cKEeeOABPfroo2Vu6+3trUcffVSnT58+734eeOABBQYGnrdfnz59tHXrVl188cUaNWqUOnbsqL/+9a9KTU3VggULnPo2aNBAQ4cO1bFjx/hAKQDUE9QloGrZjCufXgMAAACAOoozPwAAAAAsgfADAAAAwBIIPwAAAAAsgfADAAAAwBIIPwAAAAAsgfADAAAAwBIIPwAAAAAsgfADAAAAwBIIPwAAAAAsgfADAAAAwBIIPwAAAAAs4f8DBp8zTjMGW/YAAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "fig, axs = plt.subplots(1,2, figsize=(10,3))\n", + "\n", + "ax = axs[0]\n", + "out_graph = model(dataset.get_graph_inputs())\n", + "ax.hist(out_graph.detach().squeeze(), bins=100)\n", + "ax.set_title('From Dataset')\n", + "ax.set_xlabel('GNN CV')\n", + "ax.set_ylim(0,850)\n", + "\n", + "ax = axs[1]\n", + "out_graph = model(datamodule.get_graph_inputs(\"train\"))\n", + "ax.hist(out_graph.detach().squeeze(), bins=100)\n", + "out_graph = model(datamodule.get_graph_inputs(\"valid\"))\n", + "ax.hist(out_graph.detach().squeeze(), bins=100)\n", + "\n", + "ax.set_title('From Datamodule')\n", + "ax.set_xlabel('GNN CV')\n", + "ax.set_ylim(0,850)\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Save the model to TorchScript\n", + "As for normal CVs, the frozen model can be saved to TorchScript suing the `Lightning` util `to_torchscript` using `method=trace`." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/home/etrizio@iit.local/Bin/dev/mlcolvar/mlcolvar/data/datamodule.py:322: UserWarning: Length of split at index 1 is 0. This might result in an empty dataset.\n", + " warnings.warn(\n" + ] + }, + { + "data": { + "text/plain": [ + "True" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "traced_model = model.to_torchscript('gnn_model.pt', method='trace')\n", + "\n", + "# we can also check the outputs coincide\n", + "torch.allclose(model(dataset.get_graph_inputs()), traced_model(dataset.get_graph_inputs()))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "graph_mlcolvar_test_2.5", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.9.18" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/docs/notebooks/tutorials/adv_newcv_scratch.ipynb b/docs/notebooks/tutorials/adv_newcv_scratch.ipynb index 972775f1..b405b274 100644 --- a/docs/notebooks/tutorials/adv_newcv_scratch.ipynb +++ b/docs/notebooks/tutorials/adv_newcv_scratch.ipynb @@ -53,7 +53,7 @@ }, { "cell_type": "code", - "execution_count": 2, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -72,7 +72,7 @@ "from mlcolvar.cvs import BaseCV\n", "\n", "class AutoEncoderCV(BaseCV, lightning.LightningModule):\n", - " BLOCKS = ['norm_in','encoder','decoder'] " + " DEFAULT_BLOCKS = ['norm_in','encoder','decoder'] " ] }, { @@ -87,7 +87,7 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -104,7 +104,7 @@ " with the input 'data'.\n", " \"\"\"\n", " \n", - " BLOCKS = ['norm_in','encoder','decoder'] " + " DEFAULT_BLOCKS = ['norm_in','encoder','decoder'] " ] }, { @@ -136,12 +136,12 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ "class AutoEncoderCV(BaseCV, lightning.LightningModule):\n", - " BLOCKS = ['norm_in','encoder','decoder'] \n", + " DEFAULT_BLOCKS = ['norm_in','encoder','decoder'] \n", "\n", " def __init__(self,\n", "# ================================================ LOOK HERE 0.0 ================================================ \n", @@ -165,7 +165,7 @@ " Available blocks: ['norm_in', 'encoder','decoder'].\n", " Set 'block_name' = None or False to turn off that block\n", " \"\"\"\n", - " super().__init__(in_features=encoder_layers[0], out_features=encoder_layers[-1], **kwargs)\n", + " super().__init__(model=encoder_layers, **kwargs)\n", " \n", "# ================================================ LOOK HERE 0.0 ================================================ \n" ] @@ -185,19 +185,19 @@ }, { "cell_type": "code", - "execution_count": 5, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ "class AutoEncoderCV(BaseCV, lightning.LightningModule):\n", - " BLOCKS = ['norm_in','encoder','decoder'] \n", + " DEFAULT_BLOCKS = ['norm_in','encoder','decoder'] \n", " \n", " def __init__(self,\n", " encoder_layers : list, \n", " decoder_layers : list = None, \n", " options : dict = None, \n", " **kwargs):\n", - " super().__init__(in_features=encoder_layers[0], out_features=encoder_layers[-1], **kwargs)\n", + " super().__init__(model=encoder_layers, **kwargs)\n", "\n", "# ================================================ LOOK HERE 0.0 ================================================ \n", " \n", @@ -224,21 +224,21 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ "from mlcolvar.core.loss import MSELoss\n", "\n", "class AutoEncoderCV(BaseCV, lightning.LightningModule):\n", - " BLOCKS = ['norm_in','encoder','decoder'] \n", + " DEFAULT_BLOCKS = ['norm_in','encoder','decoder'] \n", " \n", " def __init__(self,\n", " encoder_layers : list, \n", " decoder_layers : list = None, \n", " options : dict = None, \n", " **kwargs):\n", - " super().__init__(in_features=encoder_layers[0], out_features=encoder_layers[-1], **kwargs)\n", + " super().__init__(model=encoder_layers, **kwargs)\n", "\n", " # ======= OPTIONS ======= \n", " # parse and sanitize\n", @@ -283,7 +283,7 @@ }, { "cell_type": "code", - "execution_count": 7, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -291,14 +291,14 @@ "from mlcolvar.core.transform import Normalization\n", "\n", "class AutoEncoderCV(BaseCV, lightning.LightningModule):\n", - " BLOCKS = ['norm_in','encoder','decoder'] \n", + " DEFAULT_BLOCKS = ['norm_in','encoder','decoder'] \n", " \n", " def __init__(self,\n", " encoder_layers : list, \n", " decoder_layers : list = None, \n", " options : dict = None, \n", " **kwargs):\n", - " super().__init__(in_features=encoder_layers[0], out_features=encoder_layers[-1], **kwargs)\n", + " super().__init__(model=encoder_layers, **kwargs)\n", "\n", " # ======= OPTIONS ======= \n", " # parse and sanitize\n", @@ -425,7 +425,7 @@ }, { "cell_type": "code", - "execution_count": 10, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -442,7 +442,7 @@ " with the input 'data'.\n", " \"\"\"\n", " \n", - " BLOCKS = ['norm_in','encoder','decoder'] \n", + " DEFAULT_BLOCKS = ['norm_in','encoder','decoder'] \n", " \n", " def __init__(self,\n", " encoder_layers : list, \n", @@ -465,7 +465,7 @@ " Available blocks: ['norm_in', 'encoder','decoder'].\n", " Set 'block_name' = None or False to turn off that block\n", " \"\"\"\n", - " super().__init__(in_features=encoder_layers[0], out_features=encoder_layers[-1], **kwargs)\n", + " super().__init__(model=encoder_layers, **kwargs)\n", "\n", " # ======= OPTIONS ======= \n", " # parse and sanitize\n", @@ -625,7 +625,7 @@ ], "metadata": { "kernelspec": { - "display_name": "pytorch", + "display_name": "graph_mlcolvar_test", "language": "python", "name": "python3" }, @@ -639,14 +639,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.8" + "version": "3.9.18" }, - "orig_nbformat": 4, - "vscode": { - "interpreter": { - "hash": "1cbeac1d7079eaeba64f3210ccac5ee24400128e300a45ae35eee837885b08b3" - } - } + "orig_nbformat": 4 }, "nbformat": 4, "nbformat_minor": 2 diff --git a/docs/notebooks/tutorials/adv_preprocessing.ipynb b/docs/notebooks/tutorials/adv_preprocessing.ipynb index dd0b4a7d..766467b8 100644 --- a/docs/notebooks/tutorials/adv_preprocessing.ipynb +++ b/docs/notebooks/tutorials/adv_preprocessing.ipynb @@ -247,7 +247,7 @@ "source": [ "from mlcolvar.cvs import RegressionCV\n", "\n", - "model = RegressionCV(layers=[2,10,10,1], \n", + "model = RegressionCV(model=[2,10,10,1], \n", " preprocessing = pca ) \n", "\n", "# the preprocessing can also be saved later, like in:\n", diff --git a/docs/notebooks/tutorials/cvs_DeepTDA.ipynb b/docs/notebooks/tutorials/cvs_DeepTDA.ipynb index a2ee41e7..47ba9b2e 100644 --- a/docs/notebooks/tutorials/cvs_DeepTDA.ipynb +++ b/docs/notebooks/tutorials/cvs_DeepTDA.ipynb @@ -7,7 +7,7 @@ "source": [ "# Deep-TDA: Deep Targeted Discriminant Analysis\n", "Reference papers: \n", - "- *Deep-TDA*: _Trizio and Parrinello, [JPCL](https://pubs.acs.org/doi/full/10.1021/acs.jpclett.1c02317) (2021)_ [[arXiv]](https://128.84.4.34/abs/2107.05444).\n", + "- *Deep-TDA*: _Trizio and Parrinello, [JPCL](https://pubs.acs.org/doi/full/10.1021/acs.jpclett.1c02021)_ [[arXiv]](https://128.84.4.34/abs/2107.05444).\n", "- *TPI-Deep-TDA*: _Ray, Trizio and Parrinello, [JCP](https://pubs.aip.org/aip/jcp/article/158/20/204102/2891484) (2023)_ [[arXiv]](https://arxiv.org/abs/2303.01629).\n", "\n", "[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/luigibonati/mlcolvar/blob/main/docs/notebooks/tutorials/cvs_DeepTDA.ipynb)" @@ -81,18 +81,10 @@ "execution_count": 1, "metadata": {}, "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "/home/etrizio@iit.local/Bin/miniconda3/envs/mlcvs_test/lib/python3.10/site-packages/tqdm/auto.py:22: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html\n", - " from .autonotebook import tqdm as notebook_tqdm\n" - ] - }, { "data": { "text/plain": [ - "" + "" ] }, "execution_count": 1, @@ -160,7 +152,7 @@ { "data": { "text/plain": [ - "DictModule(dataset -> DictDataset( \"data\": [4002, 2], \"labels\": [4002] ),\n", + "DictModule(dataset -> DictDataset( \"data\": [4002, 2], \"labels\": [4002], \"data_type\": descriptors ),\n", "\t\t train_loader -> DictLoader(length=0.8, batch_size=0, shuffle=True),\n", "\t\t valid_loader -> DictLoader(length=0.2, batch_size=0, shuffle=True))" ] @@ -200,7 +192,7 @@ "outputs": [ { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAdMAAAF4CAYAAAAPJROAAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/NK7nSAAAACXBIWXMAAA9hAAAPYQGoP6dpAAEAAElEQVR4nOydd5hU9dmG7zO9bu+VumwDpDepKipib0nsoonRaBJLvphmTIwm+fLFEsVeohK7iF1QQIqAFIFl6WVh+7J9dvrMOd8fk/NzZneBXUBp576uuVh2z5QzsPOctz2vpCiKgoaGhoaGhsZhozvWL0BDQ0NDQ+NERxNTDQ0NDQ2NI0QTUw0NDQ0NjSNEE1MNDQ0NDY0jRBNTDQ0NDQ2NI0QTUw0NDQ0NjSNEE1MNDQ0NDY0jRBNTDQ0NDQ2NI0QTUw0NDQ0NjSNEE1MNDQ0NDY0jRBNTDQ0NDY1jxpIlSzj//PPJyspCkiTee++9Q97nyy+/ZMSIEVgsFvr168dTTz313b/QQ6CJqYaGhobGMcPtdjN06FAef/zxHh2/Z88eZsyYwcSJE/nmm2/4zW9+wx133ME777zzHb/SgyNpRvcaGhoaGscDkiQxd+5cLrroogMe8z//8z+8//77bNmyRXzvlltuYcOGDaxYseJ7eJXdYzhmz3yMkGWZmpoanE4nkiQd65ejoaGhccQoioLL5SIrKwud7ugkHH0+H4FA4LBfT+fPV7PZjNlsPuLXtWLFCqZPnx7zvbPPPpvnn3+eYDCI0Wg84uc4HE45Ma2pqSE3N/dYvwwNDQ2No05lZSU5OTlH/Dg+n4/cxDgafcHDur/D4aCjoyPme/fddx9//OMfj/i11dXVkZ6eHvO99PR0QqEQjY2NZGZmHvFzHA6nnJg6nU4g8p8uLi7uGL8aDQ0NjSOnvb2d3Nxc8fl2pAQCARp9QRZdOAqHUd+r+3YEw0ydt7rLZ+zRiEpVOke9arXyWGYbTzkxVd/suLg4TUw1NDROKo62mDjNehzG3smE9N8s83f1GZuRkUFdXV3M9xoaGjAYDCQnJx/15+spp5yYamhoaGj0DEmSkHS9E+jvOjocN24cH3zwQcz35s+fz8iRI49ZvRS00RgNDQ0NjWNIR0cH69evZ/369UBk9GX9+vXs27cPgHvvvZdrr71WHH/LLbewd+9e7rzzTrZs2cILL7zA888/z913330sXr5Ai0w1NI4xiqKImk/0pJokSeKmoXEskHTfpm17c5/esGbNGqZOnSr+fueddwJw3XXX8dJLL1FbWyuEFaBv3758/PHH/PKXv+SJJ54gKyuLxx57jEsvvbR3T3yU0cRUQ+N7QlEUZFnucusJOp0OvV4v/tQEVuN7Qdf7NC+9PH7KlCkczO7gpZde6vK9yZMns27dut69ru8YTUw1NL4jVPEMh8OEw+EeCWe0SEZ/wHQWXoPBgMFgQKfTacKq8Z0hHYaY9lp8TxKOac30oYceYtSoUTidTtLS0rjooovYtm3bIe93PPoyamhARADD4TB+vx+Px4PP5yMYDAohlCQJvV6P0WjEbDZjsViw2WzYbDbsdrv4OvpmsVgwmUwYDAYhnKFQCJ/Ph8/nIxwOH8tT1jiJUdO8vb2dihzT0/7yyy+57bbbWLlyJQsWLCAUCjF9+nTcbvcB73O8+jJqnNrIsozf78fr9eLz+QiFQuJner0ek8mE1WrFarUKcdTrI/N7oVCIQCCA1+sVN/UxFEWJEV/1/gaDQTyv6lSjOYNqHG2i6/a9uZ2KHNM076effhrz9xdffJG0tDTWrl3LpEmTur3PU089RV5eHo888ggARUVFrFmzhn/84x/HvACtcWqhKAqhUIhQKNQlhdtdGlaNWAOBAKFQiGAw2CMBlCQJk8kk7NgMBoMQ2EAgQDgcJhgMEgqFsFgsR81OTkNDo+ccVzXTtrY2AJKSkg54TG99Gf1+P36/X/y9vb39KL5ijVMRRVEIBoMEg7FWa3q9XgidJEniuEOlY3U6nbhFP4daZ1UUJeb/sclkwul0YjKZsFgsIrJVFAWv14vFYhFRr4bGkaDVTHvOcSOmiqJw5513cvrpp1NaWnrA43rry/jQQw9x//33fyevWePUQpZlEQGqSJIUE4UCBINBUS/tHLEajUZR/zQajTF10AM9pxrRqvXXQCBAU1MTZrMZp9OJ0WhEr9eL5/P5fFitVi1C1Thivo/RmJOF40ZMf/azn7Fx40aWLVt2yGN748t47733irkl+NbDUkOjp3QnojqdToiYJEnIsozb7cbj8XQRW4vFIlK00QIny7JI06pRaHSUqh6vPpfD4SAcDtPR0YHH4xHRanx8vGhUihZULeWrcaRokWnPOS7E9Pbbb+f9999nyZIlh9x40FtfxqO19kfj1ENRFFHfVNHpdJhMJlELDYfDQkSj658WiwWr1YrZbBYpX7/fj9vtFg1G0eWHA6E2HcXHx2O329Hr9eLrtrY2AoEAbW1thEIhnE4nFosFr9crns9isZyyDSEaR44kHUZkeor+dzumYqooCrfffjtz585l8eLF9O3b95D3OV59GTVOHrqriaoiGt2B29HRgdfrFcfo9XrsdrtIsSqKgtvtpq2tjY6OjhhRVlHTxKohgzqbGgqFRHrX7/fT2tqK2WwmKSmJhIQEDAYDSUlJdHR00NHRgdvtRlEU4uLihKCqKWK181dDo7dokWnPOaa/Zbfddhv/+c9/mDdvHk6nU0Sc8fHxWK1WIJKmra6u5uWXXwYivoyPP/44d955JzfffDMrVqzg+eef57XXXjtm56FxcqA2/USPmXRO53YnoiaTCbvdLqJQn89HS0sLbW1tMU1HkiSJeVKLxSJGXA4UOarNSx0dHbS2tuL3+6mtraWpqYmcnBysVitOpxO9Xk9bWxsejweDwYDdbsdoNIr6quaYpKHx3XNMxfTJJ58EInZS0bz44otcf/31ACeML6PGiY06JxptrqBGompN1OVyxcxAm81mHA4HJpMJRVFoa2ujubm5S7SqrqKy2WxdapjRFoNATBRpNBoxGo3C1KS1tZXGxkYCgQC7d+8mPT2d5ORkbDabeH3t7e2YTCaMRqOYUw0Gg5hMpu/y7dM4SdEi055zzNO8h+JE8WXUODHpLqWripha61RTqer/1+jRFFmWaWpqoqmpKeYx4uLiSExMxG63i6jQ6/XS2toqGojUJqJoDAaDMHdISkoiJSVFpIGTk5NJSEigpqaG9vZ26uvrCQaDZGRkYLfbCQQC+P1+2traSE5OxmQy4ff7xciYFp1q9Bodvbf2OUV73rRiisYpS/R8JnzrVKRGj36/n/b2dlHrNBgMxMXFYTabhYju379fpHL1ej1JSUkkJiZiNBpRFIXW1lYaGhpoaWnB5/P16DW5XC5cLhcNDQ3odDpSUlLIzc0VKd2cnByam5upq6ujublZ1FLj4+PZv38/wWAQr9eL1WoVFwShUEjrKdDoNVpk2nM0MdU45VA7XVURVFO6aoo1HA7T3t4uxE+n0+F0OkUdv6WlRYgWRCLZlJQUEhIS0Ol0uN1uKioqaGhoiIlWJUnC6XSKURZ1nEUdf1EURRg8qGLq9XppaGigoaGBnJwc+vbtK6JUWZZpaGigrq5OePw6HA4hxlarVbgkaWKqcThoc6Y9RxNTjVOKUCgUk1rtnNL1er20t7eLaNVms+F0OoVI1tXVCZE1GAykpaWRkJAAQGNjI9XV1bS2torHNxgMpKamkpKSQnx8/CE7ax0OBw6Hg5SUFPr06YPL5aKyspL9+/dTVVVFW1sbJSUlWCwWUlJS8Pl8tLe3U11dzYABA7Db7Xg8HsLhsIhOA4GAqMtqc6cavUGLTHuOJqYapwSdo9HuRl3UuU2IiGBCQoLoiq2pqRF2lzqdjtTUVJKSkpAkiYaGBvbu3YvH4xHPl5KSQmZmJomJiULA1FEZl8sVY2avuiJZrVZSUlJEBClJEnFxcZSUlNDU1MSWLVtwuVxs2LCB4cOHYzQaycrKwu12EwgEaG5uJjk5GavVSkdHBz6fTzQ9qWMymphqaHw3aGKqcdLTuTZ6sGhUkiQcDgd2ux2ApqYmGhoaRLdtYmIiaWlp6PV69u/fz549e0T3rl6vJysri+zsbCwWCwBut5vKykrq6+tpamqK6fTtDp1OR1paGllZWQwYMEAIa3JyMiNGjGD9+vV4vV7Ky8sZMmQIer2etLQ0amtr2b9/PwkJCUJM1e5kvV4vxFRL9Wr0Bi0y7TmamGqctHR2MJIkCbPZLKLRcDhMW1tbjIG8mor1+XzU1NQI8bNarWRmZmK1WmltbWXXrl24XC4gEsXm5OSQk5Mj7rtlyxb27t1LU1NTzGtS66ZqvdRgMBAIBAgEAqJruK6ujrq6OrZt28b48eNJS0sTr2Hw4MF88803tLa2UlVVRV5eHomJiTQ1NREIBGhvbycxMRGDwSB2nprNZoLBoLb3VKPXaDXTnqOJqcZJiepPq0ajBoMBk8kkxkNUVyE14nQ6nSIabWxspKGhAUVRRKSYlJREIBBg8+bNNDQ0AJEoMi8vT4hoY2Mj27dvZ+/evTEG96mpqWRnZ4vU8IHqpoqi4HK5qKmpYevWrbjdbhYsWEBxcbGIQh0OBwMGDGDbtm3s3buXjIwMTCYTCQkJNDQ0CDE1m80iIlcbp9Tn0EZkNHqKJB1GZHqK/v/SxFTjpEIdA1Frn4DYAar+PNp8Ibo2GggEqK6uFrVPh8NBVlYWer2e6upq9uzZI6K7zMxM+vTpg8lkoqGhgbKyMurr68VzJiYmMmDAAHJzc2PE7GCoNdK4uDj69+/P2rVr2bVrF5s3b6a1tZXJkyej0+nIyMigpqYGl8vFnj17GDRoEE6nk4aGBtxuN+FwGJPJhNvtJhgMioXNqkGEtp5No6dIUu+9dk9RLdXEVOPkoXNat/PcaCgUorW1VYyr2Gw24uLikCSJ1tZWamtrRcdrRkYGCQkJuN1utm3bJlK6TqeTgoICIV7r169n//79QCRSzc/Pp6CggOTkZHGFLsuycEdqbm7G5/Oh1+vR6/VYrVby8vJER7CK0Whk7NixZGdns3z5cmpqati0aRNDhgxBkiQGDBjAN998Q21tLX379sVsNmMymQgEArjdbhFlq4vLdTqd2EyjialGT4mkeXtbM/2OXsxxjiamGicFndO60U1GAD6fj9bWVpHmTEhIwGKxEA6Hqa2tFZ26NpuN7OxsDAYD+/bto6KiAkVRMBgM9OvXj8zMTDo6Oli6dKmwudTpdPTv35+SkhIhYi6Xi927d7Nnzx727t17yA0xiYmJ9OvXj9GjR+N0OsX3c3NzGTt2LMuXL6e8vJy+ffuKWVW73Y7b7aa5uZmMjAxsNhuBQACv1ysuElS/YfV96InrmIaGRu/RxFTjhCd6drRzk1HntK7RaBRbV7xeL1VVVSIlnJqaSmpqKl6vl7KyMhGNJicnU1BQgF6vZ8OGDWzZsgVZlpEkif79+zN48GBsNhvBYJDNmzdTVlbG3r17Y16jwWAgMTGRpKQkbDabiBJbW1uprq6mpaWFtWvXsmnTJiZPnszQoUOFAObn57Nr1y7q6upYs2YNU6ZMQZIkkpOTcbvdNDU1kZGRITqI1TlYvV4vts9Ej+doaPSYw+jmRevm1dA4sejsqxu9UBsQYqUKbXRat6WlhdraWhRFwWg0kp2djc1mo66ujh07diDLMgaDgYEDB5KWlkZdXR1ff/01HR0dQKRmOmzYMBITE/F6vSxbtox169bFWAZmZWXRr18/+vbtS3p6+gFnPP1+P3v37mXlypXU1dUxf/58KisrmTFjhjDaHzVqFB999BE1NTXU1dWRmZlJcnIy+/bto6WlBUVRhJiq56t29IbD4ZiLCw2NnqJ18/YcTUw1Tkg6mzB07tYNBoO0tLSIn6vzl7IsU1tbS0tLCxBpMsrOzgZg8+bNov6ZkJBAYWEhOp2OFStWsGfPHiAiyCNHjiQ3NxePx8PixYv55ptvhKDHxcVRWlrK4MGDiY+P79G5mM1mCgoKGDBgAOvWrWPx4sVs2bIFm83GGWecIR63f//+7Nixg927d5OZmYnT6RRr4fx+v9gMEwwGRScyIKJo9X3T0Ogp2pxpz9HEVOOEo3N9VF05phJdH9Xr9cJ4PhgMUllZKWZH09LSSElJoaOjg82bN+P1epEkib59+5Kbm0tdXR0rVqwQ3y8oKBDp11WrVrFixQqRIk5LS2PcuHEMHDiwSwSqGiZAJA2tevF2RqfTMXLkSOLj45k7dy5r166loKCA3NxcAPr06cOOHTuoqakRTUVWq1VsoIluYtLqpBpHA01Me44mphonFOFwWKRSu6uPqnZ9EBFZ1c7P4/FQWVlJKBQSm1ccDge1tbUirWs2m0UT0Zo1a9i+fTsQ6eAdN24cKSkpbN++nUWLFtHe3g5ERHTixIn069cvZtXali1bqKiooKqqiurq6phRHZvNRnFxMUOHDqWgoKDLrtGBAwcyZMgQNm7cyKeffspNN92EJEmkpKSIjt3m5mZSUlKw2WxCTJOSkmKsA6PF9FSd/dM4MrQ0b8/RxFTjhCEYDApR6lwfVRSF9vZ2MSMaXR9ta2ujuroaRVEwm83k5eVhMBjYvn07NTU1ACQlJVFUVITX6+Wzzz4TaeCCggKGDRuGx+Ph3XffZdeuXUBEYCdNmkRxcTGSJBEIBNiwYQPr169n27ZtB3Ub8ng8rFmzhjVr1mCz2bj66qspKiqKOWbq1Kls27aNlpYW9u3bR35+PjqdjuTkZGpra2ltbSUlJQWz2SzeG/V9kWW5SzSqRacaGt8tmphqHPd0bjTS6/WYzeaYOc7oRqNoN6OGhgZRB3U6nWRnZxMOh9mwYYMYh+nTpw/5+fns2bOH1atXEwqFMJvNjB8/nszMTNatW8eSJUsIBoPodDrGjBnD2LFjMRqNtLa2snz5clasWCE6hgHS09MZNGgQubm55OTkEB8fL0ZVampq2LhxIxs3bqS1tZVnn32WCy64gMmTJ4tzMpvNFBYWsmHDBsrLy8nPzwciNV5ANEJFWyN2977BqetIo3HkaGnenqOJqcZxTedGo87zo+FwmObmZmHUkJiYiMViQZblmE0vycnJpKen43a7KSsrw+/3o9frKSoqIjExka+//pqdO3cCESGcMGECwWCQN998U4y55OTkMH36dFJSUmhvb2fevHmsWLEixgR/9OjRnHbaaWRkZBzwnPr370///v05//zzefvtt1m1ahXz5s3D4/EwY8YMcVxpaSkbNmxg27ZtnHnmmZhMJjGDqqayDyamGhpHipbm7TmamGoct6jLslWxirYFhMh8aXNzs5ijTExMxGQyEQ6HqaysFJFiZmYmSUlJNDU1sXnzZsLhMFarldLSUiRJ4vPPP6exsRGAwYMHU1payrZt25g/fz5+vx+DwcCUKVMYNmwYfr+fTz75hMWLF4uUc//+/Zk0aRIlJSW9chcyGAxceeWVZGRkMG/ePD7//HOKioro27cvEBmtUY0ZGhsbycrKEuMv0TVY9b0CxHslSVLM11qUqnE4aN68PUcTU43jks4duxaLJUaoAoEALS0twh5PNZAPBoPs27cPn8+HTqcjJycHp9NJdXU1O3bsACA+Pp7S0lJaW1tZsmQJPp8Pk8kkNrTMnz+fjRs3AhEhPu+880hMTGTDhg289957ItrNz8/n/PPPp3///od9npIkMWXKFGpqali9ejULFy5k1qxZ4mcpKSnCmCErKytm3CX6T71eL7x31b+rX+t0Ok1MNQ4LLc3bczQx1TjuCIfD+P1+0YVqsVhiRkn8fr8wKjAajSQmJqLX64X5QTAYRK/Xk5+fj8ViYdeuXVRWVgLf1jJVkwRZlklISGDSpEkEAgFeffVVUWMdN24cEyZMoKWlhWeffZYtW7YAkWal888/P8alCBCCvWvXLnbv3k0wGMRkMmE2m8nPz+e8884jOTm523M+88wzWb16NeXl5TQ2NpKSkiKea+/evTQ3NwN0EdPoZefRqV5NTDWOBlqat+doYqpxXBE9+tK5YxciM6Rqp2306IvX62Xv3r1iY0p+fj4Gg4EtW7aIlWnq/GhZWRmbNm0CInXQ8ePHU1FRwUcffUQgEMBmszFz5kzy8/NZtWoV7733nqixnnHGGZxxxhlinKW5uZnly5fHePV2R3l5OfPnz+eMM87gwgsvJCkpKebnaWlpDBgwgJ07d7Jt2zYhpnFxccC3DUedu3LVWrHqdgQI16TuolQNDY3vBk1MNY4boj12dTodFoslJpLyeDwixWqxWEhISECSJDwej9gharFYROdrWVmZEN7CwkJSU1NZsWIFFRUVAGLWc8WKFSxfvhyIiOsFF1yALMs899xzbN68GYB+/fpx5ZVXikXdFRUVvPfee6xatSpmZ+rAgQMZMGAA/fv3x263iy0uixYtYsuWLXz22Wd8/vlCbvzx7ZwxeXTM+ffp04edO3dSVVUlvtc5klRrpepIjPp+qQvAIdKkpRrcQ0RMoyNYDY2eoqV5e44mphrHBdEzpJ1HXwDcbrcwSrBarWLUpKOjg3379qEoCjabjby8PGRZZuPGjbhcLnQ6HSUlJTidThYtWkR9fT2SJDFmzBhyc3OZN2+eqKWOGDGCKVOmsHPnTl599VU6OjrQ6/Wcd955Ypfonj17ePPNN/nmm2/Eaxs0aBATJ05k7NixYnSlM5MmTaK8vJw/Pfgw4XAHT87+F2NGPBlzfE5ODkC3YqoKtiqeamSsRvFms1m8f0ajMSblq6V5NQ4XLc3bczQx1TjmRAtpZ49diBXSaDMGl8tFZWUliqLgcDjIzc0lGAyyYcMGPB4PBoOBIUOGYDAYWLBgAa2trRgMBiZNmoTD4eC1116jvr4evV7P9OnTKSkpYf78+cyfPx9FUcjMzOSaa64hMzMTl8vFG2+8wRdffCFquePGjePCCy8UkfChKCkpIbdwMjs3fIrFEGThwoVccMEF4ufqOI3aWQzE2BDCt+JpsVjE2JD699bWViAitNHpX/hWjLXIVKM3qIvle3ufUxFNTDWOKYcS0o6ODjFTabfbhbl7e3u7aCpyOp3k5OTg9/vZsGGD6M4dOnQosiyzYMECOjo6sFgsTJ06lVAoxKuvvorL5cJms3HxxReTkJDAM888w7Zt2wAYO3YsF198MUajkUWLFjFnzhxRt2z0WPnBD37ATT86u9fn+7ffXMV/3rDz0ftv8sUXXzBz5kwhcKq/cHRUqY73qBGs+l44HA68Xq/w6LVYLDFCGy2mWr1U43DR0rw9RxNTjWNCZ1ejzmYMECukDocDh8Mh7AHVVGhcXBw5OTl4vV42bNiA3+/HYrEwdOhQfD4fCxcuxOfz4XA4OOOMM2hsbGTu3LkEg0GSkpK47LLL8Hq9PPzwwzQ1NWEymbjssssYNWoUTU1NPPPMM2zYsAGArKxsPt8QoN1vZn9r8LDOW6/XcfklM1i44APq6+spLy9n8ODB//1ZVwMGVcCjl45D5AJCFVqbzQbERq3q1waDISYqPVWjBo3DQ0vz9hxNTDW+d7oT0s5m752FVHX+iRbShIQEsrKy8Hq9rF+/XnTiDh06FJfLxaJFiwgEAiQmJjJ16lT27t3Lhx9+iCzL5ObmcvHFF4v6qN/vJykpiVmzZpGVlcXSpUt58cUX8Xg8GI1GrrjiCmbMmMHny7eweUc1N14xCYjUMKurq9m4aRuPPf8+iUkpvPDIPcTHxx3w/C0WC+PGjWPhwoWsX79eiGn0+6OmktX0ttPpJBwOi/ckLi5OdCnb7XaCwaBYtWYymcRxRqMxZkRGQ0Pju0ETU43vlc5C2nl9GvROSD0eDxs2bCAQCGC32xk6dCitra0sWrSIUChESkoKU6dOpby8nAULFgCRhqHzzjuP5cuXM2/ePBRFYcCAAVx33XUYDAZmz57NkiVLgIi70a233ip2np49qZTxw/JZsmQhn3/+OWvWrImJJGv3w4wZiygqKuT8Cy7k/JnndSti6uOp3cYQm8JVXYvU+mlKSoqYrTWbzdhsNiG0cXFxwuBfdUhS07zq6jnQxFSj92hp3p6jianG98aRCGl7e3uPhLS5uZnFixcTDodJT09n0qRJrFmzhmXLlgEwbNgwpk6dygcffCAEc9y4cVx66aXU1NTw8MMPU1NTgyRJXHbZZVx00UUi/VpfX89LL73Exx9/LM4BIpGhzZHAvnoPRjwYFD+bN29m8+bNPPz0e3w292nMpthftcTERCBWTKPFEaC1tVUYUCQmJoquY9X4Ifp49T2z2WwxCwGizRw0MdXoLZJ0GGneU1NLNTHV+H7oiZBG7yKNFlK1axciVoBqavdgQpqZmcnEiRP56quvWLVqFQDjx49n9OjRvPLKK8IucObMmUybNo2vv/6a2bNn4/f7SUxM5Pbbb6e4uBiIGDO8/PLLotYKEQOIM888kzPOOIPc3FwkSWLlN7swGvRkp1q46zd/Y8/WVQRatrB67XpOHzcy5lw7G9YDohtXFdP6+nogEpXqdDrhzJScnIzb7RaexHa7XRxrtVpjRmSiTR40MdXoLVpk2nM0MdX4XjiUkHo8HhFp2e32mFVjqpDGxcWRnZ2Nz+frIqQtLS1CSLOyspg4cSJffvkla9euBSL7QQcPHsyzzz7Lzp070ev1XHXVVQwdOpS3336bd955B4gY3f/sZz8jPj6ecDjMvHnzePrpp4XoBaV4ZPsALvrRtUweO4i05G9ro0ajnrqGNoaV5PG/D93HVdfdgr99H++8/XYXMfV6vcC3zUOA2K2qjsiokXh2djaKolBXVwdELBFVe8GEhARhv6i+d6qxhclkiqmXas1HGr1FE9Oeo4mpxndOIBA4qJB6vV4hADabTYy/eDweYcigjr+oS7j9fr9oNmppaWHRokUxQrpo0SJhrHDWWWcxcOBAZs+eTWVlJWazmZtuuom8vDweffRREbnOmDGDq666Cr1ez7Zt2/j73/8u/HgLCgoYMHgyz727CTwS9zz0BtkZiSx76zdIksS+6iYu++kTKIrChBEDeOahG6h2J5DCPlatXMb+/ftJTU0V5xzdWKRSXV0NfFtPVcU0JyeHlpYWscEmJSVF2CEmJSWJrl51GYAamaobdECLSjUOE91/b729zynIKXraGt8X3Y2/ROP3+0V602q1CkMGn8/H3r17URQFu91OTk6OMGTw+Xxi/KW9vb1LajdaSM8991z69evH448/TmVlJXa7ndtuu4309HTu/9OfWLVqFbICZ517Cddeey06nY7XX3+dm2++mS1btmC2WLn5x7fy/PPP84uf/pCBfb/dU9rm8oivdXpJ1IqWr93JR4s2cOaU8QSIA0Vm4cKFMeetXjyoKd1wOCwiUzWNrXbr5uTkUFtbC0Q8fBVFEe9ZYmKiGJ9xOBxCSHU6HQaDIcZSUEND47tDE1ON74xoQ4buxl/UNWoQiapUi0B1+4ssy10sAj0eD2azmdNOO0143oZCIdLT00VqVxXSGTNmkJubyxNPPEF9fT3x8fHcfvvtmM1m7rvvPnbt3ElIltiyP5VGr4PW1lbuueceHnvsschj5hSw11/C7Lm7UBRw2Cw8cPcl6PUR1exw+6mqi7z+qtpmzjq9BJNBj9lk4I0PVrFk1TYmnz4KiG00AsTC8egoVB3tSU9PZ/fu3QCkpqbicDhiUr7Nzc0oioLFYsFqtcaIabRXL3zrfKSJqcZhISmHdzsF0dK8Gt8JoVAoRkg7R6TBYFCIgtlsFqb1wWCQvXv3EgqFsFgs5OXloSgKZWVldHR0YDQaGTJkCH6/n4ULFxIIBEhNTWXy5MksW7ZM1EjPOeccsrOzefzxx2lqaiIpKYlbb70Vt9vNgw8+SGtrK8nJyWQMOJ3k/X7izH6uuvoaWpqb0OkNZA0YB7Y85Jo9eDx+1I+H0UP7Icvfflh8tHA9lbXNzHlvpfjeb2+byV+e+BCAhSursPJtjVRFNdvv06cPgOjU7d+/PzqdTohp//798Xq9YkQmJydH3Dc1NRWv10s4HEav12Oz2cRxZrM5JsWr1Us1Dgvpv7fe3ucURItMNY460dtfDAZDF2ejUCgkhFTdRypJEuFwWOwjVdeo6XQ6tmzZQltbG3q9niFDhgCwcOFCYbQwZcoUVq1axerVqwGYPn06ubm5QkiTk5O57bbbaG1t5f7776e1tZW8vDz+/Oc/8/tf/pDcRB8vP/t/tDQ3EZ+YQp1cwtc7wny9YQ8SEAiF+duTH4kob2RpX3EuDz8/n9fmfSuk6jmrZxsMR+6jOhJBpKlKTeGqYqraGA4YMIBwOBwjpmpUmpycjNlspqmpCYh0+XY2dVDnSzuLqYbGYaFTDu92CqL9lmkcVaI7S/V6fRevXVmWaW5uRpZlDAYDSUlJYvfmvn37RJNNfn4+er2e7du309jYiCRJDB48GKPRyMKFC/F4PMTFxTF16lQ2bNjAV199BcAZZ5xB3759mT17Ns3NzaSkpHDbbbdRX1/PAw88gNvtZuDAgfzhD38gISGBp556io/nvoyEjI8ELr7qZ4R1dvF6FSAQCPLsy3P5v0eeYObFV7FywYvIbVtRwj78gRCyAjqdhNNh4Xe3n88PLxhDRloCAKnxkdS23mDhnU/W4Pb4Y3apOhwOXC4Xu3btAiJr4SoqKvD7/djtdjIzM0UkmpubS3Nzs9jZ6nQ6Y8RUFWz1PdfqpRpHjHSYt1MQLc2rcdSQZVl8oHe3Rk0VUnU+MikpSawHq66uxuPxoNPpyMvLw2QyUVFRIRpviouLsdvtfPHFF7S3t2Oz2Zg2bRrbt29n0aJFAJx++ukUFBTw+OOP09jYSFJSErfddhtVVVX8/e9/JxAIUFxczK9+9St0Oh3333+/cEVykY1LyuNvTy9AUSKfB0X902iu2UTVjpUQ8vDI/y0Q56L46qBlI5ItC13iachGB+dNGcJNV04G4J0nb+ORFxawr+xj9rXCW/M3s39eNV+t3UFheqTGqdoIlpWVoSgKOTk5pKSksHJlJNItLCyMaUTq06cPe/bsASLjMV6vl1AohE6nw+FwiHEZdaOMVi/VOGIOpwZ6itZMtchU46gQLaQ6na6LkKodqMFgEEmSSEpKEh/ydXV1tLe3I0kSubm5WK1WamtrRUQ2cOBAkpOTWbp0KU1NTZjNZqZNm0ZNTQ2fffYZAKNGjWLo0KE89dRTNDQ0kJCQwG233UZ1dbUQ0qFDh/LrX/+acDjMXXfdxYIFC9Dp9Pzqf35NYu5wkCTCYQVFkQm3bmbjoqeo2rIQQh5MZiuWuFyk+GJ0icPAlATIKJ4qwo0R8Xvn0zXifF97fxVvfvQ1e/57DpLR8d/3RhLG+aqYrl+/HoDTTjuNQCAg6qdFRUXiPUhLS8NkMokUb1paWkxHsGqKAbFbY/R6vVYv1TghmD17Nn379sVisTBixAiWLl160OPnzJnD0KFDsdlsZGZmcsMNN4jfj2OBJqYaR4yiKPh8PmHObrFYughpe3u7SP8mJSWJhqSmpiYRUWVnZ4sIS60h5uXlkZWVxapVq6itrUWv1zNlyhRcLhcffPABiqIwePBgxo0bx3PPPUdNTQ1Op5Nbb72VhoYGIaSnnXYad999Nx0dHdx6662sXbsWGR37lUEkpBdw3rRILVbxN6HUzkdu3QhyAAwOdMkjCaefRyhpAvrEIejiB2HImo4ufUrkBP1NKHKIqeOKaGmLzHwW9s9Er/jQE0Cv1/Pa7F/xxJ+u4cZLhtHY2IjJZKK4uJjm5mZ27twJRKwOt2zZQjAYJDExkYyMDPGzfv360dDQIJag22w2Iabx8fHiQsZoNKLX67UUr8bR4Xvq5n3jjTf4xS9+wW9/+1u++eYbJk6cyLnnnsu+ffu6PX7ZsmVce+21zJo1i/Lyct566y1Wr17NTTfddKRnfNhoYqpxRBxKSCFiE6gasSckJIgRmfb2duHqk5aWRnx8PB0dHZSXlwORVGbfvn0pKytj9+7dSJLE6aefDsC7775LKBSif//+nHnmmbz88stUVFRgtVq55ZZbaG1t5a9//St+v58hQ4Zw55130traym233caOHTuw2hw0SaX4pQQCwRCvzVtJuHk94drPkQOtmC12dClj0GfPQOccgKTrWhGRLOmgswAKUrCV+UvLue6uZwE4b9pQ/njrRCCyFLx/n0zOmzaU9d+sAyLCabFYWL16NYqiMHDgQJKSkkTUOnToUBoaGkQHc35+vkh5Z2Vl0dHRIerOdrtddAtbrVZkWRbOR5qYahwRusO8Efn9jr6pF9Pd8c9//pNZs2Zx0003UVRUxCOPPEJubi5PPvlkt8evXLmSPn36cMcdd9C3b19OP/10fvKTn7BmzZpuj/8+0MRU47BRFAW/3y8+uC0WS5fOUa/XG7OD02q1iu+rXaqJiYmkpKTg9/vZuHEj4XCYhIQEBg0axO7duykrKwMiqdz4+HjeeustfD4fWVlZzJw5kzfeeIMtW7ZgNBq5+eabCQaD/PWvf8Xn81FaWsrdd99NS0sLt912G3v37sXuiKfCOxC9OYFf/3QGv/3bHBq3fYjSvhVQ0DnyMeSci97ZF+kgLt93//hcRowYBoDsj0TXba5vR2B2bItcFIwYMSJyjCyzYsUKILJ8XJZlvv76a3Fu9fX11NXVodPpKC0tFenePn364PV66ejoQJIk0tPThWmDaicYneKN7uLVOnk1jogjiExzc3OJj48Xt4ceeqjbpwgEAqxdu5bp06fHfH/69OmisbAz48ePp6qqio8//hhFUaivr+ftt9/mvPPOO7rn3wu0BiSNwyYYDIoP7u6ENBAIiA99m80m/HaDwaCwCXQ4HGRmZiLLMmVlZcK4oKSkhIaGBmH1V1JSQn5+Pv/5z39wuVwkJSVx6aWXMn/+fNauXYtOp+P666/HarXyxz/+EbfbTUFBAXfffTfNzc387Gc/o6amhszMTPoMPY8d8zcTDoR46OGXCe9fDmEvSAZ0KWPQ2XPxhw59/v96aQHpusjVdn5WEgk5OTzy+x/y6ntf0dzSLj4IxowZA8CmTZtoaGjAarUybNgwtm3bRnNzMxaLhSFDhgiXpIKCAhRFEZ7EAwcOFFaDqampKIoijBoSExNF1G82m2PsBA0G7ddb4wg5gjnTyspK4fAF3xqJdKaxsVFseYomPT1dZK46M378eObMmcOVV16Jz+cjFApxwQUX8K9//auXL/booV22ahwW0TaB6od4NOFwWLj+mM3mGNu8ffv2EQqFMJvN5OTkALBlyxaR0hw8eDBer5elS5eiKAr5+fkMHjyY999/n4aGBmw2G5dddhlr164VAnTllVeSmZkpDBny8vL4n//5Hzo6OrjjjjuoqakhOzubJ554gqTkiEeu3LGXcN3CiJAa49BnTUdnz+3xexAIhtlTERG82mY/Hzz3CxpbOvjdP97liefexOPxkJqaSmlpKQDz588HYNKkSVgsFtFgMWbMGMLhMJs3bwYikez27dtRFIW0tLSYrTDZ2dnifbXb7RiNRi3Fq/HdIR3GjOl/I9O4uLiY24HEVDxVp/KQWjrqjs2bN3PHHXfwhz/8gbVr1/Lpp5+yZ88ebrnllqNz3oeBdumq0Ws6uxt1joA6z5Kq7kbqCIzP50Ov15OXl4der2fXrl1ilrS0tBSdTsfixYsJBAKkpKQwduxYFi1axO7duzEYDFxyySVUVVXx7rvvAhHbwNLSUu6//34aGhpIT0/nN7/5DcFgkJ///OdUVVWRlZXFE088QUgx8exrXyK3bUVuWQ+AzpaDlDIGSRfr0tQjwpGosG9+5KIgKy0Bi9mI1RfpKpw2bZpYn6a6M02fPp2Ghga2bNmCJElMmDCBDRs2iKvztLQ0EdUWFhZSW1sroniHwyFqp0lJSQQCAWRZFvVqtYtXS/FqnCikpKSg1+u7RKHq73J3PPTQQ0yYMIF77rkHgCFDhmC325k4cSIPPPAAmZmZ3/nr7oz226bRK2RZ7uJuFI2iKLS1tYn5R3WWFGD//v24XC4kSRKzpPX19SKdWVhYiNPpZNmyZbhcLux2O5MnT6asrIx16yKNOzNnziQcDvPKK6+gKApjx45lypQpPPzww+zdu5f4+Hh+85vfYDQaufPOO9mzZw82Rxy3/fw3pKSk8NXaHRg6yoSQSs4CpNQJhyWkStgHoUj37sUzpvDc61+SkZbApy/eTrwx0ml75plnAvDpp5+iKAolJSVkZ2eLiLq4uJiEhAQhtCNHjmT37t0EAgGRAldTvDk5ObhcLsLhMAaDAafTKTbGWK1WJEkSYqqleDWOCt+DaYPJZGLEiBFi5ltlwYIFjB8/vtv7qDPp0aiZmOgdvt8n2m+cRo9RO3chEvl0djeCiFWeekxiYqL4D97e3i6WW2dlZWGz2Whvb2fr1q1AZAQmPT2dNWvWUFdXh8FgYPLkydTV1fH5558DkfRoWloajzzyCIFAgIKCAi699FKee+45ysrKMJvN/OpXvyIxMZG77rqLrVu3YjBZqXD348bfvMZVF+7m38/9C6Uj4jakSxyKFFd42HOYijeSejXZkvj78xHjCKfDQrB1B4FAgIEDB1JcXIzH4xHied5559Ha2iq6Ds844wzKysrweDzEx8dTUFDARx99BEQuLvbv308gEMBkMpGamirmTpOSkmIubGw2W0yKVxNTjaPC92TacOedd3LNNdcwcuRIxo0bxzPPPMO+fftE2vbee++lurqal19+GYDzzz+fm2++mSeffJKzzz6b2tpafvGLXzB69GiysrJ6/fxHA+03TqNH9GQERu04hcjsozoC4/P5ROduUlISCQkJ+P0RWz1FUUhOTqZv377s3LlTzJeOHz8eRVGYN2+eiOiGDRvG448/TltbG+np6Vx//fW8//77LF68GEmSRJv8H//4x0hTkt6IyzSYUMiAIsuxQpo8Gp2z35G9J95IujVoSEVP5II8Ky2efz4fWTR++eWXI0kSCxcuxOv1kpWVxWmnncb7779POBymf//+5OXlCeOJ0aNHU1lZidvtxmKx0K9fP7EBJycnB5/Ph8/nQ5KkmMYjdUesmnrXjBo0jhrfk9H9lVdeSVNTE3/605+ora2ltLSUjz/+mPz8fABqa2tjZk6vv/56XC4Xjz/+OHfddRcJCQlMmzaNv/3tb71/8qOEJqYah0RRFFGbA7oV0mAwKEwE7HY7NpsN+LbhSN1LmpGRgSzLlJeXi87doqIimpqahFH9kCFDSEtL49VXX8Xv95OVlcX06dN57bXXqKqqwm63c/PNN7Nx40befPNNAG688UZGjBjBE088weeff45Or6dBHkjAY8Bg0OGv//pbIU0Zg87RlyNBCQdQPJH0tM4WqZfedu0ZuJsrqK+vJyEhgbPOOotAICAizZkzZ9LR0RHjI1xWVkZ7ezt2u52SkhI+/fRTAAYNGkRraysejwe9Xk9WVpZI9yYkJKDX64WY2mw2FEXRUrwaR5/DMa4/TKP7W2+9lVtvvbXbn7300ktdvnf77bdz++23H9ZzfRdoNVONQxIMBsUHdXcjMLIs09LSgqIowoAdIiJcVVVFMBjEaDSSk5ODJEns3LmT9vZ29Ho9paWlBINBlixZgizL5ObmUlJSwkcffURTUxMOh4OLLrpI7CnV6XTccMMNuFwuZs+eDUQWgJ911lk88dQLzJkzB4DTz7yUgJQAgL9hHYor4iR0NIQUQOnYA0oYqzMFzCkAjBzch1deeQWAH/zgB5jNZhYsWEBLSwspKSlMmjSJhQsXEgwGyc/PZ8CAATFzp9XV1bhcLkwmEwUFBTE7T0OhkIj6k5OT8Xq9yLKMTqcTs6Vq1kDr4tXQ+P45pmK6ZMkSzj//fLKyspAkiffee++gx6vpvM43te6mcfQJhUJiBMZkMnX5oFY9d9Wdmuo6NYg0HKlGA7m5uRgMBmpra6mpqQEizTdms5mlS5fi9XqJj49n3LhxrFixgp07d6LX67n44ovZu3cvH3/8MQCXXHIJKSkp/OMf/8Dv9zN48GCuvvpqnnrhLV59+TkALrj4CszOSHpIbtv6XzOG/6Z2j4aQKgqyK2Ko4Df1IT0lMvbz+tvvs2fPHhwOB5dccgk+n4958+aJ1+12u0VUes4557Bp0yZcLhcOh4PBgwcLc4qioiJcLhcdHR3odDpycnKE52hcXBwmk0lEpXa7vUvjkZbi1ThqaFtjeswxzQe53W6GDh3KDTfcwKWXXtrj+23bti1mGDg1NfW7eHmnPIfq3IVIw5F6TGJioohaXS5XTMOR1WrF5XKxfft2IOLqk5yczJo1a9i/fz9Go5FJkyZRWVnJ8uXLgcgIicFg4NVXX0VRFMaNG8eYMWP4y1/+QmNjI5mZmfz85z+nurqaN155CgnwkMpzH1QTkqvQB6oIqeMviUPJGTCc6y87nWnjishKTyQQDFFR1chHizbwyrtf4fMHe/S+KN5qCHWAZESy59PY0gGKwuZvvgTg0ksvxeFw8M4779De3k5GRgaTJk3inXfeIRgM0rdvX/r27ctzz0XEf+zYsVRWVsZEpaqwqs0UqvlFcnIygUBAXOCojUeqeYaW4tU4qmhbY3rMMf3NO/fcczn33HN7fb+0tDQSEhKO/gvSEHTXudsZn88X03Ckiq3f74+xCkxISCAYDFJeXi4ajvLz86moqOjScPThhx8CkQ0qgwYN4tFHH8Xn89GnTx8uueQS5syZw5YtW7BarVzxwxv465MfsnHZm/j9XjKy+zD2jB8x+9XFyN56QvWRKFByFnDG2Rfz6H1XEee0itdvs5o4rTiP04rz+MHMMVx/z3NU1jQf8r2R2yKRbnxmKW6dkT7ZKQxID7Hp61bsdjs//OEPaWtr44MPPgDgiiuuYP/+/cLN6fzzz2fNmjW43W4SEhIoLS0VddWSkhLa29txuVxiHZ16UeJwOLDZbGIxgM1mQ6fTicYjbbZU46gT5bXbq/ucgpyQl7HDhg3D5/NRXFzM7373O6ZOnXrAY/1+f4zBsrpMWePAqJ67B+vcDYVCMVaBasORLMtUVlYiyzJWq5WMjAwURWHr1q34fD4sFguFhYW0t7eLvZ0lJSVkZGQwZ84cfD4fGRkZTJ06lTfffJOamhocDgfXXXcdq1at4pNPPgFg3OTzuPaeV3EGN2GhlRAmvJZSZr+6mJQ4qNu3HJCRbLmUjJ7BE3++FpvVRIfHx+xXFrJy3S7MZiPnn3kaP7pgLP3z03jx77O44OZH8XgDB35vfPvB3wjo+MNv7yY+IYnhxTnc8pNZAFx11VXExcXx0ksv4fP56NevH2PHjuW5555DURTRXKWWNCZNmsSuXbvweDzYbDYGDhwoVrKp7lDq+5yamkowGBT/n+12e0zjUXeZAw2NI0KLTHvMCXUNkZmZyTPPPMM777zDu+++y6BBgzjjjDNYsmTJAe/z0EMPxZgt5+b23C7uVCXac7fzXlKIiK3acGQ0GmNS7rW1tfj9fgwGA7m5ueh0OiorK2lqakKSJEpKSpAkiSVLlhAOh8nIyGDIkCEsWrSI+vp6rFYrF110EWvWrGHNmjXodDquu+46Ojo6eOaZZwCobnfyxZoGrOF9WGhFRkezVMiWPc0ocpC6LR9H1qeZktCljOW+n1+EzWoiGApz7S+fZfYrC1lXvpcV63bym7+/zYNPRKLhAX3SxXLv7lAUhXBLZKuLztmXPzz6CfnZyXz6yQfU1dURF5/AD37wA6qrq8UA+g9/+EO2b9/Oli1b0Ol0nHfeeSxdupRgMEhGRgZ9+vRh06ZNQGS/aVNTE263G71eT25urohK1Q5p1aTBYrFgMBi0xiON75bvaQXbycAJJaaDBg3i5ptvZvjw4YwbN47Zs2dz3nnn8Y9//OOA97n33ntpa2sTN9VtR6N7ojeQHKjhKNrhKLrhqLW1VURR2dnZGI1G2tra2L17NwADBgzA6XSyevVq2tvbsVqtTJgwge3bt4t5yvPOOw+XyxVjFZiTk8PDDz+M3+/HFpeGW8pkfGkCTiXyb5ldcDo2Z0qkMWj/Sgi2g96KPm0iQ4v7MG74AADe/PBr1pXv7XLOz77+JTv2RAwYbrxiIgZ9978WintvJCqV9EjxxfgDIdas38oLL7wIwN72FNaU7ePFF18kHA4zYsQIiouLRRR6+umnoygKGzduBCJWg+qIUHx8PH369GHPnj1AxMRClmXxfqalpREKhYQPb/TSANAajzQ0jjUnlJh2x9ixY8Wqqu5QTdajbxrdI8uyqJMeqOHI6/WKD3R13hEi9VO1Szc1NRWHw0EwGBTm7WlpaWRlZbFr1y6xm3TChAl4vV4xWzl27FiysrL497//TSgUoqioiKlTp/LCCy9QXV1NYmIiD//vn/n0xZ/x2fuRpqRBpaNZuzNEe4cPuXVTpDkIHfq005EMVqZPKhGv/a2PV3d73oqi8O5nEUeihDgbY4f173qMHBQWhGMnz6R/vz5ces4Idm9eRiDgJ4ADL6ns2bmFTZs2YTQaufbaa1m6dCn19fXY7XamT58unJAKCwuJi4sTDVkjRowQUb3JZCInJ6dLrVStT5vNZoxGI+FwWMz+aileje+EI9hneqpxQtZMo/nmm2+OianxyYZaJ4UDNxxFGzM4nU6xBUKtk6pm7OqasK1bt+L3+7FarRQUFNDe3h5jzJCSksKcOXMIBALk5ORw+umn89prr7F//34SEhK46qqrWLZsGUuWLBEOR3a7nV//+te0trZSUFBA1sDxsPlrFF89Sltkf6guZRSSORmAUUMiLkduj5+ybVUHPP9V3+wWX48a0pdla2Iv0OTWTRD2kZyawZoKE7Cf2uo9JAYjUeavfnU32Vl5zHnpMQAuuOACLBaLuFCYOXMm+/bto7KyUlglrlu3DlmWycrKIiUlRTQo9e3bN+a91qJSjWOGVjPtMcf0GqKjo4P169eLhos9e/awfv16YRt17733cu2114rjH3nkEd577z127NhBeXk59957L++88w4/+9nPjsXLP6mIdjg6UJ1UTTmazWbsdrv4WV1dHYFAAIPBQHZ2NpIkUV1dLeqkxcXFSJLE8uXLRZ20pKSEpUuXUldXh8Vi4fzzz+ebb75hzZo1SJLENddcQ3t7O88//zwAl112GUVFRbzyyits2LABm83GAw88QPmOOpSwn/D+SDOT5OgXM0s6ID8NgL3VjYTD8gHPf+e+BvF1/z6xmyqUQCtKeySCvOvuX2MymUGRiQ9HjCDOP/98rrhoOru2fk1zczPp6elceOGFvPfee/j9fvLz8xk6dKiISseMGUNHRwfV1dVIksTw4cOpqKggFAoJlyh15VpcXBxWq1VEpSaTCZPJFDMOo0WlGt8Z2pxpjzmmYrpmzRqGDRvGsGHDgIjZ8bBhw/jDH/4AdPVjDAQC3H333QwZMoSJEyeybNkyPvroIy655JJj8vpPFkKhkOgINZvN3Y5XRNdJ4+Pjhdi2tbWJ/ZrZ2dkYDAZcLhe7dkWs+/r374/T6WT9+vW0tLRgNpsZP348FRUVfP3110BkRMrv9/PWW28BkfnSvLw8/vWvf+H3+ykqKuLiiy+mvLycF154AYC77rqLnJwciguykBu/FjtJdUnDxWs2mwwkJ0aiuNqGtoO+B+0uL25PJDLPSosX31cUmXDTakDhnHPO4dqrLmPx679mxigrOtlDcnIyt912G3v27BGdxjfeeCO7d+9m/fr1SJLE5ZdfzooVK8QozMiRI8WWmMLCQgwGg7AKHDBgAG63W5hdpKenx0SlqruUGpXq9XptHEbju0PiMBqQjvWLPjYc0zTvlClTDroup7Mf469+9St+9atffcev6tSiszFDd0P/B6qTBgIBUSdNSUnB4XCIJdeKopCSkkJ2djY1NTXCpWrs2LEoiiIcjYYNG0a/fv14/PHH8fv99O3bl7POOos333yT3bt343A4+NnPfobP5+P+++8nHA5z1llnceaZZ/HZkjLmvDrn2zppyjgk3bev3277dhmxx/vteNSB8PgC2G1mbNZv76e0bwd/E0gGZv3k5wC4WuopWxvpIL/rrruw2Ww8+OCDwliiqKiIv//970Bk9MVgMAjxPPPMM9mxYwculwuLxUJpaSlbtmwR72FCQoJo2EpMTMRkMsVkBEwmkzYOo/H9IdH7kOsUFVPtkvYUpid10lAoJGp3DodD1EnVRd/qPGlaWiSdumvXLrxeLyaTiUGDBuH3+4X/bEFBAdnZ2cyfPx+3201ycjJTpkxh4cKFVFRUYLFYuPrqq9m5cyfvv/8+ADfffDPJyck8/vjjVFVVkZaWxt13382dD7zOj381m3BTpAtYlzgEyZwY89rNpm+FNRAKH/L9CAT/6z9sjgiUEmhFao+MrRhTRtDaoRAMBnnwwQcJh8NMmTKFKVOmMG/ePCoqKsQ87Mcff0xTUxMJCQlMnz6dzz77LNIsNWgQqampYhRm+PDhtLa20tLSgiRJ9O/fn9bWVnw+HzqdTsyVHqhWqpk0aGgcP5zwDUgah09P66TqPKn6YQ7Q2NgoFvSqBvaNjY0iUlXTl1999RU+n4/4+HiGDRvGpk2b2L59u5i5rKurE006l1xyCTabjSeffBJFUZg4cSJjxoxh5cqVYrzkd7/7HU6nk+176pCb1oASAnMKUtygLufnD4TE1ybDoWcwTcbIr4PPH0RRwpE6bDiEM6UfHlsf/udvb3LHrj7s2LGD+Ph47r77bvbu3SvGeG644QZaW1vF3PMVV1xBeXk5dXV1mM1mpk2bxurVq0XdODc3VzRkqcvS1bJGamoqBoNBuB1ZLBYRlapiajQatcYjje8WrQGpx2iXtacoPamTdnR0EAwGkSSJhIQE8cHt9XppaIg07GRmZmIymQgEAsIaMCcnh6SkJHbv3k1VVRU6nY7x48fjdrvFou+JEyeSnJzMnDlzkGWZoUOHMnLkSF5//XXq6upISkri+uuvx+1289e//hWINCF9sLSSoef+njjd/v/uE9WhTx7VraioNVAgJnV7IGyWSGTu8fqRW8og2Ao6E5PO/hGSJDEo18K///1vIJLejYuL48knnyQcDjNq1ChGjhzJa6+9hqIojBo1ioyMDJYtWwZEShpNTU3U1tai0+kYNWoU+/btw+/3Y7FYyMvLo7GxkVAohMlkIikpiUAgIDIHnWulmkmDxveC1oDUYzQxPQXpSZ00EAiIDtK4uDhxTDgcFr67cXFxxMfHoygK27ZtIxgMYrfb6devHx0dHaJOOGTIEBITE/nkk08IBoPk5OQwatQoPvvsM+rr63E6nVx++eVs27ZNRKk/+clPsNvtzJ49m4aGBrKyshh1+jn8+53ltLa2smLh2wDoEkqQTPGdXz4QiUybWiLnkJnW/TEqcU6rqLHW1NSKTTOjplzBvx64iQUv/xK5+RvC4TDTp0/nzDPP5K233qKiogKn08msWbP45JNPaGhoIC4ujgsuuIBPP/2UYDBIbm4uBQUFrFkTmWUtLS0VzlAQaToKhUJiM0x6ejqSJAnrS6vVisFg0KJSje8fdZ9pb2+nIJqYnmJE10klSeq2ThrtvGOxWITvLkB9fb0Yg1FX59XX14sxmKKiIiRJYuXKlQSDQVJSUigqKhIjT0ajkXPPPZfKykoxKnL55ZdjNBp56qmngIgzUFFxCQ898gpz584FIGQvxmqJmNTLLetB9oMxHim+8KDnu3NvJILOz05BfwBnI4ABeWni6x3lkS5jKW4QI0dPwGjQ8+q/n6G2tpasrCzuvvtuysvLY+q6TU1NIr175ZVXsn37diorK8X5fvPNN/j9fuLj4ykqKmL79u3C9D85OZm6ujoxp+t0OvH5fCIroEalaiZBkiRtO4zG94MWmfYYTUxPMYLBoKiTdmdgD5FlAOp+0vj4byM6l8slxmBycnLQ6/X4fD7hQNWnTx8cDgfbtm2jvr4eg8HA+PHjaW9vZ/HixUCku9XpdIp06PDhwxk8eDBvv/02dXV1JCYmcvXVV/PKu8t4982XAHCTxuZ9AYaX5vP7H58eWcwN/03vHjzVuWZj5Fi7zczgQTkHPG7MsH7i69WrV4E5FV3iUObOX8fc995nwYIF6PV6/vjHPwLwxBNPoCgK06ZNY/DgwcyZMwdFURgzZgyZmZl8+WVkHdvkyZNxu93CJnDs2LE0NDTQ3t6OTqdj4MCBYnepJElkZGSI9xoinrx6vR5FUcR2GC0q1fje0Lx5e4wmpqcQsizH+O52Vyf1+XwxYzDqMeFwWDQXJScni40l27ZtIxwOExcXR25uLi6XS5hwDBs2DIfDEZPuHD58OF988QX19fViifbevXvF6rVZs2Zhs9mo2LYaI15kjIydPJPH/3Q1kiTx8fuvAiDZ+yBZUg55zvOXbhJfXz5jVLfHSJLEJWePACL+wl+t+gZ96ngkSUdrUx2PPPJPAKZOv4CSkhKefvppmpubycjI4Nprr+Xtt9+mpaWFpKQkLrjgAj766CNxviUlJcLZqLCwEKfTKWZw+/bti9FopLa2VryvZrMZt9tNOBxGp9MJc4zoWqkWlWpoHH9oYnqKEJ3e1ev13X4gh8NhMQZjt9tjUsC1tbWiOUYdg6mpqaGlpQWdTkdhYSGSJLFq1SrC4TBpaWkMHDiQjRs3sm/fPgwGA+eccw4NDQ1io8rFF1+M1WrlmWeeQZZlRo8ezciRI/liyTq+mB/ZBXrH7bfz9EM3MX1iKZ/N/5xVK1cCOnSJg3t03hu2VLJqfWRu84qZoxlekt/lmJt/MJmBfSMR4QsvvIiSNAbJYMVu1ZFt3k0oGMRHAm8sbGT+/PmsXr0avV7Pz3/+c8rLy1m3bh06nY5rrrmGDRs2UFNTg8lkYsaMGaxbtw6v14vT6WTIkCFs376dcDiM0+kU/ruhUAij0UhqairhcFjUqp1OJzqdTquVahw7tMi0x2hieooQPQZjMpm6HYNpa2tDlmUMBoOo0wFi4w5E0rs6nQ6v1ysirH79+mGz2di5cyf19fXo9XrGjh2L2+0W6d2JEycSHx/PG2+8QTgcpri4mGHDhvH555+za9curFYr119/PQD/fOQxkEMEcHDBBecB8E35Xn58290ASM4BSIZv7QwPxZ8efQ+vL4DRoOflh2/m1qunMawkj3HD+vPgPZfym9tmApEZ2WffXoNkTgFFwezeRNDnwmRx0iINJM4a5t8vvwzAj370I5xOJ2+/HWmEmj59OhaLheXLlwNw1lln4XK5hAHDuHHjaG5uFrVldQZXbTrKzMxEp9PhcrnEKJLVGqkRa1GpxjFDM7rvMdpv5ilAOBw+5BiM1+sVkWv0GEwoFBJpyNTUVKxWK4qisH37dmRZJj4+nuzsbDwej1ijdtppp+F0Opk3bx5+v5+MjAxGjBjB6tWr2bNnDyaTicsuu4yOjg7efPNNINK0k5SUxObNm2mp++8mldPPY/7SzXyxvJzWhl3IviaQ9OgSint1/uU7avjZfa/y8O9/SJzDyq9umdHlmF27dvH7B2Yj2fPBG8CpVGKhFQUdv/7N77n7bx+QY6tCDocZOXIk06dP51//+hc+n4++ffsyefJkXnnlFWRZZtCgQfTt21e4PBUVFREfHy/sE/Pz87Hb7aKOGhcXh9PpJBAIiBR7XFwckiTFRKXdXQRpaHynaHOmPUYT05Oc6PTugcZgwuGwGMNwOp0xFnV1dXWEw2HMZjMpKZEaZX19vXDtGTQoYpawevVqgsEgycnJFBQUsGvXLrZt24YkSZx99tl4vV4++CCSuj3nnHNITEzkhRdeoKOjg7y8PM466yw63D4eeeQRAGRLFtddOYNrfvksoXAYW3ukU1YXNxBJb+n1+/DF8s2ce90/ueHy05k6vojM1ASCwSB79uzkow8/5KU3F/Laq8/RJyeFC6/+NUpLZPxnwrSLmD5tPKPfegdPWxirPY6f/vSnfPDBB1RWVmKz2bjmmmtYtGgRLS0tOBwOzjrrLL7++mv8fj8JCQkMGTKErVu3EgqFcDgc5OXl0dzcjNfrRafTkZGRITIDEBmFUVPsatORTqfT5ko1vn8Opzv3FL3e08T0JCcQCKAoygHHYNQPcUVRMBgMMdtgXC6X+IDPyspCp9Ph9/vZuTOyLaVPnz7YbDb27dtHVVUVkiQxZswYQqGQqIuOHDmS9PR03njjDdxuN5mZmUyaNIl9+/aJY6677jqee2MJDz/xb5KU7cjoaPBn8dr7q+ibm8L2rRtxtdRGlnLHHXwU5mBU17fwwOMf8MDjH6AEWgnXLQQ5gGRJQ5c+mbmfrSPe7MfoLicADB05mYf++EvmzZuHp60Wo9HIH353Lzt37mTp0qVAJN1bV1dHWVkZEFm1Vl1dTXV1tTCraGpqorGxEUmSKCwsJBQKCdOL9PR0jEYjbrebUCgUMwojy3KMB68WlWp87xzO3Kg2Z6pxstE5vdvdh7HP5+s2vdu5e1edNd25cyehUAin00lubi7BYFCYMxQXF5OYmMjKlStpb28nLi6OCRMmUFlZKTpaL7vsMnQ6Ha+88gqKojB69GhKSkpYtGILTuW/JgbFYyge1Jc1ZRXs3NuAKRCx2JMc/Q4rKu2MEmwnXLcI5ACYk9GlTaR0UB6vzV3MO689TSAQYMTI0Tz+8ANs3LBepKJvuOEGHA4Hr732GhCZh83MzBRGE+PGjSM+Pl68H6eddhpWq1WMDqnp3ZqaGmRZxmazkZiYSDgcFqMwTqdTRKDRHrxaVKqhcXyjielJSuf0bncfxtHduw6HIya9W19f36V7t6mpif379wMR03qdTsfGjRvxeDw4HA5KS0tpamoStcEzzjgDo9HIu+++i6IojBgxgn79+rF+/XrKysowGAxcddVVAJw9KgUjXswWK08+/Ec+fOGXeH0BlHAAX2sFQMye0sN+X4Iu5PrFEdMHUwL6tMkU9M/hpsvHkaxsRU+QIDb6D5lOTU0N//rXv1AUhenTpzNhwgRefPFF/H4//fr1Y/r06bz//vvC1Wns2LFiZ2tmZiaDBg1i27Zt4uIjLy+P1tZW3G43kiQJ0wu16chgMIiLlugLIa1WqnHM0Ewbeowmpicph0rvAjHp3WgTe7fbLcwZ1PRuOBwWEVZOTg5Op5Pm5mbhxztq1Cj0ej1ffPEFsizTr18/BgwYwLp166ioqMBkMjFz5kzC4TBz5swB4OyzzyY9PR1ZlvlifsRNyEUmj/17ERBZ7K14KkGRwRgPptitML1FCbQSrv0CJeQBYxyGjClIehPnnzGERZ/8ByNu0JsJxQ1j/MgB/OMf/8Dn81FcXMw111zDG2+8QW1tLU6nk2uvvZYvv/yShoYGrFYr559/Phs2bBA7W8eNGxczOlRUVEQoFKKurg6AtLQ0zGZzTNORuic22qBBr9drUanGsUMbjekxmpiehBxJeleWZZHeTUxMFDXUvXv34vP5MJvN9OnTB0VRWL16NYqikJ+fT1ZWFjt37qSiogK9Xs8ZZ5xBMBjko48+AiJ7PBMSEli+fDlVVVU4HA7OnXEe/357GZfd+Ht2796Ngp4mfwovvBUxh3/4Dz/CEIx0Euvs+UcUnSn+pv/WSH1gTECfMY3UlFSeuP8q9u9axpo1a7DZbLzw7JOs/vAhPvvgDerr60lNTeUXv/gFS5cu5ZtvvkGn03H99ddTVVUlzCnOO+882traxIXFuHHjCIfDMQvSrVarSO9arVaSk5PFVh4Am80mLnrC4XDMGJOGxjFDG43pMafoaZ+89CS9K8tyjDlDdHq3qalJeO+mp6cDkUg12pTdYDCwZ88eGhsbMRgMDB8+nFAoxKJFkYhy1KhRJCYmsnTpUlpaWkhISGDy5MmEQiExl3nBBRfwzOvLue/huezbHllD1kEGimTAbDTwwL/ep6W5jYArIqaSLeuw3xPZUytqpCZHOvqMaUh6Cz+8YAw7Ny3hiy++QK/X8+CDDzJo0CCeeeYZysvLsVgs3HPPPdTW1gqHposvvhiHw8Fnn30GwPjx40lLSxM7WwsLC8nIyGDLli0oikJSUhJZWVm0tLSI9G52djaSJNHR0SGcjtSmo+io1GAwaPtKNY4xhxOVnpqRqdbNe5IRDAYPmd51uVzIsoxer48xZ/D7/aImmp6eLjxho03Z1YXV6kxpaWkpNpuNVatW0drait1uZ8yYMXR0dIh1azNmzMBkMvH555/T0NBAfHw8QX0ab3/yCSbaMdGBgo60vCHY/VC3v43n3lhC9d6toIRBb42keXuJoigorh3Izd8ACpIlnVFnXMeYYYN48tWFvP7aHAzeSPT4u9/9jtGjR/Puu++yZMkSdDodP//5z7FarTz11FOiWWrkyJG8/PLLhEIh+vbty9ixY/niiy8IBAIkJydz2mmnsWvXLtxuN0ajkcLCQoLBoEjvpqenYzabCQaDMVt5VNEMhUIoSuTDSItKNY452mhMj9Eue08iwuHwIQf8A4EAHo8H+LZGBxHhqa2tRVEU7Ha7MLhvaGigra1NmLIDlJWV4fP5cDqdFBYW4vF4RGQ2efJkzGYzn3/+OT6fj+zsbEaMGEEoFBILvi+88EKe/M8SmlvdJOgjIyKXXXoxX7z+e4YU5YrX+tGnka0ykiW91yleRZGRm9ciN68DFCRHX3Tpk1i1YR8ffLEeU3CfENKf//znnH322Sxbtkx07l5//fUMGjSIZ599Fq/XS35+PpdeeikfffQRra2txMXFMXPmTDZs2EBjYyMmk4nTTz+d5uZmkSYvKirCaDRSVVWFoijYbDaSkpJiZkrNZjMWi0X8G6hRqdZ0pKFxYqFFpicJnZtWujNn6GwMYDZ/uzC7vb29S5dpKBQSdb/8/HwsFgvt7e2iNjhixAj0ej1fffUVgUCA9PR0SkpKaG1tFbZ6M2fORKfTsWTJEhobG4mPj+fMM8+kJZDII899iLG1CQVYuL4D+emPqa5r+fb1hiKiLxm/bY7q0XsRchPevwL8jZH3I+k0cA4S4qT37SVe2QvAj666liuvvJKNGzfy5JNPitd8xhln8PTTT7N//34SEhK48cYbWbVqFbt27cJgMHDRRRexf/9+tm6N7D0dN24cBoNBvDe5ubkkJSWxf/9+Yc4Qnd5V16tFX9Co/36abaDGcYPmgNRjtN/Yk4RQKHTIphWPx9PFGAAiEa2ahkxJSRH337dvH4FAAIvFQk5OZH3ZunXrkGWZrKwssrOzaW5uFo04U6ZMQZIk5s+fTygUon///gwaNAhZlkVUOnPmTEwmE1ddNA6dt4J//WsRNmcq5XtclO9ZKF6TyaDH+18xRf/tPtVDIburkJu+jsyQSkYyCs+i0RsHQPHALCaVWnj/ncjmGRc5nHXuhezatYv/+7//IxwOM3bsWH74wx8yd+5cduzYgclk4qabbqK2tlZE32effTZWq1WsWSsuLiYrK4v169eLMZi+ffvi9XqFOUNmZiYmk4lQKCRmSuPi4kRNO9qg4UBNYxoa3zuH01B0iuY7T9HTPrmQZTkmPdhd00q0MUD0hzhAY2Oj2FyiWgZ6PJ6YpiO9Xk9dXR3V1dVIksSIEZGVZUuXLhWjMPn5+TQ3NwuDhhkzZiBJEmvWrKGurk5Y7QVDYRRFEU08U884i3inldKCbAAkCQKhMIoSEReU8CHfAyXsI9y4Cnn/soiQmpJIKb6Y5x/7PanJTrLSEpgxyimEtHDIeP503/9QvmUn9//pAfx+P4MHD+a2225jyZIlLF++HEmSuPrqqzEajcJnd+TIkRQUFLBkyRJCoRDp6ekMHTqU3bt3097ejl6vp7i4GEVRqKqqEu93fHx8TGbAZDIJI/vOG320URiN4wZtNKbHaGJ6EhDt33qg9GB7e3uXbSRAzOaSjIwMIcS7du1CURQSExPFGIfadFRQUEBcXBx1dXUirTl58mQAPv/8c2RZpqCggH79Igu3P/nkEyAyHvPx4nIKp93LlT/9B9u3b0ev13Pbj69mwyd/5oPnf8H1l03gv/036GyRaFj+7zLw7lAUGbl9O+Hqj8TScCluEBnFF3HfnVcxtCiX+39xEe31m/jPqy8CcO211/L8k/9LVoqVV158ioDfS3xiGnfeeSebNm3i/fcjM68XXHAB/fr149133yUYDJKfn8/kyZP56quvaG9vx2aziTqpKpyFhYVYrVbq6upEV3RmZiaSJOHxeAgEAl3Su9oojMbxiox0WLdTEU1MT3BCoRDhcCRyO1DTis/nw+fzAQduOnI4HCL129LSIgR2wIABSJJERUUFzc3NGAwGSktLAYQ/bXFxMampqbS0tAj3o+nTpwNQUVHBli1b0Ov1TJ8+nYUrthCWZTaXRSz3hg8fTkJCAhCpFd505RSslsiojuToA+gg0Izib4o5JyXkRW4tJ1z1YaTJSA6CKRF9xpnok4bxPz+dycVnj2DJ19t4+qmniFMiloSXXfEjfvKTn9DW1sYrLz2F2RDGGzRw1swfUFtbKwwlJk6cyIQJE5g7dy7t7e0kJiZywQUXsGXLFqqqqtDpdEycOBFFUUTdNCcnh9TUVNra2sT8aE5ODgaDISa963Q6xUVPdK3baDRqozAaxxXyYd5ORbSa6QlM5w/i7tKDiqKIjTCdZ0pdLpdoOlKjJ0VRhJF9dnY2drudcDjMhg0bACgpKcFisVBVVcWePXvQ6XRMmDABgEWLFhEOhxkwYAD9+/cHYP78+QCMHj2apKQkbrj8dBqbXfhqqmiojfjbArw3fx3zl2ziph9Oxu+PpHclvQXJlo3iqSRctxDJmglyCEX2Q6AVMc+mM6FLGIzk7I8kRcTo3v99h48XbeCbrz7ERmTcZ8LUmdz5i5/R3t7OAw88QOP+BhITk7j1up9Q0Cedf/3rX4TDYUpLS7nwwgv5+OOPqampwWKxcOmll9LU1CTeh1GjRpGUlMQ333wj6qT9+vUjEAiIbt6UlBTsdnvMMgGTySQsAyF2lCn630ZDQ+PEQhPTE5iefBC73W5hDBBtGagoCvX19UDEyF5NL9bX1+N2u9Hr9fTp0weAHTt24Ha7sVqtFBZGtrZ89dVXAAwePJjExETcbreolZ555plApO6qdvVOnz6dVet3ceOvXsDjdpNFpB47btw4ZFnmrr+8Tjgs09bhRZL4VieThiPLfhRfA4qnKvbkzCnonP0xOvMJK7ERXTgYYMNXc7HRigJceuUN3P3zm+no6OChhx6iqqqKxMRE7rvvD5jNZh577DE8Hg/5+flcc801rFy5ki1btqDT6bjwwgvR6/XiXAoKChgwYADbt2/H5XJhMBgoKSkBoKqqSrgcqZ7GB0rvyrKs7SrVOK6RFQlZ6d3/y94ef7Kg5ZROUHryQRwOh4UxgNPpjEkhNjc3EwgE0Ov1oukoHA6LhdV5eXkYjUaCwSCbNm0CIsJpMBioqqqioqICnU7H2LFjAcR4THZ2NgUFBQAsW7YMv99PdnY2gwYN4oZ7nsft8WPmv1GaNZ5p1zzK4pVbmTx6EJIEk8cMIs4ZidyMBj2SwYoufSq6tInoEk9j5OQruPveBzDmnoct/xwS0otIT0vinEml4txOG5RKKpuw0IrBaOTHt94lhPQvf/kLe/bsIS4ujt/97nc4HA6eeuopWltbSU9P5+abb2b79u1COM866ywyMjJYvHixaDgaMWIEdXV1MfOkFouFhoYGMQaTk5Mjxouid8V2l97VtsJoHK9oad6eo0WmJyg9+SCO3kYS3XQUDoeF01FaWpq4f1VVFX6/H7PZLEZhtm7dit/vx+l0itStGpWWlpYSHx9PKBQS9dOpU6cKYV+8eDEQ2R6j0+nok5PC5h015CSDpxFavBY6dH4+XLiB5/9+Iz5/kJ0V9QzIT+Onv3sZf+C/6V5JQrJFOn037AOdw4Wid5KW5KSqroV2t49AMITVbMSouPDVLMaguElMTORvf/sbpaWluFwu/vKXv1BRUSGENDk5maeeeoqGhgYSEhL4yU9+QmNjo1ipNmbMGAYPHsyiRYvo6OjAbrdz+umn43a72b59OxDZ6ZqcnIzL5RJ15uzsbEwmU4z3buf0bjgcFrVubRRG43jlcBqKTtUGJE1MT0CiP4gP5nTUeRuJyv79+wmHw5jNZhITI5tYgsEg+/ZFmnT69euHXq8nEAiI5prBgwej0+moq6ujoqICSZJEVLphwwZcLhfx8fGcdtppQMQ5affu3UiSJGqq7zz5M/ZUNvLy84+xZMk2Th83nP2+ZGZdOQlJkvho4QbufvCNA563BCgKrCuPGC5URRk8jCjtw9mjEnn0kX/S3hZk4MCB/O1vfyMjI4P29nYefPBBKioqiI+P53e/+x3p6ek8++yz7N27F5vNxi233EIoFGLu3LnIskxhYSETJ04UYz0Gg4HJkyej0+nYtGkTsiyTlJREfn4+wWCQ6upqAJKSkoiLi8y1RpszRC8T0JqONE4UDifS1CJTjROCnhjZRzcdWa3WmHELv99Pc3MzEBmFUT/g9+3bRzgcxm63i1rftm3bCAQCxMXFkZ+fDyDqokVFRaILV41Kx48fL16P2tVbXFwsrAmtFhPFA7OEkcGVF05m0qRJ4rWpIqmik6BfXhrBUJi91U3o9DrC4W9/VROcVixWE0YdpOj28I//fQaASZMm8Yc//AGbzUZjYyMPPvggNTU1xMfH8/vf/56MjAxeeOEFdu7cidls5ic/+QlWq5VXX32VQCBAbm4uM2bMYPv27WLt3IQJE4iPj2fjxo34/X6sVitFRUVAJKIPh8NYLBaxHCAQCIgUe3x8fMy/k9Z0pHGioNVMe44mpicYPTFC9/l8IiKKdjqCSIOROgqjNiT5/X4RWfXr1w9JkggEAmzZsgX4NiptaWkRc6VjxowBoLKykr1796LX6xk3bpx4HlVM1eOiaWyM2PylpqYSCIaY/cpCLGYj9/z4HLbsqGHzzhr8gRCyAjf9YDKTRg/i9Q9WMqw4nz88PJdwWOahX13GsNJ8Wpoa+O1vf8uihbvR6XTMmjWL6667TkTRDzzwAI2NjSQlJfHb3/6WjIwMXnnlFbZs2YLRaOTmm28mNTWVOXPm0NHRQXJyMhdffDH19fWsW7cOiIzv5OTksGPHDlpbW9HpdJSUlGA0Gqmvr8fj8Yg6qU6nQ5Zlkd61WCwxKXat6UhD4+REE9MTiJ4YoSuKIuYZ7XZ7TETk9XrFz9QICiJRqSzLxMXFkZSUBMD27dsJBoPEx8eLqHTt2shsaL9+/UhNTQW+Fc0hQ4YI4Xa73SKiGz58eJfXqDbhhMNhPvh8PY+8EBmfyc1M4qKzhzNyaF9eeHMpNouJ8cMHkJWewJ03nQPA0vGRaFCWZd59912eeOIJ/H4/SUlJ3H///cKZae/evTz00EO0traSkZHB7373OxITE/nPf/7Dhg0b0Ov13HjjjeTm5vLGG2/Q3NyM0+nk8ssvx+v1smzZMhRFoX///hQWFlJbWysuOIqKinA4HLhcLnFhkJWVJbyOXS6X6KBWo3L130ZzOtI4kQj/99bb+5yKaGJ6AqFGNAczQvd4POKDXF3sraKOwsTHx4tNJT6fT3Sl9u3bF0mSCAaDolZaWlqKJEn4fD7KysqAiKUeRKJkNXobPXq0eJ7y8nIURSErK0t0CkfjdDqpq6vD5XLhtDsxGHQY9Xo+WriBjxdvFPOuJpOBvOzkLvevq6vjwQcfZM2aNeL1/OEPfxDPVVZWxj//+U+8Xi95eXn85je/wel08p///Id169ah0+m47rrrGDhwIHPnzqW2thaLxcLll1+OwWDg888/F527o0aNor29PabhKDU1lUAgEFMnVUXT5/OJrTwJCQkx9dDOTkdaVKpxvKMcRgOSojUgaRzP9CQ9KMtyjMtO9Ad5R0eHMGhQa6IQcShSFIWEhATRjLRz507RwZuXlwdEmoyCwSCpqakiUi0vL8fj8ZCQkCDGYQDe/SCyxxRjYrfnokawazds4//mlKOTYM7jP+GTxRGxtttMhEIy2ekJvPzOcjLS4pk+sRSPx8N//vMf5syZI7qOb7vtNi655BJxrkuXLuWpp54iHA5TVFTEXXfdhc1m6yKkJSUlfPjhh+zevRuDwcCll15KfHw8CxYswOv1Eh8fz6RJk8RokKIopKSkkJ+fjyzLVFZWEg6HsVqtIsoPh8MxS9ejt/JoTUcaJyJaA1LP0cT0BKEnozAdHR3djsIoiiKafhITE0Wt1ev1imi1b9++QESQ1ai0uLgYnU6HoijC+WfEiBFCyKNritHisHPnTgzAmq0ttLm8PP2fRQzsk87FZ0dSsIWFhaxbt46PP3gX5P6EJT3tHT7u+fE5TBoziG/K9/LP5z5j49YqNm6tAkXmf64byicfviPSqkOHDuXee+8VYq8oCvPmzeP1118HIs1QP/3pT9HpdMyZMydGSAcPHsz8+fPZunWrMGXIzMxk8eLFtLS0YLFYmDJlCnq9no0bNxIMBrHb7RQWFiJJErW1tfh8PvR6vaiTqi5HsixjMBi61Kq1piONExGtAannaGJ6AtCTUZhQKITb7QYikV/0MS6XS5gJqLVOiDQPqWb2appy7969eDweLBaLENg9e/bQ2tqK2WwWHaw+n080KKnjMOrrMEl+ZAWuuuwsnn1tMbNfiaxWG1Hah7zsZK699lq++OIL6uvrSdTpiMsZy22/f5nEBDsfPv9LMtMSeGXuVwT9PoKuChxKLa+8tBKI1CZvvfXWmHnWQCDA008/LYwWzjvvPK666ipkWebll19m48aN6HQ6rrnmGoYMGcKXX34pLg5mzpxJv379WLlyJbW1tej1eqZMmYLdbmfz5s10dHRgNBopLS3FYDDQ2tpKS0tkJCcnJyfmwkSth0aPwYDWdKRx4qJFpj1HE9PjnOj04IFGYQAxhmEymbqkF9WoNCkpSdRa/X4/tbW1ACJtqyiKEMiCggLxXOq+0tLSUhFVbd68mWAwSEpKijB4ANVOLzJi8+Orz+HDhRHRSkl0kJgQqeHGxcVx//33c8stP8UiN9C6bwWyFE+jR8c7730EwXbG5O7nm/UbUP67fk2RzPzijlu4+OKL6fAEue/huRT0y2DmlGL+93//l127dqHX67nhhhs488wzCYVCvPTSS5SXl6PX67n++uspLS3lq6++EuM9Z599NoWFhWzYsEHMxJ5++ukkJydTUVHB/v37kSSJkpISrFYrXq9X1JdTU1NFN3Rnl6PoyLNz05G29FtD4+RE+80+zoluWjlQejAUCgmDhs5RaXt7O36/H51OF9MMpEal8fHxYl60oaGBlpYW9Hq9qIG63W527doFRFKrKqrF4NChQ7sYQgDCOP/8M05jWEkeCXE2HDYL23fX8eiL85k0ZhCX/+Aa3nr9ZWzsx6ZE7vf8U9tjzs1giSdgyuOOn17LlRdGzB+efeFzXn73KxwmP0s+DtDe3obD4eCXv/wlJSUl+Hw+XnjhBXbs2IHRaOTGG2+ksLCQr7/+mmXLlgGRReZDhw5lx44d4lxGjx5NTk4ODQ0NVFRUAJGLioSEBEKhkHjPHA6HiPAVRaGlpUWY2Hdu+tLWq2mcyGgOSD1HE9PjGEVRRHrwYE0ratOR2WyO+cBWFEWIW3Jysog0g8GgiErVmiMgaqX9+vUT0a3amZuZmRnj4RtdV41Gna9Um5kAcjKSxNePvjifjxZt5OPFZWxe8BemnzGRG279PaGgD7NRoqh/Ojk5OQwfPpzhw4eTm5vbJS06tCiXDEcH+fGttLdHUr/33HMPmZmZdHR08Oyzz7Jv3z7MZjOzZs1i4MCBrFu3Ttgbnn766YwePZrKykpWr14NRGZpBwwYQHt7e8xKtczMTBRFobq6mmAwiNFoFL676nsfCoW6uByp77/WdKRxIvN9iuns2bP53//9X2praykpKeGRRx5h4sSJBzze7/fzpz/9iVdffZW6ujpycnL47W9/y4033nhYz3+kaGJ6HBMKhQ4ZlQYCAbGrtHPTS3RUmpz87YiJ6thjt9vFXGlHR4dYcD1o0CAgIgZq1KbuMIVIB7DX68Vms4nNMiqqmEbPV0YzacwgPl5cxpjT+mExG2nzm2kI9wMdXDp9JP/32x8c9D1xu91sW7+QPgmR5xk7dqxwMGptbeWpp56ivr4+kmb+8Y/Jy8tj/fr1fP755+L48ePHU19fHzNLOnjwYHw+n7AKTE5OFl7E+/fvp6OjA0mSyMvLExclfr9f1KkTEhK6pOC1piONEx1Zidx6e5/e8sYbb/CLX/yC2bNnM2HCBJ5++mnOPfdcNm/eHHPBH80VV1xBfX09zz//PAMGDKChoYFQKNT7Jz9KaGJ6nNI5Kj1Q04paK7VarV1qdd1FpeFwWMxH5ufni8dVTRYyMzOFEDY2NtLY2IherxeNR4BI+xYUFMREW1+t28mLby8lTuKAtd0rZ47hgjOHYTFHzmlAXhrxTisdbj+TRw866HuyefNmZs+eTWNjIzqdjquvvppzzz0XSZKoq6vj6aefprW1lfj4eH7605+Snp7Ohg0bxE7VkSNHMnHiRFpaWvjyyy+RZZmcnBxGjx5NOBxm06ZNBAIB7HY7RUVFSJKEy+US72NWVpaYz412ObJareL7KlrTkcbJwPcVmf7zn/9k1qxZ3HTTTQA88sgjfPbZZzz55JM89NBDXY7/9NNP+fLLL9m9e7cICDpf2H/faHmn4xTVNvBgUU0gEBDNLdG7SiGSfuwuKq2rqyMUCmGxWETdLxwOC4EcOHCgOFZNd/bt2zdGLNR6Yuf/vK+/v4rmtsjraW5u4UBYLd+KS1pKHAP7phOWZf729MfdHu/3+5kzZw5//vOfaWxsJD09nfvvv58ZM2YgSRK7du3iscceo7W1ldTUVO644w4hpJ999hkQGemZOnUqHR0dLFq0SMzMnn766UiSxJYtW0TnrrpqLhAIiGg9MTFR1JYhEvXLsoxerxfG9tFo69U0TgaOZAVbe3t7zE39rOpMIBBg7dq1TJ8+Peb706dPFxuqOvP+++8zcuRI/v73v4u1j3fffbfoHTkWaJHpcUhPo1K1Vmq1WmO6RKOj0qSkJPFhriiKEIfoul9lZaUwb8/OzhbHqp296kJw9ft790YM6TuL6Y8uHMuOrRuANjwed4/PNystEaggPaWrKJWVlfHss8+KjuQpU6Zw3XXXiTna9evX8+qrrxIOh+nTpw+zZs3C4XB0EdJp06bh9XpZuHAhXq+XhIQEMUu6Y8cOmpqa0Ol0DB48GIvFIowZ1EXfGRkZ4jV5vV7xS9vZ5Qh6NsqkoXGyk5ubG/P3++67jz/+8Y9djmtsbCQcDsdYnELE8rSurq7bx969ezfLli3DYrEwd+5cGhsbufXWW2lubuaFF144aufQGzQxPQ6JjkoPNEoRCARE9NM5KnW73fh8vi5RaXNzM16vF4PBECMOu3fvBqB///5CGBobG2ltbcVgMDBgwABxbGtrqzB2V4VXZeyw/jz255v5/e9/z86dO/F4PDE7PNvaPVz+s9m0tXuY8+gtDMiPODH947dXcvXF4ygtyI55ntdff100DSUlJXHjjTcKK0NFUfjiiy/46KOPgEgD0dVXX43JZGLdunWiRqoKaSAQYOHChXR0dOBwOJg2bRomk4nq6mqR9i4sLBRRZrQxQ25urnhfwuGwGIOx2+3dduj2ZJRJQ+NE4EjsBCsrK2OyNtEje93R+aJT/QzsDlmWkSSJOXPmiLLUP//5Ty677DKeeOKJGNOa7wtNTI8zDqdW2llwVZeghISEmJ+popGRkSG+73a7RWdvv379xLFq2jcvLy9GMNQIMSUlpVuhHzBgANnZ2VRXV7Ns2bKY1M3GbVVs3x250vxy1VYhpiajgdFDI8/t9/v5+OOPmTdvnmismj59Oj/84Q/FL0goFOKNN94Q3rwTJ07koosuQqfTsXr1ahYtWgREaqRTp04lHA6zePFi2trasFqtnHHGGVitVpqamkStuG/fvsJmsaWlRdRDc3JyRJr9UC5H0LNRJg2NE4UjMW2Ii4vrtgTSmZSUFPR6fZcotKGhoUu0qpKZmUl2dnZMo2NRUZHIvkWXq74vNDE9zojuAD1YVHqgWqnX6xUdptFRqcfjEXtMoyNKNSpNT0+PEQdVTKOjUvhWTKP9faORJIkzzzyTf//738yfP59p06aJ8xg9tB+XnD2ClnYPF541LOZ+HR0dLFy4kE8//VS8zv79+3PttdeK7mKAtrY2XnrpJSoqKtDpdFxyySVi+fjKlStZsmQJEOnanThxIrIss3jxYhobGzGZTEybNk1sfNm8ebM4d7Vj0Ov1iouLtLS0mPf3YC5H0NVgQxuF0TjRiXTz9tZOsHfPYTKZGDFiBAsWLODiiy8W31+wYAEXXnhht/eZMGECb731lsg0QWTTlboK8VigielxxNGMSuPj42MiSjUqTU5OFhGeoijs2bMHiI1KozfJRH8fEFZ60ULdmUmTJvHmm29SVVXFfffdxz333ENCQgJmk4F//v6H4rhQKMS2bduECKpClZKSwg9+8APGjx8fI0gVFRW8+OKLtLe3Y7FYuO666ygsLERRFJYsWSKcjcaPH8+ECRNQFIWlS5dSX1+PwWBg6tSpJCQk4Pf7KSsrIxwOk5CQwKBBg5AkKcaYwel0xphcRKd3O7scqciyrEWlGicV35ed4J133sk111zDyJEjGTduHM888wz79u3jlltuAeDee++lurqal19+GYAf/ehH/PnPf+aGG27g/vvvp7GxkXvuuYcbb7zxmKR4QRPT44qerFgLBoMHjEoDgYD4wO8sBGoKJToqbW5uxuVyibqgiiooSUlJXdI0auNNdC20M3a7ndtvv51HH32UXbt2ceuttzJ48GCGDRtGMBjE5XJRX19PWVmZWFcGkZTyjBkzGD9+fBfziVWrVvH222+LRoVZs2aRmpoqaqeq6f6UKVMYPXo0iqKwYsUKqqurhd9uSkoKoVCIsrIyAoEANpuNkpISYVSvGjOYTCays7PFxYyiKLS2tqIoCkajsYvLUfS/DWhRqcbJQxiJcC9rpr09HuDKK6+kqamJP/3pT9TW1lJaWsrHH38srE5ra2vZt2+fON7hcLBgwQJuv/12Ro4cSXJyMldccQUPPPBAr5/7aKGJ6XHC0YhKm5qagIiYRY+y1NfXEw6HsVgsMc5EaldudnZ2TCSl/qftblhaFdPOc5WdGT58OP/3f//Ho48+yo4dO9iwYYMwl4/G6XQybNgwJk+eTHFxcZfz9vl8vPXWW0IsBw8ezI9+9CMsFgvhcJhPP/2U8vJyIFJbPe2001AUha+//pqKigokSWLixImkp6cjy3KMef3gwYPFeUcbM+Tk5HRZqq6mb7tL70JsB68WlWpo9J5bb72VW2+9tdufvfTSS12+V1hYyIIFC77jV9VzNDE9TlCdOw4WlYZCIdGU050HrNo003kht1oDzMrKiom2DjTioo7PdCem0UYEhyIlJYU///nP1NXVsXz5cnbt2oXNZsPpdJKQkEBxcXFMB3Fnamtreemll2hoaECn03Huuecybdo0dDodwWCQefPmCYP6GTNmUFJSgqIorF27lp07dyJJEuPHjyc7OxtFUdi5cyfNzc3odDpKS0tFOqizMUN0mqhzevdg/zYQMavQolKNkwVta0zP0cT0OKCnUanaWGQ2m7tEP62trciyjNlsjhHajo4OXC4XkiTFjMM0Nzfj8XgwGAxkZmaK7weDwRhh6YwqNL0Zjs7IyODSSy/t8fGyLLNs2TI+/PBDgsEgCQkJXHvttWIlnM/n491336WqqgqDwcAFF1zAgAEDUBSF9evXs23bNiDShKReKFRVVYk6cFFRkegCDAaDop7c2ZgBIoPnh0rvKooixFSLSjVOJmQOY5/pKWp0f0wvoZcsWcL5558vIqb33nvvkPf58ssvGTFiBBaLhX79+vHUU0999y/0O6Ync6XhcFjUFzt/qCuKIjpgk5KSYsRYrZUmJyfHRJOVlZVARDCjn7OhoQFZlrHb7d2Ofqi1UlXYjzYtLS089dRTzJ07l2AwSGFhIXfddZcQ0vb2dl577TWqqqowm81cccUVouN406ZNokN31KhRonlq//79oju5f//+wvlJNWZQU+DRFxsQGdNRMwHx8fEHvMhR07uSJGlRqcZJhWon2NvbqcgxjUzdbjdDhw7lhhtu6FHksmfPHmbMmMHNN9/Mq6++yvLly7n11ltJTU3tVeRzPNHbqNRoNHYZfu7o6CAQCKDT6WIiK1mWqa+vB+giFGoqt7NLSfTx3b0WVWDVrt6jhVrnfP/99/F4PJhMJs4///yYjt79+/eLdni73c7ll18uRnTKy8vZuHEjEKnXqivk2tvbhZNTZmZmTNt8Q0ODWJoebcygvh41vWuz2Q4acapRqcFg0NyONE4qtDRvzzmmYnruuedy7rnn9vj4p556iry8PB555BEgkq5bs2YN//jHPw4opn6/P8YTUv2APF4Ih8MoSmQw60BRqSzLIirt3MELiKi0s7VdS0uLWBummkFDRJjb2tqQJCkmxRv9WAcafVEjxO3btyPL8lGJxOrr63nrrbdijCKuvvpqEUFCpClq7ty5+P1+kpOTufzyy0Wn8datW8UC89NOO02Y8nu9XsrKypBlmaSkJAYOHBizOk1t2MrOzu5SA/Z4PIRCIXQ6XbcRuoqiKCIy1dyONE42jsQB6VTjhKqZrlixoosZ8tlnn83zzz8vRKMzDz30EPfff//39RJ7RU+jUo/Hg6IoGAyGLlGp3+8XHb7RggnfpnjT0tJiRE+tHSYnJ3d5PDXijO76jaZPnz6YzWY6Ojqorq7uEtn2ho6ODhYsWMDy5csJh8OYTCbOOeccJk2aFCNMZWVlfPbZZ2LLyyWXXCK6ibdv387atWuBSKdvSUkJgBiBCQaD2O12iouLxXsQXSftbvwnHA4L32OHw3HQCwb1QgjQUrwaGqcwJ5SY1tXVdWuGHAqFaGxs7BJlQWTY98477xR/b29vPyIBOJr0xHpOURSR4rXb7V0EVxU/h8MRI4yhUEhEXp1TvNHdvZ1RO4I7N+KoGAwGCgoKKCsrY+XKlYf1XgYCAZYvX878+fNFTbKoqIjLLrss5oKgsxnDoEGDmDFjhnivduzYIZZ7FxcXM3jwYCASyZeXl4t0sboFRn3M6upqUSftzq6so6NDXLwcbJ4Wvq2X6nQ6LcWrcdLxfe0zPRk4ocQUujdD7u77Kmaz+ZAGy8eKntTafD6fSKd2dvaI3qnZOSptbGwUG0+iU8OKoghLwM4iqz4fdG1yimb8+PGUlZXx1VdfYbPZxCq0Q+FyuVi2bBnLly8XFwjZ2dnMnDlTuBCpBINBPvnkE7EGTnU1Uo/ZuXMnX3/9NRAR4tNOOw1JksQITEtLS8wWGJXm5mbcbreYJ+1u44uaUo+Li+uxQGpCqnEy8n3tMz0eCIfDlJWVkZ+ff8DM3ME4ocQ0IyOjWzNkg8FwUHu74xFZlns05K9+sFut1i4f2C6Xi3A4jMFg6FJLVcdb0tLSYu6n7hXU6/VdBFhRFCGmB7sAKSws5LzzzuOjjz7i888/x+v1cvHFF3dbM/R6vWzatEl02qoXEImJiZxzzjmMHDmyi6C1trby3nvvifnSc845h9LSUvHzXbt2xUSrw4YNE+dYXV0dMwITXe/0+XyiwSo9Pb3bc1RT5iaTqUeztIe6mNPQOJE5mRuQfvGLXzB48GBmzZpFOBxm8uTJIkD48MMPmTJlSq8e74QS03HjxvHBBx/EfG/+/PmMHDnyhJvvU2ulBxvyDwaDwnmnu0gxur4Z/WEeCoVEI1FnQ/rorS+dxU812YdDr0s688wzsVqtvPPOOyxfvpzly5fTv39/IVJNTU00NzdTW1srLhog0lw0ZcoUhgwZ0q34VlZW8t577+H1erHZbFxwwQUx5hG7du1i5cqVABQUFDBixAhx7s3NzezcuROIeApHNzDJskxVVRWKouBwOLpcSEBsVOpwODSB1DjlOZkj07fffpurr74agA8++IA9e/awdetWXn75ZX7729+yfPnyXj3eMRXTjo4O8eEHkdGX9evXk5SURF5eXhdz41tuuYXHH3+cO++8k5tvvpkVK1bw/PPP89prrx2rUzgsZFnu0ZC/mgq1WCzdCp/68871zaamJhRFwWq1dhFhtY7a2SUJeh9dTZgwAZvNJtyN1FtnMjIyGDx4MIMHD45ZSh6NoiisW7eORYsWIcsyGRkZXHzxxTGR5e7du4WQDhw4kJEjR4rHcrvdwlYwIyOjSy23sbFRROTRvrvRqEYU3Y0fHYjoPacH27+ooXEicjLXTBsbG0Wp6+OPP+byyy+noKCAWbNm8dhjj/X68Y6pmK5Zs4apU6eKv6uNQtdddx0vvfRSF3Pjvn378vHHH/PLX/6SJ554gqysLB577LETbsZUFVKdTnfAcYpwOCw+3LuLStVaqc1m65KOVAUzOjLr/LPu0uLRozmhUKhHac5hw4YxePBgamtrqa+vp66ujmAwSFJSEsnJyaSnp3f7OqLx+/188sknbN++HejaaAQRIV2xYgUQEdJRo0YJ4QoGg2ILTFxcXMwIDEREUk17Z2ZmdjuCpCiKiEoP1XQUjfrvpyiKJqYaGicQ6enpbN68mczMTD799FNmz54NREprhzPmdkzFdMqUKTGjBZ3pztx48uTJwvT8RKTzOMyBiI6SOh+nbjGBrlGpLMsHFMxQKERbW1u3P4NIZKrX6wmHw0Lwe4LBYCA3N/ewOnsbGhqYN2+eaBiaOnUqw4cPjxGl6NRuZyFVO3d9Ph8Wi4XS0tKYXwS1excOvqzY7/cTDoeRJKlXK5yi37NgMHjcNrtpaBwOJ3Oa94YbbuCKK64gMzMTSZI466yzAFi1ahWFhYW9frwTqmZ6MhBtaH+gq5/OUVLnaEfdYqLT6bqIQ3t7O+FwGKPR2O3PFEXBZDIdUDAcDgdtbW20tbUdUHiOBqoh/ZdffikiygsvvLDLeNPOnTtFs1FnIYVIxNra2irM67uL0tX0rvpL0x2qsUd3jV6HwmAwiAsQnU53wtXvNTQOxMncgPTHP/6R0tJSKisrufzyy8WFsF6v59e//nWvH08T0++ZnozDBAKBg0ZJalTqdDq7CLIalXZuSoJv3Z8O5jObnJxMW1sbjY2N39k8rsvl4pNPPqGiogKI+OWee+65XdKrO3bsEOMvgwYNimk2gsi8rGqLWFhY2KWjORgMioar9PT0AzpMwbdiejiRpcFgQJZl0TCmGuNrKV+NE52TOTIFuOyyy7p877rrrjusx9LE9HukJyYNcPBxGFmWhSh2Z6xwMDtA9X4HizhTU1PZvXs31dXVDBs27CBn03sURWHDhg0sXryYQCCAwWBg6tSpYkY0mq1btwpno8LCwi6p37a2NlFjzc/P79K1DBGTD0VRsNlsBzShgNhdpD2pE3eH+u8ZDAYJBoPC0Ukzc9A4kZGB8CGP6nqf45XeNBbdcccdvXpsTUy/R6J3Xh5sA4k669ldI4zb7RazpZ0bkwKBgOjw7W7oWP1Zd/6+Kv3792fVqlVs27aNKVOmHPTY3tDS0sL8+fPFDtWsrCzOOeecbruKy8vLhdducXFxF7H1+/2Ul5ejKAqpqald9rFCJBWuXjwcLL0LxHjrHq4loCRJmEwmJEkiEAggyzI+nw9JkjAajZoJvobGccDDDz/co+MkSfp+xLRPnz7ceOONXH/99d0ukNboSk93Xh6s8Qhio8vOH85q+tdut3cbYamPfbAGm+zsbLKzs6murmb16tUx3daHg9frZcWKFaxbtw5ZljEYDEyaNInhw4d3ES5FUSgrK6OsrAxAjNNEn6csy2zevJlAIIDNZuvinKQ+jmruER8fH+OA1B1H03jBaDSi1+sJBoNitV4gEBA1br1eL4RVE1eN4x1ZOYx9pr08/vtkz54939ljH9Zl+F133cW8efPo168fZ511Fq+//nrMZhaNrqgdvAcbhznUeEZ0ire7VO2hfHV7MvohSRJjx44FYP369aL7t7cEAgHWrFnDs88+y5o1a5BlmT59+nDDDTd063qkzpmqQjp06FCGDBnSRXB27NhBW1sber2ekpKSbuugHR0deDweJEnqNv3bGTX1frTETafTYTabxdiSeq5qXdXr9eLxeEQjWSgUQpblg3a2a2gcC06FfaaBQIBt27b1aoKhOw5LTG+//XbWrl3L2rVrKS4u5o477iAzM5Of/exnJ/TYyndFdFR6sCYYtdYmSVK30ZTH40GWZfR6fbeCqArfgcRUFfRD1QX79etHVlYWwWCQd955R6Sde4LH42H58uU8/fTTLFy4EJ/PR0pKCpdffjlXXHFFt+lnWZZZtWqV8OEdMWJEjH2gSm1trTDpLy4uPqB/sDpTmpSU1KMaaGfjhaOFmuK1Wq1YrVZMJlPMhZQqrn6/Xwisx+PB5/NpIqtxXCAf5u1EwOPxMGvWLGw2GyUlJcLT4I477uCvf/1rrx/viHZGDR06lEcffZTq6mruu+8+nnvuOUaNGsXQoUN54YUXtA+B/xL9gXgwMVUjR4vF0m3tTl0L5nQ6u0RR4XBY1EQP1GDU072bkiRx4YUXYrfbaWxs5Nlnn2XVqlXC2rAzPp+P8vJy3n77bWbPns3y5cvxer0kJCQwffp0rr/+erEHtbvXpDooSZLEuHHjup3xcrlc7NixA4iUGQ7kxaxGfJIk9divWa11RjtTHW3UkRmLxYLNZsNisWAymTAYDF2Wkqszq5rIahxrTubI9N577xUNkdHBy5lnnskbb7zR68c7ogakYDDI3LlzefHFF1mwYAFjx45l1qxZ1NTU8Nvf/pbPP/+c//znP0fyFCcFakR4sCaUaJP57mqaiqIcNMWrCu3BtuSo6cyeNNk4nU4uv/xyYajw5Zdf8vXXX5Oeno7FYsFsNuNyuWhoaBDm8Crp6emMHj2aQYMGHfS5AoEAS5Ysob6+Hp1Ox4QJE7qtwQeDQcrLy8WS7/z8/AM+ZmNjIxCplfZ03lNtHlLF67ueE1VnjDubSyiKgizLQijVr9WfR3cdq+h0ui43rRarcbQ4me0E33vvPd544w3Gjh0b8ztTXFzcrS3qoTgsMV23bh0vvvgir732Gnq9nmuuuYaHH344JqKYPn06kyZNOpyHP6lQPwTh4FGpz+dDURT0en23qUmfzydMAbpLb0ZHrQdC/Q/T04gmLS2NWbNmsXnzZr766itaW1vFbGhnkpOTKSwspLCwsEcRodfrZdGiRbS0tGAwGJg4cWK3+1UVRWHbtm34fD7MZjNFRUUHFItQKCTeh95uEbJarfj9ftxud4/2mB5t1Iak7mrJBxPZ6K+jH0sVVrVDWRNYDY1Y9u/f321PhbqisbcclpiOGjWKs846iyeffJKLLrqo2yv54uJifvCDHxzOw59URDseHSxKU6NSi8XS7T+kGv3Z7fZuH6cnYy8mk4lAINCrZjHVWaioqIh9+/bhdrvx+Xz4/X5sNhtpaWmkpKT0yuygvb2dRYsW0dHRgcViYerUqd1ucYHISrXGxkYkSaKkpOSgUaMqpBaL5ZAdvJ2xWCxYrVa8Xi9tbW2EQqFu0+nfNz0V2WixjY5i1axIdDSsCq2GxqE4mU0bRo0axUcffcTtt98OfBtsPPvss4wbN67Xj3dYYrp79+6Dptog8qH/4osvHs7Dn1T0xPFInUmEA4+tqEJxILGMFtsDYTab6ejoOKzOa71ef8C6Z2+ora1l6dKlBINBHA4H06ZNO2A07XK5RLqlf//+h7Q3VBuwDscGUZIk4uPj0ev1dHR04Ha7CQaD2O12zGbzMRfVzhxMZFVhVU1CVIENhUJd7Cyjo1cNjc6czGL60EMPcc4554g9y48++ijl5eWsWLGCL7/8stePd1i/QYcSUo0I0Sm4Q6V4ATGD2JnoDTLdCU/0SM3BxFQV6s41zu+LnTt3smjRIoLBICkpKUyfPv2AQhoOh9myZQuKopCSkkJ2dvZBHzv6PThcT2FJknA6naIbOhAI0NLSQkNDA21tbXi93i41y+MNVSSjm52sVitmszmm2UkV10AgIJqc/H6/mI3V0IBv50x7ezsRGD9+PMuXL8fj8dC/f3/mz59Peno6K1asYMSIEb1+vKPqgHTddddRWVnJwoULj+bDnrBEOx71JMV7IJN1NYVrMpm6TXOqjjvAQdObycnJVFVVCf/e74twOMz69evF6Evfvn0ZM2bMQbuKd+3ahcfjwWg0UlBQcMjIUPXEVZuJjgSr1YrBYBAdtLIsi45a+PaiR72p0Z3qbHW8RbFqWle9UFPTwJ2j1+jIVT0fzbnp1EY5DHFUThAxhYgxzL///e+j8lhHVUyzsrK0dNF/6elsqSzLIu16ICFUxfRAUacqxmaz+aDvv2rdp85hfh90dHSwbNkyIeDduRp1pqGhgZqaGgCKiop6JI7q2M7RSskajUbi4+OJi4vD7/fj9/vFWIpajzxQujy6q7bznz393neJJEkx/yeja6zqrK36tercZDAYtHTwKYis6JCV3v2b9/b4Y0k4HGbu3Lls2bIFSZIoKiriwgsvPOhn9oE4YjGNtmJ76KGHjvThThqi5wAPFoGpH8gHSvECh0zhRjcvHYzk5GR0Oh0ej4fm5uYDNv0cLWpra1m+fDl+vx+TycTYsWMPuYnG5/Oxbds2APLy8nr8GqPHj44mqoGG+t6qRguqqKp/HqrDtrfPGd2R27kzN/rPoxEJq+JqMBhi6qvR56VerKjCqkWsGic6mzZt4sILL6Suro5BgwYBsH37dlJTU3n//fcZPHhwrx7vsD95nn/+eR5++GExSD9w4EB+8YtfcNNNNx3uQ55URBskHOxDJzoq7e64QxnfAz3y/FV/npeXR0VFBdu2bTusjrWeEAwG+eabb8T/jaSkJCZOnHhI03x1DCYc/v/2zjtMrrLu+99zzvQ+O7M723tN7yRI71XsoDwUBQUVFNDXBywPiAqiNBFDkSIqTaUoUkMKNQlJ2NTN9l5mZ3d6r+f9Y7jvzOzO7s7M1iTnc11zbTJ76pzZ8z2/HoVarU7ZwH4iyLnPVtMFAmkVmCp7eaLs2sQkoFT/TnyPbIe8PxWJWbrEekx0P2cqeESciTeAuIJJowgirKFQKGmfgrAemxxrvXkTufbaa7F48WLs3r2bdmaz2+24+uqr8Z3vfAfbt2/PaHtZiekvfvELPPDAA7jxxhvpDXn79u24+eab0d3djV//+tfZbPaYIV0Xb2KjhomsSvJ7kUg0oVgSqyydZgN1dXXo7u5Gd3c3VqxYMWnT+2wwm83YsWMHdU3X1NRg1apVaVmMfX19sNvtYFkWDQ0NGbkUE0egzRepmjFkwkTlLmNfie7YxFjnWLdzosVJJteIxeKMPldiFYvF4nHCSo4jcT+CG/jYIt60IVMxnaWDmWH27duXJKRAfNrWb37zG6xduzbj7WUlpo888gj+/Oc/4+tf/zp97/Of/zyWLVuGG2+8URDTBCtjshtrOBymSTMTCWE6zekTk0amwmg0wmAwwGq1Yt++fbSp/XTxeDxobGyk/S2VSiXWr1+P/Pz8tNZ3uVx0okNVVVXGTRPI50cs+UzrTBcCE5W7TESioJEXEVYitmS+KskGB0CzfcmLtFOcirHCGolE6HeY7EewVo8tjuWYaV1dHYaHh7F48eKk9y0WC6qrqzPeXlZiGo1GsWbNmnHvr169etbdbEcDmbp4J0uamSo5KZF0b14rV67Eu+++i46ODigUiikTgiYjEAigubkZzc3N1EqpqanBihUr0m7Ll1gGk5ubm7IT0lRwHAeNRgOXy4W+vj5UVFTMePx0oZEobmNJtFiJ6CUmTyWGDwBQUSWvqQSdZVmaXU6aQ4y1VoU5rkc/x1qdKWnJCgB33XUXfvCDH+COO+6gRsWOHTtw55134p577sl421ndbf7nf/4HjzzyCO6///6k9x9//HFcfvnl2WzymCJdS5EkdUyWrZpuclEmmEwmLFu2DPv378eBAwfg8/mwdu3ajNyTTqcThw8fRldXF43t5eXlYc2aNSknw0xGR0cH/H4/JBJJWmUwE1FQUEDHmvX19aGsrOy4dTsSMRsrtCSBirxIWRX5P3HPkzpVUp860TVJdPEmJmclznElxyGI6tHHsRYz1el0Sd9Dnufxta99bVyr1YsvvjjjmvJpJSC98847SYre19eHK6+8ErfccgtdbqzgHuskJo5MJk7kZgNgwlZ8PM8nWa8TkWnPXSBeoiKVSrF79250dHRgZGQEVVVVKCkpSdlIIRqNwul0or+/H319fXR2KhDPEl60aBFKSkoyvmHabLakMpjpNJkXiUQoLS1FV1cXfD4f2tvbkZOTA41GM+3a02OFsQlUY6fUhEIh+v9wOAy3203XIeI60TUmy0kkkpQuYEFUBeabrVu3ztq2sxLTgwcPYtWqVQBA273l5uYiNzcXBw8epMsdj3805GlmqthXYqnBRKKb2Fd1MpEh7sxMk29qa2uhUCiwY8cOuFwuNDY2orGxEWq1GjKZjLqpPR4PPB5PklgzDIOioiI0NDQgNzc3q2sdDAZx+PBhAEBRUVHGFm0qZDIZSktLMTAwgHA4jOHhYQwPD0Mul0OhUFAhSaybnGySTzoJQWObH0yUyZv4StxH4v5J3DSxNGZs67+x2btisTjrGGWiZUmS0UgyExHXWCwGv99PR9tJpVLaVSnVPhNdvKRWVRDVo5NjLWZ66qmnztq2sxLT2VT3o510Z4YmungnuqmkswxwxGqdaN7oZBQXF+Piiy9Gd3c3+vv7MTw8DLfbTXsBJyISiWAymVBaWoqioqKMmtuPhed5HD58mPa/rayszHpbY1EqlaiurobD4YDL5YLX66VikAry0MMwzLjylKOJxKQisVhMrUQyNzVdiLgqlUrqQSHDDUisNRAIgGEYOvg8lTgSoeY4bpyoRiIRIaZ6FHCsuXlT4fP50NvbO+7+uWzZsoy2c2xnaMwD6YopsSIncz+SizuV65OI2kRiMRVSqRR1dXWoq6tDMBiE1WpNKtpXKpXQaDQTtjvMht7eXjgcDrAsi0WLFmVdTjIRLMsiJycHOTk5CIfD8Hg8VBBIJyMilunWc6aaHTp2zFmq2aKJXorEJgtjYzfkZ6o61MT2f6kydwFQyy8VHMdRVy15TdUxixwjseaJEAYCAfj9/qQ2ixzHQaFQQKFQjNvmRKJKroNEIpnx6y8wM8R4BtFjVExHRkbwzW9+E2+++WbK389ZzFRgPOl2PSI3JWByoUxnDipwpDNSIBBAKBSaVnxQKpVmlU2bCU6nk5bB1NTUTNqcfyYQi8UpXchEnMZaoYmu1rlq8TcdSOZuYlIR+Un+HY1Gk/oLA0eEkjTDVygUk353SDMHiUQCtVpNm+QHAgFEo1Hq0ZDL5VAqleO+24miSprsk4lJZI7v8ZowtlA5li3Tm266CXa7HTt27MDpp5+OV155BcPDw/j1r3+N++67L+PtCWI6gxALZ6qbb2K7uXTEdKqndolEAqVSCa/XC6fTidzc3EwPfc7w+Xw0rp6Xl5d2HepscKzM9Zwoc5dA+j8Hg0HqoiUCSP5NEIvFUCqVUCqVUKlUEz7IJVqsRBC9Xi8ikQh1qZPv5djYamJMNbHfMVlHcP0uHGJgEctwuFimy88XW7Zswb///W+sXbsWLMuirKwMZ599NjQaDe6++25ceOGFGW1PENMZhIjfVDfoxD6yk9000hVTANBqtfB6vbDZbAtWTEOhEPbv349wOAy1Wj2tMhiB9GFZlsY2CcQ7QsavEQEMh8NwOBw0W1smk0GtVtOktFTXi2VZat2S8hriJQmFQhCJRFCr1SlFVSqVQiwWIxgM0laFkUgkLRe0gMB08Hq9yMvLAxBveToyMoLa2losXboUn376acbbE8R0BkmnJAZAWq0GgeQhAlNhNBoxODgIi8WCqqqqBdewIBKJYP/+/bQ70dKlSxfcMc4XiRnA5P8ky5dc+8QY7EyQ6LLVarUAkOQKJjFm8hoZGYFYLIZarYZWq00ZP0/cZjQahdfrhc/nQyQSgd1un1BUWZaFTCZLcv0KVurC4Fh289bV1aGlpQXl5eVYsWIFHnvsMZSXl+PRRx9FQUFBxtsT7mYzRGJ96VRP1JlYnOmi1+uhUCjg8/kwNDQ05XSWuSQcDuPAgQPweDwQi8VYtmzZMVv3mZikEwqFaMJTYqOExM5EJGabLiThKbHnbmJbQOJ6naomNBUcx1Er1GQyUSvT5XLB4/EgHA7DZrPBZrNREdbpdCmvJelIpVKp4PV6qQvYbrfTmGviesT1y3FckpUajUZnbKyeQOYcy2J60003YWhoCABw++2349xzz8Xf//53SCSSrGacCmI6Q4xNYJmMdC3TTGAYBsXFxWhtbUV/fz8KCwsXRIakz+fDgQMH4Pf7IRKJsHTp0oz77i5EQqEQteIS3aRkmPhsQSzYdGqKE0tXFAoFjYUqlcq0XKhisRg6nQ46nQ6xWAwejwculwsulwuhUAgjIyMYGRmBUqmEXq+HRqMZ991nWRZqtRpKpRIejwderxehUAhWqxVyuRxqtTrpezrWSiWxVKlUuiC+z8cbMWRRZ3qUxEwTu/WtXLkS3d3daG5uRmlpKZ39nAmCmM4Q6SYfJS471c2B3PDSTdE2mUzo7u5GMBhEd3c3qqqq0lpvtrBarTh8+DCNgS1btmzWM3dnGp7nqdsz8TWVmBErkfxMtCCJVZnYOGKsG3dszevY8pjEnrvkRZKMyIscu8/no8PZybaVSiXUajU0Gg00Gg0UCsWk31uWZemysVgMbrcbDoeDCqTX64VIJKLlSGO/22R9pVIJt9ud9PChVquT9p9opQYCATpdSXD7zj3HmmWa2J1vKjLt3ieI6QxBBDKdP/R03cGZzujkOA41NTU4dOgQ+vr6oFAosvL9T5dQKIT29nZYLBYAgEajwZIlS44K1244HIbT6aQWmNvtnvBhRiaT0dpKYgGSzkAzkTyTKC5kKHe6kFaUJMGICB5xt5KHAuLmEolE1AolIYPJ2gZqtVpotVqEQiHY7XbY7XZEIhFYLBaMjIxAr9fDaDSOyzDmOA46nQ4KhQIulwvhcBgulwt+vx9arTZpeZI4RZpFkHhqulNuBKbPsSamjY2NaS2XzfdLENMZglgRU91EM4mtkptnJpN4cnNzUVpait7eXrS0tIBhmDkrP4lEIhgaGkJPTw895qKiIlRWVi5YF100GoXD4aCCQBq9J8KyLFQqVdJLoVAs6AQqhmFoc4bEGlsisqQm1Ol0wu12IxKJYHR0FKOjowDi9cZGoxFGoxFarXbC76pEIoHJZEJubi6cTiesViuCwSBsNhvsdvuEoiqRSGAwGODz+eB2uxEOhzE6OkpdwokPElKplLp9ySzViTKLBQQmY8H15hUYTyaWKWGqZckNaOzQ56moqKhAKBSC2WxGc3MzPB4PysvLZ+3mHwqFMDAwgIGBASqiKpUKtbW10Gg0s7LP6RAIBGC1WmG1WmG328clACkUCurSJC7QY6VMI1FkSQlVotvWbrfD6XQiGAzSayoWi5Gbm4u8vDxotdoJy2P0ej10Oh28Xi9GRkbg8/lgs9ngcDhgMBhgNBqTPkfibpbJZHSfbrcbwWAQOp2OPoARty/DMDQ5ye/3QyaTHTPXZaHC80zGMVN+AVums4kgpjNEJpYpYSoxJXWBJAM0XTFkGAZ1dXWQSCTo7e2lPXfLyspQWFg4IzegcDgMq9UKi8WSJEhyuRylpaXIz89fUJaDz+ejCTMejyfpd1KpFDk5OVQMFoo7OrGt4NiuTDNJotu2rKwM0WgUdrudWqrhcBiDg4MYHByETCZDfn4+8vPzU44FZBiGWu8ejwcWiwV+vx8jIyOw2+0wmUzjBJnjOOj1evh8PprcNDo6Cp1Ol9T/WSQSgWXZpDiqIKizy7E2z3Q2EcR0hsikJjRdSIs10rYt1Wi0iWAYBpWVldBoNHReaHt7O3p7e5GTkwODwQCdTpfWyDNy4yIdlhwOx7hG+Gq1GiUlJVlPkJkNotEoLBYLBgcHxx2vVquFwWCAwWCYMvlmJiHWF4lf+ny+pHrOxNKZibKCSfyUlMaQMhjSxo9k7mY7eo7jOOrijcVicDgcGB4exujoKAKBALq7u9Hd3Q2DwYCSkpIJrVWVSgWlUgmXy4Xh4WGEw2EMDAzAbrejsLAwSSiJlSqRSOBwOBCJRGCz2ca5fUm2LxFUv98PuVwuCOoscazFTGcTQUxngEyszUxRKpUIhUJwOp0ZiSnBaDQiJycHZrMZ3d3d1P1rNpsBxF3JiUkz5KZEbuhEyFPd2BUKBfLy8pCbm7ugsnSJi3JwcDAp3qzX65Gbmwuj0Tjr1ieZ/0rqMp1OJ3VlThdSg0kGITidzgmXlclk0Gg0NLGIvNIVn8SBAdFoFCMjIzCbzXA4HNRVrtFoUFJSAqPRmLKRg1arhVqthtVqpe7fjo4O5OXlwWAwJK0jFothMBhoUhJJAEssuyGJSaQMSRDU2eNYG8E2mwhiOsck3jhisdiUiTk6nQ52ux0ulwvRaDSrRB6WZVFYWIj8/Hx6E7TZbLR9XLo1iwqFAmq1mmZ9pnLzzSd+vx99fX0YGhqiDzgymYye+2wKaDgchsViodmsVqt1QstSLpdTq41kAhPrkpR/iMXipEk0idNkEktjEsej+f1+au16PB5aehIIBGhmNRC3PHNycmgcNC8vLy0PBcdx1MXr8/nQ39+PoaEhuFwuHDp0CCqVCuXl5eMEEoh/B3Nzc6HVajE4OAiv14vh4WF4PB4UFRWNy+Ilmb0ulws+nw/RaBQ6nS5p+g6xUElvYMHlO/PMpWW6ceNG/P73v8fQ0BAWL16MBx98ECeffPKU63300Uc49dRTsWTJEuzduzerfc8EgpjOAJl0sCE3R1J8P5U4EqsxGAzSRI5sSbQyACQ1JSd1iUQAEt2IpORjobhvxxKNRtHZ2YnBwUF6LSazlmYKv9+Pnp4e9Pb2YnR0dNz3QCKR0FgsaWqg0WjSEq6ZIBwO08Qiklxks9mSGi40NTWBYRgYjUYUFxejtLQUKpVqym0rFArU1tairKyMJip5PB4cPHgQarUa1dXVtE1hIhKJBGVlZbDb7TCbzfB6vejo6EBRUVGS54W4fTmOg91upxnCer0+KTFJJpPB7/cLMdSjnBdffBE33XQTNm7ciM997nN47LHHcP7556OpqQmlpaUTrud0OnHllVfizDPPxPDw8Bwe8XgY/mibgDxNXC4XtFotnE7njGWaElcTgLTcnSMjI4hEIsjJyUlrwLbVaoXZbIZIJEJ1dfWCLTOZD9xuNw4fPkxHi+n1epSVlU0Yx5su0WgUfX196OzshNlsThJQlUpFy0Ryc3OhVqsX3AMIz/Nof/z3GG5rgddUCn/N8nHlQAaDARUVFSgvL097AHw4HEZvby8GBgboA1lhYSEqKiomfHgIBoPo7++nU2vy8vJSPvyQWtZYLAaRSASDwZAkmMQy5XmexlQX2uc+28z0fY1s7wdv/ABSZXrfAULQG8RDFzyU0bGccMIJWLVqFR555BH6XkNDA77whS/g7rvvnnC9yy67DDU1NeA4Dq+++qpgmR5LkAblkyESiaibLp2blV6vT7Io5nNs2UJieHgYLS0ttJC/vr6eWt0zjcfjQVtbGzo6OpLinkajEeXl5SgsLMwqpj3XMAwDlYhDpL8FJs8oqn/0C3h9PgwODqK3txfDw8M0Fvrpp5+iuLgY1dXVU2Zni8ViVFVVoaSkhD5oDA4OYmRkBDU1NXQ6RyJSqRQVFRUYHh6GzWaDxWJBMBgcl3FOalLJ0Hqr1ZokqERASWw/GAwK/XxniOnETF0uV9L7pG/0WEKhEPbs2YNbb7016f1zzjkHH3/88YT7efrpp9HR0YG///3v+PWvf53RMc4GgpjOAJn+0ZISl3RilUD8ZpGfn4/e3l5YrdYFGa+ca7q6utDT0wMgPj6poaFhVtynbrcbe/fuRV9fH7VCFQoFqqqqUFFRcVQI6FhMV94I1fITIKusA/NZQ4ra2lrU1tZS13VnZyfsdjt6e3vR29sLtVqNhoaGKRtwkIea/Px8tLa2wufzoampiY63GnuNWJZFQUEBpFIphoaG4HQ6EQqFUFpamlQKRloV2my2SQWVzGkNh8MLpsTpaGY6pTFjh23cfvvtuOOOO8YtPzo6img0CpPJlPS+yWSiiZJjaWtrw6233ooPPvhgwTRPWRhHcZyRqZgCoNM83G43enp6MnLBHWsMDw9TIS0tLUVFRcWMWyHRaBSHDx/GwYMHaTvB/Px81NbWoqio6KiOy7FiCdRrTkr5O7lcjvr6etTX18Nms6GjowOdnZ1wu9345JNPcPDgQSxfvnzKz1yn02HNmjU0pjwyMgK3241FixaldP3l5ORAIpGgv78ffr8f3d3d4xqNiMXiJEG12+3Iycmhx8FxHM0vCIfDGbdgFBjPdBKQ+vr6kq71VPersd+nibx80WgU3/jGN/DLX/4StbW1GR3bbCJ802aAxAuejpuXPDGTEVzpxkALCwvR09NDa/2OR0ENBoNobW0FAJSVlaGiomLG92Gz2fDxxx/TkhOTyYTVq1cnteWbSwJdrfB3NEF70jlgZXM3cYckq61YsQIdHR1oamqCz+fD9u3b0dzcjJUrV07a+5llWVRUVMBgMKCpqQmBQACNjY2oq6tLGapQqVSoqKhIGtYwkaBarVZaMpYYHxeJRDTbORgMJpV7CWQOn4Wbl/9seZJwNxVGoxEcx42zQi0WyzhrFYh7i3bv3o3GxkbccMMNAOJxc57nIRKJ8M477+CMM87I6JhnAkFMZwgy5SOdfC6O4yAWi+mkj3RHkolEIpSVlSXdbEpLS2mnpGMdnufR2tqKaDQKtVqNsrKyGd/+4cOHsW/fPsRiMUilUqxevRrl5eUzZvnysSic294Ep9ZAvfYUxEJBuHe9D1l5LaRFZeB5Hi6XCxaLJV6barfBvPUNRMCAPdAGNq8QwJFh4SIAMrkMcpUaKpUKBoMBOTk5k/bTzRSxWIz6+npUV1ejpaUFhw4dgt1ux5YtW1BcXIw1a9ZMmnin0WiwevVqHD58GDabDc3NzfB6vaisrBz3uUqlUpSXl08pqKRkjIz2S8xAlkgkNFs+GAwelwlJRxMSiQSrV6/Gpk2b8MUvfpG+v2nTJlxyySXjltdoNDhw4EDSexs3bsSWLVvwr3/9a1YesNNBENMZIhMxBeI3DTJEOpP5niKRKOlm09nZCaPRiNzc3GP+Cdzr9cJqtYJhGNTX18/4+R44cID+kZaUlGDdunUzHpt2vv8Whh7/LQCg/Dd/hvOjd9Gz5Q2MGIrgW3c2hj9rv5eELi6gCPPAwEBa+xGJRMjPz0dBQQGKiopQWlo67XMRiURYvHgxqqurcfDgQbS0tKC/vx8WiwXr1q2b9OFGLBZj6dKl6OrqQm9vL/r6+hAMBlNex7GC2tvbi/Ly8qTlSDMKMtlHJBLR8yPN8UlCkhA/zZ65qjO95ZZbcMUVV2DNmjXYsGEDHn/8cfT29uL6668HANx2220YGBjAX//6V7AsiyVLliStn5eXB5lMNu79uUQQ0xkisXY0HeRyOTweD43vZJI8QwSVFMyPjo7C6XTCZDKlHNB8rOBwOADE43Ez3XGpvb2dCumqVatQX18/5efI8zxCQ30Q5+aDFad3s+bUOvAAnEo9PjjYgtaRAHyLTo//8rM4MMuyMBqNtDZVzgCsxwlleRXEkrhbPxaLwX1gF6zb3kJIJIb85PPgE8vptJZIJIL+/n709/dj165dYBgGBQUFqKioQG1t7bTqb4nFXl1dje3bt8NqteLDDz+E2WzGmjVrJgxbkBaXSqUSzc3NsFgs4HkeDQ0NEwpqV1cX/H4/+vv7UVJSknTMSqUSkUgEPp8PDocDubm5dN8syybFTzmOE0rKsmCuxPTSSy+F1WrFnXfeiaGhISxZsgRvvPEGfUAbGhpCb29vxtudS4Q60xkiHA4jFAqB47i0LQC73Y5AIAC5XA6dTpfVfl0uF4aGhmjbPNKOLXHqxrHCwYMHMTo6ioqKihl18Q4ODmLbtm3geR5Lly7FsmXL0lpv5J9PYvSlpyEtr0XF3U+mNRR+7969aNz1CazOI2UDHMugpKAAVfUNKCwsRG5ublqJM7FQENZ//x2sQomcCy6l++d5HjabDUNDQxgcHERfX1/ScHAgHg+tr6/H8uXLp5WRHIvFsH//fhw6dAhAvIzr5JNPnnKbo6OjOHToEHieR25ubkpBBeLeiJ6eHvA8j5ycnHExWp7nYbVa6QPp2O5LwWAQkUgEDMMs6MYj02W26ky/+eqtkGRYZxryBvH0F3474/fYhY5gmc4Q5EYQjUbTSkIC4k/WgUAAfr8fSqUyq9IOjUYDpVJJWwSGw2GYzWYMDw9DqVTSLOC56rozm5Bm9ak662RLLBbD9u3bwfM8KioqsHTp0rTXDfS0AwBCA90AHwOYiR9evF4vXnvtNfp0zXEcqqursXjxYpSVlWV1fViJFLlfvQYAEHE5wEqkYGVxwSBN/Inby+VyoaurCx0dHejq6qJJVjs//ADLqitx0oWfzyr2zrIsVqxYAZPJhI8++gh2ux1vv/02zjjjjElrfo1GIxYvXoxDhw5hZGQEHMehrq5u3N+NUqlEUVER+vv7YbPZxj14MgwDnU5Hp9t4PJ4kIZdIJPRvMhQKHXcJe9MlygPRDC3N6HFlnh1h3oNsGzduREVFBWQyGVavXo0PPvhgwmW3bds2bhQVwzBobm6ewyNOzdiOLOkgkUioFetwODJqS5gIx3HIy8tDbW0t8vPzIZVKwfM8PB4PhoaG0NraitbWVvT398NqtdJep0cbiQOjZwrSgUcmk+GEE07IaNumK3+AnAsvRclP7gHDTiykAwMDeOaZZ9Db2wuxWIwzzzwT3//+93HJJZeguro6SUgjbie6fvZtdN76TYTto2kdh/fAbrRd/3m03/BlRJz2lMtoNBosX74cX/rSl3DDDTfg3JNPhN5jQ5Tl0NjZg8cffxw7d+6E/aNN6PrFdXDt2JL25wAABQUFuOCCC6DX6xEMBrF58+Zx1vBYiKACgNlsRl9fX8rltFotnb06ODhIOyYRRCIRfcDyeDx0AABwJH4KHMmeF0gfUmea6et4ZF7FlPRj/NnPfobGxkacfPLJOP/886f0jbe0tGBoaIi+ampq5uiIJ4ZhGOpWzeQPlsQ4I5HIuDmbmcKyLAwGA6qrq1FdXY28vLykmahOpxNmsxldXV1obm5Ga2srenp6MDQ0BJvNRm9EC9XzT1yfiZNgpktbWxsAoKqqKmO3uCSvAKYrboRy6doJl2ltbcULL7wAj8cDg8GAK6+8EqtXr54wFOA7uBuBjsMIdrfBu3dH0u9cuz9A23e/gKG/PJj0fqC7FYjFEPW4EB5NXeSeiFQqxbL1n8NFVQU4KTAMo06LYDCI9957D89v+xi9Zgsszz4y5XbGolAocNZZZ8FoNCIUCmHz5s0YHZ38gcBoNNK/387OToyMjKRcjkwm4nke/f394x5YSf9oIP5gmvh7juPod4f0oBYQmGnm1c17//3345prrsG1114LAHjwwQfx9ttv45FHHpm0H2NeXl7WMcbZhOM4RKPRjMSU4zhotVo4HA54PB46n3K6SKVS2iM2Go3C7/fD5/PRSSKRSGTCiTEMw0AikUAikdAWYGSqyXzGnIgFl2h5TIdAIEBr26qrq2dkm4l4PB689tpriEajqKmpwQUXXJDSzejZtxMDD/4f5DWLUXjj7VAsWQ0+EoFq9eeSljP/+feIOm1wvPUvuLdvgTg3H2U//wMYiRSMVAblopWQVdand3DRKMRqLepWrML6Cy/FoabDeP/99+HxAjtrNmBVjgpVaYYrEpFIJDjjjDOwdetWjIyMYOvWrTj33HMnjZ0VFRXB6/VicHAQhw8fpvNYE2EYBsXFxbSdo9lsRmFhYdIyGo0GwWAQ0WgUbrc7KRwgkUgQiUTA87yQ3ZsBwjzT9Jk3Mc22HyMArFy5EoFAAIsWLcLPf/5znH766RMuGwwGk3qpju0XOZMQy4Zk9aZbuiGTySCXy+H3+2G326HX62e0JIPjOKhUqqRaPFLUHgqF6GdEkqh4nqfvJQ7VJkkc5GanUCjmtBxHrVbTySeTNQtIF/IgMbZOcaZoa2ujbdIuueSSCT8r985tiPm98O7/BHw4jLKf/yHlcmKjCVGnDRCJEXXaEHXaEOjtgOPdf4MPBuBrOwTrf56Fv3k/8v7n+5AWJSdp8bEYfIc+hSS/GL6W/Rj5xxMAAElxBZauOhFVxUV4+/E/oo2R41ObB9i8GWeeeWbGgioWi3HGGWdQy/S9997DueeeO6mAVVdX06zcpqYmrFq1apynQCQSoaioCD09PbDb7TQfgMCyLHQ6HWw2G3w+H+RyOd0neUAMhUIIh8MQiUTHfCnZTCDMM02feTvrbPoxFhQU4PHHH8dLL72El19+GXV1dTjzzDPx/vvvT7ifu+++G1qtlr7G9oucSViWpTeATFoFkgHKREBJlu9sIhKJoFQqodfrkZ+fj7KyMlRXV6OhoQE1NTUoKytDfn4+9Ho9HbzM8zx8Ph9GR0fR09ODlpYW9PX1weVypR0nng4kbma1Wmck9pWY/TobkE5NU9XE6s//KhSLVsLwxasgzjECAIKDvbC+9hzCo8NwvP8mWr9zEeR1y1DwvZ+h7PaHoVi8CtpTL4C8sh6GL14JSWEpDBdehpHnH4Wn8WNY//v8uP1Ynt2I3t/chM6fXAmRIQ/gRGAkUkjyiwEAkcONqG/chCV9+wEAn376KbWsM0UkEuGUU06BQqGAy+XCxx9/POnnzLIs7a9MxrKlgjSmADBu+DsQ98gkunsT95kooAs5nLGQIJZppq/jkXnP5k23HyMA1NXVoa6ujv5/w4YN6Ovrw7333otTTjkl5Tq33XYbbrnlFvp/l8s1q4IqFotpOzOJRJL2Uz3JSnQ4HAgEArDb7dSanEvXaqKLN9FaI9mQPp+PDqAOh8NwuVxwuVzgOA56vR45OTmzljmsUqlo7eDo6GjKVmOZkCimmXgS0iESidDYf2JM33tgN9y73of+vK9AWhif0ygrqUTZ//0xaf2+e/4fwsMDcO/5EHw0iqjLAcfmf6P+mXcBAGW/eIguqz3xLGhPPAt8LArPvp0ItDdBPcZF7D24B7bXXwAAxIJBSIsrUPPIK2BYDpwq7oJVLF4FaUkl6mJRlJx+Kt5+7wM0NzdDLBbj/PPPz/gzkMvlOOWUU/DOO+9gYGAATU1NNOEoFVKpFA0NDdi/fz8GBwdpRvJY8vLyaI222WxGcXFx0u8T3b2J2b2JzRxIOEbo3Ts5PDIXR15IQJpbMu3HOBHr16+nSSSpkEqltEdkur0ipwPLsvQmnYl1ChwRVGKhejweOnZqviE3Ir1ej+LiYtTU1KCyshIGg4E+QIyOjtKs4ZmKa449BuLe7erqmrZ1Sgavx2IxDA4OzsQhUsgUEyA5tNB//09hf+dlmJ9+YNL1RVr9Zz9zYPzCFZCWVCLv0u9Mug7Dcii/40+o++tmqNecnPS7iO1IYk/+N2+CSK2FSKOnQkr2Wfn7v6LqvmexdO0JtLXbgQMH0NnZmcZZj8dgMGDt2niC1v79+2njjYnIycmh4tjS0pLyb4hlWRQVFQGID4dODEWQ35O/c4/Hk7QNlmWTYu+CdTo5xM2b6et4ZN7OOrEfYyKbNm3CiSeemPZ2GhsbZyR+NlMQyw6Ii2mm7k8iqKR5dzgcxujoKNxu95y4UtOFxE/z8/NRU1ODkpISmjTidDrR3t6O4eHhGS9FKCkpgUQiQSAQmLYAkkbsALIWi8m2Tbwohw8fpu9Ly+NTLuSVdSnXI5Tcdj9Kf/4HFH7/F1CvPgmVv/8rci74Wlr7ZlJY2JqTzkb+NT9G4Q3/B91ZX5hwXc/+Xej6+Xdgf+cVVFVVYc2aNQCAt99+Oyn3IBOqqqpQWFhIa3on+h5HfV64dmxFSa4BcrkcoVAI7e3tKZeVy+VJ7t6x3zPyoATEv4+JoikWi2n7z9l46BM4PpnXR4hbbrkFTzzxBJ566ikcPnwYN99887h+jFdeeSVd/sEHH8Srr76KtrY2HDp0CLfddhteeuklOjlgocBx3LRiMwzDQKFQwGg0QiKR0JrRkZEReDyeBSWqQPx4NRoNysvLabs4nucxOjqK9vb2GU364jgO5eXlAICenp6sb/CEqqoqAPF607EWznSpr49n1ra0tFCLrOznD6LqoX8g97LrJl2XkyugXLIa7GftA50fvoOhP/8urdKXVDAsB/3ZX4D2pHMmDRuMvvQUAu1NGP77wwCAk046CTqdDm63e8rEwAn3zTA44YQTIBaLYbPZ0NLSknK5gQd/gYEHf4HBe/4f/ezIsPJU5OXl0SzdVHkWiQ+kPp8v6XjGTm4SSI1QZ5o+8yqml156KR588EHceeedWLFiBd5///1J+zGGQiH8+Mc/xrJly3DyySfjww8/xOuvv44vfelL83UKKUksFCfx02wgA5FJa8BYLAa3243h4WE4HI4F6aaSy+UoKytDSUkJxGIxIpEI+vr60NfXN2Pu6vz8fNqXtampaVoPFzqdDvn5+eB5Hh999FHSjdXXehBdP70WI/98Mqttl5SUIC8vD6FQCP/4xz/gdrvBcCJI8gozioNHvW4MPvwrODb/ByP/emrK5floBNbXnoP19RfAxzITCt2pF4KRyaE742IAcQ8SyZZPnO2aKQqFAqtWrQIQdxunSrDjI3F3LB8OQavVUncvmRQ0lkR3LyktS4TjOBovdbvdSd8/kUhEkwUX4t/RQkFIQEofoTfvLEJKTYB4+ct0euXyPA+/3w+v15t0UyC9gGUyGXVfLRRisRhGRkZo4T7HcSguLp6RMhSfz4c9e/YgGo2ipKSEWpjZ4PV68cYbbyAUCqGhoYHe9Psf/AXcO7YCAGqffgecPPNZoh6PB8899xwcDgf0Oh0+v6wOxmVrwCnSb9TPRyPo/N9vItTfBeNl18Fw3lfAyiZu/efavhkDf7gdAFD8498mDQKP+rxg5YqMviexWAyPPPIIvF4vvvSlL2Vdk8vzPN566y3YbDZUV1fjhBNOSPp9xGWHZ/dHUK44AeKceH30J598gmAwiOLi4gn3S5qOiMXicc03SJ/iUCgEiUSSNEw8FovRCT1isfiorj2drd68F7/4a4gVmZXphX0BvHbpz4+73rzHZ6R4jkh8+g0Gg9OyoBJdvwaDgTbtjkajdDSZxWKB3W6Hz+ejBerzCcuyMJlMqKyshFQqRTQaRU9PD4aHh6d9bAqFgroC+/r6JiynSgelUon169cDiMc3u7q6AADak88Fp9JAc9I5k4rXZKhUKlx66aXQaDSwOxz45ztb0Pj7n2W0DYYTofK3TyP38u9h9IXH0PnjKxALTxzrE+cXAxwHRiSG2HSkscHoy39B67fOxcADv8ho/yzL0s+6qakpo3UTYRgGq1evBhCf0mOz2ZJ+L9LooTvjIohz4iVQHMehtjYeY57MDZ+Xl0fnA1sslnH7JM0bQqFQ0ng7lmWnld9wPCBYpukjiOksQty9JNkhEAhM+w+WxHt0Oh1MJhPN/mUYBrFYDIFAAE6nEyMjI1RcvV7vvLqy5HI5KisrodfHM1RHR0fR2dk57Xhnbm4uLXNqbm7G8PBw1tsqKSmhCUMff/wxWlpaoF59EmqfeANFN/zftCx+rVaLyy67DIpYBD6ZEltkBdi0adOk5+9taoT93VepaDIiEcKWeMJV2DqMWMA/4bryijrkf+tHkJZUINh7pF7Ts29nfNv7P8n4HCorKwFgyvaAU5GXl0fDOI2NjVMubzAYkJeXByB+jVP9/XAcR7sh2Wy2caIrEomou9flco1z95L8BqHV4Hh4PrvX8YggprMMwzBU7GZKUBO3LZfLodfrYTKZYDAYoFKpaOo/EVeXywWr1Qqz2YzR0VG4XC4EAoE5TbxgWRaFhYUoLi4Gx3EIBALo7OycdnJSZWUlzeY+fPjwtAR19erV1BLavXs39u/fP2M3V51Oh6uuvBI1OhXAMGhsbMQTTzyBw4cPj9tH2GpB769+CPMT98KW0Hwh9yvXIOfib6Do5l9DpJ58co71tecQ6GqF5dmN9D3TlT+AZsOZKPz+zzM+fpKp7XU60HPnD+DZvyvjbRBWrFgBlmVhNpvTysgmwwC8Xi+6u7tTLqNSqeiUmlTNHJRKJU3mS2zmkJjfQAaJCwhkgyCmcwCpOSSCSorGZxJisarVahiNRuTn5yMnJ4c2OkisffV6vbDb7bBYLLBYLHA4HHM2SUar1aKqqgpyuRyxWIy6aLN9wGAYBrW1tUmCmm3JDMMwWLNmDR3DduDAAezYsWPGbrDqwmJ88Tvfw6WXXgq9Xk/Hsj333HNoa2s7coMXS8B85n7klMl1oKbLvwfNulOn3Jf+jIvBSGXQnfl5+p68qgFFP/wl1GtTNziZDNJVKBAKwdv0KUaez7wRPkGlUtGHlsbGximvvUQiocv39vZO+ABmMpkglUoRiUQwODiY9JBCSs5Idm+i9UoGiQPxvw8hu/cIgmWaPoKYzhFEUIlLKRAIzKrrlTxxq9Vq5OTkwGQyITc3F1qtFnK5nHZ+IU3wnU4nLBYLRkZG4HQ6Z9XlJRaLUV5eTi0Jq9WKrq6urGv+xgoqGTmXjUAzDINly5bR+srOzk68+eab03ZvJlJWVoarr7oKS7kg2FgMAwMDeOWVV/DEE0/Ek6qkclT+/m8ou/1P0J39haz2kXPR11F1799h/NLVM3LM5NqwDAtwImhOOnda21uyZAkkEgkcDgdtuzgZZGgDEI/bTtbMgWEYuN3ucdeMDJUA4klniRnFY/MbBHevQKYIYjqHEEElQhYOh+fM3cowDEQiERQKBXQ6HXJzc2EymaDX65MGk0ciEfh8PthsNgwPD9M+wTN9c2FZFgUFBSgpKaFu346ODjidzqy2RwSV1KAODg5i3759WQs06fusUCjgdrvxzjvvYN++fTN3rVx2lO96G2ccehdV5jaI+Sjsdjs2b96MjRs3YtveA/AYCtKK1Q498Xu0XHM+nB+/S98b+OMdaL/xKzD/+XczcrgDAwMAgILiYjT8fSsMF146re1JpVKsWLECALBv376kOtCJqK2thUwmQyAQQHNzc8rvJGkkAsS7qY2Nn5JBDUC8nCbRHTw2v0EQVIDnmaxexyOCmM4xxGJMjNMEAoF5eRom4q7RaGA0GmlCU2Jje9IneLZqWzUaDaqqqqBQKBCLxdDf35+yo006MAyD8vJyLFmyBBzHwel0Ys+ePVO2sJuI/Px8XHDBBSgrKwPP8zh48CDefvvtCWduZoLIkAfdWZdAKeKwaPAwzjrwDs4843QYDAaEw2Hs3bsXf/nLX/DMM89g165dEz5k8LEoHJv/g5jXDee2N+j7gfZ41q2/7RB9z9d6EO03XYbBR+7K+BoSMSWW30xQXV0No9GISCSC3bt3T3lMYrEYixcvBsMwsFqtEw4Tz8nJoclu/f394xK9NBoNjZ/a7XbqwSD5DYAQPyUIbt70EepM55FYLIZQKJQkHCKRCGKxeN7HQ5G5j4FAAH6/P8llSixcIroztT+LxUJdc2KxGMXFxdSKyBSv14uDBw/SUoji4mJUVFRkXevb09ODXbt20RtzWVkZVqxYMe2a2dBQH0b//Teolp8AzYYzwfM8ent7sXfvXrS1tSV97nl5eaiurkZVVRXy8/OpqFlfew6u7ZuRd/n3oVwcr5H1tRyA84O3oDvjYsg/m3E6+Nhv4dz6XwBA9cZXaAnKZHgP7EL/3t34ryUeU//qV79KWzDOBHa7HW+++SZ4nsf69evTqhceGBig/bgbGhpS9vKOxWLo7u6G3++HWCxGRUVF0gAG0ks6FouNqz8l4wmBeLx2tgY3zCSzVWd6zt/uyqrO9J0rfrog7rFziSCmC4BoNIpQKJR04+Q4jorqfDdiID1M/X5/Up0eqX1VKpXTakiRiMfjweDgILUKjEYjcnNzsxLtSCSC9vZ2WoOqUChQV1eXNDQ6EwKBAPbu3UvHg5H+u4sXL0459Hu6+Hw+NDc3o7m5GQMDA0mWm0KhQEVFBSoqKlBaWpqWqPtaD2Lokd9AVrsU2pPOgaykAiLd+KkssYAffCwGhuNw+Jrz8WH1BjgVOlRWVuLLX/7yjH8fDx48iH379oHjOJx33nnQ6XSTLs/zPNrb2zEwMACGYbB06VIaf08kEonQWLxEIkFFRUXSlJhwOAyr1Qqe5+kQB3JuZO4pMP2GK3OBIKbzjyCmCwQyBmysqJJY50IZZky6xni93iSLWi6XQ6VSzchIq2g0CrPZTN2zUqkURUVFNKM0U8g0GxI/LS4uRnl5edbHarfb8emnn1KRlkgkaGhoQF1d3axZMT6fD52dnWhra0NPT8+4WLAhJwcGxzAMfAj137gWxsLiCUXP8uLjsL7yV3BaPWr+9DIY0ZFjDlkG0fW/3wQfCaHk//6Ed/76BNrU+ZCwDK657vqkYdwzBc/z2Lp1K4aGhqDRaHDeeedN+TnyPI+mpiaMjIyAZVmsWLEi5d9zKBRCV1cXIpEIZDIZysvLk4QxGAzS5hEKhQIajYbGTckYNwAz6oWZDQQxnX8EMV2AkH6+Y2vlyPBx0kh/Pi1WcrMhDSEIMpksqdZ1OrhcrqT4KSngz+amFg6H6SQbIC6AlZWVMJlMWX2OPM9jaGgIn376KY1nSiQSOnN3NixVQjQaRX9/P7q6utDd3T2u6w8QfwDJy8tDXl4ecnNzkZOTg5ycHMjlcpifvA+Od18FI5Gi9ok3aDN9APA0bkfX729Fn6EU3VWr4AnFrbOLLroIixYtmvFz4aMRBHrawRsL8Na7m+H3+5Gbm4vTTz99yu9QLBbDgQMHYLfbwbIslixZktJCDQaDdGSfTCZDaWlp0rb9fj99cJPL5bRB/ti68IUsqLMlpmf/NTsx3XSlIKbHPEeDmBJ4nqeiOrbMg2EYKqwcx82rsIZCITqsmSCTyaBWq6dtqUYiEQwNDdHaQrFYjIKCgqwtJKvVivb2duquVqvVqK6uztr1G4vF0NPTg4MHD9Jj5FgGRXwIy049A9qi0qy2mwk+nw/dzU1ofvV52MQKOBVaRKOpy4KkUimUCgVkIT9UOQZIdTlUIDweDxwOBxw2K6Kf3RUUCgU2bNhA2wCmQ8TtjH8/VVP/fZE4rrx+GdQ3/gqbN29GOByG0WjE6aefPmW/3EgkgkOHDsFut4NhGNTX16eMoQYCAfT09CASiUAikaCsrCxp2z6fjz4UTSSoJEFpIQrqbInpWc9kJ6bvXiWI6THP0SSmiRBhjUajE07QSLRc50Ncw+EwPB5PUv2eQqGASqWadszJ7XZjaGiIxrE0Gg3y8/OzsoBJ1nBPTw/9LPPy8lBRUZG1K5k0oCA3dgBgeB4VVVWor6+n2aVT4W1qhOXZjdCsPx2Gi7+R1bFEo1Haq5kkddlstoy6TalFLOqibqy//FooCkvSXi/Q047un30bYBhU3P0kpMWTJyv1/PIG+A7vhUhnQM2j/4bVasWWLVsQCoWQk5OD008/nWbYTkQsFkNzczO10CsrK1FSUjLubyAUCqG7uxvhcBgikQhlZWVJ2060UGUyGW3yQBqt8Dy/YAV11sT0L3dDlKGYRnwBvHv1bUfdPXa6CGJ6FMLzPBXVaDSasqRgPl3CpMNMoqWqVCqhUqmmdROKxWKwWCx0viXDMDAajTAajVltl7j/SOyTYRgUFBSgtLR0yhv4ZMe49zc/Rp8mHx5jEX3fZDKhvr4ehYWFkx5r790/gnffToBhUP/sNjDszCW+hMNhuFwueDweeDwe2vUqFouB53kolcp4aVTQC+vt14EBkHPhpTBdcWPa+3B9vBkDD8Un1hTd8pspuzWFLINwbH0d6nWnQF4R741ss9mwZcsWBINByOVynHjiibR2dCJ4nkdHRwf6+/sBxBPXUsWww+EwnYNLWlwmeiVIKRgQd9vr9XpaJraQBXW2xPTMLMV0syCmxz7HgpiOZaGKaygUgsvlotYkwzBQqVRQKpXT2r/f74fZbKaF/mKxGCaTiSaPZIrb7UZXVxdNRGEYBoWFhSgtLc0q9hkLBhAy98Gt0KG5uRl9fX30mqhUKtTV1aGysjKlC9P58bswP3EvNOtPR8F3/jfjfc8EsXAI3f93PUL93Sj5ye+gXLom7XX5SATW//wdYDkYPv+NrB8GnE4n3n//fWpNNzQ0YPny5ZN6OHiex+DgINrb28HzPGQyGRYtWjTu7zwajaKvrw9erxdAPBafGDsPBAK0fy/HccjJyYFIJKI14eRaLqQsX0FM5x9BTI9BYrEYtTgmEleO42gLtdkUVpKolDicmQxtJv2Ks92uy+XC8PAwFWvS/Sbb2lSHw4Guri4aOyOWS0lJybQSirxeL1pbW9He3k6TtUQiESorK1FXV7cgv4c8zwN8bEYt46n252tqhEhvhLQwHmeORCLYs2cP2tvbAQB6vR4nnnjilKUzLpcLTU1NCAQCYBgGlZWVKC5Ozm7meR7Dw8PUy6FSqegQBiBuwdpsNsRiMbAsC71eTxs9JCYlLRRBnTUxfTpLMf2mIKbHPMeDmCbC8/w4y3UsxGIViUSzJqzETeZ2u+mNSCQS0W402e43FothdHQUo6Oj9KFBrVYjLy8vK1ct6YrT3d1NraLE1ofZun+BI3WPLS0tSR2NCgsLUV9fn9SI4WjE+dEmjLz4Z+jP/sK4eG/U74Njy38gr2yAomH5uHXtm/8D859/B0YsRtUf/glxjhEAELIMoatxF/a7ggiFQmAYBtXV1Vi2bNmk1yIcDqOlpYU2AdFoNKirq6PTb+gxO520hlckEqG4uJguE41GYbPZ6EOgVquFQqEYJ6hSqXRGSsKmw2yJ6RlPZSemW74liOkxz/EmpmMh9ayk/Gbs5Z9tYeV5Hl6vFx6Ph+5bIpFAo9FMq5yGDIZObB1IehBPlRE60XHabDb09PRQUWUYBiaTCaWlpVlbv2TbZrMZLS0ttE0fEL/hNzQ0TKsGdj7pvPVbCHa3gpUrUPf0O0m/Mz/zB9jf/CfAiVD7+GvglMnZ2NbXX4Dlbw8DDIOqP/wDkrwC8DyPtu9+AVGHFdJTLkD/kpNoC0HSWrCurm7Cz4q4fTs7OxGNRsEwDEpLS1FaWppkTfr9fvT391OvgdFoRF5eHp0R7HA4aPyf1KICSKpDne9OSbMlpqc/mZ2Ybr3m+BPTo+8vVmBaJJbUSCQSxGIxmiVMRJZ0ZEpsFjFTwkripgqFAh6Ph9apjo6OTqucRiwWo6ioCEajEcPDw3C73XA4HHA6ndDr9TAajRnd7BiGgcFgQE5ODux2O3p7e+FwOGA2m2E2m5GXl5d256FU2y4oKEBBQQHcbjdaWlrQ0dEBl8uFnTt3Yu/evaitraWN3WeTWMCP3t/+CBHbKEr+93eQFpVnvS3DRZfB8vxj0J99ybjfkU5LnEIFRjz+4Sbn3K+AU2ogMRVCkldA3ydfOxliOOWUUzA8PIw9e/bAbrdj7969aGlpwfLly1FRUTEuIYhhGBQVFcFgMKCtrQ1WqxU9PT2wWCyora2lGdZkeD0Z7DA6Ogqv14vi4mKahJSYtBWJRKDT6SCVShEKhRCJRGjParFYfFR7F8bCf/bKdJ3jEcEyFaAQYR1rsTIMA7FYPCvWaiQSgcfjSWpTOBPlND6fDxaLhSaZMAyDnJwcGI3GrK0+p9OJnp4emqgExJNXysrKpv1dIk0lWlpa6DFzHIeamhosWrQo65IdIN4MP2QegCS/aFwM1Hd4H3p++X0AQO5l18H4hSuyPwkAji3/xdATv4f6hFNR/MM7jxwDz8PfehASUxFEuvGNFSYiPGqGr/Ug1Ks+B1YW/wxI393EaTNKpRINDQ2oqqpKeX15nsfo6Cja2tqoBZqbm4uqqqqkBxan04nBwUEaK83Ly6N9exMTk0gcVSwWIxwO07g9x3FJ84PnitmyTE/L0jLddhxapoKYCoyDuIJTdWEi1upMJ13MVjmNx+OBxWKhYj0Toup2u9Hb25s0PUav16O0tJTWJmYLqVdtamqios2yLGpqarB48eKsRHXgj7+E66NNUG84E8U//GXS7/hIGIN/+hXCtlEU3fB/EOdOXoIyFb2//TG8e3cADIv6Z7fOagJTNBpFa2srTTYC4vHL+vp61NTUpEwaC4fD6OrqogPkWZZFaWkpHQUIxLPQBwYGkoS6qKiICqfdbqfuXY1GA4VCQa1Tss3pJNdlw6yJ6RO/zU5Mr731uLvHCmIqMCkTdWFiWRZisXjGs4FTldMolUoolcqsRZXneXg8HoyMjMyoqPp8PvT29mJ4eJha8hqNBmVlZUlTSLI95qGhIRw8eJCKNsdxqK+vR0NDQ0bZxR0/uhyhgR5ICkpR9cBzEy4XHOyF+c+/g7S0Cqarfggmi8/b33YII/98Eup1p0J/1nh372wQiUTQ2dmJpqYmatWLRCJUV1ejoaEhZXzb7Xajvb2dJoLJZDJUV1fDYDDQRg1kpi+xRAsKCqDVasHzPJxOJxVwmUxG3yfvzXUt6myJ6al/zk5M3/u2IKbHPIKYZgexVsPhcFJG8Gy4gFOV08xEjSoRVYvFknTTm66oBgIB9Pb2YmhoKElUy8vLkyaRZHvMw8PD2Lt3Ly3jkEqlWL16NcrLy9PadqCrFY733oD21PNpY4RUDP/tj7C9/iIAoPK+v2ccP41EInQeLZkwRJojkKQ2mUxGH47UajV0Oh3UavWMiA5p7djU1EQT0RiGQVlZGerr62EwJE/I4XkeIyMjSSVLOp0OVVVVtF1lMBjEwMBAUvvJwsJCcBwHn89Hk9NEIhFt8DAftaizJaanPJ6dmL7/HUFMj3kEMZ0+xAWcODyZTLeZyQQM8qTvdrupgLMsSxOYpiuqM22pBoNB9PX10ZgbEL85V1RUZN37N/GYBwYGsHfvXmpN5efnY926dTM2ycXXcgD9994GaUkFSn96f9I0mcno7OzEnj170NfXNy4skA4sy0Kr1cJoNMJgMNAmCjk5OUki6z24B4N/+hUUDStQeOPtE15/ksXb1NSUNAQgNzcXDQ0NKCoqStpuJBJBT08P+vv7qQgWFBSgoqKC1paOjo7SbXEcR63UUChEB4wzDAOtVguZTDbnpTOzJaYnZymmHwhieuwjiOnMQVzA4XA4KWFJLBbPuKj6/X54PJ5ZEdVES5VlWeTk5MBgMExLVHt7ezE4OEg/F4PBgIqKimkPE49Gozh8+DAOHDiAWCwGjuOwevVqVFdXz3nSSzAYxJYtW3DgwAH6nlIuh4mNQl9cCnVBMaRSKa1zDofD8f633e2wd7TCL5HDL1cjNsEtSCQSwWg00sxnbtu/wX/4JhgANY/+J61EJpvNhubmZvT09FBxU6lUqK+vH5es5Pf70dXVlSSa5eXlVHz9fj8GBwfpd0Wr1aKgIJ557HA4qHVLYv2hUIh+X2f6b2IssyWmJz3+W4jkGYqpP4APBTE99hHEdOaZS1H1+XzweDz0xshxHFQqFeRy+YyLKrGSsnXTBQIBdHd3096/QLxBQ0VFxbRrEt1uN3bu3ElHypWVlWHDhg1z1o3HbDbj1VdfpW7O1atXY/ny5fD++W749u0Eq9Kg7ok3Uq4bcVjRe9ctAM+j5Kf3w89JaEmK1WrFyMgILMPDCKewcsXRCHKlIlSc8DkUFxejsLAwrc/S5/OhtbU1KZtXKpWmLEFyOp1oa2uDx+MBEM8ur66uRk5ODm0SQuLYIpEIRUVFUCqVcLvdNGYrkUig0+noQwRZdjoNSiZDENP5RxBTgRkj0QJJTFaa6ZvIXIoqx3EwGo3jXI6ZQAZ7k248IpEIVVVV0+54xPM8mpub0djYCJ7nkZeXh9NOO23Wmwc4nU4888wzCAQC0Ol0uOCCC1BcXAwAGHrqfjjeeRmyijpU3P3klMdvf/OfiLidMH7xyqS5qt2/vAGjnW1wanPBfOFbMJvNGOrrRZRJvgYsyyI/Px9lZWUoKytDYWHhpB6FSCSC1n170dLZCd9ns1pZlkVVVRUaGhqoy5w01ujo6KCua4PBgOrqasjlcvh8PgwMDFBhJrN2g8EgnE4n7etLalnJcrNVOjNrYvpYlmJ6nSCmxzyCmM4+k4mqWCyesQxH0k3J6/UmtShUqVTT7vvrdrsxPDyc1Es3Ly9vWqUvDocDbW1t1HrRarWora0d1+IuU4aGhvDBBx9kNAc0WyKRCJ577jmYzWaYTCZcdtllCDfvg/ODt5Fz3pchq6xDoLsN0qJysNLJb8KJNa6mb92CnHO+RH838tLTGP3nk1CtOhElP/kdAKDzF9dhuL8f3prl8C/dgL6+Pmo9EkQiEUpLS1FZWYnKyspxfXyjPg/av/8lRAN+xL72PfTL9UlTiEpLS7F06VIa4yZTZkjLQZZlUVZWhpKS+Fi64eFhWsIkk8lQXFwMlmVp+QzDMNDpdBCJRLTsazZKZ2ZLTD/36D1ZielH1//vcXePFcRUYNaYK0s1FotRS5V8nUUiEdRq9bSsAJ7n4XA4YLFYqHUilUqRn5+fdeyTzFLt7u6mSSupGrFnSuIcUKPRiLPOOmtWXL6bN2/Gnj17IJPJcNVVV0Gr1aL1Oxcj6rJDVlmHirsmt0YTCVst6Pzx/yAW8EO99lSYrv4hxDm59PcRtxNgGHh2vQ/F4lXgFCp4D30KxeJViNhHITYVwe3zo6+vDz09Pejp6aEPKgSDwYCamhrU1NTAZDLB9t/nMfL8owDPw/DFq5D7tWthsVhw6NAhDA0N0fXKysqwZMkSKsZerxdtbW00S1ihUKCurg5arRZutxsDAwNUPAsLC6HRaGC32+nDGBnsMFuCOltiemKWYvqxIKbHPoKYzj0TiepMx1RjsRi1VBP7/qrV6mlZarFYDDabDaOjo0nF+iaTKevt+v1+tLW1UavGYDCgvr5+Wi5am82GzZs3IxQKoba2FmvXrs16W6kwm83429/+Bp7ncfFpJ6PAa4V6/RkYevweuLdvRs7nL4fpG99NWoePRODasQXS4gqITUVgJRIw3BE3bNhqQfsNXwH4GLSnXoDC7/40af3++38G9yfvQZSTi5qNrwA44kqW1y9H+R1/OrKvz7JuOzs70dnZiYGBgaTvmwIx5A93odA+gNKV61B43f8muZZtNhsOHjxI+/8CcVFdvnw51Go1eJ6HxWJBe3s7jYMWFRWhoqKCZlsTMdfr9TCZTLQFIXCks1diXH6mBHW2xHTDI9mJ6fbvCmJ6zCOI6fwxl6JK+v4SpFIp1Gr1tMQqEolgZGQkae5pbm4uDAZDVq7rsfM3pVIpFi9ePK3v5cDAALZt2wYAOPXUU2ksc7rwPI+///3vGBoaQkNDPWr//ShiHhe0p16AgutvQ9Tjgkg9vvyHuGvBcgB4iI35qPzdM7Q1IB+NoPP/XYXQYA9MV9+EnPO+knw+f7wDro/ehTivEMolq+H88B2wKg2ithFwah1q//zfCY85GAyio6MDrU2H0NXTg3D0yHdOLZNi6arVSdYnYayoMgCqa2qwdOlSyOVyhMNhdHR00MQymUyG+vp6aLVajIyM0OQkmUyGkpISOpQdiH8PtVrtjAvqrInpxizF9HvHn5gunFHxAsc8pBZVJpNBKpVSAQqHw/D5fLRZ+HRhWRYajQZ5eXm0/V4wGMTo6CgcDkfKMXTpIBKJUFBQgKqqKjqKy2KxoKOjg1ofmUAasa9atYq6ABsbG5NqIzOlqKgIDQ0NAICdO3cmtWfMBp7nERzsRXd7O4aGhiAWi3HaaafTGlTms4egVEIK4IgVyvNALIawZRBh20jS7yvueRrVD780TkgBoOA7t6Lolt9Ac8JpcGx5DXwoiJjbiZyLv46Sn/x20mOXSqVoaGjA4m0v4Ozd/8EprBdlXAQiloE7EMTHH3+Mxx9/HM899xyampro9yInJwennHIKVlk7obH0ggfQ1taG//znPzh06BBYlkV9fT2WLVsGqVSKQCCAvXv3oqurC0ajEWVlZeA4DoFAAF1dXTRuCsS/hw6Hg3awGjtwXODoRRBTgTlnrKiSp3IiqmNLbLKF4zg6ho2UPvj9foyMjMDtdme9D5lMRusPRSIRQqEQurq6MDw8nGRxp4tarcaaNWtgMBjA8zyampqS4neZsnz5cmg0GgQCAbS0tGS9HQCwvfYcOm/5Bj5+7ikAwJIlS6BWq1Fx1xMo/tFdMF35g0nXN3z+Gyj+8W8hqzzSeUlsKkpahhVLIDaaUq7PSmXQrDsVrOJIjJqPRuDY/Bqi/qkfYAb+9CtERofB8THke0bwtZtvxY0/vAkXXXQRysvLAQD9/f3473//i0cffRQffvghTWoyVdeheufrWNK7HwaDAZFIBHv37sXrr7+OoaEh5OTkYO3atcjPj/cz7u3tRWNjIx38LpVKEYlE0N3djVAoRFtMhkKho0ZQeT671/GI4OYVmHeI+zfRMp2NNoVj+/5yHEcTQ7LdRzQaxdDQUFKP1+Li4ox65xJ4nkdraysV0pqaGhQVFU2xVmp6enrw4YcfQiKR4JJLLsk6tju48dewfLQFm5aeA55lcfXVVyMvL2/ccs4P3oblhcegP+eLMF4yfvKMa+c2mJ++H5oNZyH/qokF2Ov1YqCnBz3v/BuOGINQXjECwSACfj9cLQeAWAwMeLA8DynLIPfEM6BQKKDVaqEMehDZ9DIK6xah7Ns/Bh8OoeXKM5O2PzZr2O12Y//+/di7d2/StJ7Fixdj7dq10Io5cCo1wLDo6upCY2MjddGWl5dj9erVkMlksFgsaG1tRSQSgUgkwqJFi6DVatHf30/FOT8/HxqNBlarlY5r0+l01HswnbKZ2XLznvCn7Ny8O79//Ll5BTEVWDCkav7AMAwkEsmMNdQnLQpdLhe1IiUSCbRa7bRavrlcLgwODtKMzoKCAlpjmOnxdXR0oL+/HwBQV1dHu+xkup3XX38dTqcTy5cvx5IlSzLeBgCEbaP48J9/xy5nCCaTCVdddVXK5Y4MBlei7um3AQCxUBCOd/8NSVEZVMtPgGffJ4g4RuH4cBOiThtkFbXIu/KHaOvpxcGDB9HefBgjdkdWxzkWrUqJPJ0OyuZdyA95YAp5IOWjyP/Wj6A95TyAZZOSj6LRKNra2rB79246UQaIf/4nn3wycnLi3ZZCoRD2799PLX6pVIq1a9eirKwMgUAAhw4dgtvtBhAX29LS0qTymdzcXOj1eiqo5LtHBFUkEmX1IDZbYrru4XsgynBSUcTvxyc3HH9iKgwHF1gwJFqjiQOXSbN0IqrT3YdcLodUKoXX64XH40EoFMLIyAjUanXWjfQ1Gg3kcjnN6CRt5zJtzMAwDKqqqgDE3Y+tra2QyWQZCzPDMFi8eDE+/vhjtLa2YtGiRVklSYlzjBhS5wHOfixdunTC5fTnfBHDTz8IVqVBxGWHSKOH9bXn4slHYFDw3dsw9MhddHkvK0aLzYeWtp/DzSd/Pho+Ak3IBy0iUIb9kEbCkIhFEAV94MGABxBjGITEMohXbMDo3k/g5STwcFK4OQl8nAROjxdOjxdQFcY3yvPI12tRM2CF9rovIx9h1N7/LMSGuJVNJvLU19ejv78fn3zyCZ0v29bWhuXLl+PEE0+EUqnEmjVrUF5ejp07d8LhcODDDz+E2WzG6tWrsXLlSrS1tWFoaAjd3d0IBoOorq6GSCSCxWKhyUk5OTmw2WzUW6JWq+mgcTIUYCGQjdv2+DLPjrAwrpiAQAKJokoGL5O4EsdxkEgk0278wLIs1Go15HI5nE4nQqEQ3G43/H4/dDpdVlm/YrEYZWVlNKOT3CyLi4szegggghoKhWgN5OrVqzOeZVpaWordu3fD7/djeHg4pYUbCwVheXYjACDv8u8lWWtA3BIjllpFRUXK/fCxGCJWC/hwEJGRIXj2fATd6RdBpDcCAFiZHOHReFJVmGHRqCrAfmU+YgwL8PGSkXqVFMbmT5AfckPGRwFOBM3nzobrg7fid2d/ih0HAGbXW6gOJMdOgwwHu0gOh84EcygKs0QNl0gGs8MFs+MQkFMHSSyCqt/dhbUXXoK6+noo8o58NsXFxciTsLAuX4qPGvehs7MTjY2NaGpqwkknnYSVK1fCaDTivPPOw4EDB3Do0CG0t7fDarXipJNOQl1dHdRqNXXZh8NhNDQ0gGEYDA8PY2RkhA5WsFqtCAaDEIvFkEqlCIfD9OFxrsa3TQr/2SvTdY5DBDEVWLAQFy9J8olGo4hGo/D7/TNWTiMSiZCTkwO/3w+Xy4VIJILR0VGoVCqoVKqMt88wDPLy8iCTyWi8rKurC6WlpRnFLRmGQV1dHfx+P9xuNw4ePIhVq1ZlJMocx6GsrAxtbW3o7u5OKabu3R/A/vZLAAB53VJoTzwr6ff9/f2IxWLQaDTjSkgI1v88i9GXngZYFpLiCihXrAcA6E6/CLKyGoj0RnBKFQ58uhub3Tx8XPxzKNYoccp5F2DFmrUI7NuO/n3vxDfIskA0Atf7b0JSUoFQX9eE58gHxichSfko8sMe5I94UA9AZMyHa3gUAxIN+qRa9Eu1CHBiHA4Bh1/5DySxV7Bq9Sqces55yM/Ph/fgHvT++iYwYjEuvPNRNA93YE+Igy0YxObNm7F//36cd955KCgowIoVK2AymfDRRx/BbrfjzTffxEknnUSHiTc1NWF0dBT79++nlv3w8DAsFgtEIhF1zXo8HnAcB5ZlEYvFEAwG53zAuMD0WACPPgICk0Nq8RKHLZMJJJFIZNpZkAzDQKFQJGX9ejweWK3WrEaKAXG3b0VFBW0jRzI6M4HjOCxZsgRisRherxfd3d0ZH0dZWRmAI6I4FnllA1ilGqxSDXll/bjfE6u0tLR0wht71B1Pvor7X6MY+eeT6Pn1D+H6aBN677oZfb//CXZ8sguv+cTwcRKoIwGcH7Phljt+hQaFCKN/+yMc771NtyerXkT/HbGmLhPiNDqoVp0I41evQcmt96Lm8f+i4YUP0fDChygY0/ghMmqGIhZBTcCGM5xduMKyF5e4OrE0ZIcyGkKI5bCjcR/uuecePPLII2g9dBAADz4cgnPbG5B98BpO3PkqTl1SD5lMhpGRETz77LPYvXs3eJ5HQUEBLrjgAuTl5SESieC9995Dd3c3cnNzsWzZMnAcB6fTiaamJjrmDwBNNCPtJF0uF3XvxmKxjL8vswGf5SsbNm7ciIqKCshkMqxevRoffPDBhMu+/PLLOPvss5GbmwuNRoMNGzbg7bffnnD5uUCwTAWOGjiOg0wmS8r8JS6xxLrV6Wxfr9fD7/fD6XQiHA5jdHQUWq02YxcrAMjlclRWVlIh7e7uRnl5eUYWqlQqRV1dHW0iYDQaM5qNmpubC6lUimAwCIvFQss4CJL8ItQ+9hoAgEkRpyPt88YO1iaER82IOG1QrjkJ3t0fItTfjVB/NwAg6nIg5nXjoFmCbS+/DABY5LVgvasXlf/3EGJeN/ru+TEwRuQDrQfB6Y1gpTKEhwfH7hIAUPv4xM0apoIBYPJaYfJasR7AoESNJm0xukUqtLa2ohVAacMZOK2hGuqVa+DY+hoQi8G45R+46ge/wrbGeALSli1b0NvbiwsuuAAKhQJnnnkmtm/fju7ubnz00Ue0E9Xy5cuxd+9e2Gw2tLa2ora2lsZK+/r6UFFRgXA4jFAoBKfTSTN8SWbwXE0CSsVcxUxffPFF3HTTTdi4cSM+97nP4bHHHsP555+PpqYmlJaWjlv+/fffx9lnn4277roLOp0OTz/9NC6++GLs3LkTK1euzPwAZgDBMhU4qiA1qnK5nMY1Y7EY/H7/jDV9kMvlyM3NhVgspv15ySSQTBGLxVRAw+FwVhaq0WiEyRSvw2xpacmo6QTLsrS8hmQIj4URiVIKKXBETCcS8MFH7obrw3fg3f3hke1JZWAkUnA5RnQqjNimqwQPYF11OU7yD0G3+kQoGlaAkcrAqVJne0YdVoTN/QA/dd1ueHQYnn07p1wuFQyAopAbZ48cxqWWfVhbWQqO49DrcOOv2xvxt83vQ/at/wUiYYT6uxE5sBOf//zncfbZZ4PjOLS3t+OZZ56B1WoFy7I48cQTUVNTAwDYtWsXWlpaoNFosGhR3No2m83o7e1FYWEhrUMdGBiAVqsFwzDU40IEdKa+01mTTY1pFod7//3345prrsG1116LhoYGPPjggygpKcEjjzyScvkHH3wQP/nJT7B27VrU1NTgrrvuQk1NDV577bXpne80EMRU4KiExFPlcjm98ZAbUbYdjhLhOA4Gg4G64Hw+H2w2W1ZNGcYKak9PT8bu4+rqakgkEvh8PvT29ma0Lplw0tfXl/GNmZR5TCSmIv0Ri9Xw5W8CAPhgAPpzvgTrgb14T10KHsCqyjJ87fob0PC3LSj5f/cgYrei584bJ77vTnGcI/96Cn2/+wlar7sY7Td8GeYnfp/ReaVCEw1h5Ucv4Wp1GIu8w2B5Hi0tLdj433fQUbMW0vI6aNadBoZhsHLlSlx++eU05vnCCy/AarWCYRisXbuWiueePXswNDQEo9FIRba7uxsulwslJSVgWRY+ny+pjMTj8SS5e2fi+5w1PJPdC3G3deJrom5coVAIe/bswTnnnJP0/jnnnIOPP/44rcOMxWJwu920hGk+EMRU4KiGxFNJsTupIw0GgzMSS9VoNNDr9bRzzejoaFZxVCKoYrEYoVAIfX19GQmzWCymN+Pe3l74/anSW1OTn58PjuPg8/kw0t2FWGjyFoPuPR+h62fXwv7uv+lnOJEL3fStH0GUWwBGKof1pafBfJYNLC2vwQFdMcIsB0PYi1Uf/Quu9+LDwi0v/hntN34FgbZDiLkcaZ9HIqP/egqOTz9Gn9ODvVDgUxwZYzcICT6AGp9AiWbIMAoRMpEjcW8rTnL14msjB1AYdCEcjWKzB9hW+zlE1Tq6XH5+Pq644grk5eXB6/XihRdegM1mA8MwWLFiBaqqqsDzPO2qVFRURJPAWlpaIBKJqMfBYrFALBbTEIDH46Gel/m0TqcTMy0pKYFWq6Wvu+++O+U+yAAJ8lkQTCYT7X88Fffddx+8Xi++9rWvZXiGM4cgpgLHBMT1S57oI5HIjFmpMpkMBoMBHMchGo1idHQ0q563YrEYpaWl1BoZHBzM6CZpNBqh1+tpY4d0EYlEKCyM11se+MvD8bFn4YldzaP/egqBjmZaMgOAHicfi8J3eB+innjj9mBXCyIjQ+CDR8S9euMrkKw8EYdEcWt2jXsALADz0/dj8LG7YX3lGSCa8EAiSb9JgRMc3oUGjyIPd6MIf4YJ/0YOPsARd/EIxNgCLd6EHi/CiD8hH3ehCI8hD29Di25IJvVExrxxa1wTDeJCWwtO1YrBcRwOHDiABx54IKl3skKhwNe+9jUYjUZ4vV68+OKLcLvd1EI1GAwIhUJ4//33EYlEUFVVRfv5dnV1Qa/XQyaTIRaLwWKx0OHkiQ9LpEPY0UZfXx+cTid93XbbbZMuPzbBjef5tLKZn3/+edxxxx148cUXU3bnmisEMRU4ZmAYBlKplJYUECt1Jp7sxWIxDAYDjaPabDbaVi4TyCQRAHA6nRgdHU17XYZhUF1dDSD+NG+329Nel+zTXliFkGUIMa9nwmV1Z1wMVqaA7uwvUIuU3Mwtzz2Knl9+H523fhNDT9yLocfvgbRmMThdDjiDCZL8YpifvBf79nyKMMtBH/ajNBQXXoTDcG59ffwOp7CUASAC4AOo8TBM+AgaDEOCGBjIEUUFAqjDkRKZXISxAl40wIcChCBGDDEwMEOCHVDjGeThj8jHXigwmW+A0+pRcP1tuOT2e3DjjTdCq1bDYrHg4T/8IemzVygUuPTSS5GTkwO3243//ve/4HkeHMfh5JNPhlQqhd1uR1NTE0QiEWprawHEY9h+v59aqw6HAzzP04xyr9dLrVPSAnOumU5vXo1Gk/SaqLOT0WgEx3HjrFCLxTLOWh3Liy++iGuuuQb/+Mc/cNZZZ0267GwjiKnAMQfHcUlWajgcRiAQyCreOXa7BoOB3hTsdntG7laCSqWilqLFYqFxyXRQKpU0oai9vT3tcyKNI4IqHeTX3gaRbuLYkv7sL6DuL+/A9I3v0jgeSUSKOKzxn047HO++ivDwACIjZkQdNkStwwj2dsCz5yP0NX4CACgKucCwydmonDYHYNLPUPWDwd+Qiy3QIgIWpQjiy7DiJgzh/2EIV2IU58FJly9ECJfAjq/Bhu/AgtswiB9iCF+CFSvghRQx2CHCv5GDjTDhAOQpLdWo046RfzyByOgwysrK8I0cFvqwD26fD48/9liSd0KpVOIrX/kKRCIR+vr6cOjQIfo+mSvb3NyMYDAIg8FAs6P7+vronFMgPv6N/Js0KQHiMcHpfn+zYS4a3UskEqxevRqbNm1Ken/Tpk048cQTJ1zv+eefx9VXX43nnnsOF154YTanN6MIYipwTEKs1MTJHKQudbrbJa45IC4y2Yxf0+v1NFmiv78/owzf8vJyiEQieL3etKfLkO5MAGBRG9PeF6mJtFrjImq66ofIu/x7KPj2/9JllEvXgZHKATAQ6XMhKSqHWxz/fNSRYJJLl1VpEXXaAD49t6UbLP6CPPRCCili+CJsuBojWAI/tIginZYGDAAdolgKPy6BHbdgCGfDATmisEKMl2HAX2GEHeMFPmobQfuNX4Fn/y7o8gtwvq0VCj4K8/AwXnzxxSSPh06nw+c+9zkAwNatW+n3orS0FDqdDuFwGIcPH6bvAfHs3mAwSL8LDocDLMvSB0FS+gVgXly9czU15pZbbsETTzyBp556CocPH8bNN9+M3t5eXH/99QCA2267DVdeeSVd/vnnn8eVV16J++67D+vXr4fZbIbZbKYDJ+YDQUwFjmlILJXckILB4LSTk8h8SoVCASDurs1GUE0mE+RyOWKxGHp7e9O2PMRiMW3t19XVlbYQk3V6enrSdhsSC4rECUVqLQwXfwO6U85F5e//iorfPo2i7/8M+df+CACPmN+Dsl/8AY7PbmpqWXJbxpgns5vdm9DBAjFUiOKbGMEy+NIS0MmQgMeJ8OCHMON0OCFCDN2Q4WnkwTfBLbHvrpvh7WjBkl9txDevuw4sy6KxsREHDx4EAAR62jD8tz9iSZ4ORqMRfr8fO3d+Vq4Ti2Lp0vigAVLaRJJyeJ7H0NAQVCoVxGIxYrEYvF4vfVgLBAJJeQDHKpdeeikefPBB3HnnnVixYgXef/99vPHGG/QBcGhoKCmL/bHHHkMkEsH3v/99FBQU0NcPf/jD+ToFQUwFjn1Ixi+JP0UikWm7fUmmLymdyUZQWZZFSUlJ3P0aDGaUkFRQUAClUolIJILOzs601jGZTNBoNAiHw2htbU1rHXIz6+7upjFiX+tBOD/aBElRGWTl8QzjqCM+FSUW8CPidkI8HK9pdQemF+vrRzy79cuwwoSZjRtKweMUuPFdDMOAMNzg8G/oJ0xOChz4BMGBblQ3LMZpp50GANjy4rMY+eeTGPjDHbC9/iLMD9+Jk046CUBcOAP93Wj99kXw3/UDyD6rKyXN7kk80G63g2EY6t71+XzUoxIKheiDYCwWm/OsXp5nsnplw/e+9z06HGDPnj045ZRT6O/+8pe/YNu2bfT/27ZtA8/z415/+ctfpnnG2SOIqcBxAalLHTuQeTquM4ZhoFarp2WhisXipISkdJOKWJaliSzpurfIJBkgHr9Lx9IxGo0wGo2IxWJobW1FyDKEntu/h8E//hK21/8B62vPwb3nQ2hPvQC5l12Hwhtvh6ykEoZgPA48KlMnnOyYBJQpsnhDYOD+rEnbTAtpIjmI4iuwgQOPVsixJ6HMZiwRa1wISSyv2xtE56vP0Ri0tLQKFRUVEIvFcLlc6G3chZjPg5jDCqM0fi7ENU9qd91uN2KxGP0ekd7TAKhIEI6ziZlHFYKYChxXELdvYrbvdDIliYWaKKiZJiUplUpqpZjN5rQFWavV0vaA6XZGKi8vh0KhQCAQQFtbW1r7aWhoAAAcOHAg3oT+M5w7tsDy7Eb0//5WtF13MTyffgz1+jPA8zyqPncqAGBEVwDI4+IkKSpL3nAoCIgmns5DYpgixKYoZpk++Qjj9M+SmN7DxDM4JSUVcO/+ADl6PWpK4olgHbpiFN54Oyrv/RuKbrw9yQ1v0eRBe9qF0J/zJZQ0xB9kSNaqQqGASCSirl3SspJ8fxLdu4nW6VwyVzHTY4F5F9NMmhsDwHvvvUen21dWVuLRRx+doyMVOFZgWTapc1IoFJpW+QwRVHIzdDgcGZfNGAwGqNVq8DyPvr6+tAW+srISYrEYPp8PXV1dUy7PsiydXrJ///60hHvJkiUQiUQYGBhAj90FVh5/cIja40lJ4DiA5+FvPYC2ay9A548ux9ILvgi5XA6HP4BWxON/oe4UruVIGJCm7nusRxQKRBEBO6nAzRSliMeeY5i4I97Avbeh/97b0HXbt6D/zPqWn34RxHojpMUVNHOZTNgJRaMovP425H/rFmh08Zm0JMZNkuSAeMY5EU9ijSZm8pJ6y7l38wpimi7zKqakufHPfvYzNDY24uSTT8b5558/Ybu0rq4uXHDBBTj55JPR2NiIn/70p/jBD36Al156aY6PXOBoh9zIEuv4ppOYxDAMtFotTRyx2+0ZNXZgGAZFRUWQSCSIRCJpd0iSSCSor49Pe+nv70/LTVxVVQWj0YhIJILdu3dPubxarablHVu3boX6zM+DkUhhuOgylP3yEWg2nAlZVQNU605FzO9FaLAXjGUAJ9XGrbNdmhJEwADsBOUwwdSWvAQ8LoQDAPAh1BhE5jNmM8Hy2fbzEZ4yySnY0w5vV9yy9+/cCgAI9LTD+eE74COTPwglXtdEkUxsULCgXLvZtD86DplXMc20ufGjjz6K0tJSPPjgg2hoaMC1116Lb33rW7j33nsn3EcwGBzXI1JAABgfR41Go9NKTCJZvtkKKsdxtEOS3+/H0NBQWjdSg8FAC/+bm5unzO5lGAbr1q0DwzDo6+tLq9fvunXroFQq4XA40FW+HHVPvY2o34uRfz4B14fvINBxGDkXXArlsrWQL14F6+svouStv0IVCcLLivGJthQ5X7oqvQ8igUXwYwl84MHgBRhwGLJZuV97wOITxBOA8tOJz7IcgqL494b1OBH1utH9829j8OE7MfryM/Q7lCiQqazLycR0Icwy5WN8Vq/jkXkT02yaG2/fvn3c8ueeey527949oVvs7rvvTuoPSZI9BAQIIpGICiBJTJquoEqlUvA8D7vdnlENqVQqRXFxMYC4uzjdDklVVVWQy+UIBoM4dOjQlMev1+tpLHT79u20KcNkx0UyWD/66CPsfeMVjP7zSfgO7gEAcGotWIkU3kON8B/6FJ5d70MEHhvcfQCAg4o8bGvcn9a5jOV8OKBHBG6I8A8Y8TRy0Yv0x9hNRQtkeAQmWCCGCDyWYGrXd1CXix5lvAa34StXAAwDMJ/dTjkRnQObOFCdfMYkA5zE7IF4IlpiQhgZEg6Mb7M3p8T47F7HIfMmptk0NzabzSmXj0QiE950brvttqT+kH19fTNzAgLHFKRrEklMmk6DB9LYQSKR0NaDmQiqWq2miUUWi2VKoQPiDwRLliyhg6jb2tqmtGqXL19O/37ee++9Ka3oxYsXY/Xq1QCArW29cKkMYCRSlP7fH1H10D/j3RGiyZ9ZRcCODc645bvDx2C/8sjfLyNTTHleAKBADNdhGCfDBTFi6IMUTyMPf0EuDkGeURP7RMwQ41/IwQswwgcOJoTwbQyjIA3LdFdEgnAkgvLycpSrZAiPmFHx26dR/OO7wZ16IQYHB8EwDB1OABzJ4iXXNhgMIhwO07IYIqxkNi/5/pEkJWDigQMC88+8DwfPtLlxquVTvU9I7IIjIDAZJDGJWKYkhkriqpnAMAxycnKokNpsNuTk5KQ9GNxgMCAcDsNqtWJwcBAcx9Em6BOhVCqxaNEiHDhwAENDQ5DL5SkHKxNYlsVJJ52Et956Cx6PBx988AFOO+00mgiTitNPPx1WqxXd3d3Ys/IcfOXiC6EsLQcAyCvqUPyjuxC2jcL50SYEWg8AAJb6hsHnFWJHUIQdmlJ4WQnWuvshCqRfRiQFjzPgwlp4sA0a+OpWwZRfgBEADsRQqFag+rNlo6ZSRE+9GBLwYMAjCgb9770NNzjYIIIVIgxCAnOCdbsBbpwB59Q3RI7DCCtFsyIXAHCSXorBB34OcCLUbHwF6jUn03rI8vJyWjsai8WokUDElISclEolOI5LEtPE9oEcx1HP25xbqTFkbmnOfdfDBcG8iWk2zY3z8/NTLi8SiWinFgGB6cAwDGQyGYLBIKLRKM3yFYvFGd/IiIVqs9kQDoczFlSTyYRwOAyXy4W+vj6UlpbSm/NEGAwGVFVVoaOjA52dnWAYZtLQhkwmwymnnIJNmzZheHgYW7duxWmnnTbhAwTLsvj85z+PZ599FlarFc+9/Cq+8IUvoLy8HACgXhsvtLf+5+9HVhJLsFYURtjjwh5xDg6o8tEn0+I0ZzfyQh6AEx2xaKXyCROSAECNGC6GA8YzNiD31PNTfwb1S2CoX5L03h3v7R23HAMei+HH5+BOL04KIPj1H+CNze8jBhZ1lRUo16lBZ8jwPAYGBrBr1y4Accuf0NbWhmAwCJlMRls0jnUFkx7NxF0PgGb00mOeYzHlY0hnPvu4dY5H5s1nkE1z4w0bNoxb/p133sGaNWuysh4EBFIxk5m+LMtSAeV5HlarNe2kJJLhS0pment74fFMPO2FUFxcTDsXdXR0oL+/f9Llc3JycPrpp0MsFsNisWDz5s2THqNMJsPXv/51FBcXIxQK4Z///Cf27t2btEzRjbdDd9YlKL/7SVTd9yyC7QexerQD59paIY+G4BDJ8WpOPT7UlCKUWBoT9E+c9ZuAOMM0JC0iKEAIi+HDKXDhy7DiFgzhy7ClLaRdK8/EXzd/gCBY5Ifc+HyxHjkXXYaim36Firv+jIhMgddeew08z6OhoYG6eMPhcLxGF8DSpUupK97hcIBhGBQXFyMYDNL6Uq1WS61UmUxG64c5jpsHy1SImaYLw89j3vWLL76IK664Ao8++ig2bNiAxx9/HH/+859x6NAhlJWV4bbbbsPAwAD++te/AoiXxixZsgTXXXcdvv3tb2P79u24/vrr8fzzz+PLX/5yWvt0uVzQarVJk+0FBCYiEolQYWFZlsazMoUkI5Ft6XQ6Wpc6FbFYDP39/XROZklJyZQuX57n0dXVRTN1KyoqUFpaOunN2Gq1YsuWLQiFQtDpdDjllFMm3U8kEsHbb79NJ6QsXboUp59+Ok3mSjyWocfvgfuT9xAL+BBgxdiuKECbIm6hKeVyLPVbUB92QjKaXuP+2YLT5yJqH0l6z5tTgL35dThk9wIAqv1WnG2Uo+KmOyE2xr1o4XAY//nPf9DR0QGtVourr76ahpd2796NlpYWqNVqXHTRRWAYBgcOHIDNZkN+fj7q6+sxODgIu90OtVqN4uJiWCwW8DwPg8GAaDSKWCwGqVQ6oQt+pu9rZHuLfnE3uDHXcyqigQCafnXbcXePnddodqbNjSsqKvDGG29g27ZtWLFiBX71q1/hoYceSltIBQQyJVWmbzYtCFNNm/F4PGlZuyzLori4GCqVilqoNpttyv0RAQXiD6LNzc2THrvBYMBZZ50FmUwGh8OBN954Az09PRMuLxKJaN03EO+Q9MQTT6CpqWlc+UfUaY8P3Y5GUXjR13BBoRb/s7IeeXl58Pr92AE1/iopwfv1p8B7/uXQf+lqMNIjN3FWPfs3ZUauRN43vkv/bxPJ8eGi0/G8vJQK6VpXP053dEKZl0+F1OVy4bnnnkNHRwdYlsXFF19MhfTw4cNoaWkBAKxatQosy8JsNsNms4FhGJSWlsLv99P64JycHHi9Xtq0ITGrd6zLdy4QSmPSZ14t0/lAsEwFsoEIKflzkUgkWYUWeJ6Hy+WinYcUCgU0Gk1a7jue5zE4OEizew0GA0wm06TrknVI60C1Wo0lS5ZMmpTn9Xrx0Ucf0Ybs1dXVWL169aSJSX19fXj77bepyBcXF+OMM86AQSqCSGdAy1VngQ8Fk+Kj2tMvQshpx76WNjTJDLBIjsSDlRyDCtcQqvw25IW9EKm1iHpcs9Zeh1UoUf7rP8Nnt2HbvXegVW6EWXrEKq+tqsQ6LgjV7s2Ied1QLF6Fsl88hL6+PrzyjxcRiMYgZYAvXnoZfYDp6OjAjh07AMTjp0uWLIHP58Pu3bsRi8VQWVmJkpISdHZ2IhAIQKPRoLCwkE7o0Wq1YBgGsVgMIpFo0ms2W5Zpw8/uysoyPfybnx5391hBTAUE0oTneZqYBMQtM4lEknEci+d5eL1emnAilUqh0+nSch/zPI/R0VF6w1Wr1SgqKprSarHb7Th06BAikQgkEgkaGhqg1+snXD4Wi2H//v3UhavRaLBu3boJkwOBuNv3k08+wY4dO2hZR4m1F8sKcmE05MC+6VXkXvodOLa8htBAN/Ku/CEsf/0DXd+RV4q+Ez+Pvfv3J/U3lkXDKIz6YPI7YYj4YJBwkHocSftmc3IRsyW7Z9MhxHCwiuUYFqtglqgxKFEj8lnMlgGwbPlynHnmmcDzD8Pz6UdglWoYv3QVJKtOxu7mVuzetQs8AI3PgfWWFqzYGO/G1tXVhe3bt4PnedTX12PVqlUIh8PYu3cvfD4fdDodli9fDqvViuHhYbAsi+rqang8HgQCAYjFYuh0OhoWSBwjmIpZE9PbfpOdmN79s+PuHiuIqYBABvA8j3A4TEsVphNHDQQC1L3HcRz0en3a1q7T6cTAwADNNC4uLqbN9ifC5/Ph4MGD1CouLCxEZWXlpBbn0NAQtm/fTsWttLQUK1asmDSW6nK5sG3bNjQ3N9P3CrUqLC0rxtJzLgITiyHq84ARidH/4C8Qddpg+p8bIK9fBlYsQTgUwvaH7sKBrl70ilUIMeMfFJQsoAq4oY6GoIoGoTHkgh3ugywWAQceLB8DwwMxiRSRGI8IzyPAiuBjxfBxErhFMtg5KXzc+MxqHR9GXciBascAVCIWOed/FRGXHc6tr4MproTzq9/Hjh07qNAVW/uw0juEosu/C8XaU7F79250dHQAiPdOXr9+PUKhEPbt2wefz0eTL0OhEA1jFRYWUvc6EHf3RqNR8Dw/pVVKPvPZENP6W38DTpqhmAYDaP6tIKbHPIKYCswEiYlJACZNDpmMcDgMu91OrV0ygSYda9fv9yc1xc/Ly4PRaJx0XTL/lJRlSKVS1NbWTlpaFgwGsW/fPrS3t9M68JqaGixdunRcslEiXZ98hJ0ffoC+CEtzb5UyKRYvW46lS5ci9MpTcLz7bzASKWqfeAP2t/6FkX88Ac0p58G55TUAQAwMhiVKDEnUsBmK4VDqYUtzTF26qCUiFOu1MIz0oSDmh26oM6k3LytToOCBF7Fz8zs41D9EG3AYDQbUtu+CoesQCm/4BaKL1uDDDz+kHofFixdj2bJl9PMLBAKQSqVYtmwZgPiQdp7nodPpkJeXB6vVCp7noVKpaI9mhmFoM5HJmDUx/cmvsxPT3/38uLvHCmIqIJAlpLEDSRARi8VZ1aPGYjE4HA4qzjKZDFqtNi1rNxqNYmhoiM4zVSgUKCwsnNKSsdvt8eHVn5Vg5OXlobKyclJxtNvt2Lt3LxVikUiE2tpa1NXVTWoVD23fhu3//Dt6DWUIJcw0NYg55HYfRH7MjzX3/w1dt16N0EA84UmUV4jIyFA8RsowAM9DfcJpKL751wgEAjCbzRj4eCscXj98cjXcNitszfsRZEWIggHPcYjFeLAMA1EsCnEsAgkfhSIaRv76U8Ae3AX1SB/0Yg5LHn0F7GezVV0fb8bAQ7eDVWkAYxGGWAmGSxehz+On1zknJwfr16/HokWLwMRiCLgcONzdi+bmZvA8D7lcjs997nMwmUyw2WxoampCJBKBXC7H8uXLEYvF0NPTg1gsRt30VqsV0WgUEokEWq2WfhfSfUibNTH90a+yE9P7fnHc3WMFMRUQmAY8zyMUCtEYIcuykEgkGWdejo2jsiybNIVmqnUdDgfMZjMd10WGek8myNFoFF1dXbQOlWEYFBYWoqysbNLGEmazGY2NjTTZiGSlVldXT5gQ5W87hJDfhz0bf4e+nBJYdCbwCfaflAEMYS905i7keGzIkctQ9/A/ER4ZQsRuha9lP3SnXQiRRpfymELDA+j48f8An1np+d/5X4SHB2H9998gyi1A1GUHHwxAsXQtSm+7D/72JjjefRXaUy+AcvEqAHGr3Ww2o6ulGe27d2CUF4FP+PyKi4uxbt06VFVVgWEYBINBtLW1oampiXoHSktLsW7dOohEInR2dmJgYADAkcSvYDCI/v5+8DwPpVKJoqIiOBwORCIR6uonlm867l3CbIlp3c13ZiWmLQ/833F3jxXEVEBgBhjr9s3WSg2FQnA4HNTtK5PJoNFo0hLnUCiEwcFBeL1eegz5+flQq9WTHofb7UZHRweN15FSnOLi4glFled59Pf3o7m5mSZDAXHRqKqqQkVFxThrled5mJ+6D77D+6C76mb0RVkcfm8zBn1BRLlk64tlGBhzc2GQS4Ed70LDh7HopjugLy6d0FKLelzwHtoDPhKBZsOZGHz4Trg+fhcQiVD98MuI2EcgK68FAHg8HthsNlitVoyMjMBsNmNkZGTcgAB1JIDFJ52GRYsW0c5FXq8Xzc3NaG9vpw9RWq0WK1eupFZmW1sbtfpJbHp0dBRWa3wGrEqlQkFBAZxOJx3+nZOTg3A4DJ7nwbIsZDJZ2t8fQUznH0FMBQRmiLHZvtkmJ/E8D7fbTUUxcfj4VDdXUnpjNpvpjV6pVMJkMk3aJII0lejq6kqyjgsLC1FUVDTpujabDW1tbeju7k4aDmAymVBaWoqioiI6KWUsvuZ96Pn9bfCV1sK7ZD16u7pgE8kRmGQwgEKhgFqthkKhgEwmg0wmg1gsph2CSG1mxOeBp+0wYjoDYkoNfD4f3G43PB7PhPW2CoUCJSUlKMnLRY61H0XrT4HYmI9IJIKBgQF0dnYmjcbT6XRYtGgRysrKEAwG0dnZSUuKJBIJ6urqoNFoMDAwQK+nwWBATk4O7HY7YrHYtIUUmEUx/WGWYvoHQUyPeQQxFZhNeJ5HNBpNslKzLaEJhULUcgHilqZGo0mrt28sFsPIyAhNagHiVmNubu6Uokoa2Se2LtTr9SgsLITBYJjw4SAcDqO3txcdHR1UUAharRYFBQXIz89Hbm7upOdAHibMZjMOP/UHuMHBq9DBp9IhnOUkn7GQUXkGgwEGgwH5+fnIz8+HUiyCZ8dWyOuWIJZjwsDAAAYHBzE0NDTuQWHRokUoKCiA1+tFb29vkoVOWjq6XC5YLBYqmoWFhZBIJHA6nbQxg06nQyQSocldMpks4wewWRPTG3+ZnZj+8fbj7h4771NjBASOJRiGgUgkAsuyCIVCiEajiEQiNLkkk/6qEokERqMRXq8XHo+HTpGRyWRQq9WTJqawsgyn5AAAFuZJREFULAuTyQS9Xg+LxQKn0wm32w232w2VSgWDwQClUjnuWEi81WAwwGazob+/H3a7nb7EYjFMJhNMJhNUKlXS+mKxGFVVVaiqqoLH40FPTw/6+/thtVrpCERSLqPX65Gbm0vFLLFxBbHENRoNNA11sL/1T+R+9RoYvngVAoEAXC4X3G43AoEA/H5//KfVgqB5AOLCUjDSeE0mwzDgGAbuN16AOOiHbslKlFx0KdRqNVQqVZLrnMSsDz7/JIbNZnjaexBQ6pI+G6VSibKyMlRVVUGtVsNms9G2gAS9Xo+qqiqwLIve3l7q6pXL5SgoKEAwGKTudIlEAo1GQ2Ok2QrpbJJNRyOhA9JxgmCZCswlkUiETp4Bsk9Qikaj8Hg8tEYUiN+g1Wp1WtsKBoMYGRmhWb8A6AQTtVo96Q3c7/djaGgIQ0NDNNEGiLtESTlOKmFO3LfZbMbQ0BAsFgt1IydCLDStVgutVguVSgWVSgWlUgmxSAR2inNsve7ziDpttDMRIRYOof27X0DU44Lh85fDcOl34Pf74fF44Ha74XK54HA4Jpw5azAYUFRUhMLCQuTk5CAQCMBisWBoaIgKJQDk5uaitLSUDgtIdJWbTCYolUo4nU7qXlapVJBKpUn1ypm6dhOZLcu09ru3Z2WZtj7yy+PuHiuIqYDALDO20QMQFw+JRJKxFRIOh+F2u5PcyHK5PC46aTR8CIVCsFqtsNvtVOCJkOl0ukmzh2OxGGw2G4aHhzE6OprUf1cqlSInJwd6vX7K5hM+nw8WiwVWqxVWqxU2m23SnsFEaGQyGSQSCW3lSDwALMvC+d4biNhGIS2vgWL5CdQbEA6HEfC4EfB5EQI76TQclmWh02qhQwQF5ZUoqKqBRCKB1+uliUqJrm+RSIT8/HwUFBSAYRhYrVY6oxSIx1ONRiP8fj9tesFxHL3vTLeTViKzJaY11/1fVmLa9tidx909VhBTAYE5IhaLUdcvQSQSQSwWZyyqoVAIbrc7yZqSSqVQKpVp3ZgjkQhsNhvsdntSLFAmk0Gn00Gj0UwqiMTlPDIyQhNpElEqldBqtXRbUql0wmOKxWLweDxwOBxwOBxwuVzweDzwer1J1t9MwbIslEol1Go1NBoNtFot9Ho9dDodHdJNjiWVxarT6WAymagL3mazJbU/JLHpSCSSJL4KhQIKhSJpexKJBCKRaNqj1QQxnX+EmKmAwBxBLKxEUY1EIohEIhmLqkQigcFgQCgUoqITDAYRDAYhEomgVConjb+JRCLk5eUhNzcXHo8HdrudxiHNZjPMZjNtwp9KWEnZTX5+PqLRKBwOB42rer1e+iINHsRiMdRqNdRqNZRKJZRKJe03y7Is3Q9pEk+IRqNJsdFQKIRwOEw/v1gslvRwQrJ5OY6j1r9EIoFUKoVUKoVCoaDCTmqESX3v0NDQOKufXDdiZRoMBsRiMdjtdrS3t9OHCIZhoNVqaRtAl8uV1MyDzKQlQjqdNpRzCg8g02Hfx5V5dgRBTAUE5hgiqtFoFKFQKF7G8ZmochxHyzzSgYhFJBKB1+uF3+9HJBKB0+mEy+WCTCaDQqGYsOaVYRgqcmQ9p9MJv98Pn88Hn88Hs9kMqVRKE3fGNl3nOI4mEgFHspAdDgecTie8Xi/C4TBsNtu40XEymQxyuRwymYwKHhnMTty5RHwzged5xGIxKryhUIiOOiODuH0+X0r3MsMwUKlU0Ol00Ov10Gg0CAaDcLvd6OnpSXLXi8Viugw570T3OUl0SrT+s61BnheyGfZ9nCYgCWIqIDBPcBwHuVxOY3vRaJS+WJZNqp2cCpFIBK1WC7VaTUUwGo3SeB3ZF6nJnGgbRBRDoRBcLhdcLhf8fj+1ekdHR8EwDBQKBZRKJRQKxThxlUgkyM3NRW5uLoAjyVOkdpYkUhGrcypXLsmQJvM9SaZuIkQ8yYMJaRKfDnK5HCqVij5UqNVqxGIxeqzDw8NJbmzyAKLT6SCRSODz+ZIeEjiOozHscDic1B3rqLBGExCyedNHEFMBgXmGuCSJJRWJRGjfXyIkJNlmKliWpVmw4XAYPp+PDjT3eDzweDx04LlMJpswXkfKcoxGI439kTgmsYJJEwLgiIWZaGWS4+U4jmbpEojLk4h9MBikrupEVy5ZdmwCV7owDEOtd4lEQs+bWOyk5pa4kj0eDywWS5IlSc5BpVLR5hmJyyd+ZnK5HCKRCJFIhB4vOYZMyqIWDIJlmjaCmAoILBCI5SIWi6nbN1FIOI6jFtpUN+VEEeF5nt78g8FgkjgSl7NUKp0wu1gkEtFsXyKCxLr0+XyIRCLUwrQnTHRJjFUmChoRcPI7nU6X8hx4nqdWJvlJrM+xViexVlmWpZ9R4mdFmmkQly/JaibinerzI1nSKpUKYrGYCn7iQwSpDSWtExNFn2EYmnV81IkoIYbMY6aZLn+MIIipgMACg9SiisVi6gImiTaJ5RTEok1HWInVGIvFqPCRiTdEFIF4PI+IX6qEqEQRTIyREguTbDtRuFLVlRKRSXyR8yHuXCKO5MGAiCKBuHbJTyK2JG5MPjtiJY7NOE5EJBJRq1qpVEIqldJ+y6naDxIrlyyXaDUfEyIqkDGCmAoILFASXbyJSUrEYiOuyEyElWVZWqJBegmTFxGfsQk25DWRq5JYnMSNS46PbDcUCiW5bwFk7badLqSmk1jNiS5p4lr2+/1J9aIE4sYVi8VJ50jINM59NCDETNNHEFMBgaOARGs1UVgBJP17rHU3GcRFSRo1kG5NwWCQJkSNFT0i8GMty8TsY2KZicViqFSqpH0SEUq0GIkVSV7ElZvo0k2VTESs1sRyGPIz0dWb+EDAMAx1G5Nzmyirl+M4aoWTBxpicSeSba3wUYEQM00bQUwFBI4iGIZJqqEcm72a6ApOXDYda4kII4n/EeEgFhuZapLKqkzMuE3MvE3cN3kRYZstEl2+RDhJbHdsYtHY808syWFZlq6fqvb0WLNCUyFYpukjiKmAwFFKolgmxg2JkIx1B5OyksQSk8kg5TQk4zVxe8SiTBTyqVy3ZJ+J5S2J1mVio/uxEMuUWKnklRgznSg5KRXkwYEIOxkaMJH1SY6frHcsC2gSMWRhmc7KkSx4BDEVEDgGSBRWAElWKhG7xDpMss5YgZ1MJBKtysQxbqmybsfuGwDd/1ww1ionFnOiSzrRep2o1jWTDOpjEqEDUtoIYiogcAySmLwEHBGyseUlRPASyzkSrcZEC3YiMZnKdUv2k7jfsfHQxLjoRJZlogWbGCsde6yJxzzWeh07xWcsifHmqR4uBAQSEcRUQOA4gIgDEddEQUt0CxPhm2gbY0Vs7CsVRNhnmrGu34nOZyKI9Uos88nO4bgli5ipkIAkICBw3JDoBiUWZarYY2IMMh0X7djY59g46ET9gVOJ3lhLdaz1msnAq1SWqyCcaSBk86aNIKYCAgIAxsddCWOTfcb+e2wjhfmY6jjWWh6b4CSQJYKYpo0gpgICApOSKFCpmCjTduzvEn+OXX8ii3Xsz7H/Flyzswsfi78yXed4RBBTAQGBaTGZC1fgKCfKA2yGlmb0+LRMj8GWHQICAgICAnOLYJkKCAgICKRE6ICUPoKYCggICAikRkhAShtBTAUEBAQEUiOIadoIYiogICAgkBKezyKb9/jUUiEBSUBAQEBAYLoIlqmAgICAQGoEN2/aCGIqICAgIJAaQUzTRhBTAQEBAYGUCKUx6SPETAUEBAQEUhPL8pUFGzduREVFBWQyGVavXo0PPvhg0uXfe+89rF69GjKZDJWVlXj00Uez2/EMIYipgICAgEBKiGWa6StTXnzxRdx000342c9+hsbGRpx88sk4//zz0dvbm3L5rq4uXHDBBTj55JPR2NiIn/70p/jBD36Al156abqnnDWCmAoICAgIzCv3338/rrnmGlx77bVoaGjAgw8+iJKSEjzyyCMpl3/00UdRWlqKBx98EA0NDbj22mvxrW99C/fee+8cH/kRjruYKZla4XK55vlIBAQEBGYGcj+b6fF3sXAw44SiWDSUdEwEqVQKqVQ6bvlQKIQ9e/bg1ltvTXr/nHPOwccff5xyH9u3b8c555yT9N65556LJ598EuFwmM7onUuOOzF1u90AgJKSknk+EgEBAYGZxe12Q6vVTns7EokE+fn5GNj9eFbrq1SqcffY22+/HXfccce4ZUdHRxGNRmEymZLeN5lMMJvNKbdvNptTLh+JRDA6OoqCgoKsjns6HHdiWlhYiL6+PqjV6jkfGeVyuVBSUoK+vj5oNJo53fdsIJzPwuVYOhdAOJ+p4HkebrcbhYWFM3B0gEwmQ1dXF0KhUNbHM/b+msoqTWTs8hPNuZ1s+VTvzxXHnZiyLIvi4uJ5PQaNRnNM3BAIwvksXI6lcwGE85mMmbBIE5HJZJDJZDO6zVQYjUZwHDfOCrVYLOOsT0J+fn7K5UUiEQwGw6wd62QICUgCAgICAvOGRCLB6tWrsWnTpqT3N23ahBNPPDHlOhs2bBi3/DvvvIM1a9bMS7wUEMRUQEBAQGCeueWWW/DEE0/gqaeewuHDh3HzzTejt7cX119/PQDgtttuw5VXXkmXv/7669HT04NbbrkFhw8fxlNPPYUnn3wSP/7xj+frFI4/N+98IpVKcfvtt08ZOzhaEM5n4XIsnQsgnM+xzqWXXgqr1Yo777wTQ0NDWLJkCd544w2UlZUBAIaGhpJqTisqKvDGG2/g5ptvxp/+9CcUFhbioYcewpe//OX5OgUw/EznUgsICAgICBxnCG5eAQEBAQGBaSKIqYCAgICAwDQRxFRAQEBAQGCaCGIqICAgICAwTQQxnWV+85vf4MQTT4RCoYBOp0trnauvvhoMwyS91q9fP7sHmibZnA/P87jjjjtQWFgIuVyO0047DYcOHZrdA00Du92OK664AlqtFlqtFldccQUcDsek6yyka3O0j6waSybns23btnHXgWEYNDc3z+ERp+b999/HxRdfjMLCQjAMg1dffXXKdRb6tRGYGkFMZ5lQKISvfvWr+O53v5vReueddx6Ghobo64033pilI8yMbM7nd7/7He6//348/PDD2LVrF/Lz83H22WfTPsnzxTe+8Q3s3bsXb731Ft566y3s3bsXV1xxxZTrLYRrcyyMrEok0/MhtLS0JF2LmpqaOTriifF6vVi+fDkefvjhtJZf6NdGIE14gTnh6aef5rVabVrLXnXVVfwll1wyq8czXdI9n1gsxufn5/O//e1v6XuBQIDXarX8o48+OotHODlNTU08AH7Hjh30ve3bt/MA+Obm5gnXWyjXZt26dfz111+f9F59fT1/6623plz+Jz/5CV9fX5/03nXXXcevX79+1o4xEzI9n61bt/IAeLvdPgdHlz0A+FdeeWXSZRb6tRFID8EyXaBs27YNeXl5qK2txbe//W1YLJb5PqSs6OrqgtlsThqXJJVKceqpp044Xmku2L59O7RaLU444QT63vr166HVaqc8rvm+NmRk1dgRVNmMrNq9ezfC4fCsHWs6ZHM+hJUrV6KgoABnnnkmtm7dOpuHOWss5GsjkD6CmC5Azj//fDz77LPYsmUL7rvvPuzatQtnnHEGgsHgfB9axpBm1JmMV5oLzGYz8vLyxr2fl5c36XEthGszGyOr5pNszqegoACPP/44XnrpJbz88suoq6vDmWeeiffff38uDnlGWcjXRiB9BDHNgjvuuCNl8kPia/fu3Vlv/9JLL8WFF16IJUuW4OKLL8abb76J1tZWvP766zN4FkeY7fMBMh+vlC2ZnEuq/U91XHN9bSbjaB9ZNZZMzqeurg7f/va3sWrVKmzYsAEbN27EhRdeiHvvvXcuDnXGWejXRmBqhN68WXDDDTfgsssum3SZ8vLyGdtfQUEBysrK0NbWNmPbTGQ2zyc/Px9A/Ok7cWDvZOOVpkO657J//34MDw+P+93IyEhGxzXb1yYVx8rIKkI255OK9evX4+9///tMH96ss5CvjUD6CGKaBUajEUajcc72Z7Va0dfXN2vT42fzfCoqKpCfn49NmzZh5cqVAOIxsvfeew/33HPPjO8v3XPZsGEDnE4nPvnkE6xbtw4AsHPnTjidzgnHPqVitq9NKhJHVn3xi1+k72/atAmXXHJJynU2bNiA1157Lem9+R5ZRcjmfFLR2Ng4p9dhpljI10YgA+Yz++l4oKenh29sbOR/+ctf8iqVim9sbOQbGxt5t9tNl6mrq+Nffvllnud53u128z/60Y/4jz/+mO/q6uK3bt3Kb9iwgS8qKuJdLtd8nQYl0/PheZ7/7W9/y2u1Wv7ll1/mDxw4wH/961/nCwoK5v18zjvvPH7ZsmX89u3b+e3bt/NLly7lL7rooqRlFuq1eeGFF3ixWMw/+eSTfFNTE3/TTTfxSqWS7+7u5nme52+99Vb+iiuuoMt3dnbyCoWCv/nmm/mmpib+ySef5MViMf+vf/1rTo97IjI9nwceeIB/5ZVX+NbWVv7gwYP8rbfeygPgX3rppfk6BYrb7aZ/FwD4+++/n29sbOR7enp4nj/6ro1AeghiOstcddVVPIBxr61bt9JlAPBPP/00z/M87/P5+HPOOYfPzc3lxWIxX1payl911VV8b2/v/JzAGDI9H56Pl8fcfvvtfH5+Pi+VSvlTTjmFP3DgwNwf/BisVit/+eWX82q1mler1fzll18+rtRiIV+bP/3pT3xZWRkvkUj4VatW8e+99x793VVXXcWfeuqpSctv27aNX7lyJS+RSPjy8nL+kUcemeMjnpxMzueee+7hq6qqeJlMxuv1ev6kk07iX3/99Xk46vGQsp2xr6uuuorn+aPz2ghMjTCCTUBAQEBAYJoI2bwCAgICAgLTRBBTAQEBAQGBaSKIqYCAgICAwDQRxFRAQEBAQGCaCGIqICAgICAwTQQxFRAQEBAQmCaCmAoICAgICEwTQUwFBAQEBASmiSCmAgICAgIC00QQUwEBAQEBgWkiiKmAgICAgMA0EcRUQCBLTjvtNNxwww244YYboNPpYDAY8POf/xyp2l3zPI+zzjoL5513Hv29w+FAaWkpfvazn831oQsICMwwgpgKCEyDZ555BiKRCDt37sRDDz2EBx54AE888cS45RiGwTPPPINPPvkEDz30EADg+uuvh8lkwh133DHHRy0gIDDTCMPBBQSmQUlJCR544AEwDIO6ujocOHAADzzwAL797W+PW7aoqAiPPfYYrrjiCgwPD+O1115DY2OjMABaQOAYQLBMBQSmwfr168EwDP3/hg0b0NbWhmg0mnL5r371q/jSl76Eu+++G/fddx9qa2vn6lAFBARmEUFMBQTmEJ/Phz179oDjOLS1tc334QgICMwQgpgKCEyDHTt2jPt/TU0NOI5LufyPfvQjsCyLN998Ew899BC2bNkyF4cpICAwywhiKiAwDfr6+nDLLbegpaUFzz//PP74xz/ihz/8IQDgtttuw5VXXkmXff311/HUU0/h2Wefxdlnn41bb70VV111Fex2+3wdvoCAwAwhiKmAwDS48sor4ff7sW7dOnz/+9/HjTfeiO985zsAgKGhIfT29gIARkZGcM011+COO+7AqlWrAAC33347CgsLcf3118/b8QsICMwMDJ+qKE5AQGBKTjvtNKxYsQIPPvjgfB+KgIDAPCNYpgICAgICAtNEEFMBAQEBAYFpIrh5BQQEBAQEpolgmQoICAgICEwTQUwFBAQEBASmiSCmAgICAgIC00QQUwEBAQEBgWkiiKmAgICAgMA0EcRUQEBAQEBgmghiKiAgICAgME0EMRUQEBAQEJgm/x+vBcvKreOIfQAAAABJRU5ErkJggg==", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAdMAAAF4CAYAAAAPJROAAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8g+/7EAAAACXBIWXMAAA9hAAAPYQGoP6dpAAEAAElEQVR4nOydd3xV9f3/n+fum3Gz92AkgTDCDHsvERQFrNu6sG7raOtoq7a/b6sdWrVqxTrqrquCAwVFZAvIDnsTMsm+e57z++N6Pt6bBAygss7z8bgPJbnj3JPkvs57vd6SoigKGhoaGhoaGseN7mQfgIaGhoaGxumOJqYaGhoaGhoniCamGhoaGhoaJ4gmphoaGhoaGieIJqYaGhoaGhoniCamGhoaGhoaJ4gmphoaGhoaGieIJqYaGhoaGhoniCamGhoaGhoaJ4gmphoaGhoaGieIJqYaGhoaGieNpUuXMm3aNLKzs5Ekiblz537vYxYvXsyAAQMwm80UFhbyyiuv/OjH+X1oYqqhoaGhcdJwuVz07duXZ599tkP3379/P+eddx7jxo1j48aN3HXXXdxwww0sWLDgRz7SoyNpRvcaGhoaGqcCkiQxZ84cpk+ffsT73HfffcybN48tW7aIr1122WU0Nzczf/78n+Ao28dw0l75JCHLMlVVVcTHxyNJ0sk+HA0NDY0TRlEUHA4H2dnZ6HQ/TMLR6/Xi9/uP+3haf76azWbMZvMJH9fXX3/NxIkTo742efJk7rrrrhN+7hPhrBPTqqoq8vLyTvZhaGhoaPzgHDp0iNzc3BN+Hq/XS16SjXpv4LgeHxcXh9PpjPraww8/zB/+8IcTPraamhoyMjKivpaRkYHdbsfj8WC1Wk/4NY6Hs05M4+PjgfAvnc1mO8lHo6GhoXHi2O128vLyxOfbieL3+6n3BvjqwkHEGfXH9FhnIMS4D79p8xn7Q0SlpzJnnZiqqQebzaaJqYaGxhnFD126ijfriTMem0xI32aZf6zP2MzMTGpra6O+Vltbi81mO2lRKZyFYqqhoaGh0TEkSULSHZtA/9i9KMOGDePTTz+N+toXX3zBsGHDftTX/T600RgNDQ0NjZOG0+lk48aNbNy4EQiPvmzcuJHy8nIAHnjgAa6++mpx/5tvvpl9+/Zx7733smPHDv71r3/x7rvvcvfdd5+MwxdokamGxklGURRxi0SSJHHT0DgZSLrv0rbH8phjYe3atYwbN078+5577gHgmmuu4ZVXXqG6uloIK0CXLl2YN28ed999N0899RS5ubm8+OKLTJ48+dhe+AdGE1MNjZ8IRVGQZZlQKCT+X5bl732cJEnodDp0Oh16vR6dTqcJrMZPg+7Y07wc4/3Hjh3b5kIykvbcjcaOHcuGDRuO7bh+ZDQx1dD4kVAFMxgMdkg4Wwuk+gGjKAqhUIhQKEQgEB5VMBgMGI3GH2ymUEOjPaTjENNjFt8zhJP6l/joo48yaNAg4uPjSU9PZ/r06ezcufN7H/fee+9RXFyMxWKhpKSkTTFaQ+NkoSgKwWAQr9eL2+3G6/UKMYWwYOr1eoxGI2azGavVSkxMzBFvVqsVi8WCyWRCr/9uRCEYDOLxePB6vR2KbjU0jgc1zXust7ORk/q2lyxZwm233caqVav44osvCAQCnHPOObhcriM+ZuXKlVx++eXMmjWLDRs2MH36dKZPnx5lLaWh8VOiRo4+nw+3243P5yMUConvGwyGKOGMFEdVfH0+Hx6PR9xUEQaE+FosFmJiYjCbzUJYQ6EQHo8Hv99/1FSZhsbxEFm3P5bb2cgp5c1bV1dHeno6S5YsYfTo0e3e59JLL8XlcvHJJ5+Irw0dOpR+/foxe/bs730Nu91OQkICLS0t2pypxgmhCmEgEIgSMjX6NBgMUfXNYDCI3+8nEAgQCAQIBoMdEkCdTofJZBJ2bKqQyrKMz+cTkalOp8NisZy1H2ZnMz/055r6fJuuHUm86diqgQ5/kL6vLD/rPmNPqZppS0sLAMnJyUe8z9dffy26vVQmT558xLU9Pp8Pn88n/m2320/8QDXOamRZFmIYicFgiBJQRVHw+/14vd420WokanNRZP1TFWq17ur1evF6vUDYSSY+Pl5Eq6pIy7KMx+PBYrFotVSNHwStZtpxThkxlWWZu+66ixEjRtC7d+8j3u9Ivow1NTXt3v/RRx/lj3/84w96rBpnJ+2JqCRJGI1GDAZDlICqqdrWkafRaMRkMonHqI872muqaWCfz0cgEBD/b7Vahajq9Xrxel6vVxNUjR+En2I05kzhlBHT2267jS1btrB8+fIf9HkfeOCBqEhW9bDU0Ogosizj9/ujIkudTidETJIkZFnG5XLhdrvb3M9isWA2mzGZTFECF/m8oVAIWZajRmDU+5tMJkwmE/Hx8QSDQRwOB16vVwh2UlISZrMZi8USJahWq1VL+WqcEFpk2nFOCTG9/fbb+eSTT1i6dOn3bjw4ki9jZmZmu/f/odb+aJx9tBeJqs1Aat0yGAwKEVWRJAmLxYLVasVkMomI1ev14nQ6Rcr2+9ZbSZIkGpcSExOJiYnBYDCQlJSE3+/HbrcTCARobGzEZrMRGxsbJah+v1/73dc4ISTpOCLTs1NLT66YKorCHXfcwZw5c1i8eDFdunT53scMGzaML7/8Mmp33angy6hx5qAK0dFENBAICGFUMRgMQtB0Oh2yLONwOLDb7TidznZrpmoUqqZ71XlUNVpVhbepqQmr1UpKSgo2mw2TyURKSgotLS14PB7sdjuKohAXF4fZbBbdwAaDIWqkRkPjWNAi045zUsX0tttu46233uLDDz8kPj5e1D0TEhKE+//VV19NTk4Ojz76KAB33nknY8aM4fHHH+e8887j7bffZu3atfz73/8+ae9D48xAbfqJjBjVNGukiDocjqimNrPZTGxsLCaTCQCPx0NTUxN2uz1qBlSn0xEbGyvGYywWCwZD+3+CiqIQCATweDw4nU4hmhUVFVgsFnJzczGbzSQkJKDX63E6nTgcDgwGg3he9b1oHb4aGj8+J1VMn3vuOSBsDRXJf/7zH6699loAysvLo+pMw4cP56233uL3v/89v/3tbykqKmLu3LlHbVrS0Pg+QqGQ6IiFcIpVFVFJkgiFQjgcDjwej3iMxWIhLi4Oo9GILMs0NTXR2NgYJbRGo1Gsomqvhql68sqyLEZqIl/fZDKRkJBARkYGjY2NNDY24vV62bt3L9nZ2SQmJhIfH48sy7jdbpqbm0lLS8NkMgmziGAwiNFo/AnOosaZhhaZdpyTnub9PhYvXtzmaxdffDEXX3zxj3BEGmcb7aV0TSZTVNrV6XTidDrF9yNFNBQKUVdXR0NDg0jjSpJEQkICSUlJQkAVRRFi53Q68Xg8uN3uNnVTo9GI1WrFarWSmppKSkoKOp0Og8FAeno6SUlJVFZW4nK5qKysJBgMkpqais1mE/OrdrudpKQkTCaTmGv9vq5hDY120XHs1j5aN6+GxtlDeyldg8EQ1TCk1iLVaFXtqDWZTIRCIQ4fPkxDQ4P4vtFoJDk5maSkJPR6PbIs09DQQF1dHU1NTd/bcARECWJtbS0Gg4G0tDTy8vKIiYnBaDTSqVMnDh8+TH19PbW1tWLuNCEhgfr6ejHXqoqp6tB0pJSyhsaR0CLTjqP9dWmcdbTnHBRZFw0Gg7S0tAjx0+v12Gw2zGYziqJQX19PfX29iETNZjOpqakkJCQgSRJ2u52qqirq6urajMmoKV/Ve9dsNotxmEhzhpaWFg4fPozf76e6upqamho6d+5Mfn4+kiSRkZFBKBSiqamJiooKunbtitlsJiYmBrfbjcPhICUlRdRO1WYkDY1jQZsz7TjaX5fGWUN70WhkSldRFFwuFw6HQ3w/Li6OuLg4ABwOBzU1NWJzi8lkIj09HZvNhqIo1NbWUllZGfV49T5qF+7ROmvVjuH4+HjS0tIoKCigubmZ8vJympqa2L9/Py0tLfTo0QOj0UhmZqaYN62urqZTp07ExcXh8XgIBAL4/X4hpuraNy3Vq3EsaJFpx9HEVOOsoL1oVI0KIZxebW5uFrVTtfHHYDDg8/morq4WCxjU+mViYiKyLFNVVUV5ebloPJIkifT0dLKyskS0CmExdzgcombq8XiQZVk0GsXExJCamhrVhJSUlERiYiK1tbXs2rWLxsZGysrK6Nu3L3q9ntzcXPbs2YPL5cLpdBIfH4/FYhGGDjabTVwoyLKsjcloaPxIaGKqcUYTudFFpXU0GtlgJEmS6LxVFEXUJtWoLiUlhdTUVCRJorq6mgMHDohI12QykZOTQ1ZWlhiTaWlp4dChQ6JJKfI42kOv15OZmUlOTg5du3YV3cSZmZnExsayadMm7HY7u3btori4WMyb1tfXU1NTQ1xcHFarNUpM9Xq9iE41MdU4FrTItONoYqpxxqIoSpTBfOtoNBgM0tzcLNK2kXObbrebyspKIZRxcXFkZmZiMploaGhg3759wvXIbDaTl5dHVlaWeOzOnTs5ePCgWN6gotZN1T2ler0ev9+P3+8Xs6SVlZVUVlaye/duhg8fTmJiIgDx8fH07NmTzZs3U1tbS0pKCunp6aSmptLY2Ijf78flchEbGytqsH6/P0pMNTSOBa1m2nE0MdU4I1GjUXX8ymg0YjQaRcrV7XYL16DW0WhNTQ0NDQ1AOKWbmZmJzWbD4/GwefNmmpqaxPc6d+5MdnY2kiRRU1PD7t27qaioEK+rRpXZ2dmkpaWRmJh4xOhQURSam5uprKxkx44dNDU18dlnn9GvXz+Ki4uRJInk5GQ6derEwYMH2bt3LykpKej1ehISEoRRRFxcHCaTCa/XSyAQEDOmsixrdVONY0KSjiMyPUt/vzQx1TijUJ2D1GhT9beN3AFqt9uF+YLJZBICp0aFaio2MTFReD4fOHCA8vJyIUZ5eXnk5+ej1+upqKigrKxMiCyENxl17dqVnJycDvvjqjXSpKQkCgoKWLVqFVVVVaxfvx673c7gwYORJIn8/Hxqamrw+XxUVFTQqVMn4uPjaWpqwuFwoCiKEFO/309sbGzU+TlbP+w0jh1JOnav3bP110sTU40zhtZpXb1ej9lsFuIRCARoamoS34/s1K2vr+fw4cMoioJeryc7OxubzUZzczM7d+4U4pucnExRUREWi4XKyko2bdpEc3MzEI5Uu3btSlFRkUjNQjhKbm5uFg5GgUAAvV6PTqcjLi5OdOFGYrVaGTt2LLt372bt2rXs2bOH1NRUCgoK0Ov1dO3ale3bt3Po0CHy8vJEajcYDOL1ekU0GggEkCRJpH3VzTQaGh0hnOY91prpj3QwpziamGqcEbRO67ZuMvJ4PKJ+qdPphENQMBiksrJSNCDFx8eLtO3u3buprKwUz1dUVERqaipNTU2sWLFCbC8yGAx0796d4uJiLBYLAI2Njezfv5/9+/dTXl7eZpF4a9LS0igsLGTQoEHiOSRJolu3bvj9fjZt2sSGDRuEJ296ejq7d+8mGAxit9tJTEzEarXicrnwer0kJCQACAGN7CjW0ND44dHEVOO0pvXsqLr+TI2+FEURjT0QbhZKTExEp9PhdDqFJZ9a20xKSsLhcLB9+3bxmMzMTAoKCpBlmTVr1rBnzx4gLMrFxcX07NkTs9mMz+dj48aNlJWVUV1dHXWcke5IVqtVbIVpaGigpqaGuro66urq2Lx5MxMnTqR79+7isT179uTAgQO0tLSwadMmke5NTk4WLkyJiYlYLBYhpklJSeJCIhQKaWKqcXwcRzcvWjevhsbpRWtf3dZpXdUhSK2fxsfHi/phXV0dhw8fBsICq0Z85eXl7N+/X3y9e/fuJCUlceDAAdavXy9WrnXq1Il+/foRFxeHw+Fg+fLlbN68WbyWTqcjLy+PLl260KVLFzFO0x5ut5v9+/ezcuVKmpqa+PDDDxk0aBBjx44VKdpBgwaxcOFCdu/eTXFxMTabjZSUFA4fPkxjYyMFBQUiolWPMbKLN/LiQkOjo2jdvB1HE1ON0xJ12XakL25kt67f76epqUmkOJOSkjCbzYRCoSiXosTERLKysggEAlH1z/T0dIqKiggEAixevJiqqioAbDYbgwcPJiMjA7vdzvz589myZYs4jpSUFEpKSujVq1dU48/RiImJoVevXnTv3p0VK1awevVqvvnmG+Li4hg0aBAQbmjKzs6mqqqKffv20a9fP5HKdbvdwvwBiBJ0IMqsQRNTjWNBmzPtOJqYapx2tK6Pms3mKN9Zt9st6qMGg4GkpCThZFReXo7f70eSJLKyskhKSqKpqYlt27aJxqCioiLS09M5ePAga9euxe/3o9Pp6N27Nz179iQYDLJkyRLWrl0rmpny8vIYNmwYnTp1ahOBqsu+AbFmrb0o1WAwMGbMGGJjY1m0aBHLli2joKCA5ORkADp37kxVVRWVlZX069dPzMyqnr6qYEa+FmgCqnH8aGLacTQx1TitCAaDUbZ9reujDodD2P5ZLBYSExORJAmHw0FFRQWyLGM0GsnLy8NisXDw4EGR1o2NjaVXr14YDAZWrFhBeXk5EI42hw0bhs1mY9OmTSxbtkzUU/Py8hg1ahS5ubniGB0OB9u2bePgwYNUVlZSVVUV1YBks9koKSmhpKSEwsLCNnOnAwcOZO/evRw8eJCFCxdyySWXAJCVlQVAc3MzHo8Hq9VKTEwMTqcTt9tNUlISQJuuXU1MNY4XLc3bcTQx1TgtaD0/2ro+Kssyzc3NQmhbj72onbcxMTHk5eUBsHXrVurr64Fwk1FRURFNTU0sX74cl8uFJEkiZdvQ0MBbb70luntTUlIYO3YsXbt2RZIkPB4P69evZ+PGjezdu/eoAma321mxYgUrVqwgMTGR6667jvz8fPF9SZKYPHkyL7zwAgcOHKC5uVk0GNlsNux2Oy0tLVitVsxmM06nU4zARJ4vDQ2Nnw5NTDVOeVrPj0buHYW2jUbqmIiiKFRXVwszhaSkJDIzM/H5fGzZskUIZlFREVlZWWzbto1NmzahKApxcXGMHDmSxMREUcdUo9rRo0fTv39/dDodhw8fZtmyZXzzzTdRvru5ubl069aN3NxccnNziY+PB8Kif/DgQTZv3kxZWRnNzc0888wzXH755fTv3188PjExkU6dOnHgwAG2bNnCyJEjgfBFgt1ux+FwkJmZ2Sa12/q8wdnrSKNx4mhp3o6jianGKU3rRiOTySQMCSCc9m1sbBQdq+r8aCgU4tChQyLlm5mZSXJyMs3NzWzdupVgMIjJZKJXr15YrVaWLl1KRUUFEO7UHTx4MHa7nddff110/RYVFTFx4kTi4+Opq6vjs88+Y8OGDeJYMjIyGDx4MH379iUlJeWI76lHjx706NGDCy+8kNdff51t27bx2muv4fV6GTZsmLhf7969OXDgAFu3bmXEiBFIkiREWZ2LPZqYtj6PoAmrxrGhpXk7jiamGqcsamONKgSqMbyK3++nsbFRuBYlJydjMBgIBAKUl5fj9XqF9V98fDzV1dXs2rULRVGIj4+nd+/eeL1e5s+fj8PhQKfTUVpaSkFBAevXr2fJkiWEQiEsFgvnnHMOxcXFOJ1O/ve//7Fy5UrRKdyjRw9Gjx5Nt27djkmsLBYLs2bN4sMPP2Tp0qXMmTOHbt26CSEuKipCkiRaWlpwuVzExcWJ8ZfW22cURYlK7UqSJC5ANAHVOF40b96Oo4mpxilJZMdu60YjCM9Squlbo9FIUlISer0en8/HwYMHCQQCGAwG8vPzsVgs7Nu3TzQUpaWlUVxcTHV1NStWrCAYDBITE8Po0aOJiYlhzpw57N27F4CuXbty7rnnEhMTw8qVK5k3b57YFlNcXMz5559PTk7Ocb9PnU7HhRdeSFVVFXv27GHJkiXMnDlTvK+EhARhRRgXFxc17hL5X71eH9XFq3b5qq+hRaYax4OW5u04mphqnHJEduy2XpsG0aMvkY5Gbreb8vJyQqEQJpOJTp06YTAY2L59u0jV5ufn07lzZ7Zv387GjRuBcHp25MiRNDY28t5772G329Hr9YwbN47+/ftTVVXFiy++yMGDBwHIzs5m+vTpFBUViWNSFIWDBw+yZ88e9uzZw4EDB5BlGbPZjMlkori4mClTprTx4FXf44QJE9izZw9r1qxh6tSpIgJVU9MNDQ3k5+e3EdPI9XKRnsSR92kttBoaHUVL83YcTUw1TilaC6nFYokSAJfLhd1uB8Jm8AkJCUiShNPpFFtdrFar6I7dvHkzzc3Nwuc2PT2d1atXs2/fPiCcSi0tLWXz5s0sXLgQWZZJTEzkwgsvJDU1lYULF7JgwQJCoRBms5kpU6YwcuRIIVg1NTUsW7aMZcuWCcFuj61bt/Lpp59y7rnnct5557UR1e7du5OcnCw8fXv06AGEx2jU9w1ta5/qyI3BYIj6f0VRoiJT9XuamGpo/DhoYqpxyhAIBITHbuvRFwg33ajORTExMdhsNiRJwm63ix2isbGx5OfnEwgEKCsrw+l0CsOFuLg4Fi9eTE1NDZIkMXDgQAoLC/niiy/YtGkTEBa1c889F7vdzjPPPMOBAwcAKCkp4aKLLiIhIQFFUdi+fTtz584Vj4Pv7AcLCgro2rUrZrMZv9+P3W5nwYIFHDx4kDlz5vDJpwv49b0P0LfXd5GtJEl07tyZxsZGKioqhJi2Nl5QLzRUtyP13+prQTg9HNmQpKV5NY4XLc3bcTQx1TjptJ4hbT360tqMQZ0hlSRJLNOGcBSXk5ODz+dj06ZNYhVZSUkJBoOBL774gubmZgwGgxh7efvtt8Xjx4wZw+DBg1m/fj3vvvsufr8fi8XCzJkzKS0tRZIktm7dyjvvvMOuXbuAsDj16dOHUaNGUVpaKtKzrRk7dizr1q3jL489DT43f37kr7z16uwo56bc3FzWr18vuorV549EFUyz2SxGhtR/q6lvtZsZaGMjqK1f0zgWtDRvx9HEVOOk0lpIW3vsthbS+Ph4kSJtbGwU21kSExPJzs7G7XazadMmIYR9+vQhEAiwYMEC3G43FouFcePGoSgKb7zxBi0tLZjNZqZNm0ZeXh7vv/8+K1euBKCgoIArr7ySpKQk6uvreeONN1i1ahUQFvyxY8dy/vnniwXiR0M1qzem9CdQ9w1GnHzzzTdRozDq86hGEtDWGlA1sVcjUTWVq26tAcRqOfU4NQMHjeNFkqRjzmacrdkPTUw1Thqtt760niFVFAW73S66Z202mzCPV1eXQbhJJzMzE6fTKTa3xMTE0LdvXxwOB4sXL8bn82Gz2Rg3bhwNDQ3MnTsXn89HYmIiF110ETqdjn/+859UVFQgSRITJ07k3HPPRVEUPvzwQz744INvu4uh1hXLfffcwpTxpcf8nl998g5eePEVli9dyMKFC6PEVH3vkSla9SJCfd9qvTg+Pl58z2q1otPpoqLUSDGNrJ2erR90GseHlubtOJqYapwUjlVIExISiImJAaLtAVNSUsjIyMDhcLBp0yZCoRBxcXH06dOHxsZGlixZQjAYJCUlhXHjxrFnzx4+++wzZFkmJyeHGTNmUF1dzSuvvILT6SQ2NparrrqK4uJiKioqeO6558SYTG5+Fz79xoU7YKK+OXrOs6OYTQYuu2Q6K5Z9ydatW6murhaeu+0ZMKjmDHFxcSiKErXEXN1wExsbSygUEtG9xWIRY0ORYqoJqcaxoqV5O44mpho/Oa2FtPXWl44KaWpqKunp6djtdjZv3kwoFCIhIYGSkhLRZSvLMpmZmYwePZqNGzfy1VdfAeEZ0alTp/LNN9/w/vvvC3GdNWsWCQkJfPLJJ7zzzjsiyr3mmmsYNWoUgz9bS32jk0vOHwKEx3QqKytZuWYTL/33C4oKuvKvv92F2Ww+4vtPTU2ld+/elJWVsWnTJiGmKqr4wXeRaFxcHC6XSzg9xcbGitpqbGysSP8ajcao7t3IZiStXqqh8eOhianGT8qJCGlDQ4MQ0rS0NNLT02lpaRFCmpiYSElJCZWVlSxfvhxFUcjLy2P48OHCXxfCW1nGjh3LvHnzhLj269ePyy+/HK/Xy9/+9jcxg9qvXz9uvPFGsQbtkvMG09jYyCcff8jChQujunkBtjeWMWnSPEpKSvjZxZcwdszodiPCnJwcysrKRAQJiE5l1TLQ5XKJ85CSkhJVH9bpdEJobTabuJ/VaiUYDAqzC9URCjQx1Th2tDRvx9HEVOMno7Vh/bEKqVojTUtLIy0tjebmZsrKyqKE9NChQ6xcuRJFUejUqRNDhw7lyy+/FKI3ZswY+vfvzxtvvCEEc8qUKUyaNIkdO3bwz3/+k6amJoxGI9dccw0TJkwQYnjgwAFeeuklvvrqq6joMSEhAfQx1DZ6MeKCoJ8NGzawfsMG0ruOYO7rf2sjqOq6tEgxjRRHgLq6OiAsniaTiYaGBiAsrIFAQKyBs9lsYnl5TExMVDOXel6BNqveNDS+D0k6jjTv2amlmphq/DR0REgdDke7QtrY2CiENDU1lbS0NOx2uxDSpKQkevfuTXl5OV9//TWKotC1a1cGDRrE/Pnz2bZtm1hrVlhYyOzZs9m/fz96vZ7LL7+cAQMGsGDBAl577TVkWSY7O5u77rpLGD9UV1fz4osvsmDBAiGiPXv2ZOLEiYwbN46MjAwURWHxqh1kpNqIM4e4+c6HaK7dxeF9Kzl8uI6MjPSo86FGn2o0CogaqCqmahSenh5+rCquqampYgzGarViMBjEeYuJiRGNSEajMapeqtVMNY4VLTLtOFreR+NHR03tHklIIdxoo3an2mw2IaTNzc0ivZmSkkJ6erro2lUj0t69e3Po0CEhpIWFhQwaNIh58+axbds2dDod06ZNo1OnTjzzzDPs378fi8XCzTffTN++fXnhhRd45ZVXkGWZkSNH8sgjj5Cfn4/f7+c///kPl19+uWha8krJ6NJHceHltzP1/AvJyMgA+Dalqqe23k5+fj6P//0RdOZkJBRef/PtNudEjSrV9wmI6FIdkVFrotnZ2fj9fhGZZmRk0NjYCIQjXI/HI5aBWywWMYsaOW+qpXg1jgdVTI/1djaiRaYaPyrfVyOFsJBGdqlGjoGohgrJyclkZGTgcrlE167abFRRUSFSuwUFBQwcOJCPPvqIPXv2oNfrueCCC0hKSuLpp5+moaEBm83GTTfdRHx8PH/605/YsWMHkiRxxRVXcP755yNJEmvXruXvf/87hw4dAmDAgAFYUkt4f+FuGutD3PHwGwzp15V3nrkVgFUb9vLzu/8NwPRz+vOrG6ZQ708lmUY+/uhD7rjtpqimpMgRFxVVTLOzs1EURbz3vLw8amtrxbab2NhYkR5OTk6OGp8JhUJCQE0mU5SjlIbGMaPj2EOus/S67Sx92xo/BR0RUrfbLVKdqrMRhAVWjcwSExPJzMzE4/GwadMmgsEg8fHxlJSUUFVVxYoVK0Rqt7S0VAipwWBgxowZ2Gw2IaQpKSn88pe/xGAw8Lvf/54dO3YQUnRcdc0vmDZtGqFQiGeffZZf/vKXHDp0iLj4BO67/3c8/fTT3Hv75aSn2sSxNzQ7xf8bDd+J1dzPN3Cgsp7iXv0JYSbg9wizBxU1TaumdB0Oh/hadnY2hw8fFg5OGRkZIjrPzMzE7Xbj8XiQJInExMSo8ZlIS0FJkrTIVEPjJ0L7C9P40QgEAlFzpK2F1OPxCAGJjY0VQup2uzl06BCKomCz2USac9OmTQQCAWJjY+nTpw91dXWia7dz586UlpbyySefiIh0xowZWK1WnnnmGex2O1lZWdxxxx3Y7XYeeughDtfW4gvq2VKbRrPXSnV1NbfccgtvvvkmAElZxex2duelj/cgSRLpqTYevOMCcfz7D9Xj9YWbfeobHYwe3A2dTiIxPoY/Pf0R5ZWN9O/XG/iuHqqiroNT17ft2bMHCAupujIOwovKJUkSFxY5OTnCISkhIUGcLwiLaaRxQ2STlCamGseFpBzf7SxES/Nq/Cj4/X7RVdrakAHCBu2qwFitVuLj45EkCa/XS3l5ObIsExsbS05ODsFgkE2bNuHz+bBarfTt25empiaWLFmCLMvk5eUxZMgQ5s2bx65du4SQmkwmnn32WVwuFzk5Odxyyy0cOHCAxx57DI/HQ15ePvqkPnQKShyu3sfPr34It8uFwWimU8+xHHbFoEi1OJwecdylJZ3F/4dCMl99vZ2X31vKN5sOiK/fed0k/vjPDwHYsK0SPd/VSCG86UUV086dw8+3e/duAAoLCwGi9qk2Njbi8XgwGAxkZmaKzuS0tDQRlVosFoxGY5SYRvrzas1HGseF9O3tWB9zFqKJqcYPTnteu62/r9b8LBaLWKMWCAQ4ePAgoVBIrFFTFIWysjLcbjcmk4k+ffrgdDpZvHgxoVCIrKwshg8fzueff87OnTvR6XRMnz4ds9nMv/71L9xuN3l5edx8883s2LGDJ554gkAgQM+ePfn1r3+N1Wrl1rv/yLvzP0cCUtJz2VaXTnmZAwinn2vq7Lz07lJmXTKatOQ4OuemcKAi3Ax031/exe70Rr0/i+W79+v1y8QSLaaHDh0iGAxitVpJT09HURR27twJhFfCuVwuUT8tKCiIilqDwaCot6akpIi1b/Hx8fj9fjFfajQahZGDVi/VOG50Svh2rI85C9FyPxo/KMFgMKpup64Ki/x+Y2MjiqJgMplITExEkiSCwSAHDx4kGAxiNpvFWMrWrVux2+0YDAb69u1LMBjkq6++IhAIkJaWxqhRo1iyZAlbtmxBkiQuuOACYmJieO6553C73eTn53PLLbewefNmHn/8cQKBAKWlpdx///0YDAb+/Oc/s2lNWEjdUjrTLrmZkBTe/KL7tivR7fbw//72Is/+6znGTJrB3nUfIDv2oMgBIaQ6HSQnxvLUw1dwzqjexFrDzUZpiWFh9QV1fPjFBgLBEFu2bAHCLkxqCrexsRGj0UhRUZEQ1szMTOLi4sRS8vz8fCGeNpsNo9EoIlObzSbEU91cE7kcXEPjuJCO83YWokWmGj8YoVBIpBkNBkObiDQUCtHY2IgsyxgMBpKSkpAkCVmWKS8vx+fzYTAYyM/PR6/Xs3PnThobG9HpdJSUlKDT6Vi4cCFer5ekpCTGjh3L6tWrWbduHRA2X0hMTOSZZ57B5XKJiHTdunU8++yzKIrC8OHDufXWW/F6vdx///2sX78eSaejWemEi0z+8txnAJiMeoryk6jev47D5etB9vHoI4vEe1E81dC4ASk2H11SP2TM3Hj5WC6cNACA/82+nRffXsKOVbtxA0+/8TWuUBn7yg/jqS0DwjtSAZG27dGjB2azme3bt4t/19XV4XK5MBgM5OTkCKMJ1Y9YURTMZjMmk0nUny0WS9S2GS3Fq3HcHE8N9CytmWqRqcYPQigUikorRu4jhXCU1NTURCgUQq/Xk5ycLJZWV1RU4PF40Ol0dOrUCZPJxIEDB4RRQ8+ePYmJieGrr77C6XQSFxfHuHHj2LJli1iXNnHiRLKzs3nuuedwOBxkZ2dz0003sWHDBiGkY8eO5fbbb6epqYlbb72V9evXYzZb+P2D/w9DQleQJGRFQZEDeGrXsWnRcxze9zXIPqyxNozxeegSe6NL7ANGGyghFOd+5Mb1ALw5d6V4v8+/9RXvf7qGhoaw0YKsD8+TSihCLEtKSlAUJcq6MHI/a3FxMfv37wfCUanP5xORaFpaWlRHcDAYRJZlJEnS6qUapyXPPvssnTt3xmKxMGTIENasWXPU+z/55JN0794dq9VKXl4ed999t/gMOhlokanGCSPLsohIdTodZrM56gNcURSam5sJBALodDqSk5NF6rG2thaHw4EkSeTn52OxWKiurhapzW7dupGcnMzixYtpamrCYrEwfvx4ysvLWbhwIQAjR46kqKiIp59+mubmZtLT07n55pspKyvjmWeeQVEUxo8fzw033EBlZSV33nknNTU1hDBSEehGQVFPBpUcYOGKbcieamhchxL4duzFlITOVow/Ng9J0omrTymhB4pzH3LDNyjecOp1wvBeuD0+YqxmehRkMw8HEpCcnMKrz9zHvvI64g0OVn0RIDU1ldzcXPbt20dDQwNms5mePXuKDxD1Q+XAgQMAdOnSJWrlnCRJQlgTEhJETVadZW29HFxD47j4iSLTd955h3vuuYfZs2czZMgQnnzySSZPnszOnTuFA1gkb731Fvfffz8vv/wyw4cPZ9euXVx77bVIksQ//vGPY379HwItMtU4IRRFwev1oiiKcOBpLaR2u12IbVJSkhiRaWhoEK4+OTk5xMbG0tjYyK5du4BwNJaVlcXq1auprq5Gr9czduxYmpqa+OSTTwDo378//fv35/nnn6euro6kpCRuueUWdu7cydNPP40sy4wdO5YbbriBgwcPcuutt1JTU0OsLZl6qYQAcXh8fr5aWUaofjVy7RLkgBNzTAK69NHos85BF9cJqZVBqSRJ6OPzAQlCHpSgh1f+t5z7/vIeADdeMZZfzAyPxZSWDqSocwaTR/dm7dpvABg6dCiSJAnx7NevH0ajkc2bNwPQt29fUUOOj48nLS1NiGl2dnaUnaDJZBJX5FarFVmWNT9ejR8G3XHeCBuTRN7Uz4D2+Mc//sEvfvELrrvuOnr27Mns2bOJiYnh5Zdfbvf+K1euZMSIEVxxxRV07tyZc845h8svv/x7o9kfE01MNY6bSCFV04utU4qRm09Uw3YImxSo4pCenk5CQgJOp5OtW7eiKAoZGRl06dKFsrIy9u3bhyRJjBo1ClmW+eCDDwiFQnTr1o0xY8bw8ssvU1lZSVxcHDfffDOHDh3iySefJBQKMWrUKG688Ub27dvHbbfdRkNDA3EJaex1dCUhMYXf3nY+l938d3wVn6M4wylVY1Ix+uxz0cVkHzVFOvuRX5CT1yl8Lvxhe7/GZpf4/q4d4Uaj0tLwEnGv18v69eGU8NChQ/H5fKJeOmjQIPbu3YvL5SImJobCwkLRxVtQUEBjYyOBQACTyURSUpLohk5KSsLv90eleNXZXi3Fq3HCnMCcaV5eHgkJCeL26KOPtvsSfr+fdevWMXHiRPE1nU7HxIkT+frrr9t9zPDhw1m3bp0Qz3379vHpp58yderUH/gEdBwtzatxXKjG9WrHqMViaWMM4PV6hbuRzWbDarWKr6smBElJSaSmpuL3+4VxfUJCAt27d+fAgQOUlYWbdQYPHkxCQgKvv/46fr+f3Nxcpk6dyttvv82ePXswm83ceOON2O12HnvsMYLBIIMHD+bmm29m7969/PKXv6SlpYXu3bvjtvZF3lxBY4uLP/39eeS6VaAEQG9BnzYcxZKOP/j95+A3j7yN/tv7dc1LIT23C088dDnPvbEIJehh69at4tgBVqxYgdfrJSMjg4KCAlauXInP5yMtLY2uXbvyzjvvAOFaalNTEw0NDeh0OgoKCtixYwcQ7vD1er34/X50Oh02m02MyqjnV03xtjbJ0NA4Zk5gzvTQoUPC4Qs44o7f+vp6QqGQ8LlWycjIEL/3rbniiiuor69n5MiRKIpCMBjk5ptv5re//e0xHuwPhxaZahwXgUBAfGi3J6SBQECYMsTExAhD90AgEGXKkJWVhSzLlJWVCVOG3r17U19fLyz4evbsSX5+Pv/73/9wOBwkJSUxY8YM5s+fz4YNG9Dr9Vx33XVIksRf/vIXfD4fJSUl3HHHHZSXl3PnnXfS0tJCz549eeqpp7BYY1EUBbllB/LhZWEhNaeiz5qMZGlbnzkSdqeXum9HVYKKmXefvY1la3bx19mf8uwLbwFhYVRnST///HMAJk2aBMCyZcuAcM23vr6e8vJyJEmif//+4kOkc+fOBINBcS6zs7NFVKrO56op3piYGC3Fq/HDIinfzZp29PZtZGqz2aJuRxLT42Hx4sU88sgj/Otf/2L9+vV88MEHzJs3j//7v//7wV7jWNEuXTWOmUhTBpPJ1OZDWx2BUWdJbTabGIE5dOiQSFfm5uYCsHPnThwOBwaDgZKSErxeL0uXLhXuRn369OHDDz+ktrYWq9XKz372M9atWycWe1922WWkpqby4IMP4nQ6KSgo4Fe/+hWHDx/mzjvvpLm5meLiYp588km2761j8aodyI0bUBzh2qwuvggpuX+buuj3oSgyhMKNPwP6FgPQOS8VvU4iJhSuBY8fPx6AXbt2cfDgQYxGI2PHjmXXrl3U1tZiNpsZNGgQS5YsAcKmDQaDQTgkde/eXXT3pqamYjAYRCSqboyBcBRqMBiijO21FK/G6UBqaip6vV6sHFSpra0VG5Ra8+CDD/Lzn/+cG264AQhftLpcLm688UZ+97vfnRT7TC0y1TgmQqFQlClD61lSRVFoampqM0uqKArV1dV4PB70ej35+flCNA4fPowkSfTq1Quj0ciSJUvw+XwkJyczfPhwli1bJvx2Z86cSXV1NR988AEAU6dOpWfPnvz1r3+lsbGR7Oxs7r//fhwOB3feeSf19fUkpmRw169/h9FkYd2WfUiNq74T0qR+6FIGHrOQAuBvBhR0eiO9ehTw349WMaBXJ95+4ucYFTs6nU6I6bx58wAYMWIEcXFxfPnll0A4BRwMBkVKuLS0lB07dqAoCunp6cTFxYkPmdzcXHGRYrVasVgsoh6tpnjVeqmW4tX4QfgJTBtMJhMDBw4UfxMQnhD48ssvGTZsWLuPcbvdbQRTvahXMzM/NdpfnEaHkWU5apa0PSFtaWkhEAggSRJJSUniF76hoUGkKnNzczGbzdTX14s5yqKiIhITE1m6dCktLS1YrVbGjBnD9u3bRZPBlClTkCSJ1157DUVRGDJkCGPHjuVvf/sb5eXlJCQkcP/996MoCnfffTdVVVXoTXHsaMzjZ7e/wMQRxXz89r9QPFWADl3qYHRxnY/7fCiecANVQmonHv3XpwDkZ6ew8qsFQDh9q3bhfvNNuIv3vPPOo7y8nN27d6PT6Rg7dixr164lFAqRm5tLWloaK1asAMLp7crKSmRZJj4+nvj4eNGUlJKSErVIICYmRqTd1Z+PhsYJ8xONxtxzzz1cc801lJaWMnjwYJ588klcLhfXXXcdAFdffTU5OTmiiWnatGn84x//oH///gwZMoQ9e/bw4IMPMm3atJP2u6+JqUaHUDt3of1ZUgh37qppx8gRGKfTKaIr1SLP6XSybds2IFwHzM7OZtOmTVRUVKDT6Rg9ejRNTU2izjhs2DDy8vJ44okn8Pl8FBYWctFFF/Hyyy9TVlaG2Wzm3nvvJSEhgbvvvpt9+/ahN1ppNpQgByV8Pt93Qirp0aWPRGfNOqFzInvCa9FaAjZ0VjAY9MRZ9Xz2WdhF6eKLLwbgs88+Q1EU+vbtS15enmj3HzBgAFarVZg2DB06lF27dhEMBklMTCQjI0PUjfPy8nA4HASDQQwGAzabLWo8RqfTiYyBwWDQUrwaPww/kdH9pZdeSl1dHQ899BA1NTX069eP+fPni6ak8vLyqEj097//PZIk8fvf/57KykrS0tKYNm0af/7zn4/9xX8gNDHV+F5aj8C0niWF8BaYyM5dtdnA7/dH7SVNTk4mEAiwdetWZFkmMTGRwsJCysvLhWftkCFDMJvNvPPOO2IEZujQoTz33HM0NTWRlpbGtddey+eff85XX32FJEn88pe/pEuXLjz88MNs3LgRs9nCIX83giEJgwF8lSsihHQUOmv7tZgOn5OAHXxhdyMpJrxG7bHfXsL6NUvwer0UFBQwYMAAmpqaWLQobEN4/vnnU1lZSVlZGZIkMWHCBNasWUMgECA9PZ3c3Fw+/DC8baZXr15UV1cTDAaxWCykpqaKtWzJycnIsiwuXGJiYlAURevi1fjh+QmN7m+//XZuv/32dr+3ePHiqH8bDAYefvhhHn744eN6rR8DrWaqcVTUBd+RIzCthTQYDIoOU6vVKjp31YYjdQtMVlY4Ety+fTsej0e4/tjtdjFPVlxcTH5+PnPmzMHlcpGWlsaUKVOYM2cO+/fvx2KxMGvWLPbs2cMbb7wBwM9//nMGDBjAH/7vL3z55ZcYDAZ6DbmAoBTu2vVV/bBCCiDbw+lWW2pXJEMsACVF2bz77rvimCRJ4sMPPyQQCFBUVETv3r1ZsCCcAu7Xrx82m03MnY4YMYJdu3bh9/ux2Wzk5uZy6NAhIGxe4XK58Pl8wkFKrZWqywTUdK9Op9NSvBoaJ4GTKqZLly5l2rRpZGeHh+Pnzp171PsvXrxYGHdH3tThf40fnmAwKD6ozWZzm6K/2nCkKAoGg0GMa6gNR16vF71eT15eHjqdjgMHDgjz+t69ww5BS5cuJRgMkpGRQb9+/fjiiy+oqanBYrEwY8YM1q5dy9dff40kSfz85z8nFArx1FNPCb/dKVOm8MAfn2ThgrAr0g033k5QnxQef2lcj+I+BOi+Te2euJAqckAYPLj1+cTHhpt/np79Mi0tLeTk5DB+/Hjq6+uF5eEll1xCRUWFiErPOeccEZVmZGTQqVMn4dnbu3dvamtr8fv9mM1mMjMzxUJwtQ6timlsbPiCQe2u1qJSjR8UbWtMhzmpf3kul4u+ffty/fXXM3PmzA4/bufOnVHDwO15N2qcOK07d1t/UKsNR8FgUDQcqVFrU1OTaDjKy8vDaDTS0NAQ5bkbFxfH0qVLcTgcxMTEMHLkSMrKyoTgXHDBBTQ3NzNnzhwg3LnbuXNnfv/73+N2u+nevTuzZs1i8+bNLFv4PwAcUi4PP78GRQG9exchR3jpti51CHmde3LtxSMZP6wHWemJ+ANBDlY2MG/RJl77YAVeX6BD50Vx7gvPphriUSwZOFweJCXE+tXhdO7Pf/5zDAYD77//PsFgkJ49e9KrVy+ef/55IGyBGBsbK6LSkSNHsmPHDhGV5uXliYalvLw8PB4PbrcbSZJISUnB4/Egy7Kwb4ycLdXEVOMHRdsa02FO6l/elClTmDJlyjE/Lj09ncTExB/+gDQE39e5C+H29PYajtxut8gWZGRkEBsbi9frFZFXTk4OmZmZbN26tU3DkdoeP3r0aFJSUnj88ccJhUL07duX8ePH88QTT1BVVUVycjITp/yMB//+NusWv4Eiy3Qu7E3PQefz4jtLkZ0HCNZvAECX1J9Jk6fwxIOXY4uziuOPsZpItMXQt0cel04bzPW/eYmDlQ1HPS+KIiO3hPeNxmeW4JEkBpZ0JjZ0iD1bPGRnZzN16lQOHTokZkcvv/xydu7cya5du9Dr9UydOpXly5cTDAbJzc0lOzubjz76CAh78tbU1ODz+TCbzWRlZYl0b2JiIgaDQVykxMbGiqXqoDUeafwIRHjtHtNjzkJOy8vYfv364fP56N27N3/4wx8YMWLEEe/r8/miDJbVgXeNI6NaBcKRO3f9fr84l/Hx8aLhKBgMcujQIRRFwWazkZKSgizLbN26VZi2FxQUUFNTE+VLGxMTw3vvvUcoFKKoqIiBAwfy/PPP09LSQnp6OpdddhkfffQR33zzDQoSI8ZdyNW/fplUeQtGXASIpcrXiZXvLCXB7KbhQHicRrIVU1I6jmf+eBVWiwmn28u/Xl/EqvV7MZuNTJvYjysuGEpBfjov/30WF8x6CpfnyIbciqscQm7QW3j+yT/g9oXo0y2dq39+JQC/+MUvMBgMvP322yiKwuDBgykoKOCxxx4DEP7Cqk3i2LFjxblJTk4mOzub1atXA9CpUye8Xi8ulwtJkkhNTcXn84lMgNZ4pPGjo0WmHea0uobIyspi9uzZ/O9//+N///sfeXl5jB07VqTL2uPRRx+NMlvOy8v7CY/49CSy4ag9IQ2FQqLhyGKxEBsbbsBRd5MGg0HMZrOohe/Zs0c4HPXq1Qufz8eKFStQFIWuXbvStWtX5s2bh91uJykpialTp7JgwQLhuXv99dezd+9e4V27vymBFZtrSZD3Y8RFCAONUjEHKptQgm4a9nwOyEgxOeiS+vLQndOxWkwEgiGuvvsF/vX6ItZvPcjX6/fw27+9zyPPhmutBfnp/OLyMUc8L4ocQm4OdxzrbN345R//y4BenXj9tVdwu93k5Xdm0qRJlJWVsW7dOnQ6HZdddpnYemO1WpkwYYJwburevTtxcXFiS06/fv2oqKggEAhgsVjIzMykri7cMZyYmIjRaBRr12JiYtDpdCIq1RqPNH4UTsDo/mzjtBLT7t27c9NNNzFw4ECGDx8udtk98cQTR3zMAw88QEtLi7ipKTON9ok0AmjPc1fdTSrLMnq9XjQcAdTV1YkoKjc3F71ez+HDh6mqqgKgR48emEwmYfiekJDAoEGDWLt2bXguVK/nwgsvZP/+/aJx59JLL8VkMvHPf/4TRVEwxmWjmDPpnhkkhsMgSeQVT8BgjkWRg4QOL4eQF4wJ6FKH0q9nPkP6dQXg3U/WsH7rwTbv+YW3l7B7f3gO9rqLR2LQt/9nodh3QNAJeitSfBHNDg8rV6/jww/DKdqNFbHsOVDLf/7zHwDOOecckpKS+PTTsKHD5MmTqamp4cCBA+j1esaMGcOGDRtQFIXs7GxSUlLE72eXLl3weDxRUanf7xfiqTYeaY5HGhqnBqeVmLbH4MGDhStMe5jN5jaGyxrt07rhqL1Ix+l04vf72zgcOZ1OEUVlZ2djsVjweDzs3BmuL+bn55OSksKWLVuora3FYDAwatQoamtrWbp0KQATJkzAZDKJkZcRI0bQt29fnn76aVpaWsjPz+fFZx/lzceuYM57rwFQ1HsUa3c58PuDyA1rwd8IOhP69FFIOiPnjOotjv29T79p930risIHC9YCkBAfw7ABhW3vE3Qht4RNJsadeyld8rO48bIxfPHp+4CChxSCukRWrVxCVVUVCQkJXHzxxXz22Wc4nU4yMjIYNmyYiEpLS0vx+XxUVFQgSRIDBgzg4MGDhEIh4uLiSEtL4/C3JvpJSUmYTKaoqFSv1xMKhcTsryamGj8KJ7DP9GzjtH/bGzduFPOLGsdPZJ30SA1HPp9PfKAnJCSI+wSDwaiVaomJiciyzLZt28RKtc6dO1NbWxu1Us1sNvPxxx8jyzLFxcWUlJTwxhtv4HK5yMnJ4cILL2TOnDls27YNs9nMXXfdRSgU4sEHH8Tv9zN8+HCIKwBA8pajuA4AErq04UjGOABK+3QBwOX2Ubaz4ojvf/WGfeL/S0s6t/m+3LgBlBCF3XqxdIuHAxX1fPjxJ2zYsAGTycRDv7+P1/9xDV9+EXY/uvzyy2lubmb58uUAzJgxgw0bNtDU1ERsbCxDhgxh3bp1ABQWFmI0GkUEX1BQIPbARkal6oVOe+MwWuORxo+ClubtMCdVTJ1OJxs3bhR2avv372fjxo1iY8YDDzzA1VdfLe7/5JNP8uGHH7Jnzx62bNnCXXfdxaJFi7jttttOxuGfMahCerQl36FQKGqlmmqsrigKlZWVhEIhMRMJ4WW9ap20R48eBAIB4TlbUFBA586dWbBgAXa7ncTERCZPnsyiRYvYu3cvJpOJa665hr179/L+++8DMGvWLLKzs3nqqac4dOgQ6enpPPjgg2zecQgl4MBfG2440iX2jpolLewUHps6WFlPKCQf8RzsKT8s/r+gc/ReRdldjeKuACTufeD36HQSOiWA3hWOuq+55hpmTB3J10sW4PP56N69OyNHjuS9994TNoJZWVmsXLkSgDFjxlBeXk5TUxMmk4k+ffqwd+9eFEUhOTmZxMREYb+YnJyM0WgU7lJWqxWDwYAsy6Ku3d6Fj4bGD4I2Z9phTqqYrl27lv79+9O/f38gbHbcv39/HnroIQCqq6uFsEK4MeZXv/oVJSUljBkzhk2bNrFw4UImTJhwUo7/TCEYDIqO0PaENLJOqvrCqjQ0NOB0OkWdVKfTUV9fLyLV4uJizGYzq1atwuPxEB8fT2lpKWVlZezcuROdTse0adOorq5m/vz5AFx00UVYrVaefvppFEVh5MiRjB49mq+++oqPP/4YSZJ4+OGHSUhIoFuXDEJ1K0EJIlnSkRJ6iGMzmwykJIUj1OrDLUc9B3aHB5c7HJlnpyd8997lIHJjOAX8i1/cwNRJo/ji9V8zsjgIsp+uXbty1VVXsW7dOr755hv0ej033HADa9as4eDBg5jNZqZPn86XX35JIBAgNzeXgoIC0cncp08f3G43DQ0NSJJEYWEhTU1N+Hw+9Ho9aWlp+Hw+EZXGxYXfjxaVavwkSBxHZHqyD/rkcFILLWPHjj3qupxXXnkl6t/33nsv99577498VGcXkXXS9naTQthcQ62TJiYmig9vt9sdZWBvsVjw+XyiTpqTk0Nqaiq7d+8W86QjR47EbreLedJRo0aRlJTEY489hizLDBgwgEGDBvH000/T0NBARkYGs2bNoq6ujr/+9a9A2BShR8/evPPJalZ+NRf8TaAzoUsdGrVKLTbmu2XE7qOMu4j7eP3ExpiJsX73OLlpEwRdoI/hplvCGZDayr3s2b4BnU7Hb3/7W/x+Py+99BIQ3gpjs9n45JNwh/DUqVOpr69n9+7dSJLEpEmT2LRpE36/n8TERAoKCkQ3ek5ODmazWTQhpaWlodPpomqlalSqXvxoUanGj4rEsYdcmphqnG20rpO218Ti9/ujDOzVD+9QKCSWVttsNpKSwvZ9O3fuJBAIEBsbS0FBAXa7XdQG+/XrR0JCAm+99RaBQID8/HwGDx7Mu+++S0NDA0lJSfzsZz9j1apVrFy5Ep1Ox+23347FYuF3v/sddrtduB5dfOu/2LBxE0rLDgB0KYOQDDFRx242ffd+/MEQ34c/8G0Xszn8HmVPNcq3Dkox2cOpqXMRYzEJUb/00kvp2bMns2fPprGxkczMTGbOnMlbb72Fx+MhNzeXQYMGiYvCQYMGoSgKe/fuFf+uqqrC7XZjNBrp1KkT9fX1BINBTCYTSUlJR41K9Xr9SVmCrKGh0RZNTM9SOlInlWVZ1EktFouokwLU1NTg9/sxGo1inrSiooLGxkYkSaJnz54ArFixglAoRGZmJsXFxaxYsYLq6mrMZjNTp05l69atrFq1CkmSuOKKK/B6vSLKmz59OkVFRcydO5dVq1aFG30eegij0cjuAzWE6tcAClJsPrrYtvPDPn9Q/L/J8P0zmCZj+M/B6wughHzI9eE6bExqT/z6VB74+/sML3BTX19Pfn4+N954Ixs2bBCe0bfccgvbt2+nrKwMnU7H5ZdfzsqVK3E4HCQkJDB06FAx8lNQUEBCQoKI4rt27YqiKDQ0hB2YMjIykCRJXMjExsai1+uRZVmMw2hRqcaPjmba0GG0y9qzlO+rk0LYLSoUCqHT6aLmSe12uxDZnJwc9Ho9LpdLRFwFBQXExsayZcsWGhsbMZlMDBs2jJqaGrEd5pxzzkGn0wkjhrFjx1JQUMCLL76I0+mkc+fOzJw5k5qaGp555hkAbrrpJv764hIGnPcw2dZaCDSH07vJA9p9j2oNFIhK3R6JGIsJCKeE5Ya1EPKAIZ5h42cAkJfo47PPPkOn0/G73/2OQCDACy+8AIStMbOzs/nf/8IewZMmTUJRFBGVn3POOezZs4eWlhbMZjP9+/dn7969hEIhbDYbmZmZ1NbWoigKMTExxMfH4/V6hdtR66hUM2nQ+EnQGpA6jCamZyGyLH9vndTj8Qjf3cTERJFODAQCYoQjJSWF2NhYZFlm+/btohs1JyeHhoYGsZ900KBBmEwm5s2bh6Io9OjRgx49evDBBx/gdDrJzMxk6tSprFixgnXr1mEwGLjtttvQ6/X87W9/w+12U1JSgsFWwPwlW2ior2H7+nCEp0segKS3tPs+ff4gjc0uALIimorawxZvFTXWqsqKbzfNSEy/7GZe/vuNfPjcLziwbTEAV111Fb179+bll1+msbGRrKwsLrnkEt577z1cLhdZWVmMGTNGmDX07t2blJQUMRY0YMAAXC6XmCMtKirC7XYLe0a1I1qNSuPi4tDpdFFRqclkOur70dD4QVD3mR7r7SxEE9OzDHXRNxy5ThoMBmlpCXe/xsXFCd/dyDEYi8UitvWUl5fjdDoxGAx0796dUCjEypUrURSFTp060blzZ5YvX05jYyOxsbFMnDiRzZs3s2HDBpEOdblcvPrqq0C4mzcxOY37//AUq1atQkHCmNIPk9EQXqvW8A0gI1kykWI7HfX97j4QbpDqlJOK/gjORgCF+d9tHtq9LWzuoEvuz+BBpQDM/teTOJ1OevXqxQ033MCyZctEXffWW29ly5YtIr175ZVXsmrVKpqamoiLi2PcuHGsXr0aWZbJysoiPz9fWAjm5OQQFxdHdXU1EJ7TtVqtuFwukRVQ7Rq1qFTjJ0eLTDuMJqZnGX6//6h1UnWtmqIoGI1GkV4EaGxsjLIL1Ol0OBwOsVatqKgIs9nMpk2bsNvtWK1WBg0aRGVlJWvWhOuPkydPRpZl3nvvPQDGjx9Pfn4+r7zyCg6Hg86dOzNt2jT+/tyHLP7iAwAcUh6b9zRzwaT+XHlOLor3cHjRd0rp946FrN0c3jsaG2OmpHvuEe83pH9X8f/ffPMNUmwnpPgi3vrwa15++WU2bdpETEwMf/jDH6ivrxeWgRdffDEpKSkivTt58uTw665dK/596NAhDh8+jMFgYPDgwZSXl+P1ejGZTHTp0oWGhgYxCpOenk4oFBIdvPHx8UiSpEWlGicHzbShw2hiehbRetF3e0LkdrvbHYPx+/1iDCYjIwOz2RyV3k1LSyM9PZ26ujp27Ah32A4ZMgS9Xs9nn4VdgXr16kVhYSEfffSRsNibPHkyGzZsEMu/b7rpJgwGAxW7V6EnSEgXy/hJ5/H47y4jGAzy1fywCEu2YuFydDQ+X7ZF/P/FUwe1ex9Jkpg5eSAAzc3NrPxma7g7WJI4sG+H6Ma9+PLryMjI4Omnn8bj8VBcXMy0adN488038Xq9dOrUiVGjRol0dq9evcjIyGDDhvAquL59+6Ioihh9KSoqQlEUYcOYkZGBwWDA6XSKZetq01dkVKp18GponHpo3bxnCYqiiDqpwWBoN00YCASi1qqpKWA1vasoCrGxsSQnJwNhxyp1rKOoqAhZllm1ahUQNmrPyclh2bJlIr07fvx4du3axZo1a5AkiUsvvZRgMCi6d6dOnUqXLl145b+fsqMsvIbsr4/8gbGjwyv2/vXcixw6VA46M7qE4g69703bD7F64z6G9OvKJecP5n+frW1jdv+Ly0ZT1CVcp3z5P6+gJA9F0hlIjAWbdy/BgIKLDP77xSHiLO+wd+9eYmNjuf322/nqq6/Yt28fZrOZq666iiVLltDc3IzNZmPChAlib2laWhrdunVj/fr14uIjLS2NQ4cOIcsyVquVxMREAoEAbrcbCI8ctReVaiYNGj8ZWjdvh9HE9Cyg9RhMe2lC1eUIwlFrTMx3M5sNDQ243W50Op0Yg7Hb7SLC6tatGyaTiQ0bNmC327FYLAwcOJDDhw+L3ZyTJk1Cr9eL9O6IESPo0qULb775JvX19aSmpnLxxRejKAovvTgbgIAhXQjpp4s28Mi38526hJ5Iuo6Phfy/p+by/nO3Y7WYeO2JX/Cv1xbx9YY9WEzf7jO9cBgAe/fu5YUPtiAZE0EJYXBsIYgPS1wqVa4uJPrq+fjjjUC4s9jlckW5NjU1NbF582YgfGFw4MABamtr0ev1DBs2jEOHDonacmFhIQ6HQ1y8ZGdnA9/t27VYLKJWrdVKNU4a2nLwDqOJ6VlAKBT63jEYp9MpxjAix2B8Pp/oOs3MzMRkMiHLspiPTE9PJy0tjcbGRrZv3w6ETeyNRiOff/45sizTrVs3unXrxmeffUZ9fT0JCQmcd955VFVViY7X6667DovFwqJFiwh56lHQMf2iK3jhv4vZsquS7RuXQtDz7fqztltdjsbW3VXc/vAbPPHg5djirNx789Q299m7dy8P/+1VMKeDx0eisg8TLmSMPPLIn7nmVy+RZgw3CZ177rn06tWLxx9/HFmW6d+/P8XFxVHmDAkJCcKLt3///uh0Og4cOACEje0NBoOoNaekpIgtO2r2ID4+HkCrlWqcXLTItMNoYnqGI8uycDk60lq1QCAQtQ1GvY+iKFRVVYn0bmJiIhDu3nW5XBiNRgoLC5FlmdWrV6MoCvn5+eTl5bF+/XqqqqowmUxMmDCBw4cPCwvBGTNmYLFYeP311wmFQvTv35+BAwfS1OwQM6W6+ALGjujPlXc9j6KEMNYtC389oSeS7tijsy9XbGPKNf/guotHMm54D7LSEgkE/Ozft5d58z7h1Tmr+eqTV/F4/Vx2/W8weetQgKuuu5X+Jd0YkO9CCSikZeRw5ZVX8tprr9HU1CQi6o8++giPx0N6ejrDhw8PXxSEQmRlZVFUVCT2liYnJ5ORkUFtbS2BQACj0Uh6ejqyLIuoNC4uTqTYVXHV6/VaVKrx03M83blnaRVCE9MzmMg6qU6na9cxR+3ehXDUarF8N7PZ1NTUJr3rdDpFRFVYWIjJZGL79u3CnKG0tBSHwyF2lI4ePZq4uDjeeOMNQqEQxcXF9OnThw0bNrBhwwb0ej0///nPefDxD/jgg/dIUGoIYeKwM5n5S8vISLVRvX8DXrf926i0a5v30FEqa5v40zMf86dnPkb21CDXLgVkpNjO6FKH8MLbS4iRmjH79qIAE879GbfOupgXX3wRJeAgPj6ehx98gOXLl7Nlyxb0ej3XXHMNmzZt4uDBgxiNRqZNmxZ1PoYOHcqhQ4fEBp3u3bvj9XqF01F2djY6nQ673S4Wrqsd1JEZBS0q1TgpHM/c6Fk6Z6qJ6RlMR1yOXC4XgUCgTXo3sns3PT0dk8mEoijs2rULRVFITU0lPT0dt9st6oT9+/fHarXy8ccf4/f7ycrKol+/fpSVlbFr1y4MBgMXXXQRoVCI119/HQinTLOzs/ly+WbilLDXb9eeI8jRZ/L+p2vxeP3oPQeQAZ2tO5J04tGZ4q1DPrwMkJFictClDqZ/r0688d4CUpUydCicf/40Hnjgbj7//HMWLVqEJEncfvvt2O12YWI/Y0bYGWnZsnDUPGHCBAKBAFu3bgXC3cyhUCgqvWs0Gtm3L7w7NSEhgbi4OAKBAC5X2FxCbTpq3TCmdfBqaJzaaH+hZyiRLkdGo7HdD+NAIBBlYh+Z3q2urkaWZWJiYkT3blVVFXa7Hb1eT1FREZIksXbtWoLBIKmpqRQUFHDw4EFRO500aRKhUIgPP/wQgHHjxpGamsqiRYuoqqoiPj6emTNnAjBlSCJ6gqSmZfD67D/y7rO3EgiGUAJ2Au46QEKK63zC50XxNRA6vBSUEJIlE13acEpLunDBuGJSlO3okPGRQHH/iWzdulUYSVxxxRV06dKFV199VWy36d+/Px9//DGKotCzZ0+6desm6qQFBQXk5OSI0aHU1FQyMjKoq6sTM6WZmZkoiiLSu5GZgVAopO0r1Tj5aKYNHUYT0zOQyG0wHU3vRprYt7S0iB2lanrX5/OJiKpLly6YzWYqKio4dOgQkiQxePBgZFkWRu79+/cnMzOTxYsX09jYSEJCAhMmTMDtdouO3osuuojY2Fg8Hg9fL/0cgEpnCq9+sBKDQU9qYhyy8wAAkjXriLaBHUX21BKq+QrkAJjT0KWPRJL0XDJ1AIvmvYYeP4o+FnPGYIryE3nyySeRZZmRI0dy7rnn8sorr2C328nIyODiiy/m008/xeFwkJSUxMSJE1m9ejUejwebzUZpaWnU6FC3bt3wer3U19cD4fSuwWCIajpS98RGRqVHuhDS0PhJ0EwbOoz2V3oGEgwGRVRzNHOG9tK7wWCQmpoaILxPUx3PUE3Z4+PjycnJIRgMCpefHj16kJSUxIYNG2hoaMBqtTJq1CgcDodoOpo2bRpms5nPPvsMh8NBdnY2AwcN49//XcyFV/6K5uZmQlho8Cbw/FuLAXjpb9cjecKp3++zDfw+ZHclcu2Sb5eIZ6DPGENJcSee//PPWTL/Tfbt20dqaiofvPsaC9+8j1f/8zxOp5OuXbty44038uGHH7J//34sFgvXX38969evZ9++fRgMBi688EIOHjwodraOGDECu90uFqR3794dg8EQtbLOZrMhy7LIDETO9QaDQbHnV4tKNU4quuO8nYVoNdMzjI5ENaFQKOpDPLJLtLa2llAohNlsJjU1FQjbCKrjMd26dUOSJLZt24bL5SImJoaSkhLcbjcrVqwAwgu/LRYLH3/8MT6fj/z8fPr374/T6RT1xosvvpgH/zGX+Ys3kaFsQw84pGyQJAKBEM++9iU2q4zsbwEkJGvWcZ8T2bHvWz9fhZjkLvjiS5EkPTdcOoqln7/Lxo0biY2N5fHHHyctLY2//vWvVFRUkJSUxK9//Ws2bNgg3tuVV16Jy+Vi+fLlwHfzs+qC7/79+xMXFycuNLKzs0lNTaWmpgafz4fBYCArK/xe1KYjg8Eg/Hcjf36aQYPGyed4Is2zMzLVxPQMo6PpXdV7N9KcweVyCeOGrKwsJEkiFAqxe3d4QXZOTg7x8fE4nU62bdsGhMXDYDCwaNEifD4f6enp9OnTh5qaGuGGdMEFF6DT6fj444/xeDzk5+dTtt/LynW7sVKHHj8hjHQp6kdtg5P6Jid///dnDCoIR9eYU5D0x97NqigKctNmFHu4hivFduL8n92MyWzi3U9W8/hjjxFylWM0GvnLX/5CYWEh//73vykrK8NsNnPvvffS0tIi0tLnnnsu+fn5oo7at29funfvzmeffYYsy+Tl5dGtWze2bt2K3+8nJiaGgoIC3G636N7NysrCYDDg8/nEVp7IzIBq0CBJUrtLCDQ0flK00ZgOc5YG5Gcmkd27R4pqvF6vENxI711ZlqM2l6iRUkVFBR6PR5iyA2zYsIFQKERGRgadOnWivr6eTZs2AWHjep1Ox6effoosy/Tu3ZuCggIcDgcLFiwA4JJLLuGJlz/H7vSSbAzXEO+8/SbmvfJrUpPjxbGuWh3e3iJZMo75XChyELlu+XdCmtATXepQPvh8A18u30J8aB8hVzmSpOMPf/gDAwcOZO7cuXz11VdIksQvf/lLEhISePnllwmFQpSUlDB27FjmzJmDx+MhIyOD8ePH8/XXX+N0OomNjWXo0KFUVlbS0NAQtSBdTfcmJCSI9K5ar46JiRFjL7IsCzHVolINjdML7dL3DKF1ere9Af8jGQNAOJWrdplmZITFy+v1ipnSgoICDAYDtbW1lJeXI0kSAwcORJIklixZgqIoFBUVkZ+fT3l5OWVlZUiSxHnnnQfA/PnzhRn8wIED+cVl9fz3gy9QmlvQ6w28saAcJ4uxOz3fHXAw7FHbEUP7qHPhbyZUtxICdkCHPnXwd53AioLRuwcjNSjAAw88wLhx41iyZIlYVH7ttdfSq1cv/vnPf+J0OsnJyeGKK67giy++oLa2FqvVyvTp09m9e7eok44aNQqv1yuatAoLC4mLi6OyslKYM6jpXYfDQSgUQq/XC6cj0AwaNE5BNAekDqOJ6RlC5Gq1IzWtOJ3ONsYAEE4tRm4uUT/I9+3bhyzL2Gw24dKj1gILCwtJSkqivLycvXv3IkkSo0ePBhAWgQMHDiQzMxO32y08bGfMmIEkSdxzw7lg38b774MuJpNte+vZ9fynBEMyRoOOYEhB+VZM0X+Xij4aiqKgOPciN24AJQR6C1k9plDnDDdRjRhYSFFKM5/P/xoAu66A0aPHsW7dOp5//nkAzj//fCZNmsRLL71EdXU18fHxzJo1i7KyMrZt24YkSVxwwQV4vV42btwo3mdCQgJr164VYzDZ2dnY7XaRNs/JyUGv1+P3+4WRfUJCgqhpdySroKHxk6N583aYs/Rtn1mEQqHv3SoSaQwQWaODcNNR5OYSCK8iU5uO1JnS/fv309zcjMlkEuvEFi9eDEC/fv1ISUlh37597Ny5E51OJ3Z7fvXVV7hcLrKzsxk8eDCBYPh41TGaSedMJj7WQo/C7G/fjxzuZlXC7wkl9L3nQAk4kQ8vR25YG54htWZRPPwaHvvjbSTGWynolEZJtoPP538UPt6h5/L03x7gvQ8X8o9/PIEsy4wePZrLL7+cDz74gG3btmEwGJg1axbNzc3ifY4fP560tDSWL1+Ooih06dKFwsJCduzYgdfrxWw20717d4LBIFVVVUDYezc2NjZqmYDVao1auq4ZNGickmijMR1G+6s9zYn8INbr9e02rUTOlEZuI4Fw05H6PbXpSFEU9uzZI74WHx9PMBgUddHevXtjNpvZvXs3NTU1GI1Ghg8fDsDnn4fnRQcPHkxqaiqyLIuo9LzzzuO5N7+i27j7uO2+J2lubiYhIYH777yGsgV/Ys7zdzBlTAnyt3+LUkx4mbfi3H/k9y8HkZu3EKr6DMVTCUjokvqRW3IBf7jnUsYMKea2q8dTv38VH304B4B7772Xf/3jIerrapn/8X8JhYLkdy7ixhtvZOnSpaxYsQJJkrjyyiuJi4vjo48+QlEUSkpK6Nu3L0uXLsXr9ZKUlMTgwYOpqqqivr4eSZLo1auXGIMJhUJYLBbS09OB79K7Op1OzJRC9CiMZhuocSohIx3X7WxEE9PTnMiZ0iN9EEfOlEZ+iKtORxBuOlKNG2pqanA6nej1etF0tGPHDjweD7GxsXTr1g1ZloWNXmlpKbGxsRw4cEBEpRMnTgRg3bp11NXVERcXx8iRI5m/uAxFgfXrwqvZxowZIy4ADAY9N1w2BoM+/Gup+9aHV3FXogRdUe9JCTgJNW4iVPExcvOWbx2N0tFnn4suoZh//ekaRpQWMe/LDbzz5ovEUosC/Obe+5g+fTqVlZXM//ANDDoFh8/ERZdeTVlZGR99FI5cL7jgArp3787//vc/fD4fOTk5TJw4kbVr19LQ0IDJZGL06NG43W727t0LhOvKNpuNhoYGXC4XkiSRm5uLTqfD7/dHZQbU6FMbhdE4lZGP83Y2otVMT2NafxC3lx5sbQwQ2djSXtNRMBgUTTSdO3fGZDLh9XqF32zfvn3R6/Vs3bqVhoYGLBYLgwYNAhBp29LSUlJSUoDvItXx48djNpu5/ZoJvPTOUpp2b8HlD38d4MV3lrJ5+yEuv2AIwVD4z1EyJYEpGfyNhKoWhLt6FT9K0AeB5u/epD4GXXI/pJg8IUZX3PEcQ/t3ZtPKD7HQDJKOmZdcx4zpF1JTU8Of/vQn3G4XWdm53D/rVuLMMs8//xYQnpMdOXIk7777roieZ8yYwb59+0R9eMSIEZjN5qg6aU5ODm63W3gaZ2VlYTab26R3I5cJRC4i0EZhNDROX7S/3tOYjnwQOxwOFEXBYDBEzZSGQiHRdJSeni5EtqKigkAggMViIScnB4CtW7cSDAZJTk6mc+fOyLLM11+Hm3gGDx6MxWKhtraWrVu3IkkSEyZMAMIRrtrVO2nSJOZ+vp7fPPoOit9OuhJeIt6vXz+qDzfzp6fDEaGa7lTRpw4hVP81+JtR3IeividZMpHiCzHZcgm2Kqv6vC7KVr6PBRcyOn71q99y8cyp1NfX86c//YmmpiZyc3N56KGHcDjCq9+CwSAlJSVceOGFLFiwgIqKCkwmEzNnzsRut7Nu3TogXB/OysqirKwMn8+HxWKhe/fuyLIsxmBsNpuoPx8pvduRWreGxslEViRk5dh+L4/1/mcKWpr3NKWjTUdq56i6jUSlrq5OOB0lJSUBYcOH8vJyIOy/q9PpcLlc7Nq1CwiLiCRJYsWYxWKhf//+AKJBp1evXqJGqFoJ9u3bF70xhrv/7y0CgRBmwjXakCGRoTP/TNXhZnoUZqPX6xhRWoTZFL4wMBp0SKYE9FnnoEsdhi6pP1NmzuLam+/HkHs+8Z0nkZBeQE5mCqMHdxPvrW+hjTSlDBMuYmLj+P2Df+LimVM5fPgwf/zjH6mvryczM5Pf//73+P1+nn/+ebxeLwUFBfz85z9n1apVbNmyRXTuWq1W0XDUuXNnevTowcGDB2lsbESn04k6aVVVlRiDifQ0PlJ6V533NRgM2iiMximJlubtOFpkehrSuumovQ/i1ttIIpuOfD4fjY2NAGRmZgqRPXDgQNQoDEBZWRmyLJORkUFmZmZUVDpo0CDMZjMOh0OMzIwbNw4Ip5fVnaYTJ04kxmoiNTmeugYHucngaIAGlxmn28WS1TuZ9/Jd+PxBynZW8NhvL+WOP7xJIPhtulfSIcWFvXm/3hEk9bATDHHExpipa3TicPmwWkzo9TpSLC4c5evR4yMnJ5fHH3+M/Px8Dh8+zP/7f/9PCOmDDz6IJEk8//zzwit41qxZ7Ny5U1gHTpo0iby8PBYsWIDP5yMlJYUhQ4bQ2Ngo1qp169aN+Ph4GhsbxfnOzc1Fr9dHmTO0Tu9qTUcapwPH01B0tjYgaWJ6GtKRpiOv19tmG4lKbW0tiqIQFxcn5k3dbrdoRiooKECSJBwOh6if9u3bF0mS2L17N42NjZjNZgYMGADAqlWrCAaD5Ofn07VruGlox44dtLS0EBsbS79+/TAYDCx8414O17fw5z/ej6MBRgzpj1tJ5JLzBqHT6fjnK1/w3BtfHfF9S4DT7cPpDkd0dY1O8b0xQ7ozc2Qir77yMl5FobS0lD/96U/YbDZRI62vrycrK4sHH3wQk8nEs88+S319PcnJydx4443U1taKzuMhQ4bQp08flixZQktLC1arldGjR+P3+4WVYnZ2NpmZmXg8HrEcICMjQ6TT7Xa7MGeI/BlErsfT0rsapzLHE2lqkanGaUFHjOwVRRFNR62djlwul/ie2nQEsH9/ePwkJSWFhIQEIFwrVRSFrKws0tLSUBRF+O0OGDAAs9lMKBQSkdzIkSPF861eHe7WHTRokHj9hHgrCfFWMb/661tmUFRUJB6zaVt0TdRsMpCXnUyz3U19oxOTyYDPHxTfz89OweHykpVqpXbXl3y8JvyaM2bM4O6778ZgMFBeXs4jjzxCc3OzEFKr1cq//vUvampqSEhI4JZbbsHr9TJ37lxkWaZHjx6MHj2a9evXU1VVhV6vZ8yYMZhMJtavX08oFMJms1FYWEgoFKKiokJcnKiNVx6PR3jvJiYmRv2ctKYjjdMFrWbacbS/5NMM9YP4aE5HLpdLNLyoHrsQFlk1gkpKShJpR4fDIZqR1FGYyKi0T58+AJSXl1NTU4PBYGDgwIFAWHBbWlqIi4sT9VNFUfjmm7Cv7pAhQ6KOLRAIiM7WtLQ07E4Pz772JZ1zU/nr/Rdz/b0vcbCyAX8ghM8f5A93Tic9xcZHCzfQr2c+9//1PZITY/m/X11E3x557Nm9kwcffJA1u2sxmUzcc889XHDBBQDs3r2bv/zlL7hcLvLz83nggQeIiYnh+eefp6Kigri4OG655RYMBgPvvPMOfr+fvLw8pkyZwp49e9ixYwcAw4cPJzk5ma1bt+J2uzGZTPTq1QtJkqiqqsLv92M0GsnJyRHLAdT0bmxsbFT2IBQKaU5HGhpnIJqYnkbIsvy9TUeyLON0htOf8fHxURGRw+HA6/Wi0+lETRQQ9b/09HSR9lWjUnWFGCDqoiUlJSKVqUagQ4YMEVFWRUUFjY2NQnQi0ev1whgiFArx6pxlYn/p20/fzBUXDmPrrko+WLCO9OR4SrrnkmCL4dc3Tgkfw8jw8wUCAV555RVee+01QqEQeXl5/PnPf6awsBCALVu28Nhjj+H1eikqKuK+++7DaDTy4osvsn//fqxWKzfffDNxcXG8+eabuFwu0tLSmDlzJnV1deJioE+fPuTn57N///4oYwaz2UxjY6MQzdzcXAwGQ9RWHoPBEOW9qzUdaZxuhL69HetjzkY0MT2NONZRGNWEAcIf5Or8Y0pKinh8S0uLWA/WuXNnIBzZqmnf3r17A9DQ0CDMCdSotKWlRURvgwcPFq+1efNmILw0vHVNV6fTERcXh8PhwOFwkGSLRQKSE+P4+/PzWbflAJIEigKZ6Ykk2Nr68u7du5f/+7//E13GEydO5L777hNR+LJly5g9ezahUIjevXvz61//Gp1OxwsvvMCePXswm83ceOONpKam8vbbb4tZ0osvvhi3283SpUtF527v3r2pq6sThv/du3cnISHhiHVSt9vd7lYe0JqONE4/lONoQFK0BiSNU5nW6cH2CAaDRxyFaW5uxu/3o9frRV0PvquVZmZmCkHYvn276OBNS0sDEMuvCwoKSE5OFl+TZZnOnTtHRboffRpuIpKNie0epyqmH3++msdf+4YYq4m5/76D/3v6YwBSkuJxOD0kxFt5+d2l9CjMZtiAQpqbm3n55ZeZM2eOqFv+5je/EXOtiqLw0Ucf8d///heAoUOHcuuttwLw4osvCiG9+eabycnJ4f3336e2tpaYmBguvvhidDodixcvJhgMkpGRwdChQ3E6nWzfHl7jlpubS2ZmJsFgkEOHDqEoCvHx8eJ8BoNB0dFrs9mi0vBa05HG6YjWgNRxNDE9DWhthH6k9KDaWNR6FEaWZdH0k5qaKh7f0tJCc3MzkiTRqVN49MTn8wlfXjVF6/P52LJlCxB2N1JRBTbya4qi0FBXhUEH32xtpPpwMy+/u4zhAwsZN6wHAMXFxVRXV/PFZ3NAycHt8ePxBnjyoctZs2k/C5dv5Y25X7Nk9U6WrN6JnhD3XVvC++++LVLYo0aN4je/+Y1IQYdCIV599VXhuHTeeedx5ZVXEggEeOmll9i9e7eISPPy8pgzZw6HDh3CZDLxs5/9jPj4eL744gvcbjc2m41Ro0YRCoXYsmULsiyTlJRE165dURRFrFUzmUyiThrpcmQymaIMMkBrOtI4PdEakDqO9ld9GhAKhcQozJGajvx+P16vFyCqTgdh28BgMIjRaBRRJSBSl5mZmSIlvHv3bkKhEElJSWRmZgKwbds2AoEASUlJ5OfnA2HTh4qKCiRJEg1KEE4HG3QyCnDDVVP56+xPmfv5ev7z/jK2LPgzFrOR2267jTVr1lBfW45NksnvNozzrn+C4oIsPnj+DqwWIx8t3ICk+JEdB4ilmldeDtdmi4qKuOOOO6IE3Ol08uSTTwrBv/rqq5k6dSper5d///vf7N+/Xwhply5d+OSTT9i3bx8Gg4GLLrqItLQ0lixZQlNTE2azmXHjxmE0Gtm4cSM+nw+r1UrPnj3R6XQcPnwYp9OJJEnk5eWJCxOHwyH8j9tL76pZBbPZrEWlGqcNWmTacTQxPcU51lEYq9UaJbihUIj6+nog3D2rPt5utwvjBlUgQ6EQO3fuBML1TjXiUrfFqA5IgPhaUVFRlHiraeMunTtz/oT+VNQ0h/+dm4bJGBae7Oxs7rvvPh566CFilQoqdi3HoMSye8dhPpm3gJqqgwzJq2Xr1m1AuMaoN8Vz/29+ybnnnktlTTMP/PU9Rg7qRv/iNP72t79RU1OD2WwWQut2u3n++ecpLy/HarVy0003kZ+fz4IFC9i+fTs6nY4LL7yQ3Nxcvv76a6qrq9Hr9YwbN47Y2Fh27tyJ3W5Hr9dTUlKC0WiM6nrOysoS3dCtXY4iMwfaejUNjbMDTUxPcdSmlaONwvj9fvGB3ToqbWpqIhQKYTKZhFcsIGwDMzIyRFR64MABvF4vMTExIu1bW1vL4cOH0ev1ohkJEFFg3759o14vUmwAbrpiLOeOKSEzLWylt2bTPl5+dxkXTSllxOhJrFj6BTFyNWpS9O9//VPU8xmsyQQtXXjo/puYNCr8+n//92d8/OVGPl+4hD55bjxuN6mpqfzmN7+hU6dO2O12nn/+eaqqqoiNjRU10oULF7J582YkSeK8886joKCAjRs3sn//fiRJYtSoUaSkpIgRIAinumNiYvD7/cJ3NykpSVgwyrIcZWIf2fQF4a5j9eenNR1pnG5oDkgdRxPTUxhFUQgEAkA4Km0vPRgZlcbExERFRbIsR0Wl6uPdbrf4el5enngeNSrt1q2biKBU0SwqKhJC4XQ6hRj37Nkz6nhUYVGFW5IkOuemiu//8akP2bqrkjWb9rH+kz+ybPlk7rzvL6AEscUY6ZqfQteuXRkwYAD9+/cXohxJ/175bPxmCTk2Bx53+HjvueceEhMTaWhoYPbs2dTX1xMfH8/NN99MVlYWixcvZsOGDQBMmTKFHj16sHPnTrENZ8iQIeTk5FBXVyfmawsLC0lOTkaWZQ4dOiQWqKvpb3UMRpblNi5H6vn/vp+fhsapzE8pps8++yx///vfqampoW/fvjz99NNRUwKtaW5u5ne/+x0ffPABjY2NdOrUiSeffJKpU6ce1+ufKJqYnsJERjVHalrxer2iVqfOiKo0NjYSCoUwGo3C1Qi+q5WmpKSIxxw+fJimpib0er2Y1QwGg8I6LzIq3b59u5hBjYx2oa2YtmbiiJ5s3VXJhOFhEa5uMdAodQMJrrtqMndcO+mo56SxsZHKnWEhhbAwXnnllRgMBqqrq5k9ezZ2u53k5GRuueUWUlJSWLZsmZgbPeecc+jduzcHDhwQc7N9+vShoKAAh8MhOnezs7PJyclBURSqqqrwer3o9Xry8vLEhYbH4xF16tYuR4AQUq3pSON0RVbCt2N9zLHyzjvvcM899zB79myGDBnCk08+yeTJk9m5c2fUpICK3+9n0qRJpKen8/7775OTk8PBgweP+LnzU6D9hZ+iREalRxqlUBRFdLfGxsZ2KCr1er2is1dN5UK48QjCDkhqJ7Ca9o2NjRUzqICYNy0uLo46no8WbuCzxZuI1XHEjuO7Z03m5ivHYbWEU54l3XMxGw0owLCBRe0+RmX16tW88MILOJ1OzGYzN910E8OHDxfH9NJLL+HxeMjMzOTmm2/GZrOxfPlyYYE4YcKE8Mq36mph1t+tWzd69+4tOpbVzt3CwkIkSYoyZsjLyxOp9sgxmPj4+DYpXG29msaZwE8Vmf7jH//gF7/4Bddddx0As2fPZt68ebz88svcf//9be7/8ssv09jYyMqVK8XfZORn1MlA64Y4RYmMao4kTF6vl2AwiCRJUbaBEB2VRl6tVVZWoigKCQkJIi3p9Xo5dCjsixvplasaMhQXF0dFXapjUutf3pfeWYrdFe5aVVPP7aEKKUCPwmySEmPxB4I89u/P2r2/0+nkueee44knnsDpdNKlSxceeeQRIaQbN25k9uzZeDweOnXqxO233y6EVBXN8ePHM3DgQBoaGli6dCmyLNOpUydKS0uRZZktW7a06dx1u91RxgzqOY50OTIajW3OfUe2+mhonA6cyAo2u90edVPNTFrj9/tZt24dEydOFF/T6XRMnDhR/P225qOPPmLYsGHcdtttZGRk0Lt3bx555BHRNX8y0MT0FKQjtbbWZvaRYifLsnA1ioxKg8EgVVVVwHe1UghHdbIsk5KSIkZnAoGAiFYjI1CPxyOclFqL6S8uGyOERY3mOkJmWkLUfyPf48qVK/nVr37FkiVLkCSJCy+8kP/7v/8Ti8uXLFnCa6+9JhZ733rrrcTExLQR0tLSUpqbm1m0aBHBYJDMzEyGDRsGhNPWDocDo9FInz59MBqNBINBYWAfacwAYYcov9/f7hgMRI8yaU1HGmcreXl5JCQkiNujjz7a7v3q6+sJhUJRizcgfAGrXsy2Zt++fbz//vuEQiE+/fRTHnzwQR5//HH+9Kc/tXv/nwItzXsK0tGoNBQKIUlSG4OA5uZmMVcaWSutra0lFAphtVqFOCiKItK2kVHpwYMHCQQCxMfHk52dLb6uinFSUlKbzuHzJ/Qj2Xop//znP9m6dSuyLEeJfHllA5f98jksZiPvPnsrqUnhx7/55E1s3V1Jv5754r41NTW8/vrrrFu3DoCcnBxuuOEGevQIGz+EQiHmzp3L8uXLARgxYgQzZ85EkiSWLFnCmjVrgPB+1dLSUpxOJ4sWLcLv95OSksLo0aPR6/Xs3btXeO727t0bq9V6RGMG9WejXsTYbLY2tdCOjDJpaJwunIid4KFDh6Ka8iKNZE4UWZZJT0/n3//+N3q9noEDB1JZWcnf//53Hn744R/sdY4FTUxPMTpiZt+6Vhr5ga0oiohKU1JSxPdUgQCixKGurg6Hw4HBYBDzpvBdXVStHaqo9dbWV5EqpaWlxMbGUldXx6ZNm8QmGYAV63ZTVdsMwLqyg0weHW5qio0xM7hveA+q0+lkzpw5zJ8/X+wCnTFjBtOnTxfC5fF4ePXVV9m5c6cYcxk/fjwAixYtEgI8YcIEBg4ciMfjYdGiRXg8HhISEoQpQ1VVlUhvFxcXiwuPIxkzqOldCH8wtB6DgWj/3SONMmlonC6ciGmDzWZr0+HeHqorm5rxUqmtrRWd863JysrCaDRGBRs9evSgpqYGv99/UjJCmpieYkRGpUeKao5WK3U4HMKDN7JW2tTUhNvtRq/XR/2CqmMg+fn54sM/MlpVO3tVVDFVPXtbYzKZGDNmDJ9++inz58+PMnqYMrYPi1Zux2IxMnpwt6jHNTY28sUXX/D5558LA4SSkhKuvvrqqJT04cOHeemllzh8+DAmk4mrrrqKkpISFEXhiy++YOPGjQBMmjSJ/v374/P5WLRoEQ6Hg9jYWMaPH4/ZbKahoUEY5Xfq1ElcHNjtdtG4lZ2dLYwZICz0aud0QkJCmwudjowyaWicToS7eY/VTvDYXsNkMjFw4EC+/PJLpk+fHn4OWebLL7/k9ttvb/cxI0aM4K233orKfu3atYusrKyTVlrRxPQU4oeISlUhSEpKirpqU6PSzMxMEeEFg0ExJtO1a1dx3/r6epxOJ0ajMUrIICzKgPDEbY9Jkybx2WefsWnTJh577DFuu+02YmJiSLTF8MJfrhP38/v9bN26lRUrVvD111+L5oHc3Fyuuuoq+vXrF/W8W7du5Y033sDr9ZKQkMCsWbPIy8tDlmU+/fRTMcZz7rnn0qdPHwKBAF999RXNzc1YrVYmTJhATEwMTqdT3DcjI0PUfv1+vzhPycnJURcjgUBAnPfWLkcqHTHY0NA4nfip7ATvuecerrnmGkpLSxk8eDBPPvkkLpdLdPdeffXV5OTkiLrrLbfcwjPPPMOdd97JHXfcwe7du3nkkUf45S9/eRyv/sOgiekpxIl28LrdbjweD5IkRTXMeL1ekfpVG3cgXP8MBoPExMREzXKphgzqjs5IPB4PQJs6bSRZWVnMmjWLN998k3Xr1nHTTTfRv39/evfujdfrxeFwUFlZyZYtW0R9EcKp1ilTplBaWtpmzGfhwoXMnz8fRVHo0qUL1157LTabjWAwyMcff8zu3bvR6XScd9559OjRg2AwyJIlS2hoaMBsNjNhwgTi4+Px+XyUlZURCoVITEyke/fuSJJ0RGMGIMrE3mKxtJvejYxKDQaDFpVqnBGEkAgdY830WO8PcOmll1JXV8dDDz1ETU0N/fr1Y/78+SJjVF5eHhU45OXlsWDBAu6++2769OlDTk4Od955J/fdd98xv/YPhSampwito9L2UBRFpEBbR6WAEMzExMQoEVSbhhITE6NEUB1x6dSpU9SHvxqtRtZQVVSTgsj0Z3tMnDiRoqIinn76aSoqKlizZo1oCookJSWF/v37M378+KjoWMVut/Pmm2+KlOyIESNE/dTv9zN37lwOHDiAXq/nwgsvpLCwkFAoxLJly6itrcVgMDBu3DgSEhIIBoOUlZWJEZhevXqh0+lQFIXq6uooY4bI8+FwOAgGg+h0uiPWgEKhkFYr1dA4AW6//fYjpnUXL17c5mvDhg0TM+SnApqYniJ0JCr1+/2iZtc6KvX5fKLLNDIqlWVZtJdHduUGAgEhspEjLoqiCA/a9sQ0sib4fXTq1Im///3vlJeXs2LFCioqKoiLiyM+Pp6kpCRKSkrIz88/YhS3d+9eXn31VRwOByaTiZkzZzJkyBAgHIX/73//o7q6GqPRyMyZM+nUqROyLLN8+XKqqqrQ6/WMHTuWlJQUFEVh+/btIn2tmtdDOHWtRp65ublR7y0QCIgLGJvNdsSfjVYr1TgT0bbGdBxNTE8BFEURUenRREr9ULdarW2iUnUDTFxcXFQLelNTE36/H6PRGFXnrK6uJhQKERcXJ0zbITxWo0Zo7dl4qRGpmu79PtRdqZFuS99HMBjk888/58svv0SWZTIzM7n22mujmoTeffddGhsbsVqt/OxnPyMrKwtZlvn666+pqKhAp9MxZswY8Zg9e/bQ0NCATqejd+/eIkL3eDziYiM9PT3KkrEj3bsQvmBR50o120CNMwmZ49hnepYa3Z/UIbilS5cybdo0srOzkSSJuXPnfu9jFi9ezIABAzCbzRQWFvLKK6/86Mf5Y9PRqFR1EGkdlYZCIRFZRUalEBZNCDfaRM18flsXbZ3SjLx/e8eivrbb7e7YmztGqqurefLJJ/niiy+QZZnS0lLuuusuIYqHDx/mzTffpLGxkfj4eK644gqysrJQFIU1a9Zw4MABsQFGNcmvqKgQjUWRIzChUIhDhw6hKApxcXFtmqo8Hk9U9+6RUC+EjtaBraFxOqLaCR7r7WzkpF5Gu1wu+vbty/XXX8/MmTO/9/779+/nvPPO4+abb+bNN9/kyy+/5IYbbiArK4vJkyf/BEf8w9PRcYrIqLR19KNuLjGZTFFCGwgERB01sqFGluV2nZAAMet1pPkuNXJTu3p/KEKhEEuWLGH+/PkEAgFiY2P52c9+FtXRe/DgQebMmYPf7yc5OZlLLrkEm82Goih888037N27F0mSGD58OLm5uUC4M3nPnj1A2HdYjbZVA/tAIIDRaIyavYXwOYp0mDqaJaDahaxFpRpnGlqat+Oc1L/+KVOmMGXKlA7ff/bs2XTp0oXHH38cCA/pLl++nCeeeOKIYurz+aI8IVVz8lMFVUglSTriB3YwGBSNP+35wKop3uTk5DYGC2rUFZm+rK+vJxAIYDab20Sy6nO1/rpKly5dWLVqFbt27eK88847lrd6RA4ePMi7774rBL64uJjLL788qtlnx44dzJs3j1AoRG5uLjNnzsRisaAoCuvWrRPWh8OGDRM1YIfDIUZgsrKyomrATU1N2O12YczQWgidTieyLGMwGNqc80gURREpXs2DV+NM40QckM42TqtL6a+//jrKDBlg8uTJ3HXXXUd8zKOPPsof//jHH/nIjo+ObIaB76JSs9ncpqbqdrvx+XzodLo264ciTdojUUUrKyurTVpSjTgj66iRdO/eHQhbhTmdzjZr346FxsZG5s+fz9q1a1EUhZiYGC688EIGDRokzoWiKHz99dfCNrBbt26cf/75GAwGFEVh48aNYg/r0KFD6dKlCxDuOi4rKxNbYIqKiqI250TWSdtb6B3ZdHS0hiI1KpUkSUvxamicxZxWYlpTU9OuGbLdbsfj8bTbIPLAAw9wzz33iH/b7fY2qc2ThVprO1pUKsuyaPZpL0JSI8nWRgJut1ukKVufM7Uu2nrxdmTDzZH2AiYkJJCTk0NlZSWrV69mwoQJR32P7aHa+y1evFicg9LSUi688MIocQ6FQsyfP18s8B44cCDjxo0T4yybNm0SkeegQYMoKCgAECMwfr+f2NhYMQID4fOpGtjHxcW1G4Gr581sNn+vn6gWlWqcyfxU+0zPBE4rMT0eOvKBeDLoaK3U7XajKAoGg6HN/GnkTk1124uKavuXlJQU9Ti/3y+iz9Z10UAgICKto5kyDB8+nPfee4958+ZhtVrFKrTvo7GxkSVLlrB69WqRei8sLOT8889v0+3r8Xj48MMPKS8vR5IkJk2aFFU/LSsrEyJbWlpKt25he0J1BMblcokRmMgU7uHDh/H5fOj1+jZ1UvX8qMfWEV9RdbZUG4fROBP5qfaZngqEQiHKysro1KnTETNzR+O0EtPMzMx2zZBtNtsRxxZOVSKH/I/UuKIoiuiajY2NbfOBrYqi1WptY6JQV1cH0Ga8pa6uTqwVay2Yal1Wp9MddURn6NChVFVVsWLFCt577z3cbjcTJkxoV1Dsdjtbtmxhy5Yt7NixQ7znzMxMzjvvPHr16tXmcbW1tcydO5eWlhaMRiPTp08X6VsIC2lZWRkQjlbV1DOEZ1PVEZiSkpI23rqRTlDtnXfVMrC9Ri8NjbONM7kB6a677qKkpIRZs2YRCoUYM2YMK1euJCYmhk8++YSxY8ce0/OdVp8Ww4YN49NPP4362hdffCH2Up5OdMR6zufziTVrrcUy0uKu9VWUy+XC5XIhSVKbcQ81Ym1vhlSNyMxm81EjLZ1Ox0UXXYTVamXhwoXMmzePefPm0a1bNzIyMpAkicbGRhoaGqipqRECCuGa57hx44SNX2t27tzJvHnzCAaDJCYmMmPGjChT/bKyMjZv3gxA//79o3atVldXC8OJ4uLiqMgyFAqJ8Zj21sdB+GeinoMTqQVraJwpnMmR6fvvv89VV10FwMcff8z+/fvZsWMHr7/+Or/73e9YsWLFMT3fSRVTp9MpxhYgPPqyceNGkpOTyc/P54EHHqCyspLXXnsNgJtvvplnnnmGe++9l+uvv55Fixbx7rvvMm/evJP1Fo6LyOXRHTFpiImJadPc4vF48Pv97VrcqWb3iYmJbZ5frbG2Z1R/LA006uqz+Ph41q9fz8GDB9m1a5ew/YskPz+fkpISSkpKjri6TXUuUu3BunTpwrRp06IuIiKFtF+/fvTs2VN8r7m5Wbx2586d21wsVFdXEwwGMZlMRxz7UbMAFoulw1Gpes7U9LiGxpnEmVwzra+vF58Fn376KRdffDHdunXj+uuv56mnnjrm5zupYrp27VrGjRsn/q02Cl1zzTW88sorVFdXC3MBCH/Azps3j7vvvpunnnqK3NxcXnzxxdNuxlSNSvV6/REFLBAICBP49hqP1Kg0Pj6+TfOLmspsvSat9a7T1qjPozYFdYTRo0czdOhQqqurqampEWn45ORkkpOTycrK+t76g8Ph4OOPPxZRZWSjkcqWLVuihLRXr17iex6Phy1btqAoCmlpaW3qrw6HQzRW5eTktHvOIxu9jlYvbo16zmRZFhtjNDQ0Tn0yMjLYtm0bWVlZzJ8/n+eeew5ArKo8Vk6qmI4dOzYqBdia9tyNxo4dy4YNG37Eo/pxkWVZRDFHi0ojo6TWP1hZloU4tBYqn88nmpJaC6a6j1Ov17fr6BO5mu1YhMFkMh2zZaBKeXk5H3/8MS6XC5PJxOTJk+nRo4f4vqIoUTXS1kKqdu4Gg0Hi4+MpLi6OOu5QKCRGgVJSUo4olB6P54iNXkdDp9MhSZJoKDtZuxQ1NH4MzuQ073XXXccll1xCVlYWkiSJscvVq1dHlY86ymlVMz0TiLSe68g4THsf/g6HA1mWMRqNbb6vNiW19uiF76LZhISEdqMz1fNXdf/pSDfr8RIKhVi+fDmrV68GwlH0hRdeGNWVrCgKmzdvZsuWLUBbIVUUhZ07d+J2uzGZTPTu3bvNOa2trRXp3fbqxCpqrdRqtR5zdGk0GsUSAp1OpzUuaZwxnMkNSH/4wx/o3bs3hw4d4uKLLxafl3q9nvvvv/+Yn0/7q/8J6aihvdfrRVEU9Hp9u5GOKoqJiYltPvjVNG7rURn4zv3pSCKp1+tJSkqioaGB+vr6H01MGxoa+OSTT0RKuE+fPowfPz7qvapzpOr4y4ABA6IiVgivkKurq0OSJHr16tXm4sHj8YiLi/YMKiJfS02pH88YlcFgECv0fD4fiqJoa9g0zgjO5MgU4Gc/+1mbr11zzTXH9VyamP6EqOMwRzNpgO9SvDExMW3EMhgMivGN1qlaWZaFeLRXE1VTw0cTybS0NBoaGqisrGx3v+iJEAqFWL16NStXrkSWZSwWC5MnT44abYGwuK1fv54dO3YA4Rpq67TL4cOHxd7Vbt26tTkX6o5SCJ+no3XnBgIB8XM5nqhSkiRMJhOKohAKhfD7/YRCIYxGo2bmoHFaIwPH2lp3Kkem//znPzt831/+8pfH9NyamP6EdGQcJhAIiPu1NzurRpcWi6VNFOV0OgkGg+j1+nYFUxXpowlLQUEBO3bsYNOmTQwdOvQHi7Bqa2v57LPPxGhO165dmTx5cpsRFdW0XvXaLS0tbSO2LpdLCG1eXl4bJycIp8I9Hg86ne6IHcQqkTXs420gkiQJs9ksfn6hUIhQKCTSvkf7mWtoaPw0PPHEEx26nyRJP42Ydu7cmeuvv55rr7223QXSGm3p6M7LozUewXdi2l4D0dHSv5HPfbRu1eLiYpYtW4bdbqesrIwBAwYc8b4dweFwsGzZMlH3tFgsTJw4kR49erQ5RlmWWbNmDXv37gXC5hCqRaBKIBBgy5YtyLJMYmJilJlD5POoKeSUlJTvvSD4oVyM1AjVYDCI6FSWZfx+vxhjMhgM6PV6JEnSxFXjlEdWjmOf6THe/6dk//79P9pzH5cz91133cUHH3xA165dmTRpEm+//XbUZhaNtnRkHOb7Go+CwWCUAXtrIsW0PdTnPppblF6vZ8iQIUC4q011RTpWPB4Py5cv54UXXhBC2qNHD66//np69uzZRkhCoRArVqyIWqPWWkhVq0CPx4PJZKJnz57tnkt1IbrBYDji9ptI1IucH8qoXqfTYbFYiImJiVpgoAqrx+PB7Xbj9Xrx+/0Eg0ExWqOhcSpxNuwz9fv97Ny585hGAtvjuMV048aNrFmzhh49enDHHXeQlZXF7bffzvr160/ogM5EfqjGo8gUb+vvK4oivt+emKoNMsD3jm+UlJRgs9lwOBzMnTv3mH7J7HY7ixYtYvbs2axcuZJgMEhOTg5XXXUV06ZNazfFHAwGWbJkCeXl5eh0OkaMGNFuxHngwAEaGxuFVWB77yNyljYtLa1DNUtVRE/0j6k1kiSJjmur1YrJZIoS7FAoJFyXVIH1eDyayGqcMsjHeTsdcLvdzJo1i5iYGHr16iU8De644w7+8pe/HPPzndCl+IABA/jnP/9JVVUVDz/8MC+++CKDBg2iX79+vPzyy9qHwLdEboc5WvQTGTm2lwJUt5m0F5V6PB6CwSA6na5dkwc1+oLv33BiMBiYOXMmRqOR8vJyXnzxRTZt2nRElx+n08mGDRv473//y+zZs1m7di2BQID09HQuuOACrrjiCrKzs9t9bCAQ4KuvvqK6uhq9Xs/YsWPbnVdtaGiIajhqzw4Qwk1WgUAAg8FwxAi9NWrtOdLo/4dG9Tu2Wq3ExMSICyKDwRD1O6HOIbcnsj6fTxNZjZ+UMzkyfeCBB9i0aROLFy+OclqbOHEi77zzzjE/3wk1IAUCAebMmcN//vMfvvjiC4YOHcqsWbOoqKjgt7/9LQsXLuStt946kZc4I1DF9GhNKGoXKLSfhg2FQiLF256QqFFpfHx8u4IdKRIdidbS09OZMWMGn376KXa7nQULFrBy5UrS0tIwm82YTCaam5upq6sTtViVvLw8hgwZQpcuXY5aF3S73Xz11Vc0NzdjNBoZO3Zsu7OgXq+X7du3A5CdnX1EO8DIqDQ5ObnDaVu9Xo/BYBCjLcfigHQ8qN3ckT8HRVHEonH1Frl4PLLmHolOp2tz02qxGj8UZ7Kd4Ny5c3nnnXcYOnRo1N9Mr169RN/GsXBcYrp+/Xr+85//8N///hedTsfVV1/NE088ETW+MGPGDAYNGnQ8T39G0dHGIzUqVaOV1rhcLhRFwWQytTsLqUatR4rYIn9ZOhrRdO7cmRtvvJGNGzeyevVqHA6HeJ3WZGRk0KNHjzYG80fCbrfz5Zdf4na7sVgsjB07tt36pizLbN26VTgcFRYWHvE5fT4fXq8XSZKOeYWSxWLB6XRit9tFzfOnRG1Ian0B0J7IRkal7Yms+jxqfV4TWA2NttTV1bV78a4uCTlWjktMBw0axKRJk3juueeYPn16u3XALl26cNlllx3P059RqFHp0RqP4Lv1Z0f6EFdFLC4urt0ftBq1HmnsRR37UBTlmKIvg8FAaWkpffv25eDBg6Km5/P5sNlspKWlkZKSckw2evX19SxevBifz0d8fDzjx48/4nHv27cPh8OBwWA4YsORijpHGxcXd8zzonFxcaLrtqmpCZvN1u6c70/N0US2dQSr/r867xqZjVBFVY2IT/b70jg9OJNNG0pLS5k3bx533HEH8F3A8eKLLx7XJrLjEtN9+/Z9rw9rbGws//nPf47n6c8YIhuPjvbhHgwGRbdve2KqKIowamgv8oz8fnv1UvhuDlIVwmNNZRqNxqNGhR1l3759rF69GlmWSU5OZty4cUe8gGhoaBDm9927dz9qF3JkA9bxODdJkkRycjItLS14PB7sdjt+v5/Y2NgTmj/9sWgvVQwIIW0vXRzZhBb5eE1cNY7EmSymjzzyCFOmTGHbtm0Eg0Geeuoptm3bxsqVK1myZMkxP99xNSAdj6H52UhkOu5odcrIFG979/P5fASDQSRJalcEA4GA+JA8mkiqYqQK70+J6rP79f9n77zj26jv//+6O+0tWbK8984ekMUmEFoo8OtglAKlFFpa6ODbAR1A6QBaSumgQPkWKF+gtKWMltIECCSskL3jON7bsq29193vD/H5RLJlW5LtOOOej4ceTqTTLUn3uvfesgU8z6OkpARr166dUEgjkQhtzFBcXDxuAs5YkqfsTOTqngqGYaDX6+n7Q6EQ7HY7RkZG4PF4EAqF0sYtjydIFyeZTEbLc5RKJeRyeUrMntzohcNhmuBEamPFxCYRAqkzzfZxInDGGWdgz549iMViWLBgAd544w3k5+djy5YtWLZsWdbrm9EOSNdffz16e3vx9ttvz+RqT1gySTwCjrp4J7K8JptrChwVY7lcPqlo5+Xlwel0YnR0FKWlpZkdxAwQjUaxdetWmo3b1NSExYsXT3hOBEHA4cOHEY1GoVKpMmprmNxbdzot/BiGgUajgUwmo7WgJPmLfA4kYYk8iAv/eG3GQFy8xDuS7AYm4kks1+R66OQGEyKnJkIO4iicIGIKJDq+PfHEEzOyrhkV06KiohkrfD/RycbFS5abyEojF/GJXLhTxVsJZrMZbW1tdHj4scDpdOK9996D1+sFwzA4/fTTp3QX9/b20nrSpqamjMSRNA2ZqRFoMpmM9tslrnFiuZHHRI1KxmbWkr8T/TvdcrMJsV7TiSv5LibHXIkrWGyJeOrBCyx4IbtrerbLzyXxeBwvv/wyrRZoamrCZZddllOP7mmLaXIrtvvuu2+6qztpIBeiqWpLiRCOLegnCIJAS0+mK6bEVTo6OopgMDhpDHIm6OzsxNatWxGPx6FSqbBmzZpJx6ABiUQr0vKrpqZm0j7CyWTSFCMXGIaBUqmk54pYb+QmiNR8JrtHJypjyWabYwU5OTOXiBv5/0wcIxFXmUxGj4fMtSXCmtwSURRWkROdgwcP4tJLL8XQ0BDt//3AAw/AYrHg3//+N+bPn5/V+nIW0z//+c/4zW9+QxuS19bW4lvf+ha+/OUv57rKk4rkLN5MXLyTxQ7j8TgYhplwmUyFRKfTwWw2Y3R0FK2trVi4cOGUx5ELoVAI27dvpx1FCgsLsWbNminHm8XjcTQ3N0MQBJjN5rQN7CciebD5bMKyLORyedpjSVfCkpxhm8m/yXrGZuNOBBFa4pJN/ptLSUxyYhIR1lgslpLURLKeOY6DVCoVS29OYk623rzJfPnLX8a8efOwY8cOWkrndDrxxS9+ETfffDM+/PDDrNaXk5jeddddeOihh3DbbbfRFOItW7bg29/+Nnp6enDvvffmstqTBnIxBCZ38SbHqCYSSmKVksHd6SDryMQqq6+vp2I6b968GR0RJggCenp6sH37doTDYTAMg/nz52P+/PkZWVDt7e0IBAKQSqWoq6vL6gJNjp3ETueCiTJsM2WimtLkBxE1YgmTcEK6mwhicUql0pS/2VizLMtS1zkR1rEWa/J2RFE9uUg0bchWTGdpZ2aYPXv2pAgpABiNRvz85z/PqUdCTmL66KOP4oknnsDVV19Nn7v00kuxcOFC3Hbbbae8mGbq4iUxN2JNpCOT5vTJVvBUlJWVYdeuXQgGg2hubs7alTERLpcLO3fuxNDQEIBEf+BVq1alHVKejpGREQwMDABITK7JNvZJlg+Hw4hGoyfkcO6Jakongohqcrwz+a8gCCkj/QjEopRKpZDJZBmLIBFWqVQ6TljJdkRr9eTiZI6Z1tXVwWazYd68eSnPDw8P51QGmJOYRqNRLF++fNzzy5Ytm3U324lAcuLGZBcUIqaTuT+nyvRNJtML4pIlS/Dhhx9i3759UCgU06of9fl8OHToENra2iAIAliWxbx587KyesPhMFpaWgAkWhFmMullLHK5HEqlEsFgEL29vaioqDjpk+FIzDTdjUOyxUpivNFoNEWAkycCkYQr8pjsuzTWFTw2dkzmuJLh6KKonricbHWmpBYdAO677z584xvfwD333IOVK1cCAD766CPce++9eOCBB7Jed05ieu211+LRRx/FQw89lPL8n/70J1xzzTW5rPKkIdnFO5mYkE5EwMRimskyuVBRUQGbzYb29nZs3boVgUAACxYsyPiiR3rgNjc3o7e3l8b6SktLsXTp0oyThsi6yPgjjUaTdlpMJjAMg+LiYnR0dCAYDGJgYADFxcWn7IWcTKwhzfUJpIk+eUQiEQiCQOOgBNK2cmx96mTbIVYxEVbi6ifu5VP1sziROdlipmNnPQuCgCuuuCKl/hoAPvWpT2U99GJaCUhvvPEGVfStW7eip6cH1113HW6//Xa63FjBPdnJtFEDiX0BE8c6iSVBuhdNxNgvwlQwDIMVK1ZAoVDg4MGD2L9/PwYGBlBVVYWSkpIJZ6k6nU709vait7c3pfFDQUEB5s2bN2ED+skYHBykZTCNjY3TsiblcjlKS0vR3d0Nt9uNUCgEo9EIvV6fU6r7yQixKEmMntz8RSIRWv6TnGTk9XrBcRzkcjmddDORKCZbq0SsiVAT17soqiJzyTvvvDNr687pCnPgwAEsXboUAGh3fbPZDLPZTAdBA5m5HU82MnXxEiuAxJemWmaydRGhGBsbmwyGYbB48WJotVrs2LEDdrsddrsd27dvh16vT2m47/F4aK0rgeM4lJWVobGxMeum8gSfz4e2tjYAiV7OE5X+ZINGo0FJSQkGBwcRDocxNDSEoaEhqNVq2gmIWFtTNYHPJiEoXfu+sZm6yY/kbSRvP10NanJJDPmbnLU7neYKyWUxKpUqpTNScm1tIBBAIBCgWeVkPmu6bTIMQ2OrxL0siuqJyckWMz377LNnbd05ielsqvuJTiYuXuCoUE6WaEPEcapkHGK15pLJWl1djcLCQnR2dqKvrw+jo6O0YXy67RQUFKCsrAyFhYXTSvKJx+M4dOgQeJ6H0WhESUlJzusai16vh1qthsvlgsfjQTAYTOlglExykwQyCCC5ROVEITmjlggZGZU3UZvKidZD3LYajYaGGkjjCp7nEQwGEQwGwbIsrcFN911IdvGmE1WyX6KoHr+cbG7edAQCAfT09Iy7fmZbOij6vmaQ5PmTMyGmyZbpZBAxJZm/2UImzc+bNw/BYBAOhyOlaF+j0UCv10Mul8/Yha+trQ2BQAAymQyNjY0zfkGVSCTUW0LclWOtrXSW4kRMZCVONE90og5IZF1jjzdZwCeyiJPLYibL3B07X5acD+KqJY9MPk9iiSoUCiqEoVCI9ikmNykSiQRqtRoKhWKcp2WsqJI4bTgcphnCM1miJTJz8AKD+EkqpiMjI7jhhhvw3//+N+3rxyxmKjIeIqRTtYQjF0RgcqHMpFYVODp2zev1guf5acUdlUoliouLc35/JgwPD2NwcBAA0NjYOGMtACdCJpONyxBON7aMQD6/Y9nibzqQMpWxiUXkkdytKdk6T+7upFQqoVarJ/2ukdi9XC6HTqejTfLJIAa32w2PxzPhupJFlewnz/MIhUK0+9LxfJ5PRU5my/Rb3/oWXC4Xtm7dinPOOQcvv/wybDYbfvazn+HXv/511usTxXQGIeI3VY0dKR8iFs5U65vqrl2j0YDjONqQPdepKccCt9tNp8GUlZXlHG+dLtNtsHA8Qay7iW5KSB9h4q5NtixJLJQgk8mgVquh0WigVqsnPD/JFmvyepLjq3K5nA4NGPteEpMnXgJiYYuu3+MLHiz4LIeLZbv8XPH222/j1VdfxfLly8GyLMrLy3HBBRdAp9Phvvvuw8UXX5zV+kQxnUEydfFm2rEoUzElo8McDgccDsdxK6aBQAD79+8Hz/PIy8tDRUXFXO/SKQHHcVCpVClZ2sRlGwwGqfgRFzgZkA4kQgAajQY6nW7CjHKWZan4RiIR+P1+Kt7hcBgymQwajWbc+1mWhUKhoGJPXL8kI/hkrxMWmVv8fj/tFW40GjEyMoK6ujosWLAAu3btynp9opjOIERMp7oIZDJNBkgdIjAVZrMZDocDg4ODKCsrO+7u7MPhMPbu3YtYLAatVoumpibxYvkxY13O5LnZdDcnu2wNBgOAxPcyEAjA7/fD5/MhEolQoR0eHoZMJoNOp4Ner0/b/jJ5ndFoFH6/n85JdTgckMlk0Gq14yxVjuOgVCqp6zcejyMYDNKsa5G542R289bX16OlpQUVFRVYtGgRHn/8cVRUVOCxxx7Lqi84QfymzhDJF8KpRCLTWGg2WK1WdHR0IBQKYXR0dMph2seSYDCIffv2IRwOQ6lUYsGCBSeFezUdyUk6ybWbyfHMdMlDmZI8tWVsW8DkRgtT1YSmQyKRQKfTQafTAUgkwPl8Pni9Xvj9fkQiEYyOjmJ0dJSKsF6vT+thkUqlMBgM0Gq18Pl8CAQCiEQisNvtNOaa/P1Pdv2SrOFwOExdv8fbzeGpwskspt/85jdp7sbdd9+Niy66CM8++yxkMhn+8pe/ZL0+UUxniEyTj4DseulmCsdxKCoqQk9PD3p7e2E2m4+LC5Db7caBAwcQjUYhl8uxcOHCWU84mm2IYBIrjpSKBINBhEKhWS2pSW6oMBXJpStqtRoqlYr+zcQrIJPJYDKZYDKZEI/H4fV64fF44PP5EA6HYbPZYLPZoNVqYTQaodFoxn3nOI6DXq+HRqOhohoOhzEyMgKVSgWtVpuyL8T1O/bGI12WsMjswyOHOtMTJGb6hS98gf572bJl6O7uxuHDh1FWVgaz2Zz1+kQxnSEytUqBzGOrZF2ZzsYsLi5GX18fPB4Pbac3VwiCgKGhIRw5cgSCIECtVmPhwoUz2hbxWECSa7xeL3w+H3w+H/x+/5Q9qImFmNzvlliRyZbl2BKb5JuxicpkkrNzSakJyeAdWxNKSleSB8KTGCcpedLpdFAoFJPefHEcB4PBAIPBgHg8DrfbDZfLhWAwCK/XC6/XS7OmDQbDuN8BEVWVSkXLlAKBAEKh0LjtEyuV4zgaSxXdvnPDyWaZJnfnm4psu/eJ38wZItN4aXIZxlTLZtvZSC6Xo7KyEu3t7Whra4NSqcx4astMEgwGceTIEZrEkpeXh8bGxhPiQhgOh2mJB7HCJrqZUSqVNLEnucRkppJnkoU1Wy8GKTkhFjMRVb/fj3g8To+PTOqRyWQwGAwwGo0wGAyTDlbgOI5arKFQCE6nEy6XC5FIBIODgxgeHqavj/3MpVIpTCYTPc/xeBwulwtyuRx6vT7lOEkslWQekxsEcdTbseNkE9Pdu3dntFwu36/j/+p2gpDs5s1kuUyWzWXgdUlJCbxeL4aHh3HgwAHMnz//mAlqNBpFX18fent7aU/hioqK4zIhihCNRuFyueB0OuF0OtM2vuA4DlqtllpzxFV6PMd9WZYdl8ELgFp5xJp0u9002Wh4eBjDw8MAElm8pOmFVqud8PNTKBQoLCxEfn4+XC4X7HY7otEoRkZGYLfbYTKZYDabx50ruVwOi8VCrX3i+tXr9SlCTkpwkmPOPM/PaAMRkVOH4643r8h4MrU2s4mtJs/ozBSGYdDQ0IBoNAqn04l9+/ahsrISJSUls3bxD4VC6O3txeDgID0+g8GAurq6tE3z5xq/3097EadrnahWq6n7U6fTQalUnjQXboZhqMharVYAoJYquanwer20xVpPTw/kcjny8/ORn5+fNi4KJG448vLyYDKZ4PF4MDIygnA4jNHRUTidTlgsFphMpnF9iLVaLRQKBdxuN72xCYVC0Ov19LdE3L4Mw9C61FAoNKVrWmT6CAKTdcxUOI4t09lEFNMZINl1m+mPO5PlyB06SWrJdN0sy2LBggVoaWmBzWZDZ2cnBgYGUFlZCavVOiMXoEgkgpGREQwPD6cIkkajQXl5+XGTAAUkPh+/34+RkRGMjIyMa7dH3OHExXm8uKOT46XJZTIzfV45joPRaITRaERlZSVisRjsdjtGR0fhcDgQDofptCCVSoXCwkJYrda0iWSk5lmn08Hr9cJmsyESiWBoaAgOhwMFBQXj6qClUiny8vKolRoKhRCNRmE0GlMyhclQCOL2DQaDYmLSLHOyzTOdTY6Pq8ZJxExe6Igri8SL0tX2TQTLsmhoaIDBYEBnZyfC4TAOHz6Mrq4uakEYDIaMrFUSf/P5fDTxZGzTeIPBQDsaHS8iGo1GMTQ0hMHBwRQBZRgGBoMBeXl5yMvLy2jw+kwgCAI9j6SWk2QAB4NBhMNhmsFK+iKnI7k8hiQ6keHoxAWt0Wig1WpzGkYgkUhgtVphtVoRj8fhcDhgs9lgt9sRCATQ3t6Ojo4OWCwWlJaWpm0SwjAMdDodtFotnE4nhoeHEYlE0NPTA51Oh4KCgpR9I1aqXC6H0+lEPB7H6OjouPhtchyVnE9RUGePky1mOpuIYjoDjO3rOlMwDAO1Wg2fzwePx5OVmJL3k3hWf38/uru7EQqF0N/fj/7+fgAJVzIZTZbcBpH0eiVdctJd2LVaLfLz82GxWLLet9kkGAyit7cXQ0NDKW71vLw8WCwW5OXlzbr1GYvF4HK5aFcqktSUy2SfsSSXx6Rrap+MUqmEXq+nyUUmkwk6nS5j8eE4DhaLBRaLBbFYDDabDYODg/D5fDTGajQaUVZWNm7wMpA47yaTCXq9nsZRyUi/goIC2jCCIJPJYDabaUKTy+Wig+PJukn5jCios8/JNoJtNhHF9BiTXO6SievWYDDA5/PRuFMuYk1mjxYVFdEkEeK+y6ZmUaVSQafT0RKJ461e1OfzoaenhybRAIn4Z1FREaxW66wKKKm7JK5kh8MxoWWpVqtp/1uSAUymuCSXzyRPnUl2+cbjccTjcXqzQ0phSLciYvWGw2GazTs0NES3TybqWCwWejOUiYdCIpGguLgYxcXF8Hq96O3txfDwME3e0uv1qKysHCeQQOI7SMSzv7+f3tT5fD4UFhaOy+I1mUy0WYTP50M8Hoder59UUE+m2PbxwrG0TB955BH86le/wtDQEBYtWoTf//73OP3006d83wsvvICrr74al112GV555ZWctj0TiGI6A2QTL01eJhMx1Wq14DgOsVgMPp9vWn13k8eSAQk3KLnYkrFYxJIjszGlUilUKtVxnT0ZjUbR2tqaIqKTWUszhdfrRXd3N3p7e+FwOMa9rlAoUmKxJJZ4rLKAw+Ewzdh1Op1wOBxwOp2IxWJ0aDpw1PosLS1FaWlpRm5v0hKysrKSJp+53W7s2bMHRqMRNTU1aYe9KxQKVFVV0ZsOt9uNQCCAkpKSlGQ14ibmOI7OpCWzb6eyUI/X76nIxPztb3/D7bffjsceewwrVqzAww8/jHXr1qGlpYX2z01HV1cXvvOd7+DMM888hnubHkY4kSYgzwAejwd6vR5ut5u2TZsuJLuQZEpOhiAIsNlsEAQBFoslI2tpaGgIdrudXojEi8VRnE4nmpubqXVtsVhQVlY2a83+o9Eouru70dHRgZGRkZTX9Ho9tfQsFgvUavVx91nxPI/m+78Pu9uNYGkt/EXVKeVADMMgPz8flZWVKCsryzjmGgqF0N3djaGhIXqTWFpaivLy8glvHgKBAPr7+xGJRMAwDIqKitJataFQCC6XC4Ig0K5MyeeVJCMBiRuD4/nGb7aY6esaWd83Xv8G5OrsGq2E/WH87pO/y2pfVqxYgdNOOw1/+MMfACQ+09LSUtx2222444470r4nHo/jrLPOwpe+9CW89957cLlcomV6KkFGf5Ekk0zE1Gw2w+l0IhQKweFwjJvNeSoiCAL6+vrQ3t4OIBEbbGxsnLEbpLG43W60tLSgs7OT1v0yDAOr1Yry8nIUFRUdl2VAY2FZFlrEwXYfgkIKVHz9O/B4POjv70dPTw/sdjttE7hjxw6UlZWhpqZmyuxshUKB+vp6lJWVoa2tDXa7nbrc6+vr047aU6lUqKqqwsDAAN2HUCg0LuOcWPgOh4M2zU8W1GQLNR6PIxKJnHCdto5XphMz9Xg8Kc+TRLmxRCIR7Ny5E3feeSd9jmVZrF27Flu2bJlwO/feey/y8/Nx44034r333stqH2cDUUznAKlUShN8MkncIdmVpLuMTqfLKUvzZEEQBLS0tFA3ZUFBAWpra2fFfepwOLBnzx7aEBsAdDodqqurUVFRcUII6FiKv/VT+A/sgLppCS1l0ev1aGpqgs/nQ1dXFzo6OuD1etHR0YGOjg6YTCY0NjairKxs0kQfpVKJ+fPnY3R0FK2trQiFQti7dy+KiopQXV097jPiOA4lJSXU7Wu32xGJRFBSUpKyHWKREkF1Op0pLl9ikZJB5SzLntK/kZliOqUxpaWlKc/ffffduOeee8YtPzo6ing8TuueCVarlc4+Hsv777+PP//5z9izZ09W+zabiGI6AyT3Us2EbNsEAokYILFOu7q6UFFRccpeLHp6eqiQ1tTUoLi4eMbdetFoFPv27UNLSwt1W5aUlKCurm7GanXnCk6lhu70s9O+ptFoMH/+fMybNw8jIyNob29HV1cXHA4HPvjgAxw4cACLFy+e9JwzDAOLxQKj0YiOjg4MDAxQ63PevHnjYrLEtSyXy9Hf3w+v14uenp5xwi2TyWA0GmnynNvtTklKkkgk4HmeJmaxLHtcd6k6EZhOAlJvb2+Kp2imvAVerxfXXnstnnjiiZwa0s8WopjOANkmFcnlcni9Xpr0k2niUmlpKbq6uhCJRE5ZQfV6vejq6gKQmEeYy9zBqRgaGsKWLVto2UlZWRmWLFkCjUYz49vKBP+h3YgOD0B/xjowx6ihBBG4/Px8LFmyBEeOHEFLSwvcbjc2b95Mn5/sYiaRSFBXVwez2Yzm5mb4fD7s3LkTTU1NaVtc6vV6SCQS9PT0wO/3o7u7G+Xl5SmCKpfLqYUaDAZpq0eCVCqlGc9k5N+JfOMz1wg5uHmFj5dPHuc3GaTdpM1mS3neZrOhoKBg3PLkBu9Tn/oUfS45cbKlpQXV1dVZ7fNMIIrpDJOJOJKyB1IvmOkdm0wmQ0VFBRXU7u5ulJWVHXclKrMFz/PUUjSbzWl/aNNd/+7du6lrSaPR4LTTTkNRUdHMbSMShuvt1yAvrYR63lLEAz74drwPVdMSSM1WCIIAp9NJM11dtiEMf7gRcYYDd7ANjMGcMjBcAgFKtQZKtRo6nY42mNfpdDMmIgqFAgsXLkRDQwMOHjyIw4cPY3h4GBs2bEBVVRWWLl066XfYZDJh+fLlOHDgALxeL/bt24fq6upxbkAgUTZUXl6O7u5uBAKBCQVVp9PRQQQSiYRau2RAOamNDofDp2RC0omETCbDsmXLsHHjRlx++eUAEr/FjRs34tZbbx23fENDA/bv35/y3I9+9CN4vV789re/Tfu9OhaIYjoDkItbclvBqZYnP/hQKJSV+4MIKulq1NbWhvz8fOTl5Z30FwyHw0EvnrW1tTN6vIIgYOvWrejo6ACQcB8vXbp0xi1/+6vPYvSfTwEMi5o/voTBPz+EvkP7YC+shn/BKthstvG9mI0fj9LzhwF/f0bbkclkKCwsRFFREYqKilBaWjrtmy6ZTIYlS5agrq4Oe/fuRWdnJzo6OjA4OIjVq1dPenMjl8uxePFitLa2YmhoCO3t7YhGo6isrBz3OapUqhRB7evrQ2lpacpyarUa8Xgcfr8fLpeLlnIBR39fJCEpFoudch6cmeJY1ZnefvvtuP7667F8+XKcfvrpePjhh+H3+3HDDTcAAK677joUFxfjvvvug0KhwPz581PeT7LAxz5/LBHFdIZgWRbxeBw8z2cUp1EqlQgGgwgEAtBoNFnFdmQyGSorK9Hf349AIACbzQa32w2r1XpclmPMFC6XC0Ci/GWmszX37t2Ljo4OMAyDNWvWoLy8fMr3CHwckcE+yApLwLCZfX6c1gABgNNYgDff/xCtvA7hhrMSL/b0ADhaD2w0GqHT6SCPRcBFQlCVVVFR4Hkezg/ehGv3R4hyMqgu+H/wRuO0lpR4Lrq7uwEkvp8lJSWorKxEXV1d2uzaTFGr1Vi9ejVqa2uxZcsWeL1ebNy4EQsWLMD8+fMnTFDiOA719fVQqVTo6OhAT08PBEFIW+5FBLWrqwterxdDQ0PjXPparRaxWAzhcBhOpxNms5lum+M4yGQy2pSEzI0VyY5jJaZXXnklRkZGcNddd2FoaAiLFy/G+vXraVJST0/Pcf/5iXWmMwQZ0Ez6pU6FIAh0XBXpo5otgiDA5XLBZrMhHo8DSLjk8vLysmoZd6KwY8cO+Hw+NDY2jsv8mw5tbW3YunUrAGDlypUZx1v6//BTeN7fAO3pZ6Pk9p9PuXw0GsWOHTuwd+cOeAJHaztlHIuy0jJU1dWhqKgIeXl5Gd1cxX0ejL7yf5CXVMJwzifp8zzPY3R0lCb+9Pb2jpuOY7Va0dDQgEWLFk2rFWQ0GsXOnTtpiVJBQQHWrFkz5Tr7+/vR2toKIDE2sLq6Ou1NoNvtRl9fH1332LIwcqzxeBwKhSKlSQdp5MDzPC2fOVlvNGerzvSGV+6ALMs604g/jKcuv3/Gr7HHO6JlOkMktwnMBNJ3lzSNz2U+JsMwMBqN0Gq1GBkZoeOr+vv7MTAwAI1GA51OB41Gc9xMQskVnufh8/kAJBJVZopQKITt27cDSLiIsklcCHcnxCDU3Tblsk6nE6+88gpt9CCTyVBfX4+mpqacx+NxGh2sX/g6ACDmsoNVa8FKE4PJSfLQ4sWLaRy2s7MTbW1t6OnpobWkH769EactXYIV563NyRUqlUqxcuVKWK1WbN26FUNDQ3jjjTdw/vnnp+2ARCDZwEeOHEFfXx8kEgkqKirGLafX6xGNRmGz2TA0NAS5XJ6SCMayLAwGA+x2Ox0YQMqVksMpPM+L7t4ciAtAPEtLM35KmWdHOS6usNn0ZHz66aepH51A4iNzSbY9d4GEFSmVShGNRuF2u3OeuCKRSFBYWAiLxUL7pEajUToAGkhcvFUqFe0FSxrbnyiMnYM5U3R0dIDneZhMJixcuDCr9xbd+mO4Nr0O/ZnrJl2utbUVr7/+OsLhMFQqFc455xzU19envbBHhvrQ+6vvQ6I3ofR7vwSrmLq1n2vzfzH46M8htRaj6sH/AytNjY2SZvMmkwnLli2D3+/Hgc0bsWv7NniVOny4dz/2tnXgzDPPRHF/C7xbNsJy5c1Qz1ua8bmorKyEyWTCO++8A6/XizfffBNr166dNAO6qKgIgiCgtbUVXV1dUCqVaT0OeXl5CIfDcLlc6O/vR1VVVcq5k8lk0Gq18Hq98Hg8kMlk9OaRZVnR3TsNxBFsmTPn3yrSk/Huu+/Grl27sGjRIqxbty6lz+pYdDodBgcH6YPEheYSlmXpj5S4XKeCFMwDiT6q070hkEgksFgsqK2tRVVVFcxmM3U5kwkcg4OD6OjoQHNzM1pbW2nNptPphN/vRzQazbhe9ljCMAy9QJIORNNFEAS0tSWsylwSmhTltSi4/ptQVjVMuMyuXbvw8ssvIxwOo7i4GNdffz3mz58/oYXk3bYZkf5uBA7tRrDtUMprjg0v4chXL4X938+nPB9qTywXtfWDD6SOxkuHWq3G6Rd+ApfkybEy5oJOo4Hf78f69evxz492Y7CvD6Mv/2XK9YxFr9fjwgsvhFarhd/vx5tvvklv5iaiuLiYZl8ePnw47bB2Mv1ILpcjFouhv79/3HdUrVZDJpPR0Efy6yR7HgAtRxMRmWnmXEwfeugh3HTTTbjhhhvQ1NSExx57DCqVCk8++eSE72EYBgUFBfQxk/Gz6UBcddlc7KVSKb17d7vdMzKii2EYepdfU1ND27yZzWao1Wq6n5FIBF6vF3a7HQMDA+jq6sKRI0fQ3NyMtrY2OhXE7XYjHA7P+UWICNBMnCMgkR3s9XohkUgySjjKFpvNhrfeegsAsHjxYlx11VVpY+OuTf/B4evOx+CfHoBu1flQVDdCs2wNlHWpmYnDz/4ecZcDw399HEe+/En0/ur7EHgejEwBRiaH4fxLIdFnlljERyKQ51mwYMVK3PSVr+Dcc8+FTCaDS23A+/VnYKhh6mkd6VCpVLjgggug0+kQCASwcePGKW8Sq6qqkJeXB0EQcPDgwbSfL8uyNKPX7/djdHQ05XVyY8owDKLRaMq8XeLuBUAn7ohkBklAyvZxKjKnYkp6Mq5du5Y+l0lPRp/Ph/LycpSWluKyyy7DwYMHJ1w2HA7D4/GkPGYLIlLxeDwr4dFoNPSumrRLm0kkEgm0Wi2sVisqKirQ0NCAuro6lJeXo7CwECaTie4DAFqf5/F4MDIygr6+PrS1tdHh4sPDw/D7/RnHh2cKIkTpJrTkAulApVarZyWW1tzcDCBRZnPBBRdMGBd1v7cBQiQM1+b/QmK2ovLnT6D0uw+AlaUmfnCahBeDlcsR93ng2/kBYm4HXBtfhRAJI9hxGEPP/B59D/0QMZd93HaEWBS+PR8h5nbCueFF2F99FkNP/BLRgW6cdtpp+OKnL0dJPACBYbG5o58mZWWLUqnE2rVrqYX63nvvTfpdYRgGjY2NUKlUiEQiaG5uTvv7kcvlNKN3eHg4pUE/kPiek4QXr9ebclOb3F5QtE4zh/TmzfZxKjKnRz1ZT8bk+YvJ1NfX48knn8Srr76KZ599FjzPY/Xq1TTjbyz33Xcf7T2q1+tntaA3OR6TjXVKEolmU1DHQixik8mEwsJClJeXo7a2Fk1NTaipqUFZWRmsVisMBgPNguR5Hn6/HyMjI9SKJTMpj8XFyWKxAABGRkZmdHuzse+CIODIkSMAgKampkldyOZPfxHK+gWwXncbXS7Yfhj2/7yQyNh99f9w5KaLYVz3aRTe8kOUfu+XUNbNh+nSayA1mmH53JchKyqD7vRz4Hz9b/Bu2wzX2/8et53+3/8Evfd/B10/vAmy0ioADDitHhJDIkM2tvt9LN77FmqGEolVmzdvxjvvvJPT+VEqlTj77LMhkUgwPDyMnTt3Trq8RCLBvHnzwLIsnE4nent70y5nMBioYPb3948TaaVSSW8K3W53yr5LpVJaDz7bv6+TBdEyzZzjIgEpG1atWoVVq1bR/69evRqNjY14/PHH8dOf/nTc8nfeeSduv/12+n+PxzOrgiqRSFLKZDKNw7EsS/uORqNROBwO6HS6Y94OjbjE5HJ5ikuSWKtk+LTf70c8HofL5aJF82R252xlDptMJrAsi1AoRFP3pwM5r7NhYZPzwrIsqqqq6POej95GsOUATJd+HlJjohWfet5SqH/yKF1GiEXRfe+tEMIhhHs64NuzBXGvG+73NqD6188BACrufYwub/rkFTB98grEA36439uAmGMY6oWpblrXptfh3boJABAP+qBdugY1f3wZrFIFTpnIftUuPxOut17BUjWL4jNWY/P7H2L79u2QyWRYs2ZN1udAr9djzZo12Lx5M44cOQKLxZI2Y5egVqtRU1ODI0eOoLOzk2aqJ0Pip36/H+FwGCMjIyk348TdOzIygkgkkja7NxQK0YlNYu/eyRGQvTgKYgLSsSfbnozpkEqlWLJkCU0kGQtpPZb8mE2IkAiCkHWiDMuyMJlM1EIlQ52PhxgPwzB0FFZpaSnq6+tRUVEBo9FIR8oNDw/jyJEjGBgYmLEkoWQ4jqODgtvb26dtUZIYm8/ng9PpnIldpCRb86SkJ+7zoP/hu+H4798TXZAmgmXBqRPfU4nOAMsVN0FWUgnzZ26Y+D1INLCv+vWzqHtyA5Q1TSmvRe1HE/qKv3EvGJaF1GSmQgoA8uJy1Pz+RVT89HGsWH0GLrzwQgDAli1bJk0InIySkhLMmzcPALB9+/ZxrtmxFBYWwmw2QxAEHD58OO2NjkQioS0eR0dH07p7iQh7PJ6U3w/HcfQ3Krp7p0Z082bOnB51ck9GAunJmGx9TkY8Hsf+/ftnpeF5LjAMQ91MufxYiaCSi0E4HMbo6Cj8fv9x9cMndbJFRUWoq6tDcXExFAoFrWlsbW3F6OjojFt9lZWVYFkWHo9nXBJKtigUCpSUlAAAbSM4UyiVSmqFkdgpq1BCak2IgKKidsL3MiyHyvufRNldv4fl6q/AeP6lqH7w/6BfvXbC99D3MgyYNKUfeZ+6GvnX3oqS7/0SmkUTJxe539uAzh/eBM+2zVi0aBFqa2vB8zz++9//5vxZLliwAEajEZFIBNu3b5/wexxzOeDdthk1ZWWQSCS02X06km+M07l7SRxcEIRxeRLk90lqT0VEZoI5v4W4/fbb8cQTT+Avf/kLmpubccstt4zryZg8NPbee+/FG2+8gY6ODuzatQtf+MIX0N3djS9/+ctzdQjjSHbv5hKbYRgGGo0GZrOZjpUiyUCBQOC4ElXgaOF8VVUVKioqoFAowPM8bDYb2tvbUzIrp4tcLk8RwOla7cQFS3odzySNjY0AgP379yMYDIKRSFH1q2dQ84d/wrj28knfK9EZEvNGWS4RR3/97xh6+mHEfbkl0LEyOfIuvgrapasnXW74+UcRam/GyF8fB8MwuOCCCyCXy2Gz2XKeHclxHFatWgWGYdDb2zthPLT7J19H/29+hNEnH0RdXR2ARBs5YtmPpbCwEBzH0RvOZJLLzkKhUEpG8dgb3mOdSHciQepMs32cisy5mF555ZV48MEHcdddd2Hx4sXYs2fPuJ6MyYOZnU4nbrrpJjQ2NuKTn/wkPB4PPvzwQzQ1NU20iWNOcip+LBbL+e5XKpXCbDbT1oDxeBxut5v24s1mHuqxgFirVVVVKCoqAsdxdFzc4ODgjF20SNP2YDBIp8jkSlFREXQ6HcLhMD766KOUdXl3vIeOO74E51uv5LTu2tpaaLVaeDwevPjiiwiHw2BlckjN2ZVyhXvaYXvmd3CufxGODf+ccnk+FMTIP/4M1zuvZb3PhvMvBSNXwnB+YryVRqPBGWecAQDYt29f1usjGI1G+hvdtWtX2psgIRb9+G8E+fn51N070WdMmpUACXfv2BIcqVRKuzC53e6U759Ye5oZYgJS5oi9eWeRcDhMhVSpVE6r84ogCDT5J/lCJJFIoFAooFAoskp4OhbE43EMDQ3RBvVSqRSlpaXjhkPngsvlopZSbW0tiouLc16Xw+HAhg0bwPM8TjvtNGoVddzxJYS7joBVaVD/5Pqc1j06Ooq//vWvCAaDKLRacfG8ahgWnjau7GUy4j4P2r9zLeIuBwpu/h4MZ10ERjJxKY/9X89h+PlEQlPlA09DUV4DIPEd4oMBcKqJ2/ylIxgM4pFHHgHP87jhhhtoVnW2xGIx/Otf/0IwGMSiRYvGTfiIDA8icGAntKefBU6TuMHZtm0b4vE4ampqqEciGUEQ0NvbC6/XC6VSOW4KjSAIGBkZQTweh1KppNNFgISbl8Rb5XL5Cd1yc7Z6837qbz+DVJVd7+ZoIIR/X/mjU64375xbpiczMpmMCuh0mx4Qq89iscBkMtFG4rFYDD6fD6OjoxgeHobL5UIgEDgukpY4jkNxcTHKy8tp28TOzk7Y7fZpWwIGg4H20W1ra5tWApHJZMKSJUsAADt37qRlWca1l4JVaWC88NM5r9tsNuOKK66AXC7HoM2Gf/z7dRx+/FdZrYPT6FDzu7/DeNFnMPSnB9D9k9smXV5WXAGAAatS07IXABh8/H4c+dI62J77Y1bbVyqV9FwfOnRoiqUnRiKR0PN84MABOnyd7nd+IQznXQJOk7gAy+XyFDd8uuYPJLuXZVkEg0HY7fZxrxMBDQaDKa58sfZ0akTLNHNEMZ1Fkt29PM8jFApN+wdL1mk0GmG1WqHX61O2EQwG4Xa7MTw8TMV1rtsEajQaVFdXQ6vVQhAEDA0NoaenZ9rJHyUlJbBYLBAEAfv375+WoNbX16OkpAQ8z+Odd95Bb28vjGsvR/2T65F/1c3T2k+r1YorrrgC0ngMbrUBrweleP/99yc9fu+O9+F+dz2Ej12TrEyOyEBiRFu4t33S7WmXrYHlczdCVlCKcM/RZf17E00Y/Hs+yvoYKisrAYA26s+ViooKmM1mxOPxjNzGxA0fj8cndPdKpVIaFhoeHh4nuqQvNZDwaCS7e5NrT4+HLl/HG4KQ2+NURBTTWYZlWerWnClBTV63SqWCyWRCQUEBTCYT1Go1dVfF43EEg0Ga+Wqz2WC32+H1euloqmMFx3EoLS1FQUEBLUdpb28fZ51kA8MwaGhogMlkAs/z0xJUhmFwxhlnoLS0FDzP47333qNjxWaCwsJCXPvZT6NUqwQPBh9++CGeeuopdHV1jVs22HoAfQ/egYE//gyeD9+izxd86XaYPnEFSr77wJTbG33lLwh1HMZIUglO4VfvhG7V+Si48X+y3n8Se/QO9KH7Z99CqPNI1usAEud56dJEA/2Ojg4aAphs+YaGBtrMITl/IhlSkyoIAvr6+sZ9t3U6HTiOA8/zKb17x7YaFLN7RXJFFNNjABFUUncYDAZn3A1LLgo6nQ4WiwVWqzWlTWBy5xdSV2mz2TAyMgK3203HVM0mDMMgLy8PVVVVkMlkiMVi6OrqwujoaM43GBzHYd68eSmCmmvJDMdxOOOMM1BdXQ1BEPDRRx9hz549M3ZezLUNuOqrt+Kyyy6DWq2G0+nE3//+d7z44ot0UDYAsAoVwHw85Fp9tGmBrKAE1uu/kdE0F/25lyQSic7+BH1Os2gFir/5E6gaFmW97+SG0O92InBgB0ZffTbrdRAsFgtKS0shCAJ279495fIqlYpaxu3t7WlrVRmGoUlv4XB4XO066TIGJEIuyevgOC7F3Stm9x5FtEwz58SNuJ9gkOHExDINhUJ0VNRsJA2xLEs7GQFHm0iQ7kyRSITeicdiMWohSqVSyGQyOh5uNvZNoVCgqqoKAwMD8Hg8sNls8Pl8KCkpySkJhAjqwYMH4XA4cODAAVRWVqKsrCzr/WdZFitWrIBcLsehQ4dw8OBBDA4OYs2aNTOSTMEwTGLwQGEB1v/hQbRxGnR0dKCjowNWqxXLli1DQ0MDqh58BnwoBGX1xBNpJqPgum/CfNm1kOblT3ufgaN9jDm5AoxUCt2Ks6e1vsWLF6Ovr48OL5+qK1lJSQlGRkbg8Xhw6NAhLFmyZFxCn0QiQXFxMXp6euBwOMYlHEmlUjqqze12QyKR0BIZqVSKeDwOnucRDodP6kHiIrODaJkeQ4iFmjy1JRwOH5M7YYZhaKmAwWCgw6MNBgNUKhUVMTJxw26305jrbMSSOI5DSUkJCgsL6SSQ6dSkchyH+fPn0844nZ2dOHToUE4eAIZhsGTJEpxxxhmQyWRwOBx4/fXXp12Gk4zQ34WGfe/gnENvo3ykCxwE2Gw2vP7663j00UexpbULIdPUJTQCH0fvA9/FkZsvgf9QwsoTBAFdd9+Ctq9/GqOv/t+M7C/pfV02byHqn3kbulXnT2t9Op2O1uHu2LFjyjIvhmHQ1NQEiUQCr9c7oQteq9XCbE60aRwYGBhnxarVanqDmdxdbGx+g9i7N4EgMDk9TkVEMT3GkB8tuSMmcc25yCbkOA5KpRJ6vR4WiwX5+fnQ6/UprfCCwSAcDgeGh4fh8XhmtLaVDK2uqqqisyq7urpgs9lyOhcsy6Kurg51dXVgGAYjIyPYtWvXhEX/U1FeXo6LL74YBQUFiMfj2LFjBzZu3DhlnC8TFDWN0Jx2FrRCDAt792Fd1zacddZZ0Ol0CAaD2LZtG5544gn89a9/xZ49eya8yYjZR+DbvQVxjwueDz/uJMbHEepONKtPnonq3f4u2m77LIb//kTW+9vf3w8gMX90piy2BQsWQK1WIxAIYP/+/VMur1Ao0NDQQPdnrCuXkJ+fD41GQ8tmkuOgJLs3XfyUeI+ARJb88VbHPReIbt7MEetM5xByB5xsPUml0llzr2YDia+GQiEEg8Fx0zdUKtWMNuHneR6Dg4NUqBQKBW1RmAtutxsHDhxANBoFwzCorKyk8zCzhUyA2b17N+LxOBiGQU1NDRYuXJjz/hGCbYfgfOMl6M+8COoFy8HzPNrb27Fnzx50dnamLFtUVITa2lpUV1cjLy+PxsFtz/weobaDKLjpe1CUJUpYvDs/gG/PFuRdfDVkBYka3O6ffgOBg7sAlkPDc5syOheej95GZ2sbNgy6AAA33ngj8vLyJn9TFvT392PTpk0AgPPOOy+jtqDt7e3o7e0FwzBYuHAhjYUmE4/H0dHRgUgkAoVCgYqKipSm9tFolJZoKRQKGAyGlK5lREgVCsUJ0Qx/tupML/y/X+RUZ/rGtT84Lq6xxxJRTOcYQRAQj8fHWaYSieS4mWpBygaCwWBK2QHJJlar1dNqSJGM2+3G4OAgFa38/HwqHNkSDofR0tJC55/qdDrU19fTzNRs8fl82LVrF22HR8aGNTQ0zErBv8fjQXNzM1paWsaNJNTpdKisrKSx4UxE3bvjPdie+T10q8+HqmkplDWN4FSaccvFAz4wnAQxlx2Hv30NNjWdg5BMhcWLF9Pm9zPJ1q1b0dbWBoVCgU9+8pNTNvUgQ8RHR0fBcRyWLFkCjWb8cYTDYXR2diIej0OlUqG8vDzlexoOh+l3Q6VSQafTpZTJkJvc6TZcORaIYjr3iGJ6nEBENRqNpsRQWZalono8/KDj8TgCgQACgUDKfqpUKmg0mhkR/2g0ioGBAeqeValUKC4upq7xbCB1rW1tbVSgKyoqUFpamvP5tNls2LVrF70QK5VKzJ8/H9XV1bN28+P1etHW1oa2tjb09PSMiwVbLWYYhnuQr5Kj8ZqboTeaJrwB6f/dPfB8+BbkFbWouj91ek2w9QC67rkVrEKJ4h/9Dq/++TH066zQKWT40le/ltNnMBWxWAwbNmyAy+WC1WrFeeedN+VnQ+pU3W43ZDIZFi9eTGtJU44nGERXVxd4nodGoxn3uQeDQeoN0Wq1VJRJkiDP83Ri0vHw+5sIUUznHlFMjzMEQQDP84hGo+MumMeTsJKLjc/nS4lJEUt1upYamT4zNDQEQRDAMAwsFgvMZnNOVmooFMKRI0dSBDDZXZrL/nV1dWHv3r00nqlUKtHY2IiamhpaajEbRKNR9PT0oLOzE11dXfSYklGpVDQOTrpmkc5ZPT//Nvz7t0NitqL2D6m9fh1vvITev/weXZYK9FQsRDCScHd+/vOfT9vOb7rwkTDCfZ0IG/Ox4Y03EYvFUFZWhjVr1kz5HY9GozSeLJVKsXDhwnHzTwHQ6TOCIECtVqO0tDTlpsfn88Hr9QJINBjRaDTUQiUhjuNdUGdLTC94JjcxffM6UUxPeo53MU2GlLPEYrFxGb8Mw4DjOPqYqxhrcu1qcgbkTFmq4XAYg4ODVLDkcjmKiorSWiGZ7KvNZqOxNCBR7F9TU5Oz6zcej6OtrQ2HDh06Wl7EsihFBIvWXQyVyZzTerPB6/WiffcOtG74F5xqAzwy9YQZ4gqFAmqlEvJIEFqLFTJtYoiCIAjwer1wOZ1wu1wg79bpdDj77LNp5m0mxFx2MHJlyqzUiSBxXP2ZF0G4/Aa8++674HkepaWlWLNmzZTfn3A4jP3798Pn86XUHI/F5/Oht7cXPM9DqVSi7OMxbwSv10s9IWq1GlqtlibhkXI2hmFmNE9gJpktMV37l9zE9K3rRTE96TmRxDQZnudpXWi6CyXLslRYWZadkx98OBweJ6pqtRoajWbaTf7dbjeGhoaotW40GpGfn5+TBRyLxdDd3Y2+vj4apy4qKkJFRUXObsx4PI7Ozk4c3L8fvo9FlRUE1DY0oL6+Pq3FlA7PR2/D/q/nYPrEFdCfuS6nfYlGo7RX8/DwMOx2OxwOR1ZZzUYOaGTDWHHjbZDqDBm/z7fnI/Q+8F2wai2qH3oOEt345KBkWr/+GcTsNijr5qPi3sfQ399PBbWoqAhnnnnmlJ9xLBbDgQMH4HK5aMck0l4wmUAgQF3kMpkM5eXlKZ93soU6kaCSjN/jTVBnTUyfvg+SLMU0FgjhrS/eecJdY6eLKKYnICS+Sh7pPsK5FNdwOAyv10szIsl8VrVaPa39iMVisNlsNMbFcRzy8/NhNBpzWm8gEEBHRwftmMSyLEpKSlBaWpqzmzYWCmLHT7+DgeJaBPVHp6uUlJSgoaEB+fn5k+5r27euQnSoDxKjGbWPvpLTPkxEJBKB2+2Gz+eD3++ncW+e5yEIArRaLfR6PWSD3XA+/AMwAKxf/BZMF302422MvvosRv76GACg8r4noaism3T5UHcrPB9tguHsT9Ks48HBQWzevBnxeBw6nQ5r1qxJa20mw/M8mpubae/goqKitDHscDiMrq4uxGIx2uIy2Svh9/vpMPHkLN/kCTPHo8t3tsT0/BzFdKMopic/J4OYjoVYreRvuo90rEt4tsWVZER6vV4aU2VZFlqtdtquMr/fj8HBQToBRC6Xo6CgIG1GZya4XC60t7dTq4Q0lCgtLc3J8o37vYg6RuCUKNHc3JzST9ZgMKChoQHl5eVp121/7a8YffEp5F36eZg//cWcjme6xDxOdP3gJsQDPpT/5I9QlFZl/N54MAD7K89AmmeF4YLLc/6ch4eH8f777yMYDIJlWSxevBgNDQ2Trk8QBHR2dqKnJzEQQKPRoKmpaVxIIBKJoLe3l2amk77WZN2BQAButxtAogzMZDKBZdlxLt/jSVBFMZ17RDE9yRAEYZzlOpZjGW8lSRw+n4/ui0QigVarhVwuz3nbJEFpeHiYrlej0cBqteZU+ykIAux2Ozo7O2l8ViKRoKSkBMXFxdNKKHK73WhpaUFHRwfdV7lcjpqaGtTV1eUU/51thI+r75ljJBZCPAb/gZ2Ql1TSFoihUAhbt26l3ZcKCgqwcuXKKePbDocDzc3NiEaj4DgOdXV14zwCPM9jYGCAiqbBYKCj3ICEBet0OiEIAjiOg8lkgkQiOW4FddbE9KkcxfQGUUxPek52MR1LJuLKcRzNEp4tYRUEAX6/Hz6fj1rOMpkMWq12WuUWsVgMIyMjKRmtBoMBFosl51KakZERdHV10YQiYqmWlJRMS1TD4TDa29vR0tJC180wDMrKytDQ0EDb4J2o2P/zApwbXoLlczeOi/fGXHa4310P9aLToSivHffe4Rceh/2V/0vMbn30FbDSxGcX7GlH26FDOOjyIx6Pg+M4NDY2oqmpadLPIhwO49ChQ1Qs8/LyUFtbm3KjRW6gSCcluVyOkpISukw0GqUtB0nnJIVCMW760/HQ2GG2xPS8J3MT07e/JIrpSc+pJqZjmSreOtvCyvM8jdkRFAoFtFrttMppyKQQ4qolrQrNZnNO6xUEAcPDw+ju7qbCx7IsioqKUFpaSvu45gLP8+jv78fhw4cxPDxMn8/Ly0NDQwPKysqOC2snW1pu/AR4vxfy0mpU/eovKa/1/vL78O36AJxWj7on/jPuvUNPPwzn+hfByOSoe+I/YOUKxP1etH71MgjRCJSf/hJajaX0fCkUCixcuBDV1dUTniue59HT00PLYliWRVVV1biWiD6fD/39/YjFYmAYhk5cYhgG8XgcTqeTxv+1Wi21jJPHGMrl8llp3JEpsyWm5/45NzF958ZTT0zFqTGnGAzD0O5KpKaVZAknCy1wtAvTTCYwsSwLnU4HtVoNr9dLuyqFQqFpldPI5XKUlZUhEAjAZrMhEAjAbrfD6XRSUc1mveSimp+fj9HRUXR3d8Pn86Gvrw/9/f0oKChAWVnZlN160sGyLEpLS1FaWgqHw4GWlhZ0dXXBbrfjgw8+wO7du9HQ0DDr9aoAEHWOovcXtwMMg7If/AYSw+SJPpNhvvxaONb/E6aLrxz3msSYsLo5ffr151/1FchLKqCsbgIr//jizTCJBwA1C6xduxZ9fX203/K2bdtw+PBhLFq0KG2rSJZlUVFRAYvFgpaWFng8HrS1tcFms6Guro5mWJPh9QMDA/B6vRgaGoLP50NxcTEkEgny8vLg8XgQCARoYh3pYU06JZFhELP9eR1rhI8f2b7nVES0TEUAIK2wEkiziNkYFxeNRuH1emkyETD9chriUrbZbDTJhGVZ5OXlIS8vLyexFgQBDocD3d3dNNsTAKxWK8rKynKuUyWQphJHjhyh50Imk6GhoQF1dXXTsoSFWBTRkSFIC0rGfX7ud9dj4I8/AwAU3XY39GsuyP0gAIz8488YfekvMH3qalg/f0vSPsQQbD0AeXlN2haGExHu70K4rwva5WeA4Y4OvW9tbcWBAwfoudLr9WhqakJFRUXa740gCBgYGEiJWxcVFaGyspIK4NhGIRzHobCwEHq9HkBqpq9EIoHBYIBEIkEkEqFJdmSs27EunZkty/ScHC3TTaegZSqKqcg4JuvCJJFIIJVKZ9wNOVE5jUqlmpaoer1eDA8P04sux3HIy8uDyWTKOc7lcrnQ3d0Np9NJnzObzSgvL8+4nnQiSL3qoUOHqMtaIpGgsbERjY2NOVk+XT/+CoKtB5F3+bXIv+orqdvzedD/u7sBhkXxN+5JGUaeC+3fvhqRwV5I8vJR+8hL01rXVEQiERw6dAhHjhyh3xu1Wo3GxkZUV1endbuGw2G0tbXREhqJRILKykoUFRVRAQyFQujr60sR6oKCAiqcTqeTthkkFmo0Gj0685XjppVclwuzJqb/e39uYvrlO065a6wopiKTQrowRaPRFGuV4zgqqjN10ZionIaI6nQyfz0eD4aHh2lDiZkQVY/Hg56eHlqnCgAmkwnl5eXUmskVEu87cOAATaKRyWSYN28e6urqMo7PCYKAli9eACEcgnrxSpTd8eCEywZbD8D2zO+hWboG5v93XU777du9BY7X/w7jhf8P2tPOymkd2RKJRNDa2orDhw9TT4RcLkd9ff2EVr3T6URbWxuN3Ws0GtTW1tLPjed5jI6OpohuUVERtFot4vE4XC4X/S6RJvnE3QskvrdyufyYxb5nS0zPfiI3Md18kyimJz2imOYGiafGYrEUa5VlWUil0hlNWEpXTjMTNaqkk9LIyMiMiirp/ZqcTGQymVBRUTHt7xiZybl3717qYtRoNDjttNPoIPQp9+/ATvh2fgDjus/Qxgjp6Hvoh/Bu2wwAqP/LW0djlxkSDofR29sLm82GYDBI5/SSBiJSqRRKpRJqtZoKkMFgoL1wp0ssFkNHRweam5tptyeO41BVVYX6+vpxNzhk7F9nZye9ebNYLKiqqqKx8EAggP7+/pT2k1arFSzLwuv1UjGWyWQwGAwAQAX9WJbOzJaYnvWn3MT03ZtFMT3pEcV0+hAX8Nihy1KpdEbjqoIgIBAIwOfz0axJjuOg0WiOS1El7eqSh5ubzWZUVlZOO6bK8zxtrE+yi8vLy7Fs2bKckqDS4dm2GYOP/QKapatR9PW7Mjq/giDg0KFD2Lt3LwYGBibsCTwZHMfBaDTSmLbZbE7b2cr9/huwPfM76M+5OCUeOxZi1R86dCjFFV9UVITGxkZYrdaU9UYiEXR2dtLmGgzDoKSkhDbW4HkeNpuNll9JpVIUFRVBo9EgFArRAeMsy8JgMEAqlR7z0pnZEtMzcxTT90QxPfkRxXTmEAQhJVYEzJ6o+v1++P3+FFHVarXT6pM6W6JKxn6R+kUgkahUUVExbeGLRqPYt28fWlpaIAgC5HI5VqxYgdLS0mmtNxe8Xi/eeOMNtLe30+f0GjUsQhTGyhpoLFbIZLKUxLZAIADH4QNwD/YiKFMhqNCk7dgFJETLarWioKAAhYWFiP/9UbDNu8BKpGh49p0p94+UNx0+fJg2fgAS1mVjY+O4+aY+nw/t7e1UgKVSKaqrq6n4+v1+9Pf30+97Xl4e8vPzwfM8nE4nvbnUarVQqVQIh8P0+yqTyWY103e2xPSMP90PiTJLMQ2G8L4opic/opjOPBPFVaVSKaRS6YyJKs/z1FIl25FIJNBoNLMiqhaLBUajMWc3nd/vR2dnJ42pMgyD0tJSlJeXT9tSsdvt2Lp1K73wNzY2YsmSJccs6aWzsxP//ve/EQqFwHEcVq5ciaamJjju/Roig71pZ6USwn2d6P3l9yExWVDy/V/BF47A4XDAbrfTOOXI8DDiaaxceSwCq16DyuWrUFJSgoKCgozOpdfrxeHDh9He3k5DByqValwJEmnk0N7eTnvx6nQ61NTU0LiozWaj510ul6O4uBgKhQJut5u+R6FQQK/XIxKJ0O3N9O8hGVFM5x5RTEVmjGMpqsRSTRbVmWhROFZUpVIpLBYLbXieCx6PBx0dHbRBv1wuR21t7bQ7HsXjcezduxfNzc0AEm7f1atXz3qMbmhoCM899xzi8TisVisuvvhieiw9930H/r0fQbviXJR8+6eTrkeIRTH66rNgpTKYLrkKDHtUFFu+/hm4/QH4SmrAn3MZBgcGMDQ4AIFJPTaSGFReXo7y8nIUFBRMevzhcBjNO7ejrbcf4Y8tSalUitraWjQ0NFDPAc/z6Ovro4PFgUQ7w6qqKshkMni9XvT391OhzM/PR15eHoLB4LjyGeLBIc/NRunMrInp4zmK6VdEMT3pEcV09iHJSpFI5JiLqlQqpS0KpyOqLpcLw8PD1HUnl8uRn59Px3Llss7R0VG0tbXRjE+z2Yza2tpp1ZACCSvxo48+As/zKCkpwRlnnDFrMbpgMIhnnnkGbrcbVVVVuPzyy+H/6G34922H+f9dB4nZinBPOxTltWCmyDhOrnEt/f6voFmyir428Ogv4N78OowXfRYFX/wWBJ5Hy62fgT0cQ2jxmfCU1KKvr49aggS5XI7y8nJUVVWhsrJyXKlSZLAX7f/zBfAA+Bu+j65QnIofy7Korq7GvHnzaIw7HA6jo6ODuuwlEgmqqqoSbud4HIODg/T9arUaxcXF4HkeLpcLPM+DZVka+032esx06cxsiemaxx7ISUw/+Or3T7lrrCimIrPGsRTVsS0KZ6LvL8/zcDgcGB0dTXENFhQU5Bz7jMfj6OrqQm9vLwDQRuzp5m9mQ/Ic0NLSUpx55pkzbv0IgoBXXnkFra2t0Ov1uP766yFjGbRcfwEg8NCtOh/F3/xJxusLdhxG14+/CgAwnHUR8q+9lTZ0EAQBca8bQiwG/56PoFm2BkI8hlB7M1TzliE6OgRZcTmcLjdtGdjT00MzaQlWqxW1tbWora2FyaDH8HOPwrX+HwCAwpvvgP7ci9Hf34+DBw+mjOKrrq5GU1MTnUTkdrvR2tpKs4R1Oh3q6+uhUqngcrkwODhIGz2UlJRAqVTC4XDQmzG9Xg+ZTJZS7zyTgjpbYro6RzH9UBTTkx9RTI89x0pU4/E4tVQJcrkcWq12Wskf8Xgco6OjsNvtdP+nM5wcSCS7HDlyhFo1hYWFqKmpmZZFOTg4iE2bNoHneSxZsgRNTU05rysdra2tePnll8GyLK4470zofU5oV52H7p/cilDbIRR86X9gvPD/pbyHDwXg2boZqvoF4HQGsApVyiSaQMt+dN+dyMw1f/ZLsHz2Synv77jjBoS7WqFsWIiKe/4I4KgrWX/WJ1D0tR8e3dbHWbcdHR3o6OhIGX0HAJp4BIXDnShyDqBi7SXI//xXqWuZJCvt37+fWqEMw6CmpgYLFiyAUqmkk2Y6Oztp8/vy8nKUlZUhGo2ir6+PirnFYkFeXh7cbjcVUJKFPhuCOltiuurR3MR0yy2nnpiKvXlFZh3SD5jjuBRRJZnAMplsRrJ/OY4b1/c3HA4jHA5DqVRCo9HkJH4cx9Hm5zabDW63G06nEx6PJ+fh5BqNBosXL0Z3dze6u7upu3DevHk5j2QrLCzEsmXLsH37duzZswdWqxV5eXk5rWsssVgMb731FgBg+aKF8P/6+/DHYzCPDKLi3kfBB/zgNOMvnENPPQz35tfByJUQwkEoa+ej/N5H6fmSF5eD0xkQ97ihqBg/TYaRJG6CGJZD74N3wr9vG5iPJ8qE+zpTlmVZFoWFhSgsLMSaNWvg9/sTU3oO7EfPwAB8kKG1sB6thfU44OMwf/sOzJs3j9a5Wq1WWK1W2Gw2Kqqtra3oaG9DQ2MTmpqaUFJSArPZjCNHjsDhcKCrqwujo6NobGxEZWUlhoaG4HQ6MTIygkAggJKSEgQCAToxied5qNXqlJ6+x7pbUlaIzXkz5sQbTSFywkJEValUpsQ0I5EIgsHguMSlXOE4jo5hI+O0gsEgRkZG4PF4cqqFBBKWdElJCSoqKiCXy2nMrLOzM6W3cKawLIvKykosXLgQUqkUfr8fu3btoolKuVBbW4uysjIIgoAtW7bkfKwEgecR7uvEwf374fV6odVqsXL5MjAfW9CMRAqG5dIKaeL1xM2LEEvEC4OtByBEjp4rTqNDze/+jppHX4Z2+Znj3l92x4Mo/tZPIcmzwLfjPQiRMFilGqZLrkbR13806b6r1WrMq6nGwjeexoW7XsMaFVDCxsAyDOweDzZv3oxHH30UL774Itrb2+m5slqtWLt2LZrad0DtGEKcF3Dw4EH861//QltbG+RyORYsWIDGxkZIJBL4fD7s2LEDAwMDKCwspFNpSDa3XC6nFhpplk/i5MlN8kVObEQxFTnmkFrUZFEVBGHGRVUikdBmACR26vf7MTIykpK0lC1qtRrV1dU0czQYDKK9vT3FDZwNJpMJy5cvh06nQywWw759+2C323PaN4ZhcNppp0Eul8PtdqOzs3PqN03C0JO/Rsd3rsX2DYmxaUuXLoXCmIfK+59Cyfd+ibxPXT3p+63XfxPF3/4ZJDojAIDV6MZ1VmIVKkiN6TObOY0OupXngpEeTdKKjQ7B/f4bENLM5k1GiMXQ+8D3wIeCkPBxVPABfP47P8Ctt92GCy+8EMXFxRAEAR0dHfjnP/+JJ554Atu2baM3RoXllaj74GXM9w1Cp9MhHA5j69atWL9+PRwOB6xWK0477TSYTCYIgoC2tjbs378farUaVVVVkEqltCEEgJQOSR6PJ0VQx4ZAjhc+nhGf9eNURIyZisw56UpqGIaBTCab0TaFoVAope+vRCKBTqebVjZtNBrFwMAATUwhGZ25xGjj8TgOHjwIh8MBhmEwb968nMtnDh06hN27d0Or1eKSSy7JuVym665bYOvpwuamc8GyLL72ta+ldUPb//08HOtfhOXKm2A46xPjXnes/ydGX3oaeZddg7yLr5pwe263G/0dbejZ+B945BpEjPkIhUII+rzwth4CA4CBAFYQoFQoYDn9TKhUKuj1eigdQ+DffQ1la85D0WdvQHigBx23fz5l/WOzhh0OB/bu3Yv9+/fTeKdMJsOiRYuwfPlyKIUYOG2ivKWlpQX79u2jc0/r6+uxcOFCSCQSDAwMUOtWLpdj3rx5UCqV6O3tRTAYBMMwtB6VdFIic3yJeEul0pwT5mYrZrrikdxiplu/furFTEUxFTluSNdRiWVZKqoztQ3iaktu96bT6XLeRrrRXcXFxTlNkOF5Hs3NzRgZGQHDMFiwYAFMpuxnjEajUbz66qsIh8NYvXo1Kisrs14HAIQHevDGSy+iORBDXV0dLr/88rTL0cHgZdWo+mViMHjc54Hr7X9D1bgIipp58G7bDD4YgPOd1yCEg1A1LkbeVV/BwebDOHjwINpaDsPl9eW0n2MxGY0wKyTQHdkNa8SH/KgfUoFH6fd/BWX9QrAyeUrpTjQaRXNzM3bs2JGS1btgwQKsWbOGZvUGg0Hs2rULXV1dABI3TytXrkRBQQF8Ph8OHjxIxZNkaff399NEs6KiIiiVStr0QaFQQKPR0LKZXDslzZaYnv6HByDJMnM9Fgxi262nnpiKCUgixw3EGpVKpVRUeZ6nXXZkMtm0GxIwDAO1Wg2lUgmv14tAIIBQKIRwOAydTpdTz1+GYWAymaBWq2lGZ09PD/Lz82E2m7NaH8uyaGxsBACMjIzg4MGDWLp0ada9faVSKRoaGrB3714cOXIkZzGVFZaij1MC8GL+/PkTLme68NOw//t5gOPAR8JgZXLYnv0D3JteByOTw/yZGzDy18fo8m5OjmZ7EK37exCMH43rMoIAPcNDF/JCzwGqoBdyPgYZy4CLRSCAgQAgzrCIKjVg6xbAcWgffJwMPk4GLydHiJPC4XTCAQDaErreknwzqrfthOGhe2BVK1D38AtgFUp6vhYuXIgFCxago6MD27ZtowMGDh06hOXLl+P000+HUqnEmjVrUFFRge3bt8Pv92Pjxo1YsGAB5s+fj6VLl+Lw4cOw2+1oaWlBOBxGWVkZOI6D0+nEwMAAioqKYDQa4XQ66XebjHFLHgxwPJCL2/bUMs+OIoqpyHEHEVWJREIb6sfjcQSDwRnrIMOyLPR6PVQqFdxuN6LRKG0Hp9frc8r6lcvlKRmdZORbYWFhVjcBRFAjkQjcbjf279+PZcuWZW2xVFdXY9++fRgdHYXH40lrJcS8bgw/+wgkJgssn7sxpWwFSIwq83q94DgO5eXlabcjxKKIuR0QYlGEO48geOQA1POXQfJxHJTT6hEZ6AEAhBgJtmuL0ayyAAwDxHno9XrUCAFYOw8gP+qDTODBSKTQn/8puDZMMhM15ABz0AXB50l5OshK4JQo4TIWYDAcx5BMCz8nQ++IHb0jdsDcBGU8ivoHf4FlZ5+H2vmLIDcmrH+GYVBdXY0ShQRD/hDe27YdAwMD2LJlC/bt24dzzz0XjY2NKC4uRn5+Pnbu3In29nbs378fw8PDWLNmDebPn4+uri50d3ejq6sLkUgENTU1YBgGDocDAwMDKC4uhsFggMvlgt/vB8dxNNudZJ8fFxm+YjZvxohiKnLcQmZCkkQO0iw9FovNWI2qVCpFXl4eLV2IRCIYHR2lzcqzXT/LsigqKoJCocDg4CCde0msk2zWM2/ePOzatQuhUAjNzc1YsGBBVvujVCpRWFiIgYEBdHV1YeHCheOWcb39L7g3vw4A0C5dBWVtqvXZ3d0NIOGenEjMh558CK63/w1IJFDVLYCyJlHfavncl6FZvAry4nLw0Sh2tnbg/bgSETZx2ak2G3H2JZehaf58OF//G0ZaPqTrFGJRuDa8BM5kQdwxMuExjhVSAFDyMSgjXhTZvGgCIDGa4bR70C/ToVeuR59chyAnxR67D3te+hcUL76ElWefizPPOQcmkwmuzf/F4KM/B6fR4TN3/wF7/vk89ggqeP1+vPbaa9i7dy/WrVsHk8mElStXwmq1Ytu2bbDZbHj99ddxzjnnoLKyElKpFG1tbRgYGEA0GkVDQwOARJy2v78f5eXl0Gg08Pl88Hg8MJlMNBnvuC+ZERmHmM0rctzDsiwUCkXKbMhoNIpgMJgyWzVXGIaBRqOB2WyGTCajw8SdTmfO6ydDwlmWRSAQQFdXV8rIukyQyWSYP38+WJalFk22EGuyp6cn7euqxiVgZHJIzFbICsvGvU4aH0w2lSbuTQwvZxgG8WAAg//7K/Tc/x24Nr2Onl98G/1/+Cne3LIVbwtaRFgJ8qIBfE4P3PrDu1DuH8Xw07+B/+CuxMpYFvKqhqPrdjvHbY/TGaBZuhrmz92I0jseRO2fXkPjC++j8YX3UXjLD8YtH3OOQhuPoCE4igtc7bjetgcXB/vQEHJAEY8ixEqw6b338LOf/QxPPfUUetqOJLbt88D5+t9g+Gg9ztz2ClYtWQSJRILe3l785S9/waFDhwAAlZWVuOiii6DX6xEKhfDWW29haGgIJSUlaGpqAsMwGBkZQVtbG6xWK52r2tvbC6lUSsu3XC4XvWEhN45zjZDjIxceeeQRVFRUQKFQYMWKFdi2bduEyz7xxBM488wzYTQaYTQasXbt2kmXPxaIlqnICQOJLcViMVpKMJPxVIlEApPJhEAgAI/Hg3A4jNHRURiNxpyyLDUaDSorK9HV1YVQKITu7m46IzObdVRVVaGtrQ3t7e0wmUxZtTIkNY9ut5vWiSajqpuPuj//FwzHpTSaJ5Ca14maP4R62gGGhWrBcgT270C4swXhzhYAQMwxAiEcwvutXdhuexMAsNzbhyWhUdT88BmEejvQ/7u7U1fI8wh3HIasqBzxcAhxu23sJlH3p9cyPv50sBBQ7BpEMYAzAPTIDThkKEEfq8S+ffuwD0DtvPNx3mlLobXkwb15PSQch5L3Xsb8W+7C+k2b0dvbi9deew29vb04//zzodfrsW7dOmzevBk2mw3vvPMOzjzzTJSUlIBlWRw4cACDg4OQy+UoKyujZWB9fX2oqKhANBpFPB6H1+uFRqOh8VOO447JcPGJOFYx07/97W+4/fbb8dhjj2HFihV4+OGHsW7dOrS0tCA/P3/c8ps2bcLVV1+N1atXQ6FQ4IEHHsCFF16IgwcPori4OPsdmAFEy1TkhILUqKpUKipKJJ46E/WpJEHJbDaD4zjwPA+73Z4y9i0bFAoFKioqwHEcFdRsrd3i4mLo9XrwPE/nmGYKadAPIGWmZzKsVJZWSAHQrFNSIzmW3l/dAe+2TQjs30GfY+QKsGotGLkC+7WF2K5LJAGdX1uGpcERmM6/FLLC0kSjB2n6sqTIYE9aIR1LdGQIvr1bp1xuIlgAFWEXPmk7gM+OHsSi6kowDINWhwePb9iElw92QH3VVyFEIwgdOQCutxVXXnklVq1KlNfs3bsXzz33HHw+H6RSKc4991yUlJSA53m8++676O3tpQMNAKCrqwvDw8MoLS2FRCJBOBzG0NAQjMZEHW44HEYsFqMCSrJ854xcakxz+Ak+9NBDuOmmm3DDDTegqakJjz32GFQqFZ588sm0yz/33HP42te+hsWLF6OhoQH/+7//C57nsXHjxukd7zQQxVTkhIRhGMjl8hTXbyQSQSgUmnbXHyARSzWbzdQK9Hq9cLlcMyKoPT09We0jwzBoaGgAy7JwuVwYGhrKavslJQkxm0hMJ4JM5QEwYYmDRG+k/zZdfh0AQAiHYLrkagx2tuMjdcJKOG/JAlzytW+j8blNKLjh2wj1dqDnvu+AnchKn+Q8j7z4JHp/+T0c+cqn0HbbZzH0v7/K6rgmwhQNYMX7f8fnJW7UBO2AIGD37t14/MM96K9cCFXTEqjnLQXLsjjzzDNxxRVXQKlUwmaz4YUXXoDP5wPHcTjzzDNRUVEBQRDwwQcfwOl0ori4mLrKjxw5gkgkQv/vdrsRCoVo+Y3X601x985EKCNnBCa3BxLlNcmPibqERSIR7Ny5E2vXrqXPsSyLtWvXYsuWLRntZiAQQDQazamMbKYQxVTkhIa4fokblud5BIPBGekoQzJ+SYwrFArBbrfndHEjgkpiqP39/Vntn1KppOUtHR0dKbW4U0HEdGRkBN7+HgjxyWNxrndeQ+cPvwzPjvfocxMlTxV94x6wOgMYhQqOV54BOAnAslBU1GGnrgQCw6A85ET160/CfygRFx147D50fu96RHrawAf9adc7GaMvPgnfrg8RdLvQCxkO4Kjbux8yvAUd3oMW26HGESjgBIdsbq/UfW04z9WBT48eQl7Uj0A4gv+E5NjedDYE+dFtVVRU4Nprr4VOp4PD4cDf/vY3+P1+sCyLVatW0TFt7777LsLhMKqqqmAymaiHQalUUvf54OAgVCoV9YaQzHUAWX3WM810YqalpaX096PX63Hfffel3QaZyjR2cpLVas34xvH73/8+ioqKUgT5WCOKqcgJT3J7QnLRj0ajCIVC076rZxgGKpWKZlpGo1HY7facLnAKhQJlZWVgGAYejwfDw8NZvb+4uBgqlQrRaJRm2WaCRqOhLe/2/Pan6L73tkmXtz33R4TaD2P0xaMuNiL8fCQM/8Fd4EOJOaLBQ3vAe1wQQgEAgNRSgNpHX4Vfb0anzAAAOM3bDwZA769/gL7f3g33pv+kWp5s5jHkEUjwX+jxCKy4D0V4EvnYCAN9fRRSfAAd3oYer8OIv8KM36EQ96EYf4YFG6HDAKSTeyL5xHfGHAvg8tFmrCxMfPYffPABHnnkEdqAAUi4v6+88kpoNBrY7Xb8/e9/RzgcBsuyWLNmDdRqNXw+Hz788EMIgoC6ujpwHAePx4P+/n5YLBZaAuZwOGhMm4gycBxYpznS29sLt9tNH3feeeesbOf+++/HCy+8gJdffpkmc80FopiKnDSQUhrSHpA0fJiJWKpcLqdx1Hg8DrvdnlM8S61Wo6ioCEDijjybpvYsy6KmpgZAYn5p8qi5qSDWqbOoGqHOI5OeD+Pay8EqVDCe+yn6HLmY9//+J+j56TfQff930PfQDzH6r2chr6wHpzdBYikEK5Vh+Pk/YldzCwSGQWnIBVM80aZP8Pvg3ZImpsVPnbUaBoMN0ONRWLENWoxCCoCBBnGU4aj7MB8RrIAXi+FHA4LIRwQcBMTAoA9yvA8dnoAVf0I+DkMxqahKLIUo/94DuPJ7P8aNN94IhVyOrq4u/PF3v0sZSm40GnHVVVdBrVZjZGSExu3kcjnOOusscBxHR7cpFApUV1cDSAx153meWmSjo6O05EsQBASDwZSbw7lgOr15dTpdymOitp3kd0VG3xFsNhsKCgom3b8HH3wQ999/P9544420pV/HElFMRU4qyGQa4jIDEjGZmZjMIZFIYDab6cXO4XDkNC3GYDDQnrsDAwMpF+apMJlMyMvLgyAIaG9vz/iYKioqAABecwlMt/1k0vrF/KtuRv3TbyDvE5+lsVIi+nFXogF/dKAb3m2bER3ogRAJI+52IDYyiHBvB9yb/4u+fQmXbnHEA0hTM6ElefmJhg0Z4gaH/0U+PoIWAhjUIYgrMYr/wQD+B4P4f4leRwCAQkRxEdy4DE5cCTtuwTB+gH7ciiFcCgfmIQAJBAxBhr/BjCeQjzakv8jHRgYx8uKTiLnsmDdvHq7gXFDFI7DZ7XjqqadSrEWTyURbLR44cICWIplMJnqRP3DgAHieR2FhIdRqNeLxOAYGBqDX6yGXy8HzPNxuN+12lezqjcfjc9II/1g0upfJZFi2bFlK8hBJJiKJXun45S9/iZ/+9KdYv349li9fnushzhiimIqclJAEJRJLJRm/03WXsSwLk8lE61EdDgdtkJ4N+fn50Gq1EAQBvb29WdUUVldX0246pGn6VGi12oQFxDAYVhoy3haJ6ZF+tcXf+AksV38VeZ+5gS6jXrQCYFhAIgGnN0JZNx+ej4OUulgYSBq5xmp0iNmHM77ijkCCJ2HBKKTQIoZrMIKrYUcDQtBkGAllAeQhhiUI4LNw4HYM4Ax4IAWPQcjwHCz4B0zwp7kchjsOo/Xrn0GotxP5Vis+4TgCKQS0trbiv//9b8qyxcXFWLx4MQDgjTfeoJ9pXV0dFAoFfD4fOjo6wDAMysoSNb19fX3geZ4mzjgcDtqsged52lQfwJy4eo/V1Jjbb78dTzzxBP7yl7+gubkZt9xyC/x+P264IfE9u+6661LcxA888AB+/OMf48knn0RFRQWGhoYwNDREB07MBaKYipy0JMdSSWeZUCg07eQkIqjEbUV6rGa7b8XFxZDJZIhGo1klJKlUKuq2bWtry/giSxKYkmd3TgURUxLflVoKYL7sC8hb9xlU/PwJVD30PAquuw3mK24EYjEwLIeyH/8OLl/CBa2VpiYu8Wk6Fk3GyzDBAwnMiOJGjKAG2XsCxqKEgPPhwTcxhJXwgoGAQ1Dh/2BG2luaeAyd370WjFKNZXc/jKs+fw0AYOPGjejt7QUA+A/ugu3//oCVjfVQq9VwOBzYt28fAIAD0PRxv2XS6IHM2o1GoxgZGYFerwfDMNSLQmJ/4XA4xTo9Wbnyyivx4IMP4q677sLixYuxZ88erF+/nrrAe3p6aAMRAHj00UcRiUTw2c9+lg6ELywsxIMPPjhXhyCKqcjJD8uyUCqVKdmR03X7MgwDo9FIL3q5CCrHcSgtLQXDMPD5fFklJJWXl0MmkyEYDNIL+lSUlZVBJpPB5/Nl/B7SQam1tZUKsG/vVnh3vA9FVQPkRQkLK+5MuH9jbgeEcAgKf6Irkjee+zmOARhCokTkGoxCj5kVEzV4rIMbN2EYKsRhgwxvJCUzjcW98V/gI2EsPe00aoG+9ec/wv7fF9H7y+/B8Z8X4H7+EZx++ukAgMOHD8N/aDdabrgQ3JP3g2UYeL1eeL1esCxL63+dTic4jqPu3UAgQG/UotFoSiLSsUYQmJweuXDrrbeiu7ubzo1dsWIFfW3Tpk14+umn6f+7urogCMK4xz333DPNI84dUUxFTgkmcvtOpyaVYRgYDIYUQc02hqpQKFISkrxeb0bvk0gkNJGlu7s7o7irVCpFfX09AODgwYMZ3UyQ9m5+vx+9vb0IHN6L3vv+B30P3gH3e29g9OW/INC8F3mXfh7mz3wJpd/7JViVGnnhxHHY1Ul1f2Nip5iicb8DEghgIAc/40KaTCGiNO66HZoJY6hAojUhAKxZswYA0Oz0o++Z30FmTdTTysuq6Tnu6+vD6J5tEKIRxPs6YNIm6kiJhUVKrkh2MKlpDgaDtM40Go1SNy8RDJHjE1FMRU4pSC9U4vYNBoPT6oFKBJVYEg6HI+ssX4PBQGNm/f39Gb8/Pz8fBkNicPWRI5Nn6BLq6+shkUjgdDozsk45jkNdXR0AYN++fWAkRwXQ8cY/MfK3J9D909vQdutnEbH1Qr1oBRiWQ3ljomH+aEFVovYUgLRgTJu3WGxSQXV+3O1UlVWVaG7UIIzlSMTbNmOCGZwsC0athX//dlRXV8Os1SDKcugrqEXZPY+g6tfPwXLVzdDpdNQ96apeBO2q82H+9BdRVJFws5OsVZLcRbp3JYspx3FpY6Uz0ZAkG45VzPRk4LgQ02waHAPAP/7xDzQ0NEChUGDBggV4/fXXj9GeipwMcBwHpVJJ3WfhcHhapQfE5TsdQbVarVAqlYjH4xl3SCIDqFmWhdPpRH9//5Tvkcvl1HLauXNnRse9aNEiAEBzczOcMg19XiANFz4+j97t76PluvPRdfctOP3zN4BlWXTbnRjgEiIR7e1MXbEgANEoFduxWBEFBwFOSLAbqin3c7qUIvGZ8ZjATcnz6H/gu+j5+bfR9cObYJEn4sGaS6+BRKWBvLicCiBpvxiTSFHyzZ/AcsWXaQ0p8V5IpdKU0pfkrkcMw9DXBEGYs+kxophmzpyLKWlwfPfdd2PXrl1YtGgR1q1bN2H86MMPP8TVV1+NG2+8Ebt378bll1+Oyy+/HAcOHDjGey5yIsMwDBQKBY2jTrd8hghqcpZvNgLNsixKS0vBcRzC4XDGCUkqlYq6e9vb2zOqPZ0/fz40Gg0CgQBNkpmMwsJCNDUlxqptev99aFaeB0auhPkzX0LJ934FzZLVUNbNh6phIYRoBMGW/dCxwNKyQgDAVn0ZhMmatU/QkcmAOM5DIva6AQa4MbsDs0c+toQLMPWNUKjjMPx9icYZgfffSPxt2QfP1k0QJrgRSnbXpnsu3etTvXZMyKX90SnInItptg2Of/vb3+Kiiy7Cd7/7XTQ2NuKnP/0pli5dij/84Q9plw+Hw+N6RIqIAEeHkBOLIBaLzYigJtehZiOoUqmUJiR5PB5ajjIVRUVFtMNRc3PzlIkqEomE1uW1tLRgZGTieaGEs846CxKJBH19fXCvvQK1j72KcGcLRl9+Gr7t7yLYegiWq74CZf1CqJeswtCfHkDD1tcg5eMYkahw0FgO/fmXZXQ8yayED6UIIwIWz8KMrknimdPBDgn2IJEAVIAMPjOWRZhLfG8Ytx3h/m503/119P/mR3Bvfp1+h5ItymzF9HiYZSrwQk6PU5E5FdNcGhxv2bJlXP/FdevWTbj8fffdl9IfcrK5jCKnHkRQiYt2uolJpGxGIpGA53k4HI6sYrJqtZp2fRkeHs6oQxLDMKivr4dUKoXP58toskxxcTHKy8shCALee++9KROYdDodVq5cCQDYsGEDWl75K+z/fh6h1oMAAFlBMfiAD8GWffDv3gL/vm1Q8TEs8yVczx/KzNh54OCUxzIWFsBlcEKFOEYhxV9gwfPIw/AMTY8UAOyCCo8jHz5wUCKOekydle2paIJNmhDfedfdAoZlaSMKgeXo7FmSZAQcbXyhUiVc1rFYjN5sSaVS+j0h7l3yHSTxffLvYwov5PY4BZlTMc2lwfHQ0FBWy995550p/SEzLQkQObWQSCQpiUnT6evLsizy8vKooNrt9qwElXQ5AhIdkjJx3crlcsybNw8Mw2B4eHjCYeDJrFixAnq9HsFgEO++++6Ux7tq1SrU1NQgHo9jo82DgEwFTqNDxS/+jMoHnkbcPz4TeYHfhvm+xG/zHdaIdsXHU2ZYFpigef5Y8hDD12DDafCBhYBWKPEorPgr8tAGeU6eRQFAN2R4Fmb8GyZEwaICIXwFw9BlkDn8oTMR91yyeDF0PgfAsqj8xf+i9M5fw13eCJ/PB7lcTkuLgKNZvORmiTQYIC0wSWmVQqGAIAj08+A47riyVkXSc9IPB0/u1SoiMhlkAk0oFKKCKpfLsxrmTSAWKrFM7XY7FdhMsFqtiEaj8Hg86OnpQUVFxZRDwQ0GA2pra3HkyBF0dnZCqVSmHaxMkEqlOOuss7B+/XqMjo7io48+wqpVqyYcRs0wDC655BI899xzGBkZwZ4zP4fPXH4ZlJbENrSnnYXCW34IPh6D/Z9PI2a3gQGwytsLprgC+90hvG2shtfTh4X+oazu5NXg8Um4sAI+tNavQKigAgDQByCkVaP24+Wi1jKEzr4UMvAgecIDmzcgCgYecLBDAgck6IUM9o+XYCHgPLixCr6p90kiRRenRo/CAJZhcFpgCAO/ewKcRofax/8FBSfBxldfBQA0NDTQzzscDsNuT9TiFhYmYskk5EQSk4iYyuVyevPFMExaV/Exg0f2luaxTTg+bphTMc2lwXFBQUFODZFFRDKBNHggc1FJDFU6RU1kOjiOg8lkomPb7HY7TCZTRusiHZJisRgCgQC6u7tRXl4+paAWFRXB7/ejv7+fdtuZTFB1Oh3OOOMMbNq0iRbCr169ekJBlclk+PSnP41nn30WdqcTz//jRXzmM5+B1WpNlAmd/QkIsShsf/41fQ+rVOFM/yCicSkOcxps05WiW2HE2e5OGGKhRCtC4WOXplJ9NEs4DXmIYf55Z8Nw9ifTvm5pmA9Lw/yU5+7avCftshwELIYfq+GFKZM6Vo6D83Nfw1vvfAAAWHnachgCA3Dh47wbIdGooaWlBcDRLGjgaF2v0WiEWq2GIAjUUiXlTaTGWKVSpWT8Jlulx1pMBZ5+NFm951RkTt28uTQ4XrVq1bhp6m+++eakDZFFRLJhJjN9OY5Lcflmk5TEsizKyspoyUx3d3dGXZZqampoKOTQoUNTJhgVFRXhjDPOSJSydHfjvffem9Tlq9frcc011yAvLw8+nw/PP/88Wltb6euMRIqSb/8MhgsuR/Vv/4bS79yPWG8bzhxuxlmuTkj5OGwyDV40z8MOTRHisqOeIyHoz9j9mylqxKFHDMWIYCH8OAduXIFRfAcDuASujIRUALC/8Uz8Y9OH4BkGlUEH1pbnw3rtbSi67W5U/uwJePx+bNiwAQCwcuXKFHfuWIEdHh6mjewLCgrg8/kQi8XAcRw0Gk2Ky5dYqbl4SKaNGDPNmDl3895+++24/vrrsXz5cpx++ul4+OGHxzU4Li4upoNlv/nNb+Lss8/Gr3/9a1x88cV44YUXsGPHDvzpT3+ay8MQOckgiUlkhmksFoMgCLQJeTYQQSVCSixU0o1pqveWl5fTLkddXV1TWqgMw6ChoQFAwmtz6NAh1NfXT+q9KSsrA8dxeO+999DX14d33nkHZ5xxxoTzIQ0GA77whS/g1VdfRVdXF15++WWsWLECq1evhlQqhXb5GdAuPwMAIDFZoF6yCsEjB9AQsKNECOE9ZSF6FQbs0hajU1uP+a5e1PE+cKNDwBSx28FHf4HBR38x1amjfCfjJQFGroQQTk3G8hRUYJumGB32hMW8KDCMsysKoFu8CqxMDv2aCxAMBvGvF19EOBxGUVER7ZAkCAJ27dpFR60VFRWB53ka0y4pKYFEIqEDC/R6PXiepzdccrmcWqkTDWifVXIRx1NUTOe8NCbbBserV6/G888/jz/96U9YtGgRXnzxRbzyyiuYP3/+RJsQEcmJmcz0JTFU4raz2+0Zj14jgkos1M7OzilLvIigWq1WCIKAw4cPTzmyrbi4GOeccw4kEglsNhv+85//TJjYByQu9J/97Gdpn9qtW7fiqaeeQmdnanMGVipDzDEK3u8FBAFln/g0/p9VjStWLIZer4fT68N7nBHPKMrxUcOZiH/2FujXfTrFQmXzUpMOZwNZYSlMl1xJ/z8k1eCdhRfiBcaCDn8ErMDjHFcHVri7oa6sA6dKZPOOjIzgmWeeweDgIORyOS655BKaNLRz50709vaCYRgsXboUDMOgu7sbfr8fHMehuLgYXq+XJiOZTCb6b1KzTM/BZLW6s4RYGpM5jHCKNXv0eDzQ6/Vwu920nZeIyFTE4/EUV2+uiUmCIKT08NVqtVCr1RlZu/F4HH19ffRia7VakZeXN+l7BUFAZ2cntYTy8vLQ2Ng46b67XC68//77cLvdYBgG8+fPx/z58ye9mB85cgRvvfUW3bfa2lqcffbZ0LECGJkcR750EYBE/JQPBgAA5s/cAMeerdhnc6JZkQen9Ki1rWN4VHmHUB10wBQLQlpYitjg7GXiS8wFKP/xb2FvPYz3n3wELSoz7B+XvjAMg/l1tTgt7oLkw/VAPA7DeZ9C4c3fR0tLC/7z6quIITEh5zPXXEtj1Pv376dNMVatWoWqqiq4XC7s2bMHANDU1IS8vDy0tbUhFoshLy8PeXl5tL7YZDLROaZSqXRST8ZMX9fI+hp/+AtwE3gnJiIeCqH55z845a6xopiKiGQISUgilqlMJoNEIsna7UuSTUjJi1KppCO4Mnnv4OAgnE4nAMBoNKKgoGBKq8Vms+Hw4cMQBAFKpRJNTU00izQdsVgMO3bsQHt7O4BEsuBpp51GewinIxwO4/3338euXbsgCAJYQUD5cAeWLlkM+XA//Pu2wfqlb2P4uUcR97pgve4bsD39cOK4ADirFqKjcRUOHDiQ0o5RHY+ghInC4h2BKRpEnlwKqc+Zsm3OZEHcMXXzibGEGA6jUjVsMg2GZFoMyjTgmcS55FgWp51+Os477zx4Hrgdkb5OyIorYDz/UrDLz8KHO3bRzmtmzwhWw4OGnz0OQRBw6NAhKprLli1DQ0MDgsEgdu3ahWg0ioKCAjQ0NGBgYABOpxNSqRTV1dVwuVyIRCJQKBTQaDT0PKhUqkm/H7Mmpnf+PDcxve+Hp9w1ds5jpiIiJwosy0KhUCASiSAWiyESiSAej2cdR2UYBjqdDhzHwePx0EbnRqNxSmuXYRgUFhZCJpPBZrPB6XQiEAigtLR00hIw0vv3wIED9KJeXl6OsrKytEIskUhoEs22bdswOjqK9evXo7q6GgsXLkwbs5XL5Tj//POxaNEivP322+jq6kKntRqd/R6UG4uw+Et3QH/mudCtPA9CJAw+HIT7gzcBAAXXfxONlXVYw0kQ9Hnx/q9/gkMjTvRJNPBzMrRAhha9mm5LoxOgDXmhjUegiYehVpsgDcQhE+KJWTMCD0YAeLUGsWAQUYZFiJUiwErh56TwSpVwsnKEuPGZ1RZEUecfQbV3CMr/7gNnlEKiMyACgM8vxn61FTuffZ4mBlXb2rAg7kHxTd9DKBTCli1baNOGBQsWoKGhAX6/H3v37kU0GoVarUZtbS0cDge9KSoqKkIwGKTiqdVqqfeCxO7nAkHIIZv3lDLPjiJapiIiWSIIAhVT4Oh4t1wSRMLhMFwuF3ieHzfSbSp8Ph/6+vpoY/TCwkIYDIZJL7yRSARHjhyhrkS1Wo36+vpJfwt+vx+7d+9Gd3eiFy3HcWhsbERTU9OEZT6CIKD57fXYuWcfBuNH90ev1WLBokWYP38+nL+7C4EDOyC1FKDm9y9i8Mlfw/3OazCcewmcb7wMAIiBwaBMC5tMA1dRNeysHG63O6PzkylGpQJFajnM9n4URr3QjfSlvC4vrYLu9vuxddPbaBmwIfZxglRRgRXVOzbAYB9A2Q9/A4+xAFu2bEEwGATLsli6dCnq6urg9Xqxb98+xGIxqNVqLFy4EKFQCH19ie1YLBbo9XqahERutOLxOL2Bm0pMZ8sybfjez8DJs7RMwyEc/uWPTrlrrCimIiI5MjaOmqvbNx6Pw+l00gxOlUoFnU6X0Xqi0Sj6+/upy1ir1aKwsHDSWlZBEDA8PIy2tja6TdJecLK43PDwMHbt2kWbD8jlcjQ2NqKmpmZSq7jjXy9gx7ub0JdXhljShBgrE0Ne72GUaJRY9PPH0PLFCyCEEyUhUksBoqO2FDMn71OfR/41X0MgEMBgXx/6t7wDd5RHQKGFZ6AX7q5WhFgJeDAQuEQpEseykMSikAhxyIU4VPEIStZ+CrG3XoIh4IY5z4SG3zyfaAcIwP6v5zD8/KOQmCyI6vIwKFVjqLQR/a6jCV8FBQW0G5QQjcDv9eDAkTZ0dHQAOFq7azQaYbPZ0NLSAp7nodVqsXDhwpSB7kajERaLBXa7nbrg1Wo1vVFLnm40GbMmpv/z09zE9Nc/PuWusaKYiohMA0EQEA6HaV0my7KQy+VZZ16OjaNyHAeDwZBR+YwgCBgdHaWTlliWRX5+Pkwm05RWant7O22CwnEcSkpKUFpaOqG7WRAE9Pb2Ys+ePbTJAMdxqKioQG1tbdptCoKAwKHdCDkd2Pn8/6I3rxR2rSVlGRXHwuQdhXG0D0a/A9bKalT88DeIOUYR6u9CZKAHxvM+BVaRviQo0LIf3fd8jYpv2Q9/A9c7/4Hnw7cgr2pAuLsViMehP+9TKLr5+/Du+hDej96B6eIroSivoedjYGAAnS3NaNuxFU6JkvbbBYDq6mqcfvrpKCkpAcMwCAQCOHz4MI4cOUI//9raWixdupTOmCU1viaTCY2NjfB4PDRDWq/Xw2q1wuFwgOd5SKVSGAyGlIYNmXz+wOyJaf23781JTFt+c9cpd40VxVREZJqMdfsCuVupoVAIbrebJjmpVCpotdqMxDkYDGJgYCClLV1hYSHUavWk73M6nWhvb6eZuBKJBKWlpSguLp5QVHmeR1dXFw4fPkzjfkDC0qqurkZ5efk4d7UQi6L/d/cgOjIE7Y3fQ5fLi+Z33oAtzoBnU13kHMvCkp8PIyuA2f42DAo5mr7zM+hNeRO606POUQQO7ASrUkO77Ax0/vAmhNqbIcnLR+XP/xcxrwuK0ioIggCPxwO73Q6Hw4Hh4WEMDQ1R6zAZoxDF/LPOQ1NTE21a73a70dzcjM7OTvo5WSwWLFmyBGazeZzVX1FRgdLSUgwODlIXtcFgQH5+PhVSiUQCg8FAv0Mcx2UVixfFdO4RxVREZIYYm+3LcRxkonpWQgAAGZdJREFUMlnWVirP8zQxiaxHp9NldHElpTfDw8PUWtLpdMjPz5/UFSsIAkZGRtDZ2Um3K5FIUFxcjKKiognfS97X2tqaMtScxHDLy8tRVFQ0YRzY89E76H3sfoQWroKroAL9A0NwcHJEJukSpVarodVqoVKpoFAoaLcqjuNoyz1BEBB1O+HrbAUshYjJFPD7/fD5fPD5fBPWCmu1WpSVlaHYqEeeawiFZ14AiVaPSCSC3t5edHR0pMxatlgsmDdvHm3j2NbWljIdpqGhAXK5HH19ffQmx2q1QqvVwul0QhCEcUKaaZw05TzOlph+M0cx/a0opic9opiKzCbprFSpVAqpVJq1lRoOh+F2u6koymQy6HS6jHr7xmIxDA8Pp1iNer0eFotlUlHleR7Dw8O04xKQEEaz2YzCwkIYjcYJjyMcDqOzsxMdHR0p2wUSbs7CwkIUFBTAbDZPmrUsCAJcLhcG+/vQ8uffwqfQwK81wS9X5zzJZyykiQaZ0FNQUIDCwkIo4lF4t78LzeKVCCvU6O/vR39/P2w2W8qNQklJCRobG2GxWOB2u9HT00NjySzLory8HMXFxXA4HBgdHYUgCNSNDoC6yKVSKfR6/bSEFJhFMb3tJ7mJ6e/vPuWusaKYiojMAmOtVNJNKdtGDzzPw+fzpYxhUyqV0Gq1GWUPh0Ih2Gw26sIFEpaq2WyetCUhsTj7+vpSui0pFApYrVZYrVY6lzMdbrcb3d3d6OvrGyesDMMgLy8PZrOZNirQaDRpY619v/4BfLs+ROFX7oD+rIsQCATg9Xrh9XoRCoUQDAYTf4f6EXaMQl5aCUikYFkWDMOAi0XgXf8PSGNRWNach6KzL6KNMpI9BiRmffjPv4E9FIPfWoawPPX4dDodKisrUVVVBYVCgdHRUfT396dkF1ssFlRWViIej2NoaIiKpEajQUFBAfx+P42Jjq0lzVVIgdkT07qv35OTmB555J5T7horiqmIyCyRzkrN1fUbi8WogBDUajXUanVGohoMBjE8PJwiqmq1ekIhS8bn82FgYAA2my3FMtRqtbBYLDCbzZMKazAYxODgIAYHBzE8PIxAIDBuGWKhGQwG6PV6qNVqaDQaqNVqSDgO7CTHyIcCaLnxk0A8RjsTEaLOUbTf9jkIsSgKbvwOtOdegmAwCJ/PB4/HA4/HA5fLlXaIO8MwsFgsKC4uRnFxMXQ6HQKBAGw2GwYHB2lMlGEYWK1WlJWVAUg0yCDHKJFIaHa12+2mMVmdTgeJRJIyEDyXvs+EWRPTW+7OTUwf/ckpd40VxVREZJYRBIE2eiBIJBJIpdKsRTUSicDr9aYItEqlSohOBlZvKBTC6OhoijUlkUhgNBqnzB6Ox+MYHR2FzWajNZEEpVIJk8lE1zNZNrDf78fw8DBGR0dp44LJ+h2TObMKhQIymQwymQxSqRQcx4FlWbAMA+eGf4IPBqCatxSy6kbE43HE43FEIhGEvB6EQyGEBaSct3TbMer1MAhRFNc3wVpWDolEAq/XC7vdjpGRkZQbAZlMhsLCQhQWFtIBBsSDwDAMPR/J1qhEIoFer0c8HqfHnGsYIJnZEtPar9yVk5i2Pn7vKXeNFcVUROQYwfM87ZpEyOVCSspxfD5fyjg3pVIJlUqVUTlFJBKBw+GAy+VK2R+VSgW9Xg+9Xj+pxRuJRDAyMoKRkZEUi4ug1WqppanVaieN08bjcXi9XrhcLrhcLng8Hvj9/hQRmkk4joNarYZOp4NOp4Ner4fJZKK1vcFgkO7LWIuVYRjawtFoNFKhTRZpg8EAs9mMUCiUIr4ajQZyuTzlM8u1x/NYRDGde8R2giIixwgSEyMWExm1FY1GsxJVMm9VLpcjEonA5/MhEokgGAwiGAxCKpVCrVZPGn+TyWQoKChAfn4+vF4vnE4n/H4/AoEAAoEAhoaGqOBotdpxF3yZTEbdn7FYDE6nE06nEw6HA6FQiMY1SZcfuVwOrVZLXbdk/1iWpTW1BoNh3H7GYrGU2Gg0GkUkEkE0GqXWXfLNAMMwYFmWZvdKpVLI5XL6UKvV9Dwn35S4XC709vbC4/GMmzdLhrybzWaYTCZEIhE4nU60trbSmwiWZalVTl5PHoqg0WgQj8fpunOtRz7mCACyHZJ0SplnRxHFVETkGEPcluTimiyq2bh/SRtDIqqBQID2+XW5XGAYhlqrE2UAsyxLLVHyPrfbTUWGxFiVSiU0Gg00Gg2USmWKSEskElgsFlgsiUYMpFaWrCsQCCAcDiMcDtM2hmT/FQoFlEolvTkgD3JzIZVK6XazQRAEen4jkQgikQj8fj/sdjvC4TCCwSACgUBa9zLpnWwwGGA0Gumwbo/Hg46OjhRLVS6XpyzjcrmoiEokEmi1WjoTl5BrDfKcIM4zzRhRTEVE5gCGYaj1lCyqsVgMsVgMHMel1E5OBYklarVaal3yPE//LZFIqGhN5FaUSqVUFElZTnLWbDAYxMjICFiWpXFaUuuZLP4kvklmEsdiMfh8Pjq3k1jAPM/T9U4GsV5pjPTjTN1keJ6nj3g8Pi6ZaCLIDYdWq6UPjUaDaDQKv98Pt9uNgYGBcXNFidiyLItgMJgSQ5ZIJNBoNGBZNmU/ck0+m0tymU96qs4zFcVURGQOSRZVYqGS5BnSwF4qlWZsyXAcRwWBWGChUIhmA3u9Xkil0pRmB+mQy+XIz89Hfn4+otEotVL9fj/i8XiK1UoEaayVSfaXNCVIduMSFysR03A4jFAohHA4TN24xJojIjnW/ZoJLMvShCW5XE6PW6FQ0BsBQRDoeXK73RgcHBxXz0qsTJ1OB5lMhlAoNK75g1wuh0qlAsuy9OYoeR9yGYQw54iWacaIYioichzAMAy1vohwxGIxmgkciUSyslaJC1WhUIDneWpdJguV1+ulJRkkUzbdeqVSKYxGI4xGIwRBQCgUSomvxuNx+u/k7ctkMuq2Tc7CJTcGZP+MRmPaYyClRcTSJDFSnufHJTwRi5VYsRKJBBKJhFqxxO1LziVxOZM4bLrzp1KpaIyX4zgq/qThAtmuUqmEUqmkngUixOQcZOpdOC7hkX3MNNvlTxJEMRUROc4gySkymYy6fYn7klyox4rFVOtTqVRQqVSIx+MIhUIIhUI0szhZCInokbjl2HUTK5Q0fCBiTyxMsm7StCJdNm6ytU0e5EYinTuXiBIRRYIgCFRYk88PGTxAzh25eZiscEEqldLjIhnR0WgU4XAYHo9nXGxVLpdDqVRCIpFQkU4+36R054QVUZGsEcVUROQ4hYiOVCpNsVYBUKEgFm2mwkrKQtRqNS3VIe5V8n+SIQwcnVxC9mOsQCQnQRE3riAIVIiI25ZYg2Ot7WMNOR5yw0AscpZlUxKV0s1MTV6eiPXYhhzZxLlPBMSYaeaIYioicgKQbK0SC4wIExFWILsLOinVIXFDYtURUUnOMiYkW5XJ1mVyUg2xJElCVDJEaMmNAXkkx4mJOzfZ8hxrVRKLlTySrVly/GTfyI0AcS+ThK9YLEazltNl9UokEnrOpVIpPedjrW2y7hMpsShjxJhpxohiKiJyAkESliQSyThhBZDiCh6bBTuZuCavV61Wp8QYifgRV2k6qzI5VpnOZZssfERoZwMivGPFmXSgmizLl5xTIp7A0fOZ3MYRODmt0HSIlmnmiGIqInKCMlZYk8tCiFWXnAVLBI/8zVRcCcQKJlYd+ZtcljJZxm2ysI6NiZIH2fZYiGVKxJI8iOWanJyUCckWK7Esk0U4Xaz3VBHQFHjkYJnOyp4c94hiKiJyEpCcDZwsrGNdpsliQ0RtsvrNsdsg4pNMsoiTv8nbJ0KYbDXPNmMze8m5SXbHJtelTpTRm2xhnzICmozYASljRDEVETkJIWIilUpTXJ9EPMa6QwljrcZMRDZ5W+kYazmmi4cmPyZirBVL9i95P8da3ck3EcRynqqpfrJ4npICKpITopiKiJzkJAsPYSKBSyewhGS3bDoX7UTCQ1zGM02y6zdZoJPd3JORToBF8RxDDjFTMQFJRETklCGdwI214sZm0mYSj5wo/jlZPHRs/Wjy/oz9m7w/mQ68Sr4BSH6IwpkBYjZvxohiKiIiAiA1RpjM2GSfdO7Zscsea8ZaysnuX1E0p4EophkjiqmIiMikpHMTJzNRpu3Y15L/jn3/RBbr2L9jrV7RNTu7CHzike17TkVEMRUREZkWk7lwRU5w4gLAZmlpxk9Ny/QkbNkhIiIiIiJybBEtUxERERGRtIgdkDJHFFMRERERkfSICUgZI4qpiIiIiEh6RDHNGFFMRURERETSIgg5ZPOemloqJiCJiIiIiIhMF9EyFRERERFJj+jmzRhRTEVERERE0iOKacaIYioiIiIikhaxNCZzxJipiIiIiEh6+BwfOfDII4+goqICCoUCK1aswLZt2yZd/h//+AcaGhqgUCiwYMECvP7667lteIYQxVREREREJC3EMs32kS1/+9vfcPvtt+Puu+/Grl27sGjRIqxbtw7Dw8Npl//www9x9dVX48Ybb8Tu3btx+eWX4/LLL8eBAweme8g5wwhzMeJhDvF4PNDr9XC73dDpdHO9OyIiIiLTZqava2R9patuAyuRZ/VePhZG75bfZ7UvK1aswGmnnYY//OEPiXXwPEpLS3HbbbfhjjvuGLf8lVdeCb/fj9dee40+t3LlSixevBiPPfZYVvs7U5xyMVNy7+DxeOZ4T0RERERmBnI9m2nbiI+Gs04o4uORlH0iyOVyyOXjhTkSiWDnzp2488476XMsy2Lt2rXYsmVL2m1s2bIFt99+e8pz69atwyuvvJLVvs4kp5yYer1eAEBpaekc74mIiIjIzOL1eqHX66e9HplMhoKCAvTv+FNO79doNOOusXfffTfuueeeccuOjo4iHo/DarWmPG+1WnH48OG06x8aGkq7/NDQUE77OxOccmJaVFSE3t5eaLXaYz4yyuPxoLS0FL29vSeFi1k8nuOXk+lYAPF4pkIQBHi9XhQVFc3A3gEKhQKdnZ2IRCI578/Y62s6q/Rk4pQTU5ZlUVJSMqf7oNPpTooLAkE8nuOXk+lYAPF4JmMmLNJkFAoFFArFjK4zHWazGRzHwWazpTxvs9lQUFCQ9j0FBQVZLX8sELN5RURERETmDJlMhmXLlmHjxo30OZ7nsXHjRqxatSrte1atWpWyPAC8+eabEy5/LDjlLFMRERERkeOL22+/Hddffz2WL1+O008/HQ8//DD8fj9uuOEGAMB1112H4uJi3HfffQCAb37zmzj77LPx61//GhdffDFeeOEF7NixA3/6U24x3plAFNNjiFwux913333SxA7E4zl+OZmOBRCP52TnyiuvxMjICO666y4MDQ1h8eLFWL9+PU0y6unpAcsedaSuXr0azz//PH70ox/hBz/4AWpra/HKK69g/vz5c3UIp16dqYiIiIiIyEwjxkxFRERERESmiSimIiIiIiIi00QUUxERERERkWkiiqmIiIiIiMg0EcV0lvn5z3+O1atXQ6VSwWAwZPSeL37xi2AYJuVx0UUXze6OZkguxyMIAu666y4UFhZCqVRi7dq1aG1tnd0dzQCHw4FrrrkGOp0OBoMBN954I3w+36TvOeecc8Z9Nl/96leP0R6ncqKPrBpLNsfz9NNPj/scjkWDgUx499138alPfQpFRUVgGCajfrGbNm3C0qVLIZfLUVNTg6effnrW91NkZhHFdJaJRCL43Oc+h1tuuSWr91100UUYHBykj7/+9a+ztIfZkcvx/PKXv8Tvfvc7PPbYY9i6dSvUajXWrVuHUCg0i3s6Nddccw0OHjyIN998E6+99hreffdd3HzzzVO+76abbkr5bH75y18eg71N5WQYWZVMtscDJLoHJX8O3d3dx3CPJ8bv92PRokV45JFHMlq+s7MTF198Mc4991zs2bMH3/rWt/DlL38ZGzZsmOU9FZlRBJFjwlNPPSXo9fqMlr3++uuFyy67bFb3Z7pkejw8zwsFBQXCr371K/qcy+US5HK58Ne//nUW93ByDh06JAAQtm/fTp/773//KzAMI/T390/4vrPPPlv45je/eQz2cHJOP/104etf/zr9fzweF4qKioT77rsv7fJXXHGFcPHFF6c8t2LFCuErX/nKrO5npmR7PNn8nuYSAMLLL7886TLf+973hHnz5qU8d+WVVwrr1q2bxT0TmWlEy/Q4ZdOmTcjPz0d9fT1uueUW2O32ud6lnOjs7MTQ0BDWrl1Ln9Pr9VixYsWE45WOBVu2bIHBYMDy5cvpc2vXrgXLsti6deuk733uuedgNpsxf/583HnnnQgEArO9uymQkVXJ5zSTkVXJywOJkVVz+RkQcjkeAPD5fCgvL0dpaSkuu+wyHDx48Fjs7oxzPH82IpkjdkA6Drnooovw6U9/GpWVlWhvb8cPfvADfOITn8CWLVvAcdxc715WkJFIx9u4pKGhIeTn56c8J5FIYDKZJt2vz3/+8ygvL0dRURH27duH73//+2hpacFLL70027tMOVlGVhFyOZ76+no8+eSTWLhwIdxuNx588EGsXr0aBw8enPNBFtky0Wfj8XgQDAahVCrnaM9EskG0THPgjjvuGJf8MPYx0UUgE6666ipceumlWLBgAS6//HK89tpr2L59OzZt2jRzB5HEbB/PsWS2j+Xmm2/GunXrsGDBAlxzzTV45pln8PLLL6O9vX0Gj0JkKlatWoXrrrsOixcvxtlnn42XXnoJFosFjz/++FzvmsgpimiZ5sD//M//4Itf/OKky1RVVc3Y9qqqqmA2m9HW1obzzz9/xtZLmM3jISORbDYbCgsL6fM2mw2LFy/OaZ2TkemxFBQUjEtuicVicDgcWY1xWrFiBQCgra0N1dXVWe9vLpwsI6sIuRzPWKRSKZYsWYK2trbZ2MVZZaLPRqfTiVbpCYQopjlgsVhgsViO2fb6+vpgt9tTxGgmmc3jqaysREFBATZu3EjF0+PxYOvWrVlnOGdCpseyatUquFwu7Ny5E8uWLQMAvP322+B5ngpkJuzZswcAZu2zSUfyyKrLL78cwNGRVbfeemva95CRVd/61rfoc3M9soqQy/GMJR6PY//+/fjkJz85i3s6O6xatWpcmdLx8tmIZMFcZ0Cd7HR3dwu7d+8WfvKTnwgajUbYvXu3sHv3bsHr9dJl6uvrhZdeekkQBEHwer3Cd77zHWHLli1CZ2en8NZbbwlLly4VamtrhVAoNFeHQcn2eARBEO6//37BYDAIr776qrBv3z7hsssuEyorK4VgMDgXh0C56KKLhCVLlghbt24V3n//faG2tla4+uqr6et9fX1CfX29sHXrVkEQBKGtrU249957hR07dgidnZ3Cq6++KlRVVQlnnXXWMd/3F154QZDL5cLTTz8tHDp0SLj55psFg8EgDA0NCYIgCNdee61wxx130OU/+OADQSKRCA8++KDQ3Nws3H333YJUKhX2799/zPc9Hdkez09+8hNhw4YNQnt7u7Bz507hqquuEhQKhXDw4MG5OgSK1+ulvwsAwkMPPSTs3r1b6O7uFgRBEO644w7h2muvpct3dHQIKpVK+O53vys0NzcLjzzyiMBxnLB+/fq5OgSRHBDFdJa5/vrrBQDjHu+88w5dBoDw1FNPCYIgCIFAQLjwwgsFi8UiSKVSoby8XLjpppvoRWWuyfZ4BCFRHvPjH/9YsFqtglwuF84//3yhpaXl2O/8GOx2u3D11VcLGo1G0Ol0wg033JByU9DZ2ZlybD09PcJZZ50lmEwmQS6XCzU1NcJ3v/tdwe12z8n+//7/t3eHIM3EYRzHf+fGVYuwMDUI02AZhqFpC4pgM1jvgigDBxPFYFC0GcTplgQRFkSwT5vWqajBJGsrCoI2TfJ/0zuQjfflfR/deF++n3bcM3gWji83tl2p5Pr7+53v+y6VSrlqtdo4l06nXRiGn+ZPTk7c4OCg833fDQ8Pu0ql0uaNf+1P3s/i4mJjNhaLuampKXd7e9uBrZtdXFy0vEZ+7h+GoUun002vSSaTzvd9NzAw8On6wb+BR7ABAGDEt3kBADAipgAAGBFTAACMiCkAAEbEFAAAI2IKAIARMQUAwIiYAgBgREwBADAipgAAGBFTAACMiCnwlzKZjHK5nHK5nLq7u9XT06O1tTW1+rtr55zGx8c1OTnZOP/y8qLe3l6tr6+3e3UAX4yYAgblclnRaFRXV1fa29vTzs6ODg4OmuY8z1O5XNb19bWKxaIkKZvNKh6PE1PgP8DDwQGDvr4+FQoFeZ6noaEh3d/fq1AoaG5urmk2Ho9rf39fQRDo6elJp6enuru7UzTKZQj867gzBQxGR0fleV7jeGxsTLVaTR8fHy3nZ2ZmND09ra2tLW1vbyuRSLRrVQDfiJgCbfT29qabmxtFIhHVarVOrwPgixBTwODy8vLTcbVaVSKRUCQSaTm/vLysrq4unZ2dqVgs6vz8vB1rAvhmxBQwqNfrWlpa0sPDg46Pj1UqlZTP5yVJq6urCoKgMVupVHR4eKijoyNNTExoZWVFYRjq9fW1U+sD+CLEFDAIgkDv7+9KpVJaWFhQPp/X/Py8JOnx8VH1el2S9Pz8rNnZWW1sbGhkZESStLm5qVgspmw227H9AXwNz7X6URyA38pkMkomk9rd3e30KgA6jDtTAACMiCkAAEZ8zAsAgBF3pgAAGBFTAACMiCkAAEbEFAAAI2IKAIARMQUAwIiYAgBgREwBADD6AZvH7+OU0Pz2AAAAAElFTkSuQmCC", "text/plain": [ "
" ] @@ -241,7 +233,7 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": 5, "metadata": {}, "outputs": [], "source": [ @@ -254,7 +246,7 @@ "nn_layers = [2,24,12,1]\n", "\n", "# Initialize DeepTDA model\n", - "model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, layers=nn_layers)" + "model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, model=nn_layers)" ] }, { @@ -267,7 +259,7 @@ }, { "cell_type": "code", - "execution_count": 5, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -354,7 +346,7 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -393,7 +385,7 @@ }, { "cell_type": "code", - "execution_count": 7, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -426,7 +418,7 @@ }, { "cell_type": "code", - "execution_count": 8, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -480,7 +472,7 @@ }, { "cell_type": "code", - "execution_count": 9, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -522,7 +514,7 @@ }, { "cell_type": "code", - "execution_count": 10, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -564,7 +556,7 @@ }, { "cell_type": "code", - "execution_count": 11, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -583,7 +575,7 @@ " n_cvs=2,\n", " target_centers=target_centers, \n", " target_sigmas=target_sigmas,\n", - " layers=nn_layers)" + " model=nn_layers)" ] }, { @@ -596,7 +588,7 @@ }, { "cell_type": "code", - "execution_count": 12, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -683,7 +675,7 @@ }, { "cell_type": "code", - "execution_count": 13, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -722,7 +714,7 @@ }, { "cell_type": "code", - "execution_count": 14, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -756,7 +748,7 @@ }, { "cell_type": "code", - "execution_count": 15, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -809,7 +801,7 @@ }, { "cell_type": "code", - "execution_count": 16, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -851,7 +843,7 @@ }, { "cell_type": "code", - "execution_count": 17, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -890,7 +882,7 @@ }, { "cell_type": "code", - "execution_count": 18, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -899,7 +891,7 @@ "target_sigmas = [0.2, 0.2, 0.2]\n", "nn_layers = [2,24,12,1]\n", "# MODEL\n", - "model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, layers=nn_layers)" + "model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, model=nn_layers)" ] }, { @@ -912,7 +904,7 @@ }, { "cell_type": "code", - "execution_count": 19, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -999,7 +991,7 @@ }, { "cell_type": "code", - "execution_count": 20, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -1038,7 +1030,7 @@ }, { "cell_type": "code", - "execution_count": 21, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -1071,7 +1063,7 @@ }, { "cell_type": "code", - "execution_count": 22, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -1127,7 +1119,7 @@ }, { "cell_type": "code", - "execution_count": 24, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -1176,7 +1168,7 @@ }, { "cell_type": "code", - "execution_count": 25, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -1187,7 +1179,7 @@ "target_sigmas = [0.2, 1.5, 0.2]\n", "nn_layers = [2,24,12,1]\n", "# MODEL\n", - "model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, layers=nn_layers)" + "model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, model=nn_layers)" ] }, { @@ -1199,7 +1191,7 @@ }, { "cell_type": "code", - "execution_count": 26, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -1285,7 +1277,7 @@ }, { "cell_type": "code", - "execution_count": 27, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -1322,7 +1314,7 @@ }, { "cell_type": "code", - "execution_count": 28, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -1366,7 +1358,7 @@ ], "metadata": { "kernelspec": { - "display_name": "pytorch", + "display_name": "graph_mlcolvar_test", "language": "python", "name": "python3" }, @@ -1380,14 +1372,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.8" + "version": "3.9.18" }, - "orig_nbformat": 4, - "vscode": { - "interpreter": { - "hash": "1cbeac1d7079eaeba64f3210ccac5ee24400128e300a45ae35eee837885b08b3" - } - } + "orig_nbformat": 4 }, "nbformat": 4, "nbformat_minor": 2 diff --git a/docs/notebooks/tutorials/cvs_committor.ipynb b/docs/notebooks/tutorials/cvs_committor.ipynb index 0c76e0fe..29a6616e 100644 --- a/docs/notebooks/tutorials/cvs_committor.ipynb +++ b/docs/notebooks/tutorials/cvs_committor.ipynb @@ -133,7 +133,7 @@ "options = {'optimizer' : {'lr': 1e-3, 'weight_decay': 1e-5}, \n", " 'lr_scheduler' : { 'scheduler' : lr_scheduler, 'gamma' : 0.99999 }}\n", "\n", - "model = Committor(layers=[2, 32, 32, 1],\n", + "model = Committor(model=[2, 32, 32, 1],\n", " atomic_masses=atomic_masses,\n", " alpha=1e-1,\n", " delta_f=0,\n", diff --git a/docs/notebooks/tutorials/data/alanine_gnn/alad.gro b/docs/notebooks/tutorials/data/alanine_gnn/alad.gro new file mode 100644 index 00000000..c633b39b --- /dev/null +++ b/docs/notebooks/tutorials/data/alanine_gnn/alad.gro @@ -0,0 +1,25 @@ +Generated by trjconv : Alanine in vacuum in water t= 0.00000 + 22 + 1ACE HH31 1 0.152 0.743 2.212 1.3463 0.7349 -0.6803 + 1ACE CH3 2 0.131 0.822 2.284 0.3769 0.1596 -0.3198 + 1ACE HH32 3 0.108 0.767 2.375 0.4564 -1.3627 -1.1820 + 1ACE HH33 4 0.052 0.894 2.264 -0.4328 -0.2112 1.3689 + 1ACE C 5 0.265 0.895 2.297 0.3480 0.2517 0.0099 + 1ACE O 6 0.269 0.977 2.388 0.4177 0.6042 0.0340 + 2ALA N 7 0.368 0.871 2.208 -0.4973 0.2362 0.3098 + 2ALA H 8 0.341 0.815 2.129 0.3380 2.0597 -1.3356 + 2ALA CA 9 0.488 0.955 2.197 0.3024 0.6262 0.4482 + 2ALA HA 10 0.544 0.900 2.122 0.3036 -0.7766 1.4157 + 2ALA CB 11 0.448 1.092 2.138 -0.8250 -0.8273 -0.4712 + 2ALA HB1 12 0.538 1.131 2.091 -2.1604 -0.2744 -2.7135 + 2ALA HB2 13 0.382 1.084 2.051 -1.4850 -0.3191 -0.0220 + 2ALA HB3 14 0.423 1.156 2.222 -1.1056 -1.9931 0.3539 + 2ALA C 15 0.582 0.976 2.321 -0.6113 -0.1699 -0.1353 + 2ALA O 16 0.703 0.990 2.301 0.0894 0.1629 0.1034 + 3NME N 17 0.532 0.966 2.446 -0.5630 1.1218 -0.6656 + 3NME H 18 0.432 0.954 2.452 -0.4594 0.4793 0.0514 + 3NME CH3 19 0.599 0.972 2.578 -0.1680 0.4666 0.2671 + 3NME HH31 20 0.661 0.882 2.577 -0.5665 0.1075 -2.8701 + 3NME HH32 21 0.656 1.064 2.574 -1.3881 1.2509 0.5814 + 3NME HH33 22 0.527 0.949 2.656 -1.4662 -1.5983 -1.3731 + 3.02334 3.02334 3.02334 diff --git a/docs/notebooks/tutorials/data/alanine_gnn/alad_A.trr b/docs/notebooks/tutorials/data/alanine_gnn/alad_A.trr new file mode 100644 index 00000000..6fe9d209 Binary files /dev/null and b/docs/notebooks/tutorials/data/alanine_gnn/alad_A.trr differ diff --git a/docs/notebooks/tutorials/data/alanine_gnn/alad_B.trr b/docs/notebooks/tutorials/data/alanine_gnn/alad_B.trr new file mode 100644 index 00000000..e37d8648 Binary files /dev/null and b/docs/notebooks/tutorials/data/alanine_gnn/alad_B.trr differ diff --git a/docs/notebooks/tutorials/intro_3_loss_optim.ipynb b/docs/notebooks/tutorials/intro_3_loss_optim.ipynb index 36e27ccf..7eb10823 100644 --- a/docs/notebooks/tutorials/intro_3_loss_optim.ipynb +++ b/docs/notebooks/tutorials/intro_3_loss_optim.ipynb @@ -83,7 +83,7 @@ "from mlcolvar.cvs import RegressionCV\n", "\n", "# define example CV\n", - "cv = RegressionCV(layers=[10,5,5,1], options={})\n", + "cv = RegressionCV(model=[10,5,5,1], options={})\n", "\n", "# choose optimizer\n", "cv.optimizer_name = 'Adam' \n", @@ -123,7 +123,7 @@ "options = {'optimizer' : {'lr' : 2e-3, 'weight_decay' : 1e-4} }\n", "\n", "# define example CV\n", - "cv = RegressionCV(layers=[10,5,5,1], options=options)\n", + "cv = RegressionCV(model=[10,5,5,1], options=options)\n", "\n", "print(f'optimizer_kwargs: {cv.optimizer_kwargs}')" ] @@ -155,7 +155,7 @@ "options = {'lr_scheduler' : { 'scheduler' : lr_scheduler, 'gamma' : 0.9999} }\n", "\n", "# define example CV\n", - "cv = RegressionCV(layers=[10,5,5,1], options=options)" + "cv = RegressionCV(model=[10,5,5,1], options=options)" ] }, { @@ -251,7 +251,7 @@ "from mlcolvar.cvs import DeepTICA\n", "\n", "# define CV\n", - "cv = DeepTICA(layers=[10, 5, 5, 2], options={})\n", + "cv = DeepTICA(model=[10, 5, 5, 2], options={})\n", "\n", "# print default loss mode\n", "print(f'default mode: {cv.loss_fn.mode}')\n", @@ -554,7 +554,7 @@ ], "metadata": { "kernelspec": { - "display_name": "pytorch", + "display_name": "mlcvs_test", "language": "python", "name": "python3" }, @@ -570,12 +570,7 @@ "pygments_lexer": "ipython3", "version": "3.10.8" }, - "orig_nbformat": 4, - "vscode": { - "interpreter": { - "hash": "1cbeac1d7079eaeba64f3210ccac5ee24400128e300a45ae35eee837885b08b3" - } - } + "orig_nbformat": 4 }, "nbformat": 4, "nbformat_minor": 2 diff --git a/docs/requirements.yaml b/docs/requirements.yaml index 6693ccbb..d15ecac4 100644 --- a/docs/requirements.yaml +++ b/docs/requirements.yaml @@ -26,9 +26,12 @@ dependencies: - ipykernel - scikit-learn - scipy + - pyg # Pip-only installs - pip: - sphinx-copybutton - furo - KDEpy + - mdtraj + - matscipy diff --git a/mlcolvar/core/loss/committor_loss.py b/mlcolvar/core/loss/committor_loss.py index 03196b0f..5ed4e132 100644 --- a/mlcolvar/core/loss/committor_loss.py +++ b/mlcolvar/core/loss/committor_loss.py @@ -17,7 +17,10 @@ import torch from typing import Tuple, Union from mlcolvar.core.loss.utils.smart_derivatives import SmartDerivatives +import torch_geometric +import warnings +from mlcolvar.utils._code import scatter_sum # ============================================================================= # LOSS FUNCTIONS # ============================================================================= @@ -91,7 +94,7 @@ def __init__(self, self.n_dim = n_dim def forward(self, - x: torch.Tensor, + x: Union[torch.Tensor, torch_geometric.data.Batch], z: torch.Tensor, q: torch.Tensor, labels: torch.Tensor, @@ -103,7 +106,7 @@ def forward(self, Parameters ---------- - x : torch.Tensor + x : torch.Tensor or torch_geometric.data.Batch Model input, i.e., either positions or descriptors if using descriptors_derivatives z : torch.Tensor Model unactivated output, i.e., z value @@ -229,7 +232,23 @@ def committor_loss(x: torch.Tensor, if (z_threshold is not None and (z_regularization == 0 or z_threshold <= 0)) or (z_threshold is None and z_regularization != 0) or z_regularization < 0: raise ValueError(f"To apply the regularization on z space both z_threshold and z_regularization key must be positive. Found {z_threshold} and {z_regularization}!") + + # check if input is graph + if isinstance(x, torch_geometric.data.batch.Batch): + _is_graph_data = True + batch = torch.clone(x['batch']) + node_types = torch.where(x['node_attrs'])[1] + x = x['positions'] + else: + _is_graph_data = False + # checks and warnings + if _is_graph_data and descriptors_derivatives is not None: + raise ValueError("The descriptors_derivatives key cannot be used with GNN-based models!") + + if _is_graph_data and separate_boundary_dataset: + warnings.warn("Using GNN-based models it may be better to set separate_boundary_dataset to False") + # ------------------------ SETUP ------------------------ # inherit right device device = x.device @@ -242,16 +261,33 @@ def committor_loss(x: torch.Tensor, # Create masks to access different states data - mask_A = labels == 0 - mask_B = labels == 1 + mask_A = torch.nonzero(labels == 0, as_tuple=True) + mask_B = torch.nonzero(labels == 1, as_tuple=True) # create mask for variational data if separate_boundary_dataset: - mask_var = labels > 1 + mask_var = torch.nonzero(labels > 1, as_tuple=not(_is_graph_data)) else: mask_var = torch.ones_like(labels, dtype=torch.bool) + if _is_graph_data: + # this needs to be on the batch index, not only the labels + aux = torch.where(mask_var)[0].to(device) + mask_var_batches = torch.isin(batch, aux) + mask_var_batches = batch[mask_var_batches] + else: + mask_var_batches = mask_var + + # setup atomic masses + atomic_masses = atomic_masses.to(device) + + # mass should have size [1, n_atoms*spatial_dims] + if _is_graph_data: + atomic_masses = atomic_masses[node_types[mask_var_batches]].unsqueeze(-1) + else: + atomic_masses = atomic_masses.unsqueeze(0) + # Update weights of basin B using the information on the delta_f delta_f = torch.Tensor([delta_f]).to(device) # B higher in energy --> A-B < 0 @@ -275,44 +311,41 @@ def committor_loss(x: torch.Tensor, grad_outputs=grad_outputs, retain_graph=True, create_graph=create_graph)[0] - grad = grad[mask_var] - - if cell is not None: - grad = grad / cell - - # in case the input is not positions but descriptors, we need to correct the gradients up to the positions - if isinstance(descriptors_derivatives, SmartDerivatives): - # we use the precomputed derivatives from descriptors to pos - gradient_positions = descriptors_derivatives(grad, ref_idx[mask_var]).view(x[mask_var].shape[0], -1) - - # --> If we directly pass the matrix d_desc/d_pos - elif isinstance(descriptors_derivatives, torch.Tensor): - descriptors_derivatives = descriptors_derivatives.to(device) - gradient_positions = torch.einsum("bd,badx->bax", grad, descriptors_derivatives[ref_idx[mask_var]]).contiguous() - gradient_positions = gradient_positions.view(x[mask_var].shape[0], -1) + grad = grad[mask_var_batches] - # If the input was already positions + if descriptors_derivatives is not None: + # in case the input is not positions but descriptors, we need to correct the gradients up to the positions + if isinstance(descriptors_derivatives, SmartDerivatives): + # we use the precomputed derivatives from descriptors to pos + gradient_positions = descriptors_derivatives(grad, ref_idx[mask_var]).view(x[mask_var].shape[0], -1) + + # --> If we directly pass the matrix d_desc/d_pos + elif isinstance(descriptors_derivatives, torch.Tensor): + descriptors_derivatives = descriptors_derivatives.to(device) + gradient_positions = torch.einsum("bd,badx->bax", grad, descriptors_derivatives[ref_idx[mask_var]]).contiguous() + gradient_positions = gradient_positions.view(x[mask_var].shape[0], -1) else: + # we get the square of grad(q) and we multiply by the weight gradient_positions = grad - + + if cell is not None: + gradient_positions = gradient_positions / cell + # we do the square grad_square = torch.pow(gradient_positions, 2) - - # multiply by masses - try: - grad_square = torch.sum((grad_square * (1/atomic_masses)), - axis=1, - keepdim=True) - except RuntimeError as e: - raise RuntimeError(e, """[HINT]: Is you system in 3 dimension? By default the code assumes so, if it's not the case change the n_dim key to the right dimensionality.""") + grad_square = torch.sum((grad_square * (1/atomic_masses)), axis=1, keepdim=True) + + if _is_graph_data: + # we need to sum on the right batch first + grad_square = scatter_sum(grad_square, mask_var_batches, dim=0) + # variational contribution to loss: we sum over the batch loss_var = torch.mean(grad_square * w[mask_var]) if log_var: loss_var = torch.log1p(loss_var) else: - loss_var *= gamma - + loss_var = gamma*loss_var # 2. ----- BOUNDARY LOSS loss_A = gamma * torch.mean( q[mask_A].pow(2) ) @@ -329,4 +362,6 @@ def committor_loss(x: torch.Tensor, # 4. ----- TOTAL LOSS loss = loss_var + alpha*(loss_A + loss_B) + loss_z_diff + + # TODO maybe there is no need to detach them for logging return loss, loss_var.detach(), alpha*loss_A.detach(), alpha*loss_B.detach() \ No newline at end of file diff --git a/mlcolvar/core/loss/eigvals.py b/mlcolvar/core/loss/eigvals.py index e0c84345..4fc2c826 100644 --- a/mlcolvar/core/loss/eigvals.py +++ b/mlcolvar/core/loss/eigvals.py @@ -129,7 +129,7 @@ def reduce_eigenvalues_loss( else: n_eig = len(evals) - loss = None + loss = torch.zeros(1, dtype=evals.dtype, device=evals.device) if mode == "sum": loss = torch.sum(evals[:n_eig]) diff --git a/mlcolvar/core/loss/tda_loss.py b/mlcolvar/core/loss/tda_loss.py index 8bd8e830..50234ed4 100644 --- a/mlcolvar/core/loss/tda_loss.py +++ b/mlcolvar/core/loss/tda_loss.py @@ -15,7 +15,7 @@ # GLOBAL IMPORTS # ============================================================================= -from typing import Union +from typing import Union, List, Tuple from warnings import warn import torch @@ -32,10 +32,10 @@ class TDALoss(torch.nn.Module): def __init__( self, n_states: int, - target_centers: Union[list, torch.Tensor], - target_sigmas: Union[list, torch.Tensor], - alpha: float = 1, - beta: float = 100, + target_centers: Union[List[float], torch.Tensor], + target_sigmas: Union[List[float], torch.Tensor], + alpha: float = 1.0, + beta: float = 100.0, ): """Constructor. @@ -66,7 +66,7 @@ def __init__( def forward( self, H: torch.Tensor, labels: torch.Tensor, return_loss_terms: bool = False - ) -> torch.Tensor: + ) -> Union[torch.Tensor, Tuple[torch.Tensor, torch.Tensor, torch.Tensor]]: """Compute the value of the loss function. Parameters @@ -107,12 +107,12 @@ def tda_loss( H: torch.Tensor, labels: torch.Tensor, n_states: int, - target_centers: Union[list, torch.Tensor], - target_sigmas: Union[list, torch.Tensor], + target_centers: Union[List[float], torch.Tensor], + target_sigmas: Union[List[float], torch.Tensor], alpha: float = 1, beta: float = 100, return_loss_terms: bool = False, -) -> torch.Tensor: +) -> Union[torch.Tensor, Tuple[torch.Tensor, torch.Tensor, torch.Tensor]]: """ Compute a loss function as the distance from a simple Gaussian target distribution. @@ -148,9 +148,9 @@ def tda_loss( term associated to the standard deviations of the target Gaussians. """ if not isinstance(target_centers, torch.Tensor): - target_centers = torch.Tensor(target_centers) + target_centers = torch.tensor(target_centers, dtype=H.dtype) if not isinstance(target_sigmas, torch.Tensor): - target_sigmas = torch.Tensor(target_sigmas) + target_sigmas = torch.tensor(target_sigmas, dtype=H.dtype) device = H.device target_centers = target_centers.to(device) @@ -165,7 +165,7 @@ def tda_loss( f"State {i} was not represented in this batch! Either use bigger batch_size or a more equilibrated dataset composition!" ) else: - H_red = H[torch.nonzero(labels == i, as_tuple=True)] + H_red = H[labels == i] # compute mean and standard deviation over the class i mu = torch.mean(H_red, 0) @@ -173,7 +173,7 @@ def tda_loss( warn( f"There is only one sample for state {i} in this batch! Std is set to 0, this may affect the training! Either use bigger batch_size or a more equilibrated dataset composition!" ) - sigma = 0 + sigma = torch.Tensor(0) else: sigma = torch.std(H_red, 0) @@ -189,3 +189,18 @@ def tda_loss( if return_loss_terms: return loss, loss_centers, loss_sigmas return loss + +def test_tda_loss(): + H = torch.randn(100) + H.requires_grad = True + labels = torch.zeros_like(H) + labels[-50:] = 1 + + Loss = TDALoss(n_states=2, target_centers=[-1, 1], target_sigmas=[0.1, 0.1]) + + loss = Loss(H=H, labels=labels, return_loss_terms=True) + + loss[0].backward() + +if __name__ == '__main__': + test_tda_loss() \ No newline at end of file diff --git a/mlcolvar/core/loss/utils/smart_derivatives.py b/mlcolvar/core/loss/utils/smart_derivatives.py index ac699da0..a2e32f63 100644 --- a/mlcolvar/core/loss/utils/smart_derivatives.py +++ b/mlcolvar/core/loss/utils/smart_derivatives.py @@ -970,7 +970,7 @@ def test_train_with_smart_derivatives(): datamodule = DictModule(dataset=smart_dataset, lengths=[0.8, 0.2], batch_size=80) - model = Committor(layers=[45, 10, 1], + model = Committor(model=[45, 10, 1], atomic_masses=atomic_masses, alpha=1, separate_boundary_dataset=True, diff --git a/mlcolvar/core/nn/__init__.py b/mlcolvar/core/nn/__init__.py index 4ccf68d1..fa34b8af 100644 --- a/mlcolvar/core/nn/__init__.py +++ b/mlcolvar/core/nn/__init__.py @@ -1,3 +1,4 @@ -__all__ = ["FeedForward"] +__all__ = ["FeedForward", "BaseGNN", "SchNetModel", "GVPModel"] from .feedforward import * +from .graph import * \ No newline at end of file diff --git a/mlcolvar/core/nn/feedforward.py b/mlcolvar/core/nn/feedforward.py index f84596dd..19c233bb 100644 --- a/mlcolvar/core/nn/feedforward.py +++ b/mlcolvar/core/nn/feedforward.py @@ -15,10 +15,9 @@ # GLOBAL IMPORTS # ============================================================================= -from typing import Optional, Union +from typing import Optional, Union, Any import torch -import lightning from mlcolvar.core.nn.utils import get_activation, parse_nn_options @@ -27,7 +26,7 @@ # ============================================================================= -class FeedForward(lightning.LightningModule): +class FeedForward(torch.nn.Module): """Define a feedforward neural network given the list of layers. Optionally dropout and batchnorm can be applied (the order is activation -> dropout -> batchnorm). @@ -110,3 +109,6 @@ def __init__( def forward(self, x: torch.Tensor) -> torch.Tensor: return self.nn(x) + + def backward(self, loss: torch.Tensor, *args: Any, **kwargs: Any): + return loss.backward() diff --git a/mlcolvar/core/nn/graph/__init__.py b/mlcolvar/core/nn/graph/__init__.py new file mode 100644 index 00000000..a8b3be6d --- /dev/null +++ b/mlcolvar/core/nn/graph/__init__.py @@ -0,0 +1,5 @@ +__all__ = ["BaseGNN", "SchNetModel", "GVPModel"] + +from .gnn import BaseGNN +from .schnet import SchNetModel +from .gvp import GVPModel \ No newline at end of file diff --git a/mlcolvar/core/nn/graph/gnn.py b/mlcolvar/core/nn/graph/gnn.py new file mode 100644 index 00000000..1f2deb1d --- /dev/null +++ b/mlcolvar/core/nn/graph/gnn.py @@ -0,0 +1,241 @@ +import torch +from torch import nn +from typing import List, Dict, Tuple + +from mlcolvar.core.nn.graph import radial +from mlcolvar.utils import _code + +""" +GNN models. +""" + +__all__ = ['BaseGNN'] + + +class BaseGNN(nn.Module): + """ + Base class for Graph Neural Network (GNN) models + """ + + def __init__( + self, + n_out: int, + cutoff: float, + atomic_numbers: List[int], + pooling_operation: str, + n_bases: int = 6, + n_polynomials: int = 6, + basis_type: str = 'bessel', + ) -> None: + """Initializes the core of a GNN model, taking care of edge embeddings. + + Parameters + ---------- + n_out : int + Number of the output scalar node features. + cutoff : float + Cutoff radius of the basis functions. Should be the same as the cutoff + radius used to build the graphs. + atomic_numbers : List[int] + The atomic numbers mapping. + pooling_operation : str + Type of pooling operation to combine node-level features into graph-level features, either mean or sum + n_bases : int, optional + Size of the basis set used for the embedding, by default 6 + n_polynomials : int, optional + Order of the polynomials in the basis functions, by default 6 + basis_type : str, optional + Type of the basis function, by default 'bessel' + """ + super().__init__() + + self._radial_embedding = radial.RadialEmbeddingBlock(cutoff=cutoff, + n_bases=n_bases, + n_polynomials=n_polynomials, + basis_type=basis_type + ) + self.register_buffer( + 'n_out', torch.tensor(n_out, dtype=torch.int64) + ) + self.register_buffer( + 'cutoff', torch.tensor(cutoff, dtype=torch.get_default_dtype()) + ) + self.register_buffer( + 'atomic_numbers', torch.tensor(atomic_numbers, dtype=torch.int64) + ) + self.pooling_operation = pooling_operation + + @property + def out_features(self): + return self.n_out + + @property + def in_features(self): + return None + + def embed_edge( + self, data: Dict[str, torch.Tensor], normalize: bool = True + ) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + """ + Performs the model edge embedding form `torch_geometric.data.Batch` object. + + Parameters + ---------- + data: Dict[str, torch.Tensor] + The data dict. Usually from the `to_dict` method of a + `torch_geometric.data.Batch` object. + normalize: bool + If to return the normalized distance vectors, by default True. + + Returns + ------- + edge_lengths: torch.Tensor (shape: [n_edges, 1]) + The edge lengths. + edge_length_embeddings: torch.Tensor (shape: [n_edges, n_bases]) + The edge length embeddings. + edge_unit_vectors: torch.Tensor (shape: [n_edges, 3]) + The normalized edge vectors. + """ + vectors, lengths = get_edge_vectors_and_lengths( + positions=data['positions'], + edge_index=data['edge_index'], + shifts=data['shifts'], + normalize=normalize, + ) + return lengths, self._radial_embedding(lengths), vectors + + def pooling(self, + input : torch.Tensor, + data : Dict[str, torch.Tensor]) -> torch.Tensor: + """Performs pooling of the node-level outputs to obtain a graph-level output + + Parameters + ---------- + input : torch.Tensor + Nodel level features to be pooled + data : Dict[str, torch.Tensor] + Data batch containing the graph data informations + + Returns + ------- + torch.Tensor + Pooled output + """ + if self.pooling_operation == 'mean': + if 'system_masks' not in data.keys(): + out = _code.scatter_mean(input, data['batch'], dim=0) + else: + out = input * data['system_masks'] + out = _code.scatter_sum(out, data['batch'], dim=0) + out = out / data['n_system'] + + elif self.pooling_operation == 'sum': + if 'system_masks' in data.keys(): + out = input * data['system_masks'] + else: + out = _code.scatter_sum(input, data['batch'], dim=0) + else: + raise ValueError (f"Invalid pooling operation! Found {self.pooling_operation}") + + return out + +def get_edge_vectors_and_lengths( + positions: torch.Tensor, + edge_index: torch.Tensor, + shifts: torch.Tensor, + normalize: bool = True, +) -> Tuple[torch.Tensor, torch.Tensor]: + """ + Calculates edge vectors and lengths by indices and shift vectors. + + Parameters + ---------- + positions: torch.Tensor (shape: [n_atoms, 3]) + The positions tensor. + edge_index: torch.Tensor (shape: [2, n_edges]) + The edge indices. + shifts: torch.Tensor (shape: [n_edges, 3]) + The shifts vector. + normalize: bool + If to return the normalized distance vectors, by default True. + + Returns + ------- + vectors: torch.Tensor (shape: [n_edges, 3]) + The distances vectors. + lengths: torch.Tensor (shape: [n_edges, 1]) + The edges lengths. + """ + sender = edge_index[0] + receiver = edge_index[1] + vectors = positions[receiver] - positions[sender] + shifts # [n_edges, 3] + lengths = torch.linalg.norm(vectors, dim=-1, keepdim=True) # [n_edges, 1] + + if normalize: + vectors = torch.nan_to_num(torch.div(vectors, lengths)) + + return vectors, lengths + + +def test_get_edge_vectors_and_lengths() -> None: + dtype = torch.get_default_dtype() + torch.set_default_dtype(torch.float64) + + data = dict() + data['positions'] = torch.tensor( + [[0.0, 0.0, 0.0], [0.07, 0.07, 0.0], [0.07, -0.07, 0.0]], + dtype=torch.float64 + ) + data['edge_index'] = torch.tensor( + [[0, 0, 1, 1, 2, 2], [2, 1, 0, 2, 1, 0]] + ) + data['shifts'] = torch.tensor([ + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.2, 0.0], + [0.0, -0.2, 0.0], + [0.0, 0.0, 0.0], + ]) + + vectors, distances = get_edge_vectors_and_lengths(**data, normalize=False) + assert(torch.allclose(vectors, torch.tensor([[0.0700, -0.0700, 0.0000], + [0.0700, 0.0700, 0.0000], + [-0.070, -0.0700, 0.0000], + [0.0000, 0.0600, 0.0000], + [0.0000, -0.0600, 0.0000], + [-0.070, 0.0700, 0.0000]]) + ) + ) + assert(torch.allclose(distances,torch.tensor([[0.09899494936611666], + [0.09899494936611666], + [0.09899494936611666], + [0.06000000000000000], + [0.06000000000000000], + [0.09899494936611666]]) + ) + ) + + vectors, distances = get_edge_vectors_and_lengths(**data, normalize=True) + assert(torch.allclose(vectors, torch.tensor([[0.70710678118654757, -0.70710678118654757, 0.0], + [0.70710678118654757, 0.70710678118654757, 0.0], + [-0.7071067811865476, -0.70710678118654757, 0.0], + [0.00000000000000000, 1.00000000000000000, 0.0], + [0.00000000000000000, -1.00000000000000000, 0.0], + [-0.7071067811865476, 0.70710678118654757, 0.0]]) + ) + ) + + assert(torch.allclose(distances, torch.tensor([[0.09899494936611666], + [0.09899494936611666], + [0.09899494936611666], + [0.06000000000000000], + [0.06000000000000000], + [0.09899494936611666]]) + ) + ) + + torch.set_default_dtype(dtype) + +if __name__ == "__main__": + test_get_edge_vectors_and_lengths() \ No newline at end of file diff --git a/mlcolvar/core/nn/graph/gvp.py b/mlcolvar/core/nn/graph/gvp.py new file mode 100644 index 00000000..e1f6d7c7 --- /dev/null +++ b/mlcolvar/core/nn/graph/gvp.py @@ -0,0 +1,920 @@ +import functools +import math +import torch +from torch import nn +from torch_geometric.nn import MessagePassing +from typing import Tuple, Callable, Optional, List, Dict + +from mlcolvar.core.nn.graph.gnn import BaseGNN + +""" +The Geometric Vector Perceptron (GVP) layer. This module is taken from: +https://github.com/chaitjo/geometric-gnn-dojo/blob/main/models/layers/py, +and made compilable. +""" + +__all__ = ['GVPModel', 'GVPConvLayer', 'LayerNorm', 'Dropout'] + + +class GVPModel(BaseGNN): + """ + The Geometric Vector Perceptron (GVP) model [1, 2] with vector gate [2]. + + References + ---------- + .. [1] Jing, Bowen, et al. + "Learning from protein structure with geometric vector perceptrons." + International Conference on Learning Representations. 2020. + .. [2] Jing, Bowen, et al. + "Equivariant graph neural networks for 3d macromolecular structure." + arXiv preprint arXiv:2106.03843 (2021). + """ + def __init__( + self, + n_out: int, + cutoff: float, + atomic_numbers: List[int], + pooling_operation : str = 'mean', + n_bases: int = 8, + n_polynomials: int = 6, + n_layers: int = 1, + n_messages: int = 2, + n_feedforwards: int = 2, + n_scalars_node: int = 8, + n_vectors_node: int = 8, + n_scalars_edge: int = 8, + drop_rate: int = 0.1, + activation: str = 'SiLU', + basis_type: str = 'bessel', + smooth: bool = False, + ) -> None: + """Initializes a Geometric Vector Perceptron (GVP) model. + + Parameters + ---------- + n_out: int + Number of the output scalar node features. + cutoff: float + Cutoff radius of the basis functions. Should be the same as the cutoff + radius used to build the graphs. + atomic_numbers: List[int] + The atomic numbers mapping + pooling_operation : str + Type of pooling operation to combine node-level features into graph-level features, either mean or sum, by default 'mean' + n_bases: int + Size of the basis set used for the embedding, by default 8. + n_polynomials: bool + Order of the polynomials in the basis functions, by default 6. + n_layers: int + Number of the graph convolution layers, by default 1. + n_messages: int + Number of GVP layers to be used in the message functions, by default 2. + n_feedforwards: int + Number of GVP layers to be used in the feedforward functions, by default 2. + n_scalars_node: int + Size of the scalar channel of the node embedding in hidden layers, by default 8. + n_vectors_node: int + Size of the vector channel of the node embedding in hidden layers, by default 8. + n_scalars_edge: int + Size of the scalar channel of the edge embedding in hidden layers, by default 8. + drop_rate: int + Drop probability in all dropout layers, by default 0.1. + activation: str + Name of the activation function to be used in the GVPs (case sensitive), by default SiLU. + basis_type: str + Type of the basis function, by default bessel. + smooth: bool + If use the smoothed GVPConv, by default False. + """ + super().__init__( + n_out=n_out, + cutoff=cutoff, + atomic_numbers=atomic_numbers, + pooling_operation=pooling_operation, + n_bases=n_bases, + n_polynomials=n_polynomials, + basis_type=basis_type + ) + + self.W_e = nn.ModuleList([ + LayerNorm((n_bases, 1)), + GVP(in_dims=(n_bases, 1), + out_dims=(n_scalars_edge, 1), + activations=(None, None), + vector_gate=True + ) + ]) + + self.W_v = nn.ModuleList([ + LayerNorm((len(atomic_numbers), 0)), + GVP(in_dims=(len(atomic_numbers), 0), + out_dims=(n_scalars_node, n_vectors_node), + activations=(None, None), + vector_gate=True + ) + ]) + + self.layers = nn.ModuleList( + GVPConvLayer(node_dims=(n_scalars_node, n_vectors_node), + edge_dims=(n_scalars_edge, 1), + n_message=n_messages, + n_feedforward=n_feedforwards, + drop_rate=drop_rate, + activations=(eval(f'torch.nn.{activation}')(), None), + vector_gate=True, + cutoff=(cutoff if smooth else -1) + ) + for _ in range(n_layers) + ) + + self.W_out = nn.ModuleList([ + LayerNorm((n_scalars_node, n_vectors_node)), + GVP(in_dims=(n_scalars_node, n_vectors_node), + out_dims=(n_out, 0), + activations=(None, None), + vector_gate=True) + ]) + + def forward( + self, data: Dict[str, torch.Tensor], pool: bool = True + ) -> torch.Tensor: + """The forward pass. + + Parameters + ---------- + data: Dict[str, torch.Tensor] + The data dict. Usually came from the `to_dict` method of a + `torch_geometric.data.Batch` object. + pool: bool + If perform the pooling to the model output, by default True. + """ + h_V = (data['node_attrs'], None) + for w in self.W_v: + h_V = w(h_V) + h_V_1, h_V_2 = h_V + assert h_V_2 is not None + h_V = (h_V_1, h_V_2) + + h_E = self.embed_edge(data) + lengths = h_E[0] + h_E = (h_E[1], h_E[2].unsqueeze(-2)) + for w in self.W_e: + h_E = w(h_E) + h_E_1, h_E_2 = h_E + assert h_E_2 is not None + h_E = (h_E_1, h_E_2) + + for layer in self.layers: + h_V = layer(h_V, data['edge_index'], h_E, lengths) + + for w in self.W_out: + h_V = w(h_V) + out = h_V[0] + + if pool: + out = self.pooling(input=out, data=data) + + return out + + +class GVP(nn.Module): + """ + Geometric Vector Perceptron (GVP) layer from [1, 2] with vector gate [2]. + + References + ---------- + .. [1] Jing, Bowen, et al. + "Learning from protein structure with geometric vector perceptrons." + International Conference on Learning Representations. 2020. + .. [2] Jing, Bowen, et al. + "Equivariant graph neural networks for 3d macromolecular structure." + arXiv preprint arXiv:2106.03843 (2021). + """ + + def __init__( + self, + in_dims: Tuple[int, Optional[int]], + out_dims: Tuple[int, Optional[int]], + h_dim: Tuple[int, Optional[int]] = None, + activations: Tuple[ + Optional[Callable], Optional[Callable] + ] = (nn.functional.relu, torch.sigmoid), + vector_gate: bool = True, + ) -> None: + r"""Geometric Vector Perceptron layer. + + Updates the scalar node feature s as: + .. math:: bm{s}^n \leftarrow \sigma \left(\bm{s}'\right) \quad\text{with}\quad \bm{s}' \coloneq \bm{W}_m \left[{\|\bm{W}_h\vec{\bm{v}}^o\|_2 \atop \bm{s}^o}\right] + \bm{b} + + And the vector nore features as: + .. math:: \vec{\bm{v}}^n \leftarrow \sigma_g \left(\bm{W}_g\left(\sigma^+ \left(\bm{s}'\right)\right) + \bm{b}_g \right) \odot \bm{W}_\mu\bm{W}_h\vec{\bm{v}}^o + + Parameters + ---------- + in_dims : Tuple[int, Optional[int]] + Dimension of inputs + out_dims : Tuple[int, Optional[int]] + Dimension of outputs + h_dim : Tuple[int, Optional[int]], optional + Intermidiate number of vector channels, by default None + activations : Tuple[ Optional[Callable], Optional[Callable] ], optional + Scalar and vector activation functions (scalar_act, vector_act), by default (nn.functional.relu, torch.sigmoid) + vector_gate : bool, optional + Whether to use vector gating, by default True. The vector activation will be used as sigma^+ in vector gating if `True` + """ + super(GVP, self).__init__() + self.si, self.vi = in_dims + self.so, self.vo = out_dims + self.vector_gate = vector_gate + if self.vi: + self.h_dim = h_dim or max(self.vi, self.vo) + self.wh = nn.Linear(self.vi, self.h_dim, bias=False) + self.ws = nn.Linear(self.h_dim + self.si, self.so) + if self.vo: + self.wv = nn.Linear(self.h_dim, self.vo, bias=False) + if self.vector_gate: + self.wsv = nn.Linear(self.so, self.vo) + else: + self.wv = None + self.wsv = None + else: + self.wh = None + self.wv = None + self.wsv = None + self.ws = nn.Linear(self.si, self.so) + + self.scalar_act, self.vector_act = activations + self.dummy_param = nn.Parameter(torch.empty(0)) + + def forward( + self, + x: Tuple[torch.Tensor, Optional[torch.Tensor]] + ) -> Tuple[torch.Tensor, Optional[torch.Tensor]]: + """Forward pass of GVP + + Parameters + ---------- + x : Tuple[torch.Tensor, Optional[torch.Tensor]] + Input scalar and vector node embeddings + + Returns + ------- + Tuple[torch.Tensor, Optional[torch.Tensor]] + Input scalar and vector node embeddings + """ + + s, v = x + if v is not None: + assert self.wh is not None + v = torch.transpose(v, -1, -2) + vh = self.wh(v) + vn = _norm_no_nan(vh, axis=-2) + s = self.ws(torch.cat([s, vn], -1)) + if self.vo: + assert self.wv is not None + v = self.wv(vh) + v = torch.transpose(v, -1, -2) + if self.vector_gate: + assert self.wsv is not None + gate = ( + self.wsv(self.vector_act(s)) + if self.vector_act is not None + else self.wsv(s) + ) + v = v * torch.sigmoid(gate).unsqueeze(-1) + elif self.vector_act is not None: + v = v * self.vector_act( + _norm_no_nan(v, axis=-1, keepdims=True) + ) + else: + s = self.ws(s) + if self.vo: + v = torch.zeros( + s.shape[0], + self.vo, + 3, + device=self.dummy_param.device, + dtype=s.dtype + ) + else: + v = None + + if self.scalar_act is not None: + s = self.scalar_act(s) + + return s, v + + +class GVPConv(MessagePassing): + """ + Graph convolution / message passing with Geometric Vector Perceptrons. + """ + propagate_type = { + 's': torch.Tensor, + 'v': torch.Tensor, + 'edge_attr_s': torch.Tensor, + 'edge_attr_v': torch.Tensor, + 'edge_lengths': torch.Tensor, + } + + def __init__( + self, + in_dims, + out_dims, + edge_dims, + n_layers=3, + aggr='mean', + activations=(nn.functional.relu, torch.sigmoid), + vector_gate=True, + cutoff: float = -1.0, + ) -> None: + """Graph convolution / message passing with Geometric Vector Perceptrons. + Takes in a graph with node and edge embeddings, + and returns new node embeddings. + + This does NOT do residual updates and pointwise feedforward layers + --- see `GVPConvLayer` instead. + + Parameters + ---------- + in_dims : + input node embedding dimensions (n_scalar, n_vector) + out_dims : + output node embedding dimensions (n_scalar, n_vector) + edge_dims : + input edge embedding dimensions (n_scalar, n_vector) + n_layers : int, optional + number of GVPs in the message function, by default 3 + aggr : str, optional + Type of message aggregate function, by default 'mean' + activations : tuple, optional + activation functions (scalar_act, vector_act) to be used use in GVPs, by default (nn.functional.relu, torch.sigmoid) + vector_gate : bool, optional + Whether to use vector gating, by default True. The vector activation will be used as sigma^+ in vector gating if `True` + cutoff : float, optional + Radial cutoff, by default -1.0 + """ + super(GVPConv, self).__init__(aggr=aggr) + self.si, self.vi = in_dims + self.so, self.vo = out_dims + self.se, self.ve = edge_dims + self.cutoff = cutoff + + GVP_ = functools.partial( + GVP, activations=activations, vector_gate=vector_gate + ) + + self._module_list = torch.nn.ModuleList() + if n_layers == 1: + self._module_list.append( + GVP_(in_dims=(2 * self.si + self.se, 2 * self.vi + self.ve), + out_dims=(self.so, self.vo), + activations=(None, None)) + ) + else: + self._module_list.append( + GVP_(in_dims=(2 * self.si + self.se, 2 * self.vi + self.ve), + out_dims=out_dims) + ) + for i in range(n_layers - 2): + self._module_list.append(GVP_(out_dims, out_dims)) + self._module_list.append( + GVP_(in_dims=out_dims, + out_dims=out_dims, + activations=(None, None)) + ) + + def forward( + self, + x: Tuple[torch.Tensor, torch.Tensor], + edge_index: torch.Tensor, + edge_attr: Tuple[torch.Tensor, torch.Tensor], + edge_lengths: torch.Tensor, + ) -> Tuple[torch.Tensor, torch.Tensor]: + """Forward pass of GVPConv + + Parameters + ---------- + x : Tuple[torch.Tensor, torch.Tensor] + Input scalar and vector node embeddings + edge_index : torch.Tensor + Index of edge sources and destinations + edge_attr : Tuple[torch.Tensor, torch.Tensor] + Edge attributes + edge_lengths : torch.Tensor + Edge lengths + + Returns + ------- + Tuple[torch.Tensor, torch.Tensor] + Output scalar and vector node embeddings + """ + x_s, x_v = x + assert x_v is not None + message = self.propagate( + edge_index, + s=x_s, + v=x_v.contiguous().view(x_v.shape[0], x_v.shape[1] * 3), + edge_attr_s=edge_attr[0], + edge_attr_v=edge_attr[1], + edge_lengths=edge_lengths, + ) + return _split(message, self.vo) + + def message( + self, + s_i: torch.Tensor, + v_i: torch.Tensor, + s_j: torch.Tensor, + v_j: torch.Tensor, + edge_attr_s: torch.Tensor, + edge_attr_v: torch.Tensor, + edge_lengths: torch.Tensor, + ) -> torch.Tensor: + assert edge_attr_s is not None + assert edge_attr_v is not None + v_j = v_j.view(v_j.shape[0], v_j.shape[1] // 3, 3) + v_i = v_i.view(v_i.shape[0], v_i.shape[1] // 3, 3) + message = _tuple_cat( + (s_j, v_j), (edge_attr_s, edge_attr_v), (s_i, v_i) + ) + message = self.message_func(message) + message_merged = _merge(*message) + if self.cutoff > 0: + # apply SchNet-style cutoff function + c = 0.5 * (torch.cos(edge_lengths * math.pi / self.cutoff) + 1.0) + message_merged = message_merged * c.view(-1, 1) + return message_merged + + def message_func( + self, x: Tuple[torch.Tensor, torch.Tensor] + ) -> Tuple[torch.Tensor, torch.Tensor]: + for m in self._module_list: + x = m(x) + output_1, output_2 = x + assert output_2 is not None + return output_1, output_2 + + +class GVPConvLayer(nn.Module): + """ + Full graph convolution / message passing layer with + Geometric Vector Perceptrons. + Residually updates node embeddings with + aggregated incoming messages, applies a pointwise feedforward + network to node embeddings, and returns updated node embeddings. + + To only compute the aggregated messages, see `GVPConv`. + """ + + def __init__( + self, + node_dims, + edge_dims, + n_message=3, + n_feedforward=2, + drop_rate=0.1, + activations=(nn.functional.relu, torch.sigmoid), + vector_gate=True, + residual=True, + cutoff: float = -1.0, + ) -> None: + """Full graph convolution / message passing layer with + Geometric Vector Perceptrons. + Residually updates node embeddings with + aggregated incoming messages, applies a pointwise feedforward + network to node embeddings, and returns updated node embeddings. + + To only compute the aggregated messages see `GVPConv` instead. + + Parameters + ---------- + node_dims : + node embedding dimensions (n_scalar, n_vector) + edge_dims : + input edge embedding dimensions (n_scalar, n_vector) + n_message : int, optional + number of GVP layers to be used in message function, by default 3 + n_feedforward : int, optional + number of GVPs to be used use in feedforward function, by default 2 + drop_rate : float, optional + drop probability in all dropout layers, by default 0.1 + activations : tuple, optional + activation functions (scalar_act, vector_act) to be used use in GVPs, by default (nn.functional.relu, torch.sigmoid) + vector_gate : bool, optional + whether to use vector gating, by default True. The vector activation will be used as sigma^+ in vector gating if `True` + residual : bool, optional + whether to perform the update residually, by default True + cutoff : float, optional + radial cutoff, by default -1.0 + """ + super(GVPConvLayer, self).__init__() + self.conv = GVPConv( + node_dims, + node_dims, + edge_dims, + n_message, + aggr='mean', + activations=activations, + vector_gate=vector_gate, + cutoff=cutoff, + ) + GVP_ = functools.partial( + GVP, activations=activations, vector_gate=vector_gate + ) + self.norm = nn.ModuleList([LayerNorm(node_dims) for _ in range(2)]) + self.dropout = nn.ModuleList([Dropout(drop_rate) for _ in range(2)]) + + self._module_list = nn.ModuleList() + if n_feedforward == 1: + self._module_list.append( + GVP_(in_dims=node_dims, + out_dims=node_dims, + activations=(None, None)) + ) + else: + hid_dims = 4 * node_dims[0], 2 * node_dims[1] + self._module_list.append(GVP_(node_dims, hid_dims)) + self._module_list.extend( + GVP_(in_dims=hid_dims, out_dims=hid_dims) for _ in range(n_feedforward - 2) + ) + self._module_list.append( + GVP_(in_dims=hid_dims, out_dims=node_dims, activations=(None, None)) + ) + self.residual = residual + + def forward( + self, + x: Tuple[torch.Tensor, torch.Tensor], + edge_index: torch.Tensor, + edge_attr: Tuple[torch.Tensor, torch.Tensor], + edge_lengths: torch.Tensor, + node_mask: Optional[torch.Tensor] = None, + ) -> Tuple[torch.Tensor, torch.Tensor]: + """Forward pass of GVPConvLayer + + Parameters + ---------- + x : Tuple[torch.Tensor, torch.Tensor] + Input scalar and vector node embeddings + edge_index : torch.Tensor + Index of edge sources and destinations + edge_attr : Tuple[torch.Tensor, torch.Tensor] + Edge attributes + edge_lengths : torch.Tensor + Edge lengths + node_mask : Optional[torch.Tensor], optional + Mask to restrict the node update to a subset. + It should be a tensor of type `bool` to index the first dim of node embeddings (s, V), by default None. + If not `None`, only the selected nodes will be updated. + + Returns + ------- + Tuple[torch.Tensor, torch.Tensor] + Output scalar and vector node embeddings + """ + + dh = self.conv(x, edge_index, edge_attr, edge_lengths) + + x_ = x + if node_mask is not None: + x, dh = _tuple_index(x, node_mask), _tuple_index(dh, node_mask) + + if self.residual: + input_1, input_2 = self.dropout[0](dh) + assert input_2 is not None + output_1, output_2 = self.norm[0]( + _tuple_sum(x, (input_1, input_2)) + ) + assert output_2 is not None + x = (output_1, output_2) + else: + x = dh + + dh = self.ff_func(x) + if self.residual: + input_1, input_2 = self.dropout[1](dh) + assert input_2 is not None + output_1, output_2 = self.norm[1]( + _tuple_sum(x, (input_1, input_2)) + ) + assert output_2 is not None + x = (output_1, output_2) + else: + x = dh + + if node_mask is not None: + x_[0][node_mask], x_[1][node_mask] = x[0], x[1] + x = x_ + return x + + def ff_func( + self, x: Tuple[torch.Tensor, torch.Tensor] + ) -> Tuple[torch.Tensor, torch.Tensor]: + for m in self._module_list: + x = m(x) + output_1 = x[0] + output_2 = x[1] + assert output_2 is not None + return output_1, output_2 + + +class LayerNorm(nn.Module): + """ + Combined LayerNorm for tuples (s, V). + Takes tuples (s, V) as input and as output. + """ + + def __init__(self, dims) -> None: + super(LayerNorm, self).__init__() + self.s, self.v = dims + self.scalar_norm = nn.LayerNorm(self.s) + + def forward( + self, + x: Tuple[torch.Tensor, Optional[torch.Tensor]] + ) -> Tuple[torch.Tensor, Optional[torch.Tensor]]: + """Forward pass of LayerNorm + + Parameters + ---------- + x : Tuple[torch.Tensor, Optional[torch.Tensor]] + Input channels, if a single tensor is provided it assumes it to be the scalar channel + + Returns + ------- + Tuple[torch.Tensor, Optional[torch.Tensor]] + Normalized channels + """ + + s, v = x + if not self.v: + return self.scalar_norm(s), None + else: + assert v is not None + vn = _norm_no_nan(v, axis=-1, keepdims=True, sqrt=False) + vn = torch.sqrt(torch.mean(vn, dim=-2, keepdim=True)) + return self.scalar_norm(s), v / vn + + +class Dropout(nn.Module): + """ + Combined dropout for tuples (s, V). + Takes tuples (s, V) as input and as output. + """ + + def __init__(self, drop_rate) -> None: + super(Dropout, self).__init__() + self.sdropout = nn.Dropout(drop_rate) + self.vdropout = _VDropout(drop_rate) + + def forward( + self, + x: Tuple[torch.Tensor, Optional[torch.Tensor]] + ) -> Tuple[torch.Tensor, Optional[torch.Tensor]]: + """Forward pass of Dropout + + Parameters + ---------- + x : Tuple[torch.Tensor, Optional[torch.Tensor]] + Input channels, if a single tensor is provided it assumes it to be the scalar channel + + Returns + ------- + Tuple[torch.Tensor, Optional[torch.Tensor]] + Dropped out channels + """ + s, v = x + if v is None: + return self.sdropout(s), None + else: + assert v is not None + return self.sdropout(s), self.vdropout(v) + + +class _VDropout(nn.Module): + """ + Vector channel dropout where the elements of each + vector channel are dropped together. + """ + + def __init__(self, drop_rate) -> None: + super(_VDropout, self).__init__() + self.drop_rate = drop_rate + self.dummy_param = nn.Parameter(torch.empty(0)) + + def forward(self, x : torch.Tensor) -> torch.Tensor: + """Forward pass of _VDropout + + Parameters + ---------- + x : torch.Tensor + Vector channel + + Returns + ------- + torch.Tensor + Dropped out vector channel + """ + device = self.dummy_param.device + if not self.training: + return x + mask = torch.bernoulli( + (1 - self.drop_rate) * torch.ones(x.shape[:-1], device=device) + ).unsqueeze(-1) + x = mask * x / (1 - self.drop_rate) + return x + + +def _tuple_sum( + input_1: Tuple[torch.Tensor, torch.Tensor], + input_2: Tuple[torch.Tensor, torch.Tensor] +) -> Tuple[torch.Tensor, torch.Tensor]: + """ + Sums any number of tuples (s, V) elementwise. + """ + out = [i + j for i, j in zip(input_1, input_2)] + return out[0], out[1] + + +@torch.jit.script_if_tracing +def _tuple_cat( + input_1: Tuple[torch.Tensor, torch.Tensor], + input_2: Tuple[torch.Tensor, torch.Tensor], + input_3: Tuple[torch.Tensor, torch.Tensor], + dim: int = -1 +) -> Tuple[torch.Tensor, torch.Tensor]: + """Concatenates any number of tuples (s, V) elementwise. + + Parameters + ---------- + input_1 : Tuple[torch.Tensor, torch.Tensor] + First input to concatenate + input_2 : Tuple[torch.Tensor, torch.Tensor] + Second input to concatenate + input_3 : Tuple[torch.Tensor, torch.Tensor] + Third input to concatenate + dim : int, optional + dimension along which to concatenate when viewed + as the `dim` index for the scalar-channel tensors, by default -1. + This means that `dim=-1` will be applied as + `dim=-2` for the vector-channel tensors. + + Returns + ------- + Tuple[torch.Tensor, torch.Tensor] + Concatenated tuple + """ + + dim = int(dim % len(input_1[0].shape)) + s_args, v_args = list(zip(input_1, input_2, input_3)) + return torch.cat(s_args, dim=dim), torch.cat(v_args, dim=dim) + + +@torch.jit.script_if_tracing +def _tuple_index( + x: Tuple[torch.Tensor, torch.Tensor], idx: torch.Tensor +) -> Tuple[torch.Tensor, torch.Tensor]: + """Indexes a tuple (s, V) along the first dimension at a given index. + + Parameters + ---------- + x : Tuple[torch.Tensor, torch.Tensor] + Tuple to be indexed + idx : torch.Tensor + any object which can be used to index a `torch.Tensor` + + Returns + ------- + Tuple[torch.Tensor, torch.Tensor] + Tuple with the element at the given index + """ + return x[0][idx], x[1][idx] + + +@torch.jit.script_if_tracing +def _norm_no_nan( + x: torch.Tensor, + axis: int = -1, + keepdims: bool = False, + eps: float = 1e-8, + sqrt: bool = True +) -> torch.Tensor: + """L2 norm of tensor clamped above a minimum value `eps`. + + Parameters + ---------- + x : torch.Tensor + Input tensor + axis : int, optional + Axis along which to compute the norm, by default -1 + keepdims : bool, optional + Whether to keep the original dimensions, by default False + eps : float, optional + Lowest threshold for clamping the norm, by default 1e-8 + sqrt : bool, optional + Compute the sqaure root in L2 norm, by default True. + If `False`, returns the square of the L2 norm + + Returns + ------- + torch.Tensor + Normed tensor + """ + out = torch.clamp(torch.sum(torch.square(x), axis, keepdims), min=eps) + return torch.sqrt(out) if sqrt else out + + +@torch.jit.script_if_tracing +def _split(x: torch.Tensor, nv: int) -> Tuple[torch.Tensor, torch.Tensor]: + """Splits a merged representation of (s, V) back into a tuple. + Should be used only with `_merge(s, V)` and only if the tuple + representation cannot be used. + + + Parameters + ---------- + x : torch.Tensor + the `torch.Tensor` returned from `_merge` + nv : int + the number of vector channels in the input to `_merge` + + Returns + ------- + Tuple[torch.Tensor, torch.Tensor] + split representation + """ + s = x[..., :-3 * nv] + v = x[..., -3 * nv:].contiguous().view(x.shape[0], nv, 3) + return s, v + + +@torch.jit.script_if_tracing +def _merge(s: torch.Tensor, v: torch.Tensor) -> torch.Tensor: + """Merges a tuple (s, V) into a single `torch.Tensor`, where the + vector channels are flattened and appended to the scalar channels. + Should be used only if the tuple representation cannot be used. + Use `_split(x, nv)` to reverse. + """ + v = v.contiguous().view(v.shape[0], v.shape[1] * 3) + return torch.cat([s, v], -1) + + +def test_gvp() -> None: + from mlcolvar.core.nn.graph.utils import _test_get_data + from mlcolvar.data.graph.utils import create_graph_tracing_example + + torch.manual_seed(0) + torch.set_default_dtype(torch.float64) + + model = GVPModel( + n_out=2, + cutoff=0.1, + atomic_numbers=[1, 8], + n_bases=6, + n_polynomials=6, + n_layers=2, + n_messages=2, + n_feedforwards=1, + n_scalars_node=16, + n_vectors_node=8, + n_scalars_edge=16, + drop_rate=0, + activation='SiLU', + ) + + data = _test_get_data() + ref_out = torch.tensor([[0.6100070244145421, -0.2559670171962067]] * 6) + assert ( torch.allclose(model(data), ref_out) ) + + traced_model = torch.jit.trace(model, example_inputs=create_graph_tracing_example(2)) + assert ( torch.allclose(traced_model(data), ref_out) ) + + model = GVPModel( + n_out=2, + cutoff=0.1, + atomic_numbers=[1, 8], + n_bases=6, + n_polynomials=6, + n_layers=2, + n_messages=2, + n_feedforwards=2, + n_scalars_node=16, + n_vectors_node=8, + n_scalars_edge=16, + drop_rate=0, + activation='SiLU', + ) + + data = _test_get_data() + ref_out = torch.tensor([[-0.3065361946949377, 0.16624918721972567]] * 6) + assert ( torch.allclose(model(data), ref_out) ) + + traced_model = torch.jit.trace(model, example_inputs=create_graph_tracing_example(2)) + assert ( torch.allclose(traced_model(data), ref_out) ) + + + torch.set_default_dtype(torch.float32) + +if __name__ == '__main__': + test_gvp() diff --git a/mlcolvar/core/nn/graph/radial.py b/mlcolvar/core/nn/graph/radial.py new file mode 100644 index 00000000..36224292 --- /dev/null +++ b/mlcolvar/core/nn/graph/radial.py @@ -0,0 +1,381 @@ +import torch +import numpy as np + +""" +The radial functions. This module is taken from MACE directly: +https://github.com/ACEsuit/mace/blob/main/mace/modules/radial.py +""" + +__all__ = ['RadialEmbeddingBlock'] + + +class GaussianBasis(torch.nn.Module): + """ + Gaussian basis functions. + """ + def __init__(self, cutoff: float, n_bases=32) -> None: + """Initialize a Gaussian basis function + + Parameters + ---------- + cutoff : float + Cutoff radius of the basis set + n_bases : int, optional + Size of the basis set, by default 32 + """ + super().__init__() + + offset = torch.linspace( + start=0.0, + end=cutoff, + steps=n_bases, + dtype=torch.get_default_dtype(), + ) + coeff = -0.5 / (offset[1] - offset[0]).item() ** 2 + self.register_buffer( + 'cutoff', torch.tensor(cutoff, dtype=torch.get_default_dtype()) + ) + self.register_buffer( + 'coeff', torch.tensor(coeff, dtype=torch.get_default_dtype()) + ) + self.register_buffer('offset', offset) + + def forward(self, x: torch.Tensor) -> torch.Tensor: + dist = x.view(-1, 1) - self.offset.view(1, -1) + return torch.exp(self.coeff * torch.pow(dist, 2)) + + def __repr__(self) -> str: + result = 'GAUSSIANBASIS [ ' + + data_string = '\033[32m{:d}\033[0m\033[36m 󰯰 \033[0m' + result = result + data_string.format(len(self.offset)) + result = result + '| ' + data_string = '\033[32m{:f}\033[0m\033[36m 󰳁 \033[0m' + result = result + data_string.format(self.cutoff) + result = result + ']' + + return result + + +class BesselBasis(torch.nn.Module): + r""" + Bessel radial basis functions (equation (7) in [1]) + + .. math:: RBF_n(d) = \sqrt{\frac{2}{c}\frac{sin(\frac{n\pi}{c}d)}{d}} + + References + ---------- + .. [1] Gasteiger, J.; Groß, J.; Günnemann, S. Directional Message Passing + for Molecular Graphs; ICLR 2020. + """ + + def __init__(self, cutoff: float, n_bases=8, trainable=False) -> None: + """Initializes Bessel radial basis function + + Parameters + ---------- + cutoff: float + Cutoff radius of the basis set + n_bases: int + Size of the basis set, by default 8 + trainable: bool + If to use trainable basis set parameters + """ + super().__init__() + + bessel_weights = ( + np.pi + / cutoff + * torch.linspace( + start=1.0, + end=n_bases, + steps=n_bases, + dtype=torch.get_default_dtype(), + ) + ) + if trainable: + self.bessel_weights = torch.nn.Parameter(bessel_weights) + else: + self.register_buffer('bessel_weights', bessel_weights) + + self.register_buffer( + 'cutoff', torch.tensor(cutoff, dtype=torch.get_default_dtype()) + ) + self.register_buffer( + 'prefactor', + torch.tensor( + np.sqrt(2.0 / cutoff), dtype=torch.get_default_dtype() + ) + ) + + def forward(self, x: torch.Tensor) -> torch.Tensor: + numerator = torch.sin(self.bessel_weights * x) + return self.prefactor * (numerator / x) + + def __repr__(self) -> str: + result = 'BESSELBASIS [ ' + + data_string = '\033[32m{:d}\033[0m\033[36m 󰯰 \033[0m' + result = result + data_string.format(len(self.bessel_weights)) + result = result + '| ' + data_string = '\033[32m{:f}\033[0m\033[36m 󰳁 \033[0m' + result = result + data_string.format(self.cutoff) + if self.bessel_weights.requires_grad: + result = result + '|\033[36m TRAINABLE \033[0m' + result = result + ']' + + return result + + +class PolynomialCutoff(torch.nn.Module): + r"""Continuous cutoff function (equation (8) in [1]) + + .. math:: u(d) = 1 - \frac{(p+1)(p+2)}{2}d^p + p(p+2)d^{p+1} - \frac{p(p+1)}{2}d^{p+2} + + References + ---------- + .. [1] Gasteiger, J.; Groß, J.; Günnemann, S. Directional Message Passing + for Molecular Graphs; ICLR 2020. + """ + p: torch.Tensor + cutoff: torch.Tensor + + def __init__(self, cutoff: float, p: int = 6) -> None: + """initilalizes a polynomial cutoff function. + + Parameters + ---------- + cutoff: float + The cutoff radius. + p: int + Order of the polynomial, by default 6 + """ + super().__init__() + self.register_buffer( + 'p', torch.tensor(p, dtype=torch.get_default_dtype()) + ) + self.register_buffer( + 'cutoff', torch.tensor(cutoff, dtype=torch.get_default_dtype()) + ) + + def forward(self, x: torch.Tensor) -> torch.Tensor: + # fmt: off + envelope = ( + 1.0 + - (self.p + 1.0) * (self.p + 2.0) / 2.0 + * torch.pow(x / self.cutoff, self.p) + + self.p * (self.p + 2.0) + * torch.pow(x / self.cutoff, self.p + 1) + - self.p * (self.p + 1.0) / 2 + * torch.pow(x / self.cutoff, self.p + 2) + ) + # fmt: on + + # noinspection PyUnresolvedReferences + return envelope * (x < self.cutoff) + + def __repr__(self) -> str: + result = 'POLYNOMIALCUTOFF [ ' + + data_string = '\033[32m{:d}\033[0m\033[36m 󰰚 \033[0m' + result = result + data_string.format(int(self.p)) + result = result + '| ' + data_string = '\033[32m{:f}\033[0m\033[36m 󰳁 \033[0m' + result = result + data_string.format(self.cutoff) + result = result + ']' + + return result + + +class RadialEmbeddingBlock(torch.nn.Module): + """ + Radial embedding block [1] + + References + ---------- + .. [1] Gasteiger, J.; Groß, J.; Günnemann, S. Directional Message Passing + for Molecular Graphs; ICLR 2020. + """ + + def __init__( + self, + cutoff: float, + n_bases: int = 8, + n_polynomials: int = 6, + basis_type: str = 'bessel', + ) -> None: + """Initializes a radial embedding block + + Parameters + ---------- + cutoff : float + Cutoff radius. + n_bases : int, optional + Size of the basis set, by default 8 + n_polynomials : int, optional + Order of the polynomial for the polynomial cutoff, by default 6 + basis_type : str, optional + Type fo the basis function, by default 'bessel' + + Raises + ------ + RuntimeError + _description_ + """ + super().__init__() + self.n_out = n_bases + if basis_type == 'bessel': + self.bessel_fn = BesselBasis(cutoff=cutoff, n_bases=n_bases) + self.cutoff_fn = PolynomialCutoff(cutoff=cutoff, p=n_polynomials) + elif basis_type == 'gaussian': + self.bessel_fn = GaussianBasis(cutoff=cutoff, n_bases=n_bases) + self.cutoff_fn = None + else: + raise RuntimeError( + 'Unknown basis function type "{:s}" !'.format(basis_type) + ) + + def forward(self, edge_lengths: torch.Tensor) -> torch.Tensor: + """ + The forward pass of RadialEmbeddingBlock + + Parameters + ---------- + edge_lengths: torch.Tensor (shape: [n_edges, 1]) + Lengths of edges. + + Returns + ------- + edge_embedding: torch.Tensor (shape: [n_edges, n_bases]) + The radial edge embedding. + """ + r = self.bessel_fn(edge_lengths) # shape: [n_edges, n_bases] + if self.cutoff_fn is not None: + c = self.cutoff_fn(edge_lengths) # shape: [n_edges, 1] + return r * c + else: + return r + + +def test_bessel_basis() -> None: + dtype = torch.get_default_dtype() + torch.set_default_dtype(torch.float64) + + data = torch.tensor([ + [0.30216178425160090, 0.603495364055576400], + [0.29735174147757487, 0.565596622727919000], + [0.28586135770645804, 0.479487014442650350], + [0.26815929064765680, 0.358867177503655900], + [0.24496326504279375, 0.222421990229218020], + [0.21720530022724968, 0.090319042449653110], + [0.18598678410040770, -0.019467592388889482], + [0.15252575991598738, -0.094266103787986490], + [0.11809918979627002, -0.128642857533393970], + [0.08398320341397922, -0.124823366088228150] + ]) + + rbf = BesselBasis(6.0, 2) + + data_new = torch.stack( + [rbf(torch.ones(1) * i * 0.5 + 0.1) for i in range(0, 10)] + ) + + assert (torch.abs(data - data_new) < 1E-12).all() + + torch.set_default_dtype(dtype) + + print(rbf) + + +def test_gaussian_basis() -> None: + dtype = torch.get_default_dtype() + torch.set_default_dtype(torch.float64) + + data = torch.tensor([ + [0.9998611207557263, 0.6166385641763439], + [0.9950124791926823, 0.6669768108584744], + [0.9833348700493460, 0.7164317992468783], + [0.9650691177896804, 0.7642281651714904], + [0.9405880633643421, 0.8095716486678869], + [0.9103839103891423, 0.8516705072294410], + [0.8750517756337902, 0.8897581848801761], + [0.8352702114112720, 0.9231163463866358], + [0.7917795893122607, 0.9510973184771084], + [0.7453593045429805, 0.9731449630580510] + ]) + + rbf = GaussianBasis(6.0, 2) + + data_new = torch.stack( + [rbf(torch.ones(1) * i * 0.5 + 0.1)[0] for i in range(0, 10)] + ) + + assert (torch.abs(data - data_new) < 1E-12).all() + + torch.set_default_dtype(dtype) + + print(rbf) + + +def test_polynomial_cutoff() -> None: + dtype = torch.get_default_dtype() + torch.set_default_dtype(torch.float64) + + data = torch.tensor([ + [1.0000000000000000], + [0.9999919136092714], + [0.9995588277320531], + [0.9957733154296875], + [0.9803383630544124], + [0.9390599059360889], + [0.8554687500000000], + [0.7184512221655127], + [0.5317786922725198], + [0.3214569091796875] + ]) + + cutoff_function = PolynomialCutoff(6.0) + + data_new = torch.stack( + [cutoff_function(torch.ones(1) * i * 0.5) for i in range(0, 10)] + ) + + assert (torch.abs(data - data_new) < 1E-12).all() + + torch.set_default_dtype(dtype) + + print(cutoff_function) + +def test_radial_embedding_block(): + dtype = torch.get_default_dtype() + torch.set_default_dtype(torch.float64) + + data = torch.tensor([ + [0.302161784075405670, 0.603495363703668900], + [0.297344780473306900, 0.565583382110980900], + [0.285645292705329600, 0.479124599728231300], + [0.266549578182040000, 0.356712961747292670], + [0.238761404317085600, 0.216790818528859370], + [0.201179558989195350, 0.083655164534829570], + [0.154832684273361420, -0.016206633178216297], + [0.104419964978618930, -0.064535087460860160], + [0.057909938358517744, -0.063080025890725560], + [0.023554408472511446, -0.035008673547055544] + ]) + + embedding = RadialEmbeddingBlock(6, 2, 6) + + data_new = torch.stack( + [embedding(torch.ones(1) * i * 0.5 + 0.1) for i in range(0, 10)] + ) + + assert (torch.abs(data - data_new) < 1E-12).all() + + torch.set_default_dtype(dtype) + + +if __name__ == '__main__': + test_bessel_basis() + test_gaussian_basis() + test_polynomial_cutoff() + test_radial_embedding_block() diff --git a/mlcolvar/core/nn/graph/schnet.py b/mlcolvar/core/nn/graph/schnet.py new file mode 100644 index 00000000..9b2c0f17 --- /dev/null +++ b/mlcolvar/core/nn/graph/schnet.py @@ -0,0 +1,382 @@ +import math +import torch +from torch import nn +from torch_geometric.nn import MessagePassing + +from mlcolvar.core.nn.graph.gnn import BaseGNN + +from typing import List, Dict + +""" +The SchNet components. This module is taken from the pgy package: +https://github.com/pyg-team/pytorch_geometric/blob/master/torch_geometric/nn/models/schnet.py +""" + +__all__ = ["SchNetModel", "InteractionBlock", "ShiftedSoftplus"] + +class SchNetModel(BaseGNN): + """ + The SchNet [1] model. This implementation is taken from torch_geometric: + https://github.com/pyg-team/pytorch_geometric/blob/master/torch_geometric/nn/models/schnet.py + + Parameters + ---------- + n_out: int + Size of the output node features. + cutoff: float + Cutoff radius of the basis functions. Should be the same as the cutoff + radius used to build the graphs. + atomic_numbers: List[int] + The atomic numbers mapping, e.g. the `atomic_numbers` attribute of a + `mlcolvar.graph.data.GraphDataSet` instance. + n_bases: int + Size of the basis set. + n_layers: int + Number of the graph convolution layers. + n_filters: int + Number of filters. + n_hidden_channels: int + Size of hidden embeddings. + aggr: str + Type of aggregation function for the GNN message passing. + w_out_after_pool: bool + If apply the readout MLP layer after the scatter sum. + References + ---------- + .. [1] Schütt, Kristof T., et al. "Schnet–a deep learning architecture for + molecules and materials." The Journal of Chemical Physics 148.24 + (2018). + """ + + def __init__( + self, + n_out: int, + cutoff: float, + atomic_numbers: List[int], + pooling_operation : str = 'mean', + n_bases: int = 16, + n_layers: int = 2, + n_filters: int = 16, + n_hidden_channels: int = 16, + aggr: str = 'mean', + w_out_after_pool: bool = False, + ) -> None: + """The SchNet model. This implementation is taken from torch_geometric: + https://github.com/pyg-team/pytorch_geometric/blob/master/torch_geometric/nn/models/schnet.py + + Parameters + ---------- + n_out : int + Size of the output node features. + cutoff : float + Cutoff radius of the basis functions. Should be the same as the cutoff + radius used to build the graphs. + atomic_numbers : List[int] + The atomic numbers mapping. + pooling_operation : str + Type of pooling operation to combine node-level features into graph-level features, either mean or sum, by default 'mean' + n_bases : int, optional + Size of the basis set used for the embedding, by default 16 + n_layers : int, optional + Number of the graph convolution layers, by default 2 + n_filters : int, optional + Number of filters, by default 16 + n_hidden_channels : int, optional + Size of hidden embeddings, by default 16 + aggr : str, optional + Type of the GNN aggregation function, by default 'mean' + w_out_after_pool : bool, optional + Whether to apply the last linear transformation form hidden to output channels after the pooling sum, by default False + """ + + super().__init__( + n_out=n_out, + cutoff=cutoff, + atomic_numbers=atomic_numbers, + pooling_operation=pooling_operation, + n_bases=n_bases, + n_polynomials=0, + basis_type='gaussian' + ) + + # transforms embedding into hidden channels + self.W_v = nn.Linear( + in_features=len(atomic_numbers), + out_features=n_hidden_channels, + bias=False + ) + + # initialize layers with interaction blocks + self.layers = nn.ModuleList([ + InteractionBlock( + n_hidden_channels, n_bases, n_filters, cutoff, aggr + ) + for _ in range(n_layers) + ]) + + # transforms hidden channels into output channels + self.W_out = nn.ModuleList([ + nn.Linear(n_hidden_channels, n_hidden_channels // 2), + ShiftedSoftplus(), + nn.Linear(n_hidden_channels // 2, n_out) + ]) + + self._w_out_after_pool = w_out_after_pool + + self.reset_parameters() + + def reset_parameters(self) -> None: + """ + Resets all learnable parameters of the module. + """ + self.W_v.reset_parameters() + + for layer in self.layers: + layer.reset_parameters() + + nn.init.xavier_uniform_(self.W_out[0].weight) + self.W_out[0].bias.data.fill_(0) + nn.init.xavier_uniform_(self.W_out[2].weight) + self.W_out[2].bias.data.fill_(0) + + def forward( + self, data: Dict[str, torch.Tensor], pool: bool = True + ) -> torch.Tensor: + """ + The forward pass. + Parameters + ---------- + data: Dict[str, torch.Tensor] + The data dict. Usually came from the `to_dict` method of a + `torch_geometric.data.Batch` object. + pool: bool + If to perform the pooling to the model output. + """ + + # embed edges and node attrs + h_E = self.embed_edge(data) + h_V = self.W_v(data['node_attrs']) + + # update through layers + for layer in self.layers: + h_V = h_V + layer(h_V, data['edge_index'], h_E[0], h_E[1]) + + # in case the last linear transformation is performed BEFORE pooling + if not self._w_out_after_pool: + for w in self.W_out: + h_V = w(h_V) + out = h_V + + # perform pooling of the node-level ouptuts + if pool: + out = self.pooling(input=out, data=data) + + # in case the last linear transformation is performed AFTER pooling + if self._w_out_after_pool: + for w in self.W_out: + out = w(out) + + return out + +class InteractionBlock(nn.Module): + def __init__( + self, + hidden_channels: int, + num_gaussians: int, + num_filters: int, + cutoff: float, + aggr: str = 'mean' + ) -> None: + """SchNet interaction block + + Parameters + ---------- + hidden_channels : int + Size of hidden embeddings + num_gaussians : int + Number of Gaussians for the embedding + num_filters : int + Number of filters + cutoff : float + Radial cutoff + aggr : str, optional + Aggregation function, by default 'mean' + """ + super().__init__() + self.mlp = nn.Sequential( + nn.Linear(num_gaussians, num_filters), + ShiftedSoftplus(), + nn.Linear(num_filters, num_filters), + ) + self.conv = CFConv( + hidden_channels, + hidden_channels, + num_filters, + self.mlp, + cutoff, + aggr + ) + self.act = ShiftedSoftplus() + self.lin = nn.Linear(hidden_channels, hidden_channels) + + self.reset_parameters() + + def reset_parameters(self) -> None: + """ + Resets all learnable parameters of the module. + """ + nn.init.xavier_uniform_(self.mlp[0].weight) + self.mlp[0].bias.data.fill_(0) + nn.init.xavier_uniform_(self.mlp[2].weight) + self.mlp[2].bias.data.fill_(0) + self.conv.reset_parameters() + nn.init.xavier_uniform_(self.lin.weight) + self.lin.bias.data.fill_(0) + + def forward( + self, + x: torch.Tensor, + edge_index: torch.Tensor, + edge_weight: torch.Tensor, + edge_attr: torch.Tensor, + ) -> torch.Tensor: + x = self.conv(x, edge_index, edge_weight, edge_attr) + x = self.act(x) + x = self.lin(x) + return x + + +class CFConv(MessagePassing): + """Continuos-filter convolution from SchNet""" + def __init__( + self, + in_channels: int, + out_channels: int, + num_filters: int, + network: nn.Sequential, + cutoff: float, + aggr: str = 'mean' + ) -> None: + """Applies a continuous-filter convolution + + Parameters + ---------- + in_channels : int + Number of input channels + out_channels : int + Number of output channels + num_filters : int + Number of filters + network : nn.Sequential + Neural network + cutoff : float + Radial cutoff + aggr : str, optional + Aggregation function, by default 'mean' + """ + super().__init__(aggr=aggr) + self.lin1 = nn.Linear(in_channels, num_filters, bias=False) + self.lin2 = nn.Linear(num_filters, out_channels) + self.network = network + self.cutoff = cutoff + + self.reset_parameters() + + def reset_parameters(self): + nn.init.xavier_uniform_(self.lin1.weight) + nn.init.xavier_uniform_(self.lin2.weight) + self.lin2.bias.data.fill_(0) + + def forward( + self, + x: torch.Tensor, + edge_index: torch.Tensor, + edge_weight: torch.Tensor, + edge_attr: torch.Tensor, + ) -> torch.Tensor: + C = 0.5 * (torch.cos(edge_weight * math.pi / self.cutoff) + 1.0) + W = self.network(edge_attr) * C.view(-1, 1) + + x = self.lin1(x) + x = self.propagate(edge_index, x=x, W=W) + x = self.lin2(x) + return x + + def message(self, x_j: torch.Tensor, W: torch.Tensor) -> torch.Tensor: + return x_j * W + +# TODO maybe remove and use the common one +class ShiftedSoftplus(nn.Module): + def __init__(self) -> None: + super().__init__() + self.shift = torch.log(torch.tensor(2.0)).item() + + def forward(self, x: torch.Tensor) -> torch.Tensor: + return nn.functional.softplus(x) - self.shift + + + +from mlcolvar.core.nn.graph.utils import _test_get_data +from mlcolvar.data.graph.utils import create_graph_tracing_example + +def test_schnet_1() -> None: + torch.manual_seed(0) + torch.set_default_dtype(torch.float64) + + model = SchNetModel( + n_out=2, + cutoff=0.1, + atomic_numbers=[1, 8], + n_bases=6, + n_layers=2, + n_filters=16, + n_hidden_channels=16 + ) + + data = _test_get_data() + ref_out = torch.tensor([[0.40384621527953063, -0.1257513365138969]] * 6) + assert ( torch.allclose(model(data), ref_out) ) + + model = SchNetModel( + n_out=2, + cutoff=0.1, + atomic_numbers=[1, 8], + n_bases=6, + n_layers=2, + n_filters=16, + n_hidden_channels=16, + pooling_operation='sum', + ) + + data = _test_get_data() + ref_out = torch.tensor([[0.5760462255365488, -0.4465858318467991]] * 6) + assert ( torch.allclose(model(data), ref_out) ) + + traced_model = torch.jit.trace(model, example_inputs=create_graph_tracing_example(2)) + assert ( torch.allclose(traced_model(data), ref_out) ) + + +def test_schnet_2() -> None: + torch.manual_seed(0) + torch.set_default_dtype(torch.float64) + + model = SchNetModel( + n_out=2, + cutoff=0.1, + atomic_numbers=[1, 8], + n_bases=6, + n_layers=2, + n_filters=16, + n_hidden_channels=16, + aggr='min', + w_out_after_pool=True + ) + + data = _test_get_data() + ref_out = torch.tensor([[0.3654537816221449, -0.0748265132499575]] * 6) + assert ( torch.allclose(model(data), ref_out) ) + + torch.set_default_dtype(torch.float32) + +if __name__ == "__main__": + test_schnet_1() \ No newline at end of file diff --git a/mlcolvar/core/nn/graph/utils.py b/mlcolvar/core/nn/graph/utils.py new file mode 100644 index 00000000..1a0e3780 --- /dev/null +++ b/mlcolvar/core/nn/graph/utils.py @@ -0,0 +1,54 @@ +import torch +import torch_geometric +import numpy as np + +from mlcolvar.data.graph import atomic, create_dataset_from_configurations +from mlcolvar.data import DictModule + + +def _test_get_data() -> torch_geometric.data.Batch: + # TODO: This is not a real test, but a helper function for other tests. + # Maybe should change its name. + torch.manual_seed(0) + torch.set_default_dtype(torch.float64) + + numbers = [8, 1, 1] + positions = np.array( + [ + [[0.0, 0.0, 0.0], [0.07, 0.07, 0.0], [0.07, -0.07, 0.0]], + [[0.0, 0.0, 0.0], [-0.07, 0.07, 0.0], [0.07, 0.07, 0.0]], + [[0.0, 0.0, 0.0], [0.07, -0.07, 0.0], [0.07, 0.07, 0.0]], + [[0.0, 0.0, 0.0], [0.0, -0.07, 0.07], [0.0, 0.07, 0.07]], + [[0.0, 0.0, 0.0], [0.07, 0.0, 0.07], [-0.07, 0.0, 0.07]], + [[0.1, 0.0, 1.1], [0.17, 0.07, 1.1], [0.17, -0.07, 1.1]], + ], + dtype=np.float64 + ) + cell = np.identity(3, dtype=float) * 0.2 + graph_labels = np.array([[1]]) + node_labels = np.array([[0], [1], [1]]) + z_table = atomic.AtomicNumberTable.from_zs(numbers) + + config = [ + atomic.Configuration( + atomic_numbers=numbers, + positions=p, + cell=cell, + pbc=[True] * 3, + node_labels=node_labels, + graph_labels=graph_labels, + ) for p in positions + ] + dataset = create_dataset_from_configurations( + config, z_table, 0.1, show_progress=False + ) + + datamodule = DictModule( + dataset, + lengths=(1.0,), + batch_size=10, + shuffle=False, + ) + datamodule.setup() + + return next(iter(datamodule.train_dataloader()))['data_list'] \ No newline at end of file diff --git a/mlcolvar/core/transform/__init__.py b/mlcolvar/core/transform/__init__.py index 51d14adf..feaf3d19 100644 --- a/mlcolvar/core/transform/__init__.py +++ b/mlcolvar/core/transform/__init__.py @@ -1,4 +1,14 @@ -__all__ = ["Transform","Normalization","Statistics","SwitchingFunctions","MultipleDescriptors","PairwiseDistances","EigsAdjMat","ContinuousHistogram","Inverse","TorsionalAngle","SequentialTransform"] +__all__ = ["Transform", + "Normalization", + "Statistics", + "SwitchingFunctions", + "MultipleDescriptors", + "PairwiseDistances", + "EigsAdjMat", + "ContinuousHistogram", + "Inverse", + "TorsionalAngle", + "SequentialTransform"] from .transform import * from .utils import * diff --git a/mlcolvar/core/transform/tools/utils.py b/mlcolvar/core/transform/tools/utils.py index 6705ffff..3a73e589 100644 --- a/mlcolvar/core/transform/tools/utils.py +++ b/mlcolvar/core/transform/tools/utils.py @@ -3,7 +3,7 @@ from typing import Union, List -def batch_reshape(t: torch.Tensor, size: torch.Size) -> torch.Tensor: +def batch_reshape(t: torch.Tensor, size: List[int]) -> torch.Tensor: """Return value reshaped according to size. In case of batch unsqueeze and expand along the first dimension. For single inputs just pass. diff --git a/mlcolvar/core/transform/utils.py b/mlcolvar/core/transform/utils.py index 5a769ecb..23c35fad 100644 --- a/mlcolvar/core/transform/utils.py +++ b/mlcolvar/core/transform/utils.py @@ -146,7 +146,7 @@ def test_sequential_transform(): import lightning masses = initialize_committor_masses(atom_types=[0,0,0,0], masses=[1.008]) - model = Committor(layers=[6,2,1], atomic_masses=masses, alpha=1) + model = Committor(model=[6,2,1], atomic_masses=masses, alpha=1) model.preprocessing = sequential pos = torch.rand((5, 4, 3)) diff --git a/mlcolvar/cvs/committor/committor.py b/mlcolvar/cvs/committor/committor.py index 65dd5ceb..ad75217f 100644 --- a/mlcolvar/cvs/committor/committor.py +++ b/mlcolvar/cvs/committor/committor.py @@ -1,9 +1,10 @@ import torch import lightning from mlcolvar.cvs import BaseCV -from mlcolvar.core import FeedForward +from mlcolvar.core import FeedForward, BaseGNN from mlcolvar.core.loss import CommittorLoss from mlcolvar.core.nn.utils import Custom_Sigmoid +from typing import Union, List __all__ = ["Committor"] @@ -13,8 +14,10 @@ class Committor(BaseCV, lightning.LightningModule): The committor function q is expressed as the output of a neural network optimized with a self-consistent approach based on the Kolmogorov's variational principle for the committor and on the imposition of its boundary conditions. - **Data**: for training it requires a DictDataset with the keys 'data', 'labels' and 'weights' - + **Data**: for training it requires a DictDataset containing: + - If using descriptors as input, the keys 'data', 'labels' and 'weights'. + - If using graphs as input, `torch_geometric.data` with 'graph_labels' and 'weight' in their 'data_list'. + **Loss**: Minimize Kolmogorov's variational functional of q and impose boundary condition on the metastable states (CommittorLoss) References @@ -34,11 +37,12 @@ class Committor(BaseCV, lightning.LightningModule): Class to optimize the gradients calculation imporving speed and memory efficiency. """ - BLOCKS = ["nn", "sigmoid"] + DEFAULT_BLOCKS = ["nn", "sigmoid"] + MODEL_BLOCKS = ["nn", "sigmoid"] def __init__( self, - layers: list, + model: Union[List[int], FeedForward, BaseGNN], atomic_masses: torch.Tensor, alpha: float, gamma: float = 10000, @@ -74,7 +78,7 @@ def __init__( separate_boundary_dataset : bool, optional Switch to exculde boundary condition labeled data from the variational loss, by default True descriptors_derivatives : torch.nn.Module, optional - `SmartDerivatives` object to save memory and time when using descriptors. + `SmartDerivatives` object to save memory and time when using descriptors. Cannot be used with GNN models. See also mlcolvar.core.loss.committor_loss.SmartDerivatives log_var : bool, optional Switch to minimize the log of the variational functional, by default False. @@ -88,16 +92,18 @@ def __init__( Number of dimensions, by default 3. options : dict[str, Any], optional Options for the building blocks of the model, by default {}. - Available blocks: ['nn'] . + Available blocks: ['nn']. """ - super().__init__(in_features=layers[0], out_features=layers[-1], **kwargs) + super().__init__(model, **kwargs) + + self.register_buffer('is_committor', torch.tensor(1, dtype=int)) # ======= LOSS ======= self.loss_fn = CommittorLoss(atomic_masses=atomic_masses, alpha=alpha, + cell=cell, gamma=gamma, delta_f=delta_f, - cell=cell, separate_boundary_dataset=separate_boundary_dataset, descriptors_derivatives=descriptors_derivatives, log_var=log_var, @@ -111,12 +117,18 @@ def __init__( options = self.parse_options(options) # ======= BLOCKS ======= - # initialize NN turning - o = "nn" - # set default activation to tanh - if "activation" not in options[o]: - options[o]["activation"] = "tanh" - self.nn = FeedForward(layers, **options[o]) + if not self._override_model: + # initialize NN + o = "nn" + # set default activation to tanh + if "activation" not in options[o]: + options[o]["activation"] = "tanh" + self.nn = FeedForward(self.layers, **options[o]) + elif self._override_model: + self.nn = model + + if self.nn.out_features != 1: + raise ValueError('Output of the model must be of dimension 1') # separately add sigmoid activation on last layer, this way it can be deactived o = "sigmoid" @@ -134,13 +146,19 @@ def training_step(self, train_batch, batch_idx): """Compute and return the training loss and record metrics.""" # =================get data=================== - x = train_batch["data"] - # check data are have shape (n_data, -1) - x = x.reshape((x.shape[0], -1)) - x.requires_grad = True - - labels = train_batch["labels"] - weights = train_batch["weights"] + if isinstance(self.nn, FeedForward): + x = train_batch["data"] + # check data have shape (n_data, -1) + x = x.reshape((x.shape[0], -1)) + x.requires_grad = True + + labels = train_batch["labels"] + weights = train_batch["weights"] + elif isinstance(self.nn, BaseGNN): + x = self._setup_graph_data(train_batch) + labels = x['graph_labels'] + weights = x['weight'].clone() + try: ref_idx = train_batch["ref_idx"] except KeyError: @@ -172,8 +190,7 @@ def training_step(self, train_batch, batch_idx): self.log(f"{name}_loss_bound_B", loss_bound_B, on_epoch=True) return loss - -def test_committor(): +def test_committor_1(): from mlcolvar.data import DictDataset, DictModule from mlcolvar.cvs.committor.utils import initialize_committor_masses, KolmogorovBias @@ -218,7 +235,7 @@ def test_committor(): -6.7121, -7.6094, -7.9009, -7.0479, -5.2398, -7.8241, -5.8642, -7.0701, -7.0348, -7.2577, -6.6142, -7.6322, -7.3279, -7.6393, -7.8608, -7.7037, -6.6949, -6.3947, -7.2246, -7.7009, -6.7359, -7.2186, -7.7849, -5.6882]) - model = Committor(layers=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0) + model = Committor(model=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0) trainer.fit(model, datamodule) out = model(X) out.sum().backward() @@ -238,7 +255,7 @@ def test_committor(): [0.0783],[0.1384],[0.0689],[0.0649],[0.0983],[0.1548],[0.0778],[0.0934],[0.0858],[0.1203], [0.1073],[0.1139],[0.0716],[0.0988],[0.0918],[0.1109],[0.0918],[0.0928],[0.1070],[0.0742]]) trainer = lightning.Trainer(max_epochs=5, logger=None, enable_checkpointing=False, limit_val_batches=0, num_sanity_val_steps=0) - model = Committor(layers=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, separate_boundary_dataset=False) + model = Committor(model=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, separate_boundary_dataset=False) trainer.fit(model, datamodule) out = model(X) out.sum().backward() @@ -254,7 +271,7 @@ def test_committor(): [0.7714],[0.5826],[0.6442],[0.5796],[0.6132],[0.5923],[0.7023],[0.5731],[0.7308],[0.6404], [0.5781],[0.6850],[0.5960],[0.6718],[0.6626],[0.6069],[0.7319],[0.5498],[0.6772],[0.5847]]) trainer = lightning.Trainer(max_epochs=5, logger=None, enable_checkpointing=False, limit_val_batches=0, num_sanity_val_steps=0) - model = Committor(layers=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, log_var=True) + model = Committor(model=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, log_var=True) trainer.fit(model, datamodule) out = model(X) out.sum().backward() @@ -270,7 +287,7 @@ def test_committor(): [0.1337],[0.1444],[0.1603],[0.1396],[0.2043],[0.1964],[0.1459],[0.2243],[0.1930],[0.1893], [0.2634],[0.1868],[0.1340],[0.2483],[0.1550],[0.1559],[0.1614],[0.2020],[0.1270],[0.2555]]) trainer = lightning.Trainer(max_epochs=5, logger=None, enable_checkpointing=False, limit_val_batches=0, num_sanity_val_steps=0) - model = Committor(layers=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, z_regularization=100, z_threshold=0.000001) + model = Committor(model=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, z_regularization=100, z_threshold=0.000001) trainer.fit(model, datamodule) out = model(X) out.sum().backward() @@ -281,7 +298,7 @@ def test_committor(): for z_regularization, z_threshold in zip([10, 0, -1, 10], [None, 10, 1, -1]): try: - model = Committor(layers=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, z_regularization=z_regularization, z_threshold=z_threshold, n_dim=2) + model = Committor(model=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, z_regularization=z_regularization, z_threshold=z_threshold, n_dim=2) trainer.fit(model, datamodule) except ValueError as e: print("[TEST LOG] Checked this error: ", e) @@ -289,10 +306,91 @@ def test_committor(): # test dimension error try: trainer = lightning.Trainer(max_epochs=5, logger=None, enable_checkpointing=False, limit_val_batches=0, num_sanity_val_steps=0) - model = Committor(layers=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, z_regularization=10, z_threshold=1, n_dim=2) + model = Committor(model=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, z_regularization=10, z_threshold=1, n_dim=2) trainer.fit(model, datamodule) except RuntimeError as e: print("[TEST LOG] Checked this error: ", e) + + +def test_committor_2(): + from mlcolvar.data import DictDataset, DictModule + from mlcolvar.cvs.committor.utils import initialize_committor_masses, KolmogorovBias + + # create two fake atoms and use their fake positions + atomic_masses = initialize_committor_masses(atom_types=[0,1], masses=[15.999, 1.008]) + # create dataset + samples = 50 + X = torch.randn((4*samples, 6)) + + # create labels + y = torch.zeros(X.shape[0]) + y[samples:] += 1 + y[int(2*samples):] += 1 + y[int(3*samples):] += 1 + + # create weights + w = torch.ones(X.shape[0]) + + dataset = DictDataset({"data": X, "labels": y, "weights": w}) + datamodule = DictModule(dataset, lengths=[1]) + + # train model + trainer = lightning.Trainer(max_epochs=5, logger=False, enable_checkpointing=False, limit_val_batches=0, num_sanity_val_steps=0) + + print() + print('NORMAL') + print() + # dataset separation + model = Committor(model=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0) + trainer.fit(model, datamodule) + model(X).sum().backward() + bias_model = KolmogorovBias(input_model=model, beta=1, epsilon=1e-6, lambd=1) + bias_model(X) + + # naive whole dataset + trainer = lightning.Trainer(max_epochs=5, logger=False, enable_checkpointing=False, limit_val_batches=0, num_sanity_val_steps=0) + model = Committor(model=[6, 4, 2, 1], atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, separate_boundary_dataset=False) + trainer.fit(model, datamodule) + model(X).sum().backward() + + + print() + print('EXTERNAL FEEDFORWARD') + print() + # dataset separation + ff_model = FeedForward([6, 4, 2, 1]) + model = Committor(model=ff_model, atomic_masses=atomic_masses, alpha=1e-1, delta_f=0) + trainer.fit(model, datamodule) + model(X).sum().backward() + bias_model = KolmogorovBias(input_model=model, beta=1, epsilon=1e-6, lambd=1) + bias_model(X) + + # naive whole dataset + trainer = lightning.Trainer(max_epochs=5, logger=False, enable_checkpointing=False, limit_val_batches=0, num_sanity_val_steps=0) + model = Committor(model=ff_model, atomic_masses=atomic_masses, alpha=1e-1, delta_f=0, separate_boundary_dataset=False) + trainer.fit(model, datamodule) + model(X).sum().backward() + + + print() + print('EXTERNAL GNN') + print() + from mlcolvar.core.nn.graph import SchNetModel + from mlcolvar.data.graph.utils import create_test_graph_input + gnn_model = SchNetModel(1, 0.1, [1, 8]) + + model = Committor(model=gnn_model, + atomic_masses=atomic_masses, + alpha=1e-1, + delta_f=0) + + datamodule = create_test_graph_input(output_type='datamodule', n_samples=100, n_states=3, n_atoms=3) + trainer = lightning.Trainer(max_epochs=5, logger=False, enable_checkpointing=False, limit_val_batches=0, num_sanity_val_steps=0, enable_model_summary=False) + trainer.fit(model, datamodule) + + example_input_graph_test = create_test_graph_input(output_type='example', n_atoms=4, n_samples=3, n_states=2) + + model(example_input_graph_test).sum().backward() @@ -352,7 +450,7 @@ def test_committor_with_derivatives(): datamodule = DictModule(dataset, lengths=[1.0]) # seed for reproducibility - model = Committor(layers=[45, 20, 1], + model = Committor(model=[45, 20, 1], atomic_masses=masses, alpha=1, separate_boundary_dataset=separate_boundary_dataset) @@ -408,7 +506,7 @@ def test_committor_with_derivatives(): torch.manual_seed(42) datamodule = DictModule(dataset_desc, lengths=[1.0]) - model = Committor(layers=[45, 20, 1], + model = Committor(model=[45, 20, 1], atomic_masses=masses, alpha=1, separate_boundary_dataset=separate_boundary_dataset, @@ -438,7 +536,7 @@ def test_committor_with_derivatives(): # test errors try: # separate boundary with explicit derivatives - model = Committor(layers=[45, 20, 1], + model = Committor(model=[45, 20, 1], atomic_masses=masses, alpha=1, separate_boundary_dataset=True, @@ -470,7 +568,7 @@ def test_committor_with_derivatives(): torch.manual_seed(42) datamodule = DictModule(smart_dataset, lengths=[1.0]) - model = Committor(layers=[45, 20, 1], + model = Committor(model=[45, 20, 1], atomic_masses=masses, alpha=1, separate_boundary_dataset=separate_boundary_dataset, @@ -516,6 +614,29 @@ def test_committor_with_derivatives(): except ValueError as e: print("[TEST LOG] Checked this error: ", e) + + print() + print('EXTERNAL GNN') + print() + from mlcolvar.core.nn.graph import SchNetModel + from mlcolvar.data.graph.utils import create_test_graph_input + gnn_model = SchNetModel(1, 0.1, [1, 8]) + + model = Committor(model=gnn_model, + atomic_masses=masses, + alpha=1e-1, + delta_f=0) + + datamodule = create_test_graph_input(output_type='datamodule', n_samples=100, n_states=3, n_atoms=3) + trainer = lightning.Trainer(max_epochs=5, logger=False, enable_checkpointing=False, limit_val_batches=0, num_sanity_val_steps=0, enable_model_summary=False) + trainer.fit(model, datamodule) + + example_input_graph_test = create_test_graph_input(output_type='example', n_atoms=4, n_samples=3, n_states=2) + + model(example_input_graph_test).sum().backward() + + if __name__ == "__main__": - test_committor() + test_committor_1() + test_committor_2() test_committor_with_derivatives() \ No newline at end of file diff --git a/mlcolvar/cvs/committor/utils.py b/mlcolvar/cvs/committor/utils.py index 040f3a84..4aba561a 100644 --- a/mlcolvar/cvs/committor/utils.py +++ b/mlcolvar/cvs/committor/utils.py @@ -1,6 +1,8 @@ import torch import numpy as np from typing import List +from mlcolvar.core import FeedForward, BaseGNN +from mlcolvar.utils import _code from mlcolvar.data import DictDataset __all__ = ["KolmogorovBias", "compute_committor_weights", "initialize_committor_masses"] @@ -35,12 +37,34 @@ def __init__(self, self.epsilon = epsilon def forward(self, x): - x.requires_grad = True + if isinstance(self.input_model.nn, FeedForward): + x.requires_grad = True + + elif isinstance(self.input_model.nn, BaseGNN): + x['positions'].requires_grad_(True) + x['node_attrs'].requires_grad_(True) + q = self.input_model(x) grad_outputs = torch.ones_like(q) - grads = torch.autograd.grad(q, x, grad_outputs, retain_graph=True)[0] + + if isinstance(self.input_model.nn, BaseGNN): + grads = torch.autograd.grad(q, x['positions'], grad_outputs, retain_graph=True)[0] + + elif isinstance(self.input_model.nn, FeedForward): + grads = torch.autograd.grad(q, x, grad_outputs, retain_graph=True)[0] + grads_squared = torch.sum(torch.pow(grads, 2), 1) - bias = - self.lambd*(1/self.beta)*(torch.log( grads_squared + self.epsilon ) - torch.log(self.epsilon)) + + # gnn models need an additional scatter + if isinstance(self.input_model.nn, BaseGNN): + grads_squared = _code.scatter_sum(grads_squared, + x['batch'], + dim=0) + + print(grads_squared.shape) + + bias = - self.lambd*(1/self.beta)*(torch.log( grads_squared + self.epsilon ) - torch.log(self.epsilon)) + return bias def compute_committor_weights(dataset : DictDataset, @@ -66,23 +90,29 @@ def compute_committor_weights(dataset : DictDataset, ------- Updated dataset with weights and updated labels """ + if len(dataset) != len(bias): + raise ValueError('Dataset and bias have different lenghts!') if bias.isnan().any(): raise(ValueError('Found Nan(s) in bias tensor. Check before proceeding! If no bias was applied replace Nan with zero!')) - n_labels = len(torch.unique(dataset['labels'])) + if dataset.metadata['data_type'] == 'descriptors': + original_labels = dataset['labels'] + else: + original_labels = torch.Tensor([dataset['data_list'][i]['graph_labels'] for i in range(len(dataset))]) + + n_labels = len(torch.unique(original_labels)) if n_labels != len(data_groups): raise(ValueError(f'The number of labels ({n_labels}) and data groups ({len(data_groups)}) do not match! Ensure you are correctly mapping the data in your training set!')) - # TODO sign if not from committor bias weights = torch.exp(beta * bias) - new_labels = torch.zeros_like(dataset['labels']) + new_labels = torch.zeros_like(original_labels) data_groups = torch.Tensor(data_groups) # correct data labels according to iteration for j,index in enumerate(data_groups): - new_labels[torch.nonzero(dataset['labels'] == j, as_tuple=True)] = index + new_labels[torch.nonzero(original_labels == j, as_tuple=True)] = index for i in np.unique(data_groups): # compute average of exp(beta*V) on this simualtions @@ -90,10 +120,15 @@ def compute_committor_weights(dataset : DictDataset, # update the weights weights[torch.nonzero(new_labels == i, as_tuple=True)] = coeff * weights[torch.nonzero(new_labels == i, as_tuple=True)] - + # update dataset - dataset['weights'] = weights - dataset['labels'] = new_labels + if dataset.metadata['data_type'] == 'descriptors': + dataset['weights'] = weights + dataset['labels'] = new_labels + else: + for i in range(len(dataset)): + dataset['data_list'][i]['weight'] = weights[i] + dataset['data_list'][i]['graph_labels'] = new_labels[i] return dataset @@ -123,4 +158,72 @@ def initialize_committor_masses(atom_types: list, masses: list): # make it a tensor atomic_masses = torch.Tensor(atomic_masses) - return atomic_masses \ No newline at end of file + return atomic_masses + +def test_Kolmogorov_bias(): + # test on feed forward + from mlcolvar import DeepTDA + model = DeepTDA(n_states=2, + n_cvs=1, + target_centers=[-1,1], + target_sigmas=[0.1, 0.1], + model=[4,2,1]) + inp = torch.randn((10, 4)) + model_bias = KolmogorovBias(input_model=model, beta=1.0) + model_bias(inp) + + # test on GNN + from mlcolvar.core.nn.graph import SchNetModel + from mlcolvar.data.graph.utils import create_test_graph_input + + dataset = create_test_graph_input('dataset') + inp = dataset.get_graph_inputs() + + gnn_model = SchNetModel(n_out=1, + cutoff=0.1, + atomic_numbers=[1,8]) + + model = DeepTDA(n_states=2, + n_cvs=1, + target_centers=[-1,1], + target_sigmas=[0.1, 0.1], + model=gnn_model) + + model_bias = KolmogorovBias(input_model=model, beta=1.0) + model_bias(inp) + + +def test_compute_committor_weights(): + # descriptors + # create dataset + samples = 50 + X = torch.randn((3*samples, 6)) + + # create labels, bias and weights + y = torch.zeros(X.shape[0]) + y[samples:] += 1 + y[int(2*samples):] += 1 + bias = torch.zeros(X.shape[0]) + w = torch.zeros(X.shape[0]) + + # create and edit dataset + dataset = DictDataset({"data": X, "labels": y, "weights": w}) + dataset = compute_committor_weights(dataset=dataset, bias=bias, data_groups=[0,1,2], beta=1.0) + print(dataset) + assert (torch.allclose(dataset['weights'], torch.ones(X.shape[0]))) + + + # graphs + # create dataset + from mlcolvar.data.graph.utils import create_test_graph_input + dataset = create_test_graph_input('dataset', n_states=4, random_weights=True) + bias = torch.zeros(len(dataset)) + dataset = compute_committor_weights(dataset=dataset, bias=bias, data_groups=[0,1,2,3], beta=1) + aux = [] + for i in range(len(dataset)): + aux.append(dataset['data_list'][i]['weight']) + assert (torch.allclose(torch.ones(len(dataset)), torch.Tensor(aux))) + +if __name__ == '__main__': + test_Kolmogorov_bias() + test_compute_committor_weights() \ No newline at end of file diff --git a/mlcolvar/cvs/cv.py b/mlcolvar/cvs/cv.py index 7bfd95c5..0cff165d 100644 --- a/mlcolvar/cvs/cv.py +++ b/mlcolvar/cvs/cv.py @@ -1,5 +1,8 @@ import torch from mlcolvar.core.transform import Transform +from typing import Union, List +from mlcolvar.core.nn import FeedForward, BaseGNN +from mlcolvar.data.graph.utils import create_graph_tracing_example class BaseCV: @@ -9,10 +12,12 @@ class BaseCV: To inherit from this class, the class must define a BLOCKS class attribute. """ + DEFAULT_BLOCKS = [] + MODEL_BLOCKS = [] + def __init__( self, - in_features, - out_features, + model: Union[List[int], FeedForward, BaseGNN], preprocessing: torch.nn.Module = None, postprocessing: torch.nn.Module = None, *args, @@ -22,10 +27,6 @@ def __init__( Parameters ---------- - in_features : int - Number of inputs of the CV model - out_features : int - Number of outputs of the CV model, should be the number of CVs preprocessing : torch.nn.Module, optional Preprocessing module, default None postprocessing : torch.nn.Module, optional @@ -35,13 +36,13 @@ def __init__( super().__init__(*args, **kwargs) # The parent class sets in_features and out_features based on their own - # init arguments so we don't need to save them here (see #103). + # init arguments so we don't need to save them here (see #103). + # It is needed for compatibility with multiclass CVs self.save_hyperparameters(ignore=['in_features', 'out_features']) # MODEL + self.parse_model(model=model) self.initialize_blocks() - self.in_features = in_features - self.out_features = out_features # OPTIM self._optimizer_name = "Adam" @@ -59,12 +60,39 @@ def n_cvs(self): @property def example_input_array(self): - return torch.randn( - (1,self.in_features) - if self.preprocessing is None - or not hasattr(self.preprocessing, "in_features") - else self.preprocessing.in_features - ) + if self.in_features is not None: + return torch.randn( + (1,self.in_features) + if self.preprocessing is None + or not hasattr(self.preprocessing, "in_features") + else self.preprocessing.in_features + ) + else: + return create_graph_tracing_example(n_species=len(self.atomic_numbers)) + + + # TODO add general torch.nn.Module + def parse_model(self, model: Union[List[int], FeedForward, BaseGNN]): + if isinstance(model, list): + self.layers = model + self.BLOCKS = self.DEFAULT_BLOCKS + self._override_model = False + self.in_features = self.layers[0] + self.out_features = self.layers[-1] + elif isinstance(model, FeedForward) or isinstance(model, BaseGNN): + self.BLOCKS = self.MODEL_BLOCKS + self._override_model = True + self.in_features = model.in_features + self.out_features = model.out_features + # save buffers for the interface for PLUMED + if isinstance(model, BaseGNN): + self.register_buffer('n_out', model.n_out) + self.register_buffer('cutoff', model.cutoff) + self.register_buffer('atomic_numbers', model.atomic_numbers) + else: + raise ValueError( + f"Keyword model can either accept type list, FeedForward or BaseGNN. Found {type(model)}" + ) def parse_options(self, options: dict = None): """ @@ -78,7 +106,13 @@ def parse_options(self, options: dict = None): """ if options is None: options = {} - + else: + for o in options.keys(): + if o in self.DEFAULT_BLOCKS and self._override_model: + raise ValueError( + "Options on blocks are disabled if a model is provided!" + ) + for b in self.BLOCKS: options.setdefault(b, {}) @@ -225,3 +259,9 @@ def __setattr__(self, key, value): if (key == "loss_fn") and ("cannot assign" in str(e)): del self.loss_fn super().__setattr__(key, value) + + def _setup_graph_data(self, train_batch, key : str='data_list'): + data = train_batch[key] + data['positions'].requires_grad_(True) + data['node_attrs'].requires_grad_(True) + return data \ No newline at end of file diff --git a/mlcolvar/cvs/generator/generator.py b/mlcolvar/cvs/generator/generator.py index 6ce151ea..0d19b94c 100644 --- a/mlcolvar/cvs/generator/generator.py +++ b/mlcolvar/cvs/generator/generator.py @@ -35,7 +35,7 @@ class Generator(BaseCV, lightning.LightningModule): """ - BLOCKS = ["nn"] + DEFAULT_BLOCKS = ["nn"] def __init__(self, r: int, @@ -79,7 +79,7 @@ def __init__(self, Options for the building blocks of the model, by default {}. Available blocks: ['nn'] . """ - super().__init__(in_features=layers[0], out_features=r, **kwargs) + super().__init__(model=layers, **kwargs) # ======= LOSS ======= self.loss_fn = GeneratorLoss(r=r, diff --git a/mlcolvar/cvs/multitask/multitask.py b/mlcolvar/cvs/multitask/multitask.py index 90712122..1be64eaa 100644 --- a/mlcolvar/cvs/multitask/multitask.py +++ b/mlcolvar/cvs/multitask/multitask.py @@ -19,10 +19,11 @@ import torch from mlcolvar.cvs.cv import BaseCV +from mlcolvar.core.nn import BaseGNN # ============================================================================= -# VARIATIONAL AUTOENCODER CV +# MULTITASK CV # ============================================================================= @@ -98,6 +99,10 @@ def __init__( has always coefficient 1). """ + # check if model is GNN, not implemented yet TODO + if hasattr(main_cv, "nn") and isinstance(main_cv.nn, BaseGNN): + raise NotImplementedError('Multitask not supported (yet) for GNN-based CVs') + # This changes dynamically the class of this object to inherit both from # MultiTaskCV and main_cv.__class__ so that we can access all members of # main_cv and still be able to override some of them. diff --git a/mlcolvar/cvs/supervised/deeplda.py b/mlcolvar/cvs/supervised/deeplda.py index 55d949ff..cf754e29 100644 --- a/mlcolvar/cvs/supervised/deeplda.py +++ b/mlcolvar/cvs/supervised/deeplda.py @@ -1,10 +1,12 @@ import torch import lightning from mlcolvar.cvs import BaseCV -from mlcolvar.core import FeedForward, Normalization +from mlcolvar.core import FeedForward, BaseGNN, Normalization from mlcolvar.data import DictModule from mlcolvar.core.stats import LDA from mlcolvar.core.loss import ReduceEigenvaluesLoss +from typing import Union, List + __all__ = ["DeepLDA"] @@ -14,7 +16,9 @@ class DeepLDA(BaseCV, lightning.LightningModule): Non-linear generalization of LDA in which a feature map is learned by a neural network optimized as to maximize the classes separation. The method is described in [1]_. - **Data**: for training it requires a DictDataset with the keys 'data' and 'labels'. + **Data**: for training it requires a DictDataset containing: + - If using descriptors as input, the keys 'data' and 'labels' + - If using graphs as input, `torch_geometric.data` with 'graph_labels' in their 'data_list'. **Loss**: maximize LDA eigenvalues (ReduceEigenvaluesLoss) @@ -31,9 +35,14 @@ class DeepLDA(BaseCV, lightning.LightningModule): Eigenvalue reduction to a scalar quantity """ - BLOCKS = ["norm_in", "nn", "lda"] + DEFAULT_BLOCKS = ["norm_in", "nn", "lda"] + MODEL_BLOCKS = ["nn", "lda"] - def __init__(self, layers: list, n_states: int, options: dict = None, **kwargs): + def __init__(self, + model: Union[List[int], FeedForward, BaseGNN], + n_states: int, + options: dict = None, + **kwargs): """ Define a Deep Linear Discriminant Analysis (Deep-LDA) CV composed by a neural network module and a LDA object. @@ -41,8 +50,13 @@ def __init__(self, layers: list, n_states: int, options: dict = None, **kwargs): Parameters ---------- - layers : list - Number of neurons per layer + model : list or FeedForward or BaseGNN + Determines the underlying machine-learning model. One can pass: + 1. A list of integers corresponding to the number of neurons per layer of a feed-forward NN. + The model Will be automatically intialized using a `mlcolvar.core.nn.feedforward.FeedForward` object. + The CV class will be initialized according to the DEFAULT_BLOCKS. + 2. An externally intialized model (either `mlcolvar.core.nn.feedforward.FeedForward` or `mlcolvar.core.nn.graph.BaseGNN` object). + The CV class will be initialized according to the MODEL_BLOCKS. n_states : int Number of states for the training options : dict[str, Any], optional @@ -50,7 +64,8 @@ def __init__(self, layers: list, n_states: int, options: dict = None, **kwargs): Available blocks: ['norm_in','nn','lda'] . Set 'block_name' = None or False to turn off that block """ - super().__init__(in_features=layers[0], out_features=layers[-1], **kwargs) + super().__init__(model=model, **kwargs) + self.save_hyperparameters(ignore=['model']) # ======= LOSS ======= # Maximize the sum of all the LDA eigenvalues. @@ -65,26 +80,31 @@ def __init__(self, layers: list, n_states: int, options: dict = None, **kwargs): # ======= BLOCKS ======= - # initialize norm_in - o = "norm_in" - if (options[o] is not False) and (options[o] is not None): - self.norm_in = Normalization(self.in_features, **options[o]) + if not self._override_model: + # initialize norm_in + o = "norm_in" + if (options[o] is not False) and (options[o] is not None): + self.norm_in = Normalization(self.in_features, **options[o]) + + # initialize nn + o = "nn" + self.nn = FeedForward(self.layers, **options[o]) - # initialize nn - o = "nn" - self.nn = FeedForward(layers, **options[o]) + elif self._override_model: + self.nn = model # initialize lda o = "lda" - self.lda = LDA(layers[-1], n_states, **options[o]) + self.lda = LDA(self.nn.out_features, n_states, **options[o]) # regularization self.lorentzian_reg = 40 # == 2/sw_reg, see set_regularization self.set_regularization(sw_reg=0.05) def forward_nn(self, x: torch.Tensor) -> torch.Tensor: - if self.norm_in is not None: - x = self.norm_in(x) + if not self._override_model: + if self.norm_in is not None: + x = self.norm_in(x) x = self.nn(x) return x @@ -137,13 +157,19 @@ def regularization_lorentzian(self, x): def training_step(self, train_batch, batch_idx): """Compute and return the training loss and record metrics.""" # =================get data=================== - x = train_batch["data"] - y = train_batch["labels"] + if isinstance(self.nn, FeedForward): + x = train_batch["data"] + labels = train_batch["labels"] + elif isinstance(self.nn, BaseGNN): + x = self._setup_graph_data(train_batch) + labels = x['graph_labels'].squeeze() + # =================forward==================== h = self.forward_nn(x) + # ===================lda====================== eigvals, _ = self.lda.compute( - h, y, save_params=True if self.training else False + h, labels, save_params=True if self.training else False ) # ===================loss===================== loss = self.loss_fn(eigvals) @@ -151,6 +177,7 @@ def training_step(self, train_batch, batch_idx): s = self.lda(h) lorentzian_reg = self.regularization_lorentzian(s) loss += lorentzian_reg + # ====================log===================== name = "train" if self.training else "valid" loss_dict = {f"{name}_loss": loss, f"{name}_lorentzian_reg": lorentzian_reg} @@ -164,7 +191,7 @@ def test_deeplda(n_states=2): in_features, out_features = 2, n_states - 1 layers = [in_features, 50, 50, out_features] - + # create dataset n_points = 500 X, y = [], [] @@ -187,6 +214,9 @@ def test_deeplda(n_states=2): "nn": {"activation": "relu"}, "lda": {}, } + print() + print('NORMAL') + print() model = DeepLDA(layers, n_states, options=opts) # create trainer and fit @@ -200,6 +230,55 @@ def test_deeplda(n_states=2): with torch.no_grad(): s = model(X).numpy() + + # feedforward external + print() + print('EXTERNAL') + print() + ff_model = FeedForward(layers=layers) + model = DeepLDA(ff_model, n_states) + + # create trainer and fit + trainer = lightning.Trainer( + max_epochs=1, log_every_n_steps=2, logger=None, enable_checkpointing=False + ) + trainer.fit(model, datamodule) + + # eval + model.eval() + with torch.no_grad(): + s = model(X).numpy() + print(s) + + # gnn external + print() + print('GNN') + print() + from mlcolvar.core.nn.graph.schnet import SchNetModel + from mlcolvar.data.graph.utils import create_test_graph_input + gnn_model = SchNetModel(2, 0.1, [1, 8]) + model = DeepLDA(gnn_model, n_states) + + datamodule = create_test_graph_input(output_type='datamodule', n_samples=200, n_states=n_states) + + # create trainer and fit + trainer = lightning.Trainer( + max_epochs=1, log_every_n_steps=2, logger=False, enable_checkpointing=False, enable_model_summary=False + ) + trainer.fit(model, datamodule) + + traced_model = model.to_torchscript( + file_path=None, method="trace") + + example_input_graph_test = create_test_graph_input(output_type='example', n_atoms=4, n_samples=3, n_states=n_states) + assert torch.allclose(model(example_input_graph_test), traced_model(example_input_graph_test)) + + # eval + model.eval() + with torch.no_grad(): + s = model(example_input_graph_test).numpy() + print(s) + if __name__ == "__main__": test_deeplda(n_states=2) diff --git a/mlcolvar/cvs/supervised/deeptda.py b/mlcolvar/cvs/supervised/deeptda.py index e58f215c..1dfeb321 100644 --- a/mlcolvar/cvs/supervised/deeptda.py +++ b/mlcolvar/cvs/supervised/deeptda.py @@ -1,35 +1,36 @@ import torch import lightning from mlcolvar.cvs import BaseCV -from mlcolvar.core import FeedForward, Normalization +from mlcolvar.core import FeedForward, BaseGNN, Normalization from mlcolvar.core.loss import TDALoss from mlcolvar.data import DictModule +from typing import Union, List __all__ = ["DeepTDA"] - class DeepTDA(BaseCV, lightning.LightningModule): """ Deep Targeted Discriminant Analysis (Deep-TDA) CV. Combine the inputs with a neural-network and optimize it in a way such that the data are distributed accordingly to a mixture of Gaussians. The method is described in [1]_. - - **Data**: for training it requires a DictDataset with the keys 'data' and 'labels'. - + **Data**: for training it requires a DictDataset containing: + - If using descriptors as input, the keys 'data' and 'labels'. + - If using graphs as input, `torch_geometric.data` with 'graph_labels' in their 'data_list'. + **Loss**: distance of the samples of each class from a set of Gaussians (TDALoss) - References ---------- .. [1] E. Trizio and M. Parrinello, "From enhanced sampling to reaction profiles", The Journal of Physical Chemistry Letters 12, 8621– 8626 (2021). - See also -------- mlcolvar.core.loss.TDALoss Distance from a simple Gaussian target distribution. """ - BLOCKS = ["norm_in", "nn"] + DEFAULT_BLOCKS = ["norm_in", "nn"] + MODEL_BLOCKS = ["nn"] + # TODO n_states optional? def __init__( @@ -38,14 +39,13 @@ def __init__( n_cvs: int, target_centers: list, target_sigmas: list, - layers: list, + model: Union[List[int], FeedForward, BaseGNN], options: dict = None, **kwargs, ): """ Define Deep Targeted Discriminant Analysis (Deep-TDA) CV composed by a neural network module. By default a module standardizing the inputs is also used. - Parameters ---------- n_states : int @@ -56,15 +56,20 @@ def __init__( Centers of the Gaussian targets target_sigmas : list Standard deviations of the Gaussian targets - layers : list - Number of neurons per layer + model : list or FeedForward or BaseGNN + Determines the underlying machine-learning model. One can pass: + 1. A list of integers corresponding to the number of neurons per layer of a feed-forward NN. + The model Will be automatically intialized using a `mlcolvar.core.nn.feedforward.FeedForward` object. + The CV class will be initialized according to the DEFAULT_BLOCKS. + 2. An externally intialized model (either `mlcolvar.core.nn.feedforward.FeedForward` or `mlcolvar.core.nn.graph.BaseGNN` object). + The CV class will be initialized according to the MODEL_BLOCKS. options : dict[str, Any], optional Options for the building blocks of the model, by default {}. Available blocks: ['norm_in', 'nn']. Set 'block_name' = None or False to turn off that block - """ - - super().__init__(in_features=layers[0], out_features=layers[-1], **kwargs) + """ + super().__init__(model, **kwargs) + self.save_hyperparameters(ignore=['model']) # ======= LOSS ======= self.loss_fn = TDALoss( @@ -106,43 +111,54 @@ def __init__( ) # ======= BLOCKS ======= - - # Initialize norm_in - o = "norm_in" - if (options[o] is not False) and (options[o] is not None): - self.norm_in = Normalization(self.in_features, **options[o]) - - # initialize NN - o = "nn" - self.nn = FeedForward(layers, **options[o]) - - def training_step(self, train_batch, batch_idx): + if not self._override_model: + # Initialize norm_in + o = "norm_in" + if (options[o] is not False) and (options[o] is not None): + self.norm_in = Normalization(self.in_features, **options[o]) + + # initialize NN + o = "nn" + self.nn = FeedForward(self.layers, **options[o]) + elif self._override_model: + self.nn = model + + def training_step(self, train_batch, *args, **kwargs) -> torch.Tensor: """Compute and return the training loss and record metrics.""" # =================get data=================== - x = train_batch["data"] - labels = train_batch["labels"] + if isinstance(self.nn, FeedForward): + x = train_batch["data"] + labels = train_batch["labels"] + elif isinstance(self.nn, BaseGNN): + x = self._setup_graph_data(train_batch) + labels = x['graph_labels'].squeeze() + # =================forward==================== z = self.forward_cv(x) + # ===================loss===================== - loss, loss_centers, loss_sigmas = self.loss_fn( - z, labels, return_loss_terms=True - ) - # ====================log=====================+ + loss, loss_centers, loss_sigmas = self.loss_fn(z, + labels, + return_loss_terms=True + ) + + # ====================log===================== name = "train" if self.training else "valid" self.log(f"{name}_loss", loss, on_epoch=True) self.log(f"{name}_loss_centers", loss_centers, on_epoch=True) self.log(f"{name}_loss_sigmas", loss_sigmas, on_epoch=True) + return loss -# TODO signature of tests? import numpy as np - def test_deeptda_cv(): from mlcolvar.data import DictDataset + # feedforward with layers for states_and_cvs in [[2, 1], [3, 1], [3, 2], [5, 4]]: + print(states_and_cvs) # get the number of states and cvs for the test run n_states = states_and_cvs[0] n_cvs = states_and_cvs[1] @@ -155,18 +171,18 @@ def test_deeptda_cv(): # test initialize via dictionary options = {"nn": {"activation": "relu"}} + print() + print('NORMAL') + print() model = DeepTDA( n_states=n_states, n_cvs=n_cvs, target_centers=target_centers, target_sigmas=target_sigmas, - layers=layers, + model=layers, options=options, ) - print("----------") - print(model) - # create dataset samples = 100 X = torch.randn((samples * n_states, 2)) @@ -180,17 +196,71 @@ def test_deeptda_cv(): datamodule = DictModule(dataset, lengths=[0.75, 0.2, 0.05], batch_size=samples) # train model trainer = lightning.Trainer( - accelerator="cpu", max_epochs=2, logger=None, enable_checkpointing=False + accelerator="cpu", max_epochs=2, logger=None, enable_checkpointing=False, enable_model_summary=False ) trainer.fit(model, datamodule) # trace model traced_model = model.to_torchscript( - file_path=None, method="trace", example_inputs=X[0] + file_path=None, method="trace") + model.eval() + assert torch.allclose(model(X), traced_model(X)) + + print() + print('EXTERNAL FEEDFORWARD') + print() + # feedforward external + ff_model = FeedForward(layers=layers) + model = DeepTDA( + n_states=n_states, + n_cvs=n_cvs, + target_centers=target_centers, + target_sigmas=target_sigmas, + model=ff_model + ) + + # train model + trainer = lightning.Trainer( + accelerator="cpu", max_epochs=2, logger=None, enable_checkpointing=False, enable_model_summary=False ) + trainer.fit(model, datamodule) + + # trace model + traced_model = model.to_torchscript( + file_path=None, method="trace") model.eval() assert torch.allclose(model(X), traced_model(X)) + print() + print('EXTERNAL GNN') + print() + # gnn external + from mlcolvar.core.nn.graph.schnet import SchNetModel + from mlcolvar.data.graph.utils import create_test_graph_input + gnn_model = SchNetModel(n_cvs, 0.1, [1, 8]) + model = DeepTDA( + n_states=n_states, + n_cvs=n_cvs, + target_centers=target_centers, + target_sigmas=target_sigmas, + model=gnn_model + ) + datamodule = create_test_graph_input(output_type='datamodule', n_samples=100, n_states=n_states) + + # train model + trainer = lightning.Trainer( + accelerator="cpu", max_epochs=2, logger=False, enable_checkpointing=False, enable_model_summary=False + ) + trainer.fit(model, datamodule) + + # trace model + traced_model = model.to_torchscript( + file_path=None, method="trace") + + # check on a different number of atoms + example_input_graph_test = create_test_graph_input(output_type='example', n_atoms=4, n_samples=3, n_states=2) + assert torch.allclose(model(example_input_graph_test), traced_model(example_input_graph_test)) if __name__ == "__main__": test_deeptda_cv() + diff --git a/mlcolvar/cvs/supervised/regression.py b/mlcolvar/cvs/supervised/regression.py index 1ea52331..558db234 100644 --- a/mlcolvar/cvs/supervised/regression.py +++ b/mlcolvar/cvs/supervised/regression.py @@ -1,8 +1,10 @@ import torch import lightning from mlcolvar.cvs import BaseCV -from mlcolvar.core import FeedForward, Normalization +from mlcolvar.core import FeedForward, Normalization, BaseGNN from mlcolvar.core.loss import MSELoss +from typing import Union, List + __all__ = ["RegressionCV"] @@ -12,8 +14,10 @@ class RegressionCV(BaseCV, lightning.LightningModule): Example of collective variable obtained with a regression task. Combine the inputs with a neural-network and optimize it to match a target function. - **Data**: for training it requires a DictDataset with the keys 'data' and 'target' and optionally 'weights'. - + **Data**: for training it requires a DictDataset containing: + - If using descriptors as input, the keys 'data', 'target' and optionally 'weights'. + - If using graphs as input, `torch_geometric.data` with 'graph_labels' with the target values and 'weight' in their 'data_list'. + **Loss**: least squares (MSELoss). See also @@ -22,22 +26,34 @@ class RegressionCV(BaseCV, lightning.LightningModule): (weighted) Mean Squared Error (MSE) loss function. """ - BLOCKS = ["norm_in", "nn"] + DEFAULT_BLOCKS = ["norm_in", "nn"] + MODEL_BLOCKS = ["nn"] - def __init__(self, layers: list, options: dict = None, **kwargs): + + def __init__(self, + model: Union[List[int], FeedForward, BaseGNN], + options: dict = None, + **kwargs): """Example of collective variable obtained with a regression task. By default a module standardizing the inputs is used. Parameters ---------- - layers : list - Number of neurons per layer + model : list or FeedForward or BaseGNN + Determines the underlying machine-learning model. One can pass: + 1. A list of integers corresponding to the number of neurons per layer of a feed-forward NN. + The model Will be automatically intialized using a `mlcolvar.core.nn.feedforward.FeedForward` object. + The CV class will be initialized according to the DEFAULT_BLOCKS. + 2. An externally intialized model (either `mlcolvar.core.nn.feedforward.FeedForward` or `mlcolvar.core.nn.graph.BaseGNN` object). + The CV class will be initialized according to the MODEL_BLOCKS. options : dict[str, Any], optional Options for the building blocks of the model, by default None. Available blocks: ['norm_in', 'nn']. Set 'block_name' = None or False to turn off that block """ - super().__init__(in_features=layers[0], out_features=layers[-1], **kwargs) + super().__init__(model, **kwargs) + self.save_hyperparameters(ignore=['model']) + # ======= LOSS ======= self.loss_fn = MSELoss() @@ -46,25 +62,38 @@ def __init__(self, layers: list, options: dict = None, **kwargs): # parse and sanitize options = self.parse_options(options) - # Initialize norm_in - o = "norm_in" - if (options[o] is not False) and (options[o] is not None): - self.norm_in = Normalization(self.in_features, **options[o]) + # ======= BLOCKS ======= + if not self._override_model: + # Initialize norm_in + o = "norm_in" + if (options[o] is not False) and (options[o] is not None): + self.norm_in = Normalization(self.in_features, **options[o]) - # initialize NN - o = "nn" - self.nn = FeedForward(layers, **options[o]) + # initialize NN + o = "nn" + self.nn = FeedForward(self.layers, **options[o]) + elif self._override_model: + self.nn = model def training_step(self, train_batch, batch_idx): """Compute and return the training loss and record metrics.""" # =================get data=================== - x = train_batch["data"] - labels = train_batch["target"] loss_kwargs = {} - if "weights" in train_batch: - loss_kwargs["weights"] = train_batch["weights"] + if isinstance(self.nn, FeedForward): + x = train_batch["data"] + labels = train_batch["target"] + if "weights" in train_batch: + loss_kwargs["weights"] = train_batch["weights"] + elif isinstance(self.nn, BaseGNN): + x = self._setup_graph_data(train_batch) + # TODO maybe add an external key like target? + labels = x['graph_labels'].squeeze() + if "weights" in x: + loss_kwargs["weights"] = x["weights"] + # =================forward==================== y = self.forward_cv(x) + # ===================loss===================== loss = self.loss_fn(y, labels, **loss_kwargs) # ====================log===================== @@ -82,10 +111,13 @@ def test_regression_cv(): in_features, out_features = 2, 1 layers = [in_features, 5, 10, out_features] + print() + print('NORMAL') + print() # initialize via dictionary options = {"nn": {"activation": "relu"}} - model = RegressionCV(layers=layers, options=options) + model = RegressionCV(model=layers, options=options) print("----------") print(model) @@ -123,7 +155,89 @@ def test_regression_cv(): accelerator="cpu", max_epochs=1, logger=None, enable_checkpointing=False ) - model = RegressionCV(layers=[2, 10, 10, 1]) + model = RegressionCV(model=[2, 10, 10, 1]) + model.loss_fn = lambda y, y_ref: (y - y_ref).abs().mean() + trainer.fit(model, datamodule) + + print() + print('EXTERNAL FEEDFORWARD') + print() + ff_model = FeedForward(layers=layers) + # create model + model = RegressionCV(model=ff_model) + + # create dataset + X = torch.randn((100, 2)) + y = X.square().sum(1) + dataset = DictDataset({"data": X, "target": y}) + datamodule = DictModule(dataset, lengths=[0.75, 0.2, 0.05], batch_size=25) + # train model + model.optimizer_name = "SGD" + model.optimizer_kwargs.update(dict(lr=1e-2)) + trainer = lightning.Trainer( + accelerator="cpu", max_epochs=1, logger=None, enable_checkpointing=False + ) + trainer.fit(model, datamodule) + model.eval() + # trace model + traced_model = model.to_torchscript( + file_path=None, method="trace", example_inputs=X[0] + ) + assert torch.allclose(model(X), traced_model(X)) + + # weighted loss + print("weighted loss") + w = torch.randn((100)) + dataset_weights = DictDataset({"data": X, "target": y, "weights": w}) + datamodule_weights = DictModule( + dataset_weights, lengths=[0.75, 0.2, 0.05], batch_size=25 + ) + trainer.fit(model, datamodule_weights) + + # use custom loss + print("custom loss") + trainer = lightning.Trainer( + accelerator="cpu", max_epochs=1, logger=None, enable_checkpointing=False + ) + + model = RegressionCV(model=ff_model) + model.loss_fn = lambda y, y_ref: (y - y_ref).abs().mean() + trainer.fit(model, datamodule) + + print() + print('EXTERNAL GNN') + print() + # gnn external + from mlcolvar.core.nn.graph.schnet import SchNetModel + from mlcolvar.data.graph.utils import create_test_graph_input + gnn_model = SchNetModel(1, 0.1, [1, 8]) + # create model + model = RegressionCV(model=gnn_model) + + datamodule = create_test_graph_input(output_type='datamodule', n_samples=100, n_states=2) + # train model + trainer = lightning.Trainer( + accelerator="cpu", max_epochs=1, logger=False, enable_checkpointing=False, enable_model_summary=False + ) + trainer.fit(model, datamodule) + model.eval() + # trace model + traced_model = model.to_torchscript(file_path=None, method="trace") + example_input_graph_test = create_test_graph_input(output_type='example', n_atoms=4, n_samples=3, n_states=2) + assert torch.allclose(model(example_input_graph_test), traced_model(example_input_graph_test)) + + # weighted loss + print("weighted loss") + datamodule_weights = create_test_graph_input(output_type='datamodule', n_samples=100, n_states=2, random_weights=True) + trainer.fit(model, datamodule_weights) + + # use custom loss + print("custom loss") + trainer = lightning.Trainer( + accelerator="cpu", max_epochs=1, logger=False, enable_checkpointing=False, enable_model_summary=False + ) + + model = RegressionCV(model=gnn_model) model.loss_fn = lambda y, y_ref: (y - y_ref).abs().mean() trainer.fit(model, datamodule) diff --git a/mlcolvar/cvs/timelagged/deeptica.py b/mlcolvar/cvs/timelagged/deeptica.py index b95c7547..428c7b41 100644 --- a/mlcolvar/cvs/timelagged/deeptica.py +++ b/mlcolvar/cvs/timelagged/deeptica.py @@ -1,9 +1,10 @@ import torch import lightning from mlcolvar.cvs import BaseCV -from mlcolvar.core import FeedForward, Normalization +from mlcolvar.core import FeedForward, BaseGNN, Normalization from mlcolvar.core.stats import TICA from mlcolvar.core.loss import ReduceEigenvaluesLoss +from typing import Union, List __all__ = ["DeepTICA"] @@ -16,10 +17,12 @@ class DeepTICA(BaseCV, lightning.LightningModule): approximated by TICA. The method is described in [1]_. Note that from the point of view of the architecture DeepTICA is similar to the SRV [2] method. - **Data**: for training it requires a DictDataset with the keys 'data' (input at time t) - and 'data_lag' (input at time t+lag), as well as the corresponding 'weights' and - 'weights_lag' which will be used to weight the time correlation functions. - This can be created with the helper function `create_timelagged_dataset`. + **Data**: for training it requires a DictDataset containing: + - If using descriptors as input, the keys 'data' (input at time t) + and 'data_lag' (input at time t+lag), as well as the corresponding 'weights' and + 'weights_lag' which will be used to weight the time correlation functions. + - If using graphs as input, the keys 'data_list' and 'data_list_lag', each containing the respective 'weight' + This can be created in both cases with the helper function `create_timelagged_dataset`. **Loss**: maximize TICA eigenvalues (ReduceEigenvaluesLoss) @@ -40,17 +43,26 @@ class DeepTICA(BaseCV, lightning.LightningModule): Create dataset of time-lagged data. """ - BLOCKS = ["norm_in", "nn", "tica"] + DEFAULT_BLOCKS = ["norm_in", "nn", "tica"] + MODEL_BLOCKS = ["nn", "tica"] - def __init__(self, layers: list, n_cvs: int = None, options: dict = None, **kwargs): + def __init__(self, + model: Union[List[int], FeedForward, BaseGNN], + n_cvs: int = None, + options: dict = None, **kwargs): """ Define a Deep-TICA CV, composed of a neural network module and a TICA object. By default a module standardizing the inputs is also used. Parameters ---------- - layers : list - Number of neurons per layer + model : list or FeedForward or BaseGNN + Determines the underlying machine-learning model. One can pass: + 1. A list of integers corresponding to the number of neurons per layer of a feed-forward NN. + The model Will be automatically intialized using a `mlcolvar.core.nn.feedforward.FeedForward` object. + The CV class will be initialized according to the DEFAULT_BLOCKS. + 2. An externally intialized model (either `mlcolvar.core.nn.feedforward.FeedForward` or `mlcolvar.core.nn.graph.BaseGNN` object). + The CV class will be initialized according to the MODEL_BLOCKS. n_cvs : int, optional Number of cvs to optimize, default None (= last layer) options : dict[str, Any], optional @@ -58,15 +70,13 @@ def __init__(self, layers: list, n_cvs: int = None, options: dict = None, **kwar Available blocks: ['norm_in','nn','tica']. Set 'block_name' = None or False to turn off that block """ - super().__init__( - in_features=layers[0], - out_features=n_cvs if n_cvs is not None else layers[-1], - **kwargs, - ) + super().__init__(model, **kwargs) # ======= LOSS ======= # Maximize the squared sum of all the TICA eigenvalues. self.loss_fn = ReduceEigenvaluesLoss(mode="sum2") + # here we need to override the self.out_features attribute + self.out_features = n_cvs # ======= OPTIONS ======= # parse and sanitize @@ -74,22 +84,27 @@ def __init__(self, layers: list, n_cvs: int = None, options: dict = None, **kwar # ======= BLOCKS ======= - # initialize norm_in - o = "norm_in" - if (options[o] is not False) and (options[o] is not None): - self.norm_in = Normalization(self.in_features, **options[o]) + if not self._override_model: + # initialize norm_in + o = "norm_in" + if (options[o] is not False) and (options[o] is not None): + self.norm_in = Normalization(self.in_features, **options[o]) - # initialize nn - o = "nn" - self.nn = FeedForward(layers, **options[o]) + # initialize nn + o = "nn" + self.nn = FeedForward(self.layers, **options[o]) + + elif self._override_model: + self.nn = model - # initialize lda + # initialize tica o = "tica" - self.tica = TICA(layers[-1], n_cvs, **options[o]) + self.tica = TICA(self.nn.out_features, n_cvs, **options[o]) def forward_nn(self, x: torch.Tensor) -> torch.Tensor: - if self.norm_in is not None: - x = self.norm_in(x) + if not self._override_model: + if self.norm_in is not None: + x = self.norm_in(x) x = self.nn(x) return x @@ -111,10 +126,17 @@ def training_step(self, train_batch, batch_idx): 3) Compute TICA """ # =================get data=================== - x_t = train_batch["data"] - x_lag = train_batch["data_lag"] - w_t = train_batch["weights"] - w_lag = train_batch["weights_lag"] + if isinstance(self.nn, FeedForward): + x_t = train_batch["data"] + x_lag = train_batch["data_lag"] + w_t = train_batch["weights"] + w_lag = train_batch["weights_lag"] + elif isinstance(self.nn, BaseGNN): + x_t = self._setup_graph_data(train_batch, key='data_list') + x_lag = self._setup_graph_data(train_batch, key='data_list_lag') + w_t = x_t['weight'] + w_lag = x_lag['weight'] + # =================forward==================== f_t = self.forward_nn(x_t) f_lag = self.forward_nn(x_lag) @@ -139,12 +161,15 @@ def test_deep_tica(): from mlcolvar.utils.timelagged import create_timelagged_dataset # create dataset - X = np.loadtxt("mlcolvar/tests/data/mb-mcmc.dat") - X = torch.Tensor(X) + # X = np.loadtxt("mlcolvar/tests/data/mb-mcmc.dat") + X = torch.randn((10000, 2)) dataset = create_timelagged_dataset(X, lag_time=1) datamodule = DictModule(dataset, batch_size=10000) # create cv + print() + print('NORMAL') + print() layers = [2, 10, 10, 2] model = DeepTICA(layers, n_cvs=1) @@ -163,5 +188,55 @@ def test_deep_tica(): print(X.shape, "-->", s.shape) + print() + print('EXTERNAL') + print() + ff_model = FeedForward(layers=layers) + model = DeepTICA(ff_model, n_cvs=1) + + # change loss options + model.loss_fn.mode = "sum2" + + # create trainer and fit + trainer = lightning.Trainer( + max_epochs=1, log_every_n_steps=2, logger=None, enable_checkpointing=False + ) + trainer.fit(model, datamodule) + + model.eval() + with torch.no_grad(): + s = model(X).numpy() + print(X.shape, "-->", s.shape) + + + # gnn external + print() + print('GNN') + print() + from mlcolvar.core.nn.graph.schnet import SchNetModel + from mlcolvar.data.graph.utils import create_test_graph_input + gnn_model = SchNetModel(2, 0.1, [1, 8]) + model = DeepTICA(gnn_model, n_cvs=1) + + # change loss options + model.loss_fn.mode = "sum2" + + # create trainer and fit + trainer = lightning.Trainer( + max_epochs=1, log_every_n_steps=2, logger=False, enable_checkpointing=False, enable_model_summary=False, + ) + + dataset = create_test_graph_input(output_type='dataset', n_samples=200, n_states=2) + lagged_dataset = create_timelagged_dataset(dataset, logweights=torch.randn(len(dataset))) + + datamodule = DictModule(dataset=lagged_dataset) + trainer.fit(model, datamodule) + + model.eval() + with torch.no_grad(): + example_input_graph_test = create_test_graph_input(output_type='example', n_atoms=4, n_samples=3, n_states=2) + s = model(example_input_graph_test).numpy() + print(X.shape, "-->", s.shape) + if __name__ == "__main__": test_deep_tica() diff --git a/mlcolvar/cvs/unsupervised/autoencoder.py b/mlcolvar/cvs/unsupervised/autoencoder.py index bb9839bb..ac8a7741 100644 --- a/mlcolvar/cvs/unsupervised/autoencoder.py +++ b/mlcolvar/cvs/unsupervised/autoencoder.py @@ -41,7 +41,7 @@ class AutoEncoderCV(BaseCV, lightning.LightningModule): (weighted) Mean Squared Error (MSE) loss function. """ - BLOCKS = ["norm_in", "encoder", "decoder"] + DEFAULT_BLOCKS = ["norm_in", "encoder", "decoder"] def __init__( self, @@ -67,9 +67,16 @@ def __init__( Available blocks: ['norm_in', 'encoder','decoder']. Set 'block_name' = None or False to turn off that block """ - super().__init__( - in_features=encoder_layers[0], out_features=encoder_layers[-1], **kwargs - ) + # external model not supported for autoencoder CVs (yet) + if not isinstance(encoder_layers, list): + raise NotImplementedError( + f"Encoder layer must be a list. found {type(encoder_layers)}" + ) + super().__init__(model=encoder_layers, **kwargs) + # this makes checkpointing safe, to avoid double model keys + self.save_hyperparameters(ignore=['model']) + self.hparams.pop('model') + # ======= LOSS ======= # Reconstruction (MSE) loss diff --git a/mlcolvar/cvs/unsupervised/vae.py b/mlcolvar/cvs/unsupervised/vae.py index 7ff297b4..a530b469 100644 --- a/mlcolvar/cvs/unsupervised/vae.py +++ b/mlcolvar/cvs/unsupervised/vae.py @@ -58,7 +58,7 @@ class VariationalAutoEncoderCV(BaseCV, lightning.LightningModule): Evidence Lower BOund loss function """ - BLOCKS = ["norm_in", "encoder", "decoder"] + DEFAULT_BLOCKS = ["norm_in", "encoder", "decoder"] def __init__( self, @@ -90,7 +90,17 @@ def __init__( Set ``'block_name' = None`` or ``False`` to turn off a block. Encoder and decoder cannot be turned off. """ - super().__init__(in_features=encoder_layers[0], out_features=n_cvs, **kwargs) + if not isinstance(encoder_layers, list): + raise NotImplementedError( + f"Encoder layer must be a list. found {type(encoder_layers)}" + ) + super().__init__(model=encoder_layers, **kwargs) + # this makes checkpointing safe, to avoid double model keys + self.save_hyperparameters(ignore=['model']) + self.hparams.pop('model') + + # here we need to override the self.out_features attribute + self.out_features = n_cvs # ======= LOSS ======= # ELBO loss function when latent space and reconstruction distributions are Gaussians. diff --git a/mlcolvar/data/dataloader.py b/mlcolvar/data/dataloader.py index 7ac18861..cc2c1ed4 100644 --- a/mlcolvar/data/dataloader.py +++ b/mlcolvar/data/dataloader.py @@ -223,7 +223,7 @@ def set_dataset_and_batch_size( self._dataset = old_dataset self._batch_size = old_batch_size raise ValueError( - f"batch_size (length {batch_size_len}) must have length equal to the number of datasets (length {len(self.dataset)}." + f"batch_size (length {len(self._batch_size)} must have length equal to the number of datasets (length {len(self.dataset)}." ) # The number of batches per epoch must be the same for all datasets. diff --git a/mlcolvar/data/datamodule.py b/mlcolvar/data/datamodule.py index 4323b211..2bf26fae 100644 --- a/mlcolvar/data/datamodule.py +++ b/mlcolvar/data/datamodule.py @@ -20,14 +20,14 @@ import warnings import torch +import torch_geometric import numpy as np import lightning -from torch.utils.data import random_split, Subset +from torch.utils.data import Subset from torch import default_generator, randperm from mlcolvar.data import DictLoader, DictDataset - # ============================================================================= # DICTIONARY DATAMODULE CLASS # ============================================================================= @@ -122,7 +122,10 @@ def __init__( """ super().__init__() self.dataset = dataset - self.lengths = lengths + self.DataLoader = self._get_dataloader() + + self._lengths = lengths + # Keeping this private for now. Changing it at runtime would # require changing dataset_split and the dataloaders. self._random_split = random_split @@ -135,6 +138,9 @@ def __init__( ) # Make sure batch_size and shuffle are lists. + + if self._dataset_type == 'graphs' and batch_size == 0: + batch_size = len(dataset) # make this explicit for torch_geometric if isinstance(batch_size, int): self.batch_size = [batch_size for _ in lengths] else: @@ -152,6 +158,25 @@ def __init__( self.valid_loader = None self.test_loader = None + @property + def _dataset_type(self): + if not isinstance(self.dataset, list): + _dataset_type = self.dataset.metadata['data_type'] + else: + it = iter(list(self.dataset)) + _dataset_type = next(it).metadata['data_type'] + if not all(d.metadata['data_type'] for d in it): + raise ValueError("not all the dataset are of the same type!") + return _dataset_type + + def _get_dataloader(self): + # decide which loader to use + if self._dataset_type == 'descriptors': + DataLoader = DictLoader + elif self._dataset_type == 'graphs': + DataLoader = torch_geometric.loader.DataLoader + return DataLoader + def setup(self, stage: Optional[str] = None): if self._dataset_split is None: if isinstance(self.dataset, DictDataset): @@ -165,7 +190,7 @@ def train_dataloader(self): """Return training dataloader.""" self._check_setup() if self.train_loader is None: - self.train_loader = DictLoader( + self.train_loader = self.DataLoader( self._dataset_split[0], batch_size=self.batch_size[0], shuffle=self.shuffle[0], @@ -175,12 +200,12 @@ def train_dataloader(self): def val_dataloader(self): """Return validation dataloader.""" self._check_setup() - if len(self.lengths) < 2: + if len(self._lengths) < 2: raise NotImplementedError( "Validation dataset not available, you need to pass two lengths to datamodule." ) if self.valid_loader is None: - self.valid_loader = DictLoader( + self.valid_loader = self.DataLoader( self._dataset_split[1], batch_size=self.batch_size[1], shuffle=self.shuffle[1], @@ -190,12 +215,12 @@ def val_dataloader(self): def test_dataloader(self): """Return test dataloader.""" self._check_setup() - if len(self.lengths) < 3: + if len(self._lengths) < 3: raise NotImplementedError( "Test dataset not available, you need to pass three lengths to datamodule." ) if self.test_loader is None: - self.test_loader = DictLoader( + self.test_loader = self.DataLoader( self._dataset_split[2], batch_size=self.batch_size[2], shuffle=self.shuffle[2], @@ -210,11 +235,11 @@ def teardown(self, stage: str): def __repr__(self) -> str: string = f"DictModule(dataset -> {self.dataset.__repr__()}" - string += f",\n\t\t train_loader -> DictLoader(length={self.lengths[0]}, batch_size={self.batch_size[0]}, shuffle={self.shuffle[0]})" - if len(self.lengths) >= 2: - string += f",\n\t\t valid_loader -> DictLoader(length={self.lengths[1]}, batch_size={self.batch_size[1]}, shuffle={self.shuffle[1]})" - if len(self.lengths) >= 3: - string += f",\n\t\t\ttest_loader =DictLoader(length={self.lengths[2]}, batch_size={self.batch_size[2]}, shuffle={self.shuffle[2]})" + string += f",\n\t\t train_loader -> DictLoader(length={self._lengths[0]}, batch_size={self.batch_size[0]}, shuffle={self.shuffle[0]})" + if len(self._lengths) >= 2: + string += f",\n\t\t valid_loader -> DictLoader(length={self._lengths[1]}, batch_size={self.batch_size[1]}, shuffle={self.shuffle[1]})" + if len(self._lengths) >= 3: + string += f",\n\t\t\ttest_loader =DictLoader(length={self._lengths[2]}, batch_size={self.batch_size[2]}, shuffle={self.shuffle[2]})" string += f")" return string @@ -225,7 +250,7 @@ def _split(self, dataset): """ dataset_split = split_dataset( - dataset, self.lengths, self._random_split, self.generator + dataset, self._lengths, self._random_split, self.generator ) return dataset_split @@ -237,6 +262,23 @@ def _check_setup(self): "outside a Lightning trainer please call .setup() first." ) + def get_graph_inputs(self, mode='train'): + """Generate an input that can be used as input for a GNN model + + Parameters + ---------- + mode : str, optional + Type of loader to be used, either 'train' or 'val'/'valid', by default 'train' + """ + self.setup() + if mode == 'train': + loader=self.train_dataloader + elif (mode=='val' or mode=='valid'): + loader=self.val_dataloader + else: + raise ValueError(f"Mode can either be 'train', 'val', 'valid', found {mode}!") + + return next(iter(loader()))['data_list'] def split_dataset( dataset, diff --git a/mlcolvar/data/dataset.py b/mlcolvar/data/dataset.py index d9b274a5..cccf2431 100644 --- a/mlcolvar/data/dataset.py +++ b/mlcolvar/data/dataset.py @@ -1,7 +1,9 @@ import torch +import torch_geometric import numpy as np from mlcolvar.core.transform.utils import Statistics from torch.utils.data import Dataset +from operator import itemgetter __all__ = ["DictDataset"] @@ -14,7 +16,13 @@ class DictDataset(Dataset): 'weights' : np.asarray([0.5,1.5,1.5,0.5]) } """ - def __init__(self, dictionary: dict = None, feature_names=None, create_ref_idx : bool = False, **kwargs): + def __init__(self, + dictionary: dict=None, + feature_names = None, + metadata: dict = None, + data_type : str = 'descriptors', + create_ref_idx : bool = False, + **kwargs): """Create a Dataset from a dictionary or from a list of kwargs. Parameters @@ -23,6 +31,12 @@ def __init__(self, dictionary: dict = None, feature_names=None, create_ref_idx : Dictionary with names and tensors feature_names : array-like List or numpy array with feature names + metadata : dict + Dictionary with metadata quantities shared across the whole dataset. + data_type : str + Type of data stored in the dataset, either 'descriptors' or 'graphs', by default 'descriptors'. + This will be stored in the dataset.metadata dictionary. + """ # assert type dict @@ -30,7 +44,18 @@ def __init__(self, dictionary: dict = None, feature_names=None, create_ref_idx : raise TypeError( f"DictDataset requires a dictionary , not {type(dictionary)}." ) - + + if (metadata is not None) and (not isinstance(metadata, dict)): + raise TypeError( + f"DictDataset metadata requires a dictionary , not {type(metadata)}." + ) + + # assert data_type is 'descriptors' or 'graphs' + if not data_type in ['descriptors', 'graphs']: + raise TypeError( + f"data_type expected to be either 'descriptors' or 'graph', found {data_type}" + ) + # Add kwargs to dict if dictionary is None: dictionary = {} @@ -38,10 +63,23 @@ def __init__(self, dictionary: dict = None, feature_names=None, create_ref_idx : if len(dictionary) == 0: raise ValueError("Empty datasets are not supported") + # initialize metadata as dict + if metadata is None: + metadata = {} + + if 'data_type' in metadata.keys(): + if not metadata['data_type'] == data_type: + raise ValueError(f"Two different data_type specified. Found {metadata['data_type']} in metadata and {data_type} as keyword") + else: + metadata['data_type'] = data_type + # convert to torch.Tensors for key, val in dictionary.items(): if not isinstance(val, torch.Tensor): - dictionary[key] = torch.Tensor(val) + if key in ["data_list", "data_list_lag"]: + dictionary[key] = val + else: + dictionary[key] = torch.Tensor(val) # save dictionary self._dictionary = dictionary @@ -49,10 +87,13 @@ def __init__(self, dictionary: dict = None, feature_names=None, create_ref_idx : # save feature names self.feature_names = feature_names + # save metadata + self.metadata = metadata + # check that all elements of dict have same length it = iter(dictionary.values()) self.length = len(next(it)) - if not all(len(l) == self.length for l in it): + if not all([len(l) == self.length for l in it]): raise ValueError("not all arrays in dictionary have same length!") # add indexing of entries for shuffling and slicing reference @@ -62,12 +103,14 @@ def __init__(self, dictionary: dict = None, feature_names=None, create_ref_idx : def __getitem__(self, index): if isinstance(index, str): - # raise TypeError(f'Index ("{index}") should be a slice, and not a string. To access the stored dictionary use .dictionary["{index}"] instead.') return self._dictionary[index] - else: + else: slice_dict = {} for key, val in self._dictionary.items(): - slice_dict[key] = val[index] + try: + slice_dict[key] = val[index] + except: + slice_dict[key] = list(itemgetter(*index)(val)) return slice_dict def __setitem__(self, index, value): @@ -95,6 +138,10 @@ def get_stats(self): stats dictionary of dictionaries with statistics """ + if self.metadata == 'graph': + raise ValueError ( + "Method get_stats not supported for graph-based dataset!" + ) stats = {} for k in self.keys: print("KEY: ", k, end="\n\n\n") @@ -105,7 +152,12 @@ def get_stats(self): def __repr__(self) -> str: string = "DictDataset(" for key, val in self._dictionary.items(): - string += f' "{key}": {list(val.shape)},' + if key in ["data_list", "data_list_lag"]: + string += f' "{key}": {len(val)},' + else: + string += f' "{key}": {list(val.shape)},' + for key, val in self.metadata.items(): + string += f' "{key}": {val},' string = string[:-1] + " )" return string @@ -124,15 +176,30 @@ def feature_names(self, value): np.asarray(value, dtype=str) if value is not None else value ) + def get_graph_inputs(self): + """Generate and input suitable for graph models. Returns the whole dataset as a single batch not shuffled""" + assert self.metadata['data_type'] == 'graphs', ( + 'Graph inputs can only be generated for graph-based datasets' + ) + loader = torch_geometric.loader.DataLoader(self, + batch_size=len(self), + shuffle=False ) + return next(iter(loader))['data_list'] def test_DictDataset(): + # descriptors based # from list + data = torch.Tensor([[1.0], [2.0], [0.3], [0.4]]) + labels = [0, 0, 1, 1] + weights = np.asarray([0.5, 1.5, 1.5, 0.5]) dataset_dict = { - "data": torch.Tensor([[1.0], [2.0], [0.3], [0.4]]), - "labels": [0, 0, 1, 1], - "weights": np.asarray([0.5, 1.5, 1.5, 0.5]), + "data": data, + "labels": labels, + "weights": weights, } - + + # this to have the right signature in asserts + from mlcolvar.data.dataset import DictDataset dataset = DictDataset(dataset_dict) print(len(dataset)) print(dataset[0]) @@ -141,18 +208,106 @@ def test_DictDataset(): # test with dataloader from torch.utils.data import DataLoader - loader = DataLoader(dataset, batch_size=1) batch = next(iter(loader)) print(batch["data"]) # test with fastdataloader - from .dataloader import DictLoader - + from mlcolvar.data import DictLoader loader = DictLoader(dataset, batch_size=1) batch = next(iter(loader)) print(batch) + from mlcolvar.data.graph.atomic import AtomicNumberTable, Configuration + from mlcolvar.data.graph.utils import create_dataset_from_configurations + # graphs based + numbers = [8, 1, 1] + positions = np.array( + [[[0.0, 0.0, 0.0], [0.07, 0.07, 0.0], [0.07, -0.07, 0.0]], + [[0.0, 0.0, 0.0], [0.07, 0.07, 0.0], [0.07, -0.07, 0.0]], + [[0.0, 0.0, 0.0], [0.07, 0.07, 0.0], [0.07, -0.07, 0.0]]], + dtype=float + ) + cell = np.identity(3, dtype=float) * 0.2 + graph_labels = np.array([[1], [0], [1]]) + node_labels = np.array([[0], [1], [1]]) + z_table = AtomicNumberTable.from_zs(numbers) + + config = [Configuration( + atomic_numbers=numbers, + positions=positions[i] + 0.1*i, + cell=cell, + pbc=[True] * 3, + node_labels=node_labels[i], + graph_labels=graph_labels, + ) for i in range(3)] + graph_dataset = create_dataset_from_configurations(config, + z_table, + 0.1, + show_progress=False + ) + print(graph_dataset) + assert(isinstance(graph_dataset, DictDataset)) + + # check __getitem__ + # string + out = dataset['data'] + assert( torch.allclose(out, data) ) + out = graph_dataset['data_list'] + assert( torch.allclose(out[1]['positions'], torch.Tensor(positions+0.1))) + + # int + out = dataset[1] + assert( torch.allclose(out['data'], data[1]) ) + out = graph_dataset[1] + assert( torch.allclose(out['data_list']['positions'], torch.Tensor(positions+0.1))) + + + # list + out = dataset[[0,1,2]] + assert( torch.allclose(out['data'], data[[0,1,2]]) ) + out = graph_dataset[[0,1,2]] + for i in [0,1,2]: + assert( torch.allclose(out['data_list'][i]['positions'], torch.Tensor(positions+0.1*i))) + + # slice + out = dataset[0:2] + assert( torch.allclose(out['data'], data[[0,1]]) ) + out = graph_dataset[0:2] + for i in [0,1]: + assert( torch.allclose(out['data_list'][i]['positions'], torch.Tensor(positions+0.1*i))) + + # range + out = dataset[range(0,2)] + assert( torch.allclose(out['data'], data[[0,1]]) ) + out = graph_dataset[range(0,2)] + for i in [0,1]: + assert( torch.allclose(out['data_list'][i]['positions'], torch.Tensor(positions+0.1*i))) + + # np.ndarray + out = dataset[np.array(1)] + assert( torch.allclose(out['data'], data[1]) ) + out = graph_dataset[np.array(1)] + assert( torch.allclose(out['data_list']['positions'], torch.Tensor(positions+0.1))) + + out = dataset[np.array([0,1,2])] + assert( torch.allclose(out['data'], data[[0,1,2]]) ) + out = graph_dataset[np.array([0,1,2])] + for i in [0,1,2]: + assert( torch.allclose(out['data_list'][i]['positions'], torch.Tensor(positions+0.1*i))) + + # torch.Tensor + out = dataset[torch.tensor([1], dtype=torch.long)] + assert( torch.allclose(out['data'], data[1]) ) + out = graph_dataset[torch.tensor([1], dtype=torch.long)] + assert( torch.allclose(out['data_list']['positions'], torch.Tensor(positions+0.1))) + + out = dataset[torch.tensor([0,1,2], dtype=torch.long)] + assert( torch.allclose(out['data'], data[[0,1,2]]) ) + out = graph_dataset[torch.tensor([0,1,2], dtype=torch.long)] + for i in [0,1,2]: + assert( torch.allclose(out['data_list'][i]['positions'], torch.Tensor(positions+0.1*i))) + if __name__ == "__main__": - test_DictDataset() + test_DictDataset() \ No newline at end of file diff --git a/mlcolvar/data/graph/__init__.py b/mlcolvar/data/graph/__init__.py new file mode 100644 index 00000000..5322ece6 --- /dev/null +++ b/mlcolvar/data/graph/__init__.py @@ -0,0 +1,5 @@ +__all__ = ["AtomicNumberTable", "Configuration", "Configurations", "get_neighborhood", "create_dataset_from_configurations", "create_test_graph_input"] + +from .atomic import * +from .neighborhood import * +from .utils import * \ No newline at end of file diff --git a/mlcolvar/data/graph/atomic.py b/mlcolvar/data/graph/atomic.py new file mode 100644 index 00000000..78c0eb31 --- /dev/null +++ b/mlcolvar/data/graph/atomic.py @@ -0,0 +1,151 @@ +import warnings +import numpy as np +import mdtraj as md +from dataclasses import dataclass +from typing import List, Iterable, Optional + +""" +The helper functions for atomic data. This module is taken from MACE directly: +https://github.com/ACEsuit/mace/blob/main/mace/tools/utils.py +https://github.com/ACEsuit/mace/blob/main/mace/data/utils.py +""" + +__all__ = ['AtomicNumberTable', 'Configuration', 'Configurations'] + + +class AtomicNumberTable: + """The atomic number table. + Used to map between one hot encodings and a given set of actual atomic numbers. + """ + + def __init__(self, zs: List[int]) -> None: + """Initializes an atomi number table object + + Parameters + ---------- + zs: List[int] + The atomic numbers in this table + """ + self.zs = zs + self.masses = [1.0] * len(zs) + for i in range(len(zs)): + try: + m = md.element.Element.getByAtomicNumber(zs[i]).mass + self.masses[i] = m + except Exception: + warnings.warn( + 'Can not assign mass for atom number: {:d}'.format(zs[i]) + ) + + def __len__(self) -> int: + """Number of elements in the table""" + return len(self.zs) + + def __str__(self) -> str: + return f'AtomicNumberTable: {tuple(s for s in self.zs)}' + + def index_to_z(self, index: int) -> int: + """Maps the encoding to the actual atomic number + + Parameters + ---------- + index: int + Index of the encoding to be mapped + """ + return self.zs[index] + + def index_to_symbol(self, index: int) -> str: + """Map the encoding to the atomic symbol + + Parameters + ---------- + index: int + Index of the encoding to be mapped + """ + return md.element.Element.getByAtomicNumber(self.zs[index]).symbol + + def z_to_index(self, atomic_number: int) -> int: + """Maps an atomic number to the encoding. + + Parameters + ---------- + atomic_number: int + The atomic number to be mapped + """ + return self.zs.index(atomic_number) + + def zs_to_indices(self, atomic_numbers: np.ndarray) -> np.ndarray: + """Maps an array of atomic number to the encodings. + + Parameters + ---------- + atomic_numbers: numpy.ndarray + The atomic numbers to be mapped + """ + to_index_fn = np.vectorize(self.z_to_index) + return to_index_fn(atomic_numbers) + + @classmethod + def from_zs(cls, atomic_numbers: Iterable[int]) -> 'AtomicNumberTable': + """Build the table from an array atomic numbers. + + Parameters + ---------- + atomic_numbers: Iterable[int] + The atomic numbers to be used for building the table + """ + z_set = set() + for z in atomic_numbers: + z_set.add(z) + return cls(sorted(list(z_set))) + + +def get_masses(atomic_numbers: Iterable[int]) -> List[float]: + """Get atomic masses from atomic numbers. + + Parameters + ---------- + atomic_numbers: Iterable[int] + The atomic numbers for which to return the atomic masses + """ + return AtomicNumberTable.from_zs(atomic_numbers).masses.copy() + + +@dataclass +class Configuration: + """ + Internal helper class that describe a given configuration of the system. + """ + atomic_numbers: np.ndarray # shape: [n_atoms] + positions: np.ndarray # shape: [n_atoms, 3], units: Ang + cell: np.ndarray # shape: [n_atoms, 3], units: Ang + pbc: Optional[tuple] # shape: [3] + node_labels: Optional[np.ndarray] # shape: [n_atoms, n_node_labels] + graph_labels: Optional[np.ndarray] # shape: [n_graph_labels, 1] + weight: Optional[float] = 1.0 # shape: [] + system: Optional[np.ndarray] = None # shape: [n_system_atoms] + environment: Optional[np.ndarray] = None # shape: [n_environment_atoms] + + +Configurations = List[Configuration] + + +def test_atomic_number_table() -> None: + table = AtomicNumberTable([1, 6, 7, 8]) + + numbers = np.array([1, 7, 6, 8]) + assert ( + table.zs_to_indices(numbers) == np.array([0, 2, 1, 3], dtype=int) + ).all() + + numbers = np.array([1, 1, 1, 6, 8, 1]) + assert ( + table.zs_to_indices(numbers) == np.array([0, 0, 0, 1, 3, 0], dtype=int) + ).all() + + table_1 = AtomicNumberTable.from_zs([6] * 3 + [1] * 10 + [7] * 3 + [8] * 2) + assert table_1.zs == table.zs + + +if __name__ == '__main__': + test_atomic_number_table() diff --git a/mlcolvar/data/graph/neighborhood.py b/mlcolvar/data/graph/neighborhood.py new file mode 100644 index 00000000..09791f34 --- /dev/null +++ b/mlcolvar/data/graph/neighborhood.py @@ -0,0 +1,254 @@ +import numpy as np +from matscipy.neighbours import neighbour_list +from typing import Optional, Tuple, List + +""" +The neighbor list function. This module is taken from MACE directly: +https://github.com/ACEsuit/mace/blob/main/mace/data/neighborhood.py +""" + +__all__ = ['get_neighborhood'] + + +def get_neighborhood( + positions: np.ndarray, # [num_positions, 3] + cutoff: float, + pbc: Optional[Tuple[bool, bool, bool]] = None, + cell: Optional[np.ndarray] = None, # [3, 3] + true_self_interaction: Optional[bool] = False, + system_indices: Optional[List[int]] = None, + environment_indices: Optional[List[int]] = None, + buffer: float = 0.0 +) -> Tuple[np.ndarray, np.ndarray, np.ndarray]: + """Get the neighbor list of a given set atoms. + + Parameters + ---------- + positions: numpy.ndarray (shape: [N, 3]) + The positions array. + cutoff: float + The cutoff radius. + pbc: Tuple[bool, bool, bool] (shape: [3]) + If to enable PBC in the directions of the three lattice vectors. + cell: numpy.ndarray (shape: [3, 3]) + The lattice vectors. + true_self_interaction: bool + If to keep self-edges that don't cross periodic boundaries. + system_indices: List[int] + Indices of the atoms to be considered as the 'system' if + restricting the neighborhood to a subsystem (i.e., system + environment), see also Notes section. + environment_indices: List[int] + Indices of the atoms to be considered as the 'environment' if + restricting the neighborhood to a subsystem (i.e., system + environment), see also Notes section. + Only atoms within the cutoff will be included as active enviroment atoms + buffer: float + Buffer size used in finding active environment atoms, if + restricting the neighborhood to a subsystem (i.e., system + environment), see also Notes section. + + Returns + ------- + edge_index: numpy.ndarray (shape: [2, n_edges]) + The edge indices (i.e., source and destination) in the graph. + shifts: numpy.ndarray (shape: [n_edges, 3]) + The shift vectors (unit_shifts * cell_lengths). + unit_shifts: numpy.ndarray (shape: [n_edges, 3]) + The unit shift vectors (number of PBC crossed by the edges). + + Notes + ----- + Arguments `system_indices` and `environment_indices` must present at the + same time. When these arguments are given, only edges in the [subsystem] + formed by [the systems atoms] and [the environment atoms within the cutoff + radius of the systems atoms] will be kept. + These two lists could not contain common atoms. + """ + + if system_indices is not None or environment_indices is not None: + assert system_indices is not None and environment_indices is not None + + system_indices = np.array(system_indices) + environment_indices = np.array(environment_indices) + assert np.intersect1d(system_indices, environment_indices).size == 0 + else: + assert buffer == 0.0 + + if pbc is None: + pbc = (False, False, False) + + if cell is None or cell.any() == np.zeros((3, 3)).any(): + cell = np.identity(3, dtype=float) + + assert len(pbc) == 3 and all(isinstance(i, (bool, np.bool_)) for i in pbc) + assert cell.shape == (3, 3) + + pbc_x = pbc[0] + pbc_y = pbc[1] + pbc_z = pbc[2] + identity = np.identity(3, dtype=float) + max_positions = np.max(np.absolute(positions)) + 1 + # Extend cell in non-periodic directions + # For models with more than 5 layers, the multiplicative constant needs to + # be increased. + if not pbc_x: + cell[:, 0] = max_positions * 5 * cutoff * identity[:, 0] + if not pbc_y: + cell[:, 1] = max_positions * 5 * cutoff * identity[:, 1] + if not pbc_z: + cell[:, 2] = max_positions * 5 * cutoff * identity[:, 2] + + sender, receiver, unit_shifts, distances = neighbour_list( + quantities='ijSd', + pbc=pbc, + cell=cell, + positions=positions, + cutoff=float(cutoff + buffer), + # self_interaction=True, # we want edges from atom to itself in different periodic images + # use_scaled_positions=False, # positions are not scaled positions + ) + + if not true_self_interaction: + # Eliminate self-edges that don't cross periodic boundaries + true_self_edge = sender == receiver + true_self_edge &= np.all(unit_shifts == 0, axis=1) + keep_edge = ~true_self_edge + + # NOTE: after eliminating self-edges, it can be that no edges remain + # in this system + sender = sender[keep_edge] + receiver = receiver[keep_edge] + unit_shifts = unit_shifts[keep_edge] + distances = distances[keep_edge] + + if system_indices is not None: + # Get environment atoms that are neighbors of the system. + keep_edge = np.where(np.in1d(receiver, system_indices))[0] + keep_sender = np.intersect1d(sender[keep_edge], environment_indices) + keep_atom = np.concatenate((system_indices, np.unique(keep_sender))) + # Get the edges in the subsystem. + keep_sender = np.where(np.in1d(sender, keep_atom))[0] + keep_receiver = np.where(np.in1d(receiver, keep_atom))[0] + keep_edge = np.intersect1d(keep_sender, keep_receiver) + keep_edge_distance = np.where(distances <= cutoff)[0] + keep_edge = np.intersect1d(keep_edge, keep_edge_distance) + # Get the edges + sender = sender[keep_edge] + receiver = receiver[keep_edge] + unit_shifts = unit_shifts[keep_edge] + + # Build output + edge_index = np.stack((sender, receiver)) # [2, n_edges] + + # From the docs: With the shift vector S, the distances D between atoms + # can be computed from: D = positions[j]-positions[i]+S.dot(cell) + shifts = np.dot(unit_shifts, cell) # [n_edges, 3] + + return edge_index, shifts, unit_shifts + + +def test_get_neighborhood() -> None: + + positions = np.array( + [[0, 0, 0], [1, 1, 1], [2, 2, 2], [3, 3, 3]], dtype=float + ) + cell = np.array([[4, 0, 0], [0, 4, 0], [0, 0, 4]], dtype=float) + + n, s, u = get_neighborhood(positions, cutoff=5.0) + assert ( + n == np.array( + [[0, 0, 1, 1, 1, 2, 2, 2, 3, 3], [1, 2, 0, 2, 3, 0, 1, 3, 1, 2]], + dtype=int + ) + ).all() + + n, s, u = get_neighborhood(positions, cutoff=2.0) + assert ( + n == np.array([[0, 1, 1, 2, 2, 3], [1, 0, 2, 1, 3, 2]], dtype=int) + ).all() + + n, s, u = get_neighborhood( + positions, cutoff=2.0, pbc=[True] * 3, cell=cell + ) + assert ( + n == np.array( + [[0, 0, 1, 1, 2, 2, 3, 3], [3, 1, 0, 2, 1, 3, 2, 0]], dtype=int + ) + ).all() + assert ( + s == np.array( + [ + [-4.0, -4.0, -4.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [4.0, 4.0, 4.0] + ], + dtype=float + ) + ).all() + assert ( + u == np.array( + [ + [-1, -1, -1], + [0, 0, 0], + [0, 0, 0], + [0, 0, 0], + [0, 0, 0], + [0, 0, 0], + [0, 0, 0], + [1, 1, 1] + ], + dtype=int + ) + ).all() + + n, s, u = get_neighborhood( + positions, + cutoff=2.0, + pbc=[True] * 3, + cell=cell, + system_indices=[0, 1], + environment_indices=[2, 3] + ) + assert ( + n == np.array( + [[0, 0, 1, 1, 2, 2, 3, 3], [3, 1, 0, 2, 1, 3, 2, 0]], dtype=int + ) + ).all() + + n, s, u = get_neighborhood( + positions, + cutoff=2.0, + pbc=[True] * 3, + cell=cell, + system_indices=[0], + environment_indices=[1, 2, 3] + ) + assert ( + n == np.array( + [[0, 0, 1, 3], [3, 1, 0, 0]], dtype=int + ) + ).all() + assert ( + s == np.array( + [ + [-4.0, -4.0, -4.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [4.0, 4.0, 4.0] + ], + dtype=float + ) + ).all() + assert ( + u == np.array( + [[-1, -1, -1], [0, 0, 0], [0, 0, 0], [1, 1, 1]], + dtype=int + ) + ).all() + + +if __name__ == "__main__": + test_get_neighborhood() diff --git a/mlcolvar/data/graph/utils.py b/mlcolvar/data/graph/utils.py new file mode 100644 index 00000000..1732ce65 --- /dev/null +++ b/mlcolvar/data/graph/utils.py @@ -0,0 +1,839 @@ +import copy +from collections import defaultdict +from typing import Union + +import torch +import torch_geometric +from torch_geometric.data import Data, HeteroData +from torch_geometric.transforms import BaseTransform + +from mlcolvar.data import DictDataset, DictModule +from mlcolvar.data.graph import atomic +from mlcolvar.data.graph.neighborhood import get_neighborhood +from mlcolvar.utils.plot import pbar + +from typing import List + +__all__ = ["create_dataset_from_configurations", "create_test_graph_input"] + +def _create_dataset_from_configuration( + config: atomic.Configuration, + z_table: atomic.AtomicNumberTable, + cutoff: float, + buffer: float = 0.0, +) -> torch_geometric.data.Data: + """Build the torch_geometric graph data object from a configuration. + + Parameters + ---------- + config: mlcolvar.data.graph.atomic.Configuration + The configuration from which to generate the graph data + z_table: mlcolvar.data.graph.atomic.AtomicNumberTable + The atomic number table used to build the node attributes + cutoff: float + The graph cutoff radius + buffer: float + Buffer size used in finding active environment atoms if + restricting the neighborhood to a subsystem (i.e., system + environment), + `see also mlcolvar.data.grap.neighborhood.get_neighborhood` + """ + + assert config.graph_labels is None or len(config.graph_labels.shape) == 2 + + # NOTE: here we do not take care about the nodes that are not taking part + # the graph, like, we don't even change the node indices in `edge_index`. + # Here we simply ignore them, and rely on the `RemoveIsolatedNodes` method + # that will be called later (in `create_dataset_from_configurations`). + edge_index, shifts, unit_shifts = get_neighborhood( + positions=config.positions, + cutoff=cutoff, + cell=config.cell, + pbc=config.pbc, + system_indices=config.system, + environment_indices=config.environment, + buffer=buffer + ) + edge_index = torch.tensor(edge_index, dtype=torch.long) + shifts = torch.tensor(shifts, dtype=torch.get_default_dtype()) + unit_shifts = torch.tensor( + unit_shifts, dtype=torch.get_default_dtype() + ) + + positions = torch.tensor( + config.positions, dtype=torch.get_default_dtype() + ) + cell = torch.tensor(config.cell, dtype=torch.get_default_dtype()) + + indices = z_table.zs_to_indices(config.atomic_numbers) + one_hot = to_one_hot( + torch.tensor(indices, dtype=torch.long).unsqueeze(-1), + n_classes=len(z_table), + ) + + node_labels = ( + torch.tensor(config.node_labels, dtype=torch.get_default_dtype()) + if config.node_labels is not None + else None + ) + + graph_labels = ( + torch.tensor(config.graph_labels, dtype=torch.get_default_dtype()) + if config.graph_labels is not None + else None + ) + + weight = ( + torch.tensor(config.weight, dtype=torch.get_default_dtype()) + if config.weight is not None + else 1 + ) + + n_system = ( + torch.tensor( + [[len(config.system)]], dtype=torch.get_default_dtype() + ) if config.system is not None + else torch.tensor( + [[one_hot.shape[0]]], dtype=torch.get_default_dtype() + ) + ) + + n_env = ( + torch.tensor( + [[one_hot.shape[0] - n_system.to(torch.int).item()]], dtype=torch.get_default_dtype() + ) + ) + + if config.system is not None: + system_masks = torch.zeros((one_hot.shape[0], 1), dtype=torch.bool) + system_masks[config.system, 0] = 1 + else: + system_masks = None + + return torch_geometric.data.Data( + edge_index=edge_index, + shifts=shifts, + unit_shifts=unit_shifts, + positions=positions, + cell=cell, + node_attrs=one_hot, + node_labels=node_labels, + graph_labels=graph_labels, + n_system=n_system, + n_env=n_env, + system_masks=system_masks, + weight=weight, + ) + + +def create_dataset_from_configurations( + config: atomic.Configurations, + z_table: atomic.AtomicNumberTable, + cutoff: float, + buffer: float = 0.0, + atom_names: List = None, + remove_isolated_nodes: bool = False, + show_progress: bool = True +) -> DictDataset: + """Build DictDataset object containing torch_geometric graph data objects from configurations. + + Parameters + ---------- + config: mlcolvar.graph.utils.atomic.Configurations + The configurations from whihc to generate the dataset + z_table: mlcolvar.graph.utils.atomic.AtomicNumberTable + The atomic number table used to build the node attributes + cutoff: float + The graph cutoff radius + buffer: float + Buffer size used in finding active environment atoms if + restricting the neighborhood to a subsystem (i.e., system + environment), + `see also mlcolvar.data.grap.neighborhood.get_neighborhood` + remove_isolated_nodes: bool + If to remove isolated nodes from the dataset + show_progress: bool + If to show the progress bar + """ + if show_progress: + items = pbar(config, frequency=0.0001, prefix='Making graphs') + else: + items = config + + data_list = [ + _create_dataset_from_configuration( + config=c, + z_table=z_table, + cutoff=cutoff, + buffer=buffer, + ) for c in items + ] + + if atom_names is None: + atom_names_system = [f"X{i}" for i in range(data_list[0]['n_system'].to(torch.int64).item())] + atom_names_env = [f"Y{i}" for i in range(data_list[0]['n_env'].to(torch.int64).item())] + atom_names = atom_names_system + atom_names_env + + # this is only to check what isolated nodes have been removed + _aux_pos = torch.Tensor((np.array([d['positions'].numpy() for d in data_list]))) + if remove_isolated_nodes: + # TODO: not the worst way to fake the `is_node_attr` method of + # `torch_geometric.data.storage.GlobalStorage` ... + # I mean, when there are exact three atoms in the graph, the + # `RemoveIsolatedNodes` method will remove the cell vectors that + # correspond to the isolated node ... This is a consequence of that + # pyg regarding the cell vectors as some kind of node features. + # So here we first remove the isolated nodes, then set the cell back. + cell_list = [d.cell.clone() for d in data_list] + transform = _RemoveIsolatedNodes() + data_list = [transform(d) for d in data_list] + + # check what have been removed and restore cell + unique_idx = [] # store the indeces of the atoms that have been used at least once + for i in range(len(data_list)): + data_list[i].cell = cell_list[i] + # get and save the original index before removing isolated nodes for each entry + original_idx = torch.unique( torch.where(torch.isin(torch.round(_aux_pos[i], decimals=5), + torch.round(data_list[i]['positions'], decimals=5)) + )[0] + ) + data_list[i]['names_idx'] = original_idx.to(torch.int64) + + # update if needed the overall list + check = np.isin(original_idx.numpy(), unique_idx, invert=True) + if check.any(): + aux = np.where(check)[0] + unique_idx.extend(original_idx[aux].tolist()) + + unique_idx.sort() + unique_idx = torch.Tensor(unique_idx).to(torch.int64) + # here we simply have to take all the atoms + else: + unique_idx = torch.arange(data_list[0]['n_system'].item()).to(torch.int64) + for i in range(len(data_list)): + data_list[i]['names_idx'] = unique_idx + + # we also save the names of the atoms that have been actually used + unique_names = np.array(atom_names)[unique_idx] + unique_names = unique_names.tolist() + + dataset = DictDataset(dictionary={'data_list' : data_list}, + metadata={'z_table' : z_table.zs, + 'cutoff' : cutoff, + 'used_idx' : unique_idx, + 'used_names' : unique_names}, + data_type='graphs') + + return dataset + +def to_one_hot(indices: torch.Tensor, n_classes: int) -> torch.Tensor: + """Generates one-hot encoding with `n_classes` classes from `indices` + + Parameters + ---------- + indices: torch.Tensor (shape: [N, 1]) + Node indices + n_classes: int + Number of classes + + Returns + ------- + encoding: torch.tensor (shape: [N, n_classes]) + The one-hot encoding + """ + shape = indices.shape[:-1] + (n_classes,) + oh = torch.zeros(shape, device=indices.device).view(shape) + + # scatter_ is the in-place version of scatter + oh.scatter_(dim=-1, index=indices, value=1) + + return oh.view(*shape) + + +class _RemoveIsolatedNodes(BaseTransform): + r"""Removes isolated nodes from the graph + This is taken from pytorch_geometric with a small modification to avoid the bug when n_nodes==n_edges + """ + def forward( + self, + data: Union[Data, HeteroData], + ) -> Union[Data, HeteroData]: + # Gather all nodes that occur in at least one edge (across all types): + n_ids_dict = defaultdict(list) + for edge_store in data.edge_stores: + if 'edge_index' not in edge_store: + continue + + if edge_store._key is None: + src = dst = None + else: + src, _, dst = edge_store._key + + n_ids_dict[src].append(edge_store.edge_index[0]) + n_ids_dict[dst].append(edge_store.edge_index[1]) + + n_id_dict = {k: torch.cat(v).unique() for k, v in n_ids_dict.items()} + + n_map_dict = {} + for node_store in data.node_stores: + if node_store._key not in n_id_dict: + n_id_dict[node_store._key] = torch.empty(0, dtype=torch.long) + + idx = n_id_dict[node_store._key] + assert data.num_nodes is not None + mapping = idx.new_zeros(data.num_nodes) + mapping[idx] = torch.arange(idx.numel(), device=mapping.device) + n_map_dict[node_store._key] = mapping + + for edge_store in data.edge_stores: + if 'edge_index' not in edge_store: + continue + + if edge_store._key is None: + src = dst = None + else: + src, _, dst = edge_store._key + + row = n_map_dict[src][edge_store.edge_index[0]] + col = n_map_dict[dst][edge_store.edge_index[1]] + edge_store.edge_index = torch.stack([row, col], dim=0) + + old_data = copy.copy(data) + for out, node_store in zip(data.node_stores, old_data.node_stores): + for key, value in node_store.items(): + if key == 'num_nodes': + out.num_nodes = n_id_dict[node_store._key].numel() + elif node_store.is_node_attr(key) and key not in ['shifts', 'unit_shifts']: + out[key] = value[n_id_dict[node_store._key]] + + return data + + +def create_test_graph_input(output_type: str, + n_atoms: int = 3, + n_samples: int = 60, + n_states: int = 2, + random_weights = False, + add_noise = True): + """ + Util function to generate several types of mock graph data objects for testing purposes. + The graphs are created drawing positions from a predefined set of positions that cover most use cases. + It can generate: one or some configuration objects, a dataset, a datamodule, a batch of example inputs or a single item. + + Parameters + ---------- + output_type : str + Type of graph data object to create. Can be: 'configuration', 'configurations', 'datamodule', 'dataset', 'batch', 'example' + n_atoms : int, optional + Number of atoms for creating the graph, either 3 or 4, by default 3 + n_samples : int, optional + Number of samples per state to create, by default 60 + n_states : int, optional + Number of states for which to create data, by default 2. Configurations are then labelled accordingly. + random_weights : bool, optional + If to assign random weights to the entries, otherwise unitary weights are given, by default False + add_noise : bool, optional + If to add a random noise for each entry to the predefined positions, by default True + + Returns + ------- + Graph data object of the chosen type + """ + if n_atoms == 3: + numbers = [8, 1, 1] + node_labels = np.array([[0], [1], [1]]) + _ref_positions = np.array( + [ + [[0.0, 0.0, 0.0], [0.07, 0.07, 0.0], [0.07, -0.07, 0.0]], + [[0.0, 0.0, 0.0], [-0.07, 0.07, 0.0], [0.07, 0.07, 0.0]], + [[0.0, 0.0, 0.0], [0.07, -0.07, 0.0], [0.07, 0.07, 0.0]], + [[0.0, 0.0, 0.0], [0.0, -0.07, 0.07], [0.0, 0.07, 0.07]], + [[0.0, 0.0, 0.0], [0.11, 0.11, 0.11], [-0.07, 0.0, 0.07]], + [[0.1, 0.0, 1.1], [0.17, 0.07, 1.1], [0.17, -0.07, 1.1]], + ], + dtype=np.float64 + ) + + elif n_atoms == 4: + numbers = [8, 1, 1, 8] + node_labels = np.array([[0], [1], [1], [0]]) + _ref_positions = np.array( + [ + [[0.0, 0.0, 0.0], [0.07, 0.07, 0.0] , [0.07, -0.07, 0.0], [0.05, -0.05, 0.0]], + [[0.0, 0.0, 0.0], [-0.07, 0.07, 0.0], [0.07, 0.07, 0.0], [0.05, 0.05, 0.0]], + [[0.0, 0.0, 0.0], [0.07, -0.07, 0.0], [0.07, 0.07, 0.0], [0.05, 0.05, 0.0]], + [[0.0, 0.0, 0.0], [0.0, -0.07, 0.07], [0.0, 0.07, 0.07], [0.0, 0.05, 0.05]], + [[0.0, 0.0, 0.0], [0.11, 0.11, 0.11] , [-0.07, 0.0, 0.07], [-0.05, 0.0, 0.05]], + [[0.1, 0.0, 1.1], [0.17, 0.07, 1.1] , [0.17, -0.07, 1.1], [0.15, -0.05, 1.1]], + ], + dtype=np.float64 + ) + else: + raise ValueError(f'Example input can be generated either with 3 or 4 atoms, found {n_atoms}') + + + idx = np.random.randint(low=0, high=6, size=(n_samples*n_states)) + positions = _ref_positions[idx, :, :] + + # let's add some noise to the positions for fun + if add_noise: + noise = np.random.randn(*positions.shape)*1e-5 + positions = positions + noise + + cell = np.identity(3, dtype=float) * 0.2 + graph_labels = np.zeros((n_samples*n_states, 1, 1)) + for i in range(1, n_states): + graph_labels[n_samples * i :] += 1 + z_table = atomic.AtomicNumberTable.from_zs(numbers) + + if random_weights: + weights = np.random.random_sample((n_samples*n_states, 1, 1)) + else: + weights = np.ones((n_samples*n_states, 1, 1)) + config = [ + atomic.Configuration( + atomic_numbers=numbers, + positions=positions[i], + cell=cell, + pbc=[True] * 3, + node_labels=node_labels, + graph_labels=graph_labels[i], + weight=weights[i] + ) for i in range(0, n_samples*n_states) + ] + + if output_type == 'configuration': + return config[0] + if output_type == 'configurations': + return config + + dataset = create_dataset_from_configurations( + config, z_table, 0.1, show_progress=False, remove_isolated_nodes=True + ) + + if output_type == 'dataset': + return dataset + + datamodule = DictModule( + dataset, + lengths=(0.8, 0.2), + batch_size=0, + shuffle=False, + ) + + if output_type == 'datamodule': + return datamodule + + datamodule.setup() + batch = next(iter(datamodule.train_dataloader())) + if output_type == 'batch': + return batch + example = batch['data_list'].get_example(0) + example['batch'] = torch.zeros(len(example['positions']), dtype=torch.int64) + if output_type == 'example': + return example + + return None + +def create_graph_tracing_example(n_species : int): + """ + Util to create a tracing example for graph based models. + + Parameters + ---------- + n_species : int + Number of chemical species to be considered in the model. + + Returns + ------- + dict + Tracing graph input example as dict. + """ + numbers = [1, 1, 1] + node_labels = np.array([[0], [0], [0]]) + _ref_positions = np.array( + [ + [[0.0, 0.0, 0.0], [0.07, 0.07, 0.0], [0.07, -0.07, 0.0]], + [[0.0, 0.0, 0.0], [-0.07, 0.07, 0.0], [0.07, 0.07, 0.0]], + [[0.0, 0.0, 0.0], [0.07, -0.07, 0.0], [0.07, 0.07, 0.0]], + [[0.0, 0.0, 0.0], [0.0, -0.07, 0.07], [0.0, 0.07, 0.07]], + [[0.0, 0.0, 0.0], [0.11, 0.11, 0.11], [-0.07, 0.0, 0.07]], + [[0.1, 0.0, 1.1], [0.17, 0.07, 1.1], [0.17, -0.07, 1.1]], + ], + dtype=np.float64 + ) + + idx = np.random.randint(low=0, high=6, size=1) + positions = _ref_positions[idx, :, :] + cell = np.identity(3, dtype=float) * 0.2 + graph_labels = np.zeros((1, 1, 1)) + + z_table = atomic.AtomicNumberTable.from_zs(numbers) + + weights = np.ones((1, 1, 1)) + config = [ + atomic.Configuration( + atomic_numbers=numbers, + positions=positions[i], + cell=cell, + pbc=[True] * 3, + node_labels=node_labels, + graph_labels=graph_labels[i], + weight=weights[i] + ) for i in range(0, 1) + ] + + # here we do not remove isolated nodes + dataset = create_dataset_from_configurations( + config, z_table, 0.1, show_progress=False, remove_isolated_nodes=False + ) + + datamodule = DictModule( + dataset, + lengths=(0.8, 0.2), + batch_size=0, + shuffle=False, + ) + + datamodule.setup() + batch = next(iter(datamodule.train_dataloader())) + example = batch['data_list'].get_example(0) + example['batch'] = torch.zeros(len(example['positions']), dtype=torch.int64) + + example = example.to_dict() + example['node_attrs'] = torch.cat((example['node_attrs'], torch.zeros(3, n_species - 1)), 1) + return example + +# =============================================================================== +# =============================================================================== +# ==================================== TESTS ==================================== +# =============================================================================== +# =============================================================================== + +import numpy as np + +def test_to_one_hot() -> None: + i = torch.tensor([[0], [2], [1]], dtype=torch.int64) + e = to_one_hot(i, 4) + assert ( + e == torch.tensor( + [[1, 0, 0, 0], [0, 0, 1, 0], [0, 1, 0, 0]], dtype=torch.int64 + ) + ).all() + +def test_from_configuration() -> None: + # fake atomic numbers, positions, cell, graph label, node labels + numbers = [8, 1, 1] + positions = np.array([[0.0, 0.0, 0.0], + [0.07, 0.07, 0.0], + [0.07, -0.07, 0.0]], + dtype=float + ) + cell = np.identity(3, dtype=float) * 0.2 + graph_labels = np.array([[1]]) + node_labels = np.array([[0], [1], [1]]) + + # init AtomicNumber object + z_table = atomic.AtomicNumberTable.from_zs(numbers) + + # initialize configuration using all atoms + config = atomic.Configuration( + atomic_numbers=numbers, + positions=positions, + cell=cell, + pbc=[True] * 3, + node_labels=node_labels, + graph_labels=graph_labels, + ) + + # create dataset from a configuration + data = _create_dataset_from_configuration(config, z_table, 0.1) + + # check edges and shifts are created correctly + assert(data['edge_index'] == torch.tensor([[0, 0, 1, 1, 2, 2], + [2, 1, 0, 2, 1, 0]]) + ).all() + + assert(data['shifts'] == torch.tensor([[0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.2, 0.0], + [0.0, -0.2, 0.0], + [0.0, 0.0, 0.0]]) + ).all() + + assert(data['unit_shifts'] == torch.tensor([[0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 1.0, 0.0], + [0.0, -1.0, 0.0], + [0.0, 0.0, 0.0]]) + ).all() + + # check correct storage + assert(data['positions'] == torch.tensor([[0.0, 0.0, 0.0], + [0.07, 0.07, 0.0], + [0.07, -0.07, 0.0]]) + ).all() + + assert(data['cell'] == torch.tensor([[0.2, 0.0, 0.0], + [0.0, 0.2, 0.0], + [0.0, 0.0, 0.2]]) + ).all() + + assert(data['node_attrs'] == torch.tensor([[0.0, 1.0], + [1.0, 0.0], + [1.0, 0.0]]) + ).all() + + assert(data['node_labels'] == torch.tensor([[0.0], + [1.0], + [1.0]]) + ).all() + + assert(data['graph_labels'] == torch.tensor([[1.0]])).all() + assert(data['weight'] == 1.0) + + # initialize configuration using two atoms (1 system, 1 env) as a subset + config = atomic.Configuration( + atomic_numbers=numbers, + positions=positions, + cell=cell, + pbc=[True] * 3, + node_labels=node_labels, + graph_labels=graph_labels, + system=[1], + environment=[2] + ) + + data = _create_dataset_from_configuration(config, z_table, 0.1) + + # check edges and shift are computed correctly + assert(data['edge_index'] == torch.tensor([[1, 2], + [2, 1]]) + ).all() + assert (data['shifts'] == torch.tensor([[0.0, 0.2, 0.0], + [0.0, -0.2, 0.0]]) + ).all() + assert(data['unit_shifts'] == torch.tensor([[0.0, 1.0, 0.0], + [0.0, -1.0, 0.0]]) + ).all() + + # initialize configuration using three atoms (1 system, 2 env) as a subset and no buffer + config = atomic.Configuration( + atomic_numbers=numbers, + positions=positions, + cell=cell, + pbc=[True] * 3, + node_labels=node_labels, + graph_labels=graph_labels, + system=[0], + environment=[1, 2] + ) + + data = _create_dataset_from_configuration(config, z_table, 0.1) + assert(data['edge_index'] == torch.tensor([[0, 0, 1, 1, 2, 2], + [2, 1, 0, 2, 1, 0]]) + ).all() + + + # check if pbc and cutoffs works. now the third atoms is too far + positions = np.array([[0.0, 0.0, 0.0], + [0.07, 0.07, 0.0], + [0.07, -0.08, 0.0]], + dtype=float + ) + + config = atomic.Configuration( + atomic_numbers=numbers, + positions=positions, + cell=cell, + pbc=[True] * 3, + node_labels=node_labels, + graph_labels=graph_labels, + system=[0], + environment=[1, 2] + ) + # create dataset with same cutoff + data = _create_dataset_from_configuration(config, z_table, 0.1) + + # check third atom is not included anymore + assert (data['edge_index'] == torch.tensor([[0, 1], + [1, 0]]) + ).all() + + # create dataset with slightly large cutoff + data = _create_dataset_from_configuration(config, z_table, 0.11) + + # check the edge with the third atom is created once again + assert(data['edge_index'] == torch.tensor([[0, 0, 1, 1, 2, 2], + [2, 1, 0, 2, 1, 0]]) + ).all() + + # check with buffer layer + # the third atoms should be included but with no edge to the system atom + data = _create_dataset_from_configuration(config, z_table, 0.1, 0.01) + assert(data['edge_index'] == torch.tensor([[0, 1, 1, 2], + [1, 0, 2, 1]]) + ).all() + assert(data['shifts'] == torch.tensor([[0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.2, 0.0], + [0.0, -0.2, 0.0]]) + ).all() + assert(data['unit_shifts'] == torch.tensor([[0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 1.0, 0.0], + [0.0, -1.0, 0.0]]) + ).all() + + # create a list of configurations + config = [atomic.Configuration( + atomic_numbers=numbers, + positions=positions, + cell=cell, + pbc=[True] * 3, + node_labels=node_labels, + graph_labels=np.array([[i]]), + ) for i in range(0, 10)] + + # create dataset from list of configurations + dataset = create_dataset_from_configurations(config, + z_table, + 0.1, + show_progress=False) + + # check if the labels of the entries are created correctly + assert dataset.metadata['z_table'] == [1, 8] + assert (dataset[0]['data_list']['graph_labels'] == torch.tensor([[0.0]])).all() + assert (dataset[2]['data_list']['graph_labels'] == torch.tensor([[2.0]])).all() + assert (dataset[4]['data_list']['graph_labels'] == torch.tensor([[4.0]])).all() + + # dataset_1 = dataset[np.array([0, -1])] + assert dataset.metadata['z_table'] == [1, 8] + assert (dataset[ 0]['data_list']['graph_labels'] == torch.tensor([[0.0]])).all() + assert (dataset[-1]['data_list']['graph_labels'] == torch.tensor([[9.0]])).all() + + + +def test_from_configurations() -> None: + # fake atomic numbers, positions, cell, graph label, node labels + numbers = [8, 1, 1] + positions = np.array([[0.0, 0.0, 0.0], + [0.07, 0.07, 0.0], + [0.07, -0.07, 0.0]], + dtype=float + ) + cell = np.identity(3, dtype=float) * 0.2 + graph_labels = np.array([[1]]) + node_labels = np.array([[0], [1], [1]]) + + # init AtomicNumber object + z_table = atomic.AtomicNumberTable.from_zs(numbers) + + # initialize configuration using all atoms + config = atomic.Configuration( + atomic_numbers=numbers, + positions=positions, + cell=cell, + pbc=[True] * 3, + node_labels=node_labels, + graph_labels=graph_labels, + ) + + # create dataset from a configuration, even if single is the multiple function + dataset = create_dataset_from_configurations([config], + z_table, + 0.1, + remove_isolated_nodes=True, + show_progress=False + )[0] + + # take data entry from the DictDataset + data = dataset['data_list'] + + # check edges and shifts are created correctly + assert(data['edge_index'] == torch.tensor([[0, 0, 1, 1, 2, 2], + [2, 1, 0, 2, 1, 0]]) + ).all() + assert(data['shifts'] == torch.tensor([[0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.2, 0.0], + [0.0, -0.2, 0.0], + [0.0, 0.0, 0.0]]) + ).all() + + assert(data['unit_shifts'] == torch.tensor([[0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 1.0, 0.0], + [0.0, -1.0, 0.0], + [0.0, 0.0, 0.0]]) + ).all() + + # check correct storage + assert(data['positions'] == torch.tensor([[0.0, 0.0, 0.0], + [0.07, 0.07, 0.0], + [0.07, -0.07, 0.0]]) + ).all() + + assert(data['cell'] == torch.tensor([[0.2, 0.0, 0.0], + [0.0, 0.2, 0.0], + [0.0, 0.0, 0.2]]) + ).all() + + assert(data['node_attrs'] == torch.tensor([[0.0, 1.0], + [1.0, 0.0], + [1.0, 0.0]]) + ).all() + assert(data['node_labels'] == torch.tensor([[0.0], + [1.0], + [1.0]]) + ).all() + assert(data['graph_labels'] == torch.tensor([[1.0]])).all() + assert(data['weight'] == 1.0) + + # initialize configuration using three atoms (1 system, 2 env) as a subset and no buffer + config = atomic.Configuration( + atomic_numbers=numbers, + positions=positions, + cell=cell, + pbc=[True] * 3, + node_labels=node_labels, + graph_labels=graph_labels, + system=[1], + environment=[2] + ) + dataset = create_dataset_from_configurations([config], + z_table, + 0.1, + remove_isolated_nodes=True, + show_progress=False + )[0] + + # take data entry from the DictDataset + data = dataset['data_list'] + + assert(data['positions'] == torch.tensor([[0.07, 0.07, 0.0], + [0.07, -0.07, 0.0]]) + ).all() + assert(data['cell'] == torch.tensor([[0.2, 0.0, 0.0], + [0.0, 0.2, 0.0], + [0.0, 0.0, 0.2]]) + ).all() + assert(data['node_attrs'] == torch.tensor([[1.0, 0.0], + [1.0, 0.0]]) + ).all() + assert(data['edge_index'] == torch.tensor([[0, 1], + [1, 0]]) + ).all() + assert(data['shifts'] == torch.tensor([[0.0, 0.2, 0.0], + [0.0, -0.2, 0.0]]) + ).all() + assert(data['unit_shifts'] == torch.tensor([[0.0, 1.0, 0.0], + [0.0, -1.0, 0.0]]) + ).all() + +if __name__ == '__main__': + test_to_one_hot() + test_from_configuration() + test_from_configurations() \ No newline at end of file diff --git a/mlcolvar/data/utils.py b/mlcolvar/data/utils.py new file mode 100644 index 00000000..46deee84 --- /dev/null +++ b/mlcolvar/data/utils.py @@ -0,0 +1,151 @@ +import torch +import numpy as np + +from mlcolvar.data import DictDataset +from mlcolvar.data.graph.atomic import AtomicNumberTable + +__all__ = ["save_dataset", "load_dataset", "save_dataset_configurations_as_extyz"] + +def save_dataset(dataset: DictDataset, file_name: str) -> None: + """Save a dataset to disk. + + Parameters + ---------- + dataset: DictDataset + Dataset to be saved + file_name: str + Name of the file to save to + """ + assert isinstance(dataset, DictDataset) + + torch.save(dataset, file_name) + + +def load_dataset(file_name: str) -> DictDataset: + """Load a dataset from disk. + + Parameters + ---------- + file_name: str + Name of the file to load the dataset from + """ + dataset = torch.load(file_name) + + assert isinstance(dataset, DictDataset) + + return dataset + + +def save_dataset_configurations_as_extyz(dataset: DictDataset, file_name: str) -> None: + """Save a dataset to disk in the extxyz format. + + Parameters + ---------- + dataset: DictDataset + Dataset to be saved with data_type graphs + file_name: str + Name of the file to save to + """ + # check the dataset type is 'graphs' + if not dataset.metadata["data_type"] == "graphs": + raise( + ValueError("Can only save to extxyz dataset with data_type='graphs'!") + ) + + # initialize the atomic number object + z_table = AtomicNumberTable.from_zs(dataset.metadata["z_table"]) + + # create file + fp = open(file_name, 'w') + + for i in range(len(dataset)): + d = dataset[i]['data_list'] + + # print number of atoms + print(len(d['positions']), file=fp) + + # header line for configuration d + # Lattice, properties, pbc + line = ( + 'Lattice="{:s}" '.format((r'{:.5f} ' * 9).strip()) + + 'Properties=species:S:1:pos:R:3 pbc="T T T"' + ) + + # cell info + cell = [c.item() for c in d['cell'].flatten()] + print(line.format(*cell), file=fp) + + # write atoms positions + for j in range(0, len(d['positions'])): + # chemical symbol + s = z_table.index_to_symbol(np.where(d['node_attrs'][j])[0][0]) + print('{:2s}'.format(s), file=fp, end=' ') + + # positions + positions = [p.item() for p in d['positions'][j]] + print('{:10.5f} {:10.5f} {:10.5f}'.format(*positions), file=fp) + fp.close() + + + + +import tempfile + +def test_save_dataset(): + # check using descriptors dataset + dataset_dict = { + "data": torch.Tensor([[1.0], [2.0], [0.3], [0.4]]), + "labels": [0, 0, 1, 1], + "weights": np.asarray([0.5, 1.5, 1.5, 0.5]), + } + dataset = DictDataset(dataset_dict) + + # save to temporary working directory + with tempfile.TemporaryDirectory() as tmpdir: + save_dataset(dataset=dataset, file_name=f'{tmpdir}/saved_dataset') + + # load and check it's ok + loaded = load_dataset(file_name=f'{tmpdir}/saved_dataset') + assert(torch.allclose(dataset['data'], loaded['data'])) + + # check using graph dataset + from mlcolvar.data.graph.atomic import AtomicNumberTable, Configuration + from mlcolvar.data.graph.utils import create_dataset_from_configurations + numbers = [8, 1, 1] + positions = np.array( + [[0.0, 0.0, 0.0], [0.07, 0.07, 0.0], [0.07, -0.07, 0.0]], + dtype=float + ) + cell = np.identity(3, dtype=float) * 0.2 + graph_labels = np.array([[1]]) + node_labels = np.array([[0], [1], [1]]) + z_table = AtomicNumberTable.from_zs(numbers) + + config = [Configuration( + atomic_numbers=numbers, + positions=positions, + cell=cell, + pbc=[True] * 3, + node_labels=node_labels, + graph_labels=graph_labels, + )] + dataset = create_dataset_from_configurations( + config, z_table, 0.1, show_progress=False + ) + + # save dataset + with tempfile.TemporaryDirectory() as tmpdir: + save_dataset(dataset=dataset, file_name=f'{tmpdir}/saved_dataset') + + # load and check it's ok + loaded = load_dataset(file_name=f'{tmpdir}/saved_dataset') + assert(torch.allclose(dataset['data_list'][0]['positions'], loaded['data_list'][0]['positions'])) + + # save to extxyz + with tempfile.TemporaryDirectory() as tmpdir: + save_dataset_configurations_as_extyz(dataset=dataset, file_name=f'{tmpdir}/saved_dataset') + +if __name__ == "__main__": + test_save_dataset() + + diff --git a/mlcolvar/explain/__init__.py b/mlcolvar/explain/__init__.py index 7cfef572..40afa6c4 100644 --- a/mlcolvar/explain/__init__.py +++ b/mlcolvar/explain/__init__.py @@ -1,7 +1,9 @@ __all__ = [ "sensitivity_analysis", "plot_sensitivity", + "graph_node_sensitivity" ] from .sensitivity import * +from .graph_sensitivity import * # from .lasso import * # lasso requires additional dependencies diff --git a/mlcolvar/explain/graph_sensitivity.py b/mlcolvar/explain/graph_sensitivity.py new file mode 100644 index 00000000..c6934c22 --- /dev/null +++ b/mlcolvar/explain/graph_sensitivity.py @@ -0,0 +1,277 @@ +import numpy as np +from typing import Dict +import torch + +from mlcolvar.data import DictModule +from mlcolvar.utils.plot import pbar +from mlcolvar.core.nn import BaseGNN + + +__all__ = ['graph_node_sensitivity'] + + +def graph_node_sensitivity( + model, + dataset, + component: int = 0, + device: str = 'cpu', + batch_size: int = None, + show_progress: bool = True +) -> Dict[str, np.ndarray]: + """Performs a sensitivity analysis on a GNN-based CV model using + partial derivatives w.r.t. nodes' positions. + This allows us to measure which atom is most important to the CV model. + + Parameters + ---------- + model: mlcolvar.cvs.BaseCV + Collective variable model based on GNN + dataset: mlcovar.data.DictDataset + Graph-based dataset on which to compute the sensitivity analysis + device: str + Name of the device on which to perform the computation + batch_size: + Batch size used for evaluating the CV + show_progress: bool + If to show the progress bar + + Returns + ------- + results: dictionary + Results of the sensitivity analysis, containing 'node_indices', + 'sensitivities', and 'sensitivities_components', ordered according to + the node indices. + + See also + -------- + mlcolvar.utils.explain.sensitivity_analysis + Perform the sensitivity analysis of a feedforward model. + """ + # check model is GNN-based + if not isinstance(model.nn, BaseGNN): + raise ValueError ( + "The CV model is not based on GNN! Maybe you should use the feedforward sensitivity_analysis from mlcolvar.utils.explain.sensitivity!" + ) + + model = model.to(device) + + gradients = get_dataset_cv_gradients( + model=model, + dataset=dataset, + component=component, + batch_size=batch_size, + show_progress=show_progress, + progress_prefix='Getting gradients' + ) + sensitivities_components = np.linalg.norm(gradients, axis=-1) + + results = {} + results['atoms_list'] = np.array(dataset.metadata['used_names']) + results['node_labels'] = [str(a) for a in results['atoms_list']] + # results['node_labels_components'] = np.array([np.array(dataset.metadata['used_names'])[dataset[i]['data_list']['names_idx']] for i in range(len(dataset))]) + results['sensitivities'] = sensitivities_components.mean(axis=0) + results['sensitivities_components'] = sensitivities_components + + return results + +def get_dataset_cv_values( + model, + dataset, + batch_size: int = None, + show_progress: bool = True, + progress_prefix: str = 'Calculating CV values' +) -> np.ndarray: + """Gets the values of a CV model on a given dataset. + The calculation will run on the device where the model is on. + + Parameters + ---------- + model: mlcolvar.cvs.BaseCV + Collective variable model + dataset: mlcovar.data.DictDataset + Dataset on which to compute the sensitivity analysis + batch_size: + Batch size used for evaluating the CV + show_progress: bool + If to show the progress bar + """ + datamodule = DictModule( + dataset=dataset, + lengths=(1.0,), + batch_size=batch_size, + random_split=False, + shuffle=False + ) + datamodule.setup() + + cv_values = [] + device = next(model.parameters()).device + + if show_progress: + items = pbar( + datamodule.train_dataloader(), + frequency=0.001, + prefix=progress_prefix + ) + else: + items = datamodule.train_dataloader() + + with torch.no_grad(): + for batchs in items: + outputs = model(batchs['data_list'].to(device).to_dict()) + outputs = outputs.cpu().numpy() + cv_values.append(outputs) + + return np.concatenate(cv_values) + + +def get_dataset_cv_gradients( + model, + dataset, + component: int = 0, + batch_size: int = None, + show_progress: bool = True, + progress_prefix: str = 'Calculating CV gradients' +) -> np.ndarray: + """Get gradients of a GNN-based CV w.r.t. node positions in a given dataset. + The calculation will run on the device where the model is on. + + Parameters + ---------- + model: mlcolvar.cvs.BaseCV + Collective variable model based on GNN + dataset: mlcovar.data.DictDataset + Graph-based dataset on which to compute the sensitivity analysis + component: int + Component of the CV to analyse + batch_size: + Batch size used for evaluating the CV + show_progress: bool + If to show the progress bar + """ + datamodule = DictModule( + dataset=dataset, + lengths=(1.0,), + batch_size=batch_size, + random_split=False, + shuffle=False + ) + datamodule.setup() + + cv_value_gradients = [] + device = next(model.parameters()).device + + if show_progress: + items = pbar( + datamodule.train_dataloader(), + frequency=0.001, + prefix=progress_prefix + ) + else: + items = datamodule.train_dataloader() + + for batchs in items: + batch_dict = batchs['data_list'].to(device) + batch_dict['positions'].requires_grad_(True) + cv_values = model(batch_dict) + cv_values = cv_values[:, component] + grad_outputs = [torch.ones_like(cv_values, device=device)] + gradients = torch.autograd.grad( + outputs=[cv_values], + inputs=[batch_dict['positions']], + grad_outputs=grad_outputs, + retain_graph=False, + create_graph=False, + ) + graph_sizes = batch_dict['ptr'][1:] - batch_dict['ptr'][:-1] + + # if we used the removed isolated atoms this will give an inhomogenous tensor! + gradients = torch.split( + gradients[0].detach(), graph_sizes.cpu().numpy().tolist() + ) + + # here we ensure that all the gradients have the correct shape + # and that each entry is at the correct index accordingly + max_used_atoms = len(dataset.metadata['used_idx']) + for i,g in enumerate(gradients): + aux = torch.zeros((max_used_atoms, 3)) + # this populates the right entries according to the orignal indexing + aux[batch_dict[i]['names_idx'], :] = g + cv_value_gradients.extend(aux.unsqueeze(0).cpu().numpy()) + + return np.array(cv_value_gradients) + + +def test_get_cv_values_graph(): + import lightning + from mlcolvar.cvs import DeepTDA + from mlcolvar.core.nn.graph import SchNetModel + from mlcolvar.data.graph.utils import create_test_graph_input + + # create data, we need the dataset for sensitivity analysis later + dataset = create_test_graph_input(output_type='dataset', n_samples=50, n_states=2, n_atoms=3) + datamodule = DictModule(dataset=dataset, lengths=[0.8, 0.2], shuffle=[1, 0]) + + # create model + gnn_model = SchNetModel(n_out=1, cutoff=0.1, atomic_numbers=[8, 1]) + model = DeepTDA( + n_states=2, + n_cvs=1, + target_centers=[-5, 5], + target_sigmas=[0.2, 0.2], + model=gnn_model + ) + + # train model + trainer = lightning.Trainer( + accelerator="cpu", max_epochs=2, logger=False, enable_checkpointing=False, enable_model_summary=False + ) + trainer.fit(model, datamodule) + + # do analysis + cv_values = get_dataset_cv_values(model=model, dataset=dataset, batch_size=0) + + # print results + print(cv_values) + + assert (torch.allclose(model(dataset.get_graph_inputs()), torch.Tensor(cv_values))) + + + +def test_graph_sensitivity(): + import lightning + from mlcolvar.cvs import DeepTDA + from mlcolvar.core.nn.graph import SchNetModel + from mlcolvar.data.graph.utils import create_test_graph_input + + # create data, we need the dataset for sensitivity analysis later + dataset = create_test_graph_input(output_type='dataset', n_samples=100, n_states=2, n_atoms=3) + datamodule = DictModule(dataset=dataset, lengths=[0.8, 0.2], shuffle=[1, 0]) + + # create model + gnn_model = SchNetModel(n_out=1, cutoff=0.1, atomic_numbers=[8, 1]) + model = DeepTDA( + n_states=2, + n_cvs=1, + target_centers=[-5, 5], + target_sigmas=[0.2, 0.2], + model=gnn_model + ) + + # train model + trainer = lightning.Trainer( + accelerator="cpu", max_epochs=2, logger=False, enable_checkpointing=False, enable_model_summary=False + ) + trainer.fit(model, datamodule) + + # do analysis + test_sensitivity = graph_node_sensitivity(model=model, + dataset=dataset, + batch_size=0) + + # print results + print(test_sensitivity) + +if __name__ == '__main__': + test_graph_sensitivity() + test_get_cv_values_graph() \ No newline at end of file diff --git a/mlcolvar/explain/lasso.py b/mlcolvar/explain/lasso.py index 55d90248..ca44f837 100644 --- a/mlcolvar/explain/lasso.py +++ b/mlcolvar/explain/lasso.py @@ -3,7 +3,6 @@ import matplotlib import matplotlib.pyplot as plt -import mlcolvar.utils.plot try: import sklearn diff --git a/mlcolvar/explain/sensitivity.py b/mlcolvar/explain/sensitivity.py index a553a4ba..f6f37451 100644 --- a/mlcolvar/explain/sensitivity.py +++ b/mlcolvar/explain/sensitivity.py @@ -2,7 +2,6 @@ import torch from matplotlib import patches as mpatches import matplotlib.pyplot as plt -import mlcolvar.utils.plot __all__ = [ "sensitivity_analysis", "plot_sensitivity" ] diff --git a/mlcolvar/tests/data/Cu.xyz b/mlcolvar/tests/data/Cu.xyz new file mode 100644 index 00000000..111af6a3 --- /dev/null +++ b/mlcolvar/tests/data/Cu.xyz @@ -0,0 +1,54 @@ +16 +Lattice="7.15486134 0.0 0.0 0.0 3.57743067 0.0 0.0 0.0 7.15486134" Properties=species:S:1:pos:R:3 +Cu 0.0 0.0 0.0 +Cu 0.0 1.78871534 1.78871534 +Cu 1.78871534 0.0 1.78871534 +Cu 1.78871534 1.78871534 0.0 +Cu 0.0 0.0 3.57743067 +Cu 0.0 1.78871534 5.36614601 +Cu 1.78871534 0.0 5.36614601 +Cu 1.78871534 1.78871534 3.57743067 +Cu 3.57743067 0.0 0.0 +Cu 3.57743067 1.78871534 1.78871534 +Cu 5.36614601 0.0 1.78871534 +Cu 5.36614601 1.78871534 0.0 +Cu 3.57743067 0.0 3.57743067 +Cu 3.57743067 1.78871534 5.36614601 +Cu 5.36614601 0.0 5.36614601 +Cu 5.36614601 1.78871534 3.57743067 +16 +Lattice="7.15486134 0.0 0.0 0.0 3.57743067 0.0 0.0 0.0 7.15486134" Properties=species:S:1:pos:R:3 +Cu 0.0 0.0 0.0 +Cu 0.0 1.78871534 1.78871534 +Cu 1.78871534 0.0 1.78871534 +Cu 1.78871534 1.78871534 0.0 +Cu 0.0 0.0 3.57743067 +Cu 0.0 1.78871534 5.36614601 +Cu 1.78871534 0.0 5.36614601 +Cu 1.78871534 1.78871534 3.57743067 +Cu 3.57743067 0.0 0.0 +Cu 3.57743067 1.78871534 1.78871534 +Cu 5.36614601 0.0 1.78871534 +Cu 5.36614601 1.78871534 0.0 +Cu 3.57743067 0.0 3.57743067 +Cu 3.57743067 1.78871534 5.36614601 +Cu 5.36614601 0.0 5.36614601 +Cu 5.36614601 1.78871534 3.57743067 +16 +Lattice="7.15486134 0.0 0.0 0.0 3.57743067 0.0 0.0 0.0 7.15486134" Properties=species:S:1:pos:R:3 +Cu 0.0 0.0 0.0 +Cu 0.0 1.78871534 1.78871534 +Cu 1.78871534 0.0 1.78871534 +Cu 1.78871534 1.78871534 0.0 +Cu 0.0 0.0 3.57743067 +Cu 0.0 1.78871534 5.36614601 +Cu 1.78871534 0.0 5.36614601 +Cu 1.78871534 1.78871534 3.57743067 +Cu 3.57743067 0.0 0.0 +Cu 3.57743067 1.78871534 1.78871534 +Cu 5.36614601 0.0 1.78871534 +Cu 5.36614601 1.78871534 0.0 +Cu 3.57743067 0.0 3.57743067 +Cu 3.57743067 1.78871534 5.36614601 +Cu 5.36614601 0.0 5.36614601 +Cu 5.36614601 1.78871534 3.57743067 \ No newline at end of file diff --git a/mlcolvar/tests/data/Cu_top.pdb b/mlcolvar/tests/data/Cu_top.pdb new file mode 100644 index 00000000..9e7a14a1 --- /dev/null +++ b/mlcolvar/tests/data/Cu_top.pdb @@ -0,0 +1,19 @@ +CRYST1 7.155 3.577 7.155 90.00 90.00 90.00 P 1 +MODEL 1 +ATOM 1 Cu MOL 1 0.000 0.000 0.000 1.00 0.00 CU +ATOM 2 Cu MOL 1 0.000 1.789 1.789 1.00 0.00 CU +ATOM 3 Cu MOL 1 1.789 0.000 1.789 1.00 0.00 CU +ATOM 4 Cu MOL 1 1.789 1.789 0.000 1.00 0.00 CU +ATOM 5 Cu MOL 1 0.000 0.000 3.577 1.00 0.00 CU +ATOM 6 Cu MOL 1 0.000 1.789 5.366 1.00 0.00 CU +ATOM 7 Cu MOL 1 1.789 0.000 5.366 1.00 0.00 CU +ATOM 8 Cu MOL 1 1.789 1.789 3.577 1.00 0.00 CU +ATOM 9 Cu MOL 1 3.577 0.000 0.000 1.00 0.00 CU +ATOM 10 Cu MOL 1 3.577 1.789 1.789 1.00 0.00 CU +ATOM 11 Cu MOL 1 5.366 0.000 1.789 1.00 0.00 CU +ATOM 12 Cu MOL 1 5.366 1.789 0.000 1.00 0.00 CU +ATOM 13 Cu MOL 1 3.577 0.000 3.577 1.00 0.00 CU +ATOM 14 Cu MOL 1 3.577 1.789 5.366 1.00 0.00 CU +ATOM 15 Cu MOL 1 5.366 0.000 5.366 1.00 0.00 CU +ATOM 16 Cu MOL 1 5.366 1.789 3.577 1.00 0.00 CU +ENDMDL diff --git a/mlcolvar/tests/data/p.dcd b/mlcolvar/tests/data/p.dcd new file mode 100644 index 00000000..9ac637e3 Binary files /dev/null and b/mlcolvar/tests/data/p.dcd differ diff --git a/mlcolvar/tests/data/p.pdb b/mlcolvar/tests/data/p.pdb new file mode 100644 index 00000000..9928ce81 --- /dev/null +++ b/mlcolvar/tests/data/p.pdb @@ -0,0 +1,21 @@ +CRYST1 100.000 100.000 100.000 90.00 90.00 90.00 P 1 1 +ATOM 1 C UNL X 1 -2.477 -2.092 -0.388 1.00 0.00 C +ATOM 2 C UNL X 1 -3.520 -1.057 -0.204 1.00 0.00 C +ATOM 3 C UNL X 1 -2.495 -2.603 1.038 1.00 0.00 C +ATOM 4 H UNL X 1 -2.456 -1.819 1.767 1.00 0.00 H +ATOM 5 H UNL X 1 -3.448 -3.157 1.109 1.00 0.00 H +ATOM 6 H UNL X 1 -1.674 -3.291 1.255 1.00 0.00 H +ATOM 7 C UNL X 1 -2.786 -3.294 -1.306 1.00 0.00 C +ATOM 8 H UNL X 1 -2.634 -2.975 -2.351 1.00 0.00 H +ATOM 9 H UNL X 1 -2.089 -4.135 -1.164 1.00 0.00 H +ATOM 10 H UNL X 1 -3.798 -3.583 -1.079 1.00 0.00 H +ATOM 11 C UNL X 1 -1.151 -1.437 -0.717 1.00 0.00 C +ATOM 12 H UNL X 1 -0.988 -0.621 0.002 1.00 0.00 H +ATOM 13 H UNL X 1 -0.343 -2.138 -0.507 1.00 0.00 H +ATOM 14 H UNL X 1 -1.088 -1.151 -1.779 1.00 0.00 H +ATOM 15 C UNL X 1 -4.964 -1.269 -0.340 1.00 0.00 C +ATOM 16 H UNL X 1 -5.253 -1.213 -1.409 1.00 0.00 H +ATOM 17 H UNL X 1 -5.103 -2.218 0.074 1.00 0.00 H +ATOM 18 H UNL X 1 -5.607 -0.523 0.107 1.00 0.00 H +ATOM 19 F UNL X 1 -3.038 0.027 0.136 1.00 0.00 F +END diff --git a/mlcolvar/tests/data/r.dcd b/mlcolvar/tests/data/r.dcd new file mode 100644 index 00000000..c302acd2 Binary files /dev/null and b/mlcolvar/tests/data/r.dcd differ diff --git a/mlcolvar/tests/data/r.pdb b/mlcolvar/tests/data/r.pdb new file mode 100644 index 00000000..75656832 --- /dev/null +++ b/mlcolvar/tests/data/r.pdb @@ -0,0 +1,21 @@ +CRYST1 100.000 100.000 100.000 90.00 90.00 90.00 P 1 1 +ATOM 1 C UNL X 1 -2.394 -1.013 0.390 1.00 0.00 C +ATOM 2 C UNL X 1 -2.588 -1.774 -0.881 1.00 0.00 C +ATOM 3 C UNL X 1 -2.555 -1.607 1.679 1.00 0.00 C +ATOM 4 H UNL X 1 -2.178 -2.584 1.842 1.00 0.00 H +ATOM 5 H UNL X 1 -2.075 -0.998 2.471 1.00 0.00 H +ATOM 6 H UNL X 1 -3.633 -1.736 1.862 1.00 0.00 H +ATOM 7 C UNL X 1 -1.780 0.373 0.296 1.00 0.00 C +ATOM 8 H UNL X 1 -0.759 0.225 0.823 1.00 0.00 H +ATOM 9 H UNL X 1 -1.819 0.834 -0.686 1.00 0.00 H +ATOM 10 H UNL X 1 -2.358 0.965 1.025 1.00 0.00 H +ATOM 11 C UNL X 1 -1.348 -1.777 -1.672 1.00 0.00 C +ATOM 12 H UNL X 1 -1.110 -0.748 -2.010 1.00 0.00 H +ATOM 13 H UNL X 1 -0.424 -2.178 -1.122 1.00 0.00 H +ATOM 14 H UNL X 1 -1.446 -2.390 -2.576 1.00 0.00 H +ATOM 15 C UNL X 1 -3.773 -1.199 -1.601 1.00 0.00 C +ATOM 16 H UNL X 1 -3.816 -1.599 -2.581 1.00 0.00 H +ATOM 17 H UNL X 1 -4.663 -1.441 -1.003 1.00 0.00 H +ATOM 18 H UNL X 1 -3.757 -0.102 -1.693 1.00 0.00 H +ATOM 19 F UNL X 1 -2.878 -3.051 -0.412 1.00 0.00 F +END diff --git a/mlcolvar/tests/test_core_nn_graph.py b/mlcolvar/tests/test_core_nn_graph.py new file mode 100644 index 00000000..821acc8d --- /dev/null +++ b/mlcolvar/tests/test_core_nn_graph.py @@ -0,0 +1,9 @@ +from mlcolvar.core.nn.graph.gnn import test_get_edge_vectors_and_lengths +from mlcolvar.core.nn.graph.radial import test_bessel_basis, test_gaussian_basis, test_polynomial_cutoff, test_radial_embedding_block + +if __name__ == "__main__": + test_get_edge_vectors_and_lengths() + test_bessel_basis() + test_gaussian_basis() + test_polynomial_cutoff() + test_radial_embedding_block() \ No newline at end of file diff --git a/mlcolvar/tests/test_core_nn_graph_gvp.py b/mlcolvar/tests/test_core_nn_graph_gvp.py new file mode 100644 index 00000000..dcbc8176 --- /dev/null +++ b/mlcolvar/tests/test_core_nn_graph_gvp.py @@ -0,0 +1,4 @@ +from mlcolvar.core.nn.graph.gvp import test_gvp + +if __name__ == "__main__": + test_gvp() \ No newline at end of file diff --git a/mlcolvar/tests/test_core_nn_graph_schnet.py b/mlcolvar/tests/test_core_nn_graph_schnet.py new file mode 100644 index 00000000..2d83fdbf --- /dev/null +++ b/mlcolvar/tests/test_core_nn_graph_schnet.py @@ -0,0 +1,5 @@ +from mlcolvar.core.nn.graph.schnet import test_schnet_1, test_schnet_2 + +if __name__ == "__main__": + test_schnet_1() + test_schnet_2() \ No newline at end of file diff --git a/mlcolvar/tests/test_cvs.py b/mlcolvar/tests/test_cvs.py index 40d1425a..46f44d97 100644 --- a/mlcolvar/tests/test_cvs.py +++ b/mlcolvar/tests/test_cvs.py @@ -64,10 +64,10 @@ def dataset(): # ============================================================================= @pytest.mark.parametrize("cv_model", [ - mlcolvar.cvs.DeepLDA(layers=LAYERS, n_states=N_STATES), - mlcolvar.cvs.DeepTDA(n_states=N_STATES, n_cvs=1, target_centers=[-1., 1.], target_sigmas=[0.1, 0.1], layers=LAYERS), - mlcolvar.cvs.RegressionCV(layers=LAYERS), - mlcolvar.cvs.DeepTICA(layers=LAYERS, n_cvs=1), + mlcolvar.cvs.DeepLDA(model=LAYERS, n_states=N_STATES), + mlcolvar.cvs.DeepTDA(n_states=N_STATES, n_cvs=1, target_centers=[-1., 1.], target_sigmas=[0.1, 0.1], model=LAYERS), + mlcolvar.cvs.RegressionCV(model=LAYERS), + mlcolvar.cvs.DeepTICA(model=LAYERS, n_cvs=1), mlcolvar.cvs.AutoEncoderCV(encoder_layers=LAYERS), mlcolvar.cvs.VariationalAutoEncoderCV(n_cvs=1, encoder_layers=LAYERS[:-1]), ]) @@ -113,7 +113,7 @@ def test_lr_scheduler(): initial_lr = 1e-3 options = {'optimizer' : {'lr' : initial_lr}, 'lr_scheduler' : { 'scheduler' : lr_scheduler, 'gamma' : 0.9999}} - model = mlcolvar.cvs.RegressionCV(layers=[2,5,1], options=options) + model = mlcolvar.cvs.RegressionCV(model=[2,5,1], options=options) # check training and lr scheduling trainer = lightning.Trainer(max_epochs=10, diff --git a/mlcolvar/tests/test_cvs_committor.py b/mlcolvar/tests/test_cvs_committor.py index ac941394..aa658601 100644 --- a/mlcolvar/tests/test_cvs_committor.py +++ b/mlcolvar/tests/test_cvs_committor.py @@ -1,5 +1,10 @@ -from mlcolvar.cvs.committor.committor import test_committor, test_committor_with_derivatives +from mlcolvar.cvs.committor.committor import test_committor_1, test_committor_2 , test_committor_with_derivatives +from mlcolvar.cvs.committor.utils import test_compute_committor_weights, test_Kolmogorov_bias + if __name__ == "__main__": - test_committor() - test_committor_with_derivatives() \ No newline at end of file + test_committor_1() + test_committor_2() + test_committor_with_derivatives() + test_Kolmogorov_bias() + test_compute_committor_weights() \ No newline at end of file diff --git a/mlcolvar/tests/test_cvs_multitask_multitask.py b/mlcolvar/tests/test_cvs_multitask_multitask.py index 67a23087..75106305 100644 --- a/mlcolvar/tests/test_cvs_multitask_multitask.py +++ b/mlcolvar/tests/test_cvs_multitask_multitask.py @@ -21,6 +21,7 @@ import lightning import torch +from mlcolvar.core.nn import FeedForward from mlcolvar.core.loss import TDALoss, FisherDiscriminantLoss, AutocorrelationLoss from mlcolvar.cvs.cv import BaseCV from mlcolvar.cvs.multitask.multitask import MultiTaskCV @@ -62,11 +63,13 @@ def forward(self, data, data_lag=None, **kwargs): class MockCV(BaseCV, lightning.LightningModule): """Mock CV for mock testing.""" - BLOCKS = [] + DEFAULT_BLOCKS = [] + MODEL_BLOCKS = [] def __init__(self, in_features=N_DESCRIPTORS, out_features=N_CVS): """Constructor.""" - super().__init__(in_features=in_features, out_features=out_features) + model = FeedForward(layers=[in_features, in_features]) + super().__init__(model=model) self.loss_fn = MockAuxLoss(in_features, out_features) def training_step(self, train_batch, batch_idx): @@ -129,7 +132,7 @@ def create_cv(cv_name, n_descriptors=N_DESCRIPTORS, n_cvs=N_CVS): n_cvs=n_cvs, encoder_layers=[n_descriptors, 10] ) elif cv_name == "deeptica": - returned = "time-lagged", DeepTICA(layers=[n_descriptors, 10, n_cvs]) + returned = "time-lagged", DeepTICA(model=[n_descriptors, 10, n_cvs]) else: raise ValueError("Unrecognized cv_name.") diff --git a/mlcolvar/tests/test_cvs_supervised_tda.py b/mlcolvar/tests/test_cvs_supervised_tda.py index c1ec6bac..b3300b7f 100644 --- a/mlcolvar/tests/test_cvs_supervised_tda.py +++ b/mlcolvar/tests/test_cvs_supervised_tda.py @@ -1,4 +1,6 @@ from mlcolvar.cvs.supervised.deeptda import test_deeptda_cv +from mlcolvar.core.loss.tda_loss import test_tda_loss if __name__ == "__main__": test_deeptda_cv() + test_tda_loss() diff --git a/mlcolvar/tests/test_data_graph.py b/mlcolvar/tests/test_data_graph.py new file mode 100644 index 00000000..9c18f8c7 --- /dev/null +++ b/mlcolvar/tests/test_data_graph.py @@ -0,0 +1,6 @@ +from mlcolvar.data.graph.atomic import test_atomic_number_table +from mlcolvar.data.graph.neighborhood import test_get_neighborhood + +if __name__ == '__main__': + test_atomic_number_table() + test_get_neighborhood() \ No newline at end of file diff --git a/mlcolvar/tests/test_data_graph_utils.py b/mlcolvar/tests/test_data_graph_utils.py new file mode 100644 index 00000000..f765c6a9 --- /dev/null +++ b/mlcolvar/tests/test_data_graph_utils.py @@ -0,0 +1,6 @@ +from mlcolvar.data.graph.utils import test_from_configuration, test_from_configurations, test_to_one_hot + +if __name__ == "main": + test_to_one_hot() + test_from_configuration() + test_from_configurations() \ No newline at end of file diff --git a/mlcolvar/tests/test_data_utils.py b/mlcolvar/tests/test_data_utils.py new file mode 100644 index 00000000..6ef6eb43 --- /dev/null +++ b/mlcolvar/tests/test_data_utils.py @@ -0,0 +1,4 @@ +from mlcolvar.data.utils import test_save_dataset + +if __name__=="main": + test_save_dataset() \ No newline at end of file diff --git a/mlcolvar/tests/test_explain_sensitivity.py b/mlcolvar/tests/test_explain_sensitivity.py index 0b80078d..69d4edef 100644 --- a/mlcolvar/tests/test_explain_sensitivity.py +++ b/mlcolvar/tests/test_explain_sensitivity.py @@ -1,6 +1,9 @@ import pytest from mlcolvar.explain.sensitivity import test_sensitivity_analysis +from mlcolvar.explain.graph_sensitivity import test_graph_sensitivity, test_get_cv_values_graph if __name__ == "__main__": test_sensitivity_analysis() + test_graph_sensitivity() + test_get_cv_values_graph() diff --git a/mlcolvar/tests/test_utils_io.py b/mlcolvar/tests/test_utils_io.py index 629ff095..173f49b3 100644 --- a/mlcolvar/tests/test_utils_io.py +++ b/mlcolvar/tests/test_utils_io.py @@ -2,6 +2,9 @@ import urllib from mlcolvar.utils.io import load_dataframe from mlcolvar.utils.io import test_datasetFromFile +from mlcolvar.utils.io import test_datasesetFromTrajectories +from mlcolvar.utils.io import test_create_dataset_from_trajectories +from mlcolvar.utils.io import test_dataset_from_xyz example_files = { "str": "mlcolvar/tests/data/state_A.dat", @@ -23,6 +26,66 @@ def test_loadDataframe(file_type): df = load_dataframe(filename, start=0, stop=10, stride=1) +inputs = [""" +CRYST1 2.000 2.000 2.000 90.00 90.00 90.00 P 1 1 +ATOM 1 OH2 TIP3W 1 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 2 H1 TIP3W 1 0.700 0.700 0.000 1.00 0.00 WT1 H +ATOM 3 H2 TIP3W 1 0.700 -0.700 0.000 1.00 0.00 WT1 H +ENDMODEL +ATOM 1 OH2 TIP3W 1 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 2 H1 TIP3W 1 0.700 0.700 0.000 1.00 0.00 WT1 H +ATOM 3 H2 TIP3W 1 0.700 -0.700 0.000 1.00 0.00 WT1 H +END +""", +""" +CRYST1 2.000 2.000 2.000 90.00 90.00 90.00 P 1 1 +ATOM 1 OH2 TIP3W 1 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 2 H1 TIP3W 1 0.700 0.700 0.000 1.00 0.00 WT1 H +ATOM 3 H2 TIP3W 1 0.700 -0.700 0.000 1.00 0.00 WT1 H +ATOM 4 OH2 XXXXW 2 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 5 H1 XXXXW 2 0.300 0.300 0.000 1.00 0.00 WT1 H +ATOM 6 H2 XXXXW 2 0.300 -0.300 0.000 1.00 0.00 WT1 H +ENDMODEL +ATOM 1 OH2 TIP3W 1 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 2 H1 TIP3W 1 0.700 0.700 0.000 1.00 0.00 WT1 H +ATOM 3 H2 TIP3W 1 0.700 -0.700 0.000 1.00 0.00 WT1 H +ATOM 4 OH2 XXXXW 2 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 5 H1 XXXXW 2 0.300 0.300 0.000 1.00 0.00 WT1 H +ATOM 6 H2 XXXXW 2 0.300 -0.300 0.000 1.00 0.00 WT1 H +END +""", +""" +CRYST1 2.000 2.000 2.000 90.00 90.00 90.00 P 1 1 +ATOM 1 OH2 XXXXW 1 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 2 OH2 TIP3W 2 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 3 H1 XXXXW 1 0.300 0.300 0.000 1.00 0.00 WT1 H +ATOM 4 H1 TIP3W 2 0.700 0.700 0.000 1.00 0.00 WT1 H +ATOM 5 H2 XXXXW 1 0.300 -0.300 0.000 1.00 0.00 WT1 H +ATOM 6 H2 TIP3W 2 0.700 -0.700 0.000 1.00 0.00 WT1 H +ENDMODEL +ATOM 1 OH2 XXXXW 1 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 2 OH2 TIP3W 2 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 3 H1 XXXXW 1 0.300 0.300 0.000 1.00 0.00 WT1 H +ATOM 4 H1 TIP3W 2 0.700 0.700 0.000 1.00 0.00 WT1 H +ATOM 5 H2 XXXXW 1 0.300 -0.300 0.000 1.00 0.00 WT1 H +ATOM 6 H2 TIP3W 2 0.700 -0.700 0.000 1.00 0.00 WT1 H +END +""" +] + +@pytest.mark.parametrize("text,selection", + [(inputs[0], None), + (inputs[1], 'not resname XXXX'), + (inputs[2], 'not resname XXXX') + ] + ) +# @pytest.mark.parametrize("text", inputs) +def test_dataset_from_trajectories(text, selection): + print(selection) + test_create_dataset_from_trajectories(text, selection) + + if __name__ == "__main__": - # test_loadDataframe() + test_dataset_from_xyz() test_datasetFromFile() + test_datasesetFromTrajectories() \ No newline at end of file diff --git a/mlcolvar/utils/io.py b/mlcolvar/utils/io.py index 7bbc96db..608f29f5 100644 --- a/mlcolvar/utils/io.py +++ b/mlcolvar/utils/io.py @@ -9,10 +9,24 @@ import numpy as np import torch import os +import tempfile import urllib.request -from typing import Union +from typing import Union, List, Tuple +import mdtraj +from warnings import warn + +# Import ASE for xyz to pdb conversion. +try: + from ase.io import read, write + from ase import Atoms +except ImportError as e: + raise ImportError("ASE is required for xyz to pdb conversion.", e) + from mlcolvar.data import DictDataset +from mlcolvar.data.graph.atomic import AtomicNumberTable, Configuration, Configurations +from mlcolvar.data.graph.utils import create_dataset_from_configurations + __all__ = ["load_dataframe", "plumed_to_pandas", "create_dataset_from_files"] @@ -117,7 +131,11 @@ def load_dataframe( if "http" in filename: download = True url = filename - filename = "tmp_" + filename.split("/")[-1] + if delete_download: + temp = tempfile.NamedTemporaryFile() + filename = temp.name + else: + filename = "tmp_" + filename.split("/")[-1] urllib.request.urlretrieve(url, filename) # check if file is in PLUMED format @@ -137,7 +155,7 @@ def load_dataframe( # delete temporary data if necessary if download: if delete_download: - os.remove(filename) + temp.close() else: print(f"downloaded file ({url}) saved as ({filename}).") @@ -251,13 +269,371 @@ def create_dataset_from_files( dictionary = {"data": torch.Tensor(df_data.values)} if create_labels: dictionary["labels"] = torch.Tensor(df["labels"].values) - dataset = DictDataset(dictionary, feature_names=df_data.columns.values) + dataset = DictDataset(dictionary, feature_names=df_data.columns.values, data_type='descriptors') if return_dataframe: return dataset, df else: return dataset +def create_pdb_from_xyz(input_filename: str, output_filename: str) -> str: + """ + Convert the first frame of an XYZ file into a PDB file using ASE. + This pdb file can then serve as the topology for MDTraj. + + Parameters: + input_filename: Path to the input .xyz file. + output_filename: Path to the output .pdb file. + + Returns: + The path to the generated PDB file. + """ + atoms: Atoms = read(input_filename, index=0) + + if (atoms.cell == 0).all(): + warn("A topology file was generated from the xyz trajectory file but no cell information were provided!") + if not atoms.pbc.any(): + warn("A topology file was generated from the xyz trajectory file but no PBC information were provided!") + elif not atoms.pbc.all(): + warn( f"Partial PBC are not supported! The provided input has pbc {atoms.pbc}") + + write(output_filename, atoms, format='proteindatabank') + return output_filename + + + +def create_dataset_from_trajectories( + trajectories: Union[List[str], str], + top: Union[List[str], str, None], + cutoff: float, + buffer: float = 0.0, + z_table: AtomicNumberTable = None, + load_args: list = None, + folder: str = None, + labels: list = None, + system_selection: str = None, + environment_selection: str = None, + return_trajectories: bool = False, + remove_isolated_nodes: bool = True, + show_progress: bool = True, + save_names=True, + lengths_conversion : float = 10.0, +) -> Union[ + DictDataset, + Tuple[ + DictDataset, + Union[List[List[mdtraj.Trajectory]], List[mdtraj.Trajectory]] + ] +]: + """ + Create a dataset from a set of trajectory files. + + Parameters + ---------- + trajectories: Union[List[str], str] + Paths to trajectories files. + top: Union[List[str], str, None] + Path to topology files. Only for .xyz files it can be set to None or empty to generate automatically a topology file. + cutoff: float (units: Ang) + The graph cutoff radius. + buffer: float + Buffer size used in finding active environment atoms. + z_table: mlcolvar.graph.data.atomic.AtomicNumberTable + The atomic number table used to build the node attributes. If not + given, it will be created from the given trajectories. + load_args: list[dict], optional + List of dictionaries for loading options for each file (keys: start,stop,stride), by default None + folder: str + Common path for the files to be imported. If set, filenames become + `folder/file_name`. + labels: list + List of labels to be assigned to the given files. by default None. + If None, it simply enumerates the files. + system_selection: str + MDTraj style atom selections [1] of the system atoms. If given, only + selected atoms will be loaded from the trajectories. This option may + increase the speed of building graphs. + environment_selection: str + MDTraj style atom selections [1] of the environment atoms. If given, + only the system atoms and [the environment atoms within the cutoff + radius of the system atoms] will be kept in the graph. + return_trajectories: bool + If also return the loaded trajectory objects. + remove_isolated_nodes: bool + If remove isolated nodes from the dataset. + show_progress: bool + If show the progress bar. + save_names: bool + If to save names from topology file, by default True + lengths_conversion: float, + Conversion factor for length units, by default 10. + MDTraj uses nanometers, the default sends to Angstroms. + + Returns + ------- + dataset: mlcolvar.graph.data.GraphDataSet + The graph dataset. + trajectories: Union[List[List[mdtraj.Trajectory]], List[mdtraj.Trajectory]] + The loaded trajectory objects. + + Notes + ----- + The login behind this method is like the follows: + 1. If only `system_selection` is given, the method will only load atoms + selected by this selection, from the trajectories. + 2. If both `system_selection` and `environment_selection` are given, + the method will load the atoms select by both selections, but will + build graphs using [the system atoms] and [the environment atoms within + the cutoff radius of the system atoms]. + + References + ---------- + .. [1] https://www.mdtraj.org/1.9.8.dev0/atom_selection.html + """ + + # check if using truncated graph + if environment_selection is not None: + assert system_selection is not None, ( + 'the `environment_selection` argument requires the' + + '`system_selection` argument to be defined!' + ) + selection = '({:s}) or ({:s})'.format( + system_selection, environment_selection + ) + elif system_selection is not None: + selection = system_selection + else: + selection = None + + if environment_selection is None: + assert buffer == 0, ( + 'Not `environment_selection` given! Cannot define buffer size!' + ) + + # initiliaze simple labels if not provided + if labels is None: + labels = [i for i in range(len(trajectories))] + else: + assert len(labels) == len(trajectories), ( + "Number of labels and trajectories must be the same!" + ) + + # check topologies if given + if top is not None: + assert len(trajectories) == len(top) or len(top)==1 or isinstance(top, str), ( + 'Either a single topology file or as many as the trajectory files must be provided!' + ) + + # ensure trajectories is a list + if isinstance(trajectories, str): + trajectories = [trajectories] + + # --- Handle topologies input --- + # Allow top to be None or empty. In that case, create a list of empty strings. + if isinstance(top, str): + top = [top for _ in trajectories] + if top is None or (isinstance(top, list) and len(top) == 0): + top = ["" for _ in trajectories] + elif len(top) == 1 and len(trajectories) > 1: + top = [top for _ in trajectories] + + # For each trajectory file (and its associated topology), if the trajectory file + # has a ".xyz" extension and no topology is provided, convert it. + for i in range(len(trajectories)): + if folder is not None: + trajectories[i] = os.path.join(folder, trajectories[i]) + if top[i]: + top[i] = os.path.join(folder, top[i]) + assert isinstance(trajectories[i], str) + _, ext = os.path.splitext(trajectories[i]) + if (ext.lower() == ".xyz") and (not top[i]): + pdb_file = trajectories[i].replace('.xyz', '_top.pdb') + top[i] = create_pdb_from_xyz(trajectories[i], pdb_file) + + # check if per file args are given, otherwise set to {} + if load_args is not None: + if (not isinstance(load_args, list)) or (len(trajectories) != len(load_args)): + raise TypeError( + "load_args should be a list of dictionaries of arguments of same length as trajectories." + ) + + + # load topologies and trajectories + topologies = [] + trajectories_in_memory = [] + for i in range(len(trajectories)): + # load trajectory + traj = mdtraj.load(trajectories[i], top=top[i]) + traj.top = mdtraj.core.trajectory.load_topology(top[i]) + + # mdtraj does not load cell info from xyz, so we use ASE and add it + _, ext = os.path.splitext(trajectories[i]) + if (ext.lower() == ".xyz"): + ase_atoms = read(trajectories[i], index=':') + ase_cells = np.array([a.get_cell().array for a in ase_atoms], dtype=float) + # the pdb for the topology are in nm, ase work in A so we need to scale it + traj.unitcell_vectors = ase_cells/10 + + if selection is not None: + subset = traj.top.select(selection) + assert len(subset) > 0, ( + 'No atoms will be selected with selection string ' + + '"{:s}"!'.format(selection) + ) + traj = traj.atom_slice(subset) + trajectories_in_memory.append(traj) + topologies.append(traj.top) + + if z_table is None: + z_table = _z_table_from_top(topologies) + + if save_names: + atom_names = _names_from_top(topologies) + else: + atom_names = None + + # create configurations objects from trajectories + configurations = [] + for i in range(len(trajectories_in_memory)): + configuration = _configures_from_trajectory( + trajectory=trajectories_in_memory[i], + label=labels[i], + system_selection=system_selection, + environment_selection=environment_selection, + start=load_args[i]['start'] if load_args is not None else 0, + stop=load_args[i]['stop'] if load_args is not None else None, + stride=load_args[i]['stride'] if load_args is not None else 1, + lengths_conversion=lengths_conversion, + ) + configurations.extend(configuration) + + # convert configurations into DictDataset + dataset = create_dataset_from_configurations( + config=configurations, + z_table=z_table, + cutoff=cutoff, + buffer=buffer, + atom_names=atom_names, + remove_isolated_nodes=remove_isolated_nodes, + show_progress=show_progress + ) + + if return_trajectories: + return dataset, trajectories_in_memory + else: + return dataset + + +def _names_from_top(top: List[mdtraj.Topology] ): + it = iter(top) + atom_names = list(next(it).atoms) + if not all([atom_names == list(n.atoms) for n in it]): + raise ValueError( + "The atoms names or their order are different in the topology files. Check or deactivate save_names" + ) + + return atom_names + + +def _z_table_from_top( + top: List[mdtraj.Topology] +) -> AtomicNumberTable: + """ + Create an atomic number table from the topologies. + + Parameters + ---------- + top: List[mdtraj.Topology] + The topology objects. + """ + atomic_numbers = [] + for t in top: + atomic_numbers.extend([a.element.number for a in t.atoms]) + # atomic_numbers = np.array(atomic_numbers, dtype=int) + z_table = AtomicNumberTable.from_zs(atomic_numbers) + return z_table + + +def _configures_from_trajectory( + trajectory: mdtraj.Trajectory, + label: int = None, + system_selection: str = None, + environment_selection: str = None, + start: int = 0, + stop: int = None, + stride: int = 1, + lengths_conversion : float = 10.0) -> Configurations: + """ + Create configurations from one trajectory. + + Parameters + ---------- + trajectory: mdtraj.Trajectory + The MDTraj Trajectory object. + label: int + The graph label. + system_selection: str + MDTraj style atom selections of the system atoms. If given, only + selected atoms will be loaded from the trajectories. This option may + increase the speed of building graphs. + environment_selection: str + MDTraj style atom selections of the environment atoms. If given, + only the system atoms and [the environment atoms within the cutoff + radius of the system atoms] will be kept in the graph. + lengths_conversion: float, + Conversion factor for length units, by default 10. + MDTraj uses nanometers, the default sends to Angstroms. + """ + if label is not None: + label = np.array([[label]]) + + if system_selection is not None and environment_selection is not None: + system_atoms = trajectory.top.select(system_selection) + assert len(system_atoms) > 0, ( + 'No atoms will be selected with `system_selection`: ' + + '"{:s}"!'.format(system_selection) + ) + environment_atoms = trajectory.top.select(environment_selection) + assert len(environment_atoms) > 0, ( + 'No atoms will be selected with `environment_selection`: ' + + '"{:s}"!'.format(environment_selection) + ) + else: + system_atoms = None + environment_atoms = None + + atomic_numbers = [a.element.number for a in trajectory.top.atoms] + if trajectory.unitcell_vectors is not None: + pbc = [True] * 3 + cell = trajectory.unitcell_vectors + else: + pbc = [False] * 3 + cell = [None] * len(trajectory) + + if stop is None: + stop = len(trajectory) + + configurations = [] + + for i in range(start,stop,stride): + configuration = Configuration( + atomic_numbers=atomic_numbers, + positions=trajectory.xyz[i] * lengths_conversion, + cell=cell[i] * lengths_conversion, + pbc=pbc, + graph_labels=label, + node_labels=None, # TODO: Add supports for per-node labels. + system=system_atoms, + environment=environment_atoms + ) + configurations.append(configuration) + + return configurations + + +# ================================================================================================= +# ============================================= TESTS ============================================= +# ================================================================================================= def test_datasetFromFile(): # Test with unlabeled dataset @@ -316,6 +692,230 @@ def test_modifier(x): stride=1, ) +def test_datasesetFromTrajectories(): + create_dataset_from_trajectories( + trajectories=['r.dcd', + 'p.dcd'], + top=['r.pdb', + 'p.pdb'], + folder="mlcolvar/tests/data", + cutoff=8.0, # Ang + labels=None, + system_selection='all and not type H', + show_progress=False, + ) + + dataset = create_dataset_from_trajectories( + trajectories=['r.dcd', + 'p.dcd'], + top=['r.pdb', + 'p.pdb'], + folder="mlcolvar/tests/data", + cutoff=8.0, # Ang + labels=[0,1], + system_selection='all and not type H', + show_progress=False, + load_args=[{'start' : 0, 'stop' : 10, 'stride' : 1}, + {'start' : 6, 'stop' : 10, 'stride' : 2}] + ) + assert(len(dataset)==12) + + dataset = create_dataset_from_trajectories( + trajectories=['r.dcd', 'r.dcd', + 'p.dcd', 'p.dcd'], + top=['r.pdb', 'r.pdb', + 'p.pdb', 'p.pdb'], + folder="mlcolvar/tests/data", + cutoff=8.0, # Ang + labels=[0,1,2,3], + system_selection='all and not type H', + show_progress=False, + load_args=[{'start' : 0, 'stop' : 10, 'stride' : 1}, {'start' : 0, 'stop' : 10, 'stride' : 1}, + {'start' : 6, 'stop' : 10, 'stride' : 2}, {'start' : 6, 'stop' : 10, 'stride' : 2}] + ) + assert(len(dataset)==24) + + +def test_create_dataset_from_trajectories(text: str = """ +CRYST1 2.000 2.000 2.000 90.00 90.00 90.00 P 1 1 +ATOM 1 OH2 TIP3W 1 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 2 H1 TIP3W 1 0.700 0.700 0.000 1.00 0.00 WT1 H +ATOM 3 H2 TIP3W 1 0.700 -0.700 0.000 1.00 0.00 WT1 H +ENDMODEL +ATOM 1 OH2 TIP3W 1 0.000 0.000 0.000 1.00 0.00 WT1 O +ATOM 2 H1 TIP3W 1 0.700 0.700 0.000 1.00 0.00 WT1 H +ATOM 3 H2 TIP3W 1 0.700 -0.700 0.000 1.00 0.00 WT1 H +END +""", +system_selection: str = None +) -> None: + import tempfile + with tempfile.TemporaryDirectory() as tmpdir: + test_dataset_path = "test_dataset.pdb" + test_dataset_path = os.path.join(tmpdir, test_dataset_path) + with open(test_dataset_path, 'w') as fp: + print(text, file=fp) + + dataset, trajectories = create_dataset_from_trajectories( + trajectories=[test_dataset_path, test_dataset_path, test_dataset_path], + top=[test_dataset_path, test_dataset_path, test_dataset_path], + cutoff=1.0, + system_selection=system_selection, + return_trajectories=True, + show_progress=False + ) + + assert len(dataset) == 6 + assert dataset.metadata["cutoff"] == 1.0 + assert dataset.metadata["z_table"] == [1, 8] + assert len(trajectories[0]) == 2 + assert len(trajectories[1]) == 2 + assert len(trajectories[2]) == 2 + + assert dataset[0]["data_list"]['graph_labels'] == torch.tensor([[0.0]]) + assert dataset[1]["data_list"]['graph_labels'] == torch.tensor([[0.0]]) + assert dataset[2]["data_list"]['graph_labels'] == torch.tensor([[1.0]]) + assert dataset[3]["data_list"]['graph_labels'] == torch.tensor([[1.0]]) + assert dataset[4]["data_list"]['graph_labels'] == torch.tensor([[2.0]]) + assert dataset[5]["data_list"]['graph_labels'] == torch.tensor([[2.0]]) + + dataset, trajectories = create_dataset_from_trajectories( + trajectories=[test_dataset_path, test_dataset_path, test_dataset_path], + top=test_dataset_path, + cutoff=1.0, + labels=None, + system_selection=system_selection, + return_trajectories=True, + show_progress=False + ) + + assert dataset[0]["data_list"]['graph_labels'] == torch.tensor([[0.0]]) + assert dataset[1]["data_list"]['graph_labels'] == torch.tensor([[0.0]]) + assert dataset[2]["data_list"]['graph_labels'] == torch.tensor([[1.0]]) + assert dataset[3]["data_list"]['graph_labels'] == torch.tensor([[1.0]]) + assert dataset[4]["data_list"]['graph_labels'] == torch.tensor([[2.0]]) + assert dataset[5]["data_list"]['graph_labels'] == torch.tensor([[2.0]]) + + def check_data_1(data) -> None: + assert(torch.allclose(data["data_list"]['edge_index'], torch.tensor([[0, 0, 1, 1, 2, 2], + [2, 1, 0, 2, 1, 0]]) + ) + ) + assert(torch.allclose(data["data_list"]['shifts'], torch.tensor([[0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 2.0, 0.0], + [0.0, -2.0, 0.0], + [0.0, 0.0, 0.0]]) + ) + ) + assert(torch.allclose(data["data_list"]['unit_shifts'], torch.tensor([[0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 0.0, 0.0], + [0.0, 1.0, 0.0], + [0.0, -1.0, 0.0], + [0.0, 0.0, 0.0]]) + ) + ) + assert(torch.allclose(data["data_list"]['positions'], torch.tensor([[0.0, 0.0, 0.0], + [0.7, 0.7, 0.0], + [0.7, -0.7, 0.0]]) + ) + ) + assert(torch.allclose(data["data_list"]['cell'], torch.tensor([[2.0, 0.0, 0.0], + [0.0, 2.0, 0.0], + [0.0, 0.0, 2.0]]) + ) + ) + assert(torch.allclose(data["data_list"]['node_attrs'], torch.tensor([[0.0, 1.0], + [1.0, 0.0], + [1.0, 0.0]]) + ) + ) + + for i in range(6): + check_data_1(dataset[i]) + + if system_selection is not None: + + dataset = create_dataset_from_trajectories( + trajectories=[test_dataset_path, test_dataset_path, test_dataset_path], + top=[test_dataset_path, test_dataset_path, test_dataset_path], + cutoff=1.0, + system_selection='type O and {:s}'.format(system_selection), + environment_selection='type H and {:s}'.format(system_selection), + show_progress=False + ) + + for i in range(6): + check_data_1(dataset[i]) + + dataset = create_dataset_from_trajectories( + trajectories=[test_dataset_path, test_dataset_path, test_dataset_path], + top=[test_dataset_path, test_dataset_path, test_dataset_path], + cutoff=1.0, + system_selection='name H1 and {:s}'.format(system_selection), + environment_selection='name H2 and {:s}'.format(system_selection), + show_progress=False + ) -if __name__ == "__main__": - test_datasetFromFile() + def check_data_2(data) -> None: + assert(torch.allclose(data["data_list"]['edge_index'], torch.tensor([[0, 1], [1, 0]]))) + assert(torch.allclose(data["data_list"]['shifts'], torch.tensor([[0.0, 2.0, 0.0], + [0.0, -2.0, 0.0]]) + ) + ) + assert(torch.allclose(data["data_list"]['unit_shifts'], torch.tensor([[0.0, 1.0, 0.0], + [0.0, -1.0, 0.0]]) + ) + ) + assert(torch.allclose(data["data_list"]['positions'], torch.tensor([[0.7, 0.7, 0.0], + [0.7, -0.7, 0.0]]) + ) + ) + assert(torch.allclose(data["data_list"]['cell'], torch.tensor([[2.0, 0.0, 0.0], + [0.0, 2.0, 0.0], + [0.0, 0.0, 2.0]]) + ) + ) + assert(torch.allclose(data["data_list"]['node_attrs'], torch.tensor([[1.0], + [1.0]]) + ) + ) + + for i in range(6): + check_data_2(dataset[i]) + + +def test_dataset_from_xyz(): + # load single file + load_args = [{'start' : 0, 'stop' : 2, 'stride' : 1}] + dataset = create_dataset_from_trajectories(trajectories="Cu.xyz", + folder="mlcolvar/tests/data", + top=None, + cutoff=3.5, # Ang + labels=None, + system_selection="index 0", + environment_selection="not index 0", + show_progress=False, + load_args=load_args, + buffer=1, + ) + + print(dataset) + + # load multiple files + load_args = [{'start' : 0, 'stop' : 2, 'stride' : 1}, + {'start' : 0, 'stop' : 4, 'stride' : 2}] + dataset = create_dataset_from_trajectories(trajectories=["Cu.xyz", "Cu.xyz"], + folder="mlcolvar/tests/data", + top=None, + cutoff=3.5, # Ang + labels=None, + system_selection="index 0 or index 1", + environment_selection="not index 0 and not index 1", + show_progress=False, + load_args=load_args, + buffer=1, + ) + print(dataset) \ No newline at end of file diff --git a/mlcolvar/utils/plot.py b/mlcolvar/utils/plot.py index 62f1a3ee..41b86d5f 100644 --- a/mlcolvar/utils/plot.py +++ b/mlcolvar/utils/plot.py @@ -328,6 +328,72 @@ def plot_features_distribution(dataset, features, titles=None, axs=None): ax.set_yticks([]) ax.legend([],[],title=feat,loc='upper center',frameon=False) +import sys +import time +import typing + +""" +A simple progress bar. +""" + +__all__ = ['pbar'] + + +def pbar( + item: typing.List[int], + prefix: str = '', + size: int = 25, + frequency: int = 0.05, + use_unicode: bool = True, + file: typing.TextIO = sys.stdout +): + """ + A simple progress bar. Taken from stackoverflow: + https://stackoverflow.com/questions/3160699 + Parameters + ---------- + it : List[int] + The looped item. + prefix : str + Prefix of the bar. + size : int + Size of the bar. + frequency : float + Flush frequency of the bar. + use_unicode : bool + If use unicode char to draw the bar. + file : TextIO + The output file. + """ + if (use_unicode): + c_1 = '' + c_2 = '█' + c_3 = '━' + c_4 = '' + else: + c_1 = '|' + c_2 = '|' + c_3 = '-' + c_4 = '|' + count = len(item) + start = time.time() + interval = max(int(count * frequency), 1) + + def show(j) -> None: + x = int(size * j / count) + remaining = ((time.time() - start) / j) * (count - j) + mins, sec = divmod(remaining, 60) + time_string = f'{int(mins):02}:{sec:02.1f}' + output = f' {prefix} {c_1}{c_2 * (x - 1) + c_4}{c_3 * (size - x)} ' + \ + f'{j}/{count} Est. {time_string}' + print('\x1b[1A\x1b[2K' + output, file=file, flush=True) + + for i, it in enumerate(item): + yield it + if ((i % interval) == 0 or i in [0, (count - 1)]): + show(i + 1) + print(flush=True, file=file) + def test_utils_plot(): import matplotlib @@ -344,3 +410,10 @@ def test_utils_plot(): cmap = matplotlib.colors.Colormap("fessa_r", 2) cmap = matplotlib.colors.Colormap("cortina80", 2) cmap = matplotlib.colors.Colormap("cortina80_r", 2) + + import time + for i in pbar(range(15), "Computing: ", 40): + time.sleep(0.1) + + for i in pbar(range(15), "Computing: ", 40, use_unicode=False): + time.sleep(0.1) \ No newline at end of file diff --git a/mlcolvar/utils/timelagged.py b/mlcolvar/utils/timelagged.py index 3d751f0a..989fd64e 100644 --- a/mlcolvar/utils/timelagged.py +++ b/mlcolvar/utils/timelagged.py @@ -3,6 +3,8 @@ from bisect import bisect_left from mlcolvar.data import DictDataset import warnings +from typing import Union +import copy # optional packages # pandas @@ -193,7 +195,7 @@ def progress(iter, progress_bar=progress_bar): def create_timelagged_dataset( - X: torch.Tensor, + X: Union[torch.Tensor, np.ndarray, DictDataset], t: torch.Tensor = None, lag_time: float = 1, reweight_mode: str = None, @@ -223,8 +225,8 @@ def create_timelagged_dataset( Parameters ---------- - X : array-like - input descriptors + X : torch.Tensor or np.ndarray or DictDataset + Input data, graph data can only be provided as DictDataset t : array-like, optional time series, by default np.arange(len(X)) reweight_mode: str, optional @@ -287,13 +289,23 @@ def create_timelagged_dataset( tprime = t # find pairs of configurations separated by lag_time - x_t, x_lag, w_t, w_lag = find_timelagged_configurations( - X, - tprime, - lag_time=lag_time, - logweights=logweights if reweight_mode == "weights_t" else None, - progress_bar=progress_bar, - ) + if isinstance(X, torch.Tensor) or isinstance(X, np.ndarray): + x_t, x_lag, w_t, w_lag = find_timelagged_configurations( + X, + tprime, + lag_time=lag_time, + logweights=logweights if reweight_mode == "weights_t" else None, + progress_bar=progress_bar, + ) + elif isinstance(X, DictDataset): + index = torch.arange(len(X), dtype=torch.long) + x_t, x_lag, w_t, w_lag = find_timelagged_configurations( + index, + tprime, + lag_time=lag_time, + logweights=logweights if reweight_mode == "weights_t" else None, + progress_bar=progress_bar, + ) # return only a slice of the data (N. Pedrani) if interval is not None: @@ -306,34 +318,92 @@ def create_timelagged_dataset( data[i] = data[i][interval[0] : interval[1]] x_t, x_lag, w_t, w_lag = data - dataset = DictDataset( - {"data": x_t, "data_lag": x_lag, "weights": w_t, "weights_lag": w_lag} - ) - - return dataset + if isinstance(X, torch.Tensor) or isinstance(X, np.ndarray): + dataset = DictDataset({"data": x_t, + "data_lag": x_lag, + "weights": w_t, + "weights_lag": w_lag}, + data_type='descriptors') + return dataset + + elif isinstance(X, DictDataset): + if X.metadata["data_type"] == "descriptors": + dataset = DictDataset({"data": X['data'][x_t], + "data_lag": X['data'][x_lag], + "weights": w_t, + "weights_lag": w_lag}, + data_type='descriptors') + + elif X.metadata["data_type"] == "graphs": + # we use deepcopy to avoid editing the original dataset + dataset = DictDataset(dictionary={"data_list" : copy.deepcopy(X[x_t.numpy().tolist()]["data_list"]), + "data_list_lag" : copy.deepcopy(X[x_lag.numpy().tolist()]["data_list"])}, + metadata={"z_table" : X.metadata["z_table"], + "cutoff" : X.metadata["cutoff"]}, + data_type="graphs") + # update weights + for i in range(len(dataset)): + dataset['data_list'][i]['weight'] = w_t[i] + dataset['data_list_lag'][i]['weight'] = w_lag[i] + + return dataset def test_create_timelagged_dataset(): in_features = 2 - n_points = 100 + n_points = 20 X = torch.rand(n_points, in_features) * 100 + dataset = DictDataset(data=X, data_type='descriptors') + # unbiased case t = np.arange(n_points) - dataset = create_timelagged_dataset(X, t, lag_time=10) - print(len(dataset)) + lagged_dataset_1 = create_timelagged_dataset(X, t, lag_time=10) + print(len(lagged_dataset_1)) + lagged_dataset_2 = create_timelagged_dataset(dataset, t, lag_time=10) + print(len(lagged_dataset_2)) + assert(torch.allclose(lagged_dataset_1['data'], lagged_dataset_2['data'])) + assert(torch.allclose(lagged_dataset_1['data_lag'], lagged_dataset_2['data_lag'])) + assert(torch.allclose(lagged_dataset_1['weights'], lagged_dataset_2['weights'])) + # reweight mode rescale_time (default) logweights = np.random.rand(n_points) - dataset = create_timelagged_dataset(X, t, logweights=logweights) - print(len(dataset)) + lagged_dataset_1 = create_timelagged_dataset(X, t, logweights=logweights) + print(len(lagged_dataset_1)) + lagged_dataset_2 = create_timelagged_dataset(dataset, t, logweights=logweights) + print(len(lagged_dataset_2)) + assert(torch.allclose(lagged_dataset_1['data'], lagged_dataset_2['data'])) + assert(torch.allclose(lagged_dataset_1['data_lag'], lagged_dataset_2['data_lag'])) + assert(torch.allclose(lagged_dataset_1['weights'], lagged_dataset_2['weights'])) + # reweight mode weights_t logweights = np.random.rand(n_points) - dataset = create_timelagged_dataset( + lagged_dataset_1 = create_timelagged_dataset( X, t, logweights=logweights, reweight_mode="weights_t" ) + print(len(lagged_dataset_1)) + lagged_dataset_2 = create_timelagged_dataset( + dataset, t, logweights=logweights, reweight_mode="weights_t" + ) + print(len(lagged_dataset_2)) + assert(torch.allclose(lagged_dataset_1['data'], lagged_dataset_2['data'])) + assert(torch.allclose(lagged_dataset_1['data_lag'], lagged_dataset_2['data_lag'])) + assert(torch.allclose(lagged_dataset_1['weights'], lagged_dataset_2['weights'])) + + + + # graph data + from mlcolvar.data.graph.utils import create_test_graph_input + dataset = create_test_graph_input('dataset') + print(dataset['data_list'][0]) + lagged_dataset = create_timelagged_dataset(dataset, logweights=torch.randn(len(dataset))) + print(lagged_dataset['data_list'][0]) + print(dataset['data_list'][0]) + print(len(dataset)) + if __name__ == "__main__": diff --git a/requirements.txt b/requirements.txt index c5283ab0..ba6536de 100644 --- a/requirements.txt +++ b/requirements.txt @@ -3,4 +3,7 @@ torch numpy<2 pandas matplotlib -kdepy \ No newline at end of file +kdepy +torch_geometric +matscipy +mdtraj \ No newline at end of file