-
NIF is a mesh-agnostic dimensionality reduction paradigm for parametric spatial temporal fields. For decades, dimensionality reduction (e.g., proper orthogonal decomposition, convolutional autoencoders) has been the very first step in reduced-order modeling of any large-scale spatial-temporal dynamics.
-
Unfortunately, these frameworks are either not extendable to realistic industry scenario, e.g., adaptive mesh refinement, or cannot preceed nonlinear operations without resorting to lossy interpolation on a uniform grid. Details can be found in our paper.
-
NIF is built on top of Keras, in order to minimize user's efforts in using the code and maximize the existing functionality in Keras.
- built on top of tensorflow 2 with Keras, hassle-free for many up-to-date advanced concepts and features
- distributed learning: data parallelism across multiple GPUs on a single node
- flexible training schedule: e.g., first Adam then fine-tunning with L-BFGS
- performance monitoring: model weights checkpoints and restoration
-
Hello world! A simple fitting on 1D travelling wave
- learn how to use class
nif.NIF
- model checkpoints/restoration
- mixed precision training
- L-BFGS fine tuning
- learn how to use class
-
- learn how to use class
nif.NIFMultiScale
- demonstrate the effectiveness of learning high frequency data
- learn how to use class
-
Learning linear representation
- learn how to use class
nif.NIFMultiScaleLastLayerParameterized
- demonstrate on a (shortened) flow over a cylinder case from AMR solver
- learn how to use class
If you find NIF is helpful to you, you can cite our paper in the following bibtex format
@misc{pan2022neural,
title={Neural Implicit Flow: a mesh-agnostic dimensionality reduction paradigm of spatio-temporal data},
author={Shaowu Pan and Steven L. Brunton and J. Nathan Kutz},
year={2022},
eprint={2204.03216},
archivePrefix={arXiv},
primaryClass={cs.LG}
}