Data Driven PDE Solvers

\( \newcommand{\states}{\mathcal{S}} \newcommand{\actions}{\mathcal{A}} \newcommand{\observations}{\mathcal{O}} \newcommand{\rewards}{\mathcal{R}} \newcommand{\traces}{\mathbf{e}} \newcommand{\transition}{P} \newcommand{\reals}{\mathbb{R}} \newcommand{\naturals}{\mathbb{N}} \newcommand{\complexs}{\mathbb{C}} \newcommand{\field}{\mathbb{F}} \newcommand{\numfield}{\mathbb{F}} \newcommand{\expected}{\mathbb{E}} \newcommand{\var}{\mathbb{V}} \newcommand{\by}{\times} \newcommand{\partialderiv}[2]{\frac{\partial #1}{\partial #2}} \newcommand{\defineq}{\stackrel{{\tiny\mbox{def}}}{=}} \newcommand{\defeq}{\stackrel{{\tiny\mbox{def}}}{=}} \newcommand{\eye}{\Imat} \newcommand{\hadamard}{\odot} \newcommand{\trans}{\top} \newcommand{\inv}{{-1}} \newcommand{\argmax}{\operatorname{argmax}} \newcommand{\Prob}{\mathbb{P}} \newcommand{\avec}{\mathbf{a}} \newcommand{\bvec}{\mathbf{b}} \newcommand{\cvec}{\mathbf{c}} \newcommand{\dvec}{\mathbf{d}} \newcommand{\evec}{\mathbf{e}} \newcommand{\fvec}{\mathbf{f}} \newcommand{\gvec}{\mathbf{g}} \newcommand{\hvec}{\mathbf{h}} \newcommand{\ivec}{\mathbf{i}} \newcommand{\jvec}{\mathbf{j}} \newcommand{\kvec}{\mathbf{k}} \newcommand{\lvec}{\mathbf{l}} \newcommand{\mvec}{\mathbf{m}} \newcommand{\nvec}{\mathbf{n}} \newcommand{\ovec}{\mathbf{o}} \newcommand{\pvec}{\mathbf{p}} \newcommand{\qvec}{\mathbf{q}} \newcommand{\rvec}{\mathbf{r}} \newcommand{\svec}{\mathbf{s}} \newcommand{\tvec}{\mathbf{t}} \newcommand{\uvec}{\mathbf{u}} \newcommand{\vvec}{\mathbf{v}} \newcommand{\wvec}{\mathbf{w}} \newcommand{\xvec}{\mathbf{x}} \newcommand{\yvec}{\mathbf{y}} \newcommand{\zvec}{\mathbf{z}} \newcommand{\Amat}{\mathbf{A}} \newcommand{\Bmat}{\mathbf{B}} \newcommand{\Cmat}{\mathbf{C}} \newcommand{\Dmat}{\mathbf{D}} \newcommand{\Emat}{\mathbf{E}} \newcommand{\Fmat}{\mathbf{F}} \newcommand{\Gmat}{\mathbf{G}} \newcommand{\Hmat}{\mathbf{H}} \newcommand{\Imat}{\mathbf{I}} \newcommand{\Jmat}{\mathbf{J}} \newcommand{\Kmat}{\mathbf{K}} \newcommand{\Lmat}{\mathbf{L}} \newcommand{\Mmat}{\mathbf{M}} \newcommand{\Nmat}{\mathbf{N}} \newcommand{\Omat}{\mathbf{O}} \newcommand{\Pmat}{\mathbf{P}} \newcommand{\Qmat}{\mathbf{Q}} \newcommand{\Rmat}{\mathbf{R}} \newcommand{\Smat}{\mathbf{S}} \newcommand{\Tmat}{\mathbf{T}} \newcommand{\Umat}{\mathbf{U}} \newcommand{\Vmat}{\mathbf{V}} \newcommand{\Wmat}{\mathbf{W}} \newcommand{\Xmat}{\mathbf{X}} \newcommand{\Ymat}{\mathbf{Y}} \newcommand{\Zmat}{\mathbf{Z}} \newcommand{\Sigmamat}{\boldsymbol{\Sigma}} \newcommand{\identity}{\Imat} \newcommand{\epsilonvec}{\boldsymbol{\epsilon}} \newcommand{\thetavec}{\boldsymbol{\theta}} \newcommand{\phivec}{\boldsymbol{\phi}} \newcommand{\muvec}{\boldsymbol{\mu}} \newcommand{\sigmavec}{\boldsymbol{\sigma}} \newcommand{\jacobian}{\mathbf{J}} \newcommand{\ind}{\perp!!!!\perp} \newcommand{\bigoh}{\text{O}} \)

Projects

Questions

Are there methods for learning from data driven by controllers? How well does this work on real-world data? Can we get better explanations of how the controller performs? Can we learn the implicit PDE of a controller we can’t analyze? How do we validate we have learned it correctly?

Notes

TODO Background

The idea is to try and estimate a PDE from data (i.e. no model).

  • What is the mathematical formulation of the problem?
  • What is the data used to estimate the mapping?
  • What PDEs can this approximate? And how do the different approaches approximate these PDEs?

Finite-dimensional operators (i.e. approximate the parametric map through convolutional networks)

Physics informed machine learning

TODO Physics Informed Machine Learning (Karniadakis et al. 2021)

TODO

Neural Operators

DeeONets

References

Adler, Jonas, and Ozan Öktem. 2017. “Solving Ill-Posed Inverse Problems Using Iterative Deep Neural Networks.” Inverse Problems 33 (12). IOP Publishing: 124007. doi:10.1088/1361-6420/aa9581.
Bhatnagar, Saakaar, Yaser Afshar, Shaowu Pan, Karthik Duraisamy, and Shailendra Kaushik. 2019. “Prediction of Aerodynamic Flow Fields Using Convolutional Neural Networks.” Computational Mechanics 64 (2): 525–45. doi:10.1007/s00466-019-01740-0.
Guo, Xiaoxiao, Wei Li, and Francesco Iorio. 2016. “Convolutional Neural Networks for Steady Flow Approximation.” In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 481–90. KDD ’16. New York, NY, USA: Association for Computing Machinery. doi:10.1145/2939672.2939738.
Jiang Chiyu “Max”, Soheil Esmaeilzadeh, Kamyar Azizzadenesheli, Karthik Kashinath, Mustafa Mustafa, Hamdi A. Tchelepi, Philip Marcus, Mr Prabhat, and Anima Anandkumar. 2020. “MESHFREEFLOWNET: A Physics-Constrained Deep Continuous Space-Time Super-Resolution Framework.” In SC20: International Conference for High Performance Computing, Networking, Storage and Analysis, 1–15. doi:10.1109/SC41405.2020.00013.
Karniadakis, George Em, Ioannis G. Kevrekidis, Lu Lu, Paris Perdikaris, Sifan Wang, and Liu Yang. 2021. “Physics-Informed Machine Learning.” Nature Reviews Physics 3 (6). Nature Publishing Group: 422–40. doi:10.1038/s42254-021-00314-5.
Khoo, Yuehaw, Jianfeng Lu, and Lexing Ying. 2021. “Solving Parametric PDE Problems with Artificial Neural Networks.” European Journal of Applied Mathematics 32 (3): 421–35. doi:10.1017/S0956792520000182.
Kovachki, Nikola, Zongyi Li, Burigede Liu, Kamyar Azizzadenesheli, Kaushik Bhattacharya, Andrew Stuart, and Anima Anandkumar. 2024. “Neural Operator: Learning Maps Between Function Spaces.” May 2. doi:10.5555/3648699.3648788.
Kutyniok, Gitta, Philipp Petersen, Mones Raslan, and Reinhold Schneider. 2022. “A Theoretical Analysis of Deep Neural Networks and Parametric PDEs.” Constructive Approximation 55 (1): 73–125. doi:10.1007/s00365-021-09551-4.
Lu, Lu, Pengzhan Jin, Guofei Pang, Zhongqiang Zhang, and George Em Karniadakis. 2021. “Learning Nonlinear Operators via DeepONet Based on the Universal Approximation Theorem of Operators.” Nature Machine Intelligence 3 (3). Nature Publishing Group: 218–29. doi:10.1038/s42256-021-00302-5.
Raissi, M., P. Perdikaris, and G.E. Karniadakis. 2019. “Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations.” Journal of Computational Physics 378 (February): 686–707. doi:10.1016/j.jcp.2018.10.045.
Ummenhofer, Benjamin, Lukas Prantl, Nils Thuerey, and Vladlen Koltun. 2019. “Lagrangian Fluid Simulation with Continuous Convolutions.” https://openreview.net/forum?id=B1lDoJSYDH.
Zhu, Yinhao, and Nicholas Zabaras. 2018. “Bayesian Deep Convolutional Encoder–Decoder Networks for Surrogate Modeling and Uncertainty Quantification.” Journal of Computational Physics 366 (August): 415–47. doi:10.1016/j.jcp.2018.04.018.