- Here we collect all the works that may be useful for writing our paper
- We divide these works by topic in order to structure them
Note
This review table will be updated, so it is not a final version.
| Topic | Title | Year | Authors | Paper | Code | Summary |
|---|---|---|---|---|---|---|
| Neural Operators - Theory | Operator Learning: Algorithms and Analysis | 2024 | Nikola B. Kovachki, Samuel Lanthaler, Andrew M. Stuart | arXiv:2402.15715 | — | Review of approximation theory for neural operators mapping between infinite-dimensional Banach spaces. Covers fundamental algorithms (DeepONet, FNO, PCA-Net, Random Features) with theoretical analysis including universal approximation theorems, complexity bounds, and curse of dimensionality. Best for theory |
| DeepONet | DeepONet: Learning Nonlinear Operators via Universal Approximation | 2019 | Lu Lu, Pengzhan Jin, George Em Karniadakis | arXiv:1910.03193 | — | Foundational DeepONet paper. Branch network encodes input function; trunk network encodes query location. Grounded in Chen & Chen (1995) universal operator approximation theorem. |
| Fourier Neural Operator | Fourier Neural Operator for Parametric Partial Differential Equations | 2020 | Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar | arXiv:2010.08895 | — | Introduces FNO: learns solution operator in Fourier space via parameterized integral kernel. Resolution-invariant and mesh-independent |
| POD-based Comparison & Extensions | A Comprehensive and Fair Comparison of Two Neural Operators (with Practical Extensions) Based on FAIR Data | 2022 | Lu Lu, Xuhui Meng, Shengze Cai, Zhiping Mao, Somdatta Goswami, Zhongqiang Zhang, George Em Karniadakis | arXiv:2111.05512 | GitHub | Benchmark comparison of DeepONet and FNO across 16 diverse tasks. Introduces extensions: POD-DeepONet, improved boudary condition handling, and FNO variants for complex domains. DeepONet proves more robust to noise. POD concept - precomputed POD basis as the trunk net |
| POD | Neural-POD: A Plug-and-Play Neural Operator Framework for Infinite-Dimensional Functional Nonlinear Proper Orthogonal Decomposition | 2026 | Changhong Mou, Binghang Lu, Guang Lin | ArXiv:2602.15632 | — | Best Introducing neural operator framework that constructs nonlinear, orthogonal basis functions in infinite-dimensional space using NNs. As a result - basis construction as a sequence of residual minimization problems solved through neural network training |
| POD-based Nonlinear Reduction | Nonlinear Model Reduction for Operator Learning | 2024 | Hamidreza Eivazi, Stefan Wittek, Andreas Rausch | arXiv:2403.18735 | GitHub | Best Proposes KPCA-DeepONet combining kernel principal component analysis with DeepONet for nonlinear model order reduction. Extends POD-based approaches with nonlinear dimensionality reduction, achieving superior performance on operator learning benchmarks. |
| POD-based Ensemble & Mixture-of-Experts | Ensemble and Mixture-of-Experts DeepONets For Operator Learning | 2025 | Ramansh Sharma, Varun Shankar | arXiv:2405.11907 | GitHub | Interesting Introduces ensemble DeepONets with multiple distinct trunk networks and Partition-of-Unity Mixture-of-Experts (PoU-MoE) architecture. Achieves 2-4x lower errors compared to standard DeepONets. Combines POD modes with MoE for improved expressivity and spatial locality. |
| POD-based | PODNO: Proper Orthogonal Decomposition Neural Operators | 2025 | Zilan Cheng, Zhongjian Wang, Li-Lian Wang, Mejdi Azaiez | ArXiv:2504.18513 | — | Proper Orthogonal Decomposition Neural Operators (PODNO) - a neural operator architecture that replaces the Fourier transform in FNO with data-driven POD basis functions to solve PDEs dominated by high-frequency oscillatory components. Unlike FNO which truncates high-frequency modes, PODNO computes an optimal basis from training snapshots that captures both low and high-frequency. Cool GSO. PODNO outperforms FNO on high-frequency probems |
| PI-DeepONets | Learning the solution operator of parametric partial differential equations with physics-informed DeepONets | 2021 | Sifan Wang, Hanwen Wang, Paris Perdikaris | DOI | GitHub | physics-informed DeepONet - framework for rapidly predicting the solution of various types of parametric PDEs, including biasing the outputs of DeepONets toward physically consistent predictions |
| PI-DeepONets | Separable Physics-Informed DeepONet: Breaking the Curse of Dimensionality in Physics-Informed Machine Learning | 2024 | Luis Mandla, Somdatta Goswami, Lena Lamberts, Tim Ricken | arXiv:2407.15887 | GitHub | Aims to solve curse of dimensionality by introducing separable physics-informed DeepONet (Sep-PI-DeepONet). Factorizing coordinates and separate sub-networks for each one-dimensional coordinate - from |
| One-shot Learning for PDEs | One-shot learning for solution operators of partial differential equations | 2025 | Anran Jiao, Haiyang He, Rishikesh Ranade, Jay Pathak & Lu Lu | DOI | GitHub | One-shot task for PDE solving (PDE solution operators from only one data point) - predict solutions of new input functions via mesh-based fixed-point iteration or meshfree neural-network based approaches. 3 approaches - fixed-point iteration (FPI), local-solution-operator-informed neural network (LOINN), LOINN with correction (cLOINN). |
| Domain decomposition DeepONets | DD-DeepONet: Domain decomposition and DeepONet for solving partial differential equations in three application scenarios | 2025 | Bo Yang, Xingquan Li, Jie Zhao, Ying Jiang | arXiv | — | Introducing framework for decomposing complex geometries into simple structures and vice versa. Using DDM - breaks problems in several smaller problems |
| Efficiency & Acceleration | DeepONet Augmented by Randomized Neural Networks for Efficient Operator Learning in PDEs | 2025 | Zhaoxi Jiang, Fei Wang | arXiv:2503.00317 | GitHub | Integrating randomized neural networks(RaNNs) with DeepONet architecture to balance accuracy and efficiency. Also developing PI-RaNN-DeepONets |
| POD | Neural Basis Functions for Accelerating Solutions to High Mach Euler Equations | 2022 | David Witman, Alexander New, Hicham Alkandry, Honest Mrema | arXiv:2208.01687 | — | Neural Basis Functions (NFS) = POD in DeepONets - goes from Finite Element Method (FEM) approach and uses a set of approximating neural networks to learn the underlying bases and explicit unknowns of the governing equations |
| PI | TARGETED DIGITAL TWIN VIA FLOW MAP LEARNING AND ITS APPLICATION TO FLUID DYNAMICS | 2025 | QIFAN CHEN , ZHONGSHU XU ,JINJIN ZHANG , DONGBIN XIU | arXiv:2510.07549 | — | targeting digital twin - directly models the dynamics of specific quantities of interest by learning from short simulation bursts of a full digital twin. Not exactly what needed |
| PI | CANONICAL AND NONCANONICAL HAMILTONIAN OPERATOR INFERENCE | 2023 | ANTHONY GRUBER, IRINA TEZAUR | arXiv:2304.06262 | — | Interesting Structure-preserving model reduction of canonical and non-canonical Hamiltonian systems. Making POD of H, math may be useful for physical systems |
| PI | Hamiltonian Learning using Machine Learning Models Trained with Continuous Measurements | 2025 | Kris Tucker, Amit Kiran Rege, Conor Smith, Claire Monteleoni, Tameem Albash | arXiv:2404.05526 | — | Qubit simulation, using supervised and unsupervised learning. Second - mapping H parameters to a measurement record using integrator with RNN. Interesting connection of LSTM and solvers(integrators) |
| PI | FourCastNet: A Global Data-driven High-resolution Weather Model using Adaptive Fourier Neural Operators | 2022 | Jaideep Pathak, Shashank Subramanian, Peter Harrington, Sanjeev Raja, Ashesh Chattopadhyay, Morteza Mardani, Thorsten Kurth, David Hall, Zongyi Li, Kamyar Azizzadenesheli, Pedram Hassanzadeh, Karthik Kashinath, Animashree Anandkumar | arXiv:2202.11214 | GitHub | FNO applied to global weather forecasting. Generates week-long forecast in under 2 seconds |
| PI | Physics-informed machine learning | 2021 | George Em Karniadakis, Ioannis G. Kevrekidis, Lu Lu, Paris Perdikaris, Sifan Wang, Liu Yang | DOI | — | Foundation principals of physics-informed ML systems. Core idea - integrating physical laws(PDEs) directly into the training process of NNs, which allows models to learn from both data and underlying mathematical principles |
| PI-ROM | Physics-informed non-intrusive reduced-order modeling of parameterized dynamical systems | 2025 | Himanshu Dave, Leo Cotteleer, Alessandro Parente | DOI:10.1016 | — | Physics-informed non-intrusive ROM framework for parametrized PDEs. Uses POD decomposition to extract POD modes from high-fidelity training data. Outperforms vanilla POD + ANN ROM by one order of magnitude in relative error. Includes self-calibration technique for loss function hyperparameter tuning. |
| Neural POD | OrthoSolver: A Neural Proper Orthogonal Decomposition Solver For PDEs | 2026 | Anonymous authors (Paper under double-blind review) | OpenReview | — | OrthoSolver - a neural POD framework that generalizes POD's energy-maximization principle from an information-theoretic perspectve. Maximizes mutual information between modes and data rather than just variance. Outperforms Neural-POD with different constraints. |
| POD | Geometry-agnostic model reduction with GNN-generated reduced POD bases and boosted PGD enrichment for (non)linear structural elastodynamics | 2025 | Victor Matray, David Néron, Frédéric Feyel, Faisal Amlani | HALscience | — | hybrid method that combines GNN with POD to create reduced-order models. Uses a Grassmannian distance as a training objective and introduces a "Boosted PGD" enrichment mechanism to improve accuracy for both linear and nonlinear structural dynamics problems. Good geometry interpretation |
| PI | Physics-informed neural network for volumetric sound field reconstruction of speech signals | 2024 | Marco Olivieri, Xenofon Karakonstantis, Mirco Pezzoli, Fabio Antonacci, Augusto Sarti, Efren Fernandez-Grande | Springer Nature | — | PINN-based approach for the recovery of arbitrary volumetric acoustic fields. Method enables to learn the physical law of sound propagation, kernel ridge regression (KRR) mentioned. Using inhomogeneous wave equation, NN satisfies this eq. Architecture includes SIREN-inspired neural network. |
| Activations | Implicit Neural Representations with Periodic Activation Functions | 2020 | Vincent Sitzmann, Julien N. P. Martel, Alexander W. Bergman, David B. Lindell, Gordon Wetzstein | arXiv:2006.09661 | GitHub | Proposing pereodic activation functions - SIREN, solving BC with this method. Better than ReLU with shape representations, also solving Helmholtz with this technique |
| Data | PDEBench: An Extensive Benchmark for Scientific Machine Learning | 2022 | Makoto Takamoto, Timothy Praditia, Raphael Leiteritz, Dan MacKinlay, Francesco Alesiani, Dirk Pflüger, Mathias Niepert | arXiv:2210.07182 | GitHub | BEST Benchmark suite with datasets for 11 PDEs (Burgers, Navier-Stokes, Darcy flow, etc.), baseline models (FNO, U-Net, PINN), and standardized evaluation metrics for operator learning |
| Data | Arc-lake v1.1: Per-lake Environmental and Climate Forcing Data (1995–2009) | 2011 | MacCallum S, Merchant C | DOI:10.7488/ds/159 | — | Long-term environmental and climate forcing data for multiple lakes. Contains meteorological variables and water quality parameters for hydrodynamic modeling. |
| Data | PAMAP2 Physical Activity Monitoring Dataset | 2012 | Reiss, A. | UCI ML Repository | — | Wearable sensor dataset with IMU data for physical activity recognition. Benchmark dataset for human activity classification and motion analysis. |
| Data | CARDIODAT: PTB Electrocardiogram Signal Database | 1995 | Bousseljot R, Kreiseler D, Schnabel A | PhysioNet | — | ECG database for cardiac arrhythmia detection and cardiovascular signal processing validation. |
| Manifold learning | A kernel ridge regression combining nonlinear ROMs for accurate flow-field reconstruction with discontinuities | 2025 | Weiji Wang, Chunlin Gong, Xuyi Jia, Chunna Li | arXiv:2502.11363 | — | Proposing KRR-DCR method to reconstruct the discontinuous regions accurately. Constructing a set of modes to reconstruct discontinuous flow fields nonlinearly. Demonstrating better accuracy and physical consistency compared to POD and NPBM. In comparison with POD, the modes obtained via KRR capture the local discontinuities more precisely. |
| Manifold learning | PDE-FREE MASS-CONSTRAINED LEARNING OF COMPLEX SYSTEMS WITH HIDDEN STATES | 2026 | Gianmaria Viola, Alessandro Della Pia, Lucia Russo, Ioannis Kevrekidis, Constantinos Siettos | arXiv:2510.17657 | GitHub | Interesting next-generation Equation-free algorithms for learning the spatio-temporal dynamics of mass-constrained complex systems with hidden states - Diffusion Maps + ROMs with Sparse Identification of Nonlinear Dynamics and Multivariate Autoregressive models + convex interpolation based on the kNN. Proved that both the linear POD and k-NN lifting operators preserve the mass (constraints). Methods mentioned: (kernel-PCA, Laplacian Eigenmaps, Diffusion Maps, and Autoencoders) - but using only PODs(trivial) and DMs (out-of-sample extension pre-image). Good theory for PODs and also DMs. |
| EM Algorithm | Maximum Likelihood from Incomplete Data via the EM Algorithm | 1977 | A. P. Dempster, N. M. Laird, D. B. Rubin | Paper | — | Foundation - Introduced EM algorithm for MLE with latent variables. Proves monotone likelihood conv - essential |
| Mixture | Local Maxima in the Likelihood of Gaussian Mixture Models: Structural Results and Algorithmic Consequences | 2016 | Jin, Zhang, Balakrishnan, Wainwright, Jordan | arXiv:1609.00978 | — | EM can converge to arbitrarily bad local maxima even for well-separated Gaussians. multiple EM runs needed for robustness. Using k-means for inicialization |
| EM + GMM Implementation | Expectation-Maximization for Gaussian Mixture Model from Scratch | 2021 | S. Agac | Tutorial | GitHub | Complete GMM-EM implementation with k-means++ initialization |
| Clustering | Considerably Improving Clustering Algorithms Using UMAP Dimensionality Reduction Technique: A Comparative Study | 2020 | Mebarka Allaoui, Mohammed Lamine Kherfi, Abdelhakim Cheriet | SpringerNature | — | Link to ICISP book. Using the UMAP dimensionality reduction technique as a preprocessing step can significantly improve the performance of common clustering algorithms. UMAP embedding helps overcome the curse of dimensions |
| Clustering | Nonlinear Model Reduction Using Space Vectors Clustering (SVC) POD with Application to the Burgers Equation | 2014 | Samir Sahyoun, Seddik Djouadi | ResearchGate | — | Introducing Space Vector Clustering(SVC) - proposing clustering the spatial domain based on the solutions behavior over all time at each spatial location |
| Clustering | Nonlinear model reduction for transport-dominated problems | 2026 | Jan S. Hesthaven, Benjamin Peherstorfer, Benjamin Unger | ArXiv:2602.01397 | — | Introducing linear model reduction, good pipeline. 3 ways of nonlinear model reduction: nonlinear parametrizations, reduced dynamics, and online solvers.interpolation and theory |
| Clustering | Analysis of an innovative sampling strategy based on k -means clustering algorithm for POD and POD-DEIM reduced order models of a 2-D reaction-diffusion system | 2023 | Enrico Alberto Cutillo, Gianmarco Petito, Katarzyna Bizon, Gaetano Continillo | Paper | — | Investigation of Model-Order Reduction, using k-means for sampling snapshots, so model is stable and efficient, achieving a speedup of up to 1020 times compared to the original model (no full access ) |
| Local ROM | Nonlinear Model Order Reduction Based on Local Reduced-Order Bases | 2012 | David Amsallem, Matthew J. Zahr, Charbel Farhat | DOI:10.1002/nme.4371 | — | local ROM paper- partition snapshot state space via k-means, builds one POD basis per cluster offline, and selects the nearest basis online via hard assignment |
| Local ROM | A Localized Reduced-Order Modeling Approach for PDEs with Bifurcating Solutions | 2019 | Martin Hess, Alessandro Alla, Annalisa Quaini, Gianluigi Rozza, Max Gunzburger | arXiv:1807.08851 | — | Applies k-means clustering to snapshots and constructs local POD bases per cluster, targeting PDEs whose solutions bifurcate across parameter regimes.Hard cluster assignments; offline only. |
| Local ROM | Model Order Reduction Assisted by Deep Neural Networks (ROM-net) | 2020 | Thomas Daniel, Fabien Casenave, Nissrine Akkari, David Ryckelynck | DOI:10.1186/s40323-020-00153-6 | — | Clusters solution subspaces on the Grassmann manifold via k-medoids with a ROM-oriented dissimilarity (principal angles), builds a dictionary of local POD+DEIM ROMs, and trains a DNN ensemble classifier to recommend the appropriate model at inference - offline fixed snapshot pipeline; hard assignment |