2109.00092
GFINNs: GENERIC Formalism Informed Neural Networks for Deterministic and Stochastic Dynamical Systems
Zhen Zhang, Yeonjong Shin, George Em Karniadakis
incompletehigh confidence
- Category
- Not specified
- Journal tier
- Specialist/Solid
- Processed
- Sep 28, 2025, 12:56 AM
- arXiv Links
- Abstract ↗PDF ↗
Audit review
Case 2b: The paper’s construction ANN(z) = Q_G(z)^T B_ANN(z) Q_G(z) and proof of Theorem 2 are coherent and correct under the stated hypotheses (compact Ω, a nonvanishing component of ∇G, A skew or SPSD, A∇G=0), culminating in a uniform approximation of A with preserved degeneracy; see the statement and proof sketch using q̃_k = P_k∇G and the reduction to approximating Λ̃(z) (Theorem 2 and its proof) . Case 1: The paper’s Theorem 1 argues that there exists f with ∇f∘P_A ≈ c_A∘P_A using a generic UAT citation, but it never justifies that the target field c_A|_{P_A(Ω)} is (even approximately) a gradient field, which is generally false without topological/integrability conditions; the proof sketch therefore has a gap . The candidate model correctly handles Case 2b (using Q_G built from a skew basis and a pseudoinverse factorization), but in Case 1 it asserts that the closed 1-form P_A^*(c_A·dξ) implies that c_A is exact on each connected component of M, which is not valid without additional assumptions (e.g., H^1(M)=0). Hence both are incomplete for Case 1; Case 2b is correct in both.
Referee report (LaTeX)
\textbf{Recommendation:} major revisions
\textbf{Journal Tier:} specialist/solid
\textbf{Justification:}
Strong contribution on structure-preserving learning for GENERIC systems with convincing Case 2b theory and practice. However, the universality proof for Case 1 relies on approximating an arbitrary continuous vector field on the reduced space by gradients of scalar networks without establishing conservativity or adding topological conditions. This is a substantive but fixable gap. Clarifying assumptions and providing a correct integrability argument (or narrowing the target class) would make the paper rigorous end-to-end.