2108.12453
Convolutional Autoencoders for Reduced-Order Modeling
Sreeram Venkat, Ralph C. Smith, Carl T. Kelley
correctmedium confidence
- Category
- Not specified
- Journal tier
- Specialist/Solid
- Processed
- Sep 28, 2025, 12:56 AM
- arXiv Links
- Abstract ↗PDF ↗
Audit review
The paper defines a PDE-agnostic manifold LSPG ROM using a single convolutional autoencoder trained on randomized smooth functions (Brownian-bridge + spline or trigonometric sums), then solves per-step latent problems û_n = argmin_ξ ||r_n(x_ref + g(ξ))||_2^2 with Gauss–Newton, and demonstrates reuse of the same decoder across heat, 1D wave, KS, and 2D wave models. These elements are explicitly in Section 4.1: the LSPG objective and x_ref choice x_ref = u_0 − autoencoder(u_0), the residuals for heat r_n(y)=Ay−Bu_{n−1}, 1D wave r_n(y)=(4/r^2 I+K)y−2(4/r^2 I−K)u_{n−1}+(4/r^2 I+K)u_{n−2}, KS r_n(y)=Ay−Du_{n−1}, and 2D wave r_n(y)=y−(2I−K_2D)u_{n−1}+u_{n−2} . The discretizations and matrices A,B,K,K_2D and the KS coefficients are also stated in the paper (heat CN: A,B with r=Δt/Δx^2; wave schemes; KS finite-difference matrices with a,b_i^n,c_i^n,e; D) . The training-data generation algorithms and typical parameters N_max=10, A_max=5, ω_max=10 are given (Algorithms 1–2) . Implementation remarks about Jacobian computation via autodiff vs 3-point finite differences and memory/time tradeoffs are likewise in the paper . The candidate solution reproduces all of these components and adds explicit Gauss–Newton normal equations H=J_g^T A_n^T A_n J_g and g=J_g^T A_n^T(A_n(x_ref+g(ξ))−b_n), along with standard rank/regularity conditions and matrix-free product strategies. These additions are consistent with the cited LSPG setup (the paper references GN but does not derive the normal equations) and with the residual forms given. One minor caveat: the model assumes g∈C^1, while the paper’s architecture uses PReLU, which is piecewise linear and not C^1 at the kink; in practice, the Jacobian exists a.e. and autodiff works, so this does not affect the practical method. Overall, the methods match in substance, with the model giving a more explicit optimization step and implementation detail consistent with the paper.
Referee report (LaTeX)
\textbf{Recommendation:} minor revisions
\textbf{Journal Tier:} specialist/solid
\textbf{Justification:}
The submission convincingly demonstrates a PDE-agnostic manifold ROM workflow using a single autoencoder trained on randomized smooth functions, then applied via manifold LSPG to heat, wave (1D/2D), and KS models. The approach is practically valuable, and the implementation guidance (Jacobians via autodiff vs 3-point FD, dimension choices) is useful. Minor issues include light treatment of GN assumptions and a couple of small referencing slips; addressing these would improve rigor and clarity. Overall, a solid and well-motivated contribution.