2102.11923
UNIVERSAL APPROXIMATION PROPERTIES OF NEURAL NETWORKS FOR ENERGY-BASED PHYSICAL SYSTEMS
Yuhan Chen, Takashi Matsubara, Takaharu Yaguchi
incompletemedium confidence
- Category
- math.DS
- Journal tier
- Specialist/Solid
- Processed
- Sep 28, 2025, 12:56 AM
- arXiv Links
- Abstract ↗PDF ↗
Audit review
The paper’s Theorem 4 claims a universal approximation result for the transformed vector field (Du)^{-1} G (Du)^{-T} ∇H but its proof relies only on pointwise continuity of det and the inverse map, without establishing a uniform margin of invertibility or a quantitative perturbation bound on the inverse Jacobian over the compact phase space. This is insufficient. The candidate solution supplies the missing uniform stability arguments (minimum singular value gap on K, Neumann-series bound for the inverse, and a clean error decomposition), yielding a complete and correct proof under the paper’s assumptions.
Referee report (LaTeX)
\textbf{Recommendation:} major revisions
\textbf{Journal Tier:} specialist/solid
\textbf{Justification:}
The latent-variable universal approximation theorem targets a practically important setting, but its proof is incomplete. The argument invokes continuity of determinant and inversion without establishing a uniform spectral gap or providing quantitative stability bounds. These omissions can be remedied with standard perturbation techniques; doing so will materially improve rigor and clarity. Other parts of the paper are clear and useful, and with these revisions the contribution would be solid.