Back to search
2104.13911

Discovery of Slow Variables in a Class of Multiscale Stochastic Systems via Neural Networks

Przemysław Zieliński, Jan S. Hesthaven

incompletemedium confidence
Category
math.DS
Journal tier
Specialist/Solid
Processed
Sep 28, 2025, 12:56 AM

Audit review

The paper informally argues that once an encoder–decoder N = D ∘ E approximates the slow projection P, encoder level sets must overlap fast fibers, and even states they are Df-dimensional closed submanifolds. However, it does not supply the regularity or identifiability assumptions needed to justify “submanifold” and “dimension Df,” nor does it make explicit when E must be constant along each fiber. The candidate solution repairs these gaps: it (i) derives that level sets of E lie in unique fast fibers from N = D ∘ E = P and the injectivity of P across fibers, (ii) notes that E becomes constant on fibers if D parametrizes S (injectivity), and (iii) invokes the Regular Value (Preimage) Theorem to obtain that level sets are Df-dimensional embedded submanifolds (for regular values) and are closed as preimages of closed sets under a continuous map. This matches the paper’s intended claims but supplies the missing hypotheses and standard theorem. Compare the paper’s sketch in Section 4, which asserts these conclusions without the needed conditions and its definition of slow maps via alignment with fast fibers in Section 3.4 .

Referee report (LaTeX)

\textbf{Recommendation:} major revisions

\textbf{Journal Tier:} specialist/solid

\textbf{Justification:}

The core methodological idea is solid and well-motivated, and the empirical sections convincingly demonstrate the intended geometric alignment. However, a few central claims about encoder level sets (dimension Df, embedded submanifold status, and strict constancy on fibers) are asserted without the mild but necessary mathematical conditions. Making these assumptions explicit and adding a short, formal proposition with proof would bring the paper to a robust standard without altering its contributions.