Back to search
2106.08928

Recursive Construction of Stable Assemblies of Recurrent Neural Networks

Michaela Ennis, Leo Kozachkov, Jean-Jacques Slotine

correctmedium confidence
Category
math.DS
Journal tier
Strong Field
Processed
Sep 28, 2025, 12:56 AM

Audit review

The paper’s Theorem 1 proves contraction of τ ẋ = −x + Wφ(x) + v(t) under the diagonal metric condition P(g|W| − I) + (g|W| − I)^T P ≺ 0 and notes the off-diagonal refinement when Wii ≤ 0. The candidate solution establishes the same condition using the same Lyapunov/quadratic-form bounding strategy and reaches the identical conclusion, including the same off-diagonal refinement. One minor flaw in the model’s writeup is the use of the identity u^T|W|^T P u = u^T P|W| u for diagonal P, which is false in general; however, this step is unnecessary and easily replaced by bounding the two terms separately exactly as done in the paper’s proof. With this correction, the model’s derivation is essentially the same as the paper’s.

Referee report (LaTeX)

\textbf{Recommendation:} minor revisions

\textbf{Journal Tier:} strong field

\textbf{Justification:}

The manuscript provides clear sufficient conditions for global exponential stability (contraction) of continuous-time RNNs with slope-restricted nonlinearities and gives several practically useful special cases and parameterizations. The main diagonal-metric result and its off-diagonal refinement are correct and well-motivated. The work is relevant to robust RNN design, Neural ODEs, and modular network assemblies. Minor presentation issues (explicit rate bounds, consistent notation, clarity about τ-scaling) merit small revisions.