Back to search
2102.03934

A Convex Optimization Approach to Learning Koopman Operators

M. Sznaier

incompletemedium confidence
Category
Not specified
Journal tier
Specialist/Solid
Processed
Sep 28, 2025, 12:55 AM

Audit review

The candidate reproduces Theorem 2’s construction (factor K to obtain y, derive (8)–(9) from (11)–(12), relate G(i) to Hankel blocks) and crucially supplies the missing global-rank step: from r* = rank([G(1); … ; G(N)]) and the identity (H_y)^T H_y = Σ_i G(i), they prove rank(H_y) ≤ r* < r+1, hence (7). The paper’s appendix proof only bounds rank(H_{y(i)}) via rank(G(i)) ≤ r* but does not explicitly justify the stacked Hankel rank (7) for all trajectories; it also does not address cross-trajectory pairs in (11) unless one factors the full K. The model closes both gaps, matching the intended result of Theorem 2. See Problem 2 and Theorem 2 in the paper and the appendix proof snippet cited here: (7)–(9) and (10)–(12) define the setting, and the proof sketch lacks the global stacking argument that the model provides .

Referee report (LaTeX)

\textbf{Recommendation:} minor revisions

\textbf{Journal Tier:} specialist/solid

\textbf{Justification:}

The approach is sound and practically relevant, with a clear path from rank-minimization to Koopman embeddings and operators. The current proof of Theorem 2, however, omits the global stacking argument required to conclude the stacked Hankel rank constraint and does not clarify whether neighborhood constraints include cross-trajectory pairs. These issues are easily remedied with brief additions and do not undermine the main result.