2103.01700
Estimates on the Dimension of Self-Similar Measures with Overlaps
De-Jun Feng, Zhou Feng
correctmedium confidence
- Category
- Not specified
- Journal tier
- Strong Field
- Processed
- Sep 28, 2025, 12:56 AM
- arXiv Links
- Abstract ↗PDF ↗
Audit review
The paper (Feng–Feng, arXiv:2103.01700, v1 dated 2021-03-02) gives a computer-assisted proof establishing a uniform lower bound dim(μβ) ≥ 0.98040856 for all β ∈ (1,2), and further shows that dim(μβ) > dim(μβ3) for all β ∈ (√2,2) outside the 10^{-8}-neighborhood of the tribonacci number β3 (the largest root of x^3−x^2−x−1) via an explicit parameter sweep and rigorously controlled estimates; see the abstract and Theorem 1.2 as well as the proof details and computation tables (including the smallest interval achieving the minimum bound) in Section 5 and Table 6 . The approach is based on a general lower-bounding inequality for self-similar measures (Theorem 1.1) derived from projection entropy, combined with carefully designed partitions and certified upper bounds on the relevant preimage measures, yielding uniform bounds over 132,530 subintervals that together cover the parameter range . The model’s claim that such results were likely open as of the cutoff is incorrect: the paper itself, dated March 2, 2021, already proves these statements and even compares its bound to the then-recent 0.96399 bound of Kleptsyn–Pollicott–Vytnova, which it improves substantially .
Referee report (LaTeX)
\textbf{Recommendation:} minor revisions
\textbf{Journal Tier:} strong field
\textbf{Justification:}
The paper achieves a notable advance by providing a certified uniform lower bound of 0.98040856 for Bernoulli convolutions across the full parameter range and pinpoints the near-minimizer location up to a very small interval around the tribonacci parameter. The theoretical framework is robust, and the computational certification is extensive and well organized. Minor improvements to documentation of the verification pipeline and error control would further strengthen the presentation and reproducibility.