Актуальные темы
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.
На самом деле это очень сильный голос за Grok. Я проверил, и похоже, что да, это действительно улучшило нижнюю границу в серьезной статье о вероятностях 2025 года. Мультиагент с поиском и выполнением кода, но зачем ограничивать себя, если вы можете на самом деле использовать инструменты?
DS (только веб) терпит неудачу/сдается.

18 февр. 2026 г.
Grok 4.20 (Beta) improves the lower bound by 9.1% on the Gaussian perimeter of convex sets in two minutes.
This is something that was pointed out to me by Xinyuan Xie. Back in 1993, Keith Ball showed that the Gaussian perimeter of a convex body in n-dimensional Euclidean space is bounded from above by 4n^{1/4}. As for the lower bound, Ball showed that for a cube (of appropriate size) the perimeter can grow as \sqrt{\log(n)}. So there was a gap for a while as to which bound is sharp, until 2003, when, in a beautiful paper, Fedor Nazarov showed that on the example of a random polyhedron (the intersection of many random half-spaces) the lower bound can grow as C n^{1/4}, with C=\exp(-5/4)=0.286…. Besides, Nazarov also improved the constant 4 in the upper bound (replacing it with 0.64) when n is large. These bounds stayed unbeaten until recently, when in 2019 Martin Raic managed to improve the upper-bound constant factor from 0.64 to 0.59.
Grok 4.20 (Beta), by more carefully optimizing Nazarov’s construction, managed to improve the lower-bound constant from 0.286 to 0.3126. I find this surprising even if it is just playing within the techniques of Nazarov’s paper, because very recently Nadimpalli--Pascale (2025) posted a preprint where, with a different approach, they recovered Nazarov’s lower bound with the same constant factor 0.286….
Grok was very generous in its response: it said that the improvement it provided follows the same argument of Nazarov ``line-by-line,'' whereas when I asked other models (other than Grok) to verify Grok’s claim, they agreed on everything except this part; they said the improvement is not really ``line-by-line'' :D.
Finally, I would not say that Nazarov missed this improvement. Knowing him for a long time, I am pretty confident it is common for him to sacrifice optimal constants for algebraic elegance.
Why is all this interesting? Having control of the Gaussian perimeter allows one to control Fourier tails of characteristic functions of these sets, which leads to controlling the time complexity of PAC learning and agnostic learning algorithms for this family (see Klivans--O’Donnell--Servedio).
References:
Chat link with Grok 4.20 (Beta).
Keith Ball. The Reverse Isoperimetric Problem for Gaussian Measure. Discrete and Computational Geometry, 10:411–420, 1993.
Adam Klivans, Ryan O’Donnell, and Rocco A Servedio. Learning geometric concepts via Gaussian surface area. In Proc. 49th IEEE Symposium on Foundations of Computer Science (FOCS), pages 541–550, 2008.
Shivam Nadimpalli, Caleb Pascale. On the Maximal Gaussian Perimeter of Convex Sets, Revisited. Preprint (2025)
Fedor Nazarov. On the maximal perimeter of a convex set in R^n with respect to a Gaussian measure. In Geometric Aspects of Functional Analysis (2001-2002) pages 169–187. Lecture Notes in Mathematics, Volume 1807, Springer, 2003
Martin Raicz. A multivariate Berry–Esseen theorem with explicit constants. Bernoulli 25(4A), 2019, 2824–2853

Чтобы было понятно, если я скажу DS, чтобы он НЕ сдавался, он думает гораздо усерднее, 12 минут здесь, и предлагает идею о том, как можно улучшить постоянную. Но сгенерированный им код не работает. В итоге он сдается. На самом деле, качественно он кажется "правильным", но получает 0.3116, <Grok

Если код DeepSeek будет исправлен (даже самим DeepSeek), он выдаст результат, который сходится со значением Grok. Так что я полагаю, что с довольно тривиальным REPL он бы "успел" так же. В любом случае, здесь более высокая полезность для Grok.
115
Топ
Рейтинг
Избранное
