GenAI Meets Mathematical Physics - B Capital , Greycroft, Madrona, and Menlo Ventures, signalling investor confidence in AI to solve Math conjectures
In my previous post, I explored Yang-Mills theory alongside Dr. Yau’s seminal paper. Gerard ’t Hooft, a Nobel laureate and one of the most eminent theoretical physicists, was honoured with the Niels Bohr Institute Medal of Honour in 2016 for his groundbreaking work in black holes, quantum gravity, and quantum field theory. He proved that Yang-Mills theories, which underpin the weak and strong nuclear forces, are renormalizable when treated quantum mechanically—a milestone that solidified these theories’ role in particle physics. His doctoral work demonstrated how to renormalize massless Yang-Mills fields and later extended this to massive fields with spontaneous symmetry breaking, helping to establish the mathematical consistency of the Standard Model. ’t Hooft’s insights resolved critical issues involving particle masses through instantons and provided powerful new mathematical tools for studying strongly interacting quarks. He introduced innovative perspectives by conceptualizing the strong force as mediated by a larger variety of quarks and gluons, enriching theoretical understanding. He was awarded the Breakthrough Prize in Fundamental Physics in 2025. This prestigious prize, often dubbed the ‘Oscar of Science,’ recognized his foundational contributions to gauge theory and the Standard Model. The award includes a significant financial reward of $3 million.
In the last post, I asked about recreating some aspects of Calabi-Yau and Yang-Mills research using transformer models. There have been attempts at developing a Gauge Equivariant Transformer, notably presented at NeurIPS 2021, which introduced a model named GET that achieves gauge equivariance and rotation invariance by making the attention mechanism agnostic to the orientation of local coordinate systems. This model was efficiently implemented on triangle meshes and demonstrated state-of-the-art performance on standard recognition tasks.
One more recent effort in that direction - The Axiom team, led by Carina Hong and François Charton, published a new paper to advance automated mathematical reasoning through CYTransformer - Calabi-Yau. , a deep learning model based on transformer architecture designed to automate the generation of formalized reasoning steps (FRSTs) - Calabi-Yau. Their recent paper Transformer (published 2 weeks back) [arXiv:2507.03732] details this novel approach, including AICY: a community-driven platform designed to combine self-improving machine learning models with a continuously expanding database to explore and catalogue the Calabi-Yau landscape. Axiom has raised $64 million in seed funding - valued at $300 million, led by B Capital alongside prominent venture firms including Greycroft, Madrona, and Menlo Ventures, signalling strong investor confidence in AI to solve Math conjectures - Mathematical Superintelligence.By integrating AI with foundational theoretical physics, the horizon of discovery is broadened, promising not only acceleration in mathematical research but potential breakthroughs in understanding the universe’s fundamental laws.
