Speakers, Titles, and Abstracts

Here is the list of all speakers, titles, and abstracts of talks in the conference:

Invited speaker: Prof. Tony Shaska, Oakland University, MI, USA.

Minicourse (Lecture 1): An Introduction to Equivariant Neural Networks  (slides-1)

Abstract: An introduction to group actions, orbit spaces, and invariant and covariant maps, which will lead to introducing equivariant networks, translation equivariance, and Euclidean CNNs. As time permits we will define Euclidean coordinate independent CNNs. We intend to give non-traditional examples from classical mathematics where such networks can be used. Many open problems will be suggested throughout. 

Minicourse (Lecture 2): A machine learning approach to Julia reduction (slides-2)

Abstract: The reduction of integer binary forms is a classical problem in mathematics. It basically is the idea of picking a coordinate system such that the binary form has "small" coefficients. However, the only case that is fully understood is for quadratics. In 1917, in the first part of his thesis, Gustav Julia suggested a very interesting reduction method for an arbitrary degree binary form. It is based on the idea of defining a quadratic (Julia quadratic) J_f  which is covariant under the action of the modular group via coordinate changes. This quadratic is a positive definite quadratic and therefore has only one root in the upper-half complex plane H_2 , say a_f. Since  J_f is an SL(2, Z)-covariant, then bringing a_f to the fundamental domain F  by a matrix M, induces an action f→f^M  on binary forms. In this talk we explore the idea of a neural network which performs Julia reduction for binary forms of degree d>2. The talk will be accessible to mathematicians, computer scientists, and graduate students who are familiar with basic concepts of machine learning.

Minicourse (Lecture 3):  Deep learning in the moduli space of algebraic curves (slides-3)

Abstract: Studying arithmetic properties of the moduli space of algebraic curves is a classical problem with many implications in arithmetic geometry and applications to cryptography. We explore how to use methods of deep learning to better understand the rational points of the moduli space. Several examples of such methods, from the moduli space of genus two curves, will be described in detail illustrating our approach. The talk will be accessible to a general audience.

 

Invited speaker: Prof. Matija Kazalicki, Zagreb University, Zagreb, Croatia

Title: Ranks of elliptic curves and deep neural networks (slides)

Abstract: Determining the rank of an elliptic curve E/Q is a difficult problem. In applications such as the search for curves of high rank, one often relies on heuristics to estimate the analytic rank (which is equal to the rank under the Birch and Swinnerton-Dyer conjecture). 

This talk discusses a novel rank classification method based on deep convolutional neural networks (CNNs). The method takes as input the conductor of E and a sequence of normalized Frobenius traces a_p for primes p in a certain range (p<10^k for k=3,4,5), and aims to predict the rank or detect curves of ``high'' rank. We compare our method with eight simple neural network models of the Mestre-Nagao sums, which are widely used heuristics for estimating the rank of elliptic curves.

We evaluate our method on the LMFDB dataset and a custom dataset consisting of elliptic curves with trivial torsion, conductor up to 10^30, and rank up to 10. Our experiments demonstrate that the CNNs outperform the Mestre-Nagao sums on the LMFDB dataset. The performance of the CNNs and the Mestre-Nagao sums is comparable on the custom dataset. 

 

This is joint work with Domagoj Vlah.

 

Additionally, we will elaborate on an ongoing project with Zvonimir Bujanović, focusing on a detailed analysis of some aspects of Mestre-Nagao sums through the use of neural networks.

 

Invited speaker: Prof. Steven J. Miller, Williams College, US 

Title: Machine Learning in Elliptic Curves and Beyond: From Conjectures to Theorems to Conjectures. (slides)

 

Abstract:  Many delicate problems in number theory have only recently become amenable to numerical exploration due to the painfully slow convergence rate. Quantities associated to elliptic curves often converge at the scale of the logarithm of the conductor; thus while we may have millions of curves with conductors at most 10^20, this translates to less than 50 (and one would never conjecture on properties of primes from integers up to 50!). Improvements in computing power have led to larger data sets, which in conjunction with machine learning (ML) techniques have found new behavior. We discuss the recent successes of He, Le and Oliver who used ML to distinguish numerous curves based on standard invariants, and discovered oscillatory behavior in the coefficients of the associated $L$-functions, which agrees with recently developed theoretical models. We report on work of the author and his colleagues on lower order terms in coefficients in families, describing an open conjecture where the "nice" term is hard to extract due to large, fluctuating terms, in the hopes of forming collaborations with audience members. 

 

This is joint work with Zoe Batterman, Aditya Jambhale, Akash L. Narayanan, Kishan Sharma, Andrew Yang, and Chris Yao.


Invited speaker:  Prof. Kyu-Hwan Lee, University of Connecticut, US

Title: AI-assisted mathematical discovery: "murmurations of elliptic curves" (slides)

Abstract: Elliptic curves have been studied for a long time for their importance in number theory and applications in cryptography. In this talk, I will explain how interpretable machine learning techniques led to the discovery of a new phenomenon from the dataset of elliptic curves, called "murmuration", which shows a striking oscillating pattern. Understanding this new phenomenon has been a challenge to the number theory community, resulting in a recent hot-topics workshop at ICERM and several papers on the subject.

This talk is mainly based on the paper https://arxiv.org/pdf/2204.10140.pdf and it is a collaboration with He, Oliver, Pozdnyakov, and Sutherland.


Speaker: Prof. Thomas Oliver, University of Westminster, UK

Title: PCA and arithmetic  (slides)

Abstract: PCA is a simple technique for unsupervised machine learning which amounts to the diagonalization of covariance matrices. Following a series of supervised learning experiments in arithmetic, we applied PCA to a database of elliptic curves and were surprised to discover a seemingly new phenomenon in number theory (known as murmuration). In this talk, I will explain the relationship between murmuration and PCA, describe other covariance matrices in number theory, and outline other arithmetic settings to which PCA could be applied. This talk is based upon joint work with Malik Amir, Yang-Hui He, Kyu-Hwan Lee, Alexey Pozdnyakov, Andrew Sutherland, and Eldar Sultanow.


20-minute talks:


Speaker 1: Nikola Adžaga, University of Zagreb, Croatia,

Title: Automated conjecturing of Frobenius numbers by grammatical evolution (slides)

Conjecturing formulas and other symbolic relations occurs frequently in number theory and combinatorics. If we could automate conjecturing, we could benefit not only from speeding up, but also from finding conjectures previously out of our grasp. Grammatical evolution, a genetic programming technique, can be used for automated conjecturing in mathematics. This talk describes how one can interpret the Frobenius problem as a symbolic regression problem, and then apply grammatical evolution to it. In this manner, a few formulas for Frobenius numbers of specific quadruples were found automatically.



Speaker2: Tomás Silva, Post-Graduate Student of the University of Campinas (Unicamp)

Title: Machine-learning Sasakian and G2 topology on contact Calabi-Yau 7-manifolds

Abstract: We propose a machine-learning approach to study topological quantities related to the Sasakian and G2-geometries of contact Calabi-Yau 7-manifolds. Specifically, we compute datasets for certain Sasakian Hodge numbers and for the Crowley-Nördstrom invariant of the natural G2-structure of the 7-dimensional link of a weighted projective Calabi-Yau 3-fold hypersurface singularity, for each of the 7555 possible P^4(w) projective spaces. These topological quantities are then machine learnt with high accuracy, along with properties of the respective Gröbner basis, leading to a vast improvement in computation speeds which may be of independent interest. We observe promising results in machine learning the Sasakian Hodge numbers from the P^4(w) weights alone, using both neural networks and a symbolic regressor which achieve R^2 scores of 0.969 and 0.993 respectively, inducing novel conjectures to be raised.

 

Speaker 3: Dr. Parisa Hemmatian Dehkordi, Payame Noor University, IRAN

Title: The role of artificial intelligence in the parametric design of geometric forms (slides)

Abstract: In this research, our focus is the use of artificial intelligence (AI) in the development of Grasshopper plugins and components in the Rhino software to create creative and beautiful architecture and geometric forms. We realize the ideas that were once beyond our imagination and show that with the potential of AI and the combination of data management tools, optimization, mathematical and geometrical tools along with algorithmic thinking in the Grasshopper plugin; This topic makes teaching mathematics and geometry very beautiful and attractive which has many applications in architecture, geometry, mathematics, graphs, structural engineering, and other sciences.

 

Speaker 4: Juan Pablo Mamani Bustamante, Statistical Consultant at EstatCamp

Title: Local Influence Diagnostics with Forward Search in Regression Analysis

Abstract: Regression analysis is one of the most widely used statistical techniques. It is well known that the least squares estimates are sensitive to atypical and/or influential observations. Many methodologies were proposed to detect influential observations considering case deletion (global influence). On the other hand, Cook (1986) developed a general and powerful methodology to obtain a group of observations that might be jointly influential considering the local influence. However, these techniques may fail to detect masked influential observations. In this paper, we propose a methodology to detect masked influential observations in a local influence framework considering the forward search (Atkinson and Riani (2000)). The usefulness of the proposed methodology is illustrated with data sets that were previously analyzed in the literature to detect outliers and/or influential observations. Masked influential observations were successfully identified in these studies. The proposed methodology may be used in any model where the local influence analysis (Cook (1986)) is appropriate.