Publications

You can also find my articles on my Google Scholar profile.

Journal Articles


Dynamical mean field theory for models of confluent tissues and beyond

Published in SciPost Physics, 2023

We consider a recently proposed model to understand the rigidity transition in confluent tissues and we derive the dynamical mean field theory (DMFT) equations that describes several types of dynamics of the model in the thermodynamic limit: gradient descent, thermal Langevin noise and active drive. In particular we focus on gradient descent dy- namics and we integrate numerically the corresponding DMFT equations. In this case we show that gradient descent is blind to the zero temperature replica symmetry breaking (RSB) transition point. This means that, even if the Gibbs measure in the zero temper- ature limit displays RSB, this algorithm is able to find its way to a zero energy configu- ration. We include a discussion on possible extensions of the DMFT derivation to study problems rooted in high-dimensional regression and optimization via the square loss function.

Download Paper

Pre-prints


Stochastic Gradient Descent outperforms Gradient Descent in recovering a high-dimensional signal in a glassy energy landscape

Published in , 2023

Stochastic Gradient Descent (SGD) is an out-of-equilibrium algorithm used extensively to train artificial neural networks. However very little is known on to what extent SGD is crucial for to the success of this technology and, in particular, how much it is effective in optimizing high-dimensional non-convex cost functions as compared to other optimization algorithms such as Gradient Descent (GD). In this work we leverage dynamical mean field theory to analyze exactly its performances in the high-dimensional limit. We consider the problem of recovering a hidden high-dimensional non-linearly encrypted signal, a prototype high-dimensional non-convex hard optimization problem. We compare the performances of SGD to GD and we show that SGD largely outperforms GD. In particular, a power law fit of the relaxation time of these algorithms shows that the recovery threshold for SGD with small batch size is smaller than the corresponding one of GD.

Download Paper