Courses a.y. 2024/2025
I teach and have taught a number of programming courses, from undergraduate to PhD level, using Python and Julia.
Biographical note
I graduated in Trieste in Theoretical Physics with a thesis in applications to Computer Science and large scale discrete optimization problems. I then moved to Turin where I did a PhD in Computational Neuroscience, and also got interested in Bioinformatics and inference problems. Throughout I kept an eye on theoretical aspects of Machine Learning and Neural Networks, which are my main focus lately.
About
I've been the director of the BSc in Mathematical and Computing Sciences for Artificial Intelligence (BAI) since 2023/24.
Research interests
My research interests focus on the application of Statistical Mechanics to machine learning and computational neuroscience, and more generally to large-scale inference and optimization problems. Lately I've been particularly interested in studying the loss landscape of neural networks, analytically and numerically.
Working papers
Entropy-sgd: Biasing gradient descent into wide valleys
ICLR 2017, Journal of Statistical Mechanics, 2019
Role of synaptic stochasticity in training low-precision neural networks
Physical review letters, 2018
Efficiency of quantum vs. classical annealing in nonconvex learning problems
Proceedings of the National Academy of Sciences, 2018
Properties of the Geometry of Solutions and Capacity of Multilayer Neural Networks with Rectified Linear Unit Activations
C Baldassi, EM Malatesta, R Zecchina, 2019
Shaping the learning landscape in neural networks around wide flat minima
Proceedings of the National Academy of Sciences, 2019
Selected Publications
Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses
PHYSICAL REVIEW LETTERS, 2015
Unreasonable effectiveness of learning neural networks: from accessible states and robust ensembles to basic algorithmic schemes
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2016
Entropy-SGD: biasing gradient descent into wide valleys
JOURNAL OF STATISTICAL MECHANICS: THEORY AND EXPERIMENT, 2019
Role of synaptic stochasticity in training low-precision neural networks
PHYSICAL REVIEW LETTERS, 2018
Efficiency of quantum vs. classical annealing in nonconvex learning problems
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2018
Properties of the geometry of solutions and capacity of multilayer neural networks with rectified linear unit activations
PHYSICAL REVIEW LETTERS, 2019
Shaping the learning landscape in neural networks around wide flat minima
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2020