.

Omar Rivasplata
Senior Research Fellow University College London 
Noteworthy News ‖ CV ‖ Email: try or ‖ Math Genealogy ‖ Follow:
Algorithmic Learning Theory. Machine Learning. Mathematics. Probability and Statistics.

I am affiliated with the Department of Statistical Science at UCL, where I lead the DELTA group.
Mine is one of two departments of the upcoming Institute for Mathematical and Statistical Sciences, the other one being the Department of Mathematics.
I have close connections and collaboration with the Centre for Artificial Intelligence at UCL, and have a network of connections at other UK institutions and abroad. I am a member of the European Laboratory for Learning and Intelligent Systems, and a fellow of the Royal Statistical Society and the Institute of Mathematics and its Applications.
My work is on machine learning research. This field is fascinating! One of the things I enjoy most about it being the confluence of maths and stats, and computer science experiments, to answer research questions. Besides statistical learning I am also interested in other learning frameworks such as online learning and reinforcement learning, and of course deep learning, which is quite popular these days. It looks that optimization is one pervasive theme in machine learning, though it comes up in such a variety of flavours and colours that it isn't boring. It reminds of the least action principle of Maupertuis, saying that "everything happens as if some quantity was to be made as small as possible." (This principle has lead the optimists to believe that we live in the best possible world.) But just optimization doesn't quite do it for machine learning... to really be talking about learning one has to pay attention to generalization!

I did research studies in machine learning at the Department of Computer Science, University College London, sponsored by DeepMind. In parallel, I was affiliated with DeepMind as a research scientist intern, for three years.
People with whom I have worked at UCL include John ShaweTaylor, María PérezOrtiz, and Benjamin Guedj.
People with whom I have worked at DeepMind include Laurent Orseau, Marcus Hutter, Ilja Kuzborskij, Csaba Szepesvári, András György from my own team; and Amal RannenTriki, Razvan Pascanu, Agnieszka GrabskaBarwińska, Geoffrey Irving, Thore Graepel from friend teams. 
I spent a year at the Department of Computing Science, University of Alberta. During this time I started building my mental model of the machine learning field and finetuning my hyperparameters. My host was Rich Sutton in theory, but in practice I was developing my research plans together with Csaba Szepesvári.

I spent some time with Mauricio's group looking at problems related to seismic signal analysis. Before that I was working with Sasha and Nicole on the smallest singular value of a sparse random matrix, using methods from geometric functional analysis and probability. Even before that I worked with Byron on reversibility of a Brownian motion with drift. As an undergrad, with Loretta I worked on a fun project about repeated twoplayer games with incomplete information on one side.

 Tighter risk certificates for (probabilistic) neural networks.
UCL Centre for AI.
Slides
Video
 Statistical Learning Theory: A Hitchhiker's Guide.
NeurIPS 2018 Tutorial. (with J. ShaweTaylor)
Slides
Video
Conference & Journal Papers 
 I. Kuzborskij, Cs. Szepesvári, O. Rivasplata, A. RannenTriki, R. Pascanu,
On the Role of Optimization in Double Descent: A Least Squares Study.
NeurIPS 2021.
arXiv PDF
 M. PérezOrtiz, O. Rivasplata, J. ShaweTaylor, Cs. Szepesvári,
Tighter risk certificates for neural networks.
JMLR, 22, 227 (2021), 140.
PDF /
revised PDF /
published PDF
 M. Haddouche, B. Guedj, O. Rivasplata, J. ShaweTaylor,
PACBayes unleashed: generalisation bounds with unbounded losses.
Entropy, 23, 10 (2021).
arXiv PDF
published PDF
 L. Orseau, M. Hutter, O. Rivasplata,
Logarithmic pruning is all you need.
NeurIPS 2020 .
PDF
 O. Rivasplata, I. Kuzborskij, Cs. Szepesvári, J. ShaweTaylor,
PACBayes analysis beyond the usual bounds.
NeurIPS 2020.
PDF
 O. Rivasplata, E. ParradoHernández, J. ShaweTaylor, S. Sun, Cs. Szepesvári,
PACBayes bounds for stable algorithms with instancedependent priors.
NeurIPS 2018.
PDF
 A.E. Litvak, O. Rivasplata,
Smallest singular value of sparse random matrices.
Studia Math., 212, 3 (2012), 195218.
PDF
 O. Rivasplata, J. Rychtar, B. Schmuland,
Reversibility for diffusions via quasiinvariance.
Acta Univ. Carolin. Math. Phys., 48, 1 (2007), 310.
PDF
 O. Rivasplata, J. Rychtar, C. Sykes,
Evolutionary games in finite populations.
Pro Mathematica, 20, 39/40 (2006), 147164.
PDF
 O. Rivasplata, B. Schmuland,
Invariant and reversible measures for random walks on Z.
Pro Mathematica, 19, 37/38 (2005), 117124.
PDF
 O. Rivasplata,
A note on a confidence bound of Kuzborskij and Szepesvári.
(2021)
PDF
 O. Rivasplata,
Subgaussian random variables: An expository note.
(2012)
PDF
 M. Haddouche, B. Guedj, O. Rivasplata, J. ShaweTaylor,
Upper and Lower Bounds on the Performance of Kernel PCA.
PDF
 O. Rivasplata, V. Tankasali, Cs. Szepesvári,
PACBayes with Backprop.
(2019)
PDF
Probability Links (probably accessible) 
Stats Links (most significant) 
Peruvian Links 
My birth town is Trujillo, the marinera dance town. 
Sometimes people ask me aboutMachu Picchu, it's a great place to see. 
They ask me less aboutArequipa though it is also a great place to visit. 
Last link, in case you care to know, is aboutPisco. 

