Omar Rivasplata

(No longer at the)
Department of Computing Science
University of Alberta

(Former) Office: 1-47 Athabasca Hall

Email: try or

Here is my Curriculum Vitae
Last updated: not so long ago

News:   here

I work with John at UCL and with Csaba at DeepMind. Actually these days I work from home.

In the Fall term of 2016 I joined Csaba's group at the U of A to work in Statistical Machine Learning. This is fascinating! Besides statistical learning I am also interested in other learning frameworks such as online learning and reinforcement learning, and of course deep learning, which is quite popular these days. It looks that optimization is one pervasive theme in machine learning, though it comes up in a variety of flavours and colours that it isn't boring. As the 'least action principle' of that guy Maupertuis who said that "everything happens as if some quantity was to be made as small as possible." But just optimization doesn't quite do it... to really be talking about learning one has to pay attention to generalization.

Much Before
I spent some time with Mauricio's group looking at signal analysis related things. Before that I was working with Sasha and Nicole using geometric functional analysis and probability for estimating the smallest singular value of a sparse random matrix. Even before that I worked with Byron on reversibility of a Brownian motion with drift. As an undergrad, with Loretta I worked on a fun project about repeated games with incomplete information.

Research Interests
Foundations of AI.   Machine Learning.   Statistical Learning Theory.   Probability and Statistics.  

Talks (sample)
  • Tighter risk certificates for (probabilistic) neural networks. UCL Centre for AI. Slides Video
  • Statistical Learning Theory: A Hitchhiker's Guide. NeurIPS 2018 Tutorial. (with J. Shawe-Taylor) Slides Video

  • L. Orseau, M. Hutter, O. Rivasplata, Logarithmic pruning is all you need. Submitted (2020). PDF
  • O. Rivasplata, I. Kuzborskij, Cs. Szepesvári, J. Shawe-Taylor, PAC-Bayes Analysis Beyond the Usual Bounds. Submitted (2020 upgrade of the 2019 workshop paper with the same title). PDF
  • O. Rivasplata, I. Kuzborskij, Cs. Szepesvári, J. Shawe-Taylor, PAC-Bayes Analysis Beyond the Usual Bounds. NeurIPS 2019 Workshop on Machine Learning with Guarantees. PDF
  • O. Rivasplata, E. Parrado-Hernández, J. Shawe-Taylor, S. Sun, Cs. Szepesvári, PAC-Bayes bounds for stable algorithms with instance-dependent priors. NeurIPS 2018. PDF
  • A.E. Litvak, O. Rivasplata, Smallest singular value of sparse random matrices. Studia Math., 212, 3 (2012), 195-218. PDF
  • O. Rivasplata, Subgaussian random variables: An expository note. Unpublished note. PDF
  • O. Rivasplata, J. Rychtar, B. Schmuland, Reversibility for diffusions via quasi-invariance. Acta Univ. Carolin. Math. Phys., 48, 1 (2007), 3-10. PDF
  • O. Rivasplata, J. Rychtar, C. Sykes, Evolutionary games in finite populations. Pro Mathematica, 20, 39/40 (2006), 147-164. PDF
  • O. Rivasplata, B. Schmuland, Invariant and reversible measures for random walks on Z. Pro Mathematica, 19, 37/38 (2005), 117-124. PDF

Machine Learning Links
Bandit Algorithms
Advice for Machine Learning students

Math Links
The complex number operations neatly visualised.
Math Seminars (beta).

Probability Links (accessible with probability one)
The Probability Web
The Gaussian Processes Website
Research in Probability
Advice for Probability students

Writing aids
How to Write Mathematics, tips from the Mathematics Student Handbook at Trent University.
A Guide to Writing Mathematics by Kevin Lee.
Writing Mathematics by Berry & Lawson.
The Underground Grammarian by Richard Mitchell.

Peruvian Links
Instituto de Matemática y Ciencias Afines
My birth town is Trujillo, the marinera dance town.
Sometimes people ask me about Machu Picchu, it's a great place to see.
They ask me less about Arequipa though it is also a great place to visit.
But sometimes they ask me about Pisco.