.
Omar_Portrait


Omar Rivasplata

(No longer at the)
Department of Computing Science
University of Alberta


Email: try or

Here is my Curriculum Vitae
Last updated: not so long ago

And here is my genealogy

News

Look me up in Google Scholar
Also in ResearchGate





Research Interests
Foundations of AI.   Machine Learning.   Statistical Learning Theory.   Probability and Statistics.  

Currently
I work with John Shawe-Taylor at UCL and with Csaba Szepesvári at DeepMind. Actually these days I work from home. People I work with at UCL include María Pérez-Ortiz and Benjamin Guedj. People I work with at DeepMind include Laurent Orseau, Marcus Hutter, Claire Vernade, Ilja Kuzborskij, András György, Tor Lattimore from my own team (Foundations); and Razvan Pascanu, Amal Rannen-Triki, Soham De, Sam Smith from friend teams.

Before
In the Fall term of 2016 I joined the Department of Computer Science at the U of A, to work in Statistical Machine Learning. This is fascinating! Besides statistical learning I am also interested in other learning frameworks such as online learning and reinforcement learning, and of course deep learning, which is quite popular these days. It looks that optimization is one pervasive theme in machine learning, though it comes up in such a variety of flavours and colours that it isn't boring. It reminds of the least action principle of Maupertuis, saying that "everything happens as if some quantity was to be made as small as possible." (This principle has lead the optimists to believe that we live in the best possible world.) But just optimization doesn't quite do it for machine learning... to really be talking about learning one has to pay attention to generalization!

Much Before
I spent some time with Mauricio's group looking at signal analysis related things. Before that I was working with Sasha and Nicole using geometric functional analysis and probability for estimating the smallest singular value of a sparse random matrix. Even before that I worked with Byron on reversibility of a Brownian motion with drift. As an undergrad, with Loretta I worked on a fun project about repeated games with incomplete information.

Talks (sample)
  • Tighter risk certificates for (probabilistic) neural networks. UCL Centre for AI. Slides Video
  • Statistical Learning Theory: A Hitchhiker's Guide. NeurIPS 2018 Tutorial. (with J. Shawe-Taylor) Slides Video

Submitted Papers
  • M. Haddouche, B. Guedj, O. Rivasplata, J. Shawe-Taylor, Upper and Lower Bounds on the Performance of Kernel PCA. Submitted (2020). PDF
  • M. Pérez-Ortiz, O. Rivasplata, J. Shawe-Taylor, Cs. Szepesvári, Tighter risk certificates for neural networks. Submitted (2020). PDF

Undergoing Renovations
  • M. Haddouche, B. Guedj, O. Rivasplata, J. Shawe-Taylor, PAC-Bayes unleashed: Generalisation bounds with unbounded losses. (2020). PDF

Conference & Journal Papers
  • L. Orseau, M. Hutter, O. Rivasplata, Logarithmic pruning is all you need. NeurIPS 2020 . PDF
  • O. Rivasplata, I. Kuzborskij, Cs. Szepesvári, J. Shawe-Taylor, PAC-Bayes Analysis Beyond the Usual Bounds. (Upgrade of the 2019 workshop paper with the same title.) NeurIPS 2020. PDF
  • O. Rivasplata, E. Parrado-Hernández, J. Shawe-Taylor, S. Sun, Cs. Szepesvári, PAC-Bayes bounds for stable algorithms with instance-dependent priors. NeurIPS 2018. PDF
  • A.E. Litvak, O. Rivasplata, Smallest singular value of sparse random matrices. Studia Math., 212, 3 (2012), 195-218. PDF
  • O. Rivasplata, J. Rychtar, B. Schmuland, Reversibility for diffusions via quasi-invariance. Acta Univ. Carolin. Math. Phys., 48, 1 (2007), 3-10. PDF
  • O. Rivasplata, J. Rychtar, C. Sykes, Evolutionary games in finite populations. Pro Mathematica, 20, 39/40 (2006), 147-164. PDF
  • O. Rivasplata, B. Schmuland, Invariant and reversible measures for random walks on Z. Pro Mathematica, 19, 37/38 (2005), 117-124. PDF

Workshop Papers
  • M. Pérez-Ortiz, O. Rivasplata, J. Shawe-Taylor, Cs. Szepesvári, Towards self-certified learning: Probabilistic neural networks trained by PAC-Bayes with Backprop. NeurIPS 2020 Workshop - Beyond BackPropagation. PDF
  • O. Rivasplata, I. Kuzborskij, Cs. Szepesvári, J. Shawe-Taylor, PAC-Bayes Analysis Beyond the Usual Bounds. NeurIPS 2019 Workshop - Machine Learning with Guarantees. PDF

Unpublished Notes
  • O. Rivasplata, A note on a confidence bound of Kuzborskij and Szepesvári. (2021) PDF
  • O. Rivasplata, Subgaussian random variables: An expository note. (2012) PDF


Machine Learning Links
Bandit Algorithms
The Cross-Entropy Method
Advice for Machine Learning students

Math Links
What's new
The complex number operations neatly visualised.
Math Seminars (beta).
Advice for Math students

Probability Links (accessible with high probability)
Almost Sure and Random
Research in Probability
The Gaussian Processes Website
Advice for Probability students

Writing aids
How to Write Mathematics, tips from the Mathematics Student Handbook at Trent University.
A Guide to Writing Mathematics by Kevin Lee.
Writing Mathematics by Berry & Lawson.
The Underground Grammarian by Richard Mitchell.

Peruvian Links
Instituto de Matemática y Ciencias Afines
My birth town is Trujillo, the marinera dance town.
Sometimes people ask me about Machu Picchu, it's a great place to see.
They ask me less about Arequipa though it is also a great place to visit.
Last link, in case you care to know, is about Pisco.