Laurent Dinh

Machine Learning Researcher
lastname.firstname@gmail.com | http://laurent-dinh.github.io

Research Experience

2022-now Senior Research Scientist, Apple

2018-2021 Research Scientist, Google Brain, Montréal, Canada

2014-2017 Graduate Researcher, Mila, Montréal, Canada
Supervisor: Yoshua Bengio

2017-2017 Research Intern, DeepMind, London, United Kingdom
Supervisors: Nando De Freitas, Misha Denil

2016-2017 Software Engineering Intern, Google Brain, Montréal, Canada
Supervisor: Samy Bengio

2016-2016 Software Engineering Intern, Google Brain, Mountain View, California, United States
Supervisors: Samy Bengio, Jascha Sohl-Dickstein

2015-2015 Software Engineering Intern, Google Brain, Mountain View, California, United States
Supervisor: Samy Bengio

2013-2013 Visiting Researcher, University of British Columbia, Vancouver, Canada
Supervisors: Nando De Freitas, Misha Denil

2011-2012 Visiting Researcher, Mila, Montréal, Canada
Supervisor: Yoshua Bengio

Education

2014-2018 Philosophiæ Doctor (Computer Science), Université de Montréal, Montréal, Canada
Supervisor: Yoshua Bengio

2012-2013 Master of Science (Machine Learning and Computer Vision), ÉNS Paris-Saclay, Cachan, France

2012-2013 Master of Engineering (Applied Mathematics), École Centrale Paris, Châtenay Malabry, France

2009-2012 Bachelor of Engineering, École Centrale Paris, Châtenay Malabry, France

Publications and Preprints

Perfect Density Models Cannot Guarantee Anomaly Detection

Charline Le Lan, Laurent Dinh
I Can’t Believe It’s Not Better! Workshop (Neural Information Processing Systems 2020) (oral, Entropic Paper Award)
Special Issue: Probabilistic Methods for Deep Learning (MPDI Entropy)

Augmented Normalizing Flows: Bridging the Gap Between Generative Flows and Latent Variable Models

Chin-Wei Huang, Laurent Dinh, Aaron Courville

Solving ODE with Universal Flows: Approximation Theory for Flow-Based Models

Chin-Wei Huang, Laurent Dinh, Aaron Courville
Workshop on Integration of Deep Neural Models and Differential Equations (International Conference on Learning Representations 2020) (oral)

Discrete Flows: Invertible Generative Models of Discrete Data

Dustin Tran, Keyon Vafa, Kumar Krishna Agrawal, Laurent Dinh, Ben Poole
Deep Generative Models for Highly Structured Data (International Conference on Learning Representations 2019)
Neural Information Processing Systems 2019

Invertible Convolutional Flow

Mahdi Karami, Dale Schuurmans, Jascha Sohl-Dickstein, Laurent Dinh, Daniel Duckworth
Neural Information Processing Systems 2019 (spotlight)

A RAD approach to deep mixture models

Laurent Dinh, Jascha Sohl-Dickstein, Hugo Larochelle, Razvan Pascanu
Deep Generative Models for Highly Structured Data (International Conference on Learning Representations 2019)

VideoFlow: A Flow-Based Generative Model for Video

Manoj Kumar, Mohammad Babaeizadeh, Dumitru Erhan, Chelsea Finn, Sergey Levine, Laurent Dinh, Durk Kingma
Workshop on Invertible Neural Nets and Normalizing Flows (International Conference on Machine Learning 2019) (oral)
International Conference on Learning Representations 2020

Conference ticket allocation via non-uniform random selection to address systemic biases

Jessica Thompson, Laurent Dinh, Layla El Asri, Nicolas Le Roux
Critiquing and Correcting Trends in Machine Learning (Neural Information Processing Systems 2018) (spotlight)

Reparametrization in Deep Learning

Laurent Dinh
PhD thesis

Learning Awareness Models

Brandon Amos, Laurent Dinh, Serkan Cabi, Thomas Rothörl, Sergio Gómez Colmenarejo, Alistair Muldal, Tom Erez, Yuval Tassa, Nando De Freitas, Misha Denil
International Conference on Learning Representations 2018 (conference track)

Sharp Minima Can Generalize For Deep Nets

Laurent Dinh, Razvan Pascanu, Samy Bengio, Yoshua Bengio
International Conference on Machine Learning 2017

Density estimation using Real NVP

Laurent Dinh, Jascha Sohl-Dickstein, Samy Bengio
Deep Learning Symposium (Neural Information Processing Systems 2016) (oral)
International Conference on Learning Representations 2017 (conference track)

Deep independence network analysis of structural brain imaging: A simulation study

Eduardo Castro, Devon Hjelm, Sergey Plis, Laurent Dinh, Jessica Turner, Vince Calhoun
IEEE 25th International Workshop on Machine Learning for Signal Processing 2015

A Recurrent Latent Variable Model for Sequential Data

Junyoung Chung, Kyle Kastner, Laurent Dinh, Kratarth Goel, Aaron Courville, Yoshua Bengio
Neural Information Processing Systems 2015

NICE: Non-linear Independent Components Estimation

Laurent Dinh, David Krueger, Yoshua Bengio
International Conference on Learning Representations 2015 (workshop track)

Techniques for Learning Binary Stochastic Feedforward Neural Networks

Tapani Raiko, Mathias Berglund, Guillaume Alain, Laurent Dinh
International Conference on Learning Representations 2015 (conference track)

Predicting Parameters in Deep Learning

Misha Denil, Babak Shakibi, Laurent Dinh, Marc’Aurelio Ranzato, Nando De Freitas
Neural Information Processing Systems 2013

Talks

Invertible Models and Normalizing Flows: A Retrospective Talk

Primer on Normalizing Flows

A RAD approach to deep mixture models

Building a Tractable Generator Network

Conference ticket allocation via non-uniform random selection to address systemic biases

Reparametrization in Deep Learning

Sharp Minima Can Generalize For Deep Nets

Density estimation using Real NVP

NICE: Non-linear Independent Components Estimation

Training Neural Bayesian Nets

Academic Service

Diversity and Inclusion Chair

Area Chair

Action Editor

Reviewer

Panelist

Mentor

Lecturer