
Real World Real Time and Five Papers for Mike Tipping
02/15/19 • 61 min
In season five episode three we chat about take a listener question about Five Papers for Mike Tipping, take a listener question on AIAI and chat with Eoin O'Mahony of Uber
Here are Neil's five papers. What are yours?
Stochastic variational inference by Hoffman, Wang, Blei and Paisley
http://arxiv.org/abs/1206.7051
A way of doing approximate inference for probabilistic models with potentially billions of data ... need I say more?
Austerity in MCMC Land: Cutting the Metropolis Hastings by Korattikara, Chen and Welling
http://arxiv.org/abs/1304.5299
Oh ... I do need to say more ... because these three are at it as well but from the sampling perspective. Probabilistic models for big data ... an idea so important it needed to be in the list twice.
Practical Bayesian Optimization of Machine Learning Algorithms by Snoek, Larochelle and Adams
http://arxiv.org/abs/1206.2944
This paper represents the rise in probabilistic numerics, I could also have chosen papers by Osborne, Hennig or others. There are too many papers out there already. Definitely an exciting area, be it optimisation, integration, differential equations. I chose this paper because it seems to have blown the field open to a wider audience, focussing as it did on deep learning as an application, so it let's me capture both an area of developing interest and an area that hits the national news.
Kernel Bayes Rule by Fukumizu, Song, Gretton
http://arxiv.org/abs/1009.5736
One of the great things about ML is how we have different (and competing) philosophies operating under the same roof. But because we still talk to each other (and sometimes even listen to each other) these ideas can merge to create new and interesting things. Kernel Bayes Rule makes the list.
http://www.cs.toronto.edu/~hinton/absps/imagenet.pdf
An obvious choice, but you don't leave the Beatles off lists of great bands just because they are an obvious choice.
See omnystudio.com/listener for privacy information.
Hosted on Acast. See acast.com/privacy for more information.
In season five episode three we chat about take a listener question about Five Papers for Mike Tipping, take a listener question on AIAI and chat with Eoin O'Mahony of Uber
Here are Neil's five papers. What are yours?
Stochastic variational inference by Hoffman, Wang, Blei and Paisley
http://arxiv.org/abs/1206.7051
A way of doing approximate inference for probabilistic models with potentially billions of data ... need I say more?
Austerity in MCMC Land: Cutting the Metropolis Hastings by Korattikara, Chen and Welling
http://arxiv.org/abs/1304.5299
Oh ... I do need to say more ... because these three are at it as well but from the sampling perspective. Probabilistic models for big data ... an idea so important it needed to be in the list twice.
Practical Bayesian Optimization of Machine Learning Algorithms by Snoek, Larochelle and Adams
http://arxiv.org/abs/1206.2944
This paper represents the rise in probabilistic numerics, I could also have chosen papers by Osborne, Hennig or others. There are too many papers out there already. Definitely an exciting area, be it optimisation, integration, differential equations. I chose this paper because it seems to have blown the field open to a wider audience, focussing as it did on deep learning as an application, so it let's me capture both an area of developing interest and an area that hits the national news.
Kernel Bayes Rule by Fukumizu, Song, Gretton
http://arxiv.org/abs/1009.5736
One of the great things about ML is how we have different (and competing) philosophies operating under the same roof. But because we still talk to each other (and sometimes even listen to each other) these ideas can merge to create new and interesting things. Kernel Bayes Rule makes the list.
http://www.cs.toronto.edu/~hinton/absps/imagenet.pdf
An obvious choice, but you don't leave the Beatles off lists of great bands just because they are an obvious choice.
See omnystudio.com/listener for privacy information.
Hosted on Acast. See acast.com/privacy for more information.
Previous Episode

The Bezos Paradox and Machine Learning Languages
In episode two of season five we unpack the Bezos Paradox (TM Neil Lawrence) take a listener question about best papers and chat with Dougal Maclaurin of Google Brain.
See omnystudio.com/listener for privacy information.
Hosted on Acast. See acast.com/privacy for more information.
Next Episode

Jupyter Notebooks and Modern Model Distribution
In episode four of season five we talk about Jupyter Notebooks and Neil's dream of a world craft software and devices, we take a listener question about the conversation surrounding Open AI's GPT-2 its announcement and the coverage and we hear an interview with Brooks Paige of the Alan Turing Instiute
See omnystudio.com/listener for privacy information.
Hosted on Acast. See acast.com/privacy for more information.
If you like this episode you’ll love
Episode Comments
Generate a badge
Get a badge for your website that links back to this episode
<a href="https://goodpods.com/podcasts/talking-machines-65524/real-world-real-time-and-five-papers-for-mike-tipping-3460497"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to real world real time and five papers for mike tipping on goodpods" style="width: 225px" /> </a>
Copy