Date: Fri, 15 Feb 2019 01:11:22 GMT
<p>In season five episode three we chat about take a listener question about Five Papers for Mike Tipping, take a listener question on<a href="https://www.whitehouse.gov/articles/accelerating-americas-leadership-in-artificial-intelligence/" target="_blank"> AIAI</a> and chat with <a href="https://www.linkedin.com/in/eoin-o-mahony-274b967/" target="_blank">Eoin</a> O'Mahony of Uber</p><br /><p>Here are Neil's five papers. What are yours?</p><p>Stochastic variational inference by Hoffman, Wang, Blei and Paisley</p><p><a href="http://arxiv.org/abs/1206.7051" target="_blank">http://arxiv.org/abs/1206.7051</a></p><p>A way of doing approximate inference for probabilistic models with potentially billions of data ... need I say more?</p><br /><p>Austerity in MCMC Land: Cutting the Metropolis Hastings by Korattikara, Chen and Welling</p><p><a href="http://arxiv.org/abs/1304.5299" target="_blank">http://arxiv.org/abs/1304.5299</a></p><p>Oh ... I do need to say more ... because these three are at it as well but from the sampling perspective. Probabilistic models for big data ... an idea so important it needed to be in the list twice. </p><br /><p>Practical Bayesian Optimization of Machine Learning Algorithms by Snoek, Larochelle and Adams</p><p><a href="http://arxiv.org/abs/1206.2944" target="_blank">http://arxiv.org/abs/1206.2944</a></p><p>This paper represents the rise in probabilistic numerics, I could also have chosen papers by Osborne, Hennig or others. There are too many papers out there already. Definitely an exciting area, be it optimisation, integration, differential equations. I chose this paper because it seems to have blown the field open to a wider audience, focussing as it did on deep learning as an application, so it let's me capture both an area of developing interest and an area that hits the national news.</p><br /><p>Kernel Bayes Rule by Fukumizu, Song, Gretton</p><p><a href="http://arxiv.org/abs/1009.5736" target="_blank">http://arxiv.org/abs/1009.5736</a></p><p>One of the great things about ML is how we have different (and competing) philosophies operating under the same roof. But because we still talk to each other (and sometimes even listen to each other) these ideas can merge to create new and interesting things. Kernel Bayes Rule makes the list.</p><br /><p><a href="http://www.cs.toronto.edu/~hinton/absps/imagenet.pdf" target="_blank">http://www.cs.toronto.edu/~hinton/absps/imagenet.pdf</a></p><p>An obvious choice, but you don't leave the Beatles off lists of great bands just because they are an obvious choice.</p><p>See <a href="https://omnystudio.com/listener">omnystudio.com/listener</a> for privacy information.</p><br /><hr /><p style="color: grey; font-size: 0.75em;"> Hosted on Acast. See <a href="https://acast.com/privacy" rel="noopener noreferrer" style="color: grey;" target="_blank">acast.com/privacy</a> for more information.</p>