Math Colloquium: On the Ubiquity of Kernels in Statistical Machine Learning
On the Ubiquity of Kernels in Statistical Machine Learning
Dr. Ernest Fokoué
School of Mathematical Sciences, RIT
In this lecture, I will present a general tour of some of the most commonly used kernel methods in statistical machine learning and data mining. I will touch on elements of artificial neural networks and then highlight their intricate connections to some general purpose kernel methods like Gaussian process learning machines. I will also resurrect the famous universal approximation theorem and will most likely ignite a [controversial] debate around the theme: could it be that [shallow] networks like radial basis function networks or Gaussian processes are all we need for well behaved functions? Do we really need many hidden layers as the hype around Deep Neural Network architectures seem to suggest or should heed Ockham’s principle of parsimony, namely “Entities should not be multiplied beyond necessity.” (“Entia non sunt multiplicanda praeter necessitatem.”)
Ernest Fokoué is Professor of Statistics in the School of Mathematical Sciences at Rochester Institute of Technology. He enjoys the honor of being the primogenito of SAMSI postdoctoral fellows that got the institute going with the Data Mining and Machine Learning (DMML) program in 2003. He is one of the co-leaders of the 2019-2020 SAMSI Games, Decisions, Risk and Reliability (GDRR) program, and will be spending his whole sabbatical year contributing to its activities. His areas of research and teaching interests are Bayesian Statistics, Statistical Machine Learning, Computational Statistics, Epistemology, Theology and Linguistics.
All are welcome. Those with interest in the topic.
When and Where
This is an RIT Only Event