18
Dez 2019
11:00
Spiegelgasse 5, Room 00.003
Seminar in probability theory: Nicolas Macris (EPFL)
High-dimensional generalized linear models are basic building blocks of current data analysis tools including multilayers neural networks. They arise in signal processing, statistical inference, machine learning, communication theory, and other fields. I will explain how to establish rigorously the intrinsic information-theoretic limitations of inference and learning for a class of randomly generated instances of generalized linear models, thus closing several old conjectures. Examples will be shown where one can delimit regions of parameters for which the optimal error rates are efficiently achievable with currently known algorithms. I will discuss how the proof technique, based on the recently developed adaptive interpolation method, is able to deal with the output nonlinearity and also to some extent with non-separable input distributions. |
https://probability.dmi.unibas.ch/seminar.html
Veranstaltung übernehmen als
iCal