Pantheon SEMPARIS Le serveur des séminaires parisiens Paris

Statut Confirmé
Série SEM-LPTMC
Domaines cond-mat.mes-hall
Date Lundi 16 Juillet 2018
Heure 10:45
Institut LPTMC
Salle Jussieu, room 5-23, 5th floor, tower 13-12
Nom de l'orateur Gabrié
Prenom de l'orateur Marylou
Addresse email de l'orateur marylou [dot] gabrie [at] ens [dot] fr
Institution de l'orateur LPS-ENS
Titre Entropy and mutual information in models of deep neural networks
Résumé The successes and the multitude of applications of deep learning methods have spurred efforts towards quantitative modeling of the performance of deep neural networks. In particular, an information-theoretic approach has been receiving increasing interest. Nevertheless, it is in practice computationally intractable to compute entropies and mutual informations in industry-sized neural networks. In this talk, we will consider instead a class of models of deep neural networks, for which an expression for these information- theoretic quantities can be derived from the replica method. We will examine how mutual informations between hidden and input variables can be reported along the training of such neural networks on synthetic datasets. Finally we will discuss the numerical results of a few training experiments.
Numéro de preprint arXiv
Commentaires
Fichiers attachés

Pour obtenir l' affiche de ce séminaire : [ Postscript | PDF ]

[ Annonces ]    [ Abonnements ]    [ Archive ]    [ Aide ]    [ ]
[ English version ]