Status |
Confirmed |
Seminar Series |
SEM-LPTMC |
Subjects |
cond-mat.mes-hall |
Date |
Monday 16 July 2018 |
Time |
10:45 |
Institute |
LPTMC |
Seminar Room |
Jussieu, room 5-23, 5th floor, tower 13-12 |
Speaker's Last Name |
Gabrié |
Speaker's First Name |
Marylou |
Speaker's Email Address |
marylou [dot] gabrie [at] ens [dot] fr |
Speaker's Institution |
LPS-ENS |
Title |
Entropy and mutual information in models of deep neural networks |
Abstract |
The successes and the multitude of applications of deep learning methods have spurred efforts towards
quantitative modeling of the performance of deep neural networks. In particular, an information-theoretic
approach has been receiving increasing interest. Nevertheless, it is in practice computationally intractable
to compute entropies and mutual informations in industry-sized neural networks. In this talk, we will
consider instead a class of models of deep neural networks, for which an expression for these information-
theoretic quantities can be derived from the replica method. We will examine how mutual informations
between hidden and input variables can be reported along the training of such neural networks on
synthetic datasets. Finally we will discuss the numerical results of a few training experiments. |
arXiv Preprint Number |
|
Comments |
|
Attachments |
|