Europe/Lisbon
Online

Robert Nowak

Robert Nowak, University of Wisconsin-Madison
The Neural Balance Theorem and its Consequences

Rectified Linear Units (ReLUs) are the most common activation function in deep neural networks. Weight decay is the most prevalent form of regularization used in deep learning. Together ReLUs and weight decay lead to an interesting effect known as “Neural Balance”: the norms of the input and output weights of each ReLU are automatically equalized at a global minima of the training objective. Neural Balance has a number of important consequences ranging from characterizations of the function spaces naturally associated to neural networks, their immunity to the curse of dimensionality, and to new and more effective architectures and training strategies.

This talk is based on joint work with Rahul Parhi and Liu Yang.