[Seminar] Expected Expressivity and Gradients of Maxout Networks by Ms. Hanna Tseran, MPI

[Seminar]  Expected Expressivity and Gradients of Maxout Networks by Ms. Hanna Tseran, MPI
Monday June 5th, 2023 04:30 PM
Zoom: https://oist.zoom.us/j/95478318970?pwd=NjNUQVJFWmRZRGVOWTBpeFdNRUtyQT09

Description

Speaker:  Ms. Hanna Tseran, MPI

Title: Expected Expressivity and Gradients of Maxout Networks

Abstract: Learning with neural networks relies on the complexity of the representable functions but, more importantly, the particular assignment of typical parameters to functions of different complexity. Taking the number of activation regions as an expressivity measure, we show that the practical complexity of networks with maxout activation functions is often far from the theoretical maximum. Continuing the analysis of the expected behavior, we study the expected gradients of a maxout network with respect to inputs and parameters and obtain bounds for the moments depending on the architecture and the parameter distribution. We observe that the distribution of the input-output Jacobian depends on the input, which complicates a stable parameter initialization. Nevertheless, based on the moments of the gradients, we formulate parameter initialization strategies that avoid vanishing and exploding gradients in wide networks.

Add Event to My Calendar

Subscribe to the OIST Calendar

See OIST events in your calendar app