January 31, 2022
Build & Learn

What is Weight Initialization for Neural Networks?

In this video, we learn about Weight Initialization for Deep Feedforward Neural Networks.

Mısra Turp
Developer Educator
Mısra Turp
Developer Educator

Weight Initialization, even though a minor concern, has serious effects on the deep feedforward neural networks we train.

Thanks to Xavier Glorot and Yoshua Bengio, we are aware that using a normal distribution for initializing weights with mean of 0 and variance of 1 contributes to the unstable gradients problem. That's why new techniques have been proposed to overcome these issues.

In this video, we learn what these techniques are, how they are different from each other, and what their perfect activation function matches are.

Title goes here

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.

Button Text
AI Concepts
Graph Neural Networks