top of page
Writer's pictureMargi Patel

Vanishing and Exploding Gradients in Neural Networks


Image source


In this blog, you will understand why the Vanishing and Exploding Gradient problem happens. What are Vanishing and Exploding Gradient problems, and why does it occur.

What is a Gradient?

The Gradient is nothing but a derivative of loss function with respect to the weights. It is used to updates the weights to minimize the loss function during the back propagation in neural networks.

What is Vanishing Gradients?

Vanishing Gradient occurs when the derivative or slope will get smaller and smaller as we go backward with every layer during backpropagation.


When weights update is very small or exponential small, the training time takes too much longer, and in the worst case, this may completely stop the neural network training.


A vanishing Gradient problem occurs with the sigmoid and tanh activation function because the derivatives of the sigmoid and tanh activation functions are between 0 to 0.25 and 0–1. Therefore, the updated weight values are small, and the new weight values are very similar to the old weight values. This leads to Vanishing Gradient problem. We can avoid this problem using the ReLU activation function because the gradient is 0 for negatives and zero input, and 1 for positive input.

What is Exploding Gradients?

Exploding gradient occurs when the derivatives or slope will get larger and larger as we go backward with every layer during backpropagation. This situation is the exact opposite of the vanishing gradients.


This problem happens because of weights, not because of the activation function. Due to high weight values, the derivatives will also higher so that the new weight varies a lot to the older weight, and the gradient will never converge. So it may result in oscillating around minima and never come to a global minima point.

Conclusion:

During the backpropagation in the deep neural networks, the Vanishing gradient problem occurs due to the sigmoid and tan activation function and the exploding gradient problem occurs due to large weights.


Thank you.

5,856 views

Recent Posts

See All
bottom of page