Main Types of Gradient Descent | Batch, Stochastic and Mini-Batch Explained! | Which One to Choose?

AI For Beginners ยท Beginner ยท๐Ÿ“ ML Fundamentals ยท1y ago
๐Ÿ”ฅ There are three main types of gradient descent: Batch, Stochastic and Mini-Batch. Batch gradient descent takes all observations for gradient computation, which is both accurate and resource heavy. Stochastic takes only one random observation from the data which is a poor approximation but introduces randomness. Mini-Batch is the mix of two, takes a random sample from the data. Each type has its own advantages and disadvantages. Batch gradient descent requires more resources and converges confidently to a minima (sometimes to a local minima), while stochastic converges faster due to frequeโ€ฆ
Watch on YouTube โ†— (saves to browser)

Chapters (8)

Introduction.
0:10 Batch gradient descent.
0:20 Stochastic gradient descent.
0:34 Mini-batch gradient descent.
0:48 Batch gradient descent pros and cons.
1:21 Stochastic gradient descent pros and cons.
1:51 Mini-batch gradient descent pros and cons.
2:10 Subscribe to us!
The NEW wave of engineering ๐Ÿค”
Next Up
The NEW wave of engineering ๐Ÿค”
Sajjaad Khader