Home/AI/Gradient Descent Simulator/What is Gradient Descent?

Gradient Descent Simulator

Visualize how models find optimal solutions through optimization

Your Progress

0 / 5 completed
โ†
Previous Module
Loss Functions Explorer

What is Gradient Descent?

Gradient descent is the workhorse of machine learning - the optimization algorithm that helps models learn by iteratively adjusting parameters to minimize loss. Think of it as hiking down a mountain in the fog, taking steps in the steepest downward direction.

๐Ÿ”๏ธ The Mountain Analogy

๐Ÿ“
Current Position
Your current parameter values
Weight = 5.0
๐Ÿงญ
Gradient
Direction of steepest ascent
Slope = +2.0 (upward)
๐Ÿ‘Ÿ
Step
Move opposite to gradient
New weight = 5.0 - 0.1ร—2.0

๐Ÿ”„ The Algorithm

1
Initialize Parameters
Start with random weights
ฮธ = random values
2
Compute Loss
Calculate prediction error
L = loss(y, ลท)
3
Calculate Gradient
Find direction to reduce loss
โˆ‡L = โˆ‚L/โˆ‚ฮธ
4
Update Parameters
Take step in negative gradient direction
ฮธ = ฮธ - ฮฑยทโˆ‡L
5
Repeat
Continue until convergence
Loop until |โˆ‡L| โ‰ˆ 0

๐Ÿ’ก Key Concepts

Gradient (โˆ‡)
Vector of partial derivatives showing steepest ascent direction
Learning Rate (ฮฑ)
Step size - how far to move in gradient direction each iteration
Convergence
When gradient approaches zero - we've found a minimum
Local vs Global Minimum
GD finds local minima, not always the global best solution