Visit complete Deep Learning roadmap

← Back to Topics List

Stochastic Gradient Descent(SGD)

SGD (Stochastic Gradient Descent) is an optimization algorithm used in machine learning to minimize the loss function of a model by updating its parameters iteratively. It works by randomly selecting a small subset of the training data (a mini-batch) to compute the gradient of the loss function with respect to the model parameters, and then updating the parameters in the direction of the negative gradient.

Resources Community KGx AICbe YouTube

by Devansh Shukla

"AI Tamil Nadu formely known as AI Coimbatore is a close-Knit community initiative by Navaneeth with a goal to offer world-class AI education to anyone in Tamilnadu for free."