Visit complete Deep Learning roadmap

← Back to Topics List

Nadam Optimiser

Gradient descent is an optimization algorithm that follows the negative gradient of an objective function in order to locate the minimum of the function.The Nesterov-accelerated Adaptive Moment Estimation, or the Nadam, algorithm is an extension to the Adaptive Movement Estimation (Adam) optimization algorithm to add Nesterov’s Accelerated Gradient (NAG) or Nesterov momentum, which is an improved type of momentum.Learn about gradient descent, an optimization algorithm used to train machine learning models by minimizing errors between predicted and actual results.

Resources Community KGx AICbe YouTube

by Devansh Shukla

"AI Tamil Nadu formely known as AI Coimbatore is a close-Knit community initiative by Navaneeth with a goal to offer world-class AI education to anyone in Tamilnadu for free."