Various Optimization Algorithms For Training Neural Network[转]

2021/4/11 10:58:30

本文主要是介绍Various Optimization Algorithms For Training Neural Network[转],对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!

from

https://towardsdatascience.com/optimizers-for-training-neural-network-59450d71caf6

 

Optimizers help to get results faster

Gradient Descent

Stochastic Gradient Descent

Mini-Batch Gradient Descent

Momentum

Nesterov Accelerated Gradient

NAG vs momentum at local minima

Adagrad

A derivative of loss function for given parameters at a given time t.

Update parameters for given input i and at time/iteration t

AdaDelta

Update the parameters

Adam

First and second order of momentum

Update the parameters

Comparison between various optimizers

Comparison 1

comparison 2

Conclusions

 



这篇关于Various Optimization Algorithms For Training Neural Network[转]的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!


扫一扫关注最新编程教程