Optimization algorithms underlying neural networks: Classification of meditative states by use of recurrent neural networks
Neural networks can be utilized for an ever widening selection of tasks. In this thesis the most common optimization algorithms underlying neural networks are investigated: classical momentum, Nesterov momentum, AdaGrad, AdaDelta, RMSprop, Adam, AdaMax, and Nadam. The underlying mathematics that these algorithms are based on is described. There is a summary of key components of a neural network—ac
