Neural Networks with Keras Cookbook
上QQ阅读APP看书,第一时间看更新

There's more...

Some of the other loss optimizers available are as follows:

  • RMSprop
  • Adagrad
  • Adadelta
  • Adamax
  • Nadam

You can learn more about the various optimizers here: https://keras.io/optimizers/.

Additionally, you can find the source code of each optimizer here: https://github.com/keras-team/keras/blob/master/keras/optimizers.py.