with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017 Keras and TensorFlow with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Keras Framework on top of TensorFlow or Theano Follows the principle of layers – can stack, split or merge for unique network architectures. Calculates the connection size between hidden layers based on each layers size. Allows GPU acceleration with minimal configuration. http://Keras.io
The Sequential Model Allows you to stack many layers on top of each other to generate arbitrarily deep models. Layers can be of different types (ex Conv -> Flatten -> Dense) Offers you the ability to create your own custom layers You can merge or multiply layers to combine them You can add callbacks to report early stopping, logging, checkpoints, etc.
Relation of Keras to TensorFlow Keras sits on top of TensorFlow or Theano (supports both, but you can only use 1 at a time), and makes setting up tensors easier. Reduces overhead of creating layers (such as needing to create weight, bias, and operation for each layer Automatically handles the connection between layers – very very helpful when dealing with convolution. Calculating the output size of the third convolution layer in a network so you can design the input to the 4th is an annoyance, and easy to make a mistake
Keras TensorFlow
Keras TensorFlow Integration – Future Work As announced in January 2017, Keras will be included directly in the TensorFlow core in the future. This will make it easier to build models with TensorFlow – You can just use the standard Keras functions! For further reading, you can read this Reddit thread and comment by the author of Keras here: https://www.reddit.com/r/MachineLearning/comments/5jg7b8/p_de ep_learning_for_coders18_hours_of_lessons_for/dbhaizx/ And an article by fast.ai http://www.fast.ai/2017/01/03/keras/ Using TensorFlow makes me feel like I’m not smart enough to use TensorFlow; whereas using Keras makes me feel like neural networks are easier than I realized. – from fast.ai article