Introduction to TensorFlow, by Machine Learning at Berkeley
Transcript of Introduction to TensorFlow, by Machine Learning at Berkeley
Introduction to TensorFlow
AgendaOur goals for tonight
Neural Network Review
What is TensorFlow?
Building Neural Nets
TensorBoard Visualization
1
2
3
4
Neural Networks Review
● Layers that combine previous features to form new features● Each layer applies linear transformation, “weighing” previous features● Nonlinear activation function● Predicts through a feedforward propagation● Learns through gradient descent and backpropagation
FEEDFORWARD
BACKPROPAGATE
Neural Network Libraries
Pros: - Strength in CNNs - Image processing - Python interface
Cons: - Inflexible - C++
Pros: - Widely used - High performance - Python
Cons: - Somewhat bulky - Can get low-level
Pros: - Slimmer - High performance - Modular
Cons: - Academic use - Lua
Pros: - Gaining support - TensorBoard - Python
Cons: - Improving performance - Low content
Why Learn TensorFlow?
➢ Backed by Google○ Constant development and frequent updates○ DeepMind moving from Torch to TensorFlow
➢ Growing Community○ Amount of example code and tutorials growing○ Most commonly mentioned ML library on Stack Overflow
➢ Long term support○ Recent TensorFlow 1.0 update, all code will be compatible
with 1.x updates➢ Performance is not very good, but getting better.
○ About an order of magnitude slower than Theano
Tensors (side note)
➢ For Programmers: Tensors generalize multidimensional arrays.➢ For Mathematicians: Tensors generalize scalars, vectors,
matrices and linear operators!➢ TensorFlow describe data as tensors, and pass them through its
computation graph.➢ Tensors flow through the network
TensorFlow Basics
Computation Graph
Construction Execution
TensorFlow Basics *
Variables➢ Stores parameters in graph➢ Can be trainable (optimized
during backprop) or untrainable➢ Variety of initializers (e.g.
constant, normal, etc)
Operations➢ Takes in variable and/or
outputs from other operations➢ Can be fed into other ops and
linked in the graph
tf.constant(5.0)
tf.constant(3.0)
tf.random_normal(mean=1, stddev=2)
tf.add()
tf.mul()
TensorFlow Basics *
Sessions➢ Handles post-construction interactions with the graph➢ Call the run method to evaluate tensors
3.0
5.0
1.68
8.0
13.44
tf.constant(5.0)
tf.constant(3.0)
tf.random_normal(mean=1, stddev=2)
tf.add()
tf.mul()
sess = tf.Session()sess.run(tf.global_variables_initializer())sess.run(mult_op)
mult_op
TensorFlow Basics *
Optimizers➢ Subclasses of tf.train.Optimizer➢ Main functions: compute_gradients, apply_gradients, and
minimize
def minimize(self, loss_fn):self.compute_gradients(loss_fn
)self.apply_gradients(loss_fn)
Backpropagation on ops
Update trainable variables
Some loss functions are built into TensorFlow➢ For example, tf.losses.mean_squared_error➢ You can also define your own loss functions by combining ops
TensorFlow Basics
Placeholders➢ A placeholder variable that can be filled in during execution➢ On evaluation, specify a dictionary with placeholder key-value pairs
P2
P1
tf.random_normal(mean=1, stddev=2)
tf.add()
tf.mul()
1.0
2.0
0.94
3.0
2.82
sess.run(mul, feed_dict={P1: 1.0, P2:2.0})
TensorBoard
Graph Visualization➢ See visual representation of the graph➢ Check and debug construction
Scoping➢ Used to create abstractions➢ Without scoping, graph can become a
convoluted mess➢ Created by using a with statement➢ Scope gets name prepended to variable
and operation names
TensorBoard
Visualizing Learning➢ See tensor values during training➢ Check and debug execution
Review
➢ TensorFlow is one of several ML libraries, each with pros and cons➢ Expected long term support for TensorFlow➢ Two stages: construction and execution➢ Tensors are passed through chained operations➢ Operations are evaluated at the execution stage with a session object➢ Use optimizers to find and apply gradients for the training step➢ TensorBoard used for graph visualization and visualizing learning
Thank You for Coming!Please fill out this feedback form: https://mlab.typeform.com/to/t51Y09
Like our page on Facebook: www.facebook.com/berkeleymlEmail us: [email protected] our website: ml.berkeley.eduCheck out our blog: ml.berkeley.edu/blog