Learn Tensorflow by Examples Series - Part 1

Tensorflow Introduction


Tensorflow is a powerful open source library for large-scale machine learning from Google. In this tutorial, we'll be discuss the basic principle of Tensorflow, with a code example.
Tensorflow is,
  • A numerical computation library well suited for deep learning applications.
  • Modular and actively maintained.
  • Allows parallelization and efficient usage of available hardware. Supports distributed computing.
  • Powers many of Google’s large-scale services, including Search, Photos, Cloud Speech etc.
  • Comes with a great visualization tool called Tensorboard.
  • Easy to install, with PIP: pip3 install –upgrade tensorflow

Code example:
import tensorflow as tf

# Declare variables (with initial values)
x = tf.Variable(3, name='x')
y = tf.Variable(4, name='y')

# Function (i.e. graph) to compute
f = (x * x * y) + y + 2

'''
Till this point no actual computation will be done
if we run the code, we just have declared a 
computation graph. 
Even the variables are not acutally initialized yet.
Let's start the computation using a tensorflow session!
'''

# Create session
sess = tf.Session()

# Initialize variables
sess.run(x.initializer)
sess.run(y.initializer)

# Evaluate result
result = sess.run(f)
print(result)
Couple of things to notice:
  • When we define a graph node, e.g. f = (x * x * y) + y + 2, Tensorflow doesn’t compute the value of f immediately, it just knows how to compute it. This is the basic principle of Tensorflow (i.e., separation of declaration and execution).
  • Tensorflow graphs are computed under a session, and are needed to be explicitly run with session.run() function. A session is an isolated environment which hold the state of variables and graphs being computed. If a new session is created, the previous values are cleared up (except in distributed computing).
  • Tensorflow variables and graphs require explicit declaration.
Q. Which of these can be a potential advantage of separation of declaration and execution?