TenforFlow Notes (1)

Getting Started with TensorFlow

learn from the official website

Tensors: The central unit of data in TensorFlow is the tensor. A tensor consists of a set of primitive values shaped into an array of any number of dimensions. A tensor’s rank is its number of dimensions.

You might think of TensorFlow Core programs as consisting of two discrete sections:

  1. Building the computational graph.
  2. Running the computational graph.

The Computational Graph: A computational graph is a series of TensorFlow operations arranged into a graph of nodes. Let’s build a simple computational graph.

Node: Each node takes zero or more tensors as inputs and produces a tensor as an output.

Constant: One type of node is a constant. Like all TensorFlow constants, it takes no inputs, and it outputs a value it stores internally.

1
2
3
> >
> > **PlaceHolder**: A graph can be parameterized to accept external inputs, known as placeholders. A placeholder is a promise to provide a value later.
> > ```a = tf.placeholder(tf.float32)

Variables: In machine learning we will typically want a model that can take arbitrary inputs, such as the one above. To make the model trainable, we need to be able to modify the graph to get new outputs with the same input. Variables allow us to add trainable parameters to a graph.

1
2
3
> >
> > Constants are initialized when you call tf.constant, and their value can never change. By contrast, variables are not initialized when you call tf.Variable. To initialize all the variables in a TensorFlow program, you must explicitly call a special operation as follows:
> >

init = tf.global_variables_initializer()
sess.run(init)

1
2
3
4
5
> >
> > **tf.assign**: A variable is initialized to the value provided to tf.Variable but can be changed using operations like tf.assign.
>
> **Session**: To actually evaluate the nodes, we must run the computational graph within a session. A session encapsulates the control and state of the TensorFlow runtime.
>

import tensorflow as tf

constant

node1 = tf.constant(3.0, dtype=tf.float32)
node2 = tf.constant(4.0) # also tf.float32 implicitly
print(node1, node2)

session

sess = tf.Session()
print(sess.run([node1, node2]))

node3 = tf.add(node1, node2)
print(“node3:”, node3)
print(“sess.run(node3):”, sess.run(node3))

placeholder

a = tf.placeholder(tf.float32)
b = tf.placeholder(tf.float32)
adder_node = a + b # + provides a shortcut for tf.add(a, b)

print(sess.run(adder_node, {a: 3, b: 4.5}))
print(sess.run(adder_node, {a: [1, 3], b: [2, 4]}))

add_and_triple = adder_node * 3.
print(sess.run(add_and_triple, {a: 3, b: 4.5}))

variable

W = tf.Variable([.3], dtype=tf.float32)
b = tf.Variable([-.3], dtype=tf.float32)
x = tf.placeholder(tf.float32)
linear_model = W*x + b

init = tf.global_variables_initializer()
sess.run(init)

print(sess.run(linear_model, {x: [1, 2, 3, 4]}))

y = tf.placeholder(tf.float32)
squared_deltas = tf.square(linear_model - y)
loss = tf.reduce_sum(squared_deltas)
print(sess.run(loss, {x: [1, 2, 3, 4], y: [0, -1, -2, -3]}))

assign

fixW = tf.assign(W, [-1.])
fixb = tf.assign(b, [1.])
sess.run([fixW, fixb])
print(sess.run(loss, {x: [1, 2, 3, 4], y: [0, -1, -2, -3]}))
```