Computation graph

Everything in TensorFlow will be represented as a computational graph that consists of nodes and edges, where nodes are the mathematical operations, say addition, multiplication and so on, and edges are the tensors. Having a computational graph is very efficient in optimizing resources and it also promotes distributed computing.

Say we have node B, whose input is dependent on the output of node A; this type of dependency is called direct dependency.

For example:

A = tf.multiply(8,5)
B = tf.multiply(A,1)

When node B doesn't depend on node A for its input it is called indirect dependency.

For example:

A = tf.multiply(8,5)
B = tf.multiply(4,3)

So if we can understand these dependencies, we can distribute the independent computations in the available resources and reduce the computation time. 

Whenever we import TensorFlow, a default graph will be created automatically and all nodes we create will get associated with the default graph.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.119.253.31