SciTech-BigDataAIML-Tensorflow-Introduction to graphs and tf.function

  • Graphs are data structures that contain:

    1. a set of tf.Operation objects,
      which representing units of computation;
    2. and tf.Tensor objects,
      which represent the units of data that flow between operations.
  • Graphs are defined in a tf.Graph context. Since these graphs are data structures, they can be saved, run, and restored all without the original Python code.

  • The benefits of graphs:
    In short, graphs are extremely useful and let your TensorFlow run fast, run in parallel, and run efficiently on multiple devices.
    However, you still want to define your machine learning models (or other computations) in Python for convenience, and then automatically construct graphs when you need them.

    1. TensorFlow uses graphs as the format for saved models when it exports them from Python.
    2. With a graph, you have a great deal of flexibility. You can use your TensorFlow graph in environments that don't have a Python interpreter, like mobile applications, embedded devices, and backend servers.
    3. Graphs are also easily optimized, allowing the compiler to do transformations like:
      Statically infer the value of tensors by folding constant nodes in your computation ("constant folding").
      Separate sub-parts of a computation that are independent and split them between threads or devices.
      Simplify arithmetic operations by eliminating common subexpressions.
      There is an entire optimization system, Grappler, to perform this and other speedups.
posted @ 2024-01-02 18:31  abaelhe  阅读(17)  评论(0)    收藏  举报