Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
340 views
in Technique[技术] by (71.8m points)

loading a graph from .meta file from Tensorflow in c++ for inference

I have trained some models using tensorflow 1.5.1 and I have the checkpoints for those models (including .ckpt and .meta files). Now I want to do inference in c++ using those files.

In python, I would do the following to save and load the graph and the checkpoints. for saving:

    images = tf.placeholder(...) // the input layer
    //the graph def
    output = tf.nn.softmax(net) // the output layer
    tf.add_to_collection('images', images)
    tf.add_to_collection('output', output)

for inference i restore the graph and the checkpoint then restore the input and output layers from collections like so:

    meta_file = './models/last-100.meta'
    ckpt_file = './models/last-100'
    with tf.Session() as sess:
        saver = tf.train.import_meta_graph(meta_file)
        saver.restore(sess, ckpt_file)
        images = tf.get_collection('images')
        output = tf.get_collection('output')
        outputTensors = sess.run(output, feed_dict={images: np.array(an_image)})

now assuming that I did the saving in python as usual, how can I do inference and restore in c++ with simple code like in python?

I have found examples and tutorials but for tensorflow versions 0.7 0.12 and the same code doesn't work for version 1.5. I found no tutorials for restoring models using c++ API on tensorflow website.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

For the sake of this thread. I will rephrase my comment into an answer.

Posting a full example would require either a CMake setup or putting the file into a specific directory to run bazel. As I do favor the first way and it would burst all limits on this post to cover all parts I would like to redirect to a complete implementation in C99, C++, GO without Bazel which I tested for TF > v1.5.

Loading a graph in C++ is not much more difficult than in Python, given you compiled TensorFlow already from source.

Start by creating a MWE, which creates a very dump network graph is always a good idea to figure out how things work:

import tensorflow as tf

x = tf.placeholder(tf.float32, shape=[1, 2], name='input')
output = tf.identity(tf.layers.dense(x, 1), name='output')

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    saver = tf.train.Saver(tf.global_variables())
    saver.save(sess, './exported/my_model')

There are probably tons of answers here on SO about this part. So I just let it stay here without further explanation.

Loading in Python

Before doing stuff in other languages, we can try to do it in python properly -- in the sense: we just need to rewrite it in C++. Even restoring is very easy in python like:

import tensorflow as tf

with tf.Session() as sess:

    # load the computation graph
    loader = tf.train.import_meta_graph('./exported/my_model.meta')
    sess.run(tf.global_variables_initializer())
    loader = loader.restore(sess, './exported/my_model')

    x = tf.get_default_graph().get_tensor_by_name('input:0')
    output = tf.get_default_graph().get_tensor_by_name('output:0')

it is not helpful as most of these API endpoints do not exists in the C++ API (yet?). An alternative version would be

import tensorflow as tf

with tf.Session() as sess:

    metaGraph = tf.train.import_meta_graph('./exported/my_model.meta')
    restore_op_name = metaGraph.as_saver_def().restore_op_name
    restore_op = tf.get_default_graph().get_operation_by_name(restore_op_name)
    filename_tensor_name = metaGraph.as_saver_def().filename_tensor_name
    sess.run(restore_op, {filename_tensor_name: './exported/my_model'})


    x = tf.get_default_graph().get_tensor_by_name('input:0')
    output = tf.get_default_graph().get_tensor_by_name('output:0')

Hang on. You can always use print(dir(object)) to get the properties like restore_op_name, ... . Restoring a model is an operation in TensorFlow like every other operation. We just call this operation and providing the path (a string-tensor) as an input. We can even write our own restore operation

def restore(sess, metaGraph, fn):
    restore_op_name = metaGraph.as_saver_def().restore_op_name   # u'save/restore_all'
    restore_op = tf.get_default_graph().get_operation_by_name(restore_op_name)
    filename_tensor_name = metaGraph.as_saver_def().filename_tensor_name  # u'save/Const'
    sess.run(restore_op, {filename_tensor_name: fn})

Even this looks strange, it now greatly helps to do the same stuff in C++.

Loading in C++

Starting with the usual stuff

#include <tensorflow/core/public/session.h>
#include <tensorflow/core/public/session_options.h>
#include <tensorflow/core/protobuf/meta_graph.pb.h>
#include <string>
#include <iostream>

typedef std::vector<std::pair<std::string, tensorflow::Tensor>> tensor_dict;

int main(int argc, char const *argv[]) {

  const std::string graph_fn = "./exported/my_model.meta";
  const std::string checkpoint_fn = "./exported/my_model";

  // prepare session
  tensorflow::Session *sess;
  tensorflow::SessionOptions options;
  TF_CHECK_OK(tensorflow::NewSession(options, &sess));

  // here we will put our loading of the graph and weights

  return 0;
}

You should be able to compile this by either put it in the TensorFlow repo and use bazel or simply follow the instructions here to use CMake.

We need to create such a meta_graph created by tf.train.import_meta_graph. This can be done by

tensorflow::MetaGraphDef graph_def;
TF_CHECK_OK(ReadBinaryProto(tensorflow::Env::Default(), graph_fn, &graph_def));

In C++ reading a graph from file is not the same as importing a graph in Python. We need to create this graph in a session by

TF_CHECK_OK(sess->Create(graph_def.graph_def()));

By looking at the strange python restore function above:

restore_op_name = metaGraph.as_saver_def().restore_op_name
restore_op = tf.get_default_graph().get_operation_by_name(restore_op_name)
filename_tensor_name = metaGraph.as_saver_def().filename_tensor_name

we can code the equivalent piece in C++

const std::string restore_op_name = graph_def.saver_def().restore_op_name()
const std::string filename_tensor_name = graph_def.saver_def().filename_tensor_name()

Having this in place, we just run the operation by

sess->Run(feed_dict,     // inputs
          {},            // output_tensor_names (we do not need them)
          {restore_op},  // target_node_names
          nullptr)       // outputs (there are no outputs this time)

Creating the feed_dict is probably a post on its own and this answer is already long enough. It does only cover the most important stuff. I would like to redirect to a complete implementation in C99, C++, GO without Bazel which I tested for TF > v1.5. This is not that hard -- it just can get very long in the case of the plain C version.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...