multi-thread for keras model / what is model._make_predict_function() used for?

when call the predict for multi-thread, there is always exception: tensor *** is not an element of this graph.

two ways to solve this problem:

1) call model._make_predict_function() after load model

_make_predict_function() is called only after a call to predict(). I believe this is a flaw in Keras design - this code is not synchronous and not thread ready. Thats why I need to call this function before threading.

It goes in conjunction with:
self.default_graph.finalize() # avoid modifications

I don't want to modify the model when running the computation.

2) according to 1) _make_predict_function() is called only after a call to predict(). And we can call predict() function after loading model

 

refer: https://github.com/jaara/AI-blog/issues/2

 

I am using flask and meet the same problem.
The solution in #2397 (comment) works for me.

Details: TypeError: Cannot interpret feed_dict key as Tensor
global thread:

    self.model = load_model(model_path)
    self.model._make_predict_function()
    self.graph = tf.get_default_graph()

another thread:

    with self.graph.as_default():
        labels = self.model.predict(data)

refer:https://github.com/keras-team/keras/issues/5640

 

 

 

https://github.com/keras-team/keras/issues/2397

problem:

ValueError: Tensor Tensor("Sigmoid_2:0", shape=(?, 17), dtype=float32) is not an element of this graph.

I had this problem when doing inference in a different thread than where I loaded my model. Here's how I fixed the problem:

Right after loading or constructing your model, save the TensorFlow graph:

graph = tf.get_default_graph()

In the other thread (or perhaps in an asynchronous event handler), do:

global graph
with graph.as_default():
    (... do inference here ...)

I learned about this from https://www.tensorflow.org/versions/r0.11/api_docs/python/framework.html#get_default_graph

 


版权声明:本文为xiewenbo原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接和本声明。