You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Based on the issue, I was able to save a model in pb format as well we previously saved a model in h5 format (Keras model).
I am trying to serve a model using TensorFlow servings, but the saved model doesn't have the signature definition, and hence couldn't do inference from the client.
Is there other way to do inference?
saved_model_cli show --dir /home/dsdev/gitrepo/ReferringRelationships/model/1/ --all
MetaGraphDef with tag-set: '' contains the following SignatureDefs:
Traceback (most recent call last):
File "/data/anaconda/envs/py35/bin/saved_model_cli", line 11, in <module>
sys.exit(main())
File "/data/anaconda/envs/py35/lib/python3.5/site-packages/tensorflow/python/tools/saved_model_cli.py", line 910, in main
args.func(args)
File "/data/anaconda/envs/py35/lib/python3.5/site-packages/tensorflow/python/tools/saved_model_cli.py", line 611, in show
_show_all(args.dir)
File "/data/anaconda/envs/py35/lib/python3.5/site-packages/tensorflow/python/tools/saved_model_cli.py", line 199, in _show_all
signature_def_map = get_signature_def_map(saved_model_dir, tag_set)
File "/data/anaconda/envs/py35/lib/python3.5/site-packages/tensorflow/python/tools/saved_model_cli.py", line 243, in get_signature_def_map
meta_graph = saved_model_utils.get_meta_graph_def(saved_model_dir, tag_set)
File "/data/anaconda/envs/py35/lib/python3.5/site-packages/tensorflow/python/tools/saved_model_utils.py", line 49, in get_meta_graph_def
' could not be found in SavedModel')
RuntimeError: MetaGraphDef associated with tag-set could not be found in SavedModel
and
dsdev@dsdev:/mnt/data/dataset-visualgenome/train$ tensorflow_model_server --port=8500 --rest_api_port=8501 --model_name=refrel --model_base_path=/home/dsdev/gitrepo/ReferringRelationships/model/
2019-11-12 07:09:14.359747: I tensorflow_serving/model_servers/server.cc:82] Building single TensorFlow model file config: model_name: refrel model_base_path: /home/dsdev/gitrepo/ReferringRelationships/model/
2019-11-12 07:09:14.359974: I tensorflow_serving/model_servers/server_core.cc:461] Adding/updating models.
2019-11-12 07:09:14.359995: I tensorflow_serving/model_servers/server_core.cc:558] (Re-)adding model: refrel
2019-11-12 07:09:14.460213: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: refrel version: 1}
2019-11-12 07:09:14.460243: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: refrel version: 1}
2019-11-12 07:09:14.460261: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: refrel version: 1}
2019-11-12 07:09:14.460283: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /home/dsdev/gitrepo/ReferringRelationships/model/1
2019-11-12 07:09:14.460298: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /home/dsdev/gitrepo/ReferringRelationships/model/1
2019-11-12 07:09:14.533650: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2019-11-12 07:09:14.538127: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:259] SavedModel load for tags { serve }; Status: fail. Took 77816 microseconds.
2019-11-12 07:09:14.538160: E tensorflow_serving/util/retrier.cc:37] Loading servable: {name: refrel version: 1} failed: Not found: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: `saved_model_cli`
The text was updated successfully, but these errors were encountered:
Based on the issue, I was able to save a model in pb format as well we previously saved a model in h5 format (Keras model).
I am trying to serve a model using TensorFlow servings, but the saved model doesn't have the signature definition, and hence couldn't do inference from the client.
Is there other way to do inference?
and
The text was updated successfully, but these errors were encountered: