All Products
Search
Document Center

Platform For AI:Export TensorFlow models in the SavedModel format

Last Updated:Feb 26, 2024

This topic describes how to export TensorFlow models in the SavedModel format.

SavedModel format

Before you use the built-in official processor in Elastic Algorithm Service (EAS) of Machine Learning Platform for AI (PAI) to deploy a TensorFlow model service online, you must export the model in the SavedModel format. The SavedModel format is defined and recommended by TensorFlow. The SavedModel format uses the following directory structure:

assets/
variables/
    variables.data-00000-of-00001
    variables.index
saved_model.pb|saved_model.pbtxt

The directory structure includes the following directories:

  • The assets directory is optional. It stores supporting files for the prediction service.

  • The variables directory stores the variables saved by calling the tf.train.Saver method.

  • The saved_model.pb or saved_model.pbtxt directory stores MetaGraphDef and SignatureDef. MetaGraphDef stores the model training logic and SignatureDef specifies the input and output of the model service.

Export models in the SavedModel format

For more information about how to use TensorFlow to export models in the SavedModel format, visit Saving and Restoring. If the model is simple, you can use the following method to export it in the SavedModel format:

tf.saved_model.simple_save(
  session,
  "./savedmodel/",
  inputs={"image": x},   ## x specifies the input variables of the model.
  outputs={"scores": y}  ## y specifies the output variables of the model.
)

When you call the online prediction service, you must set the signature_name parameter for the model in the request. If the model is exported by calling the simple_save() method, the default value of the signature_name parameter is serving_default.

If the model is complex, you can manually export it in the SavedModel format, as shown in the following sample code:

print('Exporting trained model to', export_path)
builder = tf.saved_model.builder.SavedModelBuilder(export_path)
tensor_info_x = tf.saved_model.utils.build_tensor_info(x)
tensor_info_y = tf.saved_model.utils.build_tensor_info(y)

prediction_signature = (
    tf.saved_model.signature_def_utils.build_signature_def(
        inputs={'images': tensor_info_x},
        outputs={'scores': tensor_info_y},
        method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME)
)

legacy_init_op = tf.group(tf.tables_initializer(), name='legacy_init_op')

builder.add_meta_graph_and_variables(
    sess, [tf.saved_model.tag_constants.SERVING],
    signature_def_map={
        'predict_images': prediction_signature,
    },
    legacy_init_op=legacy_init_op
)

builder.save()
print('Done exporting!')

Take note of the following parameter descriptions:

  • The export_path parameter specifies the path to which the model is exported.

  • The prediction_signature parameter indicates the SignatureDef that specifies the input and output of the model. For more information, visit SignatureDef. In this example, the signature_name parameter is set to predict_images.

  • The builder.add_meta_graph_and_variables method specifies the parameters for exporting the model.

Note
  • When you export the model required for prediction, you must specify tf.saved_model.tag_constants.SERVING as the tag of the exported model.

  • For more information about TensorFlow models, visit TensorFlow SavedModel.

Convert a Keras model to the SavedModel format

You can call the model.save() method of Keras to convert a Keras model to the H5 format. However, a Keras model must be converted to the SavedModel format for online prediction. To implement the format conversion, you can call the load_model() method to load the H5 model, and then export it in the SavedModel format, as shown in the following sample code:

import tensorflow as tf
with tf.device("/cpu:0"):
    model = tf.keras.models.load_model('./mnist.h5')
    tf.saved_model.simple_save(
      tf.keras.backend.get_session(),
      "./h5_savedmodel/",
      inputs={"image": model.input},
      outputs={"scores": model.output}
    )

Convert a Checkpoint model to the SavedModel format

When you call the tf.train.Saver() method during model training, a model is saved in the Checkpoint format. You must convert it to the SavedModel format for online prediction. You can call the saver.restore() method to load the Checkpoint model as tf.Session, and then export the model in the SavedModel format, as shown in the following sample code:

import tensorflow as tf
# variable define ...
saver = tf.train.Saver()
with tf.Session() as sess:
  # Initialize v1 since the saver will not.
    saver.restore(sess, "./lr_model/model.ckpt")
    tensor_info_x = tf.saved_model.utils.build_tensor_info(x)
    tensor_info_y = tf.saved_model.utils.build_tensor_info(y)
    tf.saved_model.simple_save(
      sess,
      "./savedmodel/",
      inputs={"image": tensor_info_x},
      outputs={"scores": tensor_info_y}
    )