

Specific wrapper code when deploying models on devices. Which can further reduce your model latency and size with minimal loss inĪdd metadata, which makes it easier to create platform # > converter = tf._concrete_functions()Ĭonverter = tf._concrete_functions(, # from_concrete_functions API is able to work when there is only the first # Notes that for the versions earlier than TensorFlow 2.7, the # (to generate a SavedModel) tf.saved_model.save(model, "saved_model_tf_dir")Ĭoncrete_func = model._call_.get_concrete_function() # (ro run your model) result = Squared(5.0) # This prints "25.0" # Create a model using low-level tf.* APIs

The following example shows how to convert # (to generate a SavedModel) tf.saved_model.save(model, "saved_model_keras_dir")Ĭonverter = tf._keras_model(model) Model.fit(x=, y=, epochs=5) # train the model pile(optimizer='sgd', loss='mean_squared_error') # compile the model # Create a model using high-level tf.keras.* APIs The following example shows how to convert aĬonverter = tf._saved_model(saved_model_dir) # path to the SavedModel directory Note: The following sections assume you've both installed TensorFlow 2.x and It to a Frozen GraphDef file and then use this API as shown If you have checkpoints, then first convert

