View source on GitHub |
A runner to do inference with the TFLite model.
mediapipe_model_maker.model_util.LiteRunner(
tflite_model: bytearray
)
Args | |
---|---|
tflite_model
|
A valid flatbuffer representing the TFLite model. |
Methods
run
run(
input_tensors: Union[List[tf.Tensor], Dict[str, tf.Tensor]]
) -> Union[List[tf.Tensor], tf.Tensor]
Runs inference with the TFLite model.
Args | |
---|---|
input_tensors
|
List / Dict of the input tensors of the TFLite model. The order should be the same as the keras model if it's a list. It also accepts tensor directly if the model has only 1 input. |
Returns | |
---|---|
List of the output tensors for multi-output models, otherwise just the output tensor. The order should be the same as the keras model. |