View source on GitHub |
GestureRecognizer for building hand gesture recognizer model.
mediapipe_model_maker.gesture_recognizer.GestureRecognizer(
label_names: List[str],
model_options: mediapipe_model_maker.gesture_recognizer.ModelOptions
,
hparams: mediapipe_model_maker.gesture_recognizer.HParams
)
Args | |
---|---|
label_names
|
A list of label names for the classes. |
model_options
|
options to create gesture recognizer model. |
hparams
|
The hyperparameters for training hand gesture recognizer model. |
Attributes | |
---|---|
embedding_size
|
Size of the input gesture embedding vector. |
Methods
create
@classmethod
create( train_data:
mediapipe_model_maker.face_stylizer.dataset.classification_dataset.ClassificationDataset
, validation_data:mediapipe_model_maker.face_stylizer.dataset.classification_dataset.ClassificationDataset
, options:mediapipe_model_maker.gesture_recognizer.GestureRecognizerOptions
) -> 'GestureRecognizer'
Creates and trains a hand gesture recognizer with input datasets.
If a checkpoint file exists in the {options.hparams.export_dir}/checkpoint/ directory, the training process will load the weight from the checkpoint file for continual training.
Args | |
---|---|
train_data
|
Training data. |
validation_data
|
Validation data. |
options
|
options for creating and training gesture recognizer model. |
Returns | |
---|---|
An instance of GestureRecognizer. |
evaluate
evaluate(
data: mediapipe_model_maker.model_util.dataset.Dataset
,
batch_size: int = 32
) -> Any
Evaluates the classifier with the provided evaluation dataset.
Args | |
---|---|
data
|
Evaluation dataset |
batch_size
|
Number of samples per evaluation step. |
Returns | |
---|---|
The loss value and accuracy. |
export_labels
export_labels(
export_dir: str, label_filename: str = 'labels.txt'
)
Exports classification labels into a label file.
Args | |
---|---|
export_dir
|
The directory to save exported files. |
label_filename
|
File name to save labels model. The full export path is {export_dir}/{label_filename}. |
export_model
export_model(
model_name: str = 'gesture_recognizer.task'
)
Converts the model to TFLite and exports as a model bundle file.
Saves a model bundle file and metadata json file to hparams.export_dir. The resulting model bundle file will contain necessary models for hand detection, canned gesture classification, and customized gesture classification. Only the model bundle file is needed for the downstream gesture recognition task. The metadata.json file is saved only to interpret the contents of the model bundle file.
The customized gesture model is in float without quantization. The model is lightweight and there is no need to balance performance and efficiency by quantization. The default score_thresholding is set to 0.5 as it can be adjusted during inference.
Args | |
---|---|
model_name
|
File name to save model bundle file. The full export path is {export_dir}/{model_name}. |
export_tflite
export_tflite(
export_dir: str,
tflite_filename: str = 'model.tflite',
quantization_config: Optional[mediapipe_model_maker.quantization.QuantizationConfig
] = None,
preprocess: Optional[Callable[..., bool]] = None
)
Converts the model to requested formats.
Args | |
---|---|
export_dir
|
The directory to save exported files. |
tflite_filename
|
File name to save TFLite model. The full export path is {export_dir}/{tflite_filename}. |
quantization_config
|
The configuration for model quantization. |
preprocess
|
A callable to preprocess the representative dataset for quantization. The callable takes three arguments in order: feature, label, and is_training. |
summary
summary()
Prints a summary of the model.