This is a minimal example of how to build a MediaPipe component using pytorch for inference. Currently, this only supports building a native C++/Desktop component -- no support for mobile platforms or JS.
- To build the example application, run:
bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 examples:pytorch_inference
-
Download the model file from here to `examples/yolov5s.torchscript.pt. The path to the model is currently hard-coded. Any torchscript model with the same input-output interface should work though.
-
To launch the application, run:
GLOG_logtostderr=1 bazel-bin/examples/pytorch_inference
[ ] Expand support to Android or iOS. The main issue here is that at the time of this writing, libtorch's bazel manifest only supports Desktop builds. Mobile bazel builds would require wrapping the cmake-based mobile build.