Skip to content

dsuess/mediapipe-pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

mediapipe-pytorch

This is a minimal example of how to build a MediaPipe component using pytorch for inference. Currently, this only supports building a native C++/Desktop component -- no support for mobile platforms or JS.

Setup (TBC)

  1. To build the example application, run:
bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 examples:pytorch_inference
  1. Download the model file from here to `examples/yolov5s.torchscript.pt. The path to the model is currently hard-coded. Any torchscript model with the same input-output interface should work though.

  2. To launch the application, run:

GLOG_logtostderr=1 bazel-bin/examples/pytorch_inference

TODO

[ ] Expand support to Android or iOS. The main issue here is that at the time of this writing, libtorch's bazel manifest only supports Desktop builds. Mobile bazel builds would require wrapping the cmake-based mobile build.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published