Skip to content

Arxtage/videotouch.github.io

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MediaPipe: Video-Touch edition [Accepted to SIGGRAPH Asia 2020 Emerging Technologies]

MediaPipe fork with hand gesture recognition and message passing to other software.

Getting started

This code was tested on macOS Big Sur 11.4 with Intel Core i7 8/9-gen CPU. It should work for Linux as well if one builds the libzmq.a themself (see p. 3 of the next section).

Installation

  1. git clone https://github.com/Arxtage/videotouch.github.io
  2. Please follow the official MediaPipe installation instructions;
  3. To use ZeroMQ message passing mechanism, one need to build the libzmq.a. Just follow the Build instructions section in cppzmq repo (in the end of the main README.md).

Usage

  1. Build the hand tracking desktop CPU example:
bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/examples/desktop/hand_tracking:hand_tracking_cpu
  1. Run:
GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/hand_tracking/hand_tracking_cpu
  1. By default, the example sends hand tracking data via ZeroMQ to a server. One may use zmq_server_demo.py to check the full pipeline:
python mediapipe/examples/desktop/hand_tracking/zmq_server_demo.py

Description

We made 2 key modifications to the original version:

Hand gesture recognition

We added the hand_tracking_cpu_main to make the system recognize hand gestures in real-time. To make this work, we employed hand gesture recognition calculators and made changes to the original .pbtxt graphs (see the latest commits).

Currently there are 2 versions of hand gesture calculcator:

  1. HandGestureCalculator: rule-based hand gesture recognition. Inspired by the code from the TheJLifeX repo.

  2. HandGestureCalculatorNN: neural network-based gesture recognition.

By default, HandGestureCalculator is used. Feel free to modify the hand_landmark_cpu.pbtxt graph to change the gesture calculator.

We used Jesture AI SDK (python/annotation.py) to collect the data for neural network training.

ZeroMQ message passing

ZeroMQ is a tool for message passing between different processes. It allows to communicate between e.g. a binary file compiled from C++ and a python script. In our code, we use the hand_tracking_cpu_main as a Requester and the zmq_server_demo.py as a Replier (see REQ-REP strategy).

To make all these things work we used the cppzmq header files (see examples/desktop/hand_tracking dir).

Citation

If you find this code useful for your purposes, please don't hesitate to refer to our original SIGGRAPH Asia 2020 publication:

@inproceedings{
    10.1145/3415255.3422892, 
    author = {Zakharkin, Ilya and Tsaturyan, Arman and Cabrera, Miguel Altamirano and Tirado, Jonathan and Tsetserukou, Dzmitry}, 
    title = {ZoomTouch: Multi-User Remote Robot Control in Zoom by DNN-Based Gesture Recognition}, 
    year = {2020}, 
    publisher = {Association for Computing Machinery}, 
    url = {https://doi.org/10.1145/3415255.3422892}, 
    doi = {10.1145/3415255.3422892}, 
    booktitle = {SIGGRAPH Asia 2020 Emerging Technologies}, 
    keywords = {Robotics, Hand Tracking, Gesture Recognition, Teleoperation}, 
    location = {Virtual Event, Republic of Korea}, 
    series = {SA '20} 
}

We thank TheJLifeX for the ideas on how to implement the rule-based gesture recognition calculator.

License

This repo is open-sourced under the Apache License 2.0. Please also see the original MediaPipe license.