Skip to content

Control a mobile robot on a 2D map using hand gestures with the MediaPipe library, replacing traditional WASD keyboard inputs.

License

Notifications You must be signed in to change notification settings

Makizy/Hand-Gesture-Controlled-Mobile-Robot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Hand-Gesture_Controlled-Mobile-Robot

Control a mobile robot on a 2D map using hand gestures with the MediaPipe library, replacing traditional WASD keyboard inputs.

Watch Our Demo

demo-control-mobile-robot.mp4

Project Setup and Running Guide

Prerequisites

  • Ubuntu 20.04
  • ROS Noetic
  • MediaPipe
  • Stage ROS

Installation

1. Setting Up ROS Noetic

Follow the official ROS installation instructions for Ubuntu 20.04. Make sure to configure your environment by sourcing the ROS setup script:

echo "source /opt/ros/noetic/setup.bash" >> ~/.bashrc
source ~/.bashrc

2. Creating a ROS Workspace

Create a new ROS workspace to host the packages:

mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/
catkin_make

3. Cloning and Building the Project

Clone this repository into the src directory of your workspace:

cd ~/catkin_ws/src
git clone <repository-url> hand_gesture_control
cd ..
catkin_make
source devel/setup.bash

Running the Project

To run the project, open three terminals and execute the following commands in sequence:

Terminal 1: Start ROS Core

Start the ROS master node:

roscore

Terminal 2: Launch the Stage ROS

In a new terminal, launch Stage ROS with a predefined world for simulation:

rosrun stage_ros stageros $(rospack find stage_ros)/world/willow-erratic.world

This command opens a 2D simulation world where the robot will navigate.

Terminal 3: Launch Hand Gesture Control

Finally, in another new terminal, launch the hand gesture control nodes:

roslaunch hand_gesture_control hand_gesture_control_launch.launch

This command starts the nodes necessary for hand gesture recognition and robot control.

Usage

After launching all components:

  • Stage ROS window displays the robot in a 2D environment.
  • MediaPipe node processes hand gestures and translates them into robot movement commands.

Contributions

We welcome contributions! Please read our contributing guidelines to learn about our review process, coding conventions, and more. Contributions can be made via pull requests to the repository.

About

Control a mobile robot on a 2D map using hand gestures with the MediaPipe library, replacing traditional WASD keyboard inputs.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published