Skip to content

Jetbot tools is a set of ROS2 nodes that utilize the Jetson inference DNN vision library for NVIDIA Jetson

Notifications You must be signed in to change notification settings

Jen-Hung-Ho/ros2_jetbot_tools

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

58 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Jetbot Tools with Jetson Inference DNN Vision Library for NAV2 ROS2 Robot

Jetbot tools is a set of ROS2 nodes that utilize the Jetson inference DNN vision library for NVIDIA Jetson. With Jetbot tools, you can build your own low-cost 2-wheel robot with a camera and a lidar sensor and make it do the following amazing things:

  • Lidar-assisted object avoidance self-driving: Your robot can navigate autonomously and avoid obstacles using the lidar sensor.
  • Real-time object detection and tracking: Your robot can detect objects using the SSD Mobilenet V2 model. You can also make your robot follow a specific object that it detects.
  • Real-time object detection and distance measurement: Your robot can detect and measure the distance of objects using the SSD Mobilenet V2 model and the lidar sensor. You can also make your robot follow a specific object that it detects and stop when it is too close to the object.
  • NAV2 TF2 position tracking and following: Your robot can track its own position and follow another Jetbot robot using the NAV2 TF2 framework.
  • Empower your robot with voice control: Unleash the power of voice control for your ROS2 robot with Jetbot Voice-to-Action Tools.

Here is a brief overview of the jetbot tools design diagram/architecture

Jetbot tools source code and video demos:


  • Lidar-assisted object avoidance self-driving:
    • Code logic explanation:
      • Use the LIDAR sensor to collect data from all directions and divide it into 12 segments of 30 degrees each
      • Compare the distances of the objects in the first three segments (front 90 degrees) and select the segment with the farthest open area
      • If the object in the selected segment is closer than a threshold distance to the target object
        • Repeat the comparison for the first six segments (front 180 degrees) and select the segment with the farthest object
        • If the object in the selected segment is still closer than the threshold distance to the target object
          • Repeat the comparison for all 12 segments (360 degrees) and select the segment with the farthest open area
            • Rotate the robot to face the selected segment
      • Publish a ROS2 Twist message to move the robot towards the open area
    • Source code:
    • Usage:
      • ros2 launch jetbot_tools laser_avoidance.launch.py param_file:=./jetbot_tools/param/laser_avoidance_params.yaml
      • ros2 param get /laser_avoidance start
      • ros2 param set /laser_avoidance start true
  • Real-time object detection and tracking:
    • Code logic explanation:
      • Use Jetson DNN inference ROS2 detectnet node to detect the targeting object position of the image capture from camera
      • Calculate the angle between the image center and the targeting position
      • Use the size of the detected image to determine the distance between robot to the target
      • Send a ROS2 Twist message to move the robot follow the detection object
      • Stop the robot if it is too close to the target
    • Source code:
    • Usage:
      • ros2 launch jetbot_tools DNN_SSD_source.launch.py model_path:=/home/jetbot/dev_ws/pytorch-ssd/models/toy/ssd-mobilenet.onnx class_labels_path:=/home/jetbot/dev_ws/pytorch-ssd/models/toy/labels.txt launch_video_source:=false topic:=/video_source/raw
      • ros2 launch jetbot_tools detect_copilot.launch.py param_file:=./jetbot_tools/param/detect_toys_copilot_params.yaml
      • ros2 param get /detect_copilot follow_detect
      • ros2 param set /detect_copilot follow_detect true
  • Real-time object detection and distance measurement:
    • Code logic explanation:
    • Source code:
    • Usage:
      • ros2 launch jetbot_tools DNN_SSD_source.launch.py model_name:=ssd-mobilenet-v2 launch_video_source:=false topic:=/video_source/raw
      • ros2 launch jetbot_tools follow_copilot.launch.py param_file:=./jetbot_tools/param/follow_copilot_params.yaml
      • ros2 param get /follow_copilot follow_detect
      • ros2 param set /follow_copilot follow_detect true
  • NAV2 TF2 position tracking and following:
    • Code logic explanation:
      • To run this tf2_follow_copilot program, you need two robots that can use tf2 broadcaster to publish their coordinate frames.
      • The tf2_follow_copilot program uses a tf2 listener to calculate the difference between the robot frames and determine the direction and distance to follow.
      • The program publishes a ROS2 Twist message to control the GoPiGo3 robot's speed and steering, so that it can follow the jetbot robot.
    • Source code:
    • Usage:
      • Pre requirements: ros2 launch <follow_copilot.launch.py> or <detect_copilot.launch.py>
      • ros2 launch jetbot_tools tf2_follow_copilot.launch.py param_file:=./jetbot_tools/param/tf2_follow_copilot_params.yaml
      • ros2 param set /tf2_follow start_follow true
  • Empower your robot with voice control:
    • Unleash the power of voice control for your ROS2 robot with Jetbot Voice-to-Action Tools!.
    • The Jetbot Voice-to-Action tools integrates the Jetson Automatic Speech Recognition (ASR) library, empowering your robot to understand and respond to spoken commands. Enhance interactions with features like natural chat greetings, object avoidance self-driving, and real-time person following, and basic robot navigation movement.

Requirements:

References

About

Jetbot tools is a set of ROS2 nodes that utilize the Jetson inference DNN vision library for NVIDIA Jetson

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages