Skip to content

AntoBrandi/Self-Driving-and-ROS-2-Learn-by-Doing-Map-Localization

Repository files navigation

Self-Driving and ROS 2 - Learn by Doing! Map & Localization

LinkedIn Udemy


Cover Map & Localization

Table of Contents

About the Course

This repository contains the material used in the course Self Driving and ROS 2 - Learn by Doing! Map & Localization that is currently available on the following platforms:

Have you ever developed a mapping and a localization algorithm for your robot? Do you want to know more about SLAM (Simultaneous Localization and Mapping) and how to use it to enable your robot to create a nice and accurate map of the environment using a 2D LiDAR sensor?

Then this course will teach you exaclty that, with many more topics:

  • Robot Localization
  • Map Representations
  • Mapping
  • SLAM
  • Obstacle Avoidance
  • Speed and Separation monitoring
  • Using LiDAR Sensors

Furthermore, all the laboratory classes in which we are going to develop the actual Software of our mobile robot are available both in Pyhton and in C++ to let you the freedom of choosing the programming language you like the most or become proficient in both!

Other Courses

Self Driving and ROS 2 - Learn by Doing! Odometry & Control

If you are passionate about Self-Driving and you want to make a real robot Autonomously Navigate, then this course is for you! Apart from explaining in detail all the functionalities and the logic of ROS 2, the latest version of the Robot Operating System, it covers some key concepts of Autonomous Navigation such as

  • Sensor Fusion
  • Kalman Filter
  • Probability Theory
  • Robot Kinematics
  • Odometry
  • Robot Localization
  • Control

enroll on the following platforms:

Cover Odometry & Control

Robotics and ROS 2 - Learn by Doing! Manipulators

If you find this course interesting and you are passionate about robotics in general (not limited to autonomous mobile robots), then you definitely have to take a look at my outher courses!

Cover Manipulators 2

In this course I'll guide you through the creation of a real robotic arm that you can control with your voice using the Amazon Alexa voice assistant. Some of the concepts that are covered in this course are

  • Gazebo Simulation
  • Robot Kinematics
  • ROS 2 Basics
  • MoveIt 2
  • Using Arduino with ROS 2
  • Interface Alexa with ROS 2

Looks funny? Check it out on the following platforms:

ROS 1 Nostalgic?

Do you want to master Self-Driving or Manipulation using ROS, the first version of the Robot Operating System?

Despite many companies already started switching to ROS 2, knowing both ROS 1 and ROS 2 will position you at the forefront of this demand, making you an attractive candidate for a wide range of roles.

Here you can access the same courses, where will be created the same robots, implementing the same functionalities in ROS 1

Getting Started

You can decide whether to build the real robot or just have fun with the simulated one. The course can be followed either way, most of the lessons and most of the code will work the same in the simulation as in the real robot

Prerequisites

You don't need any prior knowledge of ROS 2 nor of Self-Driving, I'll explain all the concepts as they came out and as they are needed to implement new functionalities to our robot. A basic knowledge of programming, either using C++ or Python is required as this is not a Programming course and so I'll nmot dwell too much on basic Programming concepts.

To prepare your PC you need:

  • Install Ubuntu 22.04 on PC or in Virtual Machine Download the ISO Ubuntu 22.04 for your PC
  • Install ROS Humble or ROS Iron on your Ubuntu 22.04
  • Install ROS 2 missing libraries. Some libraries that are used in this project are not in the standard ROS 2 package. Install them with:
sudo apt-get update && sudo apt-get install -y \
     ros-humble-ros2-controllers \
     ros-humble-gazebo-ros \
     ros-humble-gazebo-ros-pkgs \
     ros-humble-ros2-control \
     ros-humble-gazebo-ros2-control \
     ros-humble-joint-state-publisher-gui \
     ros-humble-joy \
     ros-humble-joy-teleop \
     ros-humble-turtlesim \
     ros-humble-robot-localization \
     ros-humble-tf-transformations

Usage

To Launch the Simulation of the Robot

  1. Clone the repo
git clone https://github.com/AntoBrandi/Self-Driving-and-ROS-2-Learn-by-Doing-Map-Localization.git
  1. Build the ROS 2 workspace
cd ~/Self-Driving-and-ROS-2-Learn-by-Doing-Map-Localization/Section10_SLAM/bumperbot_ws
colcon build
  1. Source the ROS 2 Workspace
. install/setup.bash
  1. Launch the Gazebo simulation
ros2 launch bumperbot_bringup simulated_robot.launch.py

Contributing

Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the Apache 2.0 License. See LICENSE for more information.

Contact

Antonio Brandi - LinkedIn - [email protected]

My Projects: https://github.com/AntoBrandi

Acknowledgements