Skip to content

Fab-Ver/ros2-ekf-localization

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

90 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BAGS USED

EXPERIMENT TYPE BAG TO USE
real real_08-Dec_21_48_13

Lab 04 — Localization with EKF

In this exercise, you will implement Extended Kalman Filter (EKF)–based localization for a robot using ROS 2.
You will use sensor fusion from odometry, IMU, and landmark detection to improve localization accuracy.
The system will first be tested in Gazebo simulation and then on a real TurtleBot3 robot.


Objectives

  • Develop a ROS 2 package that runs an EKF to estimate the robot’s pose.
  • Localize the robot using landmark measurements.
  • Fuse measurements from odometry and IMU for improved estimation accuracy.

Requirements

  • Concepts from lectures: motion models, sensor models, and EKF.
  • Packages:
    • turtlebot3_simulations
    • turtlebot3_perception
  • Install the perception package with:
git clone https://github.com/SESASR-Course/turtlebot3_perception.git

Follow the instructions in its README.md file to complete installation.


Task 0 — Probabilistic Models (1 point)

Implement probabilistic motion and measurement models as standalone Python functions.

  • Velocity-based motion model (sampling)

    • Input: current state x, command u, and noise parameters.
    • Output: new pose x' as a NumPy array.
    • Sample 500 poses from an initial state using two different noise settings:
      • One emphasizing angular uncertainty.
      • One emphasizing linear uncertainty.
    • Plot the sample distribution and compute the Jacobian matrices G and V.
  • Landmark-based measurement model (sampling)

    • Input: state x, measurement z, and noise parameters.
    • Output: numpy array [range, bearing] of detected landmarks.
    • Sample 1000 poses and plot their distribution.
    • Compute and report the Jacobian matrix H.

Task 1 - EKF Localization in ROS 2 (3 points)

Implement an EKF node that estimates the robot’s position from landmark measurements.

  • State: [x, y, θ] — robot pose in the global frame.

  • Prediction step (20 Hz):

    • Use the velocity motion model with control inputs (v, ω) from /odom.
    • Implement prediction using a timer callback.
  • Update step:

    • Landmarks are published on /landmarks (or /camera/landmarks on the real robot)
      with message type landmark_msgs/msg/LandmarkArray.
    • For each landmark, perform the EKF update inside the subscription callback.
    • Publish the estimated state on /ekf (nav_msgs/msg/Odometry).
  • Landmark coordinates are provided in turtlebot3_perception/config/landmarks.yaml.


Task 2 — Extended State EKF (1 point)

Extend the EKF state to include linear and angular velocities:

[x, y, θ, v, ω]
  • Create new prediction and Jacobian functions g(u, x), G(u, x), and V(u, x).
  • Update step now includes:
    • Landmark model (ht_landmark)
    • Wheel encoder model (ht_odom) using /odom
    • IMU model (ht_imu) for updating ω
  • Compute the Jacobians:
    • Ht_landmark, Ht_odom, and Ht_imu
  • Define and tune noise parameters (std_odom, std_imu) to set the covariance matrix Q.

Task 3 — Real Robot Test (1 point)

Run your EKF implementation (Task 1 or 2) on the real TurtleBot3 robot.

  • Launch camera and landmark detection:
    ros2 launch turtlebot3_perception camera.launch.py
    ros2 launch turtlebot3_perception apriltag.launch.py
  • Landmarks are published on /camera/landmarks at ~6 Hz.
  • Record /odom and /ekf topics for later analysis and comparison.

Report Requirements

  • Provide a short explanation of your ROS 2 system architecture (nodes, topics, parameters).
  • For Task 0:
    • Include motion and measurement model plots.
    • Discuss effects of different uncertainty parameters.
    • Report computed Jacobians and their linearized forms.
  • For Task 1:
    • Compare /ground_truth, /odom, and /ekf data.
    • Plot each state component (x, y, θ) over time.
    • Plot the 2D trajectory and compute RMSE and MAE.
  • For Task 2:
    • Repeat the analysis and compare results with Task 1.
  • For Task 3:
    • Compare /odom vs /ekf on the real robot and discuss the differences.

How to Test Your Algorithms

Simulation Environment

Run the Gazebo simulation (with both turtlebot3_simulations and turtlebot3_perception installed):

ros2 launch turtlebot3_gazebo lab04.launch.py

The environment includes white columns acting as landmarks (IDs shown in orange).

Real Robot

  1. Start the camera driver and landmark detector (as shown in Task 3).
  2. Landmarks will appear on /camera/landmarks if detected.

About

ROS2 EKF Localization developed for the "Sensors, embedded systems and algorithms for Service Robotics" course @polito. Tech stack: Python, ROS2

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Jupyter Notebook 47.0%
  • Python 32.1%
  • TeX 20.9%