Skip to content

This repository is part of Udacity's Sensor Fusion Nanodegree, where the goal is to track an object using a combination of Lidar and camera data. By fusing different sensor modalities, the system improves detection reliability and precision using KF, EKF, and UKF techniques.

Notifications You must be signed in to change notification settings

AdithyaDamarla/Sensor_Fusion_Udacity

Repository files navigation

Sensor Fusion & Perception Projects

Welcome to my collection of perception-based projects using LiDAR, Camera, and Radar data. These projects explore how autonomous vehicles detect obstacles, track objects, and estimate movement using real-world sensor data and algorithms like RANSAC, Kalman Filters, and Euclidean Clustering.


Project 1: LiDAR Obstacle Detection

In this project, I worked with LiDAR point cloud data to detect obstacles like vehicles and roadside objects.

The pipeline includes:

  • Voxel Grid & ROI Filtering
  • 3D RANSAC Plane Segmentation
  • KD-Tree-Based Euclidean Clustering
  • 3D Bounding Box Visualization

Here's a sample output where the street is shown in green, and detected obstacles are wrapped in red bounding boxes:

LiDAR Detection Result LiDAR Detection Result

👉 More details in Lidar_Obstacle_Detection


Project 2: 3D Object Tracking with Camera & LiDAR

This project tracks a leading vehicle using both camera images and LiDAR data to estimate Time-to-Collision (TTC).

Key highlights:

  • Keypoint Detection & Matching (e.g., SIFT, ORB)
  • Bounding Box Association between frames
  • TTC Estimation from both Camera and LiDAR

See it in action — the green box tracks the vehicle, and TTC estimates are shown at the top:

👉More details in Project-2D-Feature_Matching

👉More details in Project-3D-Object-Tracking

Frame 1 Frame 2
Frame 1           Frame 2

Frame 3 Frame 4
Frame 3           Frame 4

Frame 5
Frame 5


Project 3: Radar-Based Velocity & Range Detection

Using FMCW radar simulation, this project identifies a moving target by estimating its range and velocity.

Core steps:

  • FMCW Signal Simulation
  • 2D FFT to extract Range & Doppler
  • CFAR Detector for target identification

Output below: The peak in the plot shows a detected object ~81m away, moving at ~-20 m/s.

Radar Range Output Radar FFT Output

Lecture exercise to implement the sensor_fusion with RADAR

RADAR

👉 More details in Radar/project/README.md


Project 4: Sensor Fusion with Kalman Filters

Each vehicle (except the ego car) is tracked using an Unscented Kalman Filter (UKF) that fuses LiDAR and Radar inputs for accurate tracking.

Highlights:

  • Lidar & Radar Integration
  • Real-Time Position & Velocity Estimation
  • Predict-Update Cycles of UKF

UKF Sensor Fusion

👉 More details in Project_Unscented_Kalman_Filter


About

This repository is part of Udacity's Sensor Fusion Nanodegree, where the goal is to track an object using a combination of Lidar and camera data. By fusing different sensor modalities, the system improves detection reliability and precision using KF, EKF, and UKF techniques.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages