Skip to content
View AtharvaJam's full-sized avatar

Block or report AtharvaJam

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
AtharvaJam/README.md

Atharva Jamsandekar | Robotics Engineer πŸ‘‹


  • Email: jamsandekar.a@northeastern.edu
  • Location: Boston, MA
  • Specialization: Perception, Localization, and Multi-Agent Systems
  • Education: MS in Robotics, Northeastern University (Grad: Dec 2025)
  • Key Skills: Spatial AI, VLM Integration, Safety-Critical Control (CBFs, EKF)
  • Fun Fact: Global Winner of Shell Eco-marathon 2021 with Team AVERERA.

Experience

πŸ”¬ Graduate Research Assistant | Northeastern University (Dec 2023 – Current)

  • Focuses on multi-robot coordination, VLM integration, and sim-to-real deployment on AgileX Scout robots.

πŸš— Controls Software Intern | Robert Bosch LLC (July – Dec 2024)

  • Pioneered validation of the Automotive Connectivity Hub (C-Hub) using HIL testing and J1939 standards.
  • Integrated ADAS predictive cruise control.

πŸ€– Software Developer Intern | Drobot Inc. (April – Aug 2022)

  • Elevated control performance by 60% using a PID controller.
  • Refined localization using EKF in a UWB-RTLS setup.

Key Focus Areas

My work centers on delivering robust, high-performance autonomy solutions:

  • Safety-Critical Planning: Engineered a decentralized collision avoidance algorithm for swarm robots using Control Barrier Functions (CBFs) for formal safety guarantees.
  • Advanced Perception: Researching amodal 3D reconstruction to reconstruct partially occluded objects, alongside multi-robot coordination using Vision-Language Models (VLMs) like CLIP and Super Glue for zero-shot object-search.

Technical Skills

  • Languages & Frameworks: C++, Python, ROS2
  • Deployment: Sim-to-Real Transfer, Edge Deployment (Jetson Orin)
  • Concepts: Multi-View Geometry, Game Theory, ADAS, Vehicle Controls

Find Me Online


My Skillsets:

C C++ Python ROS NumPy MATLAB Arduino Simulink Raspberry Pi CANalyzer Git Docker Jira

Pinned Loading

  1. gesture-based-teleop gesture-based-teleop Public

    Gesture Based Control Interface for Robot using ROS2

    Python

  2. Leo_explore Leo_explore Public

    Forked from arjunjyothieswarb/Leo_explore

    An advanced autonomous navigation system designed for exploring unknown environments.

    Python 1

  3. Control-Barrier-Gazebo Control-Barrier-Gazebo Public

    Control Barrier Function on Double Integrator robots

    Jupyter Notebook 1

  4. field_robotics field_robotics Public

    Jupyter Notebook