Hi, I'm Jie Yin(殷杰), a self-motivated research engineer specializing in robotics🤖, received both my B.Eng. degree and M.Eng degree from Shanghai Jiao Tong University(上海交通大学). I am broadly interested in Robotics and Embodied AI, including Sensor Fusion, Reinforcement Learning, Manipulation, Locomotion and the intersection of them. Recently, I have been exploring dexterous manipulation with locomotion and whole-body control. Previously, I focus on advancing the performance of multi-sensor fusion SLAM algorithms under corner cases with comprehensive benchmarks.
My research has been published in top-tier venues such as ICRA, IROS, RAL, CVPR, TRO, and GPS Solutions. My highlighted research include:
- 🔥[RA-L'2021] M2DGR (ESI HIGHLY CITED PAPER, TOP 1%)
- 🔥[ICRA'2024] Ground-Fusion
- [IROS'2024] DAF
- [CVPR'2024 Highlight] EN-SLAM
- [TRO'2025] LIGO
- [GSIS'2024] Sky-GVINS
- [ROBIO'2023] Ground-Challenge
In addition, I have curated high-quality resources on robotics, including:
These projects have collectively received over 3000 GitHub stars, reflecting their impact in the robotics and SLAM community. To give back to the academic community, I have also served as a reviewer for ICRA, IROS, RA-L, TASE, and TRO.
I'm a self-motivated research engineer on robotics. Previously, I used to research at MIT-IBM Watson Lab, HKU Mars Lab, Tencent Robotics X Lab, Shanghai AI Lab, SJTU Beidou NLS and Shanghai Beidou Research Institute. For more information, visit my personal website
I'm currently working on following topics. If you want to collaborate, welcom to mail me at robot_yinjie@outlook.com. 🤝🏻
- Dexterous Hand Manipulation
- Mobile Manipulation
- Reinforcement Learning
- SLAM for Corner Cases
- Robotic Arm Control
- Language & Framework:
- Simulation Platform: Isaac Gym; Isaac Sim (OIGE); Isaac Lab; MuJoCo; Pybullet
- Sensors: LiDAR (Velodyne, Robosense, Livox, RP Lidar); Camera (Intel Realsense D435I&D455, PointGrey, Hikon, Indemind, Orbbec); IMU(Xsens series, Handsfree); GNSS(Ublox); Mocap System(Vicon)
- Robots: Wheeled Robot(WheelTec, Agile); Robotic Arm (Franka, Flexiv, Realman); Robotic Hand (Inspire, Shadow)
Title | Description | Stars | Forks |
---|---|---|---|
Cutting-edge Algorithms | ⚡Ground-Fusion Ground-Fusion: A Low-cost Ground SLAM System Robust to Corner Cases (ICRA2024) |
||
⚡DAF Disentangled Acoustic Fields For Multimodal Physical Scene Understanding (IROS2024) |
|||
EN-SLAM Implicit Event-RGBD Neural SLAM(CVPR2024) |
|||
Sky-GVINS Sky-GVINS: a Sky-segmentation Aided GNSS-Visual-Inertial System for Navigation in Urban Canyons(Geo-spatial Information and Science2023) |
|||
SLAM Benchmarks | ⚡M2DGR M2DGR: a Multi-modal and Multi-scenario SLAM Dataset for Ground Robots (RA-L & ICRA2022) |
||
⚡M2DGR-plus M2DGR-plus: Extension and update of M2DGR, a novel Multi-modal and Multi-scenario SLAM Dataset for Ground Robots (ICRA2022 & ICRA2024) |
|||
M2DGR-Benchmark A benchmark based on M2DGR and M2DGR-plus dataset with adapted SOTA SLAM algorithms |
|||
Ground-Challenge Ground-Challenge: A Multi-sensor SLAM Dataset Focusing on Corner Cases for Ground Robots(ROBIO2023) |
|||
SJTU_GVI A GNSS-Visual-IMU benchmark Dataset for SLAM. Test dataset for M2C-VIO. |
|||
Awesome Lists | ⚡awesome-isaac-gym A curated list of awesome NVIDIA Issac Gym frameworks, papers, software, and resources |
||
awesome-LiDAR-Visual-SLAM A curated list of resources relevant to LiDAR-Visual-Fusion-SLAM |
|||
Awesome-Grasp-List A curated list of awesome open-source grasping libraries and resources |
|||
awesome-wheel-slam A curated list of resources relevant to wheel-based SLAM |
|||
awesome-isaac-sim Collect some related resources of NVIDIA Isaac Sim |