Author: Adnan Munawar (amunawa2@jh.edu)
The Asynchronous Multi-Body Framework (AMBF) provides real-time dynamic simulation of robots, and soft-bodies coupled with real-time haptic interaction via several haptic devices (CHAI-3D) (including dVRK Manipulators and Razer Hydras). It also provides a Python client for training NN and RL Agents on real-time data with the simulation in the loop. This framework is built around several external tools that include an extended version of CHAI-3D (developed alongside AMBF), BULLET-Physics, Open-GL, GLFW, yaml-cpp, pyyaml, and Eigen to name a few. Each external library has its license that can be found in the corresponding subfolder.
2. Wiki:
Please refer to the Wiki for usages, examples, and concepts.
3. Discussions:
Please refer to the discussions questions, and suggestions.
A list of some projects that are developed on/using AMBF. Please click on the project title to navigate to the project webpage.
drilling_matcap.mp4
The bone drilling simulator also provides stereoscopic view of supported Virtual Reality (VR) Head Mounted Displays (HMDs):
Barrel.Roll.Distortion.mp4
surgical_robotics_half.mp4
space_robotics_half.mp4
If this work is helpful for your research, please use the following reference for citation:
@INPROCEEDINGS{8968568,
author={A. {Munawar} and Y. {Wang} and R. {Gondokaryono} and G. S. {Fischer}},
booktitle={2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
title={A Real-Time Dynamic Simulator and an Associated Front-End Representation Format for Simulating Complex Robots and Environments},
year={2019},
volume={},
number={},
pages={1875-1882},
keywords={},
doi={10.1109/IROS40897.2019.8968568},
ISSN={2153-0858},
month={Nov},}
