The QFAAP is designed to enhance the safety of vision-guided robot grasping in Human-Robot Interaction (HRI) scenarios. It introduces an Adversarial Quality Patch (AQP) and a Projected Quality Gradient Descent (PQGD) that adapts to human hand shapes from the perspective of benign adversarial attacks, which can be used to reduce the grasping priority of hands and nearby objects, enabling robots to focus on safer, more appropriate grasping targets.
If you use this work, please cite:
@inproceedings{clee2025qfaap,
title={Quality-focused Active Adversarial Policy for Safe Grasping in Human-Robot Interaction},
author={Chenghao, Li and Razvan, Beuran and Nak Young, Chong},
booktitle={arXiv:2503.19397},
year={2025}
}
Contact
Any questions or comments contact Chenghao Li.
This code was developed with Python 3.8 on Ubuntu 22.04. Python requirements can installed by:
pip install -r requirements.txt
Currently, all datasets are supported.
- Download and extract the Cornell Dataset.
- Download and extract the OCID Dataset.
- Download and extract the Jacquard Dataset.
All pre-trained grasping models for GG-CNN, GG-CNN2, GR-Convnet, and others can be downloaded from here.
All AQP trained by different grasping models and datasets can be downloaded from here.
All pre-trained Hand Segmentation models can be downloaded from here or here.
Training for AQP is done by the AQP_training.py
. Training for Grasping model is done by the train_grasping_network.py
.
And the evaluation process is followed by the training.
- The offline and realtime prediction of QFAAP is done by the
QFAAP_offline.py
andQFAAP_realtime.py
. - For the deployment of real-time hand segmentation, please refer to this repository https://github.com/Unibas3D/Upper-Limb-Segmentationp
- Please reference this repository https://github.com/dougsm/ggcnn_kinova_grasping
- Or https://github.com/clee-jaist/MCIGP