6th Annual Conference on Robot Learning (CoRL) 2022
Website • Paper • Presentation • YCB-Slide
object's surface using geometry captured by a tactile sensor
MidasTouch performs online global localization of a vision-based touch sensor on an object surface during sliding interactions. For details and further results, refer to our website and paper.
git clone git@github.com:facebookresearch/MidasTouch.git
git submodule update --init --recursive
2. Download YCB-Slide dataset
cd YCB-Slide
chmod +x download_dataset.sh && ./download_dataset.sh # requires gdown
cd ..
chmod +x download_assets.sh && ./download_assets.sh
sudo apt install build-essential python3-dev libopenblas-dev
conda create -n midastouch
conda activate midastouch
conda env update --file environment.yml --prune
conda install pytorch torchvision torchaudio cudatoolkit pytorch-cuda=11.7 -c pytorch -c nvidia # install torch
conda install -c conda-forge cudatoolkit-dev
pip install theseus-ai
pip install -e .
ImportError: cannot import name 'gcd' from 'fractions' (/private/home/suddhu/.conda/envs/midastouch/lib/python3.9/fractions.py)
conda install -c conda-forge networkx=2.5
Follow the conda instructions from the NVIDIA MinkowskiEngine webpage
Run interactive filtering experiments with our YCB-Slide data from both the simulated and real-world tactile interactions.
python midastouch/filter/filter.py expt=ycb # default: 004_sugar_box log 0
python midastouch/filter/filter.py expt.obj_model=035_power_drill expt.log_id=3 # 035_power_drill log 3
python midastouch/filter/filter.py expt.off_screen=True # disable visualization
python midastouch/filter/filter.py expt=mcmaster # small parts: cotter-pin log 0
python midastouch/filter/filter_real.py expt=ycb # default: 004_sugar_box log 0
python midastouch/filter/filter_real.py expt.obj_model=021_bleach_cleanser expt.log_id=2 # 021_bleach_cleanser log 2
With your own DIGIT, you can simple plug in the sensor and experiment with the image to 3D and tactile codes visualizer.
python midastouch/filter/live_demo.py expt.obj_model=025_mug
midastouch
├── bash # bash scripts for filtering, codebook generation
├── config # hydra config files
├── contrib # modified third-party code for TDN, TCN
├── data_gen # Generate tactile simulation data for training/eval
├── eval # select evaluation scripts
├── filter # filtering and live demo scripts
├── modules # helper functions and classes
├── render # DIGIT tactile rendering class
├── tactile_tree # codebook scripts
└── viz # pyvista visualization
- To generate your own tactile simulation data on object meshes, refer to the
midastouch/data_gen/
scripts. - To collect tactile data in the real-world, refer to our experimental scripts in the YCB-Slide repository.
@inproceedings{suresh2022midastouch,
title={{M}idas{T}ouch: {M}onte-{C}arlo inference over distributions across sliding touch},
author={Suresh, Sudharshan and Si, Zilin and Anderson, Stuart and Kaess, Michael and Mukadam, Mustafa},
booktitle = {Proc. Conf. on Robot Learning, CoRL},
address = {Auckland, NZ},
month = dec,
year = {2022}
}
The majority of MidasTouch is licensed under MIT license, however portions of the project are available under separate license terms: MinkLoc3D is licensed under the MIT license; FCRN-DepthPrediction is licensed under the BSD 2-clause license; pytorch3d is licensed under the BSD 3-clause license. Please see the LICENSE file for more information.
We actively welcome your pull requests! Please see CONTRIBUTING.md and CODE_OF_CONDUCT.md for more info.