Skip to content

iretes/blind-object-recognition-soft-grippers

Repository files navigation

Blind object recognition with soft grippers

In scenarios where visual feedback is unavailable or unreliable, soft tactile sensing provides crucial information for object identification. Leveraging the compliance of soft grippers enables rich contact signals that can be used to recognize objects by touch alone.

This repository includes experiments with various machine learning models for object classification from tactile and proprioceptive data collected from a real-world soft robotic gripper. Metric learning techniques (e.g., [1]) were also investigated to enable few-shot object recognition, thereby empowering the model to generalize to novel objects using only a limited amount of additional data.

Directory Structure

The project's directory structure includes the following main files and folders:

blind-object-recognition-soft-grippers/
  β”œβ”€β”€ dataset/                     # contains the dataset
  β”œβ”€β”€ deep_attentive_time_warping/ # original implementation of the Deep Attentive Time Warping [1] method
  β”‚     β”œβ”€β”€ dataloader.py          # data loading utilities
  β”‚     β”œβ”€β”€ DATW.py                # Deep Attentive Time Warping core class
  β”‚     β”œβ”€β”€ experiments.sh         # script to run full training and few-shot evaluation pipeline
  β”‚     β”œβ”€β”€ few_shot_eval.py       # script for few-shot evaluation
  β”‚     β”œβ”€β”€ model.py               # model definition
  β”‚     β”œβ”€β”€ training.py            # script to run training
  β”‚     └── utils.py               # utility functions
  β”œβ”€β”€ results/                     # stores the results of the experiments
  β”œβ”€β”€ siamese_network/             # implementation of a Siamese Network
  β”‚     β”œβ”€β”€ dataloader.py          # data loading utilities
  β”‚     β”œβ”€β”€ SN.py                  # Siamese Network core class
  β”‚     β”œβ”€β”€ experiments.sh         # script to run full training and few-shot evaluation pipeline
  β”‚     β”œβ”€β”€ few_shot_eval.py       # script for few-shot evaluation
  β”‚     β”œβ”€β”€ model.py               # model definition
  β”‚     └── training.py            # script to run training
  β”œβ”€β”€ CNN.ipynb                    # experiments with a Convolutional Neural Network
  β”œβ”€β”€ DATW.ipynb                   # experiments with the Deep Attentive Time Warping method
  β”œβ”€β”€ DTW.ipynb                    # experiments with Dynamic Time Warping
  β”œβ”€β”€ LSTM.ipynb                   # experiments with a Long Short-Term Memory network
  β”œβ”€β”€ preprocessing.ipynb          # data preprocessing steps
  β”œβ”€β”€ results_summary.ipynb        # summary of all experiments' results
  β”œβ”€β”€ shapelet+XGB.ipynb           # experiments with XGBoost on Shapelets features
  β”œβ”€β”€ slides.pdf                   # slides for the project presentation 
  β”œβ”€β”€ SN.ipynb                     # experiments with the Siamese Network
  β”œβ”€β”€ stats+XGB.ipynb              # experiments with XGBoost on time and frequency domain features
  β”œβ”€β”€ transformer.ipynb            # experiments with a Transformer model
  └── utils.py                     # utility functions

References

[1] Matsuo, Shinnosuke, et al. "Attention to warp: Deep metric learning for multivariate time series." Document Analysis and Recognition–ICDAR 2021: 16th International Conference, Lausanne, Switzerland, September 5–10, 2021, Proceedings, Part III 16. Springer International Publishing, 2021. GitHub repository: https://github.com/matsuo-shinnosuke/deep-attentive-time-warping/.


This project was developed for the "Robotics" course at the University of Pisa (a.y. 2024/2025).

About

🦾 Blind object recognition with soft grippers.

Topics

Resources

Stars

Watchers

Forks

Languages