Skip to content

Haptic Knowledge Transfer Between Heterogeneous Robots using Kernel Manifold Alignment

Notifications You must be signed in to change notification settings

gtatiya/Haptic-Knowledge-Transfer-KEMA

Repository files navigation

Haptic Knowledge Transfer Between Heterogeneous Robots using Kernel Manifold Alignment

Abstract:

Humans learn about object properties using multiple modes of perception. Recent advances show that robots can use non-visual sensory modalities (i.e., haptic and tactile sensory data) coupled with exploratory behaviors (i.e., grasping, lifting, pushing, dropping, etc.) for learning objects' properties such as shape, weight, material and affordances. However, non-visual sensory representations cannot be easily transferred from one robot to another, as different robots have different bodies and sensors. Therefore, each robot needs to learn its task-specific sensory models from scratch. To address this challenge, we propose a framework for knowledge transfer using kernel manifold alignment (KEMA) that enables source robots to transfer haptic knowledge about objects to a target robot. The idea behind our approach is to learn a common latent space from multiple robots' feature spaces produced by respective sensory data while interacting with objects. To test the method, we used a dataset in which 3 simulated robots interacted with 25 objects and showed that our framework speeds up haptic object recognition and allows novel object recognition.

drawing

Development Environment

Python 3.7.6 and MATLAB R2019b update 3 (9.7.0.1261785) are used for development and following packages are required to run the code:

Python Dependencies

pip install scipy==1.1.0
pip install sklearn=0.22.1
pip install numpy==1.18.1
pip install tensorflow-gpu==2.0.0
pip install matplotlib==3.1.2
MATLAB Engine API for Python

MATLAB Dependencies

Statistics and Machine Learning Toolbox

Discretized Mean 10 bins

drawing

Discretized Range 15 bins

drawing

Dataset Collection

Baxter: https://github.com/medegw01/baxter_tufts
Fetch: https://github.com/gtatiya/Fetch_Pick_and_Place
Sawyer: https://github.com/medegw01/sawyer_tufts

Baxter drawing Fetch drawing Sawyer drawing

How to run the code?

Speeding up object recognition

Baseline condition: python MulitBehaviorsObjectClassification.py
Transfer condition: python MulitBehaviorsOC_KEMA.py

Novel object recognition

python MulitBehaviorsNovelOC_KEMA.py

Results (Discretized Mean 10 bins)

Illustrative Example

drawing

Speeding up object recognition results

Baxter as Target Robot

drawing

Fetch as Target Robot

drawing

Sawyer as Target Robot

drawing

Novel object recognition results

Baxter as Target Robot

drawing

Fetch as Target Robot

drawing

Sawyer as Target Robot

drawing

About

Haptic Knowledge Transfer Between Heterogeneous Robots using Kernel Manifold Alignment

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published