HEADERIMAGE
The Bot Appetit Dataset is a a multimodal dataset (transcripts of think-alouds, VR video recordings, object 3D coordinates) collected from 22 participants tasked with preparing a kitchen for collaboration with one of several robots in a Virtual Reality (VR) environment
Picture showing a comparison of data recorded in dataset
The data is split into folders by participant id. Each participant folder contains the ego and frontal perspective videos in mp4 format for each bundle of tasks as well as a video containing all task bundles in one.
BotAppetitDataset/
-P1/
-Almond.mp4
-Broccoli.mp4
-P1.txt
-P1_Almond.txt
-P1_Broccoli.txt
-P2/
-Almond.mp4
-Broccoli.mp4
-P2.txt
-P2_Almond.txt
-P2_Broccoli.txt
-...
-README.md
-LICENSE.md
- 189 videos in MP4 format with a total of 417 minutes of material
- Bot Appetit Dataset Repository - Link to the dataset videos hosted on university servers.
Link and Bibtex for ROMAN paper here
- Rachel Ringe - rringe@uni-bremen.de
- Leandra Thiele - le_th@uni-bremen.de
- Mihai Hawkin - mpomarlan@uni-bremen.de
- Nima Zargham - zargham@tzi.de
- Robin Nolte - nolte@uni-bremen.de
- Lars Hurrelbrink - lhurrelb@uni-bremen.de
- Rainer Malaka - malaka@tzi.de
This project is licensed under the CC-BY license - see the LICENSE.md file for details
This work was funded by the FET-Open Project #951846 MUHAI – Meaning and Understanding for Human-centric AI by the EU Pathfinder and Horizon 2020 Program.
This work has also been supported by the German Research Foundation DFG, as part of Collaborative Research Center 1320 Project-ID 329551904 EASE -- Everyday Activity Science and Engineering, University of Bremen. The research was conducted in subproject P01 – Embodied semantics for the language of action and change: Combining analysis, reasoning and simulation