Publications

ActionSense: A Multimodal Dataset for Human Activities Using Wearable Sensors in a Kitchen Environment

Publication

NeurIPS 2022

Authors

Joseph DelPreto∗, Chao Liu∗, Yiyue Luo, Michael Foshey, Yunzhu Li, Antonio Torralba, Wojciech Matusik, and Daniela Rus

Abstract

This paper introduces ActionNet, a multimodal dataset and recording frame work with an emphasis on wearable sensing in a kitchen environment. It provides rich, synchronized data streams along with ground truth data to facilitate learning pipelines that could extract insights about how humans interact with the physical world during activities of daily living, and help lead to more capable and collab orative robot assistants. The wearable sensing suite captures motion, force, and attention information; it includes eye tracking with a first-person camera, fore arm muscle activity sensors, a body-tracking system using 17 inertial sensors, finger-tracking gloves, and custom tactile sensors on the hands that use a matrix of conductive threads. This is coupled with activity labels and with externally captured data from multiple RGB cameras, a depth camera, and microphones. The specific tasks recorded in ActionNet are designed to highlight lower-level physi cal skills and higher-level scene reasoning or action planning. They include simple object manipulations (e.g., stacking plates), dexterous actions (e.g., peeling or cutting vegetables), and complex action sequences (e.g., setting a table or loading a dishwasher). The resulting dataset and underlying experiment framework are available at https://action-net.csail.mit.edu. Preliminary networks and analyses explore modality subsets and cross-modal correlations. ActionNet aims to sup port applications including learning from demonstrations, dexterous robot control, cross-modal predictions, and fine-grained action segmentation. It could also help inform the next generation of smart textiles that may one day unobtrusively send rich data streams to in-home collaborative or autonomous robot assistants.

Paper

actionsense_neurips.pdf

CDFG

© 2024 The Computational Design & Fabrication Group