| KIT Whole-Body Human Motion Database

Multi-Agent Collaboration Dataset

With the KIT Whole-Body Human Motion Database, we aim to provide a simple way of sharing high-quality motion capture recordings of human whole-body motion. The database is run by the High Performance Humanoid Technology (H²T) Lab.


Back to list


Details

Created:Sep. 2025
Modified:Sep. 2025
Write Groups:KIT
Read Protected Groups:KIT
Name:Multi-Agent Collaboration Dataset
Comment:Collaborative multi-agent task execution represents a fundamental challenge in assistive robotics, particularly when agents possess varying sensorimotor capabilities and robots have to augment human abilities. This multimodal dataset captures the complex dynamics of collaborative household tasks performed by multiple agents: humans with different abilities, pairs of humans, and human-robot teams. The dataset encompasses rich sensory modalities, including high-precision motion capture trajectories, RGB-D video streams, proprioceptive robot data, and wearable sensor measurements from IMUs
and a novel Wearable Sensori-Motor Interface (WSMI). The recordings vary in task complexity, required dexterity levels, and the number of hands involved, providing a unique resource for understanding how collaborative strategies adapt to agent capabilities and task demands. Data collection occurred at two institutions (Karlsruhe Institute of Technology, KIT and Fondazione Santa Lucia, FSL) using complementary sensing approaches, ensuring broad applicability for developing adaptive robotic assistance strategies. The data enables research into multi-agent coordination, capability-aware task planning, and human-robot collaboration.

The dataset, corresponding to Deliverable D4.3 of the HARIA project, consists of two parts:

- Motion capture recordings at KIT: Bi-manual, human-human interaction, and human-robot interaction variants of household tasks, such as handover, collaborative carrying, tray (un-)loading, pouring and cutting. Motion capture recordings, RGB-D videos, hand shape trajectories recorded by data gloves, IMU mesaurements and robot proprioception data are provided.

- Camera- and wearable sensor-based recordings at FSL: Human-robot interaction through a Wearable Sensori-Motor Interface (WSMI) in a pouring task. The available modalities span RGB video recordings, trajectory data of trackers worn by the subject, WSMI data, and robot proprioception data.
Release:Sep. 2025
Paper:
Bibtex:

Attached Files

Download all files

Created File Name File Type Description Visibility Actions

8 protected files hidden (more information)


Back to list