The Ice interface of the database now requires Ice encoding version 1.1 (available since Ice 3.5). Please be sure to update your configuration file and also use the latest Slice file (see FAQ page).
Starting today, downloading files from the KIT Whole-Body Human Motion Database using the webpage or the Ice interface requires an account. Registration is free and new accounts are active immediately.
Since the database now uses regular expressions instead of file extensions for file type detection, the labels of the file types returned by the Ice API have changed. Code that tests for predefined file types should be adapted.
All MMM motions in the database will be updated to the latest version of the MMM reference model (4 feet DoF) during the course of the next days. To work with this motions, please ensure that you are using the current version of the model, contained in MMMTools since 09.10.2015.
The KIT Whole-Body Human Motion Database has been presented at the International Conference on Advanced Robotics (ICAR) 2015. You can read the paper here.
This page contains answers to some frequently (or less frequently) asked questions about the Motion Database. If your question is not answered here, please contact us.
The major types of data available in the database are:
Depending on the type of motion capture experiment, there may be additional data available, e.g. measurements from force sensors or an inertial measurement unit.
Downloading files from the KIT Whole-Body Human Motion Database requires a account. Registration is free and only takes a few seconds. Browsing the database content to see what is available, including previews of video recordings, is possible without a login (some data may be hidden for privacy protection, e.g. non-anonymized video recordings, subject names).
Users that are already registered with the H²T project management system (Redmine) (lab staff and students), can use this account for login. In this case, please contact us to get your Redmine account activated.
In addition to using this web interface, you can also access the KIT Whole-Body Human Motion Database through an API and thus integrate the KIT Whole-Body Human Motion Database into your own tools. Using the API is simple and very useful for the development of your own tools to automatically process motions from the database.
The API is based the Internet Communications Engine (Ice) which is an object-oriented middleware framework. Ice is available for many popular programming languages such as C++, Java, Python and .NET and the API can therefore be used from all of them.
You can download an example client written in Python here. The example client establishes a connection to the database and demonstrates the usage of several API functions. To run the example client, you need to install Ice (in Ubuntu 14.04, you can install the zero-ice35 package) and download the Ice interface specifications (Slice file) for our API here: MotionDatabase.ice
If you need assistance in using the Ice interface, feel free to contact us.
In general, we aim to make content freely available to the whole scientific community. Some files however need be protected for certain reasons, e.g. video recordings of motions that are not yet anonymized and allow the identification of human subjects.
On a more technical level, users in the KIT Whole-Body Human Motion Database are associated with a number of user groups. These groups determine which protected files the user can access and which database entries they can edit. When logged in, you can see the groups your account is assigned to on the user profile page.
For every file uploaded to an database entry (motion, subject, or object), the uploader can select whether the file is public or protected. For every database entry, groups can be selected for two different levels of access:
In addition to the group-based permission system, the user that created a database entry always retains full read and write access.
If you are using the KIT Whole-Body Human Motion Database in work that leads to a publication, we kindly ask you to cite the following paper:
Master Motor Map (MMM) is a conceptual framework for perception, visualization, reproduction, and recognition of human motion in order to decouple motion capture data from further post-processing tasks, such as execution on a humanoid robot. The MMM framework has been developed in our lab at KIT and is freely available on GitLab under the GNU General Public License (see next question).
In addition to raw C3D motion capture data (which can be used without MMM by all kinds of motion processing tools), the KIT Whole-Body Human Motion Database also provides the motions converted to the MMM reference model in the XML-based MMM motion format.
MMM consists of two packages:
The documentation can be found at mmm.humanoids.kit.edu and a discussion of the core ideas and principles of MMM is provided in the following paper:
Ö. Terlemez, S. Ulbrich, C. Mandery, M. Do, N. Vahrenkamp and T. Asfour, “Master Motor Map (MMM) - Framework and Toolkit for Capturing, Representing, and Reproducing Human Motion on Humanoid Robots”, IEEE/RAS International Conference on Humanoid Robots (Humanoids), pp. 894 - 901, 2014 [BibTeX] [PDF]
The Motion Description Tree (MDT) consists of a hierarchical structure of tags that can be used to describe human motion. Every motion entry in the database can be assigned one or more of the "motion descriptions" available in the MDT.
When filtering for a motion description using the filter panel in the right bar, only motions are considered that are contained in one of the selected subtrees. Additionally, the "Advanced MDT search term" filter can be used to construct more complex search queries (see next question).
The "Advanced MDT search term" filter allows to filter for motions based on their classification in the Motion Description Tree. If a search term is provided, the simpler "Motion descriptions" filter is ignored.
Search terms consist of queries chained by using the logical operators "x AND y", "x OR y" and "NOT(x)". These search terms can be of an (almost) arbitrary length.
Motion description tags that contain spaces must be written within quotation marks when used within a search term (e.g.: "hand stand").
Video previews that are shown on the motion list and the motion detail page use the excellent VP8 codec from Google in a WebM container. They should work in any major browser (Firefox, Chrome/Chromium, Opera) except Internet Explorer.
Preview videos are subject to the same access restrictions as their corresponding video files. Therefore, if a video file is not accessible to you (e.g. because you are not logged in and the video is not yet properly anonymized), you will neither be able to see its preview. Additionally, preview videos are generated once daily, which is why they are not shown for very recently uploaded videos. Of course, you can still download a video file to inspect its content in this case.
Starting in June 2016, we have integrated motion recordings from the CMU Graphics Lab Motion Capture Database as a subset into our motion database. These motions can be found by filtering the list of motions for the "Carnegie Mellon University (CMU)" institution. The motion recordings are provided as C3D files and as the corresponding MMM representations (see above).
Some important limitations and differences to the rest of our data should be noted when working with this data though:
Acknowledgments: Motion data from the CMU Graphics Lab Motion Capture Database was obtained from mocap.cs.cmu.edu. This database was created with funding from NSF EIA-0196217.