login  |  register

News

08.06.2016:

The KIT Whole-Body Human Motion Database now provides motion data from the CMU Graphics Lab Motion Capture Database as a subset. Please see the FAQ page for further details.

01.02.2016:

The Ice interface of the database now requires Ice encoding version 1.1 (available since Ice 3.5). Please be sure to update your configuration file and also use the latest Slice file (see FAQ page).

08.01.2016:

Starting today, downloading files from the KIT Whole-Body Human Motion Database using the webpage or the Ice interface requires an account. Registration is free and new accounts are active immediately.

05.01.2016:

Since the database now uses regular expressions instead of file extensions for file type detection, the labels of the file types returned by the Ice API have changed. Code that tests for predefined file types should be adapted.

12.11.2015:

All MMM motions in the database will be updated to the latest version of the MMM reference model (4 feet DoF) during the course of the next days. To work with this motions, please ensure that you are using the current version of the model, contained in MMMTools since 09.10.2015.

29.07.2015:

The KIT Whole-Body Human Motion Database has been presented at the International Conference on Advanced Robotics (ICAR) 2015. You can read the paper here.

Frequently Asked Questions

This page contains answers to some frequently (or less frequently) asked questions about the Motion Database. If your question is not answered here, please contact us.

Which types of data are offered by the KIT Whole-Body Human Motion Database?

The major types of data available in the database are:

  • MMM Motions (*.xml): Human motion represented on the Master Motor Map (MMM) reference model as a well-specified kinematic and dynamic model of the human body. For every timestep (100 Hz), root location and rotation and joint angle values of the reference model are given. Additionally, these files also include the motion of environmental objects. For more information about the MMM framework, see the corresponding question below.
  • C3D Files (*.c3d): Raw recordings (100 Hz) from the Vicon motion capture in the industry standard file format C3D.
  • Video Files (*.avi): Complementary video recordings for the captured motions. Publicly available videos are anonymized, e.g. have their audio track removed and subjects' faces blurred.
  • Information about Subjects: Body height and weight, segment lengths according to the Anthropometric Data Table, gender, age.
  • Information about Objects: 3D models (Blender and Simox), images.

Depending on the type of motion capture experiment, there may be additional data available, e.g. measurements from force sensors or an inertial measurement unit.

Do I need an account to access the KIT Whole-Body Human Motion Database? How can I register?

Downloading files from the KIT Whole-Body Human Motion Database requires a account. Registration is free and only takes a few seconds. Browsing the database content to see what is available, including previews of video recordings, is possible without a login (some data may be hidden for privacy protection, e.g. non-anonymized video recordings, subject names).

To create an account, fill out the registration form. To change your account information like mail address and password, you can edit your user profile after login.

Users that are already registered with the H²T project management system (Redmine) (lab staff and students), can use this account for login. In this case, please contact us to get your Redmine account activated.

Which methods exist to access the KIT Whole-Body Human Motion Database?

In addition to using this web interface, you can also access the KIT Whole-Body Human Motion Database through an API and thus integrate the KIT Whole-Body Human Motion Database into your own tools. Using the API is simple and very useful for the development of your own tools to automatically process motions from the database.

The API is based the Internet Communications Engine (Ice) which is an object-oriented middleware framework. Ice is available for many popular programming languages such as C++, Java, Python and .NET and the API can therefore be used from all of them.

You can download an example client written in Python here. The example client establishes a connection to the database and demonstrates the usage of several API functions. To run the example client, you need to install Ice (in Ubuntu 14.04, you can install the zero-ice35 package) and download the Ice interface specifications (Slice file) for our API here: MotionDatabase.ice

If you need assistance in using the Ice interface, feel free to contact us.

How is access to the content in the KIT Whole-Body Human Motion Database controlled?

In general, we aim to make content freely available to the whole scientific community. Some files however need be protected for certain reasons, e.g. video recordings of motions that are not yet anonymized and allow the identification of human subjects.

On a more technical level, users in the KIT Whole-Body Human Motion Database are associated with a number of user groups. These groups determine which protected files the user can access and which database entries they can edit. When logged in, you can see the groups your account is assigned to on the user profile page.

For every file uploaded to an database entry (motion, subject, or object), the uploader can select whether the file is public or protected. For every database entry, groups can be selected for two different levels of access:

  • Read protected groups: Users in one of the "read protected groups" can download files marked as protected associated with this database entry.
  • Write groups: Users in one of the "write groups" can alter the database entry, which means they can edit the entry (including the assigned groups), delete it and upload/edit/delete associated files. Additionally, the users in one of the "write groups" can also download protected files (like for "read protected groups").

In addition to the group-based permission system, the user that created a database entry always retains full read and write access.

How can I cite the KIT Whole-Body Human Motion Database?

If you are using the KIT Whole-Body Human Motion Database in work that leads to a publication, we kindly ask you to cite the following paper:

C. Mandery, Ö. Terlemez, M. Do, N. Vahrenkamp and T. Asfour, “The KIT Whole-Body Human Motion Database”, International Conference on Advanced Robotics (ICAR), pp. 329 - 336, 2015 [BibTeX] [PDF]

What is the Master Motor Map (MMM) framework and where can I find its code and documentation?

Master Motor Map (MMM) is a conceptual framework for perception, visualization, reproduction, and recognition of human motion in order to decouple motion capture data from further post-processing tasks, such as execution on a humanoid robot. The MMM framework has been developed in our lab at KIT and is freely available on GitLab under the GNU General Public License (see next question).

In addition to raw C3D motion capture data (which can be used without MMM by all kinds of motion processing tools), the KIT Whole-Body Human Motion Database also provides the motions converted to the MMM reference model in the XML-based MMM motion format.

MMM consists of two packages:

  • MMMCore contains the data structures, kinematic models and code for reading and writing motion data.
  • MMMTools contains tools for visualization, reproduction and recognition for motion, e.g. the converters used to transfer raw motions from MoCap to the MMM reference model.

The documentation can be found at mmm.humanoids.kit.edu and a discussion of the core ideas and principles of MMM is provided in the following paper:

Ö. Terlemez, S. Ulbrich, C. Mandery, M. Do, N. Vahrenkamp and T. Asfour, “Master Motor Map (MMM) - Framework and Toolkit for Capturing, Representing, and Reproducing Human Motion on Humanoid Robots”, IEEE/RAS International Conference on Humanoid Robots (Humanoids), pp. 894 - 901, 2014 [BibTeX] [PDF]

What is the Motion Description Tree?

The Motion Description Tree (MDT) consists of a hierarchical structure of tags that can be used to describe human motion. Every motion entry in the database can be assigned one or more of the "motion descriptions" available in the MDT.

When filtering for a motion description using the filter panel in the right bar, only motions are considered that are contained in one of the selected subtrees. Additionally, the "Advanced MDT search term" filter can be used to construct more complex search queries (see next question).

How does the "Advanced MDT search term" filter work?

The "Advanced MDT search term" filter allows to filter for motions based on their classification in the Motion Description Tree. If a search term is provided, the simpler "Motion descriptions" filter is ignored.

Search terms consist of queries chained by using the logical operators "x AND y", "x OR y" and "NOT(x)". These search terms can be of an (almost) arbitrary length.

Examples:

  • "run AND forward": Returns all running motions directed forwards.
  • "carry AND drop": Returns all motions where an object is carried and dropped (a specific object may also be included in the search by using the object filter).
  • "run OR (walk AND NOT(slow))": Returns all motions where the subject is running or walking, but not slow.

Motion description tags that contain spaces must be written within quotation marks when used within a search term (e.g.: "hand stand").

Why do some videos not offer a preview video on the webpage?

Video previews that are shown on the motion list and the motion detail page use the excellent VP8 codec from Google in a WebM container. They should work in any major browser (Firefox, Chrome/Chromium, Opera) except Internet Explorer.

Preview videos are subject to the same access restrictions as their corresponding video files. Therefore, if a video file is not accessible to you (e.g. because you are not logged in and the video is not yet properly anonymized), you will neither be able to see its preview. Additionally, preview videos are generated once daily, which is why they are not shown for very recently uploaded videos. Of course, you can still download a video file to inspect its content in this case.

How is data from the CMU Graphics Lab Motion Capture Database contained within the KIT Whole-Body Human Motion Database?

Starting in June 2016, we have integrated motion recordings from the CMU Graphics Lab Motion Capture Database as a subset into our motion database. These motions can be found by filtering the list of motions for the "Carnegie Mellon University (CMU)" institution. The motion recordings are provided as C3D files and as the corresponding MMM representations (see above).

Some important limitations and differences to the rest of our data should be noted when working with this data though:

  • The CMU data does not contain information about objects with which the human subject is interacting.
  • The CMU recordings use a slightly different marker set, which is described here (in contrast to our KIT reference marker set here).
  • Data imported from the CMU database is not labeled according to our Motion Description Tree and only contains the imported free text description (this may change someday).
  • The date of the recordings is not available and has been set arbitrarily to 2010-01-01 in our database.
  • The CMU data does not provide information about subjects. Therefore, for every motion experiment, a separate "dummy subject" has been created in our database. There "dummy subjects" do not contain anthropometric measurements and the subject height is estimated only based on the head markers in the initial pose.
  • Some recordings miss some of the defined markers in the CMU marker set and have been skipped.

Acknowledgments: Motion data from the CMU Graphics Lab Motion Capture Database was obtained from mocap.cs.cmu.edu. This database was created with funding from NSF EIA-0196217.