Table Of Contents

This Page

Welcome to choreography_dataset’s documentation!

The data presented in this page can be downloaded at https://flowers.inria.fr/choreography_database.html.

If you use this database in your experiments, please cite the following paper:
Mangin O., Oudeyer P.Y., Learning to recognize parallel combinations of human motion primitives with linguistic descriptions using non-negative matrix factorization. To appear in IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Algarve (Portugal). (More information, bibtex)

Presentation

This database contains choreography motions recorded through a kinect device. These motions have a combinatorial structure: from a given set of primitive dance motions, choreographies are constructed as simultaneous execution of some of these primitive motions.

Primitive dance motions are chosen from a total set of 48 motions and are spanned over one or two limbs, either the legs (e.g. walk, squat), left or right arm (e.g. wave hand, punch) or both arms (e.g. clap in hands, paddle).

Complex choreographies are produced as the simultaneous demonstration of two or three of these primitive motion: either one for legs and one for both arm, or one for legs and one for each arm.

Each example (or record) contained in the dataset consists in two elements:
  • the motion data,
  • labels identifying which primitive motions are combined to produce the choreography.
3 separate sets of examples are included in this dataset:
  • primitive: in each example, only one primitive motion is demonstrated, the set of labels associated to each example is thus a singleton (326 examples)
  • mixed small: demonstrations of complex choreographies composed of primitive motions taken in a subset of 16 possible motions (137 examples).
  • mixed full: demonstrations of complex choreographies composed of primitive motions taken in all the possible motions (277 examples).

Description of the data

The data has been acquired through a kinect camera and the OpenNI drivers, which yields a stream of values of markers on the body.

Each example from the dataset is associated to a sequence of 3D positions of each of the 24 markers. Thus for a sequence of length T, the example would corresponds to T*24*3 values.

The kinect device recognizes the following list of markers:
head, neck, waist, left_hip, left_shoulder, left_elbow, left_hand, left_knee, left_foot, left_collar, left_wrist, left_fingertip, left_ankle, right_hip, right_shoulder, right_elbow, right_hand, right_knee, right_foot, right_collar, right_wrist, right_hand, right_fingertip, right_ankle
These markers are however not tracked with the same accuracy and it might be better to filter to keep only a subset of these markers. For examples the following list is a good start:
head, neck, left_hip, left_shoulder, left_elbow, left_hand, left_knee, left_foot, right_hip, right_shoulder, right_elbow, right_hand, right_knee, right_hand, right_foot

Labels are provided as lists of one (primitive set), or two or three (other sets) identifiers.

A list of primitives and their descriptions can be found at the end of this document.

Format

This data is accessible in three data formats:
  • text
  • numpy
  • Matlab

The text format

The set of examples consists in:
  • a json file describing metadata and labels,
  • a directory containing one text file for each example.

These are distributed in a compressed archive (tar.gz).

An example of a json file is given below. They all have a similar structure.

{
"marker-names": [
    "head",
    "neck",
    ...
],
"data-dir": "mixed_partial_data",
"name": "mixed_partial",
"records": [
    {
    "data-id": 0,
    "labels": [
        20,
        26
    ]
    },
    {
    "data-id": 1,
    "labels": [
        19,
        28
    ]
    },
    ...
]
}
It contains the following data:
  • name: name of the set of examples,
  • marker-names: list of name of the markers in the same order as they appear in data,
  • data-dir: path to the data directory,
  • records: list of records. Each record contains: - a data-id fields, - a labels field containing a list of label as integers.

For each record listed in th json file there exists a text file in the ‘data-dir’ directory, which name is the ‘data-id’ plus a ‘.txt’ extension.

The text files contains the sequence of positions of the marker. Each set of values at a given time is given as a line of space separated floating numbers (formated as ‘5.948645401000976562e+01’).

Each line contains 3 successive values for each marker which are there 3D coordinates, as provided by the OpenNI framework during capture. Thus each line contains 3M values with M the number of markers.

The numpy format

In this format each set of examples is described by two files: a json file and a compressed numpy data file (.npz).

The json file is very similar to the one from the text format, the only difference is that the ‘data-dir’ element is replaced by a ‘data-file’ element containing the path to the data file.

The data file is a numpy compressed data file storing one array for each example. The name of the array is given by the ‘data-id’ element. Each data array (one for each record) is of shape (T, M, 3) where T is the length of the example and M the number of markers.

The following code can be used to load a set of example in python:

import os
import json

import numpy as np


FILE = 'path/to/mixed_full.json'


with open(FILE, 'r') as meta_file:
    meta = json.load(meta_file)
    # meta is a dictionary containing data from the json file
path_to_data = os.path.join(os.path.dirname(FILE), meta['data-file'])
loaded_data = np.load(path_to_data)
data = []
labels = []
for r in meta['records']:
    data.append(loaded_data[str(r['data-id'])]) # numpy array
    labels.append(r['labels']) # list of labels as integers

print "Loaded %d examples for ``%s`` set." % (len(data), meta['name'])
print "Each data example is a (T, %d, 3) array." % len(meta['marker-names'])
print "The second dimension corresponds to markers:"
print "\t- %s" % '\n\t- '.join(meta['marker-names'])
return (data, labels, meta['marker-names'])

The Matlab format

In the Matlab format, a set of examples is described by a single ‘.mat’ file containing the following elements:
  • a ‘name’ variable (string) containing the name of the set of examples,
  • a ‘marker_names’ variable containing a list of marker names (strings),
  • a ‘data’ variable containing a list of data arrays (one for each record) of size (T, M, 3) where T is the length of the example and M the number of markers,
  • a ‘labels’ variable which is a list of list of labels (one list of labels for each example).

Examples

The following examples represents choreographies taken from the mixed full dataset.

_images/110.gif _images/210.gif _images/310.gif _images/48.gif _images/51.gif _images/61.gif

Contact

Appendix

List of primitive dance motions

A table with illustrations is presented on the primitive illustration page.

Id Limb(s) Description
1 right arm hold horizontal
2 right arm hold vertical (down)
3 right arm hold vertical (up)
4 right arm from horizontal on side, bend over the head
5 right arm raise from horizontal to vertical
6 right arm lower from horizontal to vertical
7 right arm from horizontal side, bend in front of the torso
8 right arm from horizontal side, bent elbow to get vertical forearm toward up
9 right arm mimic punching
10 right arm hold horizontal and bring from side to front
11 right arm from horizontal side, bend elbow to get vertical forearm toward down
12 right arm from horizontal side, bring hand to shoulder (elbow moving vertically)
13 right arm hold horizontal and bring from right side to left side
14 right arm swing forearm downside with horizontal upper arm
15 right arm draw circles with arm extended on the right
16 right arm wave motion of the arm held, horizontal on the side
17 right arm wave hand (shoulder level)
18 right arm wave hand (over the head)
19 both arms clap hands (at varying positions)
20 both arms mimic paddling on the left
21 both arms mimic paddling on the right
22 both arms mimic pushing on ski sticks
23 legs un-squat
24 legs mimic walking
25 legs stay still
26 legs step on the right
27 legs step on the left
28 right leg raise and bend leg to form a flag (or P) shape
29 left leg raise and bend leg to form a flag (or P) shape
30 left arm hold horizontal
31 left arm hold vertical (down)
32 left arm hold vertical (up)
33 left arm from horizontal on side, bend over the head
34 left arm raise from horizontal to vertical
35 left arm lower from horizontal to vertical
36 left arm from horizontal side, bend in front of the torso
37 left arm from horizontal side, bent elbow to get vertical forearm toward up
38 left arm mimic punching
39 left arm hold horizontal and bring from side to front
40 left arm from horizontal side, bend elbow to get vertical forearm toward down
41 left arm from horizontal side, bring hand to shoulder (elbow moving vertically)
42 left arm hold horizontal and bring from left side to right side
43 left arm swing forearm downside with horizontal upper arm
44 left arm draw circles with arm extended on the left
45 left arm wave motion of the arm held, horizontal on the side
46 left arm wave hand (shoulder level)
47 left arm wave hand (over the head)
The small dataset uses the following subset of labels:
1, 5, 6, 10, 19, 20, 21, 22, 23, 24, 25, 28, 30, 38, 40, 43