UMAHand: Hand Activity Dataset (Universidad de Málaga)
Loading...
Files
Description: Archivo de texto con la descripción del conjunto de datos.
Description: Dataset con las muestras
Identifiers
Publication date
Reading date
Collaborators
Advisors
Tutors
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Figshare
Share
Department/Institute
Abstract
The objective of the UMAHand dataset is to provide a systematic, Internet-accessible benchmarking database for evaluating algorithms for the automatic identification of manual activities. The database was created by monitoring 29 predefined activities involving specific movements of the dominant hand. These activities were performed by 25 participants, each completing a certain number of repetitions. During each movement, participants wore a 'mote' or Shimmer sensor device on their dominant hand's wrist. This sensor, comparable in weight and volume to a wristwatch, was attached with an elastic band according to a predetermined orientation.
The Shimmer device contains an Inertial Measurement Unit (IMU) with a triaxial accelerometer, gyroscope, magnetometer, and barometer. These sensors recorded measurements of acceleration, angular velocity, magnetic field, and atmospheric pressure at a constant sampling frequency of 100 Hz during each movement.
Description
The UMAHand Dataset comprises a main directory and three subdirectories: TRACES (containing measurements), VIDEOS (containing video sequences) and SCRIPTS (with two scripts that automate the downloading, unzipping and processing of the dataset). The main directory also includes three descriptive plain text files and an image:
• "readme.txt": this file is a brief guide of the dataset which describes the basic characteristics of the database, the testbed or experimental framework used to generate it and the organization of the data file.
• "user_characteristics.txt": which contains a line of six numerical (comma-separated) values for each participant describing their personal characteristics in the following order: 1) an abstract user identifier (a number from 01 to 25), 2) a binary value indicating whether the participant is left-handed (0) or right-handed (1), 3) a numerical value indicating gender: male (0), female (1), undefined or undisclosed (2), 4) the weight in kg, and 5) the height in cm and 6) the age in years.
• "activity_description.txt": For each activity, this text file incorporates a line with the activity identifier (numbered from 01 to 29) and an alphanumeric string that briefly describes the performed action.
• "sensor_orientation.jpg": a JPEG-type image file illustrating the way the sensor is carried and the orientation of the measurement axes.
The TRACE subfolder with the data is, in turn, organized into 25 secondary subfolders, one for each participant, named with the word "output" followed by underscore symbol (_) and the corresponding participant identifier (a number from 1 to 25). Each subdirectory contains one CSV (Comma Separated Values) file for each trial (each repetition of any activity) performed by the corresponding volunteer.
The filenames with the monitored data follow the following format: "user_XX_activity_YY_trial_ZZ.csv" where XX, YY, and ZZ represent the identifiers of the participant (XX), the activity (YY) and the repetition number (ZZ), respectively.
In the files, which do not include any header, each line corresponds to a sample taken by the sensing node. Thus, each line of the CSV files presents a set of the simultaneous measurements captured by the sensors of the Shimmer mote at a certain instant. The values in each line are arranged as follows:
•Timestamp, Ax, Ay, Az, Gx, Gy, Gz, Mx, My, Mz, P
where:
-Timestamp is the time indication of the moment when the following measurements were taken. Time is measured in milliseconds elapsed since the start of the recording. Therefore, the first sample, in the first line of the file, has a zero value while the rest of the timestamps in the file are relative to this first sample.
-Ax, Ay, Az are the measurements of the three axes of the triaxial accelerometer (in g units).
-Gx, Gy, Gz indicate the components of the angular velocity measured by the triaxial gyroscope (in degrees per second or dps).
-Mx, My, Mz represent the 3-axis data in microteslas (µT) captured by the magnetometer.
-P is the measurement of pressure in millibars.
Besides, the VIDEOS directory includes 29 anonymized video clips that illustrate with the corresponding examples the 29 manual activities carried out by the participants. The video files are encoded in MPEG4 format and named according to the format "Example_Activity_XX.mp4", where XX indicates the identifier of the movement (as described in the activity_description.txt file).
Finally, the SCRIPTS subfolder comprises two scripts written in Python and Matlab. These two programs (named Load_traces), which perform the same function, are designed to automate the downloading and processing of the data. Specifically, these scripts perform the following tasks:
1. Download the database from the public repository as a single compressed zip file.
2. Unzip the aforementioned file and create the subfolder structure of the dataset in a specific directory named UMAHand_Dataset. As previously commented, in the subfolder named TRACES, one CSV trace file per each experiment (i.e. per each movement, user, and trial) is created.
3. Read all the CSV files and store their information in a list of dictionaries (Python) or a matrix of structures (Matlab) named datasetTraces. Each element in that list/matrix has two fields: the filename (which identifies the user, the type of performed activity, and the trial number) and a numerical array of 11 columns containing the timestamps and the measurements of the sensors for that experiment (arranged as mentioned above).
All experiments and data acquisition were conducted in private home environments. Participants were asked to perform those activities involving sustained or continuous hand movements (e.g. clapping hands) for at least 10 seconds. In the case of brief and punctual movements, which might require less than 10 seconds (e.g. picking up an object from the floor), volunteers were simply asked to execute the action until its conclusion. Thus, a total of 752 samples were collected, with durations ranging from 1.98 to 119.98 seconds.
Bibliographic citation
Collections
Endorsement
Review
Supplemented By
Referenced by
Creative Commons license
Except where otherwised noted, this item's license is described as Attribution-NonCommercial 4.0 Internacional










