Share this Abstract

Disabled by author

Rate this Abstract

Login to allow rating

Views

Login to allow views

Options


Abstract Details

Abstract Title

Human Action Recognition Study with Motion Capture System Based on Inertial Sensors

Abstract Theme

Technology in sports

Type Presentation

Oral presentation

Abstract Authors

Presenter Jing Chen - Peking University (Kinesiology laboratory, Institute of sports science) - CN
Xiaoqing Fu - Peking University (Kinesiology laboratory, Institute of sports science) - CN
Ning Li - Peking University (Department of physical education) - CN
Yu Huang - Peking University (Department of physical education) - CN
Lei Che - Peking University (Department of physical education) - CN
Fuquan Lu - Peking University (Kinesiology laboratory, institute of sports science) - CN

Presentation Details

Room: Mercúrio        Date: 3 September        Time: 10:20:00        Presenter: Jing Chen

Abstract Resume

Background:
Human identity recognition is a very active research field. Not only walking style but also other activities can provide identity information. The purpose of this study was to examine whether persons
performing same actions can be recognized by analyzing data captured by inertial sensors.

Methods:
Nine active volunteer athletes of aerobics performed a set of 3 aerobic actions at a same rhythm of 8/8 beat after putting on the wearable motion capture system named Noitom Legacy, which contain 17
inertial sensors with wireless motion capture analysis software. We chose leg-raising, jumping with clapping overhead and arm-stretching as a set of actions for they represented the movements of the
body’s lower part, upper part, and the whole body, respectively. Each action was repeated three times to capture subjects’ respectively motion characteristics.

Results:
Displacement and orientation data of 21 body segments in biovision hierarchical (BVH) format captured by inertial sensor were collected. K-means clustering was applied on these BVH data to explore the
most representative orientation angles or the joints positions, using Mata Lab Software. F-value for these three actions were 0.90, 0.91 and 0.90 in leg-raising, jumping with clapping overhead and
arm-stretching respectively, which indicated good accuracy in person identity recognition in these three actions.

Conclusions:
Experimental analysis highlights the fact that the way humans perform various actions can be used for identity recognition. The three best calculated values of the different actions came from the
upper segments of body with no exception, suggesting that movements of the upper body have the possibility to be one of the most sensitive non-invasive biometrics identification besides gait, while
verification and identification application using these inertial sensors, still need to be verified.

Comment this abstract (0 comments)