Human identity recognition is a very active research field. Not only walking style but also other activities can provide identity information. The purpose of this study was to examine whether persons
performing same actions can be recognized by analyzing data captured by inertial sensors.
Nine active volunteer athletes of aerobics performed a set of 3 aerobic actions at a same rhythm of 8/8 beat after putting on the wearable motion capture system named Noitom Legacy, which contain 17
inertial sensors with wireless motion capture analysis software. We chose leg-raising, jumping with clapping overhead and arm-stretching as a set of actions for they represented the movements of the
body’s lower part, upper part, and the whole body, respectively. Each action was repeated three times to capture subjects’ respectively motion characteristics.
Displacement and orientation data of 21 body segments in biovision hierarchical (BVH) format captured by inertial sensor were collected. K-means clustering was applied on these BVH data to explore the
most representative orientation angles or the joints positions, using Mata Lab Software. F-value for these three actions were 0.90, 0.91 and 0.90 in leg-raising, jumping with clapping overhead and
arm-stretching respectively, which indicated good accuracy in person identity recognition in these three actions.
Experimental analysis highlights the fact that the way humans perform various actions can be used for identity recognition. The three best calculated values of the different actions came from the
upper segments of body with no exception, suggesting that movements of the upper body have the possibility to be one of the most sensitive non-invasive biometrics identification besides gait, while
verification and identification application using these inertial sensors, still need to be verified.