TuA1.1P IEEE RO-MAN: The 22nd IEEE International Symposium on Robot and Human Interactive Communication Gyeongju, Korea, August 26-29, 2013

Similar documents
Gait Recognition Using Encodings With Flexible Similarity Measures

Gait Based Personal Identification System Using Rotation Sensor

Arm Swing as a Weak Biometric for Unobtrusive User Authentication

Gait Recognition. Yu Liu and Abhishek Verma CONTENTS 16.1 DATASETS Datasets Conclusion 342 References 343

A Bag-of-Gait Model for Gait Recognition

User Activity Related Data Sets for Context Recognition

A Survey of Biometric Gait Recognition: Approaches, Security and Challenges. Davrondzhon Gafurov Gjøvik University College

Use of On-Body Sensors to Support Elite Sprint Coaching

Development of Wearable Sensor Combinations for Human Lower Extremity Motion Analysis

Reliable Real-Time Recognition of Motion Related Human Activities using MEMS Inertial Sensors

Gait Analysis of Gender and Age Using a Large-Scale Multi-View Gait Database

Ambulatory monitoring of gait quality with wearable inertial sensors

Movement-Type Classification Using Acceleration Sensor

Posture influences ground reaction force: implications for crouch gait

Bicycle Safety Map System Based on Smartphone Aided Sensor Network

Mobility Lab provides sensitive, valid and reliable outcome measures.

Gjøvik University College

Jogging and Walking Analysis Using Wearable Sensors *

Geometric moments for gait description

Coaching the Triple Jump Boo Schexnayder

Novel Technique for Gait Analysis Using Two Waist Mounted Gyroscopes

An investigation of kinematic and kinetic variables for the description of prosthetic gait using the ENOCH system

Walk the Walk: Attacking Gait Biometrics by Imitation

Fall Prevention Midterm Report. Akram Alsamarae Lindsay Petku 03/09/2014 Dr. Mansoor Nasir

Ecole doctorale SMAER Sciences Mécaniques, Acoustique, Electronique, Robotique

Heart Rate Prediction Based on Cycling Cadence Using Feedforward Neural Network

Introduction to Pattern Recognition

Step Detection Algorithm For Accurate Distance Estimation Using Dynamic Step Length

Brief Biomechanical Analysis on the Walking of Spinal Cord Injury Patients with a Lower Limb Exoskeleton Robot

Emergent walking stop using 3-D ZMP modification criteria map for humanoid robot

J. Electrical Systems 11-2 (2015): Regular paper. Abnormal gait detection using Hexagonal method on Model based Front view model

Gait Analysis of a Little Biped Robot. Received May 2015; accepted July 2015

Controlling Walking Behavior of Passive Dynamic Walker utilizing Passive Joint Compliance

Gait analysis for human identification

Cricket umpire assistance and ball tracking system using a single smartphone camera

SPRINTING CHARACTERISTICS OF WOMEN S 100 METER FINALS AT THE IAAF WORLD CHAMPIONSHOPS DAEGU 2011

Gait Analysis at Your Fingertips:

Accelerometers: An Underutilized Resource in Sports Monitoring

Bayesian Optimized Random Forest for Movement Classification with Smartphones

Video Based Accurate Step Counting for Treadmills

Performance of Fully Automated 3D Cracking Survey with Pixel Accuracy based on Deep Learning

Motion Control of a Bipedal Walking Robot

Performance and Security Analysis of Gait-based User Authentication

Prediction of Crossing Driveways of a Distracted Pedestrian from Walking Characteristics

PERCEPTIVE ROBOT MOVING IN 3D WORLD. D.E- Okhotsimsky, A.K. Platonov USSR

Shoe-shaped Interface for Inducing a Walking Cycle

EXSC 408L Fall '03 Problem Set #2 Linear Motion. Linear Motion

Thursday, Nov 20, pm 2pm EB 3546 DIET MONITORING THROUGH BREATHING SIGNAL ANALYSIS USING WEARABLE SENSORS. Bo Dong. Advisor: Dr.

Evaluation of Gait Recognition. Synonyms. Definition. A Panoramic View of Performance. Introduction

1. Task Definition. Automated Human Gait Recognition Ifueko Igbinedion, Ysis Tarter CS 229, Fall 2013

Mutual and asynchronous anticipation and action in sports as globally competitive

An Indoor Positioning Method with MEMS Inertial Sensors

ZMP Trajectory Generation for Reduced Trunk Motions of Biped Robots

Person Identification Using Gait

Denny Wells, Jacqueline Alderson, Kane Middleton and Cyril Donnelly

Decentralized Autonomous Control of a Myriapod Locomotion Robot

APPLICATION OF THREE DIMENSIONAL ACCELEROMETRY TO HUMAN MOTION ANALYSIS

Body Stabilization of PDW toward Humanoid Walking

Using Ground Reaction Forces from Gait Analysis: Body Mass as a Weak Biometric

Generation of Robot Motion Based on Measurement of Human Movement. Susumu Sakano 1, Satoru Shoji 1

Gait Analyser. Description of Walking Performance

KINEMATIC ANALYSIS OF SHOT PUT IN ELITE ATHLETES A CASE STUDY

Sensitivity of toe clearance to leg joint angles during extensive practice of obstacle crossing: Effects of vision and task goal

Biomechanics and Models of Locomotion

Expert Systems with Applications

Takashi Watanabe, Hiroki Saito, Eri Koike, and Kazuki Nitta. 1. Introduction

The Influence of Load Carrying Modes on Gait variables of Healthy Indian Women

Proposed Paralympic Classification System for Va a Information for National federations and National Paralympic Committees

Disability assessment using visual gait analysis

REPRESENTATION OF HUMAN GAIT TRAJECTORY THROUGH TEMPOROSPATIAL IMAGE MODELLING

TEMPORAL ANALYSIS OF THE JAVELIN THROW

ZSTT Team Description Paper for Humanoid size League of Robocup 2017

Introduction to Pattern Recognition

NIH Public Access Author Manuscript Gait Posture. Author manuscript; available in PMC 2009 May 1.

In memory of Dr. Kevin P. Granata, my graduate advisor, who was killed protecting others on the morning of April 16, 2007.

Dynamic analysis and motion measurement of ski turns using inertial and force sensors

Acoustic Gait-based Person Identification using Hidden Markov Models


Figure 1 betois (bending torsion insole system) system with five measuring points and A/D- converter.

A Three-Dimensional Motion Anlaysis of Horse Rider in Wireless Sensor Network Environments

DIFFERENCE BETWEEN TAEKWONDO ROUNDHOUSE KICK EXECUTED BY THE FRONT AND BACK LEG - A BIOMECHANICAL STUDY

In addition to reading this assignment, also read Appendices A and B.

A COMPARISON OF SELECTED BIOMECHANICAL PARAMETERS OF FRONT ROW SPIKE BETWEEN SHORT SET AND HIGH SET BALL

Journal of Chemical and Pharmaceutical Research, 2016, 8(6): Research Article. Walking Robot Stability Based on Inverted Pendulum Model

A New Approach to Modeling Vertical Stiffness in Heel-Toe Distance Runners

Artifacts Due to Filtering Mismatch in Drop Landing Moment Data

Influence of Walking Surfaces and Speeds on Accelerometer-Based Gait Recognition

WALKING MOTION ANALYSIS USING SMALL ACCELERATION SENSORS

Foot side detection from lower lumbar spine acceleration

The Effect of Time on the Performance of Gait Biometrics

Gait Recognition with Fuzzy Classification Using Shoulder Body Joint

Simulation analysis of the influence of breathing on the performance in breaststroke

Automated analysis of microscopic images of cellular tissues

ASSESMENT Introduction REPORTS Running Reports Walking Reports Written Report

Human Gait Model for Automatic Extraction and Description for Gait Recognition

Designing Diving Beetle Inspired Underwater Robot(D.BeeBot)

Improving pedestrian dynamics modelling using fuzzy logic

Smart shoe-based evaluation of gait phase detection accuracy using body-worn accelerometers

Stride October 20, 2017

Smart-Walk: An Intelligent Physiological Monitoring System for Smart Families

Transcription:

2013 IEEE RO-MAN: The 22nd IEEE International Symposium on Robot and Human Interactive Communication Gyeongju, Korea, August 26-29, 2013 TuA1.1P.25 978-1-4799-0509-6/13/$31.00 2013 IEEE 220

rates above 80% are achieved when identifying one out of 30-36 subjects [9,10,14,15]. Most approaches segment the continuous data into single strides, and calculate the similarity between the test sample and the training templates, assembled in the gait gallery. Only a few approaches avoid segmentation into single strides, e.g., [9, 16]. The closest match between a test gait cycle and a template in the gait gallery is found by a similarity score applied to pre-processed measures of the IMU [4] or dynamic time warping [5,7]. Descriptive statistics of each gait cycle are calculated as features in [17] and a decision tree or neural network is used for classification. [10] calculates the correlation between the test cycle and the template separately for the left and the right step [10]. They segment the data into single steps, but do not recognize whether it is a left or right step. During classification, they correlate the templates of the left and the right step with both steps of the test cycle. The maximum correlation describes the closeness to a template of the gallery. However, they do not consider the correlation between the left step and the right step of the same stride. We investigate whether the asymmetry in walking can be used to identify a person. Therefore, we introduce a feature which correlates the measures of the left step with the measures of the right step of the same stride. Related work investigates the following conditions which may affect recognition accuracy: different footwear [30], walking over several days [5,15], and carrying a backpack [8]. Different sensor placement between the trials is included when recording over several days [5,11,15] or is considered explicitly for recording in one session [8]. This aspect is of importance because small variations in the position of the sensor on the body cannot be avoided. We also include slight variations of the sensor placement when recording our dataset. Furthermore, we investigate the robustness of identification against variations in speed. [11] includes slow, normal, and fast gait cycles into the gait gallery and report an equal error ratio of 7%. We further investigate the impact of speed when slow or fast walking is not included in the training data and how the symmetry-based approach is influenced by speed changes. Furthermore, walking on a curved route versus straight walking has not yet been considered when recording gait data with IMU. Walking on a curved route is a great challenge for computer vision, because the perspective changes. Considering this aspect, motion capture with IMUs is preferable. We investigate to what extent an identification approach developed for straight walking can be transferred to walking on a curved route without further modifying the method. III. SEGMENTATION In order to realize online recognition and increase the number of gait samples, an auto-segmentation process is necessary. Segmentation is the process of extracting the temporal extents of single strides from continuous time series data of gait. In this paper, we are aiming for a computationally light segmentation method to detect a single stride based only on the sensor at the belt. A. Significant dimension for segmentation First, we analyzed single stride hip movement waveforms of 20 subjects during normal gait and concluded that they share the same significant features. One participant VVLQJOH stride data of an IMU attached to the pelvis (forward centered) are extracted manually and spectral analysis results are shown in Fig.2 as an example. The z-axis is the axis of forward direction of movement and this direction can be automatically determined in the case of arbitrary sensor orientation. As can be seen in Fig. 2, the angular velocity about the z-axis, CUNK í :P; has less noise than the other components. Furthermore, CUNK í :P; has two principal frequencies which are shown as two positive peaks, the smaller one (around 1.5Hz) represents the frequency of a single stride and the larger one (around 3Hz) represents the frequency of single steps. We can observe that the waveform of a single stride of CUNK í :P; has more obvious features than other degrees of freedom, even after using a Butterworth low pass filter (3Hz). Therefore, the rotational speed CUNK í :P; from the sensor on the pelvis is chosen as the significant dimension for segmentation. Figure 2. Frequency analysis and waveform of single strides of a participant. B. Feature based segmentation To enlarge the feature of CUNK í :P; of pelvis, equation (1) is applied after CUNK í :P; is filtered. The results of this transformation for one participant are shown in Fig.3. Furthermore, to make the segment point easy to detect, we can approximately regard the waveform shown in Fig.4 as the feature (:P; of a single stride. One gait cycle begins with the heel contact with one foot and ends with the heel contract of the same foot. In this paper, we use the right foot. The gait cycle can also be divided into two phases, the stance phase and the swing phase. The cycle samples of the hip in Fig.3 are manually extracted based on the ground truth of the right heel contacting the ground. Thus in Fig.4, the first negative peak P1 represents the beginning of the stance phase of the right foot with the heel contact of the 978-1-4799-0509-6/13/$31.00 2013 IEEE 221

978-1-4799-0509-6/13/$31.00 2013 IEEE 222

V. TRAINING AND CLASSIFICATION In order to test if the proposed asymmetry-based features work for human identification, a simple classifier is used. In this paper, a simple Bayes based Gaussian classifier is used for training and classification. Compared to using Hidden Markov Models for gait recognition, our approach is computionally lighter, hence, faster. To hold all features, all the componants of I Ü are used for computation without any dimensionality reduction. According to Bayes rule, the class Y to be identified from the maximum posterior probability of Y given by the test data X can be transformed from (3) to (4). Y(X)=argmaxP:;:; (3) Y(X)= argmax [P::;;P(;)] (4) To ensure that the prior probability is uniform, one sample is left out from every labeled class in the same order during leave-one-out cross validation. Therefore, (4) can be simplified to (5) and (6). (E =1~total number of classes). Y(X)= argmax [P::; Ü ;] (5) 2::; Ü ;L)=QOOE=J::áä Ü áê Ü ; (6) VI. EXPERIMENT To validate the proposed approach, a human gait database was collected. This experiment was conducted at the University of Waterloo. 20 participants (12 males and 8 females) were asked to walk on a straight route and two different curved routes, while wearing IMUs. 3DUWLFLSDQWV demographic information is summarized in Table I. The experiment was approved by the University of Waterloo Research Ethics Board, and signed consent was obtained from all participants. TABLE I. PARTICIPANTS DEMOGRAPHIC INFORMATION. Age Height[cm] Weight[kg] Ä 27 172 73 Ë 4.84 10.62 9.37 The data collection was carried out with the Shimmer IMU, a small wearable wireless inertial measurement sensor device [24]. It collects linear acceleration and angular velocity and transmits this information via Bluetooth to a nearby computer. In this experiment, the sampling frequency was set to 128Hz. Three IMUs were used in this experiment. One is attached to the pelvis (forward centered); this sensor was used for both single stride segmentation and human identification, the second is attached to the right ankle and only used to provide the ground truth for the validation of single stride segmentation during walking, since the z-component of acceleration shows a spike due to impact when the right foot contacts the ground. The third sensor is attached to the thorax but it was not used in this paper. A. Straight route 10 rounds of walking on a straight route (approximately 12m distance) at different speeds were collected from each participant: normal, slow and fast. Participants were asked to walk in the order of normal, slow, fast in each round, wearing the 3 IMUs. Thus in total 30 trials were recorded from each participant. Considering the fact that walking speed is related to the personality of the individual, we did not give any H[SOLFLWJXLGHOLQHVRQKRZIDVWLVµIDVWDQGKRZVORZLVVORZ and thus the speed of the gait is based on the individual. Additionally, before each round the placement of the IMU on the trunk was altered by asking the participant to move the belt, in order to exclude the possibility that persons are identified based on the sensor placement. B. Curved route Candidates were asked to walk both clockwise and counter-clockwise on circles of two different sizes. One is a bigger circle (r=3.7m) and the other a smaller circle (r=1.7m). Thus in total 4 trials of curved route walking were recorded per candidate. In summary, there are 30 trials per subject under the condition of straight route walking and 4 trials per subject under the condition of curved route walking. 5 continuous single strides were extracted from the middle of each trial via the segmentation algorithm described in Section II; the number of samples that can be used for validation is shown in Table II. Since we only use the data recorded by the unit attached on the pelvis, 6-dimensional time series data consisting of the 3 dimensions of linear velocity and 3 dimensions of acceleration are contained in each sample. Hence, in equation (2), M Ü >G?Ð :. TABLE II. Route SUMMARY OF THE NUMBER OF SAMPLES PER SUBJECT. straight Condition normal slow fast Trial 10 10 10 Stride 50 50 50 Total 150 Route curved Small counter small counter small big big clockwise counter clockwise Trial 1 1 1 1 Stride 5 5 5 5 Total 20 VII. EXPERIMENTAL RESULTS AND VALIDATION When the feature based segmentation algorithm is applied to our database, the thresholds set in the feature-based segmentation algorithm are the same for every speed and participant. The feature based segmentation algorithm is applied to all trials collected in our experiment and the parameter values used are shown in Table III. The segmentation results from Participant 1 (P1) are shown in Fig.5 as an example. The red lines are the ground truth obtained from the unit attached at the ankle, and the black lines are the segment points obtained from the feature based algorithm described above. TABLE III. THRESHOLD USED IN SEGMETATION. Threshold T1 T2 T3 T4 M1 Value 75 10 230 320-30 Threshold M2 M3 M4 M5 M6 Value -12 480 60 60 90 978-1-4799-0509-6/13/$31.00 2013 IEEE 223

978-1-4799-0509-6/13/$31.00 2013 IEEE 224

x We find a significant dimension (CUNK í :P; in this paper) for feature-based segmentation. Identification is based on the observation of a single stride (one left and one right step) after segmentation. This makes it possible to identify people online by giving the identification result right after a single stride. In addition, a voting algorithm can be applied with our approach by identifying people from three or more strides instead of one stride to enhance the robustness. x Robustness of the approach is tested under different speed conditions and sensor placements. Additionally, we investigate whether walking in a curved line has an effect on the identification process. Future tasks include optimizing the thresholds in the segmentation algorithm automatically based on the walking speed and finding new features of gait that are robust to path curvature and comparing the proposed approach with other existing approaches. REFERENCES [1] '*DIXURY³$VXUYH\RIELRPHWULFJDLWUHFRJQLWLRQ$SSURDFKHV VHFXULW\ DQG FKDOOHQJHV LQ 3URFHHGLQJV of Annual Norwegian Computer Science Conference, 2007. [2] D. Gafurov, E. Snekkenes, and P. Bours. Spoof attacks on gait. IEEE Transactions on Information Forensics and Security,Special Issue on Human Detection and Recognition, 2007. [3] D. Gafurov. Security analysis of impostor attempts with respect to gender in gait biometrics. In Biometrics: Theory, Applications, and Systems, First IEEE International Conference on, 2007. [4] D. Gafurov and P. Bours. Improved hip based individual recognition using wearable motion recording sensor. In Security Technology, Disaster Recovery and Business Continuity, volume 122 of Communications in Computer and Information Science, pages 179±186. Springer Berlin Heidelberg, 2010. [5] M. O. Derawi, P. Bours, and K. Holien. Improved cycle detection for accelerometer based gait authentication. In Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP), Sixth International Conference on, 2010. [6] T. Kobayashi, K. Hasida, and N. Otsu. Rotation invariant feature extraction from 3-d acceleration signals. In International Conference on Acoustics, Speech, and Signal Processing, pages 3684±3687, 2011.. [7] M. O. Derawi, C. Nickely, P. Bours, and C. Busch. Unobtrusive user-authentication on mobile phones using biometric gait recognition. In Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP), 2010 Sixth International Conference on, 2010. [8] D. Gafurov, E. Snekkenes, and P. Bours. Gait authentication and identification using wearable accelerometer sensor. In 5th IEEE Workshop on Automatic Identification Advanced Technologies, 2007. [9] J. R. Kwapisz, G. M. Weiss, and S. A. Moore. Cell phone based biometric identification. In IEEE Fourth International Conference on Biometrics: Theory, Applications and Systems, 2010. [10] H. J. Ailisto, M. Lindholm, J. Mantyjarvi, E. Vildjiounaite, and S. Makela. Identifying people from gait pattern with accelerometers. In Biometric Technology for Human Identification, SPIE, volume 5779, 2005. [11] J. Mantyjarvi, M. Lindholm, E. Vildjiounaite, S.-M. Makela, and H. Ailisto. Identifying users of portable devices from gait pattern with accelerometers. In Proc. of IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005. [12] L. Rong, Z. Jianzhong, L. Ming, and H. Xiangfeng. A wearable acceleration sensor system for gait recognition. In 2nd IEEE Conference on Industrial Electronics and Applications, pages 2654± 2659, 2007. [13] N. Trung, Y. Makihara, H. Nagahara, R. Sagawa, Y. Mukaigawa, and Y. Yagi. Phase registration in a gallery improving gait authentication. In the International Joint Conference on Biometrics (IJCB2011). IEEE and IAPR, 2011. [14] E. Vildjiounaite, S.-M. Makela, M. Lindholm, R. Riihimaki, V. Kyllonen, J. Mantyjarvi, and H. Ailisto. Unobtrusive multimodal biometrics for ensuring privacy and information security with personal devices. In Pervasive Computing, 4th International Conference, PERVASIVE, pages 187±201, 2006. [15] L. Rong, D. Zhiguo, Z. Jianzhong, and L. Ming. Identification of individual walking patterns using gait acceleration.in The 1st International Conference on Bioinformatics and Biomedical Engineering, pages 543±546, 2007. [16] T. Zhang, G. Venture, Individual Recognition from Gait using Feature Value Method, Cybernectics and Information Technology, Vol. 12, No 3, pages 86-95, 2012. [17] B. Huang, M. Chen, P. Huang, and Y. Xu. Gait modeling for human identification. In IEEE International Conference on Robotics and Automation, 2007. [18] N. Trung, Y. Makihara, H. Nagahara, Y. Mukaigawa, and Y. Yagi. ³Performance Evaluation of Gait Recognition using the Largest Inertial Sensor-based Gait Database, Proc. of the 5th IAPR Int. Conf. on Biometrics, Paper ID 182, pp. 1-7, New Delhi, India, Mar., 2012. [19] M. Karg, K. Kühnlenz, M. Buss, Recognition of Affect Based on Gait Patterns, IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 40 (2010), no. 4, pages 1050-1061. [20] B. Gelder.Towards the neurobiology of emotional body language. Journal of Nature reviews, Neuroscience, vol. 7, pages 242±249, March2006. [21] C.Roether, L.Omlor, A. Christensen, and M.Giese, ³&ULWLFDOIHDWXUHV for the perception of emotion from gait, Journal of Vision, vol.9.no.6, pages 1-32, 2009. [22] Sadeghi H. Local or global asymmetry in gait of people without impairments. Gait Posture 2003; 17(3): pages 197±204. [23] Seeley MK, Umberger BR, Clasey JL & Shapiro R. The relation between mild leg-length inequality and able-bodied gait asymmetry. Journal of Sports Science and Medicine 9, pages 572-579, 2010. [24] Kaufman, K.R., Miller, L.S. and Sutherland, D.H. Gait asymmetry in patients with limb-length inequality. Journal of Pediatric Orthopaedics, 16(2), pages 144-150, (1996) [25] $%XUQV%5*UHHQH0-0F*UDWK7-2 6KHD%.XULV60 Ayer, F. Stroiescu, and V. Cionca. Shimmer: A wireless sensor platform for noninvasive biomedical research. IEEE Sensors Journal, 10:1527±1534, 2010. [26] S. Sarkar, P. J. Phillips, Z, Liu, I. R. Vega, P. Grother, and K. W. Bowyer, " The HumanID Gait Challenge Problem: Data Sets, Performance, and Analysis." IEEE Trans. on Pattern Analysis and Machine Intelligence, pp. 162-177, Vol.27, No.2, 2005. [27] 1 %RXOJRXULV ' +DW]LQDNRV DQG.1 3ODWDQLRWLV ³*DLW Recognition: A challenging signal processing technology for biometric LGHQWLILFDWLRQ IEEE Signal Processing Magazine, p.78-90, Vol, 22, No.6, 2005. [28] K. Frank, M. Nadales, P. Robertson and M. Angermann. "Reliable real-time recognition of motion related human activities using MEMS inertial sensors." Proc. of the 23rd Int. Technical Meeting of the Satellite Division of the Institute of Navigation, 2010. [29] C. Strohrmann,; H Harms, C. Setz, and G. 7U VWHU³Monitoring Kinematic Changes with Fatigue in Running Using Body-Worn Sensors IEEE Transactions on Information Technology in Biomedicine, IEEE, pp.983-990 Vol. 16, 2012. [30] D. GafurovDQG(6QHNNHQHV³7RZDUGVXQGHUVWDQGLQJWKHXQLTXHQHVV RIJDLWELRPHWULF 8 th IEEE Int. Conf. on Automatic Face & Gesture Recognition,,pp. 1-8, 2008 978-1-4799-0509-6/13/$31.00 2013 IEEE 225