WALKING MOTION ANALYSIS USING SMALL ACCELERATION SENSORS

Similar documents
Gait Analysis of a Little Biped Robot. Received May 2015; accepted July 2015

Toward a Human-like Biped Robot with Compliant Legs

Generation of Robot Motion Based on Measurement of Human Movement. Susumu Sakano 1, Satoru Shoji 1

REPRESENTATION OF HUMAN GAIT TRAJECTORY THROUGH TEMPOROSPATIAL IMAGE MODELLING

Walking aids based on wearable/ubiquitous computing aiming at pedestrian s intelligent transport systems

Squash Questions PLAY AND PAUSE. Open the video file Service lob. 1) Play the video at all the different speeds.

Development of Wearable Sensor Combinations for Human Lower Extremity Motion Analysis

Development of Fish type Robot based on the Analysis of Swimming Motion of Bluefin Tuna Comparison between Tuna-type Fin and Rectangular Fin -

Diving Questions PLAY AND PAUSE. Open the video file 10m Tuck

Toward a Human-like Biped Robot with Compliant Legs

COMPARATIVE STUDY OF KICKING PERFORMANCE BASED ON DIFFERENT KIND OF SHOES

J. Electrical Systems 11-2 (2015): Regular paper. Abnormal gait detection using Hexagonal method on Model based Front view model

Vibration Analysis and Test of Backup Roll in Temper Mill

GROUND REACTION FORCE DOMINANT VERSUS NON-DOMINANT SINGLE LEG STEP OFF

Analysis of Gait Characteristics Changes in Normal Walking and Fast Walking Of the Elderly People

ZSTT Team Description Paper for Humanoid size League of Robocup 2017

Analysis of Backward Falls Caused by Accelerated Floor Movements Using a Dummy

Embedded Platform Performance of Image Based Abnormal Gait Detection

Dynamic analysis and motion measurement of ski turns using inertial and force sensors

An investigation of kinematic and kinetic variables for the description of prosthetic gait using the ENOCH system

Open Research Online The Open University s repository of research publications and other research outputs

Video recording setup

Computer Aided Drafting, Design and Manufacturing Volume 26, Number 2, June 2016, Page 53. The design of exoskeleton lower limbs rehabilitation robot

A System Development for Creating Indoor Floor Plan of Walking Route using Video and Movement Data 1 2


A CO 2 Waveform Simulator For Evaluation and Testing of Respiratory Gas Analyzers

Controlling Walking Behavior of Passive Dynamic Walker utilizing Passive Joint Compliance

Body Stabilization of PDW toward Humanoid Walking

SHUFFLE TURN OF HUMANOID ROBOT SIMULATION BASED ON EMG MEASUREMENT

Kinect-based Badminton Motion Sensor as Potential Aid for Coaching Strokes in Novice Level

Tutorial 6a Manual Digitisation

ZMP Trajectory Generation for Reduced Trunk Motions of Biped Robots

The Kinematics Analysis of Wu Yibing's Tennis Forehand Technique Xin WEN, Ji-he ZHOU and Chuan-jia DU

Studying Vault Board Behavior: A Preliminary Look

Implementation of Height Measurement System Based on Pressure Sensor BMP085

HIGH RESOLUTION DEPTH IMAGE RECOVERY ALGORITHM USING GRAYSCALE IMAGE.

Characteristics of ball impact on curve shot in soccer

EXSC 408L Fall '03 Problem Set #2 Linear Motion. Linear Motion

Reduction of Bitstream Transfer Time in FPGA

A Study of Human Gaits across Different Speeds

Biomechanical analysis of spiking skill in volleyball

A Study on the Human Impulse Characteristics Of the Standing Shooting Posture

Artifacts Due to Filtering Mismatch in Drop Landing Moment Data

Joint Torque Evaluation of Lower Limbs in Bicycle Pedaling

A COMPARISON OF SELECTED BIOMECHANICAL PARAMETERS OF FRONT ROW SPIKE BETWEEN SHORT SET AND HIGH SET BALL

IEEE RAS Micro/Nano Robotics & Automation (MNRA) Technical Committee Mobile Microrobotics Challenge 2016

Human Identification Based on Walking Detection with Acceleration Sensor and Networked Laser Range Sensors in Intelligent Space

Generation of See-Through Baseball Movie from Multi-Camera Views

SPRINTING CHARACTERISTICS OF WOMEN S 100 METER FINALS AT THE IAAF WORLD CHAMPIONSHOPS DAEGU 2011

Walking Simulator Mechanism

UNIVERSITY OF WATERLOO

REPORT. A comparative study of the mechanical and biomechanical behaviour of natural turf and hybrid turf for the practise of sports

Video Based Accurate Step Counting for Treadmills

SCHEINWORKS Measuring and Analysis Systems by

Mutual and asynchronous anticipation and action in sports as globally competitive

Dynamically stepping over large obstacle utilizing PSO optimization in the B4LC system

GaitAnalysisofEightLegedRobot

DIFFERENCE BETWEEN TAEKWONDO ROUNDHOUSE KICK EXECUTED BY THE FRONT AND BACK LEG - A BIOMECHANICAL STUDY

Impact Points and Their Effect on Trajectory in Soccer

Investigating the Bubble Behavior in Pool Boiling in Microgravity Conditions Thilanka Munasinghe, Member, IAENG

Fall Prevention Midterm Report. Akram Alsamarae Lindsay Petku 03/09/2014 Dr. Mansoor Nasir

Biomechanical Analysis of Body Movement During Skiing Over Bumps

Robot motion by simultaneously wheel and leg propulsion

LABORATORY STUDY ON TSUNAMI REDUCTION EFFECT OF TEIZAN CANAL

Statistics of Bicycle Rider Motion

Cricket umpire assistance and ball tracking system using a single smartphone camera

Blocking time reduction for level crossings using the genetic algorithm

Development of a load model for men-induced loads on stairs

Gait Recognition. Yu Liu and Abhishek Verma CONTENTS 16.1 DATASETS Datasets Conclusion 342 References 343

User Activity Related Data Sets for Context Recognition

Towards determining absolute velocity of freestyle swimming using 3-axis accelerometers

OPTIMAL TRAJECTORY GENERATION OF COMPASS-GAIT BIPED BASED ON PASSIVE DYNAMIC WALKING

LOCOMOTION CONTROL CYCLES ADAPTED FOR DISABILITIES IN HEXAPOD ROBOTS

Pedestrian Dead Reckoning Based on Human Activity Sensing Knowledge

Decentralized Autonomous Control of a Myriapod Locomotion Robot

Walk - Run Activity --An S and P Wave Travel Time Simulation ( S minus P Earthquake Location Method)

Visual Observation of Nucleate Boiling and Sliding Phenomena of Boiling Bubbles on a Horizontal Tube Heater

Emergent walking stop using 3-D ZMP modification criteria map for humanoid robot

How Do You Swing? You should be working with new lab partners starting with this lab.

The Study of Bubbly Gas-Water Flow Using

LABORATORY EXPERIMENTS ON WAVE OVERTOPPING OVER SMOOTH AND STEPPED GENTLE SLOPE SEAWALLS

Karate (Wado Ryu) Answers

Analysis of stroke technique using acceleration sensor IC in freestyle swimming

Experimental Analysis of Japanese Martial Art Nihon-Kempo

Walking with coffee: when and why coffee spills

Dynamic Component of Ship s Heeling Moment due to Sloshing vs. IMO IS-Code Recommendations

Geometric moments for gait description

CSE 190a Project Report: Golf Club Head Tracking

Ball impact dynamics of knuckling shot in soccer

Determination of Capillary pressure & relative permeability curves

Tuesday, 18 July 2006 TUA2-4: 12:00-12:15

Design of a Pedestrian Detection System Based on OpenCV. Ning Xu and Yong Ren*

Using sensory feedback to improve locomotion performance of the salamander robot in different environments

Motion Control of a Bipedal Walking Robot

Modeling and simulation of multiple personal mobility. vehicles in pedestrian flows using personal space

Sample Solution for Problem 1.a

Gait Recognition with Fuzzy Classification Using Shoulder Body Joint

Summary. cross-country sit-sky. tests. subjects & materials. biomechanical model. results. discusison. conclusions

Compensator Design for Speed Control of DC Motor by Root Locus Approach using MATLAB

THREE DIMENSIONAL STRUCTURES OF FLOW BEHIND A

Transcription:

WALKING MOTION ANALYSIS USING SMALL ACCELERATION SENSORS Teruaki Ito Institute of Technology and Science University of Tokushima Tokushima, Japan e-mail: ito@me.tokushima-u.ac.jp Abstract Human walking motion shows various information including involuntary behaviors such as walking patterns, ground conditions, health conditions, as well as conscious actions such as system control and communication. Such information can be used to establish interface with ubiquitous network systems. Therefore, human walking motion detection plays a key role for such interface. Human motion including walking motion can be analyzed with the motion capturing systems. However, those systems require spatial hardware such as sensors and cameras, and special environments for the measurement. Therefore, it is not suitable for the everyday life. Video image analysis is also proposed to trace the human motion as an alternative approach. However, this is not suitable for the daily use, either. This paper proposes a new approach towards the estimation of human walking motion using wearable small acceleration sensors and several other sensors, which can be used in everyday life. Compared to the typical approach using the motion capturing system to capture and track human motion, this paper discusses the feasibility of the proposed approach using the walking motion simulation. Keywords-walking motion analysist; gesture recognition; acceleration sensor; locomotion model I. INTRODUCTION Gesture recognition technique enables humans to interface with the machine and interact naturally without any mechanical devices [1]. Gestures can originate from any bodily motion but commonly originate from the face or hand [2][3]. This study focuses on the human walking motion to be used as an interface to the system. Human walking motion shows various information including involuntary behaviors such as walking patterns, ground conditions, health conditions, as well as conscious actions such as system control and communication. How to capturing the human motion is the key to use it as an interface. Tracing walking motion in this study is no exception. In addition to the objective to trace the walking motion, motion capturing in mobile use is the critical requirement in this study. Motion capturing techniques based on the image processing are often used for this purpose. However, it is not suitable to use in this study. This study proposes an approach using small acceleration sensors to detect human walking motion and to analyze the status and condition of walking motion. First, this paper overviews the typical motion capturing approach. Then the proposed approach is presented, including model building, simulation, and method behind the calculation. The results of the experiments are also presented to show the feasibility of the proposed method. II. MOTION CAPTURING OVERVIEW Motion capture system traces human motion and builds computer simulation model. Originally developed and applied in medical analysis for athletes, the motion capturing system now are used to build human behavior for computer graphics characters, too. Figure 1. Motion capturing system overview. Figure 1 shows a snapshot of a typical motion capturing system which is called KinemaTracer [4]. The snapshot shows video clip, 3D model, analytical tools, graph views, etc. The basic procedure of KinemaTracer includes Data acquisition, Reconstruction of 3D representation, Model matching, and Kinematic extraction. Motion capturing can be achieved as shown in Figure 2. Color markers attached to the joints of the subject human body shown in the video images at the left side of the figure corresponds to the joints IJSSST, Vol. 1, No. 3, April 29 65 ISSN: 1473-84x online, 1473-831 print

of the ball-stick model on the right hand side of the figure. Tracing the trajectory of the markers, the ball-stick model, or kinetic model behaves precisely the original motion of the human subject. motion capture procedure consists of estimating the human body model parameters for the human subject, initializing the pose for a set of frames, and tracking the pose for the complete sequence. However, it is not suitable for applying to the human interface applications because of the application environment just like the motion capturing system. This study proposes an approach to motion capturing for mobile use with small acceleration sensors and applied it to build a locomotion model for biped walking. The next sections briefly overview of this approach. III. LOCOMOTION MODEL FOR BIPED WALKING Motion capturing can cover the whole movement of the human body. However, this study focuses on the walking motion because its objective is to propose an approach to human interface system. Therefore, the target portion of the body in this study can be regarded as biped walking portion of the human body. Figure 2. Motion capturing for walking motion. Figure 4. Biped locomotion model. Figure 3. Motion capture using 4 video cameras. A motion capturing system such as KinemaTracer requires the special environment for measuring the human motion using the installed cameras as shows in Figure 3. Before the motion capturing measurement, a series of preprocessing is required. For example, calibration of the system using a control object is required each time before the measurement. Colored markers need to be attached to the joints of the human subject. The human subject wears tight clothes so that the video camera could trace the body motion in a precise manner. All of these requirements may be accepted for the experimental purposes. However, this kinds of system are not suitable for daily use. To reduce the burden on the subject, an alternative approach to motion capturing without the colored markers has been reported [5-9]. For example, a markerless motion capture of human subjects using multiple calibrated cameras has been reported. It uses articulated shape models such as super-quadrics to represent the humans. The markerless Biped walking locomotion model could be built from the conventional motion capturing method as shown in Figure 4. The left picture in the figure shows the model of the whole body. The middle picture in the figure shows the leg portion of the model for locomotion model. The right picture in the figure shows the geometrical representation of the locomotion model. The goal of this study is to build the biped walking locomotion model without using the motion capturing system like KinemaTracer. Figure 5 shows the flow of walking motion detection in this study. The lower flow in Figure 5 shows the basic approach to build the locomotion model using the motion capturing system. In the mean time, this study proposes a small acceleration sensor approach, or sensor-based approach instead of using the motion capturing method. The upper flow in Figure 5 shows the proposed approach in this study. The next section covers the overview of the sensor-based walking motion detection, describes how to implement this approach. IV. OVERVIEW OF THE SENSOR-BASED WALKING MOTION DITECTION This section covers the overview of the sensor-based walking motion capturing approach proposed in this study. IJSSST, Vol. 1, No. 3, April 29 66 ISSN: 1473-84x online, 1473-831 print

Figure 5. Sensor-based walking motion detection. Instead of using the motion capturing system based on the video images, this proposed method uses small acceleration sensors attached to the human body so that the walking motion could be status could be recognized in a daily use. walking stick, which is equipped with two small acceleration sensors and a touch sensor at the end point. Touch Coordinates Touch Micro-comput Personal MATLAB 3D Locomotion Figure 7. Process flow. Figure 6. Sensor-based walking motion detection. Figure 6 shows the schematic image of the use of the wearable equipment based on the sensor-based approach. Considering the mobile use, the equipment is designed to be wearable, which is composed of touch sensors installed inside of the pair of shoes and a belt porch with a small acceleration sensor and PIC microcomputer. For experimental purposes, the porch is connected with the Figure 7 shows the process flow of the sensor-based approach. Streaming data transmitted from the sensors are processed by the PIC microprocessor, and post-processed by the PC connected with a serial connection. Locomotion model simulation using MATLAB [1] is executed based on the calculated data in the PC. This simulation is implemented to evaluate the data. Therefore, it is not required by the user interface purpose in a mobile environment. Figure 8 shows the circuit diagram of the control board in the belt porch. The interval time of data transmission between the control box and PC is set at.1 sec at the speed IJSSST, Vol. 1, No. 3, April 29 67 ISSN: 1473-84x online, 1473-831 print

of 96 bps in the experiment. small acceleration sensor used in the experiment is KMX-52-15, of which measuring range is +/- 2G, which is measured by the A/D converter of PIC16F819. The touch sensor was recorded as 1 bit digital signal. Figure 11. Acceleration sensor data. Figure 11 shows an example of acceleration sensor data which was obtained during walking motion. V. RESULTS AND DISCUSSION Figure 8. Circuit diagram. A. Data acquisition and analysis to recognize walking status Three healthy male subjects were recruited from undergraduate students for the experiments. The test task was to walk 5.2 m on the five different conditions, including flat floor, up/down slope, and up/down staircase. Flat floor: The subject was asked to walk at a constant speed for 5.2 m. However, the walking speed depends on each subject. Before and after the walk, 2 sec. pause was taken so that the measurement detects the start and end of the walk. Figure 12. Acceleration graph for walking on a flat floor. Figure 9. Control box. Figure 9 shows the photo image of the inside of the control box. Acceleration [g] 1.5 1.5 -.5.6 1.2 1.8 2.4 3 3.6 4.2 4.8 5.4 6 6.6 Time [sec] Figure 1. Snapshot of the GUI window for the serial data transmission. Figure 1 shows the snapshot of the GUI window for the serial data transmission. Start, stop, time reset, and data saving can be controlled by this program. Up/Down stairs: The subject was asked to walk up/down the stairs in the same manner as the flat floor walking. In addition to the one-step walk on each stair, two-step walk for each stair was also performed. The stair used in this experiment was 2.9m horizontal length for 1.85 total high for 1 steps. Up/Down slope: The subject was asked to walk up/down the slope in the same manner as the flat floor walking. The IJSSST, Vol. 1, No. 3, April 29 68 ISSN: 1473-84x online, 1473-831 print

slope used in this experiment was 5.2 m horizontal length and.35 m height. Acceleration [g] 2 1.5 1.5 -.5 1.1 2.2 3.3 4.4 5.5 6.6 7.7 8.8 9.9 Time [sec] Figure 14 and Figure 15 show the frequency graph for the flat floor walk and the staircase walk. It shows that the peak value was observed in vertical axis and walking direction at the frequency of 1.4Hz in the flat floor walk, whereas the peak value was observed at.781hz and.625hz in the stair case walk. These values show the significant difference between the flat floor walk and the staircase walk, which provides a hint to recognize the ground condition and status of walking motion. B. Locomotion model building This section covers the procedure to build the locomotion model. The locomotion model is built by the physical measurement data and acceleration data. Figure 13. Acceleration graph for walking up a stair case. Figure 12 shows the acceleration graph in the flat floor walk and Figure 13 shows that of the stair case walking up. The frequencies of each step detected by the touch sensor were.756 on the flat floor walk, and.156 on the stair case of one step walk. Amplitude 5 4 3 2 1.31.63.94 1.25 1.56 1.88 2.19 2.5 2.81 3.13 3.44 3.75 4.6 4.38 4.69 5 Frequency [Hz] Figure 16. Calculation of coordinates for hip, knee and heel. Figure 16 shows the basic procedure to calculate leg coordinates. The distance a and b can be measured, whereas the distance c and d can be calculated from the coordinates of waist and ankle, then the distance can be calculated from the distance c and d. Figure 14. Frequency graph for walking on a flat floor. 8 Amplitude 6 4 2.31.63.94 1.25 1.56 1.88 2.19 2.5 2.81 3.13 3.44 3.75 4.6 4.38 4.69 5 Frequency [Hz] Figure 15. Frequency graph for walking up the staircase. Figure 17. Calculation of coordinates for hip, knee and heel. Figure 17 shows the calculation of hip coordinates. Knee coordinates can also be calculated as show in Figure 18, 19 and 2, considering the data size of Yk and Yb. Figure 21 shows how walking distance is calculated using touch sensors, which can be used as the ankle coordinates. IJSSST, Vol. 1, No. 3, April 29 69 ISSN: 1473-84x online, 1473-831 print

Figure 18. Knee coordinates calculation. Figure 21. Calculation of coordinates for hip, knee and heel. KinemaTracer software provides a tool to build a walking simulation based on the coordinate data recorded by the motion capturing procedure. However, the software does not provide SDK for customization for the extended study. Therefore, this study implemented a simulation tool based on a MATLAB program. MATLAB CSV file loader Coordinate calculator Figure 19. Knee coordinates calculation. Animation generator Figure 22. Snapshot of the GUI window for the serial data transmission. Figure 2. Knee coordinates calculation. C. Simulation tool This section covers the simulation tool developed in this study to show the walking motion based on the locomotion model described in the previous section. To use the walking motion status as an interface to the system, it does not need the graphical simulation. However, to analyze the status of walking motion, this simulation tool shows the walking animation based on the results of analysis. Therefore, this simulation software is a critical part of this study. Figure 22 shows the process flow using the tool, which is composed of three basic modules, including CSV file loader, coordinates calculator and animation generator. The first one is CSV file loader, which loads the data file obtained from sensors. The data file contains raw data which is transferred to the coordinate calculator to generate coordinate data for locomotion model animation. The second one is coordinates calculator, which calculates the coordinates. For example, X, Y and Z of waist coordinates are calculated from the data obtained from the small acceleration sensor. The knee coordinates are calculated from the combination of physical measurements of the subject. Figure 15 shows the geometric graph of the subject leg, which shows a primitive calculation for the coordinates. The third one is animation generator, which shows walking animation based on the calculation results from coordinate calculator module. Figure 23 shows the simulation of the biped walk extracted from the flat floor walk by motion capturing, which is identical to the simulation of KinemaTracer. This means IJSSST, Vol. 1, No. 3, April 29 7 ISSN: 1473-84x online, 1473-831 print

that the simulation of walking motion can be carried out in this simulation tool without using KinemaTracer. Figure 23. Biped walking animation by motion capturing. Figure 24. Biped walking animation by sensor-based motion analysis. Figure 24 shows the walking animation based on the sensor-based walking motion tracking data when a subject walks on the flat floor. The animation is not as smooth as that of Figure 23. However, walking status can be recognized from this simulation. D. Discussion The proposed method or sensor-based motion detection was presented. This section covered the hardware and software for this approach. The experimental results were presented to show that the data collection can be feasible during walking motion, and the walking motion status can be recognized. VI. CONCLUDING REMARKS This paper proposed a new approach to recognize walking status using small acceleration sensors. Compared to the alternative approaches using the motion capturing system to capture and to track human motion, this paper discusses the feasibility of the proposed approach using the walking motion simulation. The goal of this approach is not merely motion capturing, but to find one of the ways towards ubiquitous network interface. The test results show that the walking status can be recognized based on the data obtained from small acceleration sensors and some additional sensors. It is assumed that additional number of small acceleration sensors enables to increase the capacity of walking status recognition. Future works need to clarify that the accuracy of recognition to use as an interface to ubiquitous network systems. ACKNOWLEDGMENT This research was partly supported by the Grants-in-Aid for Scientific Research in 26-28 (Scientific Research (B) 18317) from the Japan Society of Promotion of Science. The author would like to thank Mr. Takashi Niwa and Kyouhei Matsuyama of the collaborative engineering laboratory in the University of Tokushima for building the devices and implementing software. REFERENCES [1] G. Eason, B. Noble, and I. N. Sneddon, On certain integrals of Lipschitz-Hankel type involving products of Bessel functions, Phil. Trans. Roy. Soc. London, vol. A247, pp. 529 551, April 1955. (references) [2] R. Bolt, Put-That-There: Voice and Gesture at the Graphics Interface, Computer Graphics, vol. 14, no. 3, 198, pp. 262-27. [3] Zimmerman, T.G., et al. A hand gesture interface device, ACM SIGCHI Bulletin, Vol.18, No. 4, 1987, pp.189-192. [4] Quek, F.K.H., Towards a Vision-Based Hand gesture interface, Virtual Reality, Software & Technology, 1994, pp.17-31. [5] KinemaTracer, URL: http://www.kinematracer.com/ [6] Corazza S, Muendermann L, Chaudhari A, Demattio T, Cobelli C, Andriacchi T: A markerless motion capture system to study musculoskeletal biomechanics: visual hull and simulated annealing approach Annals of Biomedical Engineering, 26,34(6):119-29. [7] K.M. Cheung, S. Baker, and T. Kanade, Shape-From-Silhouette Across Time Part I: Theory and Algorithms, International Journal of Computer Vision, Vol. 62, No. 3, May, 25, pp. 221-247. [8] K.M. Cheung, S. Baker, and T. Kanade, Shape-From-Silhouette Across Time: Part II: Applications to Human Modeling and Markerless Motion Tracking, International Journal of Computer Vision, Vol. 63, No. 3, August, 25, pp. 225-245. [9] K.M. Cheung, S. Baker, J.K. Hodgins, and T. Kanade, Markerless Human Motion Transfer, Proceedings of the 2nd International Symposium on 3D Data Processing, Visualization and Transmission, September, 24. [1] A. Sundaresan, R. Chellappa: "Markerless Motion Capture using Multiple Cameras", Computer Vision for Interactive and Intelligent Environments, (Eds. C. Jaynes and R. Collins), IEEE Press, 26. [11] T. Aoyama, Matlab, Kodan-sha, 27. (in Japanese) IJSSST, Vol. 1, No. 3, April 29 71 ISSN: 1473-84x online, 1473-831 print