Electronic Thesis and Dissertation Repository

Motion Intention Estimation using sEMG-ACC Sensor Fusion

Jose Alejandro Lopez, The University of Western Ontario

Abstract

Musculoskeletal injuries can severely impact the ability to produce and control body motion. In order to regain function, rehabilitation is often required. Wearable smart devices are currently under development to provide therapy and assistance for people with impaired arm function. Electromyography (EMG) signals are used as an input to pattern recognition systems to determine intended movements.

However, there is a gap between the accuracy of pattern recognition systems in constrained laboratory settings, and usability when used for detecting dynamic unconstrained movements. Motion factors such as limb position, interaction force, and velocity, are known to have a negative impact on the pattern recognition. A possible solution lies in the use of data from other sensors along with the EMG signals, such as signals from accelerometers (ACC), in the training and use of classifiers in order to improve classification accuracy.

The objectives of this study were to quantify the impact of motion factors on ACC signals, and to use these ACC signals along with EMG signals for classifying categories of motion factors. To address these objectives, a dataset containing EMG and ACC signals while individuals performed unconstrained arm motions was studied. Analyses of the EMG and accelerometer signals and their use in training classification models to predict characteristics of intended motion were completed.

The results quantify how accelerometer features change with variations in arm position, interaction forces, and motion velocities. The results also show that the combination of EMG and ACC data have relatively increased the accuracy of motion intention detection. Velocity could be distinguished between stationary and moving with less than 10% error using a Decision Tree ensemble classifier.

Future work should expand on motion factors and EMG-ACC sensor fusion to identify interactions between a person and the environment, in order to guide tuning of control models working towards controlling wearable mechatronic devices during dynamic movements.