Master of Engineering Science
Trejos, Ana Luisa
According to the World Health Organization, stroke is the third leading cause of disability. A common consequence of stroke is hemiparesis, which leads to the impairment of one side of the body and affects the performance of activities of daily living. It has been proven that targeting the motor impairments as early as possible while using wearable mechatronic devices as a robot assisted therapy, and letting the patient be in control of the robotic system can improve the rehabilitation outcomes. However, despite the increased progress on control methods for wearable mechatronic devices, the need for a more natural interface that allows for better control remains. This work presents, a user-independent gesture classification method based on a sensor fusion technique that combines surface electromyography (EMG) and an inertial measurement unit (IMU). The Myo Armband was used to measure muscle activity and motion data from healthy subjects. Participants were asked to perform 10 types of gestures in 4 different arm positions while using the Myo on their dominant limb. Data obtained from 22 participants were used to classify the gestures using 4 different classification methods. Finally, for each classification method, a 5-fold cross-validation method was used to test the efficacy of the classification algorithms. Overall classification accuracies in the range of 33.11%-72.1% were obtained. However, following the optimization of the gesture datasets, the overall classification accuracies increased to the range of 45.5%-84.5%. These results suggest that by using the proposed sensor fusion approach, it is possible to achieve a more natural human machine interface that allows better control of wearable mechatronic devices during robot assisted therapies.
Summary for Lay Audience
According to the World Health Organization, stroke is the third leading cause of disability. A common consequence of stroke is the paralysis on one side of the body, which affects the performance of activities of daily living. It has been proven that targeting the motor paralysis as early as possible while using wearable devices, that combines both electrical and mechanic components, as a robot assisted therapy, and letting the patient be in control of the robotic system can improve the rehabilitation outcomes. However, despite the increase progress on control methods for these robotic devices, a need for a more natural interface that allows for an intuitive interaction remains. This work presents, a comparison of multiple interfaces based on gesture recognition that allow a natural interaction with a wearable robotic device. Muscle electrical activity of the forearm, and motion data were collected from 22 healthy participants while they performed 10 types of gestures in 4 different arm positions. These data were used to train four interfaces to recognize these 10 gestures. Each interface was evaluated on its ability to differentiate between gestures after being trained using only the data obtained from the muscles' electrical activity, and after being trained using both, the muscle electrical activity and motion data. The results obtained suggest that it is possible to achieve a more natural interaction with wearable mechatronic devices during robot assisted therapies.
Collí Alfaro, José Guillermo, "Implementation of User-Independent Hand Gesture Recognition Classification Models Using IMU and EMG-based Sensor Fusion Techniques" (2019). Electronic Thesis and Dissertation Repository. 6347.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.