Institutional-Repository, University of Moratuwa.  

Development of a real-time grasping pattern classification system by fusing EMG-vision for hand prostheses

Show simple item record

dc.contributor.advisor Punchihewa HKG
dc.contributor.advisor Madusanka DGK
dc.contributor.author Perera GDM
dc.date.accessioned 2021
dc.date.available 2021
dc.date.issued 2021
dc.identifier.citation Perera, G.D.M. (2021). Development of a real-time grasping pattern classification system by fusing EMG-vision for hand prostheses [Master's theses, University of Moratuwa]. Institutional Repository University of Moratuwa. http://dl.lib.uom.lk/handle/123/22460
dc.identifier.uri http://dl.lib.uom.lk/handle/123/22460
dc.description.abstract The Electromyography (EMG) based trans-radial prostheses have revolutionized the prosthetic industry due to their ability to control the robotic hand using human intention. Although recently developed EMG-based prosthetic hands can classify a signi cant number of wrist motions, classifying grasping patterns in real-time is challenging. However, the wrist motions alone cannot facilitate a prosthetic hand to grasp objects properly without performing appropriate grasping pattern. The collaboration of EMG and vision has addressed this problem to a certain extent. However they have not been able to achieve signi cant performance in real-time. This study proposed a vision-EMG fusion method that can improve the real-time prediction accuracy of the EMG classi cation system by merging a probability matrix that represents the usage of the six grasping patterns for the targeted object. The You Only Look Once (YOLO) object detection algorithm was utilized to retrieve the probability matrix of the identi ed object, and it was used to correct the classi cation error in the EMG classi cation system by applying Bayesian fusion. Experiments were carried out to collect EMG data from six muscles of 15 subjects during the grasping action for classi er development. In addition, an online survey was conducted to collect data to calculate the respective conditional probability matrix for selected objects. Finally, the ve optimized supervised learning EMG classi ers; Arti cial Neural Network (ANN), K-nearest neighbor (KNN), Linear Discriminant Analysis (LDA), Naive Bayes (NB), and Decision Tree (DT) were compared to select the best classi er for fusion. The real-time experiment results revealed that the ANN outperformed other selected classi ers by achieving the highest mean True Positive Rate (mTPR) of M = 72:86% (SD = 17:89%) for all six grasping patterns. Furthermore, the feature set identi ed at the experiment (Age, Gender, and Handedness of the user) proved that their in uence increases the mTPR of ANN by M = 16:05% (SD = 2:70%). The proposed system takes M = 393:89 ms (SD = 178:23 ms) to produce a prediction. Therefore, the user did not feel a delay between intention and execution. Furthermore, proposed system facilitated the user to use suitable multiple grasping patterns for a single object as in real life. In future research works, the functionalities of the system should be expanded to include wrist motions and evaluate the system on amputees. en_US
dc.language.iso en en_US
dc.subject SURFACE ELECTROMYOGRAPHY en_US
dc.subject REAL-TIME CLASSIFICATION en_US
dc.subject GRASPING PATTERN en_US
dc.subject SENSOR FUSION en_US
dc.subject VISION FEED- BACK en_US
dc.subject MECHANICAL ENGINEERING- Dissertation en_US
dc.title Development of a real-time grasping pattern classification system by fusing EMG-vision for hand prostheses en_US
dc.type Thesis-Abstract en_US
dc.identifier.faculty Engineering en_US
dc.identifier.degree MSc in Mechanical Engineering by research en_US
dc.identifier.department Department of Mechanicall Engineering en_US
dc.date.accept 2021
dc.identifier.accno TH5086 en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record