Abstract:
Over the years Intelligent wheelchairs which can gather information from the envi ronment to make decisions for itself have been developed to fulfil the needs of the wheelchair users. To improve the human robot interaction between the wheelchairs and the users, wheelchairs come with several types of controllers which are designed to meet the customer satisfaction. Use of biological/ electrophysiological signals for improved user satisfaction has an increasing demand all over the world. In order to incorporate user intentions for the navigation purposes and object manip ulation purposes of an intelligent wheelchair electrophysiological signals can be used. Due to the high signal to noise ratio and the fact that it does not require invasive surgical procedures to extract the electrophysiological signal, electromyography is used for above mentioned purposes. Since, electromyography is generated due to the motion intention ofthe user, it can accurately represent the intention of the user. Furthermore, the number of working muscles required to build a controller for above mentioned pur poses is high. Hence, it can be seen that most of the existing electromyography based controllers use muscles associated with hand movements as they can generate sev eral number of combinations of muscle activations depending on the hand movement/ grasping pattern. However, these are invalidated if the wheelchair user is suffering from physical conditions like trans-radial amputation where the upper limb is ampu tated between wrist and the elbow, trans-humeral amputation where the upper limb is amputated between shoulder and elbow and the partial/complete paralysis of the upper limb. This thesis proposes an Electromygraphy based controller for navigation and intelligent object manipulation of the wheelchair, which can be used even by the wheelchair users with trans-radial amputation, trans-humeral amputation and partial limb function. Any wheelchair user who has partial/complete function of biceps brachii of both of the arms, triceps brachii of the dominant arm and the right and left ster nocleidomastoid of the neck, can use the proposed electromyography based controller. Moreover, the controller is enhanced with vision sensor and a proximity sensor for the intelligent object manipulation task. By using the common user preferences and his previous experience in arranging objects on a wheelchair tray controller was designed to be user friendly. Experiments were carried out to monitor the adaptability and usability of the proposed Electromyography based controller among different users. Results confirmed that it can be used by any user after calibrating it for few trials. A human study was performed for different subjects to monitor whether patterns emerge in placing objects in different situations. After implementing a clustering algorithm, common locations and arrange ments for object placements were identified. Furthermore, experiments were performed after building the whole system inside the CoppeliaSim simulation environment, to monitor the capability of the proposed system in manipulating and arranging objects according to the user preferences in different situations. Simulation results proved that the proposed system can place the objects in accordance with the results of the human study.
Citation:
Abayasiri, R.A.M. (2022). Incorporating biological signals for understanding user intentions for intelligent wheelchair [Master's theses, University of Moratuwa]. Institutional Repository University of Moratuwa. http://dl.lib.uom.lk/handle/123/21382