Institutional-Repository, University of Moratuwa.  

Enhancing the capabilities of intelligent wheelchair robots in approaching and docking to service scenarios

Show simple item record

dc.contributor.advisor Jayasekara AGBP
dc.contributor.author Hiroshaan V
dc.date.accessioned 2021
dc.date.available 2021
dc.date.issued 2021
dc.identifier.citation Hiroshaan V. (2021). Enhancing the capabilities of intelligent wheelchair robots in approaching and docking to service scenarios [Master's theses, University of Moratuwa]. Institutional Repository University of Moratuwa. http://dl.lib.uom.lk/handle/123/20864
dc.identifier.uri http://dl.lib.uom.lk/handle/123/20864
dc.description.abstract The world is presently confronted with the issues of aging and disability as a result of accidents and other events. For the above problem, wheelchairs are the evaded partner in the lives of many differently-abled people to support their day-to-day activities. Many academics are focusing on the usage of wheelchairs to discover better robotics solutions. However, the status of the automated powered wheelchairs is not up to the required level of autonomy in the areas such as docking behavior for a specific task. Approaching and docking need correct layout information, and identifying the proper degrees of docking in the environment and then assessing them for safety. A machine learning system trained on various docking-level setups is one viable approach. However, retraining is required for different scenarios such as furniture placements or real-time random changes of the layout. Furthermore, vision data is affected by changes in light conditions. The method for detecting docking surfaces, on the other hand, is dependent on geometric information computed from depth data, which makes it invariant to scene or light changes. As the main aim of this research, a human study was performed to identify the docking behavior of a wheelchair to the table or desk in four different scenarios such as writing, reading, eating, and using a laptop with 3D point cloud data. This research developed a novel method for determining comfortable docking locations based on analyzed ergonomics data. It was gathered from human subjects on actual wheelchair usage. Analyzed data can be applied within a single algorithm to obtain a safe location using 3D point cloud data. While docking with the table, two situations were evaluated. The first is a table with an object on it, while the second is a table with no object on it. If the object is on the table, it will dock based on its location on the table and the availability of open space. If no item is present, it will dock based on the user's desire and the available free space. This wheelchair also has navigation and obstacle avoidance built in to let it travel in a residential setting more independently. From the human study, the optimized distance between wheelchair back end and table were identified for eating, writing, reading and using a laptop as 29 cm, 27.75 cm, 27.25 cm and 40 cm respectively. The optimized height difference between table surface and wheelchair seat for all scenarios were obtained as the same value as 32.25 cm for a particular table (height = 81 cm). Seat height was not dependent on the scenario. Obtained results were applied to the simulation design for above two situations and validated through fourty test cases. en_US
dc.language.iso en en_US
dc.subject DOCKING BEHAVIOR en_US
dc.subject POINT CLOUD en_US
dc.subject WHEELCHAIR en_US
dc.subject ROS en_US
dc.subject NAVIGATION en_US
dc.subject HUMAN STUDY en_US
dc.subject INDUSTRIAL AUTOMATION -Dissertation en_US
dc.subject ELECTRICAL ENGINEERING -Dissertation en_US
dc.title Enhancing the capabilities of intelligent wheelchair robots in approaching and docking to service scenarios en_US
dc.type Thesis-Abstract en_US
dc.identifier.faculty Engineering en_US
dc.identifier.degree MSc. in Industrial Automation en_US
dc.identifier.department Department of Electrical Engineering en_US
dc.date.accept 2021
dc.identifier.accno TH4708 en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record