Abstract:
Assistive robots are developed to uplift living standards
of human beings. An assistive robot needs to be friendly,
reliable, and understandable in order to be a human like
companion. A robot should be able to navigate in an environment
according to users’ commands. Users’ commands can be vocal
or gestures. Combining users’ command types to identify the
navigation information will enhance the user-robot interaction
and makes navigation more effective and purposeful. Moreover,
understanding the relationship between the navigation commands
makes the robot a more human friendly companion. A robot
should be capable of interpreting spatial information in uncertain
terms. Moreover it should be able to improve the knowledge
through communicating with the user, itself. The Cognitive Map
Creator (CMC) has been introduced to create a cognitive map of
direct navigational commands to enhance the interaction between
the human and the robot which will help to make a robot more
human-friendly. The Conversation Management Module (CMM),
Spatial Data Interpreter (SDI) and Navigational Command Identifier
(NCI) have been introduced in order to create a cognitive
map of navigational commands. The capabilities of the robot have
been demonstrated and evaluated from experimental results.
Citation:
Bandara, H.M.R.T., Basnayake, B.M.S.S., Jayasekara, A.G.B.P., & Chandima, D.P. (2018). Identification of cognitive navigational commands for mobile robots based on hand gestures and vocal commands. In R. Samarasinghe & S. Abeygunawardana (Eds.), Proceedings of 2nd International Conference on Electrical Engineering 2018 (pp. 144-149). Institute of Electrical and Electronics Engineers, Inc. https://ieeexplore.ieee.org/xpl/conhome/8528200/proceeding