Abstract:
Intelligent robot companions contribute signi cantly in improving living standards in
the modern society. Therefore human-like decision making skills and perception are
sought after during the design of such robots. On the one hand, such features enable
a robot to be easily handled by a non-expert human user. On the other hand, the
robot will have the capability of dealing with humans without causing any disturbance
by the its behavior. Mimicking human emotional intelligence is one of the best and
reasonable ways of laying the foundation for such an emotional intelligence in robots.
As robots are widely deployed in social environments today, perception of the situation
or the intentions of a user prior to an interaction is required in order to be proactive.
Proactive robots are required to understand what is communicated by the human body
language prior to approaching a human. Social constraints in an interaction could be
demolished by this assessment in this regard.
This thesis addresses the problem of perceiving nonverbals in human behavior and
fusing human-environment semantic representations with a robot's cognition before
interacting with humans. The novelty lies in laying the background to relate nonverbal
human behavior and the features of the environment to generate context-aware
interactive responses during robot-initiated interaction. That informs the robot about
its environment. Toward this end, we introduce novel methods of perceiving human
nonverbals and spatial factors in the environment which make up a context in which
we integrated that knowledge to determine appropriate responses from a robot to assist
its user. Visual information acquired by a vision sensor was analyzed, and the level of
emotional engagement demanded by the user's nonverbals was evaluated, before a robot
initiates an interaction. After such an analysis, a robot's conversational and proxemic
behavior was adjusted to maintain an empathetic relationship between the user and
the robot. Our algorithms e ciently sustained the empathy between user and robot so
that the interaction resembles human-human interaction to a larger extent. To assist
the main problem, we formulated novel methods to recognize human nonverbals such
as postures, gestures, hand poses, psychophysiological behavior of humans and human
activities, and decode the emotional hints displayed to the outside world. In support
of this work, we conducted a series of human studies to explore the patterns in human
behavior which could be perceived by a proactive robot using its cognitive capabilities.
We introduce separate systems which can decode the sentiments of humans using
observable cues based on accepted social norms. We detail the meanings of human
nonverbals by observing human behavior over time and evaluating the context for
any patterns in behavior. Ambiguities in human behavior and random, meaningless
behaviors could be omitted through this approach. This approach further omits the
negative e ect of human responses that can be faked, such as facial expressions and
words. Finally we introduce an adaptive approach towards robot-initiated human-robot
interaction by letting a robot observe a context and generate responses while changing
its responses continuously as human behavior changes. We rst developed algorithms
based on a limited number of observable human cues and decoded their sentiments
based on modern psycho-physiological interpretations of human behavior. Next,
we expanded such approaches towards multiple observable human cues. Finally we
integrated observations from the human and the environment which create the context
during HRI (Human-Robot Interaction). Hence we integrated all the recognition
iii
approaches to perceive a complete scenario which comprises the user, robot and the
environment.
Upon unimodal systems to identify these features independently, we propose a multi
modal approach to evaluate above features together to understand a scenario. Through
this approach, we took an e ort to make proactive behavior of a social robot
more instinctive, ethical and socially acceptable or simply, humanlike. We evaluate
this approach by means of physical experiments in simulated social and domestic
environments and demonstrate its performance in appropriate occasions as determined
by a robot according to the formulated criteria of perceiving a context.
Citation:
Sirithunge, H.P.C. (2020). Framework for adaptive human - robot interaction initiation for domestic environments [Master's theses, University of Moratuwa]. Institutional Repository University of Moratuwa. http://dl.lib.uom.lk/handle/123/18712