Multi-Platform Intelligent System for Multimodal Human-Computer Interaction
keywords: Human--computer interaction, multi-platform, intelligent system architecture, multimodal system, humanoid robot
We present a flexible human--robot interaction architecture that incorporates emotions and moods to provide a natural experience for humans. To determine the emotional state of the user, information representing eye gaze and facial expression is combined with other contextual information such as whether the user is asking questions or has been quiet for some time. Subsequently, an appropriate robot behaviour is selected from a multi-path scenario. This architecture can be easily adapted to interactions with non-embodied robots such as avatars on a mobile device or a PC. We present the outcome of evaluating an implementation of our proposed architecture as a whole, and also of its modules for detecting emotions and questions. Results are promising and provide a basis for further development.
reference: Vol. 40, 2021, No. 1, pp. 83–103