Head of the Social Cognitive Systems Group at the Faculty of Technology and the Research Center on “Cognitive Interaction Technology” (CITEC) at Bielefeld University
Human-aware conversational agents
The development of conversational agents has made impressive advances in the last decades —ranging from statistical and neural approaches to dialogue management, to the understanding or generation of multimodal signals, to the large-scale deployment in voice assistants. This progress was made possible mainly by applying machine learning techniques to increasing amounts of training data. While these approaches yield dialogue, language or behavior models that cover larger domains, they do impose assumptions of universality and generality to structures and features of dialogue and even interlocutors. However, in many applications conversational agents need to be able to adapt rapidly and continuously to an individual user and an individual interaction, from as little data as a few verbal or nonverbal signals. I will argue that in order to achieve this goal we need to make conversational agents aware of how specific human users coordinate communication and dialogue, how they cognitively process socio-communicative behavior, and how they perceive conversational assistants. Along this line I will discuss work on conversational agents that are attentive and responsive to the interaction-relevant mental states (stance) of their human interlocutor and that can process (understand and generate) semantic and pragmatic functions of multimodal behavior. I will also present results from several studies on how different kinds conversational agents are perceived by different kinds of users.