Abstract
In order to achieve more believable interactions with artificial agents, there is a need to produce dialogue that is not only relevant, but also emotionally appropriate and consistent. This paper presents a comprehensive system that models the emotional state of users and an agent to dynamically adapt dialogue utterance selection. A Partially Observable Markov Decision Process (POMDP) with an online solver is used to model user reactions in real-time. The model decides the emotional content of the next utterance based on the rewards from the users and the agent. The previous approaches are extended through jointly modeling the user and agent emotions, maintaining this model over time with a memory, and enabling interactions with multiple users. A proof of concept user study is used to demonstrate that the system can deliver and maintain distinct agent personalities during multiparty interactions.
DOI
10.1145/3383652.3423881
Publication Date
2020-10-20
Event
IVA '20: ACM International Conference on Intelligent Virtual Agents
Publication Title
Proceedings of the 20th ACM International Conference on Intelligent Virtual Agents
Publisher
ACM
ISBN
9781450375863
Embargo Period
2024-11-22
First Page
1
Last Page
8
Recommended Citation
Irfan, B., Narayanan, A., & Kennedy, J. (2020) 'Dynamic Emotional Language Adaptation in Multiparty Interactions with Agents', Proceedings of the 20th ACM International Conference on Intelligent Virtual Agents, , pp. 1-8. ACM: Available at: https://doi.org/10.1145/3383652.3423881