Emotional input for character-based interactive storytelling

Hdl Handle:
http://hdl.handle.net/10149/91146
Title:
Emotional input for character-based interactive storytelling
Book Title:
The eighth international conference on autonomous agents and multiagent systems, Budapest, Hungary, May 11-15, 2009, proceedings
Authors:
Cavazza, M. O. (Marc); Pizzi, D. (David); Charles, F. (Fred); Vogt, T. (Thurid); André, E. (Elisabeth)
Editors:
Sierra, C. (Carles); Castelfranchi, C. (Cristiano); Decker, K. S. (Keith); Sichman, J. S. (Jaime)
Affiliation:
University of Teesside. School of Computing.
Citation:
Cavazza, M. O. et al. (2009) 'Emotional input for character-based interactive storytelling', 8th international conference on autonomous agents and multiagent systems (AAMAS), Budapest, Hungary, May 10 - 19, in Sierra, C. et al. (eds) The eighth international conference on autonomous agents and multiagent systems, Budapest, Hungary, May 11-15, 2009, proceedings. International Foundation for Autonomous Agents, pp.313-320.
Publisher:
International Foundation for Autonomous Agents
Conference:
8th international conference on autonomous agents and multiagent systems (AAMAS), Budapest, Hungary, May 10 - 19, 2009.
Issue Date:
May-2009
URI:
http://hdl.handle.net/10149/91146
Additional Links:
http://portal.acm.org/toc.cfm?id=1558013&type=proceeding&coll=GUIDE&dl=GUIDE&CFID=76447834&CFTOKEN=16479542
Abstract:
In most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story’s characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems. In this paper, we introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters’ behaviours. The main feature of this approach is that characters’ feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system. The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel, Madame Bovary, which is well suited to a formalisation in terms of characters’ feelings. At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover. The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters’ beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction. A limited number of emotional categories is sufficient to drive the narrative across multiple courses of actions, since it comprises over thirty narrative functions. We report results from a fully implemented prototype, both in terms of proof of concept and of usability through a preliminary user study.
Type:
Meetings and Proceedings; Book Chapter
Language:
en
Keywords:
interactive narrative; embodied conversational agents; affective interfaces; emotional input; interactive storytelling
Series/Report no.:
Proceedings; 1
ISBN:
9780981738161

Full metadata record

DC FieldValue Language
dc.contributor.authorCavazza, M. O. (Marc)en
dc.contributor.authorPizzi, D. (David)en
dc.contributor.authorCharles, F. (Fred)en
dc.contributor.authorVogt, T. (Thurid)en
dc.contributor.authorAndré, E. (Elisabeth)en
dc.contributor.editorSierra, C. (Carles)en
dc.contributor.editorCastelfranchi, C. (Cristiano)en
dc.contributor.editorDecker, K. S. (Keith)en
dc.contributor.editorSichman, J. S. (Jaime)en
dc.date.accessioned2010-02-04T12:22:40Z-
dc.date.available2010-02-04T12:22:40Z-
dc.date.issued2009-05-
dc.identifier.isbn9780981738161-
dc.identifier.urihttp://hdl.handle.net/10149/91146-
dc.description.abstractIn most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story’s characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems. In this paper, we introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters’ behaviours. The main feature of this approach is that characters’ feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system. The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel, Madame Bovary, which is well suited to a formalisation in terms of characters’ feelings. At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover. The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters’ beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction. A limited number of emotional categories is sufficient to drive the narrative across multiple courses of actions, since it comprises over thirty narrative functions. We report results from a fully implemented prototype, both in terms of proof of concept and of usability through a preliminary user study.en
dc.language.isoenen
dc.publisherInternational Foundation for Autonomous Agentsen
dc.relation.ispartofseriesProceedingsen
dc.relation.ispartofseries1en
dc.relation.urlhttp://portal.acm.org/toc.cfm?id=1558013&type=proceeding&coll=GUIDE&dl=GUIDE&CFID=76447834&CFTOKEN=16479542-
dc.subjectinteractive narrativeen
dc.subjectembodied conversational agentsen
dc.subjectaffective interfacesen
dc.subjectemotional inputen
dc.subjectinteractive storytellingen
dc.titleEmotional input for character-based interactive storytellingen
dc.typeMeetings and Proceedingsen
dc.typeBook Chapteren
dc.contributor.departmentUniversity of Teesside. School of Computing.en
dc.title.bookThe eighth international conference on autonomous agents and multiagent systems, Budapest, Hungary, May 11-15, 2009, proceedingsen
dc.identifier.conference8th international conference on autonomous agents and multiagent systems (AAMAS), Budapest, Hungary, May 10 - 19, 2009.en
or.citation.harvardCavazza, M. O. et al. (2009) 'Emotional input for character-based interactive storytelling', 8th international conference on autonomous agents and multiagent systems (AAMAS), Budapest, Hungary, May 10 - 19, in Sierra, C. et al. (eds) The eighth international conference on autonomous agents and multiagent systems, Budapest, Hungary, May 11-15, 2009, proceedings. International Foundation for Autonomous Agents, pp.313-320.-
prism.startingPage313-
prism.endingPage320-
All Items in TeesRep are protected by copyright, with all rights reserved, unless otherwise indicated.