Abstract
We present our experiments on attitude detection based on annotated multi-modal dialogue data1. Our long-term goal is to establish a computational model able to predict the attitudinal patterns in humanhuman dialogue. We believe, such prediction algorithms are useful tools in the pursuit of realistic discourse behavior in conversational agents and other intelligent man-machine interfaces. The present paper deals with two important subgoals in particular: How to establish a meaningful and consistent set of annotation categories for attitude annotation, and how to relate the annotation data to the
recorded data (audio and video) in computational models of attitude prediction. We present our current results including a recommended set of analytical annotation labels and a recommended setup for extracting linguistically meaningful data even from noisy audio and video signals.
recorded data (audio and video) in computational models of attitude prediction. We present our current results including a recommended set of analytical annotation labels and a recommended setup for extracting linguistically meaningful data even from noisy audio and video signals.
Original language | English |
---|---|
Title of host publication | NEALT2012 : Proceedings of the 4th Nordic Symposium on Multimodal Communication, Nov. 15-16, Gothenburg, Sweden |
Editors | Jens Allwood, Elisabeth Ahlsén, Patrizia Paggio, Kristiina Jokinen |
Place of Publication | Göteborg |
Publisher | Göteborg Universitet |
Publication date | 2013 |
Pages | 47-53 |
Publication status | Published - 2013 |
Event | The 4th Nordic Symposium on Multimodal Communication - University of Gothenburg, Gothenburg, Sweden Duration: 15 Nov 2012 → 16 Nov 2012 Conference number: 4 |
Conference
Conference | The 4th Nordic Symposium on Multimodal Communication |
---|---|
Number | 4 |
Location | University of Gothenburg |
Country/Territory | Sweden |
City | Gothenburg |
Period | 15/11/2012 → 16/11/2012 |
Series | Linköping Electronic Conference Proceedings |
---|---|
Number | 93 |
ISSN | 1650-3686 |