Individualizing the New Interfaces: Extraction of User's Emotions from Facial Data

dc.contributor.authorHupont, Isabelleen_US
dc.contributor.authorCerezo, Evaen_US
dc.contributor.editorPere Brunet and Nuno Correia and Gladimir Baranoskien_US
dc.date.accessioned2014-01-31T18:53:40Z
dc.date.available2014-01-31T18:53:40Z
dc.date.issued2006en_US
dc.description.abstractWhen developing new multimodal user interfaces emotional user information may be of great interest. In this paper we present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distances, presence of wrinkles and mouth shape). The method defines and detects the six basic emotions (plus the neutral one) in terms of this information and has been fine-tuned with a database of 399 images. We analyze the effect of different facial parameters and other issues like gender and ethnicity in the classification results. For the moment, the method is applied to static images.en_US
dc.description.seriesinformationSIACG 2006: Ibero-American Symposium in Computer Graphicsen_US
dc.identifier.isbn3-905673-60-6en_US
dc.identifier.urihttps://doi.org/10.2312/LocalChapterEvents/siacg/siacg06/179-185en_US
dc.publisherThe Eurographics Associationen_US
dc.subjectCategories and Subject Descriptors (according to ACM CCS): I.3.6 [Computer Graphics]: Interaction Techniques I.4.8 [Image Processing and Computer Vision]: Scene Analysisen_US
dc.titleIndividualizing the New Interfaces: Extraction of User's Emotions from Facial Dataen_US
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
179-185.pdf
Size:
307.14 KB
Format:
Adobe Portable Document Format
Collections