Emotion-based Interaction Technique Using User's Voice and Facial Expressions in Virtual and Augmented Reality

No Thumbnail Available
Date
2023
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
This paper presents a novel interaction approach based on a user's emotions within augmented reality (AR) and virtual reality (VR) environments to achieve immersive interaction with virtual intelligent characters. To identify the user's emotions through voice, the Google Speech-to-Text API is used to transcribe speech and then the RoBERTa language processing model is utilized to classify emotions. In AR environment, the intelligent character can change the styles and properties of objects based on the recognized user's emotions during a dialog. On the other side, in VR environment, the movement of the user's eyes and lower face is tracked by VIVE Pro Eye and Facial Tracker, and EmotionNet is used for emotion recognition. Then, the virtual environment can be changed based on the recognized user's emotions. Our findings present an interesting idea for integrating emotionally intelligent characters in AR/VR using generative AI and facial expression recognition.
Description

CCS Concepts: Human-centered computing -> Human computer interaction (HCI); Hardware -> VIVE Pro Eye; Facial Tracker

        
@inproceedings{
10.2312:pg.20231286
, booktitle = {
Pacific Graphics Short Papers and Posters
}, editor = {
Chaine, Raphaëlle
and
Deng, Zhigang
and
Kim, Min H.
}, title = {{
Emotion-based Interaction Technique Using User's Voice and Facial Expressions in Virtual and Augmented Reality
}}, author = {
Ko, Beom-Seok
and
Kang, Ho-San
and
Lee, Kyuhong
and
Braunschweiler, Manuel
and
Zünd, Fabio
and
Sumner, Robert W.
and
Choi, Soo-Mi
}, year = {
2023
}, publisher = {
The Eurographics Association
}, ISBN = {
978-3-03868-234-9
}, DOI = {
10.2312/pg.20231286
} }
Citation