Deep Learning on Eye Gaze Data to Classify Student Distraction Level in an Educational VR Environment -- Honorable Mention for Best Paper Award

Loading...
Thumbnail Image
Date
2021
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
Educational VR may increase engagement and retention compared to traditional learning, for some topics or students. However, a student could still get distracted and disengaged due to stress, mind-wandering, unwanted noise, external alerts, etc. Student eye gaze can be useful for detecting distraction. For example, we previously considered gaze visualizations to help teachers understand student attention to better identify or guide distracted students. However, it is not practical for a teacher to monitor a large numbers of student indicators while teaching. To help filter students based on distraction level, we consider a deep learning approach to detect distraction from gaze data. The key aspects are: (1) we created a labeled eye gaze dataset (3.4M data points) from an educational VR environment, (2) we propose an automatic system to gauge a student's distraction level from gaze data, and (3) we apply and compare three deep neural classifiers for this purpose. A proposed CNN-LSTM classifier achieved an accuracy of 89.8% for classifying distraction, per educational activity section, into one of three levels.
Description

        
@inproceedings{
10.2312:egve.20211326
, booktitle = {
ICAT-EGVE 2021 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments
}, editor = {
Orlosky, Jason and Reiners, Dirk and Weyers, Benjamin
}, title = {{
Deep Learning on Eye Gaze Data to Classify Student Distraction Level in an Educational VR Environment -- Honorable Mention for Best Paper Award
}}, author = {
Asish, Sarker Monojit
and
Hossain, Ekram
and
Kulshreshth, Arun K.
and
Borst, Christoph W.
}, year = {
2021
}, publisher = {
The Eurographics Association
}, ISSN = {
1727-530X
}, ISBN = {
978-3-03868-142-7
}, DOI = {
10.2312/egve.20211326
} }
Citation
Collections