Browsing by Author "Borst, Christoph W."
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Deep Learning on Eye Gaze Data to Classify Student Distraction Level in an Educational VR Environment -- Honorable Mention for Best Paper Award(The Eurographics Association, 2021) Asish, Sarker Monojit; Hossain, Ekram; Kulshreshth, Arun K.; Borst, Christoph W.; Orlosky, Jason and Reiners, Dirk and Weyers, BenjaminEducational VR may increase engagement and retention compared to traditional learning, for some topics or students. However, a student could still get distracted and disengaged due to stress, mind-wandering, unwanted noise, external alerts, etc. Student eye gaze can be useful for detecting distraction. For example, we previously considered gaze visualizations to help teachers understand student attention to better identify or guide distracted students. However, it is not practical for a teacher to monitor a large numbers of student indicators while teaching. To help filter students based on distraction level, we consider a deep learning approach to detect distraction from gaze data. The key aspects are: (1) we created a labeled eye gaze dataset (3.4M data points) from an educational VR environment, (2) we propose an automatic system to gauge a student's distraction level from gaze data, and (3) we apply and compare three deep neural classifiers for this purpose. A proposed CNN-LSTM classifier achieved an accuracy of 89.8% for classifying distraction, per educational activity section, into one of three levels.Item Towards Improving Educational Virtual Reality by Classifying Distraction using Deep Learning(The Eurographics Association, 2022) Khokhar, Adil; Borst, Christoph W.; Hideaki Uchiyama; Jean-Marie NormandDistractions can cause students to miss out on critical information in educational Virtual Reality (VR) environments. Our work uses generalized features (angular velocities, positional velocities, pupil diameter, and eye openness) extracted from VR headset sensor data (head-tracking, hand-tracking, and eye-tracking) to train a deep CNN-LSTM classifier to detect distractors in our educational VR environment. We present preliminary results demonstrating a 94.93% accuracy for our classifier, an improvement in both the accuracy and generality of features used over two recent approaches. We believe that our work can be used to improve educational VR by providing a more accurate and generalizable approach for distractor detection.