Eye-Tracking-Based Prediction of User Experience in VR Locomotion Using Machine Learning

dc.contributor.authorGao, Hongen_US
dc.contributor.authorKasneci, Enkelejdaen_US
dc.contributor.editorUmetani, Nobuyukien_US
dc.contributor.editorWojtan, Chrisen_US
dc.contributor.editorVouga, Etienneen_US
dc.date.accessioned2022-10-04T06:41:55Z
dc.date.available2022-10-04T06:41:55Z
dc.date.issued2022
dc.description.abstractVR locomotion is one of the most important design features of VR applications and is widely studied. When evaluating locomotion techniques, user experience is usually the first consideration, as it provides direct insights into the usability of the locomotion technique and users' thoughts about it. In the literature, user experience is typically measured with post-hoc questionnaires or surveys, while users' behavioral (i.e., eye-tracking) data during locomotion, which can reveal deeper subconscious thoughts of users, has rarely been considered and thus remains to be explored. To this end, we investigate the feasibility of classifying users experiencing VR locomotion into L-UE and H-UE (i.e., low- and high-user-experience groups) based on eye-tracking data alone. To collect data, a user study was conducted in which participants navigated a virtual environment using five locomotion techniques and their eye-tracking data was recorded. A standard questionnaire assessing the usability and participants' perception of the locomotion technique was used to establish the ground truth of the user experience. We trained our machine learning models on the eye-tracking features extracted from the time-series data using a sliding window approach. The best random forest model achieved an average accuracy of over 0.7 in 50 runs. Moreover, the SHapley Additive exPlanations (SHAP) approach uncovered the underlying relationships between eye-tracking features and user experience, and these findings were further supported by the statistical results. Our research provides a viable tool for assessing user experience with VR locomotion, which can further drive the improvement of locomotion techniques. Moreover, our research benefits not only VR locomotion, but also VR systems whose design needs to be improved to provide a good user experience.en_US
dc.description.number7
dc.description.sectionheadersPerception and Visualization
dc.description.seriesinformationComputer Graphics Forum
dc.description.volume41
dc.identifier.doi10.1111/cgf.14703
dc.identifier.issn1467-8659
dc.identifier.pages589-599
dc.identifier.pages11 pages
dc.identifier.urihttps://doi.org/10.1111/cgf.14703
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf14703
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectCCS Concepts: Computing methodologies → Classification and regression trees; Human-centered computing → Empirical studies in HCI; Virtual reality
dc.subjectComputing methodologies → Classification and regression trees
dc.subjectHuman
dc.subjectcentered computing → Empirical studies in HCI
dc.subjectVirtual reality
dc.titleEye-Tracking-Based Prediction of User Experience in VR Locomotion Using Machine Learningen_US
Files
Original bundle
Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
v41i7pp589-599.pdf
Size:
4.41 MB
Format:
Adobe Portable Document Format
Loading...
Thumbnail Image
Name:
cgf14703_v41i7pp589-599.pdf
Size:
4.49 MB
Format:
Adobe Portable Document Format
Description:
ProjektDeal version
Collections