Browsing by Author "Kurzhals, Kuno"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Been There, Seen That: Visualization of Movement and 3D Eye Tracking Data from Real-World Environments(The Eurographics Association and John Wiley & Sons Ltd., 2023) Pathmanathan, Nelusa; Öney, Seyda; Becher, Michael; Sedlmair, Michael; Weiskopf, Daniel; Kurzhals, Kuno; Bujack, Roxana; Archambault, Daniel; Schreck, TobiasThe distribution of visual attention can be evaluated using eye tracking, providing valuable insights into usability issues and interaction patterns. However, when used in real, augmented, and collaborative environments, new challenges arise that go beyond desktop scenarios and purely virtual environments. Toward addressing these challenges, we present a visualization technique that provides complementary views on the movement and eye tracking data recorded from multiple people in realworld environments. Our method is based on a space-time cube visualization and a linked 3D replay of recorded data. We showcase our approach with an experiment that examines how people investigate an artwork collection. The visualization provides insights into how people moved and inspected individual pictures in their spatial context over time. In contrast to existing methods, this analysis is possible for multiple participants without extensive annotation of areas of interest. Our technique was evaluated with a think-aloud experiment to investigate analysis strategies and an interview with domain experts to examine the applicability in other research fields.Item Visual Analysis of Spatio-temporal Phenomena with 1D Projections(The Eurographics Association and John Wiley & Sons Ltd., 2021) Franke, Max; Martin, Henry; Koch, Steffen; Kurzhals, Kuno; Borgo, Rita and Marai, G. Elisabeta and Landesberger, Tatiana vonIt is crucial to visually extrapolate the characteristics of their evolution to understand critical spatio-temporal events such as earthquakes, fires, or the spreading of a disease. Animations embedded in the spatial context can be helpful for understanding details, but have proven to be less effective for overview and comparison tasks. We present an interactive approach for the exploration of spatio-temporal data, based on a set of neighborhood-preserving 1D projections which help identify patterns and support the comparison of numerous time steps and multivariate data. An important objective of the proposed approach is the visual description of local neighborhoods in the 1D projection to reveal patterns of similarity and propagation. As this locality cannot generally be guaranteed, we provide a selection of different projection techniques, as well as a hierarchical approach, to support the analysis of different data characteristics. In addition, we offer an interactive exploration technique to reorganize and improve the mapping locally to users' foci of interest. We demonstrate the usefulness of our approach with different real-world application scenarios and discuss the feedback we received from domain and visualization experts.Item Visual Gaze Labeling for Augmented Reality Studies(The Eurographics Association and John Wiley & Sons Ltd., 2023) Öney, Seyda; Pathmanathan, Nelusa; Becher, Michael; Sedlmair, Michael; Weiskopf, Daniel; Kurzhals, Kuno; Bujack, Roxana; Archambault, Daniel; Schreck, TobiasAugmented Reality (AR) provides new ways for situated visualization and human-computer interaction in physical environments. Current evaluation procedures for AR applications rely primarily on questionnaires and interviews, providing qualitative means to assess usability and task solution strategies. Eye tracking extends these existing evaluation methodologies by providing indicators for visual attention to virtual and real elements in the environment. However, the analysis of viewing behavior, especially the comparison of multiple participants, is difficult to achieve in AR. Specifically, the definition of areas of interest (AOIs), which is often a prerequisite for such analysis, is cumbersome and tedious with existing approaches. To address this issue, we present a new visualization approach to define AOIs, label fixations, and investigate the resulting annotated scanpaths. Our approach utilizes automatic annotation of gaze on virtual objects and an image-based approach that also considers spatial context for the manual annotation of objects in the real world. Our results show, that with our approach, eye tracking data from AR scenes can be annotated and analyzed flexibly with respect to data aspects and annotation strategies.