ICAT-EGVE2016
Permanent URI for this collection
Browse
Browsing ICAT-EGVE2016 by Subject "H.5.1 [Information Interfaces and Presentation]"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Passive Arm Swing Motion for Virtual Walking Sensation(The Eurographics Association, 2016) Saka, Naoyuki; Ikei, Yasushi; Amemiya, Tomohiro; Hirota, Koichi; Kitazaki, Michiteru; Dirk Reiners and Daisuke Iwai and Frank SteinickeThe present paper describes the characteristics of an arm swing display as a part of the multisensory display for creation of walking sensation to the user who is sitting on a vestibular display (a motion chair). The passive arm swing by the display was evaluated regarding the sensation of walking. About 20 % smaller (from 25 to 35 degree) passive swing angle than a real walking motion could effectively enhanced the sensation of walking when displayed as a single modality stimulus for a walking of 1.4 s period. The flexion/extension ratio was shifted forward from the real walk. The optimal swing obtained by the method of adjustment showed the same characteristics. The sensation of walking was markedly increased when both of the passive arm swing and the vestibular stimulus were synchronously presented. The active arm swing raised less walking sensation than the passive arm swing, which might be ascribed to original passiveness of the arm swing during real walking.Item Real-Time 3D Peripheral View Analysis(The Eurographics Association, 2016) Moniri, Mohammad Mehdi; Luxenburger, Andreas; Schuffert, Winfried; Sonntag, Daniel; Dirk Reiners and Daisuke Iwai and Frank SteinickeHuman peripheral vision suffers from several limitations that differ among various regions of the visual field. Since these limitations result in natural visual impairments, many interesting intelligent user interfaces based on eye tracking could benefit from peripheral view calculations that aim to compensate for events occurring outside the very center of gaze. We present a general peripheral view calculation model which extends previous work on attention-based user interfaces that use eye gaze. An intuitive, two dimensional visibility measure based on the concept of solid angle is developed for determining to which extent an object of interest observed by a user intersects with each region of the underlying visual field model. The results are weighted considering the visual acuity in each visual field region to determine the total visibility of the object. We exemplify the proposed model in a virtual reality car simulation application incorporating a head-mounted display with integrated eye tracking functionality. In this context, we provide a quantitative evaluation in terms of a runtime analysis of the different steps of our approach. We provide also several example applications including an interactive web application which visualizes the concepts and calculations presented in this paper.