ICAT-EGVE2016

Permanent URI for this collection

For Your Eyes Only I
An Efficient Interpolation Approach for Low Cost Unrestrained Gaze Tracking in 3D Space
Christian Scheel, A. B. M. Tariqul Islam, and Oliver Staadt
Effects of Viewing Condition on User Experience of Panoramic Video
Peter J. Passmore, Maxine Glancy, Adam Philpot, Amelia Roscoe, Andrew Wood, and Bob Fields
Use All Your Senses
Passive Arm Swing Motion for Virtual Walking Sensation
Naoyuki Saka, Yasushi Ikei, Tomohiro Amemiya, Koichi Hirota, and Michiteru Kitazaki
AquaCAVE: An Underwater Immersive Projection System for Enhancing the Swimming Experience
Shogo Yamashita, Xinlei Zhang, Takashi Miyaki, and Jun Rekimoto
Going Wide: Degrees matter
View Dependent Tone Mapping of HDR Panoramas for Head Mounted Displays
Steve Cutchin and Yuan Li
Real-Time 3D Peripheral View Analysis
Mohammad Mehdi Moniri, Andreas Luxenburger, Winfried Schuffert, and Daniel Sonntag
A Superwide-FOV Optical Design for Head-Mounted Displays
Ismo Rakkolainen, Matthew Turk, and Tobias Höllerer
For Your Eyes Only II
Blurry (Sticky) Finger: Proprioceptive Pointing and Selection of Distant Objects for Optical See-through based Augmented Reality
Ja Eun Yu and Gerard J. Kim
Dynamic View Expansion for Improving Visual Search in Video See-through AR
Yuki Yano, Jason Orlosky, Kiyoshi Kiyokawa, and Haruo Takemura
When Virtual Is Not Enough
MR Work Supporting System Using Pepper's Ghost
Hiroto Tsuruzoe, Satoru Odera, Hiroshi Shigeno, and Ken-ichi Okada
Simulation based Camera Localization under a Variable Lighting Environment
Tomohiro Mashita, Alexander Plopski, Akira Kudo, Tobias Höllerer, Kiyoshi Kiyokawa, and Haruo Takemura
Synchronized Scene Views in Mixed Virtual Reality for Guided Viewing
Iker Vazquez and Steve Cutchin
From Observations to Collaborative Simulation: Application to Surgical Training
Guillaume Claude, Valérie Gouranton, Benoit Caillaud, Bernard Gibaud, Pierre Jannin, and Bruno Arnaldi
Being There
Natural Interaction in Asymmetric Teleconference using Stuffed-toy Avatar Robot
Samratul Fuady, Masato Orishige, Li Haoyan, Hironori Mitake, and Shoichi Hasegawa
Is This Seat Taken? Behavioural Analysis of the Telethrone: A Novel Situated Telepresence Display
John O'Hare, Robert C. A. Bendall, John Rae, Graham Thomas, Bruce Weir, and David J. Roberts
The Effects of Indirect Real Body Cues of Irrelevant Parts on Virtual Body Ownership and Presence
Sungchul Jung and Charles E. Hughes
The Influence of Real Human Personality on Social Presence with a Virtual Human in Augmented Reality
Kangsoo Kim, Gerd Bruder, Divine Maloney, and Greg Welch

BibTeX (ICAT-EGVE2016)
@inproceedings{
10.2312:egve.20161427,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
An Efficient Interpolation Approach for Low Cost Unrestrained Gaze Tracking in 3D Space}},
author = {
Scheel, Christian
 and
Islam, A. B. M. Tariqul
 and
Staadt, Oliver
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161427}
}
@inproceedings{
10.2312:egve.20161428,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
Effects of Viewing Condition on User Experience of Panoramic Video}},
author = {
Passmore, Peter J.
 and
Glancy, Maxine
 and
Philpot, Adam
 and
Roscoe, Amelia
 and
Wood, Andrew
 and
Fields, Bob
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161428}
}
@inproceedings{
10.2312:egve.20161429,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
Passive Arm Swing Motion for Virtual Walking Sensation}},
author = {
Saka, Naoyuki
 and
Ikei, Yasushi
 and
Amemiya, Tomohiro
 and
Hirota, Koichi
 and
Kitazaki, Michiteru
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161429}
}
@inproceedings{
10.2312:egve.20161430,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
AquaCAVE: An Underwater Immersive Projection System for Enhancing the Swimming Experience}},
author = {
Yamashita, Shogo
 and
Zhang, Xinlei
 and
Miyaki, Takashi
 and
Rekimoto, Jun
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161430}
}
@inproceedings{
10.2312:egve.20161431,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
View Dependent Tone Mapping of HDR Panoramas for Head Mounted Displays}},
author = {
Cutchin, Steve
 and
Li, Yuan
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161431}
}
@inproceedings{
10.2312:egve.20161432,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
Real-Time 3D Peripheral View Analysis}},
author = {
Moniri, Mohammad Mehdi
 and
Luxenburger, Andreas
 and
Schuffert, Winfried
 and
Sonntag, Daniel
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161432}
}
@inproceedings{
10.2312:egve.20161434,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
Blurry (Sticky) Finger: Proprioceptive Pointing and Selection of Distant Objects for Optical See-through based Augmented Reality}},
author = {
Yu, Ja Eun
 and
Kim, Gerard J.
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161434}
}
@inproceedings{
10.2312:egve.20161433,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
A Superwide-FOV Optical Design for Head-Mounted Displays}},
author = {
Rakkolainen, Ismo
 and
Turk, Matthew
 and
Höllerer, Tobias
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161433}
}
@inproceedings{
10.2312:egve.20161435,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
Dynamic View Expansion for Improving Visual Search in Video See-through AR}},
author = {
Yano, Yuki
 and
Orlosky, Jason
 and
Kiyokawa, Kiyoshi
 and
Takemura, Haruo
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161435}
}
@inproceedings{
10.2312:egve.20161436,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
MR Work Supporting System Using Pepper's Ghost}},
author = {
Tsuruzoe, Hiroto
 and
Odera, Satoru
 and
Shigeno, Hiroshi
 and
Okada, Ken-ichi
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161436}
}
@inproceedings{
10.2312:egve.20161437,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
Simulation based Camera Localization under a Variable Lighting Environment}},
author = {
Mashita, Tomohiro
 and
Plopski, Alexander
 and
Kudo, Akira
 and
Höllerer, Tobias
 and
Kiyokawa, Kiyoshi
 and
Takemura, Haruo
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161437}
}
@inproceedings{
10.2312:egve.20161439,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
From Observations to Collaborative Simulation: Application to Surgical Training}},
author = {
Claude, Guillaume
 and
Gouranton, Valérie
 and
Caillaud, Benoit
 and
Gibaud, Bernard
 and
Jannin, Pierre
 and
Arnaldi, Bruno
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161439}
}
@inproceedings{
10.2312:egve.20161438,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
Synchronized Scene Views in Mixed Virtual Reality for Guided Viewing}},
author = {
Vazquez, Iker
 and
Cutchin, Steve
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161438}
}
@inproceedings{
10.2312:egve.20161441,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
Is This Seat Taken? Behavioural Analysis of the Telethrone: A Novel Situated Telepresence Display}},
author = {
O'Hare, John
 and
Bendall, Robert C. A.
 and
Rae, John
 and
Thomas, Graham
 and
Weir, Bruce
 and
Roberts, David J.
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161441}
}
@inproceedings{
10.2312:egve.20161440,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
Natural Interaction in Asymmetric Teleconference using Stuffed-toy Avatar Robot}},
author = {
Fuady, Samratul
 and
Orishige, Masato
 and
Haoyan, Li
 and
Mitake, Hironori
 and
Hasegawa, Shoichi
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161440}
}
@inproceedings{
10.2312:egve.20161443,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
The Influence of Real Human Personality on Social Presence with a Virtual Human in Augmented Reality}},
author = {
Kim, Kangsoo
 and
Bruder, Gerd
 and
Maloney, Divine
 and
Welch, Greg
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161443}
}
@inproceedings{
10.2312:egve.20161442,
booktitle = {
ICAT-EGVE 2016 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Dirk Reiners and Daisuke Iwai and Frank Steinicke
}, title = {{
The Effects of Indirect Real Body Cues of Irrelevant Parts on Virtual Body Ownership and Presence}},
author = {
Jung, Sungchul
 and
Hughes, Charles E.
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-012-3},
DOI = {
10.2312/egve.20161442}
}

Browse

Recent Submissions

Now showing 1 - 18 of 18
  • Item
    ICAT-EGVE 2016: Frontmatter
    (Eurographics Association, 2016) Dirk Reiners; Daisuke Iwai; Frank Steinicke; Dirk Reiners and Daisuke Iwai and Frank Steinicke
  • Item
    An Efficient Interpolation Approach for Low Cost Unrestrained Gaze Tracking in 3D Space
    (The Eurographics Association, 2016) Scheel, Christian; Islam, A. B. M. Tariqul; Staadt, Oliver; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    We present a first attempt to use interpolation based approach to combine a mobile eye tracker with an external tracking system to obtain a 3D gaze vector for a freely moving user. Our method captures calibration points of varying distances, pupil positions and head positions/orientations while the user can move freely within the range of the external tracking system. For this approach, it is not necessary to know the position of the eye or the orientation of the eye coordinate system. In addition to the calibration of the external tracking system, we can calibrate the head-tracked eye tracker in a one-step process which only requires the user to look at the calibration points. Here, we don’t need any extra calibration of the eye tracker, because the raw pupil position from the eye tracker can be used. Moreover, we use low cost tracking hardware which might be affordable to a wide range of application setups. Our experiment and evaluation show that the average accuracy of the visual angle is better than 0.85 degree under unrestrained head movement with a relatively low cost system.
  • Item
    Effects of Viewing Condition on User Experience of Panoramic Video
    (The Eurographics Association, 2016) Passmore, Peter J.; Glancy, Maxine; Philpot, Adam; Roscoe, Amelia; Wood, Andrew; Fields, Bob; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    Panoramic video arises at the convergence of TV and virtual reality, and it is necessary to understand how these technologies interact to affect user experience in order to produce useful content. TV and film makers have developed a sophisticated language and set of techniques to achieve directed linear story telling on fixed screens, whereas virtual worlds more often emphasise user led exploration of possibly non-linear narrative and aspects such as presence and immersion in navigable 3D environments. This study focused on the user experience of panoramic video as viewed over two conditions, on a VR headset and using a handheld phone, and compared this to watching on a static screen thus emphasising the differences between traditional and panoramic TV. A qualitative approach to analysis was taken where users participated in semi-structured interviews. A thematic analysis was performed which produced thematic maps describing user experience for each condition. A detailed and nuanced account of emerging themes is given. Subsequently, key themes were identified and graphed to produce user response profiles to the three viewing conditions that highlight differences in user experience in terms of presence, attention, engagement, concentration on story, certainty, comfort and social ease.
  • Item
    Passive Arm Swing Motion for Virtual Walking Sensation
    (The Eurographics Association, 2016) Saka, Naoyuki; Ikei, Yasushi; Amemiya, Tomohiro; Hirota, Koichi; Kitazaki, Michiteru; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    The present paper describes the characteristics of an arm swing display as a part of the multisensory display for creation of walking sensation to the user who is sitting on a vestibular display (a motion chair). The passive arm swing by the display was evaluated regarding the sensation of walking. About 20 % smaller (from 25 to 35 degree) passive swing angle than a real walking motion could effectively enhanced the sensation of walking when displayed as a single modality stimulus for a walking of 1.4 s period. The flexion/extension ratio was shifted forward from the real walk. The optimal swing obtained by the method of adjustment showed the same characteristics. The sensation of walking was markedly increased when both of the passive arm swing and the vestibular stimulus were synchronously presented. The active arm swing raised less walking sensation than the passive arm swing, which might be ascribed to original passiveness of the arm swing during real walking.
  • Item
    AquaCAVE: An Underwater Immersive Projection System for Enhancing the Swimming Experience
    (The Eurographics Association, 2016) Yamashita, Shogo; Zhang, Xinlei; Miyaki, Takashi; Rekimoto, Jun; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    AquaCAVE is an underwater immersive projection environment which faithfully reproduces the swimming experience in the virtual space. AquaCAVE is inspired by the surrounding projection system known as the CAVE Automatic Virtual Environment, where the stereoscopic images are projected to the surfaces surrounding the user, but addresses several water-specific problems that were not studied in previous systems. In this paper, we describe techniques to overcome the water-specific issues for configuring the immersive projection system. Three characteristics of water that mainly cause problems are pincushion distortion, reflection, and infrared (IR) radiation absorption. Existing motion capture systems based on IR or blue lights are not feasible for an underwater immersive projection environment since IR are absorbed, and blue lights disturb a user to see the stereoscopic images. Therefore, we propose a setup for visible-light head tracking, which is functional for AquaCAVE. As a result, the proposed circular polarization-based method was shown to be valid to enable a constantly clear view, stable head tracking, and reflection reduction. With this methodology, we can build the proposed AquaCAVE that can be applicable to future underwater entertainment and enhanced swimming training.
  • Item
    View Dependent Tone Mapping of HDR Panoramas for Head Mounted Displays
    (The Eurographics Association, 2016) Cutchin, Steve; Li, Yuan; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    Head mounted display(HMD)s are characterized by relatively low resolution and low dynamic range. These limitations significantly reduce the visual quality of photorealistic captures on such displays. In this paper we present an interactive view dependent tone mapping technique for viewing up to 16K wide high dynamic range panoramas on HMDs via view-adjusted mapping function stored in separate texture file. We define this technique as ToneTexture. The use of a view adjusted tone mapping allows for expansion of the perceived color space available to the end user. This yields an improved visual appearance of both high dynamic range panoramas and low dynamic range panoramas on such displays. We present comparisons of the results produced by this technique against Reinhard tone mapping operators. Demonstration systems are available for WebGL and head mounted displays such as Oculus Rift and GearVR.
  • Item
    Real-Time 3D Peripheral View Analysis
    (The Eurographics Association, 2016) Moniri, Mohammad Mehdi; Luxenburger, Andreas; Schuffert, Winfried; Sonntag, Daniel; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    Human peripheral vision suffers from several limitations that differ among various regions of the visual field. Since these limitations result in natural visual impairments, many interesting intelligent user interfaces based on eye tracking could benefit from peripheral view calculations that aim to compensate for events occurring outside the very center of gaze. We present a general peripheral view calculation model which extends previous work on attention-based user interfaces that use eye gaze. An intuitive, two dimensional visibility measure based on the concept of solid angle is developed for determining to which extent an object of interest observed by a user intersects with each region of the underlying visual field model. The results are weighted considering the visual acuity in each visual field region to determine the total visibility of the object. We exemplify the proposed model in a virtual reality car simulation application incorporating a head-mounted display with integrated eye tracking functionality. In this context, we provide a quantitative evaluation in terms of a runtime analysis of the different steps of our approach. We provide also several example applications including an interactive web application which visualizes the concepts and calculations presented in this paper.
  • Item
    Blurry (Sticky) Finger: Proprioceptive Pointing and Selection of Distant Objects for Optical See-through based Augmented Reality
    (The Eurographics Association, 2016) Yu, Ja Eun; Kim, Gerard J.; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    Most AR interaction techniques are focused on direct interaction with close objects within one’s reach (e.g. using the hands). Interacting with distant objects, especially those that are real, has not received much attention. The most prevalent method is using a hand-held device to control the cursor to indirectly designate a target object on the AR display. This may not be a natural and efficient method when used with an optical see-through glass due to its multi-focus problem. In this paper, we propose the "Blurry (Sticky) Finger" in which one uses the finger to aim and point at a distant object, but focusing only on the target with both eyes open (thus without the multi-focus problem) and relying upon the proprioceptive sense. We demonstrate and validate our claim through an experiment comparing three distant pointing/selection methods: (1) indirect cursor based method using a 3D air mouse, (2) proprioceptive finger aiming (Blurry Finger) with a cursor, (3) proprioceptive finger aiming without a cursor. In the experiment, Blurry Finger showed superior performance for selecting relatively small objects and in fact showed low sensitivity to the target object size. It also clearly showed advantages in the initial object selection where the hand/finger starts from a rest position. The Blurry Finger was also evaluated to be the most intuitive and natural.
  • Item
    A Superwide-FOV Optical Design for Head-Mounted Displays
    (The Eurographics Association, 2016) Rakkolainen, Ismo; Turk, Matthew; Höllerer, Tobias; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    We present a new optical design for head-mounted displays (HMD) that has an exceptionally wide field of view (FOV). It can cover even the full human FOV. It is based on seamless lenses and screens curved around the eyes. We constructed several compact and lightweight proof-of-concept prototypes of the optical design. One of them far exceeds the human FOV, although the anatomy of the human head limits the effective FOV. The presented optical design has advantages such as compactness, light weight, low cost and superwide FOV with high resolution. The prototypes are promising, and though this is still work-in-progress and display functionality is not yet implemented, it suggests a feasible way to significantly expand the FOV of HMDs.
  • Item
    Dynamic View Expansion for Improving Visual Search in Video See-through AR
    (The Eurographics Association, 2016) Yano, Yuki; Orlosky, Jason; Kiyokawa, Kiyoshi; Takemura, Haruo; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    The extension or expansion of human vision is often accomplished with video see-through head mounted displays (HMDs) because of their clarity and ability to modulate background information. However, little is known about how we should control these augmentations, and continuous augmentation can have negative consequences such as distorted motion perception. To address these problems, we propose a dynamic view expansion system that modulates vergence, translation, or scale of video see-through cameras to give users on-demand peripheral vision enhancement. Unlike other methods that modify a user’s direct field of view, we take advantage of ultrawide fisheye lenses to provide access to peripheral information that would not otherwise be available. In a series of experiments testing our prototype in real world search, identification, and matching tasks, we test these expansion methods and evaluate both user performance and subjective measures such as fatigue and simulation sickness. Results show that less head movement is required with dynamic view expansion, but performance varies with application.
  • Item
    MR Work Supporting System Using Pepper's Ghost
    (The Eurographics Association, 2016) Tsuruzoe, Hiroto; Odera, Satoru; Shigeno, Hiroshi; Okada, Ken-ichi; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    Recently MR (Mixed Reality) techniques are used in many fields, and one of them is work support. Work support using MR techniques can display work instructions directly in the work space and help user to work effectively especially in assembling tasks. MR work support for assembling tasks often uses HMD (Head Mounted Display) to construct MR environment. However, there are some problems for the use of HMD, such as a burden to the head, a narrow view and a motion picture sickness. One of the techniques to solve such problems is pepper’s ghost which is an optical illusion using a glass. In this paper, we propose the naked eye MR work supporting system for assembling tasks using pepper’s ghost. This system enables a beginner to assemble some blocks into one object by the naked eye with a few burdens.
  • Item
    Simulation based Camera Localization under a Variable Lighting Environment
    (The Eurographics Association, 2016) Mashita, Tomohiro; Plopski, Alexander; Kudo, Akira; Höllerer, Tobias; Kiyokawa, Kiyoshi; Takemura, Haruo; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    Localizing the user from a feature database of a scene is a basic and necessary step for presentation of localized augmented reality (AR) content. Commonly such a database depicts a single appearance of the scene, due to time and effort required to prepare it. However, the appearance depends on various factors, e.g., the position of the sun and cloudiness. Observing the scene under different lighting conditions results in a decreased success rate and accuracy of the localization. To address this we propose to generate the feature database from a simulated appearance of the scene model under a number of different lighting conditions. We also propose to extend the feature descriptors used in the localization with a parametric representation of their changes under varying lighting conditions. We compare our method with a standard representation and matching based on L2-norm in a simulation and real world experiments. Our results show that our simulated environment is a satisfactory representation of the scene’s appearance and improves feature matching over a single database. The proposed feature descriptor achieves a higher localization ratio with fewer feature points and a lower process cost.
  • Item
    From Observations to Collaborative Simulation: Application to Surgical Training
    (The Eurographics Association, 2016) Claude, Guillaume; Gouranton, Valérie; Caillaud, Benoit; Gibaud, Bernard; Jannin, Pierre; Arnaldi, Bruno; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    In surgical training, Virtual Reality systems are mainly focused on technical surgical skills, leaving out procedural aspects. Our project aims at providing a novel approach to the use of Virtual Reality addressing this point. In our project, we propose an innovative workflow to integrate a generic model of the procedure, generated from real case surgery observation, as the scenarios model in the virtual reality training system. In this article we present how the generic procedure model is generated and its integration in the virtual environment.
  • Item
    Synchronized Scene Views in Mixed Virtual Reality for Guided Viewing
    (The Eurographics Association, 2016) Vazquez, Iker; Cutchin, Steve; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    Virtual Reality devices are available with different resolutions and fields of view. Users can simultaneously interact within environments on head mounted displays, cell phones, tablets, and PowerWalls. Sharing scenes across devices requires solutions that smoothly synchronize shared navigation, minimize jitter and avoid visual confusion. In this paper we present a system that allows a single user to remotely guide many remote users within a virtual reality environment. A variety of mixed device environments are supported to let different users connect to the system. Techniques are implemented to minimize jitter and synchronize views, and deal with different fields of view.
  • Item
    Is This Seat Taken? Behavioural Analysis of the Telethrone: A Novel Situated Telepresence Display
    (The Eurographics Association, 2016) O'Hare, John; Bendall, Robert C. A.; Rae, John; Thomas, Graham; Weir, Bruce; Roberts, David J.; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    We present research with two novel components; a system which may improve current small group telecommunication, and an experiment to test the efficacy. Telethrone projects a remote user onto a chair, bringing them into your space. The chair acts as a situated display which can support multi party head gaze, eye gaze, and body torque such that each observer knows where the projected user is looking. It is simpler to implement and cheaper than current systems. Our primary contribution is a counterbalanced repeated measures experiment to analyse gaze interactions. We analyse the multiple independent viewpoint support offered by the system to test if it demonstrates advantage over a set-up which shows a single view to both observers; in this results are inconclusive. Self-report questionnaire data suggests that the current implementation still gives the impression of being a display despite its situated nature although participants did feel the remote user was in the space with them. Results from the eye gaze analysis suggest that the remote user is not excluded from three way poker game-play.
  • Item
    Natural Interaction in Asymmetric Teleconference using Stuffed-toy Avatar Robot
    (The Eurographics Association, 2016) Fuady, Samratul; Orishige, Masato; Haoyan, Li; Mitake, Hironori; Hasegawa, Shoichi; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    In this paper, we propose an asymmetric teleconference system using stuffed-toy robot as the representative of the remote user. Our main goal is to realize the system that can provide natural conversation both on the remote and the local side without immersing the user into a virtual environment. We consider envelope feedback gesture as the significant part to realize natural interaction, specifically turn-taking, eye gazing and beat gesture. We use stuffed-toy robot as the avatar robot because it can move very fast as well as increase familiarity and improve the interaction due to its softness structure. Furthermore, softness structure also brings safety and robustness because it will not be easily broken. In the remote side, we use wearable sensor and eye-tracking sensor to capture remote user’s movement, thus enabling him/her to move and interact naturally as well. We conducted the teleconference experiment to evaluate our system. The result suggested that our system can improve the experience of the teleconference, especially in turn-taking aspect and transferring the beat gesture.
  • Item
    The Influence of Real Human Personality on Social Presence with a Virtual Human in Augmented Reality
    (The Eurographics Association, 2016) Kim, Kangsoo; Bruder, Gerd; Maloney, Divine; Welch, Greg; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    Human responses to an interaction with a Virtual Human (VH) can be influenced by both external factors such as technologyrelated limitations, and internal factors such as individual differences in personality. While the impacts of external factors have been studied widely, and are typically controlled for in application scenarios, less attention has been devoted to the impacts of internal factors. We present the results of a human-subject experiment where we investigated a particular internal factor: the effects of extraversion-introversion traits of participants on the sense of social presence with a VH in an Augmented Reality (AR) setting. Our results indicate a positive association between a condition where the VH proactively requests help from the participant, and participants indicating higher social presence with the VH, regardless of their personality. However, we also found that extraverted participants tended to report higher social presence with the VH, compared to the introverted participants. In addition, there were differences in the duration of when the participants were looking at the VH during the interaction according to their extraversion-introversion traits. Our results suggest that a real human's personality plays a significant role in interactions with a VH, and thus should be considered when carrying out such experiments that include measures of the effects of controlled manipulations on interactions with a VH. We present the details of our experiment and discuss the findings and potential implications related to human perceptions and behaviors with a VH.
  • Item
    The Effects of Indirect Real Body Cues of Irrelevant Parts on Virtual Body Ownership and Presence
    (The Eurographics Association, 2016) Jung, Sungchul; Hughes, Charles E.; Dirk Reiners and Daisuke Iwai and Frank Steinicke
    The employment of visual, auditory and tactile senses directly related to specific body limbs associated with task performance has been shown to give a person a perception of body ownership. However, there is much less evidence of the influence on body ownership of sensory data associated with parts of the user’s body that are not directly associated with the task being performed. For example, if arms and hands are the functional parts in a first person perspective game, do other body parts such as the torso or legs affect the person’s sense of illusion in ways that can increase or decrease the sense of body ownership? To show the effectiveness of appropriate indirect cues to an irrelevant body part, we conducted a virtual mirror based experiment. Specifically, we created a virtual reality system that has four mirror-reflected body conditions in which a participant can see his or her real lower body, a human avatar’s lower body, a generic avatar’s lower body or no lower body in the virtual mirror. With this system, we observed the effects of each condition on the user’s sense of body ownership and presence, even when the lower body parts played no role in the participant’s activities. The results represent a tendency that an indirect real body cue associated with one’s legs can arouse a higher sense of body ownership and unexpected result for presence during the performance of a task involving only virtual hands.