ICAT-EGVE2017

Permanent URI for this collection

Adelaide, Australia, November 22 – 24, 2017
Tracking
Real-time Ambient Fusion of Commodity Tracking Systems for Virtual Reality
Jake Fountain and Shamus P. Smith
A Mutual Motion Capture System for Face-to-face Collaboration
Atsuyuki Nakamura, Kiyoshi Kiyokawa, Photchara Ratsamee, Tomohiro Mashita, Yuki Uranishi, and Haruo Takemura
Won by a Head: A Platform Comparison of Smart Object Linking in Virtual Environments
Barrett Ens, Fraser Anderson, Tovi Grossman, Michelle Annett, Pourang Irani, and George Fitzmaurice
Facial Performance Capture by Embedded Photo Reflective Sensors on A Smart Eyewear
Nao Asano, Katsutoshi Masai, Yuta Sugiura, and Maki Sugimoto
Beyond Visuals
Tour de Tune - Auditory-game-motor Synchronisation in Exergames
Jenna Finlayson, Jamie Peterson, Joshua Free, Michael Lo, Lindsay Shaw, Christof Lutteroth, and Burkhard C. Wünsche
VibVid: VIBration Estimation from VIDeo by using Neural Network
Kentaro Yoshida, Seki Inoue, Yasutoshi Makino, and Hiroyuki Shinoda
Development of Olfactory Display Using Solenoid Valves Controlled Atomization for High Concentration Scent Emission
Yossiri Ariyakul
On the Analysis of Acoustic Distance Perception in a Head Mounted Display
Felix Dollack, Christina Imbery, and Jörg Bitzer
Immersion & Interacton
The Effect of User Embodiment in AV Cinematic Experience
Joshua Chen, Gun A. Lee, Mark Billinghurst, Robert W. Lindeman, and Christoph Bartneck
Evaluating the Effects of Hand-gesture-based Interaction with Virtual Content in a 360° Movie
Humayun Khan, Gun A. Lee, Simon Hoermann, Rory M. S. Clifford, Mark Billinghurst, and Robert W. Lindeman
360° versus 3D Environments in VR Headsets for an Exploration Task
Mehdi Boukhris, Alexis Paljic, and Dominique Lafon-Pham
An Augmented Reality and Virtual Reality Pillar for Exhibitions: A Subjective Exploration
Zi Siang See, Mohd Shahrizal Sunar, Mark Billinghurst, Arindam Dey, Delas Santano, Human Esmaeili, and Harold Thwaites
Asymmetric Bimanual Interaction for Mobile Virtual Reality
Huidong Bai, Alaeddin Nassani, Barrett Ens, and Mark Billinghurst
Avatars & Agents
Real-time Visual Representations for Mixed Reality Remote Collaboration
Lei Gao, Huidong Bai, Thammathip Piumsomboon, Gun A. Lee, Robert W. Lindeman, and Mark Billinghurst
Effects of Personalized Avatar Texture Fidelity on Identity Recognition in Virtual Reality
Jerald Thomas, Mahdi Azmandian, Sonia Grunwald, Donna Le, David Krum, Sin-Hwa Kang, and Evan Suma Rosenberg
Viewpoint-Dependent Appearance-Manipulation with Multiple Projector-Camera Systems
Toshiyuki Amano, Shun Ushida, and Yusuke Miyabayashi
User Interface Agents for Guiding Interaction with Augmented Virtual Mirrors
Gun A. Lee, Omprakash Rudhru, Hye Sun Park, Ho Won Kim, and Mark Billinghurst
Gaming
Enjoyment, Immersion, and Attentional Focus in a Virtual Reality Exergame with Differing Visual Environments
Michael Abernathy, Lindsay A. Shaw, Christof Lutteroth, Jude Buckley, Paul M. Corballis, and Burkhard C. Wünsche
Archives of Thrill: The V-Armchair Experience
Peter J. Passmore, Paul Tennent, Brendan Walker, Adam Philpot, Ha Le, Marianne Markowski, and Mehmet Karamanoglu
Evaluating and Comparing Game-controller based Virtual Locomotion Techniques
Bhuvaneswari Sarupuri, Simon Hoermann, Mary C. Whitton, and Robert W. Lindeman
Ethical Considerations for the Use of Virtual Reality: An Evaluation of Practices in Academia and Industry
Francisco Lopez Luro, Diego Navarro Prada, and Veronica Sundstedt
The Eyes Have It
Assessing the Relevance of Eye Gaze Patterns During Collision Avoidance in Virtual Reality
Kamala Varma, Stephen J. Guy, and Victoria Interrante
Dwarf or Giant: The Influence of Interpupillary Distance and Eye Height on Size Perception in Virtual Environments
Jangyoon Kim and Victoria Interrante
Moving Towards Consistent Depth Perception in Stereoscopic Projection-based Augmented Reality
Susanne Schmidt, Gerd Bruder, and Frank Steinicke
Exploring Pupil Dilation in Emotional Virtual Reality Environments
Hao Chen, Arindam Dey, Mark Billinghurst, and Robert W. Lindeman
Applications & Collaborations
Sharing Gaze for Remote Instruction
Sathya Barathan, Gun A. Lee, Mark Billinghurst, and Robert W. Lindeman
A New Approach to Utilize Augmented Reality on Precision Livestock Farming
Zongyuan Zhao, Wenli Yang, Winyu Chinthammit, Richard Rawnsley, Paul Neumeyer, and Stephen Cahoon
Collaborative View Configurations for Multi-user Interaction with a Wall-size Display
Hyungon Kim, Yeongmi Kim, Gun A. Lee, Mark Billinghurst, and Christoph Bartneck
Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze
Gun A. Lee, Seungwon Kim, Youngho Lee, Arindam Dey, Thammathip Piumsomboon, Mitchell Norman, and Mark Billinghurst
Graphics & Metrics
Towards Precise, Fast and Comfortable Immersive Polygon Mesh Modelling: Capitalising the Results of Past Research and Analysing the Needs of Professionals
Philipp Ladwig, Jens Herder, and Christian Geiger
Fast and Accurate Simulation of Gravitational Field of Irregular-shaped Bodies using Polydisperse Sphere Packings
Abhishek Srinivas, Rene Weller, and Gabriel Zachmann
3D Reconstruction of Hand Postures by Measuring Skin Deformation on Back Hand
Wakaba Kuno, Yuta Sugiura, Nao Asano, Wataru Kawai, and Maki Sugimoto
Reference Framework on vSRT-method Benchmarking for MAR
Ryosuke Ichikari, Takeshi Kurata, Koji Makita, Takafumi Taketomi, Hideaki Uchiyama, Tomotsugu Kondo, Shohei Mori, and Fumihisa Shibata

BibTeX (ICAT-EGVE2017)
@inproceedings{
10.2312:egve.20171331,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Real-time Ambient Fusion of Commodity Tracking Systems for Virtual Reality}},
author = {
Fountain, Jake
and
Smith, Shamus P.
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171331}
}
@inproceedings{
10.2312:egve.20171332,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
A Mutual Motion Capture System for Face-to-face Collaboration}},
author = {
Nakamura, Atsuyuki
and
Kiyokawa, Kiyoshi
and
Ratsamee, Photchara
and
Mashita, Tomohiro
and
Uranishi, Yuki
and
Takemura, Haruo
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171332}
}
@inproceedings{
10.2312:egve.20171333,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Won by a Head: A Platform Comparison of Smart Object Linking in Virtual Environments}},
author = {
Ens, Barrett
and
Anderson, Fraser
and
Grossman, Tovi
and
Annett, Michelle
and
Irani, Pourang
and
Fitzmaurice, George
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171333}
}
@inproceedings{
10.2312:egve.20171334,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Facial Performance Capture by Embedded Photo Reflective Sensors on A Smart Eyewear}},
author = {
Asano, Nao
and
Masai, Katsutoshi
and
Sugiura, Yuta
and
Sugimoto, Maki
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171334}
}
@inproceedings{
10.2312:egve.20171335,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Tour de Tune - Auditory-game-motor Synchronisation in Exergames}},
author = {
Finlayson, Jenna
and
Peterson, Jamie
and
Free, Joshua
and
Lo, Michael
and
Shaw, Lindsay A.
and
Lutteroth, Christof
and
Wünsche, Burkhard C.
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171335}
}
@inproceedings{
10.2312:egve.20171336,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
VibVid: VIBration Estimation from VIDeo by using Neural Network}},
author = {
Yoshida, Kentaro
and
Inoue, Seki
and
Makino, Yasutoshi
and
Shinoda, Hiroyuki
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171336}
}
@inproceedings{
10.2312:egve.20171337,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Development of Olfactory Display Using Solenoid Valves Controlled Atomization for High Concentration Scent Emission}},
author = {
Ariyakul, Yossiri
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171337}
}
@inproceedings{
10.2312:egve.20171338,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
On the Analysis of Acoustic Distance Perception in a Head Mounted Display}},
author = {
Dollack, Felix
and
Imbery, Christina
and
Bitzer, Jörg
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171338}
}
@inproceedings{
10.2312:egve.20171339,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
The Effect of User Embodiment in AV Cinematic Experience}},
author = {
Chen, Joshua
and
Lee, Gun A.
and
Billinghurst, Mark
and
Lindeman, Robert W.
and
Bartneck, Christoph
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171339}
}
@inproceedings{
10.2312:egve.20171340,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Evaluating the Effects of Hand-gesture-based Interaction with Virtual Content in a 360° Movie}},
author = {
Khan, Humayun
and
Lee, Gun A.
and
Hoermann, Simon
and
Clifford, Rory M. S.
and
Billinghurst, Mark
and
Lindeman, Robert W.
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171340}
}
@inproceedings{
10.2312:egve.20171342,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
An Augmented Reality and Virtual Reality Pillar for Exhibitions: A Subjective Exploration}},
author = {
See, Zi Siang
and
Sunar, Mohd Shahrizal
and
Billinghurst, Mark
and
Dey, Arindam
and
Santano, Delas
and
Esmaeili, Human
and
Thwaites, Harold
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171342}
}
@inproceedings{
10.2312:egve.20171341,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
360° versus 3D Environments in VR Headsets for an Exploration Task}},
author = {
Boukhris, Mehdi
and
Paljic, Alexis
and
Lafon-Pham, Dominique
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171341}
}
@inproceedings{
10.2312:egve.20171343,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Asymmetric Bimanual Interaction for Mobile Virtual Reality}},
author = {
Bai, Huidong
and
Nassani, Alaeddin
and
Ens, Barrett
and
Billinghurst, Mark
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171343}
}
@inproceedings{
10.2312:egve.20171344,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Real-time Visual Representations for Mixed Reality Remote Collaboration}},
author = {
Gao, Lei
and
Bai, Huidong
and
Piumsomboon, Thammathip
and
Lee, Gun A.
and
Lindeman, Robert W.
and
Billinghurst, Mark
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171344}
}
@inproceedings{
10.2312:egve.20171345,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Effects of Personalized Avatar Texture Fidelity on Identity Recognition in Virtual Reality}},
author = {
Thomas, Jerald
and
Azmandian, Mahdi
and
Grunwald, Sonia
and
Le, Donna
and
Krum, David
and
Kang, Sin-Hwa
and
Rosenberg, Evan Suma
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171345}
}
@inproceedings{
10.2312:egve.20171346,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Viewpoint-Dependent Appearance-Manipulation with Multiple Projector-Camera Systems}},
author = {
Amano, Toshiyuki
and
Ushida, Shun
and
Miyabayashi, Yusuke
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171346}
}
@inproceedings{
10.2312:egve.20171347,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
User Interface Agents for Guiding Interaction with Augmented Virtual Mirrors}},
author = {
Lee, Gun A.
and
Rudhru, Omprakash
and
Park, Hye Sun
and
Kim, Ho Won
and
Billinghurst, Mark
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171347}
}
@inproceedings{
10.2312:egve.20171349,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Archives of Thrill: The V-Armchair Experience}},
author = {
Passmore, Peter J.
and
Tennent, Paul
and
Walker, Brendan
and
Philpot, Adam
and
Le, Ha
and
Markowski, Marianne
and
Karamanoglu, Mehmet
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171349}
}
@inproceedings{
10.2312:egve.20171348,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Enjoyment, Immersion, and Attentional Focus in a Virtual Reality Exergame with Differing Visual Environments}},
author = {
Abernathy, Michael
and
Shaw, Lindsay A.
and
Lutteroth, Christof
and
Buckley, Jude
and
Corballis, Paul M.
and
Wünsche, Burkhard C.
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171348}
}
@inproceedings{
10.2312:egve.20171350,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Evaluating and Comparing Game-controller based Virtual Locomotion Techniques}},
author = {
Sarupuri, Bhuvaneswari
and
Hoermann, Simon
and
Whitton, Mary C.
and
Lindeman, Robert W.
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171350}
}
@inproceedings{
10.2312:egve.20171352,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Assessing the Relevance of Eye Gaze Patterns During Collision Avoidance in Virtual Reality}},
author = {
Varma, Kamala
and
Guy, Stephen J.
and
Interrante, Victoria
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171352}
}
@inproceedings{
10.2312:egve.20171353,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Dwarf or Giant: The Influence of Interpupillary Distance and Eye Height on Size Perception in Virtual Environments}},
author = {
Kim, Jangyoon
and
Interrante, Victoria
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171353}
}
@inproceedings{
10.2312:egve.20171351,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Ethical Considerations for the Use of Virtual Reality: An Evaluation of Practices in Academia and Industry}},
author = {
Luro, Francisco Lopez
and
Prada, Diego Navarro
and
Sundstedt, Veronica
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171351}
}
@inproceedings{
10.2312:egve.20171355,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Exploring Pupil Dilation in Emotional Virtual Reality Environments}},
author = {
Chen, Hao
and
Dey, Arindam
and
Billinghurst, Mark
and
Lindeman, Robert W.
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171355}
}
@inproceedings{
10.2312:egve.20171354,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Moving Towards Consistent Depth Perception in Stereoscopic Projection-based Augmented Reality}},
author = {
Schmidt, Susanne
and
Bruder, Gerd
and
Steinicke, Frank
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171354}
}
@inproceedings{
10.2312:egve.20171356,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Sharing Gaze for Remote Instruction}},
author = {
Barathan, Sathya
and
Lee, Gun A.
and
Billinghurst, Mark
and
Lindeman, Robert W.
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171356}
}
@inproceedings{
10.2312:egve.20171358,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Collaborative View Configurations for Multi-user Interaction with a Wall-size Display}},
author = {
Kim, Hyungon
and
Kim, Yeongmi
and
Lee, Gun A.
and
Billinghurst, Mark
and
Bartneck, Christoph
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171358}
}
@inproceedings{
10.2312:egve.20171357,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
A New Approach to Utilize Augmented Reality on Precision Livestock Farming}},
author = {
Zhao, Zongyuan
and
Yang, Wenli
and
Chinthammit, Winyu
and
Rawnsley, Richard
and
Neumeyer, Paul
and
Cahoon, Stephen
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171357}
}
@inproceedings{
10.2312:egve.20171359,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze}},
author = {
Lee, Gun A.
and
Kim, Seungwon
and
Lee, Youngho
and
Dey, Arindam
and
Piumsomboon, Thammathip
and
Norman, Mitchell
and
Billinghurst, Mark
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171359}
}
@inproceedings{
10.2312:egve.20171360,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Towards Precise, Fast and Comfortable Immersive Polygon Mesh Modelling: Capitalising the Results of Past Research and Analysing the Needs of Professionals}},
author = {
Ladwig, Philipp
and
Herder, Jens
and
Geiger, Christian
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171360}
}
@inproceedings{
10.2312:egve.20171361,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Fast and Accurate Simulation of Gravitational Field of Irregular-shaped Bodies using Polydisperse Sphere Packings}},
author = {
Srinivas, Abhishek
and
Weller, Rene
and
Zachmann, Gabriel
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171361}
}
@inproceedings{
10.2312:egve.20171362,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
3D Reconstruction of Hand Postures by Measuring Skin Deformation on Back Hand}},
author = {
Kuno, Wakaba
and
Sugiura, Yuta
and
Asano, Nao
and
Kawai, Wataru
and
Sugimoto, Maki
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171362}
}
@inproceedings{
10.2312:egve.20171363,
booktitle = {
ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {
Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
}, title = {{
Reference Framework on vSRT-method Benchmarking for MAR}},
author = {
Ichikari, Ryosuke
and
Kurata, Takeshi
and
Makita, Koji
and
Taketomi, Takafumi
and
Uchiyama, Hideaki
and
Kondo, Tomotsugu
and
Mori, Shohei
and
Shibata, Fumihisa
}, year = {
2017},
publisher = {
The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-038-3},
DOI = {
10.2312/egve.20171363}
}

Browse

Recent Submissions

Now showing 1 - 34 of 34
  • Item
    ICAT-EGVE 2017: Frontmatter
    (Eurographics Association, 2017) Lindeman, Robert W.; Bruder, Gerd; Iwai, Daisuke; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
  • Item
    Real-time Ambient Fusion of Commodity Tracking Systems for Virtual Reality
    (The Eurographics Association, 2017) Fountain, Jake; Smith, Shamus P.; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Cross-compatibility of virtual reality devices is limited by the difficulty of alignment and fusion of data between systems. In this paper, a plugin for ambiently aligning the reference frames of virtual reality tracking systems is presented. The core contribution consists of a procedure for ambient calibration. The procedure describes ambient behaviors for data gathering, system calibration and fault detection. Data is ambiently collected from in-application self-directed movements, and calibration is automatically performed between dependent sensor systems. Sensor fusion is then performed by taking the most accurate data for a given body part amongst all systems. The procedure was applied to aligning a Kinect v2 with an HTC Vive and an Oculus Rift in a variety of common virtual reality scenarios. The results were compared to alignment performed with a gold standard OptiTrack motion capture system. Typical results were 20cm and 4 of error compared to the ground truth, which compares favorably with the accepted accuracy of the Kinect v2. Data collection for full calibration took on average 13 seconds of inapplication, self-directed movement. This work represents an essential development towards plug-and-play sensor fusion for virtual reality technology.
  • Item
    A Mutual Motion Capture System for Face-to-face Collaboration
    (The Eurographics Association, 2017) Nakamura, Atsuyuki; Kiyokawa, Kiyoshi; Ratsamee, Photchara; Mashita, Tomohiro; Uranishi, Yuki; Takemura, Haruo; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    In recent years, motion capture technology to measure the movement of the body has been used in many fields. Moreover, motion capture targeting multiple people is becoming necessary in multi-user virtual reality (VR) and augmented reality (AR) environments. It is desirable that motion capture requires no wearable devices to capture natural motion easily. Some systems require no wearable devices using an RGB-D camera fixed in the environment, but the user has to stay in front of the fixed the RGB-D camera. Therefore, in this research, proposed is a motion capture technique for a multi-user VR / AR environment using head mounted displays (HMDs), that does not limit the working range of the user nor require any wearable devices. In the proposed technique, an RGB-D camera is attached to each HMD and motion capture is carried out mutually. The motion capture accuracy is improved by modifying the depth image. A prototype system has been implemented to evaluate the effectiveness of the proposed method and motion capture accuracy has been compared with two conditions, with and without depth information correction while rotating the RGB-D camera. As a result, it was confirmed that the proposed method could decrease the number of frames with erroneous motion capture by 49% to 100% in comparison with the case without depth image conversion.
  • Item
    Won by a Head: A Platform Comparison of Smart Object Linking in Virtual Environments
    (The Eurographics Association, 2017) Ens, Barrett; Anderson, Fraser; Grossman, Tovi; Annett, Michelle; Irani, Pourang; Fitzmaurice, George; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Mixed -reality platforms and toolkits are now more accessible than ever, bringing a renewed interest in interactive mixed-reality applications. However, more research is required to determine which available platforms are best suited for different situated tasks. This paper presents a user study that compares headworn and handheld platforms with a smart object linking task in interactive virtual environments. These platforms both have potential benefits for supporting spatial interaction for uses situated in the spatial context of the objects being connected. Results show that the immersive, headworn platform has several benefits over the handheld tablet, including better performance and user experience. Findings also show that semantic knowledge about a spatial environment can provide advantages over abstract object identifiers.
  • Item
    Facial Performance Capture by Embedded Photo Reflective Sensors on A Smart Eyewear
    (The Eurographics Association, 2017) Asano, Nao; Masai, Katsutoshi; Sugiura, Yuta; Sugimoto, Maki; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Facial performance capture is used for animation production that projects a performer's facial expression to a computer graphics model. Retro-reflective markers and cameras are widely used for the performance capture. To capture expressions, we need to place markers on the performer's face and calibrate the intrinsic and extrinsic parameters of cameras in advance. However, the measurable space is limited to the calibrated area. In this paper, we propose a system to capture facial performance using a smart eyewear with photo reflective sensors and machine learning technique.
  • Item
    Tour de Tune - Auditory-game-motor Synchronisation in Exergames
    (The Eurographics Association, 2017) Finlayson, Jenna; Peterson, Jamie; Free, Joshua; Lo, Michael; Shaw, Lindsay A.; Lutteroth, Christof; Wünsche, Burkhard C.; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Exergaming has been heralded as a promising approach to increase physical activity in hard-to-reach populations such as sedentary young adults. By combining physical activity with entertainment, researchers and developers hope that the excitement and immersion provided by a computer game will result in increased motivation and dissociation from the discomfort of physical exercise. A different approach to improve physical activity is the use of music. Music, in particular if synchronised with the rhythm of exercise, has been shown to increase performance and decrease the amount of perceived effort for the same performance. So far little research has been done on the combined effect of music and gameplay in exergaming. In this paper we investigate the effect of game-music synchronisation for an immersive exergame. We present a simple yet effective music analysis algorithm, and a novel exergame enabling synchronisation of gameplay with the music's intensity. Our results indicate that our exergame significantly increases enjoyment and motivation compared to music alone. It slightly increases performance, but also increases perceived effort. We did not find any significant differences between gameplay synchronised and not synchronised with the music. Our results confirm the positive effects of music while exercising, but suggest that gameplay might have a bigger effect on exergame effectiveness, and more research on the interaction between gameplay and music needs to be done.
  • Item
    VibVid: VIBration Estimation from VIDeo by using Neural Network
    (The Eurographics Association, 2017) Yoshida, Kentaro; Inoue, Seki; Makino, Yasutoshi; Shinoda, Hiroyuki; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Along with advances in video technology in recent years, there is an increasing need for adding tactile sensation to it. Many researches on models for estimating appropriate tactile information from images and sounds contained in videos have been reported. In this paper, we propose a method named VibVid that uses machine learning for estimating the tactile signal from video with audio that can deal with the kind of video where video and tactile information are not so obviously related. As an example, we evaluated by estimating and imparting the vibration transmitted to the tennis racket from the first-person view video of tennis. As a result, the waveform generated by VibVid was almost in line with the actual vibration waveform. Then we conducted a subject experiment including 20 participants, and it showed good results in four evaluation criteria of harmony, fun, immersiveness, and realism etc.
  • Item
    Development of Olfactory Display Using Solenoid Valves Controlled Atomization for High Concentration Scent Emission
    (The Eurographics Association, 2017) Ariyakul, Yossiri; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    This paper reports on the introduction of using atomization technique controlled by high-speed switching solenoid valves to present smells. Even though atomization has been widely used to release smells in commercial aroma diffusers, intensity of the released odor cannot be controlled. In this paper, the high speed ON/OFF switching of the solenoid valves enables the capability to control odor intensity precisely and rapidly and the atomization enables emission of high concentration odors compared with odors generated from natural evaporation method. The proposed olfactory display was evaluated by using an odor sensing system composed of a quartz crystal microbalance (QCM) gas sensor. As a result, the reproducibility and the capability to present high concentration odors with adjustable intensity of the proposed olfactory display were confirmed.
  • Item
    On the Analysis of Acoustic Distance Perception in a Head Mounted Display
    (The Eurographics Association, 2017) Dollack, Felix; Imbery, Christina; Bitzer, Jörg; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Recent work has shown that distance perception in virtual reality is different from reality. Several studies have tried to quantify the discrepancy between virtual and real visual distance perception but only little work was done on how visual stimuli affect acoustic distance perception in virtual environments. The present study investigates how a visual stimulus effects acoustic distance perception in virtual environments. Virtual sound sources based on binaural room impulse response (BRIR) measurements made from distances ranging from 0.9 to 4.9 m in a lecture room were used as auditory stimuli. Visual stimulation was done using a head mounted display (HMD). Participants were asked to estimate egocentric distance to the sound source in two conditions: auditory with GUI (A), auditory with HMD (A+V). Each condition was presented within its own block to a total of eight participants. We found that a systematical offset is introduced by the visual stimulus.
  • Item
    The Effect of User Embodiment in AV Cinematic Experience
    (The Eurographics Association, 2017) Chen, Joshua; Lee, Gun A.; Billinghurst, Mark; Lindeman, Robert W.; Bartneck, Christoph; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Virtual Reality (VR) is becoming a popular medium for viewing immersive cinematic experiences using 360° panoramic movies and head mounted displays. There are previous research on user embodiment in real-time rendered VR, but not in relation to cinematic VR based on 360° panoramic video. In this paper we explore the effects of introducing the user's real body into cinematic VR experiences. We conducted a study evaluating how the type of movie and user embodiment affects the sense of presence and user engagement. We found that when participants were able to see their own body in the VR movie, there was significant increase in the sense of Presence, yet user engagement was not significantly affected. We discuss on the implications of the results and how it can be expanded in the future.
  • Item
    Evaluating the Effects of Hand-gesture-based Interaction with Virtual Content in a 360° Movie
    (The Eurographics Association, 2017) Khan, Humayun; Lee, Gun A.; Hoermann, Simon; Clifford, Rory M. S.; Billinghurst, Mark; Lindeman, Robert W.; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Head-mounted displays are becoming increasingly popular as home entertainment devices for viewing 360° movies. This paper explores the effects of adding gesture interaction with virtual content and two different hand-visualisation modes for 360° movie watching experience. The system in the study comprises of a Leap Motion sensor to track the user's hand and finger motions, in combination with a SoftKinetic RGB-D camera to capture the texture of the hands and arms. A 360° panoramic movie with embedded virtual objects was used as content. Four conditions, displaying either a point-cloud of the real hand or a rigged computer-generated hand, with and without interaction, were evaluated. Presence, agency, embodiment, and ownership, as well as the overall participant preference were measured. Results showed that participants had a strong preference for the conditions with interactive virtual content, and they felt stronger embodiment and ownership. The comparison of the two hand visualisations showed that the display of the real hand elicited stronger ownership. There was no overall difference for presence between the four conditions. These findings suggest that adding interaction with virtual content could be beneficial to the overall user experience, and that interaction should be performed using the real hand visualisation instead of the virtual hand if higher ownership is desired.
  • Item
    An Augmented Reality and Virtual Reality Pillar for Exhibitions: A Subjective Exploration
    (The Eurographics Association, 2017) See, Zi Siang; Sunar, Mohd Shahrizal; Billinghurst, Mark; Dey, Arindam; Santano, Delas; Esmaeili, Human; Thwaites, Harold; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    This paper presents the development of an Augmented Reality (AR) and Virtual Reality (AR) pillar, a novel approach for showing AR and VR content in a public setting. A pillar in a public exhibition venue was converted to a four-sided AR and VR showcase, and a cultural heritage exhibit of ''Boatbuilders of Pangkor'' was shown. Multimedia tablets and mobile AR head-mountdisplays (HMDs) were provided for visitors to experience multisensory AR and VR content demonstrated on the pillar. The content included AR-based videos, maps, images and text, and VR experiences that allowed visitors to view reconstructed 3D subjects and remote locations in a 360° virtual environment. In this paper, we describe the prototype system, a user evaluation study and directions for future work.
  • Item
    360° versus 3D Environments in VR Headsets for an Exploration Task
    (The Eurographics Association, 2017) Boukhris, Mehdi; Paljic, Alexis; Lafon-Pham, Dominique; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    For entertainment, pedagogical or cultural purposes, there is a need for fast and easy setup of virtual environments that represent real ones. The use of 360° video in Virtual Reality Headsets seems like a powerful tool for producing fun and engaging content in a fast manner. This applies even more when we need to set up realistic views of actual environments. However, in terms of user experience in virtual reality headsets, can 360° shots of a real environment be an interesting alternative to a full 3D model? In this work, we have conducted a user study during a film festival comparing the reaction of a wide public to two versions of a Virtual Reality cultural heritage visit of a Paleolithic cave, the "Grotte de Commarque" located in the south of France. The first version is a full 3D textured model of the cave, the second is a series of 360° pictures, presented in a VR Headset. We have set up a scenario of observation and exploration. The users were able to navigate with the same teleportation metaphor in both conditions. We focused on evaluating the sense of presence during the visit. We have also sought for trends in perceived fun, sickness and easiness of navigation. Our results suggest that the full 3D environment is where the participants feel more present. However, the difference in rating the measures between the two conditions were not strongly marked. Moreover, a relevant result that we retain is that this rating is correlated to the degree of familiarity of the user with virtual reality.
  • Item
    Asymmetric Bimanual Interaction for Mobile Virtual Reality
    (The Eurographics Association, 2017) Bai, Huidong; Nassani, Alaeddin; Ens, Barrett; Billinghurst, Mark; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    In this paper, we explore asymmetric bimanual interaction with mobile Virtual Reality (VR). We have developed a novel two handed interface for mobile VR which uses a 6 degree of freedom (DoF) controller input for the dominant hand and full-hand gesture input for the non-dominant hand. We evaluated our method in a pilot study by comparing it to three other asymmetric bimanual interfaces (1) 3D controller and 2D touch-pad, (2) 3D gesture and 2D controller, and (3) 3D gesture and 2D touchpad in a VR translation and rotation task.We observed that using our position aware handheld controller with gesture input provided an easy and natural experience.
  • Item
    Real-time Visual Representations for Mixed Reality Remote Collaboration
    (The Eurographics Association, 2017) Gao, Lei; Bai, Huidong; Piumsomboon, Thammathip; Lee, Gun A.; Lindeman, Robert W.; Billinghurst, Mark; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    We present a prototype Mixed Reality (MR) system with a hybrid interface to support remote collaboration between a local worker and a remote expert in a large-scale work space. By combining a low-resolution 3D point-cloud of the environment surrounding the local worker with a high-resolution real-time view of small focused details, the remote expert can see a virtual copy of the local workspace with an independent viewpoint control. Meanwhile, the export can also check the current actions of the local worker through a real-time feedback view. We conducted a pilot study to evaluate the usability of our system by comparing the performance of three different interface designs (showing the real-time view in forms of 2D first-person view, a 2D third-person view and a 3D point cloud view). We found no difference in average task performance time between the three interfaces, but there was a difference in user preference.
  • Item
    Effects of Personalized Avatar Texture Fidelity on Identity Recognition in Virtual Reality
    (The Eurographics Association, 2017) Thomas, Jerald; Azmandian, Mahdi; Grunwald, Sonia; Le, Donna; Krum, David; Kang, Sin-Hwa; Rosenberg, Evan Suma; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Recent advances in 3D scanning, reconstruction, and animation techniques have made it possible to rapidly create photorealistic avatars based on real people. While it is now possible to create personalized avatars automatically with consumer-level technology, their visual fidelity still falls far short of 3D avatars created with professional cameras and manual artist effort. To evaluate the importance of investing resources in the creation of high-quality personalized avatars, we conducted an experiment to investigate the effects of varying their visual texture fidelity, specifically focusing on identity recognition of specific individuals. We designed two virtual reality experimental scenarios: (1) selecting a specific avatar from a virtual lineup and (2) searching for an avatar in a virtual crowd. Our results showed that visual fidelity had a significant impact on participants' abilities to identify specific avatars from a lineup wearing a head-mounted display. We also investigated gender effects for both the participants and the confederates from which the avatars were created.
  • Item
    Viewpoint-Dependent Appearance-Manipulation with Multiple Projector-Camera Systems
    (The Eurographics Association, 2017) Amano, Toshiyuki; Ushida, Shun; Miyabayashi, Yusuke; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    This paper proposes a novel projection display technique that realizes viewing-direction-dependent appearance-manipulation. The proposed method employs a multiple projector-camera feedback system, and each projector-camera system simultaneously manipulates the apparent color or contrast from the different viewing directions. Since we assume the mirror reflection is a dominant component, we placed the camera on the counter side of the projector for the system. We confirmed that our multiple projector-camera system enables viewpoint-dependent appearance-manipulation on an anisotropic reflection surface by the experimental results. Interestingly, the application target is not limited to a metallic surface, and we have confirmed that it can be applied to matte paper media for glossy ink reflection.
  • Item
    User Interface Agents for Guiding Interaction with Augmented Virtual Mirrors
    (The Eurographics Association, 2017) Lee, Gun A.; Rudhru, Omprakash; Park, Hye Sun; Kim, Ho Won; Billinghurst, Mark; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    This research investigates using user interface (UI) agents for guiding gesture based interaction with Augmented Virtual Mirrors. Compared to prior work in gesture interaction, where graphical symbols are used for guiding user interaction, we propose using UI agents. We explore two approaches for using UI agents: 1) using a UI agent as a delayed cursor and 2) using a UI agent as an interactive button. We conducted two user studies to evaluate the proposed designs. The results from the user studies show that UI agents are effective for guiding user interactions in a similar way as a traditional graphical user interface providing visual cues, while they are useful in emotionally engaging with users.
  • Item
    Archives of Thrill: The V-Armchair Experience
    (The Eurographics Association, 2017) Passmore, Peter J.; Tennent, Paul; Walker, Brendan; Philpot, Adam; Le, Ha; Markowski, Marianne; Karamanoglu, Mehmet; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Technology for older people is typically concerned either with health care or accessibility of existing systems. In this paper we take a more 'entertainment-oriented' approach to developing experiences aimed at older users. We describe here the design, development and a user study of the V-Armchair, a virtual reality and motion platform based roller coaster experience. The V-Armchair constitutes a blueprint for the digital archiving of physical ride experiences through the simultaneous capture of 360 video, sound and motion. It gives access to thrill experiences to those who may not be able to go on real thrill rides, such as older riders, and it can be considered as a class of technology that could help to support 'active aging' as defined by the World Health Organisation. We discuss strategies for capturing and then 'toning down' motion experiences to make them accessible for older users. We present a study which explores the user experience of the V-Armchair with an older group (median age 63) using a DK2 headset, and a younger group (median age 25) using a CV1 headset, via thematic analysis of semi-structured interviews and a modified version of the Game Experience Questionnaire, and discuss emergent themes such as the role of the presenter, reminiscence, presence and immersion.
  • Item
    Enjoyment, Immersion, and Attentional Focus in a Virtual Reality Exergame with Differing Visual Environments
    (The Eurographics Association, 2017) Abernathy, Michael; Shaw, Lindsay A.; Lutteroth, Christof; Buckley, Jude; Corballis, Paul M.; Wünsche, Burkhard C.; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Virtual reality exergames provide a compelling distraction from the possible discomfort and negative perception of exercise by immersing users in three dimensional virtual worlds. Prior studies have looked at the effects of immersion in exergames, from the technologies used, to gameplay elements, to sensory stimulation. This study examines the level of immersion and distraction caused by various visual environments, including urban, rural, and desert landscapes, and the effects on users' performance, enjoyment, and motivation. The environments were found to have little effect on the user. It appears that the core gameplay elements have a far greater effect, being essential for the immersion a user experiences.
  • Item
    Evaluating and Comparing Game-controller based Virtual Locomotion Techniques
    (The Eurographics Association, 2017) Sarupuri, Bhuvaneswari; Hoermann, Simon; Whitton, Mary C.; Lindeman, Robert W.; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    The incremental hardware costs of virtual locomotion are minimized when the technique uses interaction capabilities available in controllers and devices that are already part of the VE system, e.g., gamepads, keyboards, and multi-function controllers. We used a different locomotion technique for each of these three devices: gamepad thumb-stick (joystick walking), a customized hybrid keyboard for gaming (speedpad walking), and an innovative technique that uses the orientation and triggers of the HTC Vive controllers (TriggerWalking). We explored the efficacy of locomotion techniques using these three devices in a hide and seek task in an indoor environment. We measured task performance, simulator sickness, system usability, perceived workload, and preference. We found that users had a strong preference for TriggerWalking, which also had the least increase in simulator sickness, the highest performance score, and highest perceived usability. However, participants using TriggerWalking also had the most object and wall-collisions. Overall we found that TriggerWalking is an effective locomotion technique and that is has significant and important benefits. Future research will explore if TriggerWalking can be used with equal benefits in other virtual-environments, on different tasks, and types of movement.
  • Item
    Assessing the Relevance of Eye Gaze Patterns During Collision Avoidance in Virtual Reality
    (The Eurographics Association, 2017) Varma, Kamala; Guy, Stephen J.; Interrante, Victoria; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    To increase presence in virtual reality environments requires a meticulous imitation of human behavior in virtual agents. In the specific case of collision avoidance, agents' interaction will feel more natural if they are able to both display and respond to non-verbal cues. This study informs their behavior by analyzing participants' reaction to nonverbal cues. Its aim is to confirm previous work that shows head orientation to be a primary factor in collision avoidance negotiation, and to extend this to investigate the additional contribution of eye gaze direction as a cue. Fifteen participants were directed to walk towards an oncoming agent in a virtual hallway, who would exhibit various combinations of head orientation and eye gaze direction based cues. Closely prior to the potential collision the display turned black and the participant had to move in avoidance of the agent as if she were still present. Meanwhile, their own eye gaze was tracked to identify where their focus was directed and how it related to their response. Results show that the natural tendency was to avoid the agent by moving right. However, participants showed a greater compulsion to move leftward if the agent cued her own movement to the participant's right, whether through head orientation cues (consistent with previous work) or through eye gaze direction cues (extending previous work). The implications of these findings are discussed.
  • Item
    Dwarf or Giant: The Influence of Interpupillary Distance and Eye Height on Size Perception in Virtual Environments
    (The Eurographics Association, 2017) Kim, Jangyoon; Interrante, Victoria; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    This paper addresses the question: to what extent can deliberate manipulations of interpupillary distance (IPD) and eye height be used in a virtual reality (VR) experience to influence a user's sense of their own scale with respect to their surrounding environment - evoking, for example, the illusion of being miniaturized, or of being a giant? In particular, we report the results of an experiment in which we separately study the effect of each of these body scale manipulations on users' perception of object size in a highly detailed, photorealistically rendered immersive virtual environment, using both absolute numeric measures and body-relative actions. Following a real world training session, in which participants learn to accurately report the metric sizes of individual white cubes (3''-20'') presented one at a time on a table in front of them, we conduct two blocks of VR trials using nine different combinations of IPD and eye height. In the first block of trials, participants report the perceived metric size of a virtual white cube that sits on a virtual table, at the same distance used in the real-world training, within in a realistic virtual living room filled with many objects capable of providing familiar size cues. In the second block of trials, participants use their hands to indicate the perceived size of the cube. We found that size judgments were moderately correlated (r = 0.4) between the two response methods, and that neither altered eye height (± 50cm) nor reduced (10mm) IPD had a significant effect on size judgments, but that a wider (150mm) IPD caused a significant (μ = 38%, p < 0.01) decrease in perceived cube size. These findings add new insights to our understanding of how eye height and IPD manipulations can affect peoples' perception of scale in highly realistic immersive VR scenarios.
  • Item
    Ethical Considerations for the Use of Virtual Reality: An Evaluation of Practices in Academia and Industry
    (The Eurographics Association, 2017) Luro, Francisco Lopez; Prada, Diego Navarro; Sundstedt, Veronica; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    The following article offers a set of recommendations that are considered relevant for designing and executing experiences with Virtual Reality (VR) technology. It presents a brief review of the history and evolution of VR, along with the physiological issues related to its use. Additionally, typical practices in VR, used by both academia and industry are discussed and contrasted. These were further analysed from an ethical perspective, guided by legal and Corporate Social Responsibility (CSR) frameworks, to understand their motivation and goals, and the rights and responsibilities related to the exposure of research participants and final consumers to VR. Our results showed that there is a significant disparity between practices in academia and industry, and for industry specifically, there can be breaches of user protection regulations and poor ethical practices. The differences found are mainly in regards to the type of content presented, the overall setup of VR experiences, and the amount of information provided to participants or consumers respectively. To contribute to this issue, this study highlights some ethical aspects and also offers practical considerations that aim, not only to have more appropriate practices with VR in public spaces but also to motivate a discussion and reflection to ease the adoption of this technology in the consumer market.
  • Item
    Exploring Pupil Dilation in Emotional Virtual Reality Environments
    (The Eurographics Association, 2017) Chen, Hao; Dey, Arindam; Billinghurst, Mark; Lindeman, Robert W.; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Previous investigations have shown that pupil dilation can be affected by emotive pictures, audio clips, and videos. In this paper, we explore how emotive Virtual Reality (VR) content can also cause pupil dilation. VR has been shown to be able to evoke negative and positive arousal in users when they are immersed in different virtual scenes. In our research, VR scenes were used as emotional triggers. Five emotional VR scenes were designed in our study and each scene had five emotion segments; happiness, fear, anxiety, sadness, and disgust. When participants experienced the VR scenes, their pupil dilation and the brightness in the headset were captured. We found that both the negative and positive emotion segments produced pupil dilation in the VR environments. We also explored the effect of showing heart beat cues to the users, and if this could cause difference in pupil dilation. In our study, three different heart beat cues were shown to users using a combination of three channels; haptic, audio, and visual. The results showed that the haptic-visual cue caused the most significant pupil dilation change from the baseline.
  • Item
    Moving Towards Consistent Depth Perception in Stereoscopic Projection-based Augmented Reality
    (The Eurographics Association, 2017) Schmidt, Susanne; Bruder, Gerd; Steinicke, Frank; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Stereoscopic projection-based augmented reality (AR) is a promising technology for creating an effective illusion of virtual and real objects coexisting within the same space. By using projection technology, two-dimensional (2D) textures as well as three-dimensional (3D) virtual objects can be displayed on arbitrary physical objects. However, depending on the geometry of the projection surface, even a single virtual object could be projected with varying depths, orientations, and forms. For these reasons, it is an open question whether or not a geometrically-correct projection leads to a consistent depth perception of the AR environment. We performed an experiment to analyze how humans perceive depths of objects that are stereoscopically projected at different surfaces in a projection-based AR environment. In a perceptual matching task the participants had to adjust the depth of one of two visual stimuli, which were displayed at different depths with varying parallaxes, until they estimated the depths of both stimuli to match. The results indicate that the effect of parallax on the estimation of matching depths significantly depends on the participant's experience with stereoscopic display. Regular users were able to match the depths of both stimuli with a mean absolute error of less than one centimeter, whereas less experienced users made errors in the range of more than 2cm on average. We performed a confirmatory study to verify our findings with more ecologically valid projection-based AR stimuli.
  • Item
    Sharing Gaze for Remote Instruction
    (The Eurographics Association, 2017) Barathan, Sathya; Lee, Gun A.; Billinghurst, Mark; Lindeman, Robert W.; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    In this paper, we report on how sharing gaze cues can assist remote instruction. A person wearing a head-mounted display and camera can share his or her view with a remote collaborator and get assistance on completing a real-world task. This configuration has been extensively studied in the past, but there has been little research on how the addition of sharing gaze cues might affect the collaboration. This paper reports on a user study exploring how sharing the gaze of a remote expert affects the quality of collaboration over a head-worn video conferencing link. The results showed that the users performed faster when the local workers were aware of their remote collaborator's gaze, and the remote experts were in favour of shared gaze cues because of the ease-of-use and improved communication.
  • Item
    Collaborative View Configurations for Multi-user Interaction with a Wall-size Display
    (The Eurographics Association, 2017) Kim, Hyungon; Kim, Yeongmi; Lee, Gun A.; Billinghurst, Mark; Bartneck, Christoph; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    This paper explores the effects of different collaborative view configuration on face-to-face collaboration using a wall-size display and the relationship between view configuration and multi-user interaction. Three different view configurations (shared view, split screen, and split screen with navigation information) for multi-user collaboration with a wall-size display were introduced and evaluated in a user study. From the experiment results, several insights for designing a virtual environment with a wall-size display were discussed. The shared view configuration does not disturb collaboration despite control conflict and can provide an effective collaboration. The split screen view configuration can provide independent collaboration while it can take users' attention. The navigation information can reduce the interaction required for the navigational task while an overall interaction performance may not increase.
  • Item
    A New Approach to Utilize Augmented Reality on Precision Livestock Farming
    (The Eurographics Association, 2017) Zhao, Zongyuan; Yang, Wenli; Chinthammit, Winyu; Rawnsley, Richard; Neumeyer, Paul; Cahoon, Stephen; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    This paper proposes a new method that utilizes AR to assist pasture-based dairy farmers identify and locate animal within large herds. Our proposed method uses GPS collars on cows and digital camera and on-board GPS on a mobile device to locate a selected cow and show the behavioral and other associated key metrics on our mobile application. The augmented cow's information shown on real scene video steam will help users (farmers) manage their animals with respect to welfare, health, and management interventions. By integrating GPS data with computer vision (CV) and machine learning, our mobile AR application has two major functions: 1. Searching a cow by its unique ID, and 2. Displaying information associated with a selected cow visible on screen. Our proof-of-concept application shows the potential of utilizing AR in precision livestock farming.
  • Item
    Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze
    (The Eurographics Association, 2017) Lee, Gun A.; Kim, Seungwon; Lee, Youngho; Dey, Arindam; Piumsomboon, Thammathip; Norman, Mitchell; Billinghurst, Mark; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    To improve remote collaboration in video conferencing systems, researchers have been investigating augmenting visual cues onto a shared live video stream. In such systems, a person wearing a head-mounted display (HMD) and camera can share her view of the surrounding real-world with a remote collaborator to receive assistance on a real-world task. While this concept of augmented video conferencing (AVC) has been actively investigated, there has been little research on how sharing gaze cues might affect the collaboration in video conferencing. This paper investigates how sharing gaze in both directions between a local worker and remote helper in an AVC system affects the collaboration and communication. Using a prototype AVC system that shares the eye gaze of both users, we conducted a user study that compares four conditions with different combinations of eye gaze sharing between the two users. The results showed that sharing each other's gaze significantly improved collaboration and communication.
  • Item
    Towards Precise, Fast and Comfortable Immersive Polygon Mesh Modelling: Capitalising the Results of Past Research and Analysing the Needs of Professionals
    (The Eurographics Association, 2017) Ladwig, Philipp; Herder, Jens; Geiger, Christian; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    More than three decades of ongoing research in immersive modelling has revealed many advantages of creating objects in virtual environments. Even though there are many benefits, the potential of immersive modelling has only been partly exploited due to unresolved problems such as ergonomic problems, numerous challenges with user interaction and the inability to perform exact, fast and progressive refinements. This paper explores past research, shows alternative approaches and proposes novel interaction tools for pending problems. An immersive modelling application for polygon meshes is created from scratch and tested by professional users of desktop modelling tools, such as Autodesk Maya, in order to assess the efficiency, comfort and speed of the proposed application with direct comparison to professional desktop modelling tools.
  • Item
    Fast and Accurate Simulation of Gravitational Field of Irregular-shaped Bodies using Polydisperse Sphere Packings
    (The Eurographics Association, 2017) Srinivas, Abhishek; Weller, Rene; Zachmann, Gabriel; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    Currently, interest in space missions to small bodies (e.g., asteroids) is increasing, both scientifically and commercially. One of the important aspects of these missions is to test the navigation, guidance, and control algorithms. The most cost and time efficient way to do this is to simulate the missions in virtual testbeds. To do so, a physically-based simulation of the small bodies' physical properties is essential. One of the most important physical properties, especially for landing operations, is the gravitational field, which can be quite irregular, depending on the shape and mass distribution of the body. In this paper, we present a novel algorithm to simulate gravitational fields for small bodies like asteroids. The main idea is to represent the small body's mass by a polydisperse sphere packing. This allows for an easy and efficient parallelization. Our GPU-based implementation outperforms traditional methods by more than two orders of magnitude while achieving a similar accuracy.
  • Item
    3D Reconstruction of Hand Postures by Measuring Skin Deformation on Back Hand
    (The Eurographics Association, 2017) Kuno, Wakaba; Sugiura, Yuta; Asano, Nao; Kawai, Wataru; Sugimoto, Maki; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    In this research, we propose a method for reconstructing hand posture by measuring the deformation of the back of the hand with a wearable device. The deformation of skin on the back of the hand can be measured by using several photo-reflective sensors attached to a wearable device. In the learning phase, our method constructs a regression model by using the data on hand posture captured by a depth camera and data on the skin deformation of the back of the hand captured by several photoreflective sensors. In the estimation phase, by using this regression model, the posture of the hand is reconstructed from the data of the photo-reflective sensors in real-time. The posture of fingers can be estimated without hindering the natural movement of the fingers since the deformation of the back of the hand is measured without directly measuring the position of the fingers. This method can be used by users to manipulate information in a virtual environment with their fingers. We conducted an experiment to evaluate the accuracy of reconstructing hand posture with the proposed system.
  • Item
    Reference Framework on vSRT-method Benchmarking for MAR
    (The Eurographics Association, 2017) Ichikari, Ryosuke; Kurata, Takeshi; Makita, Koji; Taketomi, Takafumi; Uchiyama, Hideaki; Kondo, Tomotsugu; Mori, Shohei; Shibata, Fumihisa; Robert W. Lindeman and Gerd Bruder and Daisuke Iwai
    This paper presents a reference framework on benchmarking of vision-based spatial registration and tracking (vSRT) methods for Mixed and Augmented Reality (MAR). This framework can provide typical benchmarking processes, benchmark indicators, and trial set elements that are necessary to successfully identify, define, design, select, and apply benchmarking of vSRT methods for MAR. In addition, we summarize findings from benchmarking activities for sharing how to organize and conduct on-site and off-site competition.