JVRC12: Joint Virtual Reality Conference of ICAT - EGVE - EuroVR
Permanent URI for this collection
Browse
Browsing JVRC12: Joint Virtual Reality Conference of ICAT - EGVE - EuroVR by Subject "Artificial"
Now showing 1 - 6 of 6
Results Per Page
Sort Options
Item 3D User Interfaces Using Tracked Multi-touch Mobile Devices(The Eurographics Association, 2012) Wilkes, Curtis B.; Tilden, Dan; Bowman, Doug A.; Ronan Boulic and Carolina Cruz-Neira and Kiyoshi Kiyokawa and David RobertsMulti-touch mobile devices are becoming ubiquitous due to the proliferation of smart phone platforms such as the iPhone and Android. Recent research has explored the use of multi-touch input for 3D user interfaces on displays including large touch screens, tablets, and mobile devices. This research explores the benefits of adding six-degree-of-freedom tracking to a multi-touch mobile device for 3D interaction. We analyze and propose benefits of using tracked multi-touch mobile devices (TMMDs) with the goal of developing effective interaction techniques to handle a variety of tasks within immersive 3D user interfaces. We developed several techniques using TMMDs for virtual object manipulation, and compared our techniques to existing best-practice techniques in a series of user studies. We did not, however, find performance advantages for TMMD-based techniques. We discuss our observations and propose alternate interaction techniques and tasks that may benefit from TMMDs.Item Comparing Auditory and Haptic Feedback for a Virtual Drilling Task(The Eurographics Association, 2012) Rausch, Dominik; Aspöck, Lukas; Knott, Thomas; Pelzer, Sönke; Vorländer, Michael; Kuhlen, Torsten; Ronan Boulic and Carolina Cruz-Neira and Kiyoshi Kiyokawa and David RobertsWhile visual feedback is dominant in Virtual Environments, the use of other modalities like haptics and acoustics can enhance believability, immersion, and interaction performance. Haptic feedback is especially helpful for many interaction tasks like working with medical or precision tools. However, unlike visual and auditory feedback, haptic reproduction is often difficult to achieve due to hardware limitations. This article describes a user study to examine how auditory feedback can be used to substitute haptic feedback when interacting with a vibrating tool. Participants remove some target material with a round-headed drill while avoiding damage to the underlying surface. In the experiment, varying combinations of surface force feedback, vibration feedback, and auditory feedback are used. We describe the design of the user study and present the results, which show that auditory feedback can compensate the lack of haptic feedback.Item An Empiric Evaluation of Confirmation Methods for Optical See-Through Head-Mounted Display Calibration(The Eurographics Association, 2012) Maier, Patrick; Dey, Arindam; Waechter, Christian A. L.; Sandor, Christian; Tönnis, Marcus; Klinker, Gudrun; Ronan Boulic and Carolina Cruz-Neira and Kiyoshi Kiyokawa and David RobertsThe calibration of optical see-through head-mounted displays (OSTHMDs) is an important fundament for correct object alignment in augmented reality. Any calibration process for OSTHMDs requires users to align 2D points in screen space with 3D points and to confirm each alignment. In this paper, we investigate how different confirmation methods affect calibration quality. By an empiric evaluation, we compared four confirmation methods: Keyboard, Hand-held, Voice, and Waiting. We let users calibrate with a video see-through head-mounted display. This way, we were able to record videos of the alignments in parallel. Later image processing provided baseline alignments for comparison against the user generated ones. Our results provide design constraints for future calibration procedures. The Waiting method, designed to reduce head motion during confirmation, showed a significantly higher accuracy than all other methods. Averaging alignments over a time frame improved the accuracy of all methods further more. We validated our results by numerically comparing the user generated projection matrices with calculated ground truth projection matrices. The findings were also observed by several calibration procedures performed with an OSTHMD.Item Indoor Tracking for Large Area Industrial Mixed Reality(The Eurographics Association, 2012) Scheer, Fabian; Müller, Stefan; Ronan Boulic and Carolina Cruz-Neira and Kiyoshi Kiyokawa and David RobertsFor mixed reality (MR) applications the tracking of a video camera in a rapidly changing large environment with several hundred square meters still represents a challenging task. In contrast to an installation in a laboratory, industrial scenarios like a running factory, require minimal setup, calibration or training times of a tracking system and merely minimal changes of the environment. This paper presents a tracking system to compute the pose of a video camera mounted on a mobile carriage like device in very large indoor environments, consisting of several hundred square meters. The carriage is equipped with a touch sensitive monitor to display a live augmentation. The tracking system is based on an infrared laser device, that detects at least three out of a few retroreflective targets in the environment and compares actual target measurements with a precalibrated 2D target map. The device passes a 2D position and orientation. To obtain a six degree of freedom (DOF) pose a coordinate system adjustment method is presented, that determines the transformation between the 2D laser tracker and the image sensor of a camera. To analyse the different error sources leading to the overall error the accuracy of the system is evaluated in a controlled laboratory setup. Beyond that, an evaluation of the system in a large factory building is shown, as well as the application of the system for industrial MR discrepancy checks of complete factory buildings. Finally, the utility of the 2D scanning capabilities of the laser in conjuction with a virtually generated 2D map of the 3D model of a factory is demonstrated for MR discrepancy checks.Item Modifying an Identified Size of Objects Handled with Two Fingers Using Pseudo-Haptic Effects(The Eurographics Association, 2012) Ban, Yuki; Narumi, Takuji; Tanikawa, Tomohiro; Hirose, Michitaka; Ronan Boulic and Carolina Cruz-Neira and Kiyoshi Kiyokawa and David RobertsIn our research, we aim to construct a visuo-haptic system that employs pseudo-haptic effects to provide users with the sensation of touching virtual objects of varying shapes. Thus far, we have proved that it can be possible to modify an identified curved surface shapes or angle of edges by displacing the visual representation of the user's hand. However, this method has some limitations in that we can not adapt the way of touching with two or more fingers by visually displacing the user's hand. To solve this problem, we need to not only displace the visual representation of the user's hand but also deform it. Hence, in this paper, we focus on modifying the identification of the size of objects handled with two fingers. This was achieved by deforming the visual representation of the user's hand in order to construct a novel visuo-haptic system. We devised a video see-through system, which enables us to change the perception of the shape of an object that a user is visually touching. The visual representation of the user's hand is deformed as if the user were handling a visual object, when in actuality the user is handling an object of another size. Using this system we performed an experiment to investigate the effects of visuo-haptic interaction and evaluated its effectiveness. The result showed that the perceived size of objects handled with a thumb and other finger(s) could be modified if the difference between the size of physical and visual stimuli was in the range from -40% to 35%. This indicates that our method can be applied to visuo-haptic shape display system that we proposed.Item Redirected Steering for Virtual Self-Motion Control with a Motorized Electric Wheelchair(The Eurographics Association, 2012) Fiore, Loren Puchalla; Phillips, Lane; Bruder, Gerd; Interrante, Victoria; Steinicke, Frank; Ronan Boulic and Carolina Cruz-Neira and Kiyoshi Kiyokawa and David RobertsRedirection techniques have shown great potential for enabling users to travel in large-scale virtual environments while their physical movements have been limited to a much smaller laboratory space. Traditional redirection approaches introduce a subliminal discrepancy between real and virtual motions of the user by subtle manipulations, which are thus highly dependent on the user and on the virtual scene. In the worst case, such approaches may result in failure cases that have to be resolved by obvious interventions, e. g., when a user faces a physical obstacle and tries to move forward. In this paper we introduce a remote steering method for redirection techniques that are used for physical transportation in an immersive virtual environment. We present a redirection controller for turning a legacy wheelchair device into a remote control vehicle. In a psychophysical experiment we analyze the automatic angular motion redirection with our proposed controller with respect to detectability of discrepancies between real and virtual motions. Finally, we discuss this redirection method with its novel affordances for virtual traveling.