ICAT-EGVE2020 - Posters and Demos
Permanent URI for this collection
Browse
Browsing ICAT-EGVE2020 - Posters and Demos by Subject "Human centered computing"
Now showing 1 - 13 of 13
Results Per Page
Sort Options
Item AR Avatar Separated from Lecturer for Individual Communication in One-to-many Communication(The Eurographics Association, 2020) Kitagishi, Yuki; Tanaka, Yuki; Yonezawa, Tomoko; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukIn this paper, we propose an augmented reality agent separated from the lecturer as his/her avatar to promote individual communication between the lecturer and the audience in parallel with one-to-many communication such as lectures. The proposed agent is expressed by 1-1) a shadow and 1-2) its trajectory drawn by projection mapping to the floor, as well as 2) localized footsteps using the parametric speaker. The agent approaches a particular audience member who seems to need to talk with the lecturer individually. To emphasize that the agent is the lecturer's avatar, the proposed system draws the trajectory from the lecturer's feet to the agent's current position and shakes the shadow's tail like will-o'-the-wisp. During the lecture, the agent approaches the audience member who wants individual communication with the lecturer. After that, the agent talks to the audience member as the lecturer. We expect that the proposed agent will aid in the individual communication between the lecturer and the audience without the audience's hesitation, and will be able to enable multiple and parallel communications with few additional resources required by the lecturer.Item Dark/Light Mode Adaptation for Graphical User Interfaces on Near-Eye Displays(The Eurographics Association, 2020) Erickson, Austin; Kim, Kangsoo; Bruder, Gerd; Welch, Gregory F.; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukIn the fields of augmented reality (AR) and virtual reality (VR), many applications involve user interfaces (UIs) to display various types of information to users. Such UIs are an important component that influences user experience and human factors in AR/VR because the users are directly facing and interacting with them to absorb the visualized information and manipulate the content. While consumer's interests in different forms of near-eye displays, such as AR/VR head-mounted displays (HMDs), are increasing, research on the design standard for AR/VR UIs and human factors becomes more and more interesting and timely important. Although UI configurations, such as dark mode and light mode, have increased in popularity on other display types over the last several years, they have yet to make their way into AR/VR devices as built in features. This demo showcases several use cases of dark mode and light mode UIs on AR/VR HMDs, and provides general guidelines for when they should be used to provide perceptual benefits to the user.Item Effect of Motion and Hand Shape of a Massage Robot on Social Impression: Exploratory study in a Virtual Environment(The Eurographics Association, 2020) Yamamoto, Kyosuke; Kato, Yuki; Tasaki, Ryosuke; Akiduki, Takuma; Mashimo, Tomoaki; Honna, Atsuo; Kitazaki, Michiteru; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukWe aimed to investigate the effects of motion pattern and hand shape on social impressions of a massage robot. The experiment was performed in a virtual environment as an exploratory study. Participants observed a massage robot touching their body, and answered the Robot Social Attributes Scale. There were two motion patterns (discontinuous and smooth) and three types of end effectors (ball, robot hand, and human hand). We found that the massage robot was perceived more competent, warmer, and more comfortable when it moved smoothly than when it moved discontinuously, and the impression of warmth was higher for the human hand than the ball end effector. These results suggest that the massage robot should move smoothly and it would be better if its end effector is like a human hand.Item Empathy with Human's and Robot's Embarrassments in Virtual Environments(The Eurographics Association, 2020) Sugiura, Maruta; Higashihata, Kento; Sato, Atsushi; Itakura, Shoji; Kitazaki, Michiteru; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukWe feel embarrassed not only when we are embarrassed but also when we are watching others embarrassed. Humans show empathy for pain not only human others but also robots. However, it has not been investigated whether humans show empathy for robot's embarrassment. Thus, we aimed to test whether humans can empathize with robot's embarrassment in virtual environments. Four situations both of non-embarrassing and embarrassing stimuli were presented on an HMD, and participants were asked to rate their own feeling of embarrassment and the actor's feeling of embarrassment. We found that the own feeling of embarrassment was higher in human than robot actors, and higher in embarrassing than non-embarrassing conditions. The actor's feeling of embarrassment was rated higher in embarrassing than non-embarrassing conditions, and the effect was much larger in human than robot actors. These results suggest that participants could show empathy with both for human and robot in the embarrassing situations, but they infer that the robot feels less embarrassed than humans.Item Novel Tap Operation on Capacitive Touch Screen for People with Visual Impairment(The Eurographics Association, 2020) Funahashi, Kenji; Maki, Hayato; Iwahori, Yuji; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukWe propose a novel tap method for a capacitive touch screen, which is called inverse tap (iTap). Because this is operated as follows: a finger is removed from a screen first, and then put on it, different from normal tap. It enables the people with visual impairment to find a braille on a touch screen and operate a button, after remembering a button and a function layout. It also enables drivers to operate a car audio and an A/C through a touch screen instrument panel after a quick look at it while driving when they can not close watch on it.Item Partial Finger Involvement Reflects into Grasping Tasks Performance and Accuracy(The Eurographics Association, 2020) Boban, Loën; Delahaye, Mathias; Boulic, Ronan; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukThis paper presents a user study comparing a grasping algorithm we adapted for the Valve Index controller to common grasping approaches using the HTC Vive Wand or the Oculus Quest's Computer Vision-based finger tracking. It involved 24 participants performing two widely known manipulation tasks for each device: "Pick and Place" and "Grab and Reorient". Our results place our Index Controller approach as a good trade-off between the other two approaches regarding completion time and accuracy.Item Pseudo Physical Contact and Communication in VRChat: A Study with Survey Method in Japanese Users(The Eurographics Association, 2020) Nagamachi, Kazuya; Kato, Yuki; Sugimoto, Maki; Inami, Masahiko; Kitazaki, Michiteru; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukVRChat is one of social virtual reality platforms and getting popular. Pseudo physical contacts are used for communication in VRChat. We performed a questionnaire survey for VRChat users (N=341) in Japan to take statistics of users and their avatars, and to investigate effects of pseudo physical contacts on interpersonal attractiveness and communication. Users were 87% male, 8% female, and 4% neutral genders in the real world, while their avatars were 4% male, 87% female, and 9% neutral. Participants answered that the interpersonal attractiveness increased and the communication difficulty decreased after pseudo physical contacts, suggesting that the pseudo physical contact may improve our social relationship without actual touch.Item Situational Awareness in Human Computer Interaction: Diana's World(The Eurographics Association, 2020) Krishnaswamy, Nikhil; Beveridge, Ross; Pustejovsky, James; Patil, Dhruva; McNeely-White, David G.; Wang, Heting; Ortega, Francisco R.; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukIn this paper, we illustrate the role that situated awareness plays in modeling human interactions with Intelligent Virtual Agents (IVAs). Here we describe Diana, a multimodal IVA who exists within an embodied Human-Computer Interaction (EHCI) environment. Diana is a multimodal dialogue agent enabling communication through language, gesture, action, facial expressions, and gaze tracking, in the context of task-oriented interactions.Item Spatialized AR Polyrhythmic Metronome Using Bose Frames Eyewear(The Eurographics Association, 2020) Pinkl, James; Cohen, Michael; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukPolyrhythms, combinations of contrasting but mathematically related rhythms, are challenging to play, even for skilled musicians. This project entails the development of an augmented reality metronome application with an auditorily spatialized, polyphonic soundscape. Audio objects within the scene are metronomes set at polyrhythmic tempos. User control is through a pair of Bose Frames eyewear and the touch screen of a phablet hosting the application. Audio output of the scene is spatialized using the Google Resonance SDK. Options for reverb, rhythmic animation of the sound sources, and visualization of the tempo are also featured.Item Tactile Telepresence for Isolated Patients(The Eurographics Association, 2020) Mostofa, Nafisa; Avendano, Indira; McMahan, Ryan P.; Welch, Gregory F.; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukFor isolated patients, for example COVID-19 patients in an intensive care unit, conventional televideo tools can provide a degree of visual telepresence, but at best approximate a ''through a window'' metaphor-visitors such as loved ones cannot touch the patient. We present preliminary work aimed at providing an isolated patient and remote visitors with visual interactions that are augmented by touch-a perception of being touched for the isolated patient, and a perception of touching for the visitors.Item Towards Interactive Virtual Dogs as a Pervasive Social Companion in Augmented Reality(The Eurographics Association, 2020) Norouzi, Nahal; Kim, Kangsoo; Bruder, Gerd; Welch, Greg; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukPets and animal-assisted intervention sessions have shown to be beneficial for humans' mental, social, and physical health. However, for specific populations, factors such as hygiene restrictions, allergies, and care and resource limitations reduce interaction opportunities. In parallel, understanding the capabilities of animals' technological representations, such as robotic and digital forms, have received considerable attention and has fueled the utilization of many of these technological representations. Additionally, recent advances in augmented reality technology have allowed for the realization of virtual animals with flexible appearances and behaviors to exist in the real world. In this demo, we present a companion virtual dog in augmented reality that aims to facilitate a range of interactions with populations, such as children and older adults.We discuss the potential benefits and limitations of such a companion and propose future use cases and research directions.Item Virtual Reality Training for Proper Recycling Behaviors(The Eurographics Association, 2020) Do, Tiffany D.; Yu, Dylan S.; Katz, Alyssa; McMahan, Ryan P.; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukWe present an immersive virtual reality (VR) application that is designed to train users in recycling behaviors. In this demonstration, users play a recycling game in which they are awarded for proper recycling procedures. The application provides users with visual and auditory feedback as well as various interaction cues. Prior research shows that VR experiences can influence behavior in the physical world and can be particularly powerful in behavior modification. This application aims to transfer real recycling behaviors to participants and can be used to study the effects of VR on behavior modification.Item A Wind Interaction System from the Real to the Virtual World(The Eurographics Association, 2020) Liu, Jingyi; Wakita, Wataru; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukWe propose a wind interaction system from the real to the virtual world. Specifically, when the spectator waves the VIVE controller to the VR player, a wind is generated according to the direction and acceleration of waving in both real and virtual side. In flight content of this system, the aircraft tilts under the influence of the wind, and the motion platform reproduces the actual tilt. With this system, it is expected that spectators waiting for a VR experience and people who do not want to wear an HMD due to VR sickness etc. can enjoy one VR content together by disturbing or supporting the VR player.