Browsing by Author "Kim, Kangsoo"
Now showing 1 - 7 of 7
Results Per Page
Sort Options
Item An Automated Virtual Receptionist for Recognizing Visitors and Assuring Mask Wearing(The Eurographics Association, 2020) Zehtabian, Sharare; Khodadadeh, Siavash; Kim, Kangsoo; Bruder, Gerd; Welch, Greg; Bölöni, Ladislau; Turgut, Damla; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukIntelligent virtual agents have many societal uses, specifically in situations in which the presence of real humans would be prohibitive. In particular, virtual receptionists can perform a variety of tasks associated with visitor and employee safety, e.g., during the COVID-19 pandemic. In this poster, we present our prototype of a virtual receptionist that employs computer vision and meta-learning techniques to identify and interact with a visitor in a manner similar to that of a real human receptionist. Specifically we employ a meta-learning-based classifier to learn the visitors' faces from the minimal data collected during a first visit, such that the receptionist can recognize the same visitor during follow-up visits. The system also makes use of deep neural network-based computer vision techniques to recognize whether the visitor is wearing a face mask or not.Item Blowing in the Wind: Increasing Copresence with a Virtual Human via Airflow Influence in Augmented Reality(The Eurographics Association, 2018) Kim, Kangsoo; Bruder, Gerd; Welch, Gregory; Bruder, Gerd and Yoshimoto, Shunsuke and Cobb, SueIn a social context where two or more interlocutors interact with each other in the same space, one's sense of copresence with the others is an important factor for the quality of communication and engagement in the interaction. Although augmented reality (AR) technology enables the superposition of virtual humans (VHs) as interlocutors in the real world, the resulting sense of copresence is usually far lower than with a real human interlocutor. In this paper, we describe a human-subject study in which we explored and investigated the effects that subtle multi-modal interaction between the virtual environment and the real world, where a VH and human participants were co-located, can have on copresence. We compared two levels of gradually increased multi-modal interaction: (i) virtual objects being affected by real airflow as commonly experienced with fans in summer, and (ii) a VH showing awareness of this airflow. We chose airflow as one example of an environmental factor that can noticeably affect both the real and virtual worlds, and also cause subtle responses in interlocutors.We hypothesized that our two levels of treatment would increase the sense of being together with the VH gradually, i.e., participants would report higher copresence with airflow influence than without it, and the copresence would be even higher when the VH shows awareness of the airflow. The statistical analysis with the participant-reported copresence scores showed that there was an improvement of the perceived copresence with the VH when both the physical-virtual interactivity via airflow and the VH's awareness behaviors were present together. As the considered environmental factors are directed at the VH, i.e., they are not part of the direct interaction with the real human, they can provide a reasonably generalizable approach to support copresence in AR beyond the particular use case in the present experiment.Item Dark/Light Mode Adaptation for Graphical User Interfaces on Near-Eye Displays(The Eurographics Association, 2020) Erickson, Austin; Kim, Kangsoo; Bruder, Gerd; Welch, Gregory F.; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukIn the fields of augmented reality (AR) and virtual reality (VR), many applications involve user interfaces (UIs) to display various types of information to users. Such UIs are an important component that influences user experience and human factors in AR/VR because the users are directly facing and interacting with them to absorb the visualized information and manipulate the content. While consumer's interests in different forms of near-eye displays, such as AR/VR head-mounted displays (HMDs), are increasing, research on the design standard for AR/VR UIs and human factors becomes more and more interesting and timely important. Although UI configurations, such as dark mode and light mode, have increased in popularity on other display types over the last several years, they have yet to make their way into AR/VR devices as built in features. This demo showcases several use cases of dark mode and light mode UIs on AR/VR HMDs, and provides general guidelines for when they should be used to provide perceptual benefits to the user.Item ICAT-EGVE 2020 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments(The Eurographics Association, 2020) Argelaguet, Ferran; McMahan, Ryan; Sugimoto, Maki; Kulik, Alexander; Sra, Misha; Kim, Kangsoo; Seo, Byung-Kuk; Argelaguet, Ferran and McMahan, Ryan and Sugimoto, MakiItem A Review of Visual Perception Research in Optical See-Through Augmented Reality(The Eurographics Association, 2020) Erickson, Austin; Kim, Kangsoo; Bruder, Gerd; Welch, Gregory F.; Argelaguet, Ferran and McMahan, Ryan and Sugimoto, MakiIn the field of augmented reality (AR), many applications involve user interfaces (UIs) that overlay visual information over the user's view of their physical environment, e.g., as text, images, or three-dimensional scene elements. In this scope, optical seethrough head-mounted displays (OST-HMDs) are particularly interesting as they typically use an additive light model, which denotes that the perception of the displayed virtual imagery is a composite of the lighting conditions of one's environment, the coloration of the objects that make up the virtual imagery, and the coloration of physical objects that lay behind them. While a large body of literature focused on investigating the visual perception of UI elements in immersive and flat panel displays, comparatively less effort has been spent on OST-HMDs. Due to the unique visual effects with OST-HMDs, we believe that it is important to review the field to understand the perceptual challenges, research trends, and future directions. In this paper, we present a systematic survey of literature based on the IEEE and ACM digital libraries, which explores users' perception of displaying text-based information on an OST-HMD, and aim to provide relevant design suggestions based on the meta-analysis results. We carefully review 14 key papers relevant to the visual perception research in OST-HMDs with UI elements, and present the current state of the research field, associated trends, noticeable research gaps in the literature, and recommendations for potential future research in this domain.Item A Systematic Literature Review of Embodied Augmented Reality Agents in Head-Mounted Display Environments(The Eurographics Association, 2020) Norouzi, Nahal; Kim, Kangsoo; Bruder, Gerd; Erickson, Austin; Choudhary, Zubin; Li, Yifan; Welch, Greg; Argelaguet, Ferran and McMahan, Ryan and Sugimoto, MakiEmbodied agents, i.e., computer-controlled characters, have proven useful for various applications across a multitude of display setups and modalities. While most traditional work focused on embodied agents presented on a screen or projector, and a growing number of works are focusing on agents in virtual reality, a comparatively small number of publications looked at such agents in augmented reality (AR). Such AR agents, specifically when using see-through head-mounted displays (HMDs) as the display medium, show multiple critical differences to other forms of agents, including their appearances, behaviors, and physical-virtual interactivity. Due to the unique challenges in this specific field, and due to the comparatively limited attention by the research community so far, we believe that it is important to map the field to understand the current trends, challenges, and future research. In this paper, we present a systematic review of the research performed on interactive, embodied AR agents using HMDs. Starting with 1261 broadly related papers, we conducted an in-depth review of 50 directly related papers from 2000 to 2020, focusing on papers that reported on user studies aiming to improve our understanding of interactive agents in AR HMD environments or their utilization in specific applications. We identified common research and application areas of AR agents through a structured iterative process, present research trends, and gaps, and share insights on future directions.Item Towards Interactive Virtual Dogs as a Pervasive Social Companion in Augmented Reality(The Eurographics Association, 2020) Norouzi, Nahal; Kim, Kangsoo; Bruder, Gerd; Welch, Greg; Kulik, Alexander and Sra, Misha and Kim, Kangsoo and Seo, Byung-KukPets and animal-assisted intervention sessions have shown to be beneficial for humans' mental, social, and physical health. However, for specific populations, factors such as hygiene restrictions, allergies, and care and resource limitations reduce interaction opportunities. In parallel, understanding the capabilities of animals' technological representations, such as robotic and digital forms, have received considerable attention and has fueled the utilization of many of these technological representations. Additionally, recent advances in augmented reality technology have allowed for the realization of virtual animals with flexible appearances and behaviors to exist in the real world. In this demo, we present a companion virtual dog in augmented reality that aims to facilitate a range of interactions with populations, such as children and older adults.We discuss the potential benefits and limitations of such a companion and propose future use cases and research directions.