Neural3Points: Learning to Generate Physically Realistic Full-body Motion for Virtual Reality Users

Loading...
Thumbnail Image
Date
2022
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association and John Wiley & Sons Ltd.
Abstract
Animating an avatar that reflects a user's action in the VR world enables natural interactions with the virtual environment. It has the potential to allow remote users to communicate and collaborate in a way as if they met in person. However, a typical VR system provides only a very sparse set of up to three positional sensors, including a head-mounted display (HMD) and optionally two hand-held controllers, making the estimation of the user's full-body movement a difficult problem. In this work, we present a data-driven physics-based method for predicting the realistic full-body movement of the user according to the transformations of these VR trackers and simulating an avatar character to mimic such user actions in the virtual world in realtime. We train our system using reinforcement learning with carefully designed pretraining processes to ensure the success of the training and the quality of the simulation. We demonstrate the effectiveness of the method with an extensive set of examples.
Description

CCS Concepts: Computing methodologies --> Physical simulation; Virtual reality; Motion capture; Theory of computation --> Reinforcement Learning

        
@article{
10.1111:cgf.14634
, journal = {Computer Graphics Forum}, title = {{
Neural3Points: Learning to Generate Physically Realistic Full-body Motion for Virtual Reality Users
}}, author = {
Ye, Yongjing
and
Liu, Libin
and
Hu, Lei
and
Xia, Shihong
}, year = {
2022
}, publisher = {
The Eurographics Association and John Wiley & Sons Ltd.
}, ISSN = {
1467-8659
}, DOI = {
10.1111/cgf.14634
} }
Citation
Collections