Emotion Transfer for 3D Hand Motion using StarGAN

dc.contributor.authorChan, Jacky C. P.en_US
dc.contributor.authorIrimia, Ana-Sabinaen_US
dc.contributor.authorHo, Edmond S. L.en_US
dc.contributor.editorRitsos, Panagiotis D. and Xu, Kaien_US
dc.date.accessioned2020-09-10T06:27:44Z
dc.date.available2020-09-10T06:27:44Z
dc.date.issued2020
dc.description.abstractIn this paper, we propose a new data-driven framework for 3D hand motion emotion transfer. Specifically, we first capture highquality hand motion using VR gloves. The hand motion data is then annotated with the emotion type and converted to images to facilitate the motion synthesis process and the new dataset will be available to the public. To the best of our knowledge, this is the first public dataset with annotated hand motions. We further formulate the emotion transfer for 3D hand motion as an Image-to-Image translation problem, and it is done by adapting the StarGAN framework. Our new framework is able to synthesize new motions, given target emotion type and an unseen input motion. Experimental results show that our framework can produce high quality and consistent hand motions.en_US
dc.description.sectionheadersVisualisation and Machine Learning
dc.description.seriesinformationComputer Graphics and Visual Computing (CGVC)
dc.identifier.doi10.2312/cgvc.20201146
dc.identifier.isbn978-3-03868-122-9
dc.identifier.pages19-26
dc.identifier.urihttps://doi.org/10.2312/cgvc.20201146
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/cgvc20201146
dc.publisherThe Eurographics Associationen_US
dc.titleEmotion Transfer for 3D Hand Motion using StarGANen_US
Files