Interval-Based Motion Blending for Hand Grasping
dc.contributor.author | Brisbin, Matt | en_US |
dc.contributor.author | Benes, Bedrich | en_US |
dc.contributor.editor | Ik Soo Lim and David Duce | en_US |
dc.date.accessioned | 2014-01-31T19:58:15Z | |
dc.date.available | 2014-01-31T19:58:15Z | |
dc.date.issued | 2007 | en_US |
dc.description.abstract | For motion to appear realistic and believable proper motion blending methods must be used in respect to the goal or task at hand. We present a method that extends the theory of move trees [MBC01] by tagging (attaching) information to each clip within a database at intervals and finding the shortest distance per tag while pruning the tree using convergence priority. Our goal is to retain the physical characteristics of motion capture data while using non-destructive blending in a goal-based scenario. With the intrinsically high dimensionality of a human hand our method also is concerned with intelligent pruning of the move tree. By constructing a move tree for hand grasping scenarios that is sampled per interval within clips and adheres to a convergence priority; we plan to develop a method that will autonomously conform a hand to the object being g | en_US |
dc.description.seriesinformation | Theory and Practice of Computer Graphics | en_US |
dc.identifier.isbn | 978-3-905673-63-0 | en_US |
dc.identifier.uri | https://doi.org/10.2312/LocalChapterEvents/TPCG/TPCG07/201-205 | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.subject | Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Motion Blending | en_US |
dc.title | Interval-Based Motion Blending for Hand Grasping | en_US |
Files
Original bundle
1 - 1 of 1