A framework to manage multimodal fusion of events for advanced interactions within Virtual Environments
dc.contributor.author | TOURAINE, Damien | en_US |
dc.contributor.author | BOURDOT, Patrick | en_US |
dc.contributor.author | BELLIK, Yacine | en_US |
dc.contributor.author | BOLOT, Laurence | en_US |
dc.contributor.editor | S. Mueller and W. Stuerzlinger | en_US |
dc.date.accessioned | 2014-01-27T10:15:27Z | |
dc.date.available | 2014-01-27T10:15:27Z | |
dc.date.issued | 2002 | en_US |
dc.description.abstract | This paper describes the EVI3d framework, a distributed architecture developed to enhance interactions within Virtual Environments (VE). This framework manages many multi-sensorial devices such as trackers, data gloves, and speech or gesture recognition systems as well as haptic devices. The structure of this architecture allows a complete dispatching of device services and their clients on as many machines as required. With the dated events provided by its time synchronization system, it becomes possible to design a specific module to manage multimodal fusion processes. To this end, we describe how the EVI3d framework manages not only low-level events but also abstract modalities. Moreover, the data flow service of the EVI3d framework solves the problem of sharing the virtual scene between modality modules. | en_US |
dc.description.seriesinformation | Eurographics Workshop on Virtual Environments | en_US |
dc.identifier.isbn | 1-58113-535-1 | en_US |
dc.identifier.issn | 1727-530X | en_US |
dc.identifier.uri | https://doi.org/10.2312/EGVE/EGVE02/159-168 | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.title | A framework to manage multimodal fusion of events for advanced interactions within Virtual Environments | en_US |
Files
Original bundle
1 - 1 of 1