CLIPE 2024

Permanent URI for this collection

Mocap and Authoring Virtual Humans
A CRITS Foray Into Cultural Heritage: Background Characters For The SHELeadersVR Project
Jean-Benoit Culié, Bojan Mijatovic, David Panzoli, Davud Nesimovic, Stéphane Sanchez, and Selma Rizvic
Overcoming Challenges of Cycling Motion Capturing and Building a Comprehensive Dataset
Panayiotis Kyriakou, Marios Kyriakou, and Yiorgos Chrysanthou
Capture and Automatic Production of Digital Humans in Real Motion with a Temporal 3D Scanner
Eduardo Parrilla, Alfredo Ballester, Jordi Uriel, Ana V. Ruescas-Nicolau, and Sandra Alemany
LexiCrowd: A Learning Paradigm towards Text to Behaviour Parameters for Crowds
Marilena Lemonari, Nefeli Andreou, Nuria Pelechano, Panayiotis Charalambous, and Yiorgos Chrysanthou
Embodied Augmented Reality for Lower Limb Rehabilitation
Froso Sarri, Panagiotis Kasnesis, Spyridon Symeonidis, Ioannis Th. Paraskevopoulos, Sotiris Diplaris, Federico Posteraro, George Georgoudis, and Katerina Mania
Interacting with a Virtual Cyclist in Mixed Reality Affects Pedestrian Walking
Vinu Kamalasanan, Melanie Krüger, and Monika Sester

BibTeX (CLIPE 2024)
@inproceedings{
10.2312:cl.20242004,
booktitle = {
CLIPE 2024 - Creating Lively Interactive Populated Environments},
editor = {
Pelechano, Nuria
 and
Pettré, Julien
}, title = {{
CLIPE 2024: Frontmatter}},
author = {
Pelechano, Nuria
 and
Pettré, Julien
}, year = {
2024},
publisher = {
The Eurographics Association},
ISBN = {978-3-03868-241-7},
DOI = {
10.2312/cl.20242004}
}
@inproceedings{
10.2312:cl.20241046,
booktitle = {
CLIPE 2024 - Creating Lively Interactive Populated Environments},
editor = {
Pelechano, Nuria
 and
Pettré, Julien
}, title = {{
A CRITS Foray Into Cultural Heritage: Background Characters For The SHELeadersVR Project}},
author = {
Culié, Jean-Benoit
 and
Mijatovic, Bojan
 and
Panzoli, David
 and
Nesimovic, Davud
 and
Sanchez, Stéphane
 and
Rizvic, Selma
}, year = {
2024},
publisher = {
The Eurographics Association},
ISBN = {978-3-03868-241-7},
DOI = {
10.2312/cl.20241046}
}
@inproceedings{
10.2312:cl.20241047,
booktitle = {
CLIPE 2024 - Creating Lively Interactive Populated Environments},
editor = {
Pelechano, Nuria
 and
Pettré, Julien
}, title = {{
Overcoming Challenges of Cycling Motion Capturing and Building a Comprehensive Dataset}},
author = {
Kyriakou, Panayiotis
 and
Kyriakou, Marios
 and
Chrysanthou, Yiorgos
}, year = {
2024},
publisher = {
The Eurographics Association},
ISBN = {978-3-03868-241-7},
DOI = {
10.2312/cl.20241047}
}
@inproceedings{
10.2312:cl.20241048,
booktitle = {
CLIPE 2024 - Creating Lively Interactive Populated Environments},
editor = {
Pelechano, Nuria
 and
Pettré, Julien
}, title = {{
Capture and Automatic Production of Digital Humans in Real Motion with a Temporal 3D Scanner}},
author = {
Parrilla, Eduardo
 and
Ballester, Alfredo
 and
Uriel, Jordi
 and
Ruescas-Nicolau, Ana V.
 and
Alemany, Sandra
}, year = {
2024},
publisher = {
The Eurographics Association},
ISBN = {978-3-03868-241-7},
DOI = {
10.2312/cl.20241048}
}
@inproceedings{
10.2312:cl.20241049,
booktitle = {
CLIPE 2024 - Creating Lively Interactive Populated Environments},
editor = {
Pelechano, Nuria
 and
Pettré, Julien
}, title = {{
LexiCrowd: A Learning Paradigm towards Text to Behaviour Parameters for Crowds}},
author = {
Lemonari, Marilena
 and
Andreou, Nefeli
 and
Pelechano, Nuria
 and
Charalambous, Panayiotis
 and
Chrysanthou, Yiorgos
}, year = {
2024},
publisher = {
The Eurographics Association},
ISBN = {978-3-03868-241-7},
DOI = {
10.2312/cl.20241049}
}
@inproceedings{
10.2312:cl.20241050,
booktitle = {
CLIPE 2024 - Creating Lively Interactive Populated Environments},
editor = {
Pelechano, Nuria
 and
Pettré, Julien
}, title = {{
Embodied Augmented Reality for Lower Limb Rehabilitation}},
author = {
Sarri, Froso
 and
Kasnesis, Panagiotis
 and
Symeonidis, Spyridon
 and
Paraskevopoulos, Ioannis Th.
 and
Diplaris, Sotiris
 and
Posteraro, Federico
 and
Georgoudis, George
 and
Mania, Katerina
}, year = {
2024},
publisher = {
The Eurographics Association},
ISBN = {978-3-03868-241-7},
DOI = {
10.2312/cl.20241050}
}
@inproceedings{
10.2312:cl.20241051,
booktitle = {
CLIPE 2024 - Creating Lively Interactive Populated Environments},
editor = {
Pelechano, Nuria
 and
Pettré, Julien
}, title = {{
Interacting with a Virtual Cyclist in Mixed Reality Affects Pedestrian Walking}},
author = {
Kamalasanan, Vinu
 and
Krüger, Melanie
 and
Sester, Monika
}, year = {
2024},
publisher = {
The Eurographics Association},
ISBN = {978-3-03868-241-7},
DOI = {
10.2312/cl.20241051}
}

Browse

Recent Submissions

Now showing 1 - 7 of 7
  • Item
    CLIPE 2024: Frontmatter
    (The Eurographics Association, 2024) Pelechano, Nuria; Pettré, Julien; Pelechano, Nuria; Pettré, Julien
  • Item
    A CRITS Foray Into Cultural Heritage: Background Characters For The SHELeadersVR Project
    (The Eurographics Association, 2024) Culié, Jean-Benoit; Mijatovic, Bojan; Panzoli, David; Nesimovic, Davud; Sanchez, Stéphane; Rizvic, Selma; Pelechano, Nuria; Pettré, Julien
    This article presents CRITS, a software framework designed to enhance virtual environments, particularly in the context of cultural heritage and immersive learning simulations. CRITS enables the easy integration of autonomous, human-like characters into virtual settings, enriching the user's experience by simulating the dynamic activities and social presence of background characters. The framework is showcased through its application in the SHELeaders VR project, which aims to recreate historical settings and narratives centered around medieval female leaders in the Balkans. The article discusses the technical implementation of CRITS, its benefits for creating lively and populated environments, and reflects on potential improvements and future research directions.
  • Item
    Overcoming Challenges of Cycling Motion Capturing and Building a Comprehensive Dataset
    (The Eurographics Association, 2024) Kyriakou, Panayiotis; Kyriakou, Marios; Chrysanthou, Yiorgos; Pelechano, Nuria; Pettré, Julien
    This article describes a methodology for capturing cyclist motion using motion capture (mocap) hardware. It also details the creation of a comprehensive dataset that will be publicly available. The methodology involves a modular system, and an innovative marker placement. The resulting dataset is utilized to create 3D visualizations and diverse data representations, shared in an online library for public access and collaborative research.
  • Item
    Capture and Automatic Production of Digital Humans in Real Motion with a Temporal 3D Scanner
    (The Eurographics Association, 2024) Parrilla, Eduardo; Ballester, Alfredo; Uriel, Jordi; Ruescas-Nicolau, Ana V.; Alemany, Sandra; Pelechano, Nuria; Pettré, Julien
    The demand for virtual human characters in Extended Realities (XR) is growing across industries from entertainment to healthcare. Achieving natural behaviour in virtual environments requires digitizing real-world actions, a task typically laborious and requiring specialized expertise. This paper presents an advanced approach for digitizing humans in motion, streamlining the process from capture to virtual character creation. By integrating the proposed hardware, algorithms, and data models, this approach automates the creation of high-resolution assets, reducing manual intervention and software dependencies. The resulting sequences of rigged and textured meshes ensure lifelike virtual characters with detailed facial expressions and hand gestures, surpassing the capabilities of static 3D scans animated via separate motion captures. Robust pose-dependent shape corrections and temporal consistency algorithms guarantee smooth, artifact-free body surfaces in motion, while the export capability in standard formats enhances interoperability and further character development possibilities. Additionally, this method facilitates the efficient creation of large datasets for learning human models, thus representing a significant advancement in XR technologies and digital content creation across industries.
  • Item
    LexiCrowd: A Learning Paradigm towards Text to Behaviour Parameters for Crowds
    (The Eurographics Association, 2024) Lemonari, Marilena; Andreou, Nefeli; Pelechano, Nuria; Charalambous, Panayiotis; Chrysanthou, Yiorgos; Pelechano, Nuria; Pettré, Julien
    Creating believable virtual crowds, controllable by high-level prompts, is essential to creators for trading-off authoring freedom and simulation quality. The flexibility and familiarity of natural language in particular, motivates the use of text to guide the generation process. Capturing the essence of textually described crowd movements in the form of meaningful and usable parameters, is challenging due to the lack of paired ground truth data, and inherent ambiguity between the two modalities. In this work, we leverage a pre-trained Large Language Model (LLM) to create pseudo-pairs of text and behaviour labels. We train a variational auto-encoder (VAE) on the synthetic dataset, constraining the latent space into interpretable behaviour parameters by incorporating a latent label loss. To showcase our model's capabilities, we deploy a survey where humans provide textual descriptions of real crowd datasets. We demonstrate that our model is able to parameterise unseen sentences and produce novel behaviours, capturing the essence of the given sentence; our behaviour space is compatible with simulator parameters, enabling the generation of plausible crowds (text-to-crowds). Also, we conduct feasibility experiments exhibiting the potential of the output text embeddings in the premise of full sentence generation from a behaviour profile.
  • Item
    Embodied Augmented Reality for Lower Limb Rehabilitation
    (The Eurographics Association, 2024) Sarri, Froso; Kasnesis, Panagiotis; Symeonidis, Spyridon; Paraskevopoulos, Ioannis Th.; Diplaris, Sotiris; Posteraro, Federico; Georgoudis, George; Mania, Katerina; Pelechano, Nuria; Pettré, Julien
    Immersive platforms have emerged as valuable tools in rehabilitation, with potential to enhance patient engagement and recovery outcomes. Addressing the limitations of traditional Virtual Reality (VR) setups that restrict physical movement, this paper presents the system architecture of a novel, head-worn, Augmented Reality (AR) system for lower limb rehabilitation. The rehabilitation experience is enhanced by embodying avatars that replicate patients' movements. The system integrates varied avatar perspectives, such as mirror and follow modes, based on an avatar centered interface. The proposed system architecture supports seated and standing exercises, expanding the scope of rehabilitation beyond just gait. Computer vision-based 3D pose estimation captures patients' movement, mapped onto the avatar in real-time, accurately estimating the co-ordinates of 3D body landmarks. Wearable sensors evaluate patients' movements by utilizing deep learning to discern movement patterns. Feedback to patients is provided based on visual cues indicating limb areas for exercise adjustment so that exercise execution is improved. Embodiment has the potential to improve exercise understanding and assists patients' rehabilitation recovery.
  • Item
    Interacting with a Virtual Cyclist in Mixed Reality Affects Pedestrian Walking
    (The Eurographics Association, 2024) Kamalasanan, Vinu; Krüger, Melanie; Sester, Monika; Pelechano, Nuria; Pettré, Julien
    When walking in shared traffic spaces, the nearby presence and movement of other pedestrians and cyclists can prompt individuals to make speed and path adjustments to avoid potential collisions. The study of such collision avoidance strategies in virtual settings allows for the controlled scaling of environmental complexity that are present in a real situation, while ensuring pedestrians safety. Our pilot study in this work makes an early effort towards understanding the influence of cyclist movements on human walking using mixed reality (MR). On this account, the collision avoidance behavior of pedestrians crossing the path of a moving virtual cyclist avatar was examined. This was done by analyzing the temporal and spatial characteristics of the participants walking trajectory using the speed profiles and Post Encroachment Time (PET) metric. The early results from our pilot study demonstrates that mixed reality cyclist experiments can be used to study pedestrian-cyclist interactions. Furthermore, for all interactions that were noted in the study, a significant proportion of participants decided to cross the virtual cyclist, while others preferring to give the right of way. We also discuss our current findings, insights and implications of studying pedestrian behaviours using virtual cyclists.