For the paper Exploring Seated Locomotion Techniques in Virtual Reality for People with Limited Mobility Marlene Huber received the Best Student Paper Award at the VISIGRAPP 2025 located in Porto, Portugal this year.
Abstract:
Virtual reality (VR) is often designed as a standing experience, excluding individuals with limited mobility. Given that a significant portion of the population experiences lower-body mobility restrictions, accessible VR locomotion must accommodate users without requiring lower-body movement. To build a comprehensive understanding of suitable locomotion techniques (LTs) for this demographic, it is crucial to evaluate the feasibility of various approaches in virtual environments (VEs). As a starting point, we present our evaluation approach and a user study on the feasibility and potential of selected LTs for accessible seated locomotion in VR. Our findings indicate that common LTs can be adapted for seated stationary VR. Teleportation-based techniques, in particular, stand out as viable options for accessible locomotion. Although our simulated wheelchair was less popular with non-disabled participants, it was well-received by wheelchair users and shows promise as an intuitive LT for this target group.
Authors:
Marlene Huber, Simon Kloiber, Annalena Ulschmid, Agata Marta Soccini, Alessandro Clocciatti, Hannes Kaufmann, Katharina Krösl
Annalena Ulschmid also received the Best Student Paper Award for the paper Single-Exemplar Lighting Style Transfer via Emissive Texture Synthesis and Optimization.
Abstract:
Lighting is a key component in how scenes are perceived. However, many interior lighting situations are currently either handcrafted by expert designers, or simply consist of basic regular arrangements of luminaires, such as to reach uniform lighting at a predefined brightness. Our method aims to bring more interesting lighting configurations to various scenes in a semi-automatic manner designed for fast prototyping by non-expert users. Starting from a single photograph of a lighting configuration, we allow users to quickly copy and adapt a lighting style to any 3D scene. Combining image analysis, texture synthesis, and light parameter optimization, we produce a lighting design for the target 3D scene matching the input image. We validate via a user study that our results successfully transfer the desired lighting style more accurately and realistically than state-of-the-art generic style transfer methods. Furthermore, we investigate the behaviour of our method under potential alternative choices in an ablation study.
Authors:
Pierre Ecormier-Nocca, Lukas Lipp, Annalena Ulschmid, David Hahn, Michael Wimmer