
janina fels
Interactive Virtual Reality for Hearing Research:
Opportunities, Limitations, and New Insights
CV
Janina Fels is full professor and director of the Chair and Institute for Hearing Technology and Acoustics at RWTH Aachen University. She has been on the RWTH faculty since 2012, previously as professor of medical acoustics. She studied electrical engineering and received her diploma (2002) and Ph.D. (2008) from RWTH Aachen. After a postdoctoral stay at the Center for Applied Hearing Research (DTU) and Widex in Denmark, she was visiting scientist at the Institute of Neuroscience and Medicine (INM-1), Forschungszentrum Jülich (2012–2015).
Her research focuses on perception and communication in complex acoustic environments across different listener groups, including classroom and open-plan office acoustics. She develops methods and technical systems for realistic listening experiments in virtual acoustic scenes and is spokesperson of the DFG priority program SPP 2236 “AUDICTIVE”.
Fels has received several distinctions, including the Lothar Cremer Prize of the German Acoustical Society (2013), membership in the Young Academy of the North Rhine-Westphalian Academy of Sciences and Arts (2014), the FAMOS für Familie Award from RWTH Aachen (2017), the Journal of the Audio Engineering Society Best Paper Award (2019), and the Brigitte Gilles Award (2021).
She served as general co-chair of DAGA 2016 and vice chair of the 2019 International Congress on Acoustics in Aachen. She is a member of the DFG Review Board for Mechanics and Mechanical Engineering (Acoustics), currently serving as deputy spokesperson. In 2025 she was elected vice president (2025–2028) and president-elect (2028–2031) of the German Society of Acoustics (DEGA).
Understanding everyday hearing requires approaches that go beyond traditional laboratory experiments with isolated sounds. This plenary introduces interactive audiovisual virtual reality (VR) as a methodology to “bring real life into the lab” and to study perception, attention, memory, and communication in complex, yet experimentally controlled environments. VR can recreate realistic acoustic scenes while allowing for systematic manipulation. However, it also has limitations, such as reduced social and situational context, which must be carefully considered.
This talk will highlight the general principles that emerge when auditory and visual information are combined in such environments. Visual cues have been found not to automatically improve listening; their benefit appears to depend on task demands, the difficulty of the acoustic scene, and the need to integrate information across modalities. Overall, VR will be discussed as a tool for investigating these often nonintuitive interactions between seeing and hearing and for linking controlled experimentation with ecologically meaningful listening situations.
These developments point toward a future in which hearing is studied within interdisciplinary frameworks that embed auditory perception and processing in rich, audiovisual VR environments. These approaches draw on expertise from acoustics, psychology, and computer science to achieve a more comprehensive understanding of human multimodal perception.
louena shtrepi


jean-philippe groby
piotr majdak
