« Förderinformationen
Sie verwenden einen sehr veralteten Browser und können Funktionen dieser Seite nur sehr eingeschränkt nutzen. Bitte aktualisieren Sie Ihren Browser. http://www.browser-update.org/de/update.html
AUDICTIVE
Termin:
31.08.2023
Fördergeber:
Deutsche Forschungsgemeinschaft (DFG)
In March 2019, the Senate of the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) established the Priority Programme “Auditory Cognition in Interactive Virtual Environments – AUDICTIVE” (SPP 2236). The programme is designed to run for six years. After a successful first phase that started in January 2021, the present call invites proposals for the second three-year funding period.
Considerable progress has been made over the past years in the understanding of auditory cognitive processes and capabilities – from perception, attention and memory to complex performances such as scene analysis and communication. The first phase of “AUDICTIVE” focussed mostly on porting the well-controlled but often unrealistic stimulus presentations used in auditory cognition research to more comprehensive virtual or mixed reality environments. Here, recent developments in hardware and software technologies were reflected, with audiovisual virtual and mixed reality (VR, MR) reaching a high level of perceptual plausibility. In the first phase, the research in “AUDICTIVE” aimed at converting and/or applying interactive VR and/or MR technology to explore auditory cognition in audiovisual scenes that are closer to real life. With a variety of interactive virtual environment technologies and scenes, controlled research was conducted on how acoustic and visual components and further contextual factors affect perception, cognition and interaction. Here, we focused on the applicability and transfer of established paradigms from auditory cognition with rather simple audiovisual environments, significantly expanding beyond the basic approaches based on audio or visual perception that were previously used.
The results of the first phase have laid the foundation for the second phase of “AUDICTIVE”, which aims to identify or (further) develop suitable paradigms to use in more realistic scenes with the intention to elicit a close-to-natural perception, experience and/or behaviour.
Hence, in the second phase of “AUDICTIVE”, we seek to further expand the scientific level of knowledge, theories and models that have been developed within basic auditory perception and cognition research to even more realistic daily-life situations. Here, new knowledge, methods and techniques shall be generated (e.g., psychometric and cognitive assessment, QoE evaluation, physiological or behavioural analysis, signal acquisition and analysis, VR/MR technology enhancement) for richer and more complex scenarios, involving interactive VR and/or MR technology. This reflects the fact that virtual reality technology is continuing to mature, opening up new areas of application that include acoustics as a key component, such as immersive social MR (various parties being co-present in one environment); health-promoting measures through research on the effects of complex acoustic environments and/or noise (e.g., classroom, working environment); required training, therapy and/or rehabilitation (e.g., social anxiety, speech training after sudden hearing loss); assistive systems, possibly for user groups with special needs such as being hard of hearing (e.g., telepresence designed for the elderly).
“AUDICTIVE” brings together researchers from different disciplines – acoustics, cognitive psychology and computer science/virtual reality – by encouraging joint research efforts to enhance our understanding and competence in the field of auditory cognition in interactive virtual environments, as a proxy to the real world. Up to now, most relevant research efforts have been confined to individual scientific research communities, using stimuli that often lack the realism of real-life complex scenes. Today, the more realistic and interactive virtual environments that can be created using current state-of-the-art VR and/or MR technology can empower the acoustics (perception and rendering), computer science (virtual reality) and cognitive psychology communities to fully exploit the huge potential of VR and MR for testing and extending their existing theories. At the same time, VR and MR research can benefit from knowledge of auditory perception and cognition to understand the important quality criteria that need to be met in order to optimize virtual or mixed reality environments in terms of perception, cognitive performance, subjective experience or (social) presence.
“AUDICTIVE” aims to significantly extend the knowledge of hearing-related cognitive performances in real-life scenes and to enable creating “auditory-cognition-validated” VR technology. “AUDICTIVE” targets fundamental research addressing the three research priorities (a) “auditory cognition”, (b) “interactive audiovisual virtual environments” and © “quality evaluation methods”, the latter being located at the interface between (a) and (b).
“AUDICTIVE” targets the three main research priorities:
- Auditory cognition (a): Projects in AUDICTIVE are sought to investigate which auditory cognition research paradigms, methods, models and theories can be established that consider the relevant aspects and features of interactive (virtual) environments (as determined in the first phase) as key components. Here, projects aiming for further advancement of the VR and/or MR-extended, more basic cognitive research paradigms from Phase 1 are sought, as well as novel research paradigms specifically catering for the opportunities brought along with VR and/or MR technology.
- Interactive virtual environments (b): Regarding VR or MR technology, projects in AUDICTIVE will investigate to what extent audiovisual components (identified and/or developed in the first phase) need to be provided and integrated to achieve a new level of vibrancy and fidelity of virtual environments. Further, research in AUDICTIVE projects may address how audiovisual cues in social VR and, in particular, for conversational virtual humans, can be realized with different instances of VR techniques to significantly improve user experience and social presence, with outreach to real-life situations.
- Quality evaluation methods ©: Using cognitive performances to assess auditory and audiovisual cognition based on VR and/or MR scenes can be considered as a cognition-based assessment of the VR and/or MR technology employed. In this second phase, projects are sought to investigate evaluation methods more systematically, answering questions such as how instrumental models may be developed that predict VR and/or MR quality or QoE in terms of auditory cognition performances, or what guidelines need to be met by audiovisual VR and/or MR systems and environments to validly enable specific auditory cognition research.
To achieve the above objectives, projects within “AUDICTIVE” should be organised as follows:
- Each project should preferably be designed as a multidisciplinary project (with a focus on acoustics) of two principal investigators.
- Each project should preferably address two (ideally all three) research priorities.
- Each partner within a project has to define his/her own research questions/work packages, and no partner should merely provide a service to the other partner.
The coordination project will foster an open science approach, to continue developing a comprehensive database of results. All projects are to include an evaluation of the quality of VR and/or MR environments for research into auditory cognition or of the validity of research results on auditory cognition in VR and/or MR environments and all projects have to contribute to the central research data management. Here, the individual projects will commit to providing methods, (meta)data and/or tools that the coordination project of AUDICTIVE will integrate into an evaluation-method framework across all projects, hence building on the results obtained in phase 1.
Proposals and CVs must be written in English and submitted to the DFG by 3 July 2023.
Further Information:
https://www.dfg.de/foerderung/info_wissenschaft/ausschreibungen/info_wissenschaft_23_24/index.html