Presenting Archaeoacoustics Results Using Multimedia and VR Technologies

Research output: Contribution to journalArticlepeer-review

Abstract

Music and sound cannot be experienced through writing and numbers. Writing freezes time onto paper; as a time-based medium, sound cannot be heard without temporal motion, and acoustic metrics are silent data. For a complete experience of sound, it needs to engage our bodies. Digital multimedia technologies offer powerful approaches to understanding the acoustics of the past, and this work will explore a number of those affordances. In particular, this work explores the use of apps that illustrate archaeoacoustic effects, set digitally within visual and acoustic archaeological cultures. The ways of immersing audiences through projection, acoustic simulation, field and studio recordings, and musical performance will be discussed. The use of virtual reality (VR) headsets is explored to create a sense of deep-flow and presence amongst audiences, total immersion in an experiential phenomenological understanding of interacting audio and visual fields, as well as setting such results within an appropriate context. This study will examine how acoustics results at caves in Northern Spain, in various phases of Stonehenge, and at Paphos Theatre (all World Heritage Sites) can be explored using VR and multimedia technologies, evaluating the comparative advantages of the use of different technologies. It proposes that such integration of visual and sonic modelling using interactive digital technologies is effective as a nonrepresentational theory approach to compliment empirical studies, allowing understanding that goes beyond numerical analysis and binary dialectics to engage directly with the material of archaeological sites in an embodied manner, and address the real-world complexities of acoustic ecologies and their contexts.

Original languageEnglish
Article number20220340
Number of pages25
JournalOpen Archaeology
Volume9
Issue number1
Early online date22 Dec 2023
DOIs
Publication statusPublished - 22 Dec 2023

Cite this