⚠️ This is the Zenodo Sandbox instance, used for testing purposes ⚠️
DOIs created in this instance are not real and will not resolve. You can find the production instance of Zenodo at zenodo.org
Published June 1, 2023 | Version v1
Conference paper Open

A Human-Agents Music Performance System in an Extended Reality Environment

Description

This paper proposes a human-machine interactive music system for live performances based on autonomous agents, implemented through immersive extended reality. The interaction between humans and agents is grounded in concepts related to Swarm Intelligence and Multi-Agent systems, which are reflected in a technological platform that involves a 3D physical-virtual solution. This approach requires visual, auditory, haptic, and proprioceptive modalities, making it necessary to integrate technologies capable of providing such a multimodal environment. The prototype of the proposed system is implemented by combining Motion Capture, Spatial Audio, and Mixed Reality technologies. The system is evaluated in terms of objective measurements and tested with users through music improvisation sessions. The results demonstrate that the system is used as intended with respect to multimodal interaction for musical agents. Furthermore, the results validate the novel design and integration of the required technologies presented in this paper.

Files

nime2023_2.pdf

Files (1.5 MB)

Name Size Download all
md5:3eb51eef77ce0d74d5a261dd61bf332a
1.5 MB Preview Download