Telepresence is an eight-minute, 8.2 channel surround sound VR performance experience. The performance takes place in a physical performance space with an octophonic sound system, four tube amplifiers and a live trumpet performer. Eight audience members are seated in rotating chairs in the center of the sound system and performance space, each wearing an Oculus Go VR HMD, as the trumpet player performs around them.
Diagram of Sound Staging and audience seating.
(Legend: Blue=speaker, red=subwoofer, green=tube amplifier)
Telepresence invites participants to create their own narratives between worlds and to explore their possibilities based on their sensory perceptions, which create shifts in their physical movement and gaze.
Telepresence has been performed five times at Emily Carr University in Vancouver, Canada, in September 2018, twice at our artist studio in Chinatown Vancouver in November 2018, and ten times at artist-run centre Western Front in December 2018.
Telepresence seeks to redefine conventions of the concert setting, by reversing the hierarchy between visual and audio elements within a virtual performance environment. Our main research question is: How can virtual reality enhance a live musical performance without overshadowing the aural experience?
To that end, we must understand how our bodies react to differing media. As such, we utilize a choreographic perspective on viewing the world as informed by Michael Klein. Through this perspective, we can better understand how physical, sonic, and virtual spaces impact audiences’ bodies and their perception of performance.
Unlike most traditional VR experiences, Telepresence is designed as a collective and social experience. Much like going to a music concert or a cinema, participants view the virtual composition simultaneously with one another. However, instead of wearing binaural headphones traditionally used to experience VR works, the audience is united by the 8.2 channel surround sound design, four tube amplifiers and physical trumpet player performing around them.
Music Composition and Sound Choreography
Since our intention is to support the aural experience, it was important to primarily establish the sound world. Creating a sense of space through sound became the first objective for the start of the composition. Experiments with different timbres, frequencies and their velocities across the 8.2 speaker array were conducted in order to establish the dimension of the room.
Next, we examined how Carter’s presence and performance characteristics affected our focus. The term “Sound Choreography” was coined for these studies and were lead by choreographer Emmalena Fredriksson. During our full team rehearsals, we sat in the performance space with our eyes closed and experienced Carter’s trumpet playing in various physical configurations and sound characteristics.
Visuals; Skybox and Game Objects
Keeping in mind that the VE supports the sonic environment – the main challenge with the skybox was creating an “illusion of depth” and a sense of spaciousness to support the spatialized live performance. How could we create a VE that reflected the physical space established by the sound? Similarily, experiments with different unobtrusive game objects were conducted to contribute to the volume and ambience of the VE.
The technological outcome of Telepresence is a system which allows audiences to view a VR work simultaneously with one another. The VE was deployed as a stand-alone Android application. A network client was developed so all the Oculus Go’s would connect to a centralized server. The VR experience can be launched for playback at the same time in all the Oculus Go’s.