From March to November 2018, our team questioned notions of accessibility and designed exciting, inclusive and accessible virtual instruments which expanded the musical practice of two disabled musicians. The musicians performed in front of a full house alongside professional musicians from the Hard Rain Soloist Ensemble in our final showcase event.
One of such instruments was a version of EXA: The Infinite Instrument (EXA), an immersive musical studio developed by Zack Zinstner. EXA allows users to compose, record, and perform music using expressive instruments of their own design. These characteristics made EXA an ideal platform to start designing a fully customisable instrument taking into account different types of mobilities. In that way, Damian Mills and Mary Louise McCord (Drake Music NI) worked closely together to design an instrument that enabled her expressive upper body movement.
On the second workshop Damian Mills watched the Hard Rain Soloist Ensemble, James Cunningham and Mary Louise McCord play together for the first time. There was an instant fluid communication between the ad-hoc ensemble. However, when watching Mary Louise’s dissatisfaction with the range of possibilities of the Soundsbeam, her usual instrument, Damian identified scope for a potential VR integration. He gathered a few essential considerations taking into account Mary Louise’s mobility, needs, creative interests, and also the ad-hoc ensemble dynamics.
- To be an ensemble to make music together.
- The adapt the flexibility of the EXA.
- To fit the instrument to Mary Louise’s ergonomics.
- To explore navigation for Mary Louise’s in the software in the physical environment.
- The identify limitations of the hardware and Mary Louise’s being tethered to a machine.
- Manipulating VR software in real space.
Damian Mills design process:
As a musician, we use all our available senses to interact with others to create music. To deprive one musician of her visual communication, as is the case with a VR headset wearer, would place an artificial limitation on communication.
The first important step was to remove the visually confining headset. By placing the headset just behind the back of Mary Louise’s head, it was possible to see the controllers and virtual instruments on a monitor placed in front of our musician. This meant Mary Louise could see how she was manipulating the notes – which light up as you hit them – and could feel the haptic feedback from the controllers to re-enforce the feeling of having played a physical note, even thought the note only existed in VR.
In order to create a bespoke instrument for Mary Louise, we first had was to see how the Mary Louise could manipulate the controllers. This was quickly achieved by Mum and the tether rope that the controllers came with, looping them around the index and middle fingers Mary Louise’s hands. My first job was to measure the comfortable limits of movement in space that Mary Louise can create. With the headset off I could see the comfortable zones of movement, and once established, put the headset on to see how the “ringer” or striker attached to the hand controllers was moving. A quick sketch in VR around those movements gave me an ergonomic map in which to work with.
I chose to make a large stack of midi triggers in the EXA software. Large because of the inaccuracy that was probable in placement in the actual world of Mary Louise’s wheelchair, which was mapped with tape on the floor.
I also tried out circular triangle and square stacks of notes / trigger designs mapped within the EXA programme to a chromatic scale, that is 12 notes arranged vertically. I had reasoned that because of the difficulties of moving through VR space without a set of goggles on to place instruments where needed, only one, pre-set set of notes may be needed, and we could bring Mary Louise to the marked spot to play. By creating a large instrument space in front of Mary Louise, it was possible then to perform music without a visual monitor, and just the haptic and audio feedback much like a Soundbeam would.
Sound creation was made by routing to Ableton using a Virtual Midi Device. A chromatic stack can be programmed in Ableton to trigger notes in a scale. It does mean that some notes triggered within EXA do not trigger a note in Ableton in any given scale, but it would mean that one instrument design could be used for all 5 pieces of music planned for performance. The Ableton instrument was designed to assist Mary Louise to make music her way, and midi effects including note length, velocity stabilising and audio effects applied Mary Louise’s choice of electronic instrument to create a more balanced audio output. MKL was consulted throughout on midi effects and permission sought for all the effects used, though to be honest, Mary Louise was game from trying everything out.
From an audience perspective, it was possible now to see how Mary Louise was manipulating an instrument. We set up a screen that follows what the VR goggles see, placed the goggles in front of Mary Louise so that the perspective matched the audience and coupled that image to a projector. How I had set Ableton up was to apply a scale to it, meaning some triggers would not make a sound in Ableton but would still register as being struck in the projection. A work-around for this now uses the Ableton feature “Sampler” and to assign for each instrument created, a midi map mapping triggered notes to samples taken from the Ableton Live instrument rack itself (Cheers Conor for the idea). More work to set up each scale, but means no more ghost notes.
Having created the main performance instrument, the possibility of using a template for creating a light based sonic sculpture in the virtual world was too tempting. After seeing how Mary Louise would enter the space, and to map her movements, I created a shape that mimicked a Gramophone that opened up to her placed instruments and that Mary Louise would move through on entrance to the space. The audience could now see her journey through VR space from her perspective.
For further information regarding the overall process and the design process, check the article “How we’re designing musical instruments with the help of disabled musicians and VR” Dr Franziska Schroeder and Dr Matilde Meireles wrote about the Immersive Inclusive Music Technologies project for The Conversation, an independent news and commentary website produced by academics and journalists.