This article is contributed. See the original author and article here.

This blog is written by Ian Riley, an inspiring musician, as a part of the Humans of Mixed Reality series. He shares his experience in music and technology, which led him to developing music in mixed reality. 


 


touching light cover.png


Touching Light is an original musical work for Percussionist and Mixed Reality Environment that explores the border areas between the physical world that we see around us, and the worlds of infinite possibility that each of us holds in our imagination.


 


 


“A dream we dream together is called reality.”
          – Alex Kipman at the Microsoft Ignite Keynote, 2021

Mixed Reality, fundamentally, asks us to see the world differently, something that is so akin to the ways that as performers, we ask our audiences not just to hear, but to listen. By drawing the attention of those around us to something that we believe to be compelling, and even more when we can share something that we have had a hand in creating, we access a unique moment, a shared imaginative space and, in my experience, this is just the sort of thing that users of Mixed Reality are hoping to find.

“My dad’s a computer programmer.” I usually lead with this as it seems to put folks at ease when they contact me, hoping that there is some ‘secret’ for how I, someone with a doctorate in music, not computer science, learned to work with Mixed Reality. Yet, while his influence has certainly been a continual inspiration to me, it was in fact my mother’s encouragement to pursue training in the arts that positioned me to begin developing Touching Light. Despite its deep connectedness to technology, Touching Light is first a foremost a musical MR application.


 


Music and Technology


It was in pursuit of my master’s degree that I first became deeply interested in music technology. I was fascinated by the sounds that electronic instruments could create, and that curiosity would eventually lead me to perform an all percussion and live electronics final recital during my first graduate degree. This sort of recital was a first for the small college that I was attending and, though I was unaware of this at time, something that is still uncommon in the world of contemporary percussion. Those experiences would eventually lead me to pursue a DMA in Percussion Performance at West Virginia University with a desire to continue to explore and innovate with percussion and live electronics.


 


When I first started my DMA, I was aware of the work that Microsoft was doing with the HoloLens 1 (introduced in 2016), but it wasn’t until my wife and I moved to Morgantown, West Virginia that I saw the first marketing for the Microsoft HoloLens 2 on February 24th, 2019. I was amazed. Watching it again today still makes me smile, but I guess that’s good marketing for you! As I continued my studies at WVU, I kept thinking about that video, about the HoloLens 2, and about Mixed Reality. What seemed like a pipe dream in February, making music in Mixed Reality, would become a real possibility in mind in November of that same year.


 


Look toward the future – stop thinking about what is cutting edge right now and to start thinking about the cutting edge of the cutting edge; because that’s where we’re going to need people to do work.
          – Dr. Norman Weinberg, at PASIC 2019

And I knew that the future was Mixed Reality.


 


simplicity screenshot.png

 Playing vibraphone while using a holographic audio mixer from Touching Light

 


Preparing for HoloLens 2


Sometimes it is the mere fact that you know what you don’t know that can provide the clearest path forward. Soon after the reveal of the HoloLens 2 in early 2019, the first seeds of what would eventually become Touching Light began to take root. At the time, while I had done some minimal computer programming experience from high school (Java, and some HTML), since I began studying music in college, I had had little time or reason to engage with the ‘coding’ side of technology apart from some basic formatting for websites.


 


Knowing that the HoloLens 2 would likely run on something like C# or Visual Basic, I began thinking about other ways that I could engage with code-based music technology and would eventually teach myself how to build rudimentary circuits to trigger lighting and audio effects. Concurrent to this work, I also more fully invested myself into learning about audio recording and engineering, recording and editing my own performance videos from recitals and other concerts. Yet for all this experience, I still didn’t know how to program the HoloLens 2.


 


Learning Mixed Reality


When the first news of the global coronavirus pandemic entered the public awareness in the United States, it was met by a mixture of genuine concern, reasonable skepticism, and in some cases, outright dismissal. Living in West Virginia, the scope of the pandemic didn’t really hit home until the University received email correspondence from university president outlining the realities of campus closures, and the transition to online delivery for the remainder of the semester as the university endeavored to minimize the risk to the WVU community in the face of uncertain times. In the face of what seemed at the time to be indefinite lockdown, I found myself able to do what anyone would do with a sudden abundance of free time… learn how to code for Mixed Reality!


 


Over the course of the next several months, particularly during the summer of 2020, through a series of free tutorials, I learned the basics of 3-D modeling using a program called Blender, a modeling engine that is similar in many ways to the sort of interface I would eventually work with in Unity. Upon ordering a HoloLens 2 from Microsoft in early July, I quickly transitioned to Unity while familiarizing myself with the sorts of gestures and interactions that drive the HoloLens 2 holographic interface.


 


With all the components finally in hand, then began the work of writing, rehearsing, and performing Touching Light. Core to the performative practice of music, and particularly to that of the percussionist the same sorts of interactions that I already employed as a performer would serve as the conceptual framework from which the three ‘dimensions of translucence’ would be derived. These dimensions (modeled after the three coordinate dimensions in physical space) would serve to ground my creative work in the sorts of real decisions that I already knew how to make because of my work with percussion.


 


soliloquy screenshot.png

Improvising on a marimba in response to a rotating carousel of landscapes 

 


Developing Music in Mixed Reality


I knew that I wanted Touching Light to be mobile. The promise of the HoloLens 2, and Mixed Reality in general, is that there are ‘no strings attached;’ if you wear this device, that is all you need to enter a Mixed Reality environment. I intentionally connected that idea of mobility to the sorts of interactions and environments that the user engages throughout the work. Even Soliloquy, the second movement of Touching Light which features a large carousel of static images, does not extend far beyond the anticipated ‘near-field’ (that which is within reach) that a percussionist will be used to engaging with. Everything in Touching Light, whether virtual or physical follows the design ethos of ‘always being within reach.’


 


The unique opportunity to engage music-making and Mixed Reality is not something that I take lightly; what began as a pipe dream just over a year ago has had a significant impact on the ways that I engage with both music and technology. I was pleasantly surprised to discover that Mixed Reality is a profoundly creative medium, and as such, engages easily with the process of music-making. From the deeply satisfying manipulation of a standing wave through the miniscule gestures of a rotating hand, to the shocking immersion of a massive holographic carousel slowly rotating around you while you perform, there is something much more connective about the spatial interactions presented by MR than the limitations of peripherals like a mouse and keyboard to control those same musical and visual elements.


 


synecdoche screenshot.png

Exploring tuned Thai gongs while manipulating spatialized virtual instruments 


 

Making Music in Mixed Reality (How to Get Started, and Why You Should)


Already, so much of what we do as musicians is, within the context of society at large, a niche endeavor; for the percussionist, these degrees of separation can seem even more severe. But in the same ways that we as artists commit ourselves to the craft of music, and the practice of music-making, engaging with MR has only served to deepen those sorts of commitments for me.


 


For Musicians or (“Performers”)


For those individuals who are interested in the musical side of Mixed Reality, the first step to get your hands on a platform. Touching Light is obviously designed with the Microsoft HoloLens 2 in mind, but similar functionality is available through any number of other VR headsets. Once you have a platform, you will need to decide what you will perform. If you are working with the Microsoft HoloLens 2, a great place to start is with Touching Light! You can download the complete Unity file package here. Follow the instructions from the Microsoft Mixed Reality Documentation, beginning at “1. Build the Unity Project.” Once you have deployed the application to your HoloLens 2, load up the application, and explore!


 


One of the most profound discoveries that I have made while working with this technology is just how musical it can be. There is something about engaging with technology within the Mixed Reality volume, about ‘spatial computing,’ that seems intuitive and artistic. This simple fact has even more deeply convinced me that music-making in Mixed Reality is not just an interesting possibility, but a deeply meaningful inevitability.


 


For Programmers (or “Composers”)


For those individuals who may be more interested in the nuts-and-bolts of developing musical applications for Mixed Reality, the first step is to familiarize yourself with a compiler. If you are interested in programming for the Microsoft HoloLens 2, the de facto solution at present is the Unity Development Engine, though support for other compilers is becoming increasingly available. You can download the Unity Hub for free from their website, and then following the instructions in the Microsoft Mixed Reality Documentation, beginning at “1. Introduction to the MRTK tutorials,” you can begin to develop your first Mixed Reality application.


 


I would strongly advise that, once you get a handle on the basic functionality of the compiler and complete some of the beginning MRTK tutorials, take some time to consider what sorts of functionality you would like your application to demonstrate, the connect with the Microsoft MR community (via Slack or the Microsoft MR Tech Community forums) and connect with other who may be able to answer your questions, and even help you with your project design.


 


Throughout the development process of Touching Light, I was surprised at not only how easy it was to onboard myself to Mixed Reality development by using the MRTK, but also by how friendly and helpful the then-current MR development community was. Whenever I had a question, or was struggling with some element of implementation, I would quickly be directed to the relevant documentation, YouTube video, or other resource that very often addressed the exact issue I was having without ever need to post snippets of code or consult more directly with someone on the project. As a bonus, I was also able to connect with a handful of individual who had a particular interest in developing creative applications for the HoloLens 2.


 


Touching Light


I had the distinct opportunity to present Touching Light in a public recital on Saturday, May 1st, 2021. 

 


Only the beginning


Touching Light is only the beginning. It is my sincere hope that this project will serve to orient, assist, and inspire musicians, artists, and audiences alike as we continue to navigate an increasingly digital and virtual existence. Perhaps more than any other time in history, only compounded by the incredible circumstances surrounding global health and the subsequent impact that a response to such scenarios require, we have been forced to think differently about technology, and for those of us who found ourselves suddenly unable to engage in live musical performances, neither as artists nor audiences, it is my conviction that mediums like Mixed Reality will only become more essential to exploring ‘liveness’ within the context of digital and virtual spaces.


 


The work was designed during the global coronavirus pandemic of 2020-21 and it is my hope that Touching Light reminds each of us that, despite everything, we are never truly alone; there is a world beyond this one if we are only willing to reach out and touch it.


 


faculty photo 2.jpg


A photo with members of the WVU Percussion Faculty after the recital
[from left: Pf. Mark Reilly, Dr. Mike Vercelli, Ian Riley, and Pf. George Willis]


 


Resources for Making Music in Mixed Reality


Microsoft HoloLens 2


Unity Hub


Microsoft MRTK & MR Tutorials


HoloDevelopers Slack Channel


Microsoft MR Tech Community Forums


Touching Light Source Code


ianrileypercussion.com 


Riley, Ian T. “Touching Light: A Framework for the Facilitation of Music-Making in Mixed Reality.”
     West Virginia University
, West Virginia University Press, 2021.


Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.