Mvmt. is a tool for reaching new audiences with classical music experiences. Many people fear for the future of our symphony orchestras, concert halls, and both traditional and contemporary classical music. Inspired by the way musical experiences can be shared but are different for every individual, I wanted to explore how technology could augment musical experiences and encourage conversations around them. I’m wary about technology when it comes to music, but what about when computing becomes ubiquitous to the point that the technology fades into the background? What about when technology finds us in our world instead of us delving into its world and mentally leaving our present world behind?
This project is an exercise in designing for ubiquitous communication in "tabs, pads, boards, and beyond" as well as a framework for learning UI/UX prototyping tools. Our concepts for this project had to include designs for tabs, pads, boards, and beyond (I chose to design for iPhone 6S, iPad Air, a desktop iMac, and augmented reality). Emphasis was placed on high-fidelity UI mockups, creating a UX flow, and prototyping the interactions using an animation library for AfterEffects.
Can technology augment musical experiences and encourage conversations around them? What might happen when technology finds us in our world instead of us leaving our present world behind?
How It Works
With Mvmt., concertgoers have access to information about the pieces, composers, and performers they'll be experiencing before the concert. During the concert, Mvmt. provides a participatory platform that creates visualizations of audience and performer experiences alike in augmented reality. After the fact, users have access to recordings from their concerts and can continue the conversation around that musical experience.
Augmented Reality Visualizer
The highlight moment of Mvmt. is really the live concert experience using augmented reality to visualize the music. The concertgoer would use their mobile device, connected to the technology ecosystem within the concert hall, to express the emotions and movement they experience as the orchestra plays. Their input would affect the visualization, allowing for a participatory experience that not only intrigues the concertgoer but empowers them to contribute to a multisensory experience. The visualization is also affected by the score, each musician, and the conductor.
Originally, I envisioned this highlight moment as a real-time tangible interaction with an interface of depth with resistance built into each seat and real, physical lights that are controlled by the audience; however, to allow for more traditional classical concertgoers to have the experience they expect (as well as keeping within the constraints of the assignment), I chose to move the experience to a visualization through augmented reality and a tab-based contribution method.
Using Cinema 4D LITE in AfterEffects CC, I experimented with various effects to prototype a real-time music visualizer that might be seen through augmented reality. Because I didn’t have access to a performing ensemble to personally record, I asked a friend to play a short tune on the violin that I could start visualizing. While I did use the camera tracking method to put a flat-plane visualization into this video, I’m wondering what would be possible if a more three-dimensional visualization overlay was used.
I designed interfaces for the highlight moments of the ideal user journey. Through the mobile app, a user can see their upcoming and previous concerts, listen to recordings of the pieces from those concerts, and get a preview of "live mode." Their digital ticket also resides in the app, and it will appear as the user gets close to the concert hall. Entering live mode during the concert itself while in the hall will darken the screen and allow the user to participate in expressing what they feel.
The tablet experience focuses on giving the user more information about the performance and specific pieces. When the user engages with both the tablet and the desktop apps at the same time, they will sync up and the user can look at the automatically scrolling score in time with the music.
Lastly, the desktop experience is mainly about viewing video recordings and their corresponding AR visualizations.
In this class we were also given the opportunity to explore UI animation in AfterEffects. I created all animations needed for the concept video.
I started by looking into current projects being done to capture, communicate, and enhance musical experiences. MIT’s Media Lab has a group called Opera of the Future, whose opera Death and the Powers brings robots and other technologies on stage that capture and communicate the behavior, expression, and interactions of a hidden actor. Groups at MIT, CMU, Queen Mary University of London, and more are working on augmenting acoustic instruments and digitizing parts of the performance experience.
Many city orchestras have programs that specifically cater to people who don’t regularly listen to classical music. For example, FUSE@PSO is a non-traditional concert series put on by the Pittsburgh Symphony Orchestra in collaboration with composer and conductor Stephen Hackman that brings fans of classical music together with fans of more modern rock music like Coldplay and Radiohead at Heinz Hall. Hackman writes “mash-ups” of, for example, Coldplay’s discography with Beethoven’s “Eroica”. Other organizations offer cheaper tickets for young people, like the Kennedy Center’s MyTix program.
Beyond making tickets more affordable and changing up repertoire, using similar technologies to the work being done to capture, communicate, and enhance musical expression may be a way to make classical music more accessible to new patrons. I was also interested in audience participation and interaction during classical music concerts: the general expectation is that audience members remain silent, undisruptive, and passive. Etiquette aside, how could technology be used to augment the concert experience in a way that is both enlightening and engaging? Other research ideas, notes, and questions appear below.
I began this project by researching, brainstorming, and sketching out various scenarios in storyboard form before settling on the current concept behind Mvmt. Drawing out these storyboards helped me explore the end-to-end experience as it extends before, during, and after the live concert itself.
Final scenario sketches before diving into making
I sketched many versions of low-fidelity user interfaces before bringing them into Illustrator for static prototyping and AfterEffects for animations. Taking into account the different use cases for tabs, pads, boards, and beyond, I took this opportunity to explore what functions each part of the ecosystem would have (this happened as I was working through the storyboards as well). Keeping track of which functions happened on which devices could have become very confusing, but sketching them out and writing them down helped me organize my ideas.
After deciding on what elements were most important to include in my highlight moment flows, I experimented with the layout for the most important screens.
My next steps include expanding the UX flows beyond the key moments shown for each size of device. As I've continued to learn about UI design and augmented reality prototyping, I've also wanted to update this project with my new skills, both software- and aesthetics-wise. I would also love to take this to concertgoers of all types, even if they don't frequent classical concerts; if this is to expand the audience and appreciation for classical music, I'd need to seek out that perspective in particular while keeping in mind what I've learned from musicians and avid classical music concertgoers alike.