Beginnings of a Musical Future
Week 1 + 2 of the Open Music Initiative Summer Lab 2017
The Open Music Initiative Summer Lab is an 8-week program, in which the Open Music Initiative hosts nineteen student software developers, musicians and visual artists at the IDEO Cambridge Studio and Workbar Cambridge. During the 8-week program, four teams are challenged to envision a future for music and its industry through the lens of human-centered design, distributed ledgers, and the OMI API. Throughout the program, each team will be generating prototypes with the intention of developing and refining a venture concept.
On Monday June 5th, nineteen OMI fellows gathered on the first floor of 80 Prospect, the home of IDEO Cambridge, unsure of what was to come. What followed was a whirlwind of three days filled with content designed to equip the fellows with the tools and the mindset to humanize the music industry.
The agenda simply read: Day 1 — Human Centered Design, Day 2 — State of Music, Day 3 — Technologies of Tomorrow. I structured it so each day would help the fellows answer the age old questions of venture design: Desirability? Viability? and Feasibility?
After the fellows settled in, Panos Panay, co-founder of OMI and Vice President of Innovation and Strategy at Berklee College of Music, reiterated what we were all here to explore:
“How might the advances in new expressions for music combined with the rate of technological growth lead to new experiences and commercializations for creators?”
As the day progressed, Grace Nicklin and Yael Yungster from the IDEO Cambridge Studio gave a crash course in design research—a methodology on how to understand humans and design for them. I (Eric Chan) walked the fellows through a workshop on how to brainstorm and prototype effectively. By the end of the day, one could hear the hum of excitement fill the room as the fellows were sent out into Central Square to talk, observe, and interview people. Megan Griffith, a student at Berklee who runs her own music based YouTube channel, reflected on that session.
“I did something similar at a business class in Sloan. I think it is really important to talk to the potential user. They taught us to always ask the magic wand question. If I had a magic wand to solve X problem, what would it to do?”
The next day was filled with speakers from the industry of music and those seeking to push the boundaries of musical expression. George Howard provided a look into how the music industry nearly killed itself, and the copyright and licensing workflows that ran this multi-billion-dollar industry. This was followed by R. Michael Hendrix, who explored the hacks and inspirations of music today as it unraveled new expressions of sound.
A panel followed, comprised of Eric Gunther, Joseph Paradiso, and Yadid Ayzenberg who discussed the intersection of music as data and music as art. Dr. Joseph Paradiso, who directs the Responsive Environments group, has had a particular focus on the sonification of data, talked about taking data and turning it into music as a new medium for visualization. He showed off Quantizer, Tidmarsh, and Fragile Instruments and technology’s ability to create even playing fields.
“There are two ways to be creative, one can sing or dance or one can create an environment where singers and dancers can flourish.”
Eric Gunther, co-founder of SOSO Limited, specializes in crafting interactive multi-sensory environments and experiences informed by data. He emphasized the value of derivative works for both music and other mediums such as sculptures. Using Diffusion Choir and the CSIS Data Chandelier, he made the point that:
“Consumers and creators are now one and the same. They do not even have to be human.”
Yadid Ayzenberg, co-founder of the Sync Project, has been using data to improve music”s capability as a precise medicine for relaxation, better sleep, etc. He lamented the state of music consumption:
“It is rare to listen to a whole album from a single artist. Most services give us a transient playlist that are made up of single tracks, as where we attempt to bond with an artist for minuscule amount of time.”
We then dove head first into day three, and the possibilities enabled by technology via presentations and workshops from Intel, Context Labs, and IDEO. Starting with a basic understanding of what OMI and its API is trying to accomplish, Dan Harple and Gavin Nicol from Context Labs hammered home the point of interoperability of design.
“All stakeholders collaborate at a fundamental level to develop core interoperable technologies which strengthen and benefit the entire industry.”
I (Eric Chan) provided an introduction to cryptocurrencies and distributed ledgers; focusing on their attributes and properties, such as the technology’s ability to create digital scarcity, peer-to-peer exchanges, trust + authenticity, and transparency.
Simon Peffers from Intel described the importance of flexibility and interoperability of not only intents but also technology.
“Blockchain enables improved activity ‘auditability’, and traceability into the provenance and history of goods, while letting parties work cooperatively to manage their transactional records.”
After the week of the kickoff, each of the teams started to interpret their briefs and challenges in their own way.
Team NotTomatoLovers are examining methodology for cataloging, attributing, and distributing live DJ mixes by using an augmented space that captures the ethereal experiences we attribute to live performances.
Team Fruit Basket is exploring the fundamental structures of licenses and rights in the music industry, in order to examine how distributed ledgers can change the way it’s operated today. Their fundamental question is “In the future of derivative music works, can commercialization and licensing be monetized and enjoyable for copyright holders and all creators of the derivative work?”
Team Blue is organizing and attributing fragmented derivate content generated for and about artists in order to properly compensate them. They are experimenting with a customer facing and a backend artist facing prototype.
Team Bread has started to build out prototypes to create an experience around identifying individuals for their contribution in works. They are attempting to create an immersive, interactive music virtual environment, where listeners can touch, pick apart, and follow the story of their music.
Next week, we’ll be diving into more conversations with people based on the prototypes being made.