How Do You Collaborate Part 3: The Future of Music Collaboration Technology
12-21-2014 11:00 AM
In this final part of our series on creative collaboration Bob Brown considers how the future of collaboration might look. If you missed part 1 and part 2 you can read them here.
In my final post about music collaboration, I will discuss what the future of music collaboration may look like, how new technologies will help drive these changes, and what companies are already changing the landscape of the tools musicians use to create with.
The Future of Music Collaboration Technology
So, what will drive the future of music collaboration technology? At the core is the development and adoption of transport protocols, control protocols, and file formats. Most companies use a limited set of incompatible transport and control protocols in addition to proprietary file formats. This greatly limits how musicians and studio engineers can combine and move between the different tools.
Many of the transport protocols such as UDP, TCP, and FTP were designed years ago with consideration to the technological limits at the time. Various up and coming transport protocols such as IEEE 802.1 AVB, DiGiGrid, Ravenna, and Dante digital audio network are breaking new ground for large bandwidth and low latency audio delivery.
Some of the protocols for controlling transport playback such as Sony 9-pin (Sony RS-422) and MIDI Machine Control were developed over 20 years ago, and focused in one controller controlling one device. More recently, Propellerhead/Steinberg’s ReWire and Avid’s Satellite protocols were designed to provide more advanced control and faster synchronization between DAWs.
Generally, each DAW features its own complex file format for storing information. In the video industry standards organizations such as SMPTE, EBU, and AMWA have driven designs of file formats that have been widely adopted. The audio industry has yet to gravitate towards standardized file formats. Today, collaboration between different DAWs is primarily done through bouncing to WAV files or using some of the various video file formats such as AAF and OMF. Neither of these processes provide an ideal scenario for collaboration. Bringing together DAW manufacturers and standards organizations to help develop the next generation of file formats is key to moving collaborative workflows forward.
Who Drives These Changes?
Several university research groups are working on solving real time collaboration problems. These include McGill University’s Shared Reality Lab, The MIT Media Lab, UCSD’s CineGrid, and Stanford CCRMA. With some of the best and brightest engineers and access to cutting-edge technology such as dark fiber connections, these institutions are pushing the envelope of what was thought possible.
Additionally, music and movie studios are pushing the boundaries of generic and custom-designed hardware and software to create new workflows enabling fast and more cost effective media production. DigitalFilm Tree, Maker Studios, and Skywalker Sound are all at the forefront here.
Companies such as Apple, Avid, Gobbler, and Steinberg just to name a few are creating new features as part of their products to reduce the friction and boundaries of what is possible today.
What’s Next? What does the future of music production look like?
As technology evolves and software developers invent and adopt new technologies, musicians and recording studios will continue to benefit. In the coming years, network bandwidth will increase and latency will decrease enabling new tools to be created which make collaboration easier.
It will be up to the existing companies, entrepreneurs, and researchers to bubble up and adopt the best technologies to create the next generation of music production tools. It is up to us the musicians to help drive this by encouraging companies to create these new technologies.
In the coming years, the boundaries of desktop computer, laptop, tablet, phone, and web browser based workflows will slowly blur together. Musicians will be able to seamlessly work with each tool without having to think about copying data, network latency, and CPU or DSP power to record, edit, mix and master.
Imagine being able to record or create a musical idea on your phone while on your way to the studio, seamlessly moving that idea from your phone to your studio’s computer to continue to work on it, then having a band member listen and contribute from their office computer’s web browser, then streaming or sending the mix to a producer in another city all without thinking about how it will work. That is the future of music creation.
There are currently 1 users browsing this thread. (0 members and 1 guests)