Published Dec 12, 2019Humans are born to collaborate; we can't help but bounce our ideas off someone else every once in a while. An entirely new musical partner has been emerging recently, however, and it's not human. Artificial intelligence has played a significant role in music this year, and its influence is likely to spread over the coming years. If Grimes' recent comments are anything to go by, then A.I. and other technological advancements are soon going to make live music obsolete — although we're not too sure about that. It's too early to tell whether we're furthering creativity or relinquishing it, but the algorithms are here, and they're making waves — even if they are just sine waves.
Holly Herndon Collaborates with Spawn
Earlier this year, Holly Herndon released PROTO, an album created in collaboration with Spawn, a machine intelligence. (It's also one of our best dance and electronic albums of 2019.) Artificial or no, intelligence does need to be fostered — it's not as if Herndon plugged Spawn in and flew into a jam session; she had to teach it. In order to do that, she brought a combination of vocalists and developers together who taught Spawn how to identify and reinterpret sounds over many singing sessions. The result is an oft-jarring cacophony of vocals, where Spawn's actual influence isn't immediately apparent.
On PROTO, you get the impression that the humans are still very much in control. No doubt, this is the biggest worry for some as we dive further down this technological rabbit hole, but Herndon is already jumping ahead of that conversation, as she notes in the album's press release, "I don't want to live in a world in which humans are automated off stage. I want an A.I. to be raised to appreciate and interact with that beauty."
YACHT Hand Over Control
Synth pop band YACHT (Young Americans Challenging High Technology) gave decidedly more control to A.I. programs for their recent Chain Tripping record. There's A.I.-generated artwork, band photos made with machine learning tools, and even strange A.I. fonts from an online program called Bit Tripper. These are all just trimming, though: the actual album itself was also made using an A.I.
YACHT fed all of their songs from the last 17 years (82 tracks in total) into a series of machines that analyzed both their lyrics and melodies. The A.I. then wrote ten original songs based on all that information. The band weren't completely hands-off, however. While YACHT didn't alter the lyrics or sounds that the algorithms produced, they did edit slightly. Nothing was added to the lyrics or the music, but they could remove certain pieces if they felt like it. So, in a way this is a YACHT album, but in an equally real way, it's not at all.
Kıvanç Tatar Predicts Emotion
A little closer to home, a recent postdoctoral graduate from Simon Fraser University, in BC, has been working on a venture that's an entirely new interactive A.I. model for music. Revive, his latest project, "explores the affordances of live interaction between the artificial musical agent MASOM, human electronic musicians, and visual generation agents," according to his website.
What makes Tatar's A.I. stand out from the rest is the fact that it contains an emotion prediction algorithm. The A.I. essentially calculates a perceived emotion in relation to sound. Tatar and his colleagues achieved this by doing scientific experiments where humans labeled certain sounds that correspond to emotions. Once they eventually gathered a dataset of objective sound affect values, they then had a model by which they could train the machine learning algorithm and, ultimately, allow it to successfully improvise on stage with human musicians. Upcoming Revive shows include dates in Braga, Portugal and Vancouver.
Endel Gets a Record Deal
A.I.s are becoming great collaborators, but they still need humans, right? Not according to Warner Music Group they don't. In March of this year, the label — home to Madonna, Ed Sheeran, and Coldplay — signed a bunch of code to their ranks. Endel, a sound startup run by a combination of scientists, artists and developers, created an algorithm that uses A.I. to make personalized audio tracks for the purposes of boosting people's mood and productivity.
To be clear, this is the first-ever record label contract with an algorithm, and it has released albums — feel free to check out records like Cloudy Afternoon and Foggy Morning. Ultimately though, Endel's future goals are far more ambitious, and just a tad bit creepy. In what seems like the beginning of Black Mirror episode, the company would eventually like to keep tabs on the rhythms of users' daily lives, like their driving habits and events on their calendar, to create custom soundscapes. Good night, Dave!
Algoraves Reveal Code
Algorave (short for algorithmic rave) has been around for a while, but it's mostly been hovering on the outer circle of dance music, known to and performed by a very small subset of the electronic music community. With a slew of media coverage, a documentary by Resident Advisor, and some spots in mainstream festivals like SXSW, this seemingly esoteric genre has gained quite a bit of traction in 2019.
For anyone who's not familiar with the movement, it basically involves watching people use live coding to create music and sometimes visuals. Typically the raw code will be projected onto screens, so you can see how the music is being "played" in real time. This may just sound like a nerd's dream, and in some ways it absolutely is. Still, not only does algorave remove some of the mysticism behind electronic music by actually showing you how it's created, it can also be oddly off-the-cuff, as software crashes and algorithms run amok, while their masters attempt to fix problems with new lines of code. As niche as all this is, we've no doubt that algorave events will see an even bigger platform in the coming year.