• The band YACHT, named for a mysterious sign seen in Portland around the turn of the century. YACHT / Google I/O 2019
  • YACHT's Claire Evans takes the stage not to rock out, but to talk out the band's new album leveraging artificial intelligence and machine learning. Google I/O 2019
  • Album art for Chain Tripping. Here's the Spotify link. YACHT / DFA Records

The dance punk band YACHT has always felt like a somewhat techy act since debuting in the early 2000s. They famously recorded instrumental versions of two earlier albums and made them available for artists under a Creative Commons license at the Free Music Archive. Post-Snowden, they wrote a song called “Party at the NSA” and donated proceeds to the EFF. One album cover of theirs could only be accessed via fax initially (sent through a Web app YACHT developed to ID the nearest fax to groups of fans; OfficeMax mustve loved it). Singer Claire L. Evans literally wrote the book (Broad Band) on female pioneers of the Internet.

So when Evans showed up at Google I/O this summer, we knew she wasnt merely making a marketing appearance ala Drake or The Foo Fighters. In a talk titled “Music and Machine Learning,” Evans instead walked a room full of developers through a pretty cool open secret that awaited music fans until this weekend: YACHT had been spending the last three years writing a new album called Chain Tripping (out yesterday, August 30). And the process took a minute because the band wanted to do it with what Evans called “a machine-learning generated composition process.”

“I know this isnt the technical way to explain it, but this allowed us to find melodies hidden in between songs from our back catalog,” she said during her I/O talk. “Heres what the user-facing side of the model looked like when we recorded the album last May—its a Colab Notebook, not the kind of thing musicians usually bring into the studio.”

Enlarge / A look at YACHT's work with MusicVAE Colab Notebook.YACHT / Google I/O 2019

YACHT had long possessed an interest in AI and its potential application in music. But the band tells Ars it wasnt until recently, around 2016, that the concept of doing a full album using this approach seemed feasible. While research entities had long been experimenting with AI or machine learning and allowing computers to autonomously generate music, the results felt more science project than albums suitable for DFA Records (home to labelmates like Hot Chip or LCD Soundsystem). Ultimately, a slow trickle of simplified apps leveraging AI—face swap apps felt huge around then; Snapchat and its dynamic filters rose to prominence—finally gave the band the idea that now could be the time.

“We may be a very techy band, but none of us are coders,” Evans tells Ars. “We tend to approach stuff from the outside looking in and try to figure out how to manipulate and bend tools to our strange specific purposes. AI seemed like an almost impossible thing, it was so much more advanced than anything we had dealt with… And we wanted to use this to not just technically achieve the goal of making music—so we can say, Hey an AI wrote this pop song—rather we wanted to use this tech to make YACHT music, to make music we identify with and we feel comes from us.”

Bringing a Colab Notebook to a rock studio

Having the idea to use artificial intelligence to somehow make music was one thing; doing it proved to be something else entirely. The band started by looking at everything available: “We messed around with everything that was publicly available, some tools that were only privately available—we cold emailed every single person or entity or company working with AI and creativity,” as YACHT founder Jona Bechtolt puts it. But no single existing solution quite offered the combination of quality and ease of use the band had hoped for. So, they decided to ultimately build out their own system by borrowing bits and pieces from all over, leveraging their entire back catalog in the process.

One instrument of note

Looking through the liner notes for Chain Tripping, people are thanked for “neurography” and specific bits of software get shouted out. Among those, you might notice NSynth, one of the most high-profile releases from Googles Magenta team (that focuses on harnessing artificial intelligence in music). Ars tried using it last year to compose soundtracks for home movies, but its a bit wonky and advanced for a non-musician (the tool essentially allows you to interweave two instruments to generate a new sound). In the hands of YACHT, however, you might find yourself grooving to the odd riffs it generates in the background of a track like “Blue on Blue.”

“A lot of these music making tools right now are made by engineers who love music, but theyre made by engineers,” Evans adds. “So theyre often in love with the math in this way that doesn't ultimately take into consideration that the audio output of these tools isnt objectively very impressive. You can have this incredible piece of tech that uses advanced ML techniques to split the difference between two different sounds, but what if the output sounds like a fart?”

Ultimately, YACHT made it work for them by embracing that, er, fart-iness. (“The NSynth for us, we thought it sucked at first,” Bechtolt admits.) Rather than thinking of the NSynth as something that could replicate or replace a traditional guitar or even synth within a composition, the band embraced its oddity and found more success. Bechtolt notes music has a long legacy of this type of repurposing—the 808 drum machine didnt sound like real drums, but its unique sound ultimately spawned many new genres. Though the band doesn't see the NSynth having that legacy.

“Its not good at what its trying to do; its good at something it didnt set out to do—thats whats interesting,” Evans adds. “It sounds wonky, reedy, lo-fi, and kind of shitty, but in a way that speaks to us as lo-fi, DIY artists.”

“We knew wed have to base everything on some kind of dataset, so early on, we thought, What if we used our back catalog?” Bechtolt says. “We naively thought itd be something like Shazam, where we could throw raw audio at an algorithm. That isnt really possible…”

“Or, at least, not within the realm of our computing capacity,” Evans interjects.

“So we had to notate all our songs in MIDI, which is a laborious process,” Bechtolt continues. “We have 82 songs in our back catalog, which is still not really enough to train a full model, but it was enough to work with the tools we had.”

With that MIDI data, Bechtolt and longtime collaborator (bass and keyboards player) Rob Kieswetter started by identifying small segments—a particular guitar riff, a vocal melody, a drum pattern, anywhere from two bars to 16 bars—that could be looped, combined, and ultimately run through the bands simplified AI and ML model. The band relied heavily on Colab Notebooks in a Web browser—specifically, the MusicVAE model from Googles Magenta team—manually inputting the data and then waiting (and waiting) for a fragment of output from this workflow. And that AI/ML-generated fragment, of course, was nothing more than data, more MIDI information. Evans told I/O the band ran pairs of those loops through the Colab Notebook at different temps “dozens, if not hundreds of times to generate this massive body of melodic information” as source material for new songs. From there, it became the humans turn.

“It still couldnt make a song just by pushing a button; it was not at all an easy or fun flow to work through,” Bechtolt says. “So after three days, we were like, OK, I think we have enough stuff. By that point we had a few thousand clips between two- and 16-bars, and we just had to call it quits at some point.”

“It wasnt something where we fed something into a model, hit print, and had songs,” Evans adds. “Wed have to be involved. Thered have to be a human involved at every step of the process to ultimately make music… The larger structure, lyrics, the relationship between lyrics and structure—all of these other things are beyond the technologys capacity, which is good.”

Listing image by YACHT / Google I/O 2019

Evans demonstrates how, with MusicVAE, YACHT could take two old songs and generate some new ideas. Here, the bands tracks “Holograms” and “I Wanna Fuck You Til Im Dead” generate a new melody that ends up on Chain Tripping.

So with their troves of data, the band hit the studio and started the process of interpreting the computer-generated information with instruments in hand. They also adopted one rule from the start of studio sessions: “We were very strict about not adding anything ourselves,” Bechtolt says. “We werent going to improvise or jam on top of algorithmic output, we were only to use things it generated.” In case of emergency, they could revisit their Colab Notebooks—say none of the files had the right melody, the band could take parts of old songs to “finagle the algorithm to make this happen”—but artificial intelligence had to be a constant collaborator on the compositions.

“We usually go into the studio as human beings with a notebook full of lyrics, ideas, and a few riffs weve been thinking about. But to build up the repository of information we were working with… in two seconds we could have 10,000 words of lyrics. In a few minutes, we had hundreds of four-bar segments of MIDI data,” Evans says. “Theres so much stuff on the cutting room floor, cutting room hard drive if you will, that may go into future albums or just simply disappear into the multidimensional mathematical obscurity from which it emerged.”

In this light, Chain Tripping challenged YACHT to be less of a traditional band at times and more DJ or mash-up artist, taking existing bits of sound (err, computer-generated instructions for sounds) and combining them in creative ways. “Its not unlike making hip hop music or DJing,” Bechtolt admits. “But instead of crate digging and finding samples on different records, it was all found in the latent space of our own music and output, which is really trippy to think about.”

The band also utilized AI when making the video for the album's first single, "Downtown Dancing."

But does it sound good or math-y?

AI-generated music has been done before: from researchers teaming up with orchestras a decade ago to YouTubers and startups more recently. But YACHT's Chain Tripping may represent the most high-profile, traditional album to be released after being fully composed with AI/ML.

Yet, the first time anyone listens to Chain Tripping, especially fans of the bands prior work, I think theyd be hard-pressed to know anything unusual happened behind the scenes. Bass riffs on “Downtown Dancing” or “DEATH” will make you move like the best of what you can find at DFA Records. You leave singing to yourself “California Dali, thRead More – Source