This spring, during Mission Creek Festival, a lucky few were treated to a unique sci-fi dining experience. The Sci-Fi CoLab brought together dancing, storytelling and food to take attendees on an otherworldly journey. Providing the soundtrack was electronic musician Brendan Hanks.
Hanks’ new record, Generations, which drops Friday, mines that science fiction experience. The full-length album, which Hanks wrote and produced, consists of just four tracks: “Our Futures,” “Your Exile By Algorithm,” “Our Furies” and “Your Infinite Exile.” Tracks one and three were co-produced by Philip Rabalais (Trouble Lights, Utopia Park); Phil Young (Tires, Anna Libera, Night Stories) provided additional production on “Your Exile By Algorithm.”
Little Village is pleased to premiere “Your Exile By Algorithm” today. It’s a sci-fi collaboration, of sorts — Hanks produced the track by feeding one of his earlier recordings, “Your Exile,” through an algorithm in the program Ableton. It extracted his composition to midi notes, which were then played on a digital piano.
“Other than cleaning up the most sour bits,” Hanks said in an email, “this is entirely a machine’s interpretation of my music.”
The title is two-fold: Quite literally, of course, it is Hanks’ track, “Your Exile,” as interpreted by an algorithm. It also forces the listener to confront the possibility, though, of being exiled by an algorithm — of the way that the musician might be exiled if algorithms are engaged in the interpretation of music.
Hanks answered a few questions over email on the ideas and philosophy behind the track and the album.
You mentioned that this was music created for the Space Co-lab. Was that science fiction context what interested you in the idea of feeding your track into an algorithm for “Your Exile By Algorithm?”
The story that Richard Siken read at the Space CoLab event entails a generation ship (a hypothetical means by which to transport humans to distant stars), and a class of robots charged with absorbing the negative emotions of humans in order to make the trip possible. As such it seemed appropriate to let a machine assist me in writing the music over which he would be reading. I have used different methods to do this in the past, such as defining a rhythm then allowing the computer to choose pitches and, after that, fitting it to a traditional scale. Feeding a track into this algorithm in Ableton wasn’t something I had done before, and was more of a, “Let’s see what happens” kind of decision, the result of which disarmed me enough that it immediately felt like the correct choice.
Are the other three tracks more standard recordings, or do they experiment with mechanical interpretation as well?
“Your Infinite Exile” is another iteration of “Your Exile” derived from an algorithm, this one being the famous PaulStretch algorithm (if you’ve listened to a YouTube video that’s like “JUSTIN BIEBER SLOWED DOWN %800 IS AMAZING” then you’ve heard it’s effect, though my personal favorite is “Don’t Fear The Reaper,” which ends up sounding like a lost Tim Hecker album). As the story wraps up with the ship crossing a threshold it felt appropriate to emulate the closest thing we can experience to time dilation.
“Our Futures” is a fairly standard recording — I had a sample of the trains coming into the yard near my house that felt like a good base and built an ambient track from there. “Our Furies” was meant to be a shadow version of that with all the pitches re-keyed to the relative minor, subbed a fog horn for the trains, and recorded some modular synth patches I had been working with in order to lean into the idea of robots behaving in unexpected ways.
What are your overall thoughts on computerization and art? Where does “Your Exile By Algorithm” fall, conceptually, on a scale of zero to Hatsune Miku?
It’s probably closer to Hatsune Miku than I realized! The original “Your Exile” was recorded by me sitting down with a synthesizer, playing chords and manipulating values to get different sounds out of it, while “Your Exile By Algorithm” is the result of a piece of software mapping frequencies to MIDI data. The funny thing to me about these tracks is that if you played them to someone with no prior knowledge and asked them which sounded more “human” I’m certain they’d pick the algorithmically generated one.
I think of algorithmic generation and other forms of computer-assisted/generated art as just another tool — if you built the algorithm then you made artistic decisions in it’s design, and if you’re using someone else’s then you’ll make editorial decisions about what to include and what to discard (and in this case making no decision is also making a decision). You can end up with something that sounds good or you can end up with crap.