#671: “Electronauts” Democratizes Remix DJing & Musical Improvisation

nathan-burbaElectronauts is a cross between a simplified DJ simulator and improvisational music tool from Survios that is launching today. It’s releasing with over 40 electronic songs that allows you to step into the DJ booth and control the flow of the music. Each song has been broken down into component parts in such a way that allows for the DJ to add and subtract musical instruments, but also add new looping melodies with a variety of different synthesized instruments. It’s been able to find the perfect balance of providing an underlying architecture of an pre-authored song while also providing the user a lot of their own generative agency to be able to explore, improvise, and control the overall flow of the song.

I had a chance to talk with Survios co-founder and president Nathan Burba where he talks about their journey towards democratizing DJing and musical improvisation, collaborating with musical artists, developing a customized Music Reality Engine™, and some of the VR design considerations of creating an intuitive and empowering experience for musical remixing. Electronauts was inspired in part by a documentary called RiP! A Remix Manifesto about remix culture featuring artists like Girl Talk. VR is helping to transform passive media consumption into active participation through remix culture that’s been simplified and made more accessible through the Electronauts user interface.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

Here’s a video of Electronauts produced by LIV:

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality


Support Voices of VR

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. So if there's one consistent theme that virtual reality is doing across all media, it's that we're going to be having media that's much more immersive and interactive. And so what does that look like for music? Well, Servios has done a number of different experiences from raw data and sprint vector, and now they're coming out with Electronauts, which is allowing you to step into the DJ booth and trying to democratize access to something that is normally very complicated. Servios has come up with this music reality engine, which is able to break down songs into their component parts, and they're trying to give you access to being able to improvise in the music booth and to remix a song. So I had a chance to sit down with the co-founder and president of Servios, Nathan Berba, to talk about their journey of creating Electronauts and everything that it took on the back end in order to even make this possible. So that's what we'll be talking about on today's episode of the Voices of the Year podcast. So this interview with Nathan happened on Monday, July 30th, in 2018. So with that, let's go ahead and dive right in.

[00:01:18.352] Nathan Burba: I'm Nathan Berba. I'm the co-founder and president of Servios. Electronauts is a project that we've been working on for a long time, actually started as a prototype back in early 2016, and we are finally, finally getting close to shipping it, and it's really, really exciting.

[00:01:35.108] Kent Bye: So yeah, I had a chance to play it at GDC and I just actually spent about an hour in it just now just playing around and what it seems like you've been able to do is translate the process of being a DJ into not necessarily a game but more of a Optimized VR experience.

[00:01:50.978] Nathan Burba: So maybe you could tell me a bit about like how you think of what you've created Yeah, so it's kind of it's kind of like a sonic playground Essentially, you know this came from you know myself not being a musician not being a DJ, and to be honest, just being too lazy to want to actually learn how to use the complex interfaces that other people are using and to basically get good at having tempo and rhythm and all these things that separates what a musician is from someone who's not a musician. So we wanted to create technology that would empower people to essentially be able to be a musician without having to practice for years. or learn a whole new complex interface. And so, yeah, Elektronaut basically lets you do that. So it's a whole entire DJ kit, but what you do is you DJ with pre-existing songs that are done by some of the top artists today. And you basically go in and it's kind of just like a magical musical experience. It's kind of hard to describe, honestly. We kind of go back and forth between is it a tool, is it an app, is it a game, is it an experience? And we're honestly still not even sure. We know it's damn cool and it's really, really fun, but it's very much a, It's kind of weird next generation virtual reality experience. And it's very powerful. It's one of those things where if you like a song, you might go in there and spend 30 minutes just playing around kind of inside of that song. And it contains a number of different elements. Some of them are a little bit more traditional, kind of like switching track DJ type elements. And some of them are more musical and kind of instrumental elements. And the ability to actually play different instruments in real time and kind of feel like you're doing a guitar solo from the top of your head is a really cool aspect of the experience. Actually, tell me a little more about what songs did you play? What kind of, what did you think about? What was your impression?

[00:03:28.168] Kent Bye: Well, I first played it at GDC, and I think it was, I don't know, it was working, but it feels like the level of polish that you've been able to do now is such that you're able to jump in into any song, and if you know the basic structure of crescendo and decrescendo, After kind of playing it at GDC, I had an idea of the user interface, but I also went through the tutorial, which I have to say that a lot of the music creation tools are either things like a game, like a rhythm game, like AudioShield or Soundboxing or Beat Saber, where you're going into a game that's already been created and you're having an embodied experience of the music coming at you. And this seems to be more of less of a game where you're playing a run and more of like an open-ended like creation tool where you can sort of modulate the song as you're playing it. But if you understand music enough, you can play with either adding a little melody or just kind of adding different musical elements and adding and subtracting different tracks and just trying to create an overall flow of the music.

[00:04:32.221] Nathan Burba: Exactly. It's very much about giving someone control of the music. So I was inspired by this documentary called R.I.P. A Remix Manifesto. And it's about girl talk and remix culture. But there was a really interesting kind of like thesis or kind of restatement of the thesis at the end of it, which was basically this idea of a record, right? Or record is very much a concept that is not music, right? That concept existed with the phonograph. Basically, the first kind of recording of music But music has existed for all of human history way before that. And before you could record music, music was like a live dynamic thing. Every time you heard a song, it was different because someone had to play the song for you, or you had to play it for yourself, or you had to sing it to yourself. And so what we're seeing as a big trend that music needs is music needs to get more dynamic. Music needs to get something that people actually take ownership of. And basically, every time you listen to a song or play a song, it should be dynamic when you do that. And to do that, we really had to kind of take the concept of what a song is and kind of just blow it apart and explode it into all of its requisite pieces. And if you know much about game development and virtual reality development, a lot of it is about chopping everything up into those pieces and then using software to kind of recombine them in new and interesting and dynamic ways. And so that's essentially what this is with music. We take all of the different pieces of a song, the backing track, and the vocals, and the lead instruments, and the bass guitar, and every little piece, even the percussion set. And whether it's the kick drum, or the snare, the clap, whatever it is, we take all those pieces individually. And now you have a finite, very specific control over each one. And so you're basically building a completely new remix of the track in real time. And you're essentially now the musician. So you're basically like, If you take the Chainsmokers track, for example, it's like you're like the Chainsmokers now. You're basically them going up on stage, and essentially you can play the artist's music. So whatever it is, you basically feel like them. You feel like you have that full control. And then optimally, the way people play the game, it's pretty open-ended. But the best way to play it, in my opinion, is just to hang out with some friends, put it on a television or some speakers, and just Take turns basically being the DJ for like a party. It really is a party game It's one you want to play on a Friday or Saturday night with a bunch of friends over and this kind of gives you that Control where you're you're not just hey, I'm gonna choose a song right? I'll choose the playlist for songs It's not even like I'm gonna have two turntables and mix the songs, right? and create the transitions. This is like, let me play a guitar solo for you, right? Like this actually makes you feel like, you know, you're the cool guy with the acoustic guitar by the fire out in the woods kind of thing. Like you can actually have that agency over the music. And it's a feeling that I think everyone has wanted to feel some point in their life. We really think this kind of gives you that feeling.

[00:07:20.523] Kent Bye: Yeah, I played a number of different tracks and the thing that I found was that you're trying to find this sweet spot between the agency and being able to have a set of pre-configured elements of a song that kind of make up that song. And so you have a distinct difference between being in different songs and the different, you know, agency and kind of free will that you can express by either playing a harpsichord or orbs or to do things with the sound effects. And so there's these different ways that you can express your individuality and your agency but it's still within the context of an overall song that has already been created outside of the context of virtual reality so you're importing the essential sense and the loops and the lyrics and all the components of the song so you still have that ability to have the difference of this is actually a different song and it sounds completely different but within that context you're able to just kind of jam out and improv and that's what I found myself being able to do is record these different loops on the orbs and turn them off and on and really just kind of like feel like I was in a lot more control of the flow of the song without really knowing much about the song but once you know the structure of the different parts of the song then you can kind of use your intuition to kind of go between them. So that's the thing that I guess is probably the most difficult thing was to find that that balance between that authorial control and that agency, that generative agency that you're giving someone. And it sounds like you've been able to kind of build your own engine in order to kind of facilitate this with either MIDI stems or whatever you're using to be able to take the components of the song, those component parts, and then be able to provide the framework within the context of this game to be able to play the whole song and kind of modulate different aspects of it.

[00:09:04.468] Nathan Burba: Exactly. You're able to dynamically control that flow and kind of feel, you know, in the instance of the orbs and laser harp and some of the immediate instruments like the synth ball, you're able to feel as if you are the musician playing the actual notes of the song itself. And then actually, one thing that is amazing too is when you get in there with two players and you start playing together, you kind of have like a duet and you're playing call and response and someone else is doing the drop and you're syncing it up with like playing a solo when the drop happens and it's very much just kind of, it's kind of a mode of expression, kind of like how Tilt Brush is a mode of expression where you're creating a piece of art and then someone can go inside of it. This is like you're creating the music, you're creating it you know, more through time. So you're able to express yourself essentially through a performance. It's really like a performance tool, is the way I think about it. And you mentioned the engine that we created. So we built what's called the Music Reality Engine, which is an engine that actually sits separate from Unity. So Unity kind of connects and talks to the Music Reality Engine. And that engine is actually what has all of the musical elements in it, and then is able to generate the audio that the game spits out. Because there's a lot of different, very dynamic things that have to happen with a very low latency. You have to have a lot of different sound channels, different pieces loaded in in a very short order that has to run in real time in virtual reality on kind of a modern system today. So the music reality engine is basically the core of the product and we actually think the music reality engine has applications outside of this particular product as well because now that we've built this thing that can kind of control music at the stem level, we're starting to see all of the possibilities whether it's dynamic AI based generation of music or like we actually had other versions of this game that were you know we had a mobile version at one point that wasn't even in VR we had a version that had a controller that was just like a basic prototype like it's kind of its own it's its own technology that's what makes it so exciting and it's honestly what made it take so long to make the thing in the first place.

[00:10:54.845] Kent Bye: So I know that there's technologies like MIDI. Is this using a MIDI format to be able to break down what is essentially like waveforms into more of a digital aspect? Because you're trying to quantize the music to be able to dynamically switch it. In order to do that, you kind of need to have some way to programmatically describe the music rather than using something like waveforms. So are you using MIDI or what kind of formats are kind of driving?

[00:11:19.937] Nathan Burba: Yeah, MIDI is like a file format standard. I don't think we're actually using MIDI anywhere in this. If we wanted to make full-blown mixes that you could create and export, we might use MIDI for that. But really, this was all about creating a pretty simplistic standard. Simplistic from a musical standpoint, in the sense that you couldn't apply every song to it, but you can apply most songs to it. But we created a standard that involves a number of instruments, a number of tracks. Everything is in 4-4 time. You have different tempos. but you can't change the tempo, for example. You know, you can have up to six tracks, up to nine stems. When you put all those things together, it's basically its own. music format that we've created. I mean, it's not anything more complicated than just a set of folders with files in them with certain naming conventions. And then we have some data files and there's some JSON files that have some information about the song itself. You know, it's something that we want to standardize kind of a little bit more moving forward, but it's much more constricting than something like MIDI. MIDI is much more open and can be used for virtually anything. And this is not about that. This is not about creating a song as the end result. It's about creating something that is easy to use for the user. So it's simplified. It's very much on rails. The format is very rigid so that if you remember the six tracks from song one, you go to song two, it has six tracks as well. It has nine stems or less than nine stems as well. It has a few instruments as well. So the idea is once people learn the interface, the end users, people who want to be musicians, they want to be DJs, right? They learn how to use that interface. Now they know how to play every song, even if they never heard the song before. So now, as a musician and a music producer, you can create those tracks and people can immediately come in and DJ to them, which is very, very empowering. So it's kind of a fairly rigid standardization that allows for that level of fun to be had.

[00:13:12.008] Kent Bye: So can you talk a bit about the process of collaborating with these different musicians and like, what do they need to do with their digital audio workstations, their DAWs to be able to translate the music that they've already created to be specifically formatted to this music reality engine?

[00:13:29.078] Nathan Burba: That's a great question. Um, so I will, I will preface this by saying that One thing we wanted to do in the game was have what we call mod support, which is essentially the ability for you to put any music that you have or that you can make into the game. The game is not shipping with that, but man, we really want to get that in there. And so we're hoping if there's enough community engagement around that, that we'll be able to, in a future update, put mod support in there. But a lot of that is about translating our internal process into something that's a little bit more streamlined and, you know, idiot-proof, for lack of a better word, so that you can just drop files and folders and it just works. Right now, it's a little bit more involved Typically, it involves having your entire song laid out in something like Ableton or one of the other DAWs. Ableton is kind of the one we use most commonly here. And then basically, you have to have all of the elements of the song broken up into its constituent parts. So you don't want to have a percussion track. You have to actually break up the different pieces of the percussion into various elements that are then arranged on the grid in Ableton. And then you essentially have to chop all those pieces up into their individual wave files. They have to be timed correctly. If they were pieced back together as wave files, our system would piece them kind of correctly without extra space at the beginning or the end of wave files. But you might have to make some new elements as well. So, you know, once you play the game, you kind of get a sense of what's in there. But some of the songs that we've adapted to the game, they might not have enough material there for two lead instruments that you could play, or they might have like a guitar that can't be isolated because the stem doesn't exist because the artist didn't have those files. So we have to remake the guitar to then put it into the game with a guitar that sounds very similar to what was in there, but it's not exactly the same. So you have to go through that process of cutting up those individual elements. You might only have, you know, enough material for like three grenades, right? Three audio grenades. So you have to kind of make up two more, right? So fitting itself into that format sometimes necessitates the creation of a little bit more material. But from a general standpoint, as long as your song is kind of I think it's as long as it's in 4x4 time, as long as it's fairly dynamic, like there's a number of different elements going on. It's not super repetitive and there's some kind of uniqueness to individual elements. If you have all the original source material, that can be chopped up pretty well and kind of put into that format. And then when it gets really fun is when people make music specifically for the format, which is not, it's something we've done internally with our musicians. We haven't done it with external musicians, but now that we're getting the game out there and kind of getting this format a little bit known, when you make things directly for this and start hacking the systems to make something incredible, then it gets really even more interesting. And not to get too technical, but working with the artists has been a blast and it's been a learning process. We've worked with artists that are huge, artists that are not very well-known, artists that are incredibly technical, Other artists just send us their stems and they're like, have fun. If you get something working, great. If not, don't worry about it. And it really depends on everyone's process. But at the end of the day, where it's the most advantageous is the artists who are the most technical. The ones who are like, you know, Odessa, or Tipper, or really kind of the people who at their live shows, they actually do live improvisation. The stuff that they do goes really well into Electronauts, because this is a live improvisation musical toolkit that's very much inspired by some of what those other artists do. Like, I think I saw Apex Twin at LIB. I could be misremembering this, but they had a really incredible improvisational thing they were doing on stage. where they would basically do something similar to playing orbs, but with these pads that they had, and this kind of stuff they do, for example, would work really well on electronauts.

[00:17:01.079] Kent Bye: Well, I'm a fan of people like Erlich Schnauss, and I've seen him in concert a number of times, and it's a little bit of a letdown because I love the music, but when I was watching a live performance of some of these electronic dance musicians, specifically of the Shoegaze variety, and you'd be looking at this Ableton Live controller, which was basically like this square grid of buttons, and they're pushing buttons, and that's the song, you know. I think the exciting thing about Electrognauts is that it kind of translates the abstractions of playing electronic music into more embodied movements, such that you start to actually move your entire body and are moving your arm to be able to push buttons. And I think that that'll do a couple of things. One is that as people are watching you, they'll get a better sense of some of this improv that you're doing, especially when you start to play the orbs and other melody parts that you can start to play a little bit and maybe loop it. And then people can kind of see what's happening. But there's also like this live mixed reality component where now all of a sudden you're able to actually perhaps step behind and see into the DJ booth and see somebody's body as they're playing the music. And I think it's taking something that has been a little bit abstracted when it comes to electronic dance music and putting this embodiment and these natural movements back into it. And I think there's going to be this kind of like new performative element to that.

[00:18:20.857] Nathan Burba: I completely agree. I think, you know, when you really look at this from a high level, it really is a new musical augmented reality interface. We have it embodied in a virtual reality world, but really this is one that could live just as nicely in augmented reality. And it's really awesome when you see someone doing a guitar solo and you can see exactly what they're doing. And it's so much fun in the moment to be watching them and following what they're doing. And so the idea here, you're exactly correct, is to take that kind of obscure, you know, almost there's an SNL sketch about this, about the DJ dropping the bass or whatever with Andy Samberg and just kind of making fun of like no one knows what he's doing out there. It's about democratizing music and it's about kind of disambiguating that so that people do know what's going on and you are able to perform in that way. And what I'm excited about is when you have an augmented interface that you can just press a button and it just pops up all around you and now you're jamming out with your friends with a completely virtual interface like Tom Cruise Minority Report style more like Tony Stark in Iron Man 2. You kind of have an interface like that that's just visible everywhere and everyone can touch it, but you can do it without any musical expertise. To me, we're going to get to that future, and virtual reality is kind of the technology that we incubate those interfaces in.

[00:19:37.646] Kent Bye: Yeah, I think the big challenge for an experience like Electronauts is this trade-off between making it easy for people to use, especially people using it for the first time, versus the complexity and robustness you have with being able to control things at a very fine-grained level. So I imagine that there's been a lot of these different trade-offs that you had to make where you're trying to create a seamless user experience for someone who's essentially a non-professional musician. trying to give the experience of being a musician versus somebody who may be a professional musician or Want to get to the point where they start to have a little bit more fine-grained control So one thing's just as an example is and I don't know if this was already there and I just didn't find it But when I would play the orb sometimes sometimes I just wanted to tune down the volume of the orbs just tune it down a little bit and I didn't see that there was a master mixer level to be able to kind of tune that down to kind of just make it so that I could listen to the other parts. I still wanted it there, but I just didn't want it to be at that same volume. And I didn't know if the volume was there and hidden in some sort of interface, or if it's just simply not possible to do that level of mixing. So, just curious if you could kind of, you know, take that as an example to talk about there's a larger trade-off of making it easy for new people to use versus that level of fine-grained control.

[00:20:54.711] Nathan Burba: Yeah, it's definitely possible to have it both ways, but it's hard and you have to plan for it. And there are very different markets of people. But basically, you essentially have your basic level controls, and then you dig a little deeper, and there are more advanced controls. You dig a little deeper, more advanced controls, dig a little deeper, more advanced controls, etc. A lot of the best modern operating systems, in my opinion, follow this paradigm. Windows and OS X. I think OS X, they both do it in different ways, but it's that same kind of idea. And we have some of that, but it's also the constraints of developing something like this and honestly just the madness of trying to make something like this at all. I think we've done it with a fair amount of success, but for example, you can turn off quantization if you want to. that's on the cutting room floor. There's various things that we wanted to get in there that were more complex. In terms of mixing, this is a very, very hard project to mix. Unlike anything else, to get everything to sound similar enough, but then for you to be able to hear the instruments individually, all these completely different songs that have nothing to do with each other, that shouldn't be on the same album or whatever, now all need to be mixed relative to each other. It's a very difficult process, so we hadn't thought about giving the user control over the mixing because we wanted things to just be kind of easy to pick up and play, and we're just worried about mixing it generally first. We actually have, I think that might be exposed in Unity itself, and we might actually be able to expose it in a future update. We just have to, like this is a big idea, and this is just like, This game is a slice of, I think, a much larger idea, and I think that fine level of control would need to be encapsulated in that larger idea.

[00:22:32.059] Kent Bye: Yeah, the other thing that I notice is generally a trend within rhythm games is this visualization of music. Especially with something like Beat Saber, you're literally seeing the rhythm of the song come at you, and you're able to embody that. And you have something similar here in this latest version of Electronauts, in the sense that you are in this tunnel where you're going down and you have these different ways of visualizing the music that's around you. So maybe you could talk a bit about that process of trying to do that music visualization and try to take these individual components of the song and then give some sort of visual representation of what was happening in the song so that you have these different, I guess, flows being represented for the crescendos and decrescendos of a song being visually represented in your environment.

[00:23:15.672] Nathan Burba: Yeah, so the basic place to start with this is this concept that I like to call spectacle, which is actually similar to what game designers call juicy interactions. There's kind of a similarity. Basically, it's just a bunch of stuff happening all at once. So you see something, and the animation you're seeing has a corresponding sound, and everything is very synchronized to each other. That's what creates a spectacle. You walk into a room where a bunch of people are all moving and talking and doing things all at once in kind of a symphony of movement. Something that's what Servius looks like on a Monday. Like, that's kind of a spectacle, right? So what we really want to do, and I think what every music show ever wants to do, is basically create a spectacle. And the spectacle, the first piece of it is basically synchronizing the music to visuals. When we looked at that, we kind of had to break that down into, what does that mean? And what we came up with is that you want to take a sound that you can kind of identify you can isolate it in your head, right? So I hear a kick drum, you know, right? Or I hear, you know, a hi-hat or something, and you can kind of identify that individual sound. We wanted to be able to link that specific sound to an action or an animation that's going on in a visualizer so that we create the spectacle. But to do that, we had to create a unique technology, which we call the score following system. And this is basically us essentially writing out these little files timing on which beat when these things happen so that we can then drive the visualizer as opposed to doing signal analysis. to determine when those events occur, because that's a very imprecise science, and it can actually be very difficult to identify exactly when those things occur. So we built this system, and then we essentially had different pieces of the visualizer animate everything that's going on. But I will say that building a visualizer is incredibly difficult, like super, super difficult. At one point, I was trying to explain what I envisioned to the team, and the only words I could describe to kind of explain it was Mesh Dragons. Basically, there was meshes that would move through the world as you're kind of moving through a tunnel that would animate in kind of a certain way as you kind of move through them. It's very hard to describe this particular kind of art form, especially putting it into virtual reality. There's some applications out there that I think do this sort of thing pretty well. I think VRChat is kind of going into this territory, and then The Wave has some really nice visualized kind of elements. But it's something that's honestly very abstract. It's so abstract that it's hard to even talk about. But ultimately, what we wanted to create was what a lot of people expect out of Music Visualizer, which does revolve around that tunnel. We went to a Flying Lotus show pretty recently, and there was a lot of tunnel visualization in his 3D projection show that he had there. So it's a very common trope, that flying through space. And it fit very well with our overall trope, which is exploring electronic music as an astronaut, hence an electronaut. with you blasting off through the sonic accelerator and flying into a musical dimension. So it kind of just ended up making sense for that. And so we ended up just building a number of visualizers that we would be able to reskin and essentially reuse in different songs. And then once again, I think we did it with a moderate amount of success, but it was very much a technically difficult thing to pull off.

[00:26:22.103] Kent Bye: Yeah, I know that when I first did a game jam, I wanted to just throw in a beat detection algorithm just on the side, and then I realized after looking into it how difficult of a problem that actually is to detect the beats. And so it makes sense that you would have to go to something that is a little bit more handcrafted in order to do this translation between what's happening in the song and then how that actually gets visualized. So yeah, it's not an insignificant problem. But I'm curious what kind of reactions you've had from either professional DJs who are in the process of playing other people's music, or some of these musicians themselves, or people that you think is perhaps a new class of music enthusiasts who have up to this point not had the tools that are intuitive enough for them to be able to use. So I see there's these different audiences of the musicians who are generating music, the DJs who are already playing these in a live performative context, and then this whole new class of maybe casual DJs who were wanting to have access to these experiences that up to this point they haven't had before.

[00:27:19.314] Nathan Burba: Yeah, I mean, I will say, you know, everybody tries it is blown away, but kind of different people are blown away and in different ways. The musicians are definitely, it really gets them thinking about, you know, how music technology is evolving, and how they need to be changing their music and updating their music and using technology to help with their live shows. And it really kind of opens up the door to a lot of that kind of thinking. It's really fascinating to see one of our musicians we work with come in and actually play their song in VR, and they look at it with new eyes. And then you have people who are really music fans, but have never really played music themselves. To them, and this is a lot of us on the team, to them the game is just a dream come true. It's like, people come out and they're just like, I just need to be in there for hours. This feels like home, right? Because it gives them that control, that kind of thing that they've always wanted. And then actually what we've seen is a lot of kids, Kids pick it up incredibly fast, almost just like they know exactly what they're doing. And they just straight up just have fun with it. And it's very much, I don't know, it just kind of comes natural to them, which makes me very happy, which means that we're kind of on the cutting edge of interface design, not kind of the other side of things. But yeah, I would say there's a lot of different experiences, but generally people get in there and they'll spend, some people, when the first time they play the game, they'll spend 45 minutes in one song because they've never had the experience of actually having so much control over music and being able to play music like a real musician. Almost like sitting down at a piano and just, all of a sudden, you're just a prodigy, right? Like you didn't know it before. They kind of have that experience at first. And it's so immersive and mind-blowing that it's kind of, we almost have to remind them, like, hey, there's other songs, or like, you know, here's a few things that you could do. They're kind of, that initial experience is just very, very powerful.

[00:28:59.358] Kent Bye: Yeah, I had a chance to play it at GDC and I got a walkthrough step by step, but after jumping into it today, I went through the tutorial and I have to say that it's probably one of the better VR tutorials that I've ever seen, just in the sense of here's a lot of really complicated things you can do with music. I think a lot of music creation tools that are out there in VR, they kind of suffer from a level of complexity of like, where do you begin? And so just to have the structure of what you've built with already having the architecture of the song already there, but giving you enough control to be able to feel like you're actually manipulating things, but not be completely responsible for the overall feeling of a song, but just through that tutorial process of kind of stepping someone through this process of opening up the doors into this new world. I think that it's something that has been very intimidating for a lot of people and I have to say that even after going through this short tutorial of five to 10 minutes, I felt like I got the sense of what I could do. And there's still things that I don't understand, but I think I got like 80% of what I need to know to be able to go in there and play an entire song.

[00:30:06.489] Nathan Burba: Exactly. The baseline aspect of the game is very simple, right? You go from track one to track six, or track five or six, basically play through the tracks of the song, and jam on a few instruments while you do it. And that right there is you playing the game, and it's incredibly fun, but it's just scratching the surface of what's possible. And what we really want to do is get people to feel like they actually can become a musician. The game is very empowering, and the way I think about it is it empowers people to play music using this advanced new technology, and then that thing in their head kind of flips and they say, hmm, maybe I could be a musician. Maybe that dream I gave up on 10 years ago, maybe it's not so dumb to actually pick up the guitar again or pick up an instrument and actually kind of start teaching myself again or teach myself how to DJ, teach myself how to create music. It kind of gives people that courage through that empowerment and yeah, we want people to be able to get to that empowering point very, very quickly and then have the depth that they can then go into and we want people to play music for music's sake. just about I'm going to play this game to complete this game. You should be doing things for yourself and kind of going through Electronauts and exploring all of the different genres, all of the different songs that are in there. We have a lot in there now and there's a ton more coming. Go in there and kind of understand what your music is. What's the music that you are really into and then maybe what is the music that you can actually create for yourself?

[00:31:26.508] Kent Bye: Yeah, one of the things that I found a little difficult was knowing how to end a song or to not just end it, but go to the next song. So is that something that you're working on in terms of making it a little bit more of like transitioning from one song to the next? Seemed like something that wasn't necessarily all that intuitive.

[00:31:43.237] Nathan Burba: I mean, it's as intuitive as it's going to be. It's definitely, you know, this has been a challenge from an interface design standpoint. So the way you change songs is you basically, there's two different menus you can pull up. One we call the hand menu, which is on the Vibe, it's the left menu button, and on the Rift, I think it's the topmost button on either hand. But basically, you pop that open and you click switch song, and that'll open up the song select. Alternatively, there's another menu which has presets on it, and then it has a big button that says Switch Song. So you can pull that menu up, hover, and let go, and that will pull up Switch Song as well. But it was hard for us to put Switch Song somewhere in the world. Ultimately, it had to go on one of those menus, because in VR, there's tons of problems. Whether you put things on a menu versus putting them actually in the world, you never know where the user is looking. So how do you represent information consistently that they might need to be looking at all the time, like HUD-style information? So basically, we had to find creative ways of getting those things in there. But to be honest, I don't worry about those design things too much because when the game comes out and there's a lot of YouTube videos, a bunch of people playing the game, if you watch someone play for five minutes on Twitch, those five minutes will show you where some of the rough edges of the interface might be and you'll kind of be able to get over the hump that way.

[00:32:57.616] Kent Bye: Yeah, and looking at Servios in terms of your lineup of experiences that you've created, starting with Raw Data, then to Sprint Vector, and then, you know, at the latest GDC, you're showing both Electronauts and Creed, which is like a boxing game. Is there a through line between all of these different experiences? Or there seems to be a variety of different explorations of the unique affordances of VR, but also kind of spanning all these different genres. So I'm just curious what the story is that you tell yourself of who Servios is and what the commonalities of all these experiences are.

[00:33:27.488] Nathan Burba: I can tell you exactly what that is. I mean, the commonalities are the people, right? When you listen to a Beatles song, right? You kind of, in the early days, you could tell what a John Lennon song was and a Paul McCartney song was. And then you would later on, you know, George Harrison came out with something. That was his first hit that he wrote, and it was a different song. It was a pretty different feel from some of the other stuff in the past. So to me, the games that we have that are coming out are very much individual people or groups of people who are kind of like, this is their baby. bring their own kind of experiences and their own culture and taste and everything into these different projects. And so each one is its own little like soup of different people's ideas that kind of comes through. And of course, a ton of people who work on each one of these, but they're kind of driven in different directions. So, you know, raw data had a lot of our kind of feelings about corporatism and our love of sci-fi and our love of action games and a lot of the core mechanics that we had been developing over a long period of time, you know, a lot of those got into raw data, you know, on the art side, our need to make something realistic and a very high level of polish was all in there. And then with Sprint Vector, it's kind of like, Holy crap, making the art for raw data was really hard. Let's make something cartoonier, right? So then you have something that's cartoonier. We felt like things were a bit serious in raw data, so we wanted to kind of lighten them up a little bit. And then also you have different people working on the project. There was a lot more, you know, a lot more, just as an example, there's a ton of people working on Sprint Vector. but the influence of Spencer and Hunter Kitagawa and Bennett Jobling and just a number of people who had not had that influence on raw data, people who had come into the company and were now able to add to the soup, so to speak. They added a more fun and lighthearted vibe, you know, Andrew Bedian, or Andrew as he's lovingly known, Chris Thompson, like they all added a flavor to Sprint Factor that really was not there in raw data. And so, you know, once again, the group of people creates something completely different. With Electronauts, Well, I'll talk about Electros last. With Creed, just as a great example is Eugene Elkin is the project lead. He's like the embodiment of Creed. He's originally Bella Russian. He's probably one of the best boxers at the company, if I had to put money on it. Eugene is the one person in the studio you probably could imagine in a Rocky movie. So he's really bringing his personal preferences and his tastes Mike McTire as well, massive Rocky fan, massive Creed fan, right? Those guys, they're bringing a certain kind of essence into that particular project. With Electronauts, there's the Electronauts team, and really this is a team that lives and breathes music, lives and breathes music technology and music festivals, and really it creates a whole different kind of project, one that has its own unique kind of culture. So that's what we strive to do with any of the titles that we develop. Each one should feel like its own unique thing relative to the other projects, and that kind of, when you're creating art, and we're basically, I'd say we're an art company essentially, When you're creating art, each one of those projects should feel unique, but then it should still come back to our core values. It should be immersive and empowering. It should connect people. It should have incredibly high production value and incredible gameplay and mechanics. All of the things that we're known for, it's still going to have those, but it's very much its own unique project.

[00:36:50.620] Kent Bye: What's next for Electronauts?

[00:36:52.903] Nathan Burba: So we are coming out on August 7th, which is a Tuesday. So please, everyone, mark it on your calendars. And we have, I think, close to 50 songs at launch. And then we're going to have a number of other tracks that we're working on with other artists as well that are coming post-launch. And then we're going to see how everything goes, see how the community likes it. We want to start supporting the community and supporting people, really getting into music. And then hopefully there'll be new features and other things down the road. There's a lot of possibilities. We want to take it on tour. We were at EDC this past year, which is really, really fun. We're also launching to VR arcades. So if you're listening to this and you don't have a headset, but if you're close to a big city, honestly, there's VR arcades all over the place. So find your local VR arcade, and you'll most likely be able to try it out there. We're very aggressive. arcade marketing campaign and basically we just we're just super super excited to get this in the hands of people and what I want to do personally is just go on Twitch when people are on Twitch playing it and start chatting them up and like giving them pointers while they're playing because that's like the most fun thing you can do in my opinion in VR is to chat with people while they're in actually in VR.

[00:37:56.570] Kent Bye: That's great. And, uh, for you, what do you think is kind of the biggest open questions or challenges that you're trying to solve as a company with Servios, especially in the context of VR, you probably have a very good insight as the state of the VR market. So I'm just curious, you know, what some of the biggest challenges or questions are that you have in this larger context of where VR is at and where it's going.

[00:38:20.732] Nathan Burba: I mean, I think the simple answer to that is that we want to see the market grow. We want to see the market get bigger. VR is incredibly, incredibly challenging. We call it VR as a catch-all term, but virtually everything we talk about in the future with regard to experiential technology, whether it's reading a web page or watching a movie or playing a game or whatever, is basically all going to be VR at the end of the day, because reality kind of just encapsulates everything. So what that means is for virtual reality is there's way too much to bite off and chew. And the technology is incredibly complicated because you're trying to simulate these like super fundamental things that exist in reality. So the technology ends up being expensive. And it requires all the different huge companies to work together. Like for example, Electronauts will be the first title we're shipping on Mac. It'll be coming out for HTC Vive with the iMac, iMac Pro, all the different Mac products. And for that process, it's not like Apple or Valve is the end all be all. Apple and Valve have to work together. These massive companies who we think own the world, they still have to play nice in the sandbox, so to speak. Because VR is bigger than all of those companies. VR is about everyone and everything. What I'm excited about is innovative new ways that the market itself expands. Arcades is a great example of that. As products get more inexpensive and get smaller, and in particular, easier to use, I think we're going to see growth in the market. But I'm excited about that. I'm excited about AR, because I consider all of these things to be part of the revolution of natural user interface devices that people will start using more and more and more.

[00:39:59.170] Kent Bye: Great, and finally, what do you think is kind of the ultimate potential of virtual reality, and what it might be able to enable?

[00:40:08.598] Nathan Burba: Yeah, so that's a great question, and if there was ever the definition of a loaded question, it's that. It's the best way I can describe it. But I actually wrote something before we started the podcast, which was kind of my answer to that. I would say you can define someone or something as a god if they have control over their reality, right? Yeah, that's basically what God does. He controls things, right? So when you have control over a virtual reality, which is kind of like a sub-reality, you basically become like a demigod. And so that kind of control over what we're experiencing as kind of a sub-layer of our own reality is obviously going to have a massive impact on the world. I think we're already doing it. Honestly, all the media, you go back to the printing press, That's basically the earliest versions of virtual reality. You can trace back what virtual reality is going to do for society back to what all media and media democratization has done for society. you know, more free, more equal, it's connected people across the world. I think all of those trends are going to continue and I think in a world that has no new real frontiers anymore, virtual reality is going to be a very necessary semi-final frontier until space actually becomes the final frontier, if that makes any sense. Human beings need somewhere to go and the only place left to go for a while, until Elon Musk gets us to Mars, is virtual reality.

[00:41:31.918] Kent Bye: Nice. Yeah, I know that in talking to different people in the sci-fi realm, there's been this sci-fi trope that is about outer exploration. And I think that to some extent, VR is about this inner exploration. So I think that we'll always have both. We probably will eventually try to go to Mars, but we still, at the same time, with this VR technology, it allows us to go inward and have this inner exploration. And whether or not it makes you a god, I think defining and operationalizing what God actually is, I don't think we can necessarily come to that question. But I think there's a deeper question as you say that, which is the balance between fate and free will, which is what is the degree to which you do have your agency over a context in which you do are fated. And I think there's always going to be limitations in the fatedness of that until we get to this sort of completely unconstrained free will, which I don't know, I think there's always going to be limitations with even the structure of reality. So given that, I think your answer kind of brings up in the mind this sort of fundamental tension between a media ecosystem in which we've been usually passive recipients of that media, and now with VR technologies, it's taking the lessons that we've learned from video games and meshing it together with all sorts of different dimensions of storytelling and immersion and all these life experiences. And so it's like this challenge of how do you balance that fate and free will, I think is the fundamental question here.

[00:42:50.763] Nathan Burba: Yeah, and I think to bounce off of that, in terms of passive consumption versus things that are more active, what I've learned about the VR market generally is that Half of all the customers basically are game developers, and the other half are kind of aspiring game developers, if you will. And the line between production and consumption is really starting to blur. And so that sounds like a small market, but it's possible in 100 years everyone's a game developer, right? Because what is a game but virtual reality? We're basically all going to be, we're all producers in some capacity. So it's kind of... I think that piece is going to start breaking down and this is going to be the fundamental way that people both connect with each other and the fundamental way that they're kind of empowered beyond how they're empowered in real life. So if they want to fly and kind of run up walls and if they want to be a musical genius and if they want to exist in the future and exist in the past or what have you, this is going to allow them to basically unchain themselves a little bit from reality and get some of those experiences and explain what's in their head to other people.

[00:43:53.632] Kent Bye: Great. And is there anything else that's left unsaid that you'd like to say?

[00:43:57.273] Nathan Burba: Like I said, just Electronauts comes out August 7th. We'll be really excited to see everyone playing the game and seeing what people can create. I think we have a Reddit AMA coming up. I can't remember exactly when that is. But otherwise, yeah, stay tuned. August 7th is the worldwide launch.

[00:44:12.756] Kent Bye: Awesome. Great. Well, Nathan, I just want to thank you for joining me today on the podcast. So thank you.

[00:44:16.888] Nathan Burba: Yeah, thanks for having me.

[00:44:18.829] Kent Bye: So that was Nathan Berba. He's the co-founder and president of Servios, and they just launched Electronauts on August 7th. So I have a number of different takeaways about this interview is that first of all, I really love how they've been able to democratize the process of feeling like you're just jamming out and being able to really improvise within the context of a larger song. So with the Lockjar Knots, it's a little bit different from a lot of the other music generation tools that I've seen, which is that they have the underlying architecture of a song that's already there. So there's this authored architecture that you can manipulate in different ways. You can push these different buttons to control the flow of the music. And I think that they've broken down a song into these different component parts so that you can kind of sequentially go through a song, even if you've never heard it before, and just go through the same kind of algorithmic process of have the different build-ups and the beat dropping and then the chorus and the verses and there's lots of different things you can do to kind of interact with it whether it's adding special effects or bring up the different musical instruments and be able to play a synth and be able to add a melody and to record it and to loop it. really feeling like you had a lot of control and agency over the process of listening and participating within the music. So there's this larger trend of consumers not just being passive consumers of media, but that we're moving into this place of becoming an active participant and helping to remix and improvise and collaborate with other people in jamming out and creating music. This is an aspect that wasn't released when I had a chance to play it, but it's coming soon in terms of you being able to jam out with one of your friends. And I think that, you know, another big thing is that they've been able to collaborate with a lot of these other artists and to be able to take their existing songs and be able to translate it. So they have this music reality engine that is able to break down the song into the component parts. It sounds like they're using like WAV files that are basically in this very specific folder structure that they're able to break down the song and that there's a certain amount of hand tweaking to make sure that the wave file links are all the same size and that they all sound good when you play them over on top of each other and that they kind of do a lot of pre-mixing so that maybe they have this worst case scenario where they have everything playing at the same time and they want to make sure that it sounds good. there's this trade-off between being able to give a lot of fine-grained control of all these different component parts and as soon as you start to give that control then you add all these layers of complexity and so I think they've found this nice balance of being able to have a lot of control and power but not getting so down into the weeds so that you know when you step into the DJ booth you just don't know what to do because there's this process of having too many options there and I think As you go in there and you start to play a song, you get to learn what sounds good and try to time things in a specific way. It's just a fun way to explore music in a specific way and to really be inspired by this remix manifesto documentary that came out by Brent Gaynor, specifically talking about Girl Talk and this remix culture, which is, you know, moving away from just the passive consumption and enabling and empowering the audience to be able to take the culture that's being produced and to be able to remix it and combine it and juxtapose it and new in different ways. And I think this is where this is going. I think right now it's pretty simple in the sense that you're just having a single song. You're not able to do something as crazy or wild of something like Girl Talk where they're mixing five or six different tracks on top of each other at the same time. Maybe it'll eventually get to that point, but I think that just starting at this baseline of being able to make it so that anybody can kind of just step into this virtual reality experience and if they have no musical background or they haven't done anything like this before they can start to just learn the timing of being able to push these various different buttons and it's very embodied you're moving your arms around and you feel like you're in this control booth of really being able to play with and interact with the music in a way and I think it's very empowering and it's really starting to democratize this process of music making and It sounds like a lot of the musicians that were able to participate in the process started to really think about how they can re-architect the way that they create their music so that they can break it down to be able to fit into this format of the musical reality engine, which is trying to break it down into different component parts so that somebody can just step in there and start to really control the flow of a song. One of the things that I hope happens with this experience is that people just start to check it out and to have an experience of it. And that if there's enough engagement and it becomes popular enough, then I think the potential for where they could take it to be able to really fine tune the modding support, to be able to allow anybody to start to add their own music within there. Cause I think that's where this is going to start to really take off is if you're able to easily record a sample or add your own lyrics. right now there's not a lot of capability for you to, you know, just for the tools that are provided to you to start to sing your own lyrics or to play your own instruments that are outside of the virtual reality experience. I mean, I think if you were sophisticated enough, you'd be able to potentially pipe something in and to be able to mix it and to add that into your performance, but it's just something that's not baked in into the virtual reality experience. And I think that's been their focus is to try to make it super easy for people to be completely immersed into VR and to be able to do all the things that you'd want to do within the context of this experience. But I do see that being able to potentially add and create your own music within the context of this, I think it may actually simplify the process or at least give people a structure by which they could get a sense and a feel what other songs are doing and may be able to, you know, craft and create their own song. I mean, there's so many different processes of you know, fine-tuning the waveforms and the sense and, you know, just coming up with the different melodies and whatnot. But this, in the future, provides this groundwork to be able to really revolutionize the process of music creation. And if there's some way by which they could create a music sharing tool so that they have this kind of creative commons type of way where you could upload different samples and, you know, it really would empower people to create this ecosystem of music creation and, you know, maybe there'd be ways to tip people or ways to explicitly charge for some access, some of the specific tools that people may make so that you could create your own songs. But to start, they're starting with musicians that have established tracks and you're able to go in there and start to play around and kind of create your own remix of some of these songs. So I'm really excited to see where this is going. I think that this is going into the right path and trajectory when it comes to where all of VR is taking different aspects of media, especially with this remix culture and really encouraging people to participate into both becoming an active producer of music, but also a collaborator and being able to have a platform that allows them to both explore a song in new and different ways, but also express themselves by being able to improvise with the tools that are being provided. So, Electronauts is launched today, it's Tuesday, August 7th, so definitely go check it out. So, that's all that I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a listeners-supported podcast, and I do rely upon your donations in order to continue to bring you this coverage. So, you can donate today at patreon.com slash voicesofvr. Thanks for listening.

More from this show