One of the most immersive and memorable experiences that I tried at the Silicon Valley Virtual Reality Conference yesterday was TheWave. Just as TiltBrush is able to unlock your creative expression through 3D drawing, TheWave has the potential to lower the barrier to composing music with their 3D sequencer tools. TheWave development team is made up of a collection of VR developers who are also musicians wanting to use VR to unlock their musical creativity, but also eventually help to revitalize the music industry by providing working musicians another outlet for doing live virtual performances.
One of the developers is Unello Design’s Aaron Lemke, who originally got into VR with because he wanted to have an outlet for his ambient music with experiences like Eden River. I had a chance to catch up with Aaron at SVVR where he talks about TheWave’s musical composition create mode as well as a DJ performance mode, their cross-platform networked experience that they premiered at the VR Mixer at GDC, and how he sees VR playing into the future of music composition and performance.
LISTEN TO THE VOICES OF VR PODCAST
Here’s a video demo of the DJ performance mode for TheWave
Become a Patron! Support The Voices of VR Podcast Patreon
Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye and welcome to the Voices of VR Podcast. Today I talked to Aaron Lemke of Unelo Design and he's also been working with some other musicians on this really amazing experience that I just saw at the Silicon Valley Virtual Reality Conference. It's called The Wave and basically what it is is that you go into this experience and you start to just make music with this 3D sequencer where you're really starting to kind of engage and understand music in a new way. Aaron talks about as a musician you're trying to come up with the complexity of all these different MIDI files and you're looking at them one-on-one. It's kind of like, what if you were able to stack up all these different MIDI files into one immersive experience and just start playing? And so I started to talk to Aaron about how this came about, where it's going, and some of the live DJ functions they have, and how VR could potentially help revitalize the music industry in allowing performers to perform virtually and allow people to virtually attend these different performances. And so they've got some other really interesting networked experiences with the DJ mode where you're able to, in the Gear VR, be able to witness some of the performances that are happening, whether it be a live music performance using these digital instruments. or whether it's live mixing of a DJ set. So a lot of really interesting stuff about VR being made by musicians for musicians. So that's what we'll be covering today in today's episode. Today's episode is brought to you by the Virtual World Society. Now, if you're a listener to the Voices of VR podcast, then you know that at the end of every interview, I ask my guests what they think the ultimate potential of virtual reality is. Tom Furness has basically been thinking about that for the last 50 years and he's made a career out of that where he eventually left the military to go start the Human Interface Technology Lab at University of Washington to start proving out all the potential applications of VR for education, VR for medicine, and all the different possibilities for how to bring this VR technology into the world to do social good. If you're interested and you're into that, then totally go check out the Virtual World Society, sign up for the email, send an email saying, hey, I want to get more involved. They're still really early in what they're doing, but I personally want to see the ultimate potential of VR help be brought about by what the Virtual World Society wants to do. So with that, this interview was conducted on Wednesday, April 27th at the first day of the Silicon Valley Virtual Reality Conference here in San Jose. And HTC had got a lot of different booths for different independent developers to come show off some of their Vive experiences. So there's basically a whole line of different indie devs that are showing off their Vive experiences. And so Aaron Lemke is one of those developers who was invited to come show off what he's doing with The Wave. So with that, let's go ahead and dive right in.
[00:03:24.507] Aaron Lemke: So I'm Aaron Lemke from Unella Design. I made games like Eden River and Zen Zone for the Gear VR. And my latest project, separate from Unella Design, is called The Wave. And it's a platform for music performance in VR. So I just showed Kent the Create mode, which is sort of a compositional room-scale musical playground. So there's a 3D sequencer where you can make a little beat. There's a couple different motion-based instruments, sort of theremin-like things. And the other one, the other kind of mode is called DJ mode, and in that mode you can It's a fully functioning DJ deck. You can launch two tracks and sync them up and crossfade and do filters and everything. And then we have a networked audience mode where people can hop in and watch the DJ perform. And that's what we did at GDC at the VR mixer like we were just talking about. We had one person in the Vive DJing and then we had a bunch of Gear VRs out in the audience that were networked in over the local area network and they could see the DJ and they were all in a virtual space together. It was actually the first cross-platform co-presence experience.
[00:04:33.867] Kent Bye: Wow, nice. So I just had a chance to try this out and my impression of it is that it's a super immersive way of creating music. So there's like a 3D immersive sequencer that, you know, I've seen sequencers online where you use the mouse to kind of click on a grid as the, you know, there's a line that's going through the measures and it's repeating and so you can start to see the cause and effect of putting this dot here makes this sound and so what you're creating here is basically an immersive version of that where what I feel is it's a little bit more intuitive and easier to play around with so as a musician talk a bit about like designing this and some of your design goals and then you know what you're able to do in terms of composing music.
[00:05:12.757] Aaron Lemke: Yeah. That's awesome to hear that it was intuitive for you. The idea for that sequencer kind of came out of... I play a lot of iPad apps, like music creation iPad apps, and even on those and on things like GarageBand and Logic, if you're editing MIDI, you can only really edit one track at a time, and to edit that track you sort of have to zoom into it, go into the MIDI editor, then you can edit only the notes in that track. And then if you want to change something else, you have to kind of zoom out, zoom into this other instrument. So there's a lot of layer switching. You basically never get to see all the instruments at once lined up with each other. And so with this, I figured, hey, since we're in 3D, we can just stack all that MIDI on top of each other. So they're all lined up. Each beat is lined up. So you can see, oh, this snare drum is hitting a little before this piano. Right and that's like a really obvious just perspective line that you can kind of draw So it's really a better way to edit MIDI I think if you're editing like a bunch of different MIDI tracks at once
[00:06:15.982] Kent Bye: So if we kind of break this down in terms of what's actually happening, it sounds like there's three dimensions. There's the horizontal axis, which is time, as there's this plane that's moving through. You see it moving, and that's the beats. On the y-axis, that's the pitch of the different musical notes, and you're kind of reducing it down to kind of a diachronic scale just to make it sound pretty. And then the z-axis, is that the different instruments then?
[00:06:39.252] Aaron Lemke: Yeah, exactly. Yeah, z-axis is instruments, and then each of those has a different color just to sort of help drive that point home.
[00:06:46.351] Kent Bye: Great. So there's other ways that I sort of clicked the grip button and I get transformed into being away from myself, looking at myself, myself being mirrored as I'm dancing on stage. And so I get to kind of teleport my consciousness into what it would look like for me if I was on stage to watch myself on stage. But yet my movements are still doing things on stage. So talk a bit about what you're trying to do there.
[00:07:11.437] Aaron Lemke: Yeah, we just wanted to let people know what they look like in the audience, because that's a big part of this whole thing is kind of like being in the audience. We want to have a totally cross-platform audience app that will probably be free, but don't quote me on that. So, you know, Google Cardboard all the way up to Vive. Anyone will be able to hop in and watch somebody perform. So that was the first test of like, what does it look like in the audience, you know? And there's a kind of a virtual stage where the performer is up, you know, four or five feet above the audience members. And yeah, the coolest part is you can still control your avatar on stage, which is this big rainbow cat monster thing. And you can also still play all the instruments, so you can kind of see what their effects are on the world. But the coolest thing we realized is if you teleport people out there and they can still control their avatar, most of the time they're kind of like, oh, this is cool, and they start dancing. And it's really hard to get people to dance. We kind of stumbled across this great gamified trick to get people to dance around a little bit. And everybody does it. I took a picture of what you're doing because it's funny. But everybody does that, you know? And so we kind of just stumbled across that.
[00:08:18.973] Kent Bye: Yeah, it was an interesting moment because I was watching myself and it's a little bit of like, when you move you see the avatar move, so it's a little bit like the extreme case of, yeah, you're testing it.
[00:08:28.319] Aaron Lemke: Yeah, you're testing it out. Just to be like, wait, is that me? Does that, when I move this, does that move that? Oh, cool, it is me! And then you do a little dance.
[00:08:34.884] Kent Bye: Yeah, so it's interesting. A good way to get people to dance is to teleport their consciousness into a third person perspective and mirror them so they can see what they're doing. Yeah. So talk a bit about the other features that you wanted to have for performance, because there's pitch bending and other things that I couldn't even quite figure out. So as people are trying to do live performances of music, what were some of the things that you were trying to do?
[00:08:56.410] Aaron Lemke: Well in that mode it'd be really nice, so there's the sequencer, there's a couple of other motion-based instruments that you can go play with, there's some effects controllers that change the global effects mix of the whole track. So it'd be really nice if you could sort of loop any of those elements. There's this idea in music recording of automation, which is basically like controlling an effect or a fader over time. So you can imagine going over to one of those 3D effects cubes where each axis is controlling a different effects mix and doing like a little dance over the course of a measure, moving it around, changing the effects, and then once you're done, it sort of loops that and you can see whatever you just did repeating. Likewise, you can do the same thing with the instruments. You walk over to the little theremin thing and you make some chords in it and then you kind of step back and there's a clone of you, of your head and your hands, still playing that instrument. I think it would be really cool. That's kind of always been my dream. It's like, I wish I had a band of Aaron Lemke clones so I could all just play together, you know? And that's the goal. That's one of the goals. It'd also be nice to be able to switch scenes in that mode. So like, okay, I've made my loop for this. This is the chorus or whatever. I've made that loop. And then there'd need to be some kind of timeline switcher thing and you walk over to that and you could trigger kind of like an Ableton, the session view, if people have ever messed with that. It's got quantized launching of different scenes, which is like your different song structure pieces.
[00:10:26.584] Kent Bye: One of the things that I found in starting to create some music is that if you put the music right on the exact beat, it sounds like fake, it sounds electronic. And there's some sort of element of imperfection that when people play an instrument or the drums or something that it actually sounds better than if it was completely generated. One explanation I've heard of that is that there's fractal noise between the distance between when you're playing that there's like these little imperfections that you're just slightly off but if you kind of add those up you get this fractal patterning that we hear as beautiful and so I kind of sense that like this is a sequencer where it's not necessarily concerned about locking things exactly where in the beat but you have the ability to kind of get things a little off to get a little bit more of that humanity into the performance.
[00:11:12.473] Aaron Lemke: Yeah, that's a feature we definitely want to add at some point into the sequencer, because right now everything's pretty quantized. But, you know, for instance, on the looping stuff, that's all still human-driven, so that would have that kind of fractalness that you were talking about. And on some different recording softwares, you can introduce, like, fake humanness into your drum beats, and it really makes a big difference.
[00:11:35.578] Kent Bye: And so, what's next for this project then?
[00:11:38.321] Aaron Lemke: So, the other mode, the DJ mode, the plan is we're trying to work with a couple of big artists to do a big show, to throw the first virtual rave. And so what we want to do is finish out the audience app, so people will be able to attend, create a little avatar, make an account and everything. and then hop into a room with whoever, some badass DJ playing, and then watch them do a show. So yeah, that's the plan.
[00:12:07.203] Kent Bye: And for you, what do you see as what VR can do for performance? What do you see the future of performance in VR?
[00:12:14.157] Aaron Lemke: Well, I don't know. I mean, we're kind of just scratching the surface here. I think everyone that's working on The Wave right now actually is a musician, and so we all have firsthand experience kind of in the music industry. And the biggest sort of long-term goal of this thing would be to create a way for musicians to make a decent wage again, because it's really hard to support yourself as a musician right now. The music industry is Still kind of floundering in the face of the internet. We haven't figured out how to monetize properly We haven't figured out how to pay creators enough money to sustain themselves So if we could somehow create a way for performers DJs to set a price like a ticket price 99 cents You can come see my concert and if they're good enough and they have enough followers, you know Hey, you could make a thousand bucks in an hour in your basement in your underwear So that's really the long-term goal is to kind of build this virtual economy
[00:13:08.660] Kent Bye: And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?
[00:13:14.907] Aaron Lemke: Rescuing the music industry. I think it really could. I think if we get money flowing, if we get people creating not only performances but also virtual goods, you know, so if your favorite DJ is playing, maybe you can make an avatar that's sort of like a branded avatar and Sell it and do some kind of revenue share there. We're still kind of working that out But yeah, the long-term goal for me would be like can we build a virtual economy that can help support musicians and creators?
[00:13:45.407] Kent Bye: Great. Well, thank you so much.
[00:13:46.888] Aaron Lemke: Yeah. Thank you
[00:13:48.165] Kent Bye: And so that was Aaron Lemke of O'Neill Design, and he's been collaborating with a number of different musicians on this project called The Wave, which I am totally going to check out when it comes out. Because for me, I feel like this is one of those sweet spots of VR, where you go into a VR experience and you actually start creating. And in this sense, it's really starting to create music. And for me, there's all sorts of different possibilities for where this could go in the future. Right now, it's just using one sort of scale, meaning that you're not having the complexity of an entire piano keys. And so it's simplifying it in a way that whatever you do is going to sound good. and they can change the key and whatnot I'm sure at some point but as this moves forward what I expect is that they're trying to make it simple and easy enough for just a novice to go in there and start playing around and having fun and making something that sounds good basically and from there then you start to add all sorts of other complexities that people who are actual musicians and know what they're doing the type of features that they want you know being able to change the key signature or to do different scales or you know all sorts of different variations from what they have at this point is very simplified and just works and is very dead simple. There's just something really satisfying to being able to play these virtual instruments because it's such a fast feedback loop cycle between you interacting with an object and you hearing what it's doing and if it sounds good enough and it's a dynamic enough then you can start to really set up all sorts of different loops and really get deep into producing music that actually is quite good and you could use for all sorts of other different applications and You know for me this is something that I have some music background and something that I'm actually I get caught up with the complexities of not knowing all the theory and this feels like a way that was very accessible like I could just spend hours in this you know starting to create and Different musical experiences and you know for me I I really wanted to start to have it teach me music theory in a way where it just makes it dead simple You know, there's this website that's out there if you are into learning more about music check out hook theory which is essentially like the Wikipedia of people have gone through like thousands of different pop songs and started to break down the actual chord progressions and Melody and so you could start to see the different keys and chords and you know do searches in terms of like after G major What's the chord that is most popular in all of pop music and you know to a certain extent? That's a little bit of a group think and seeing like okay well if you are you really just gonna replicate what everyone else has done and what what are you really doing that's new but I at the same time some of these chord progressions are just really good because they work and you know they've been used over and over and over again and so if you don't know anything about music theory then going into a website like hook theory to be able to get a little bit of that basis and you know what i would want to see is like this merging of a website like hook theory into a virtual reality experience And then eventually when it comes to AR, once you know enough about music theory, then I can imagine these different augmented reality experiences where you're actually sitting in front of a keyboard and you're able to overlay where you're supposed to push your fingers in order to play like a keyboard. So starting to teach yourself how to create music. And so You know, to me, the wave is this unleashing of all the potential of your creativity and lowering the barrier from your imagination into being able to create your expression of your emotions, what you're going through. And it's just very cathartic and really quite amazing when you start to get into that zone of getting over that hump of feeling like you really suck. Because, you know, frankly, when you start to do something as complex as music and you start doing it, Most people who are at the very beginning aren't going to be all that great but it's one of those things that you can learn over time and a tool like this can simplify the process a little bit so that it's going to at least sound good and then you could start to play with the different rhythms and eventually go beyond what you're getting out of the box with the very simple setup they have at first, but having a full-fledged 3D immersive music sequencer that has all the features that you would possibly want, and to be able to create the music that you've always wanted to create, but were never able to actually go through the hard part of getting over that learning curve. So, I know for me personally, I'm really excited for experiences like this because it's going to be, for me, kind of like the tilt brush of music creation. So, with that, I'll be giving my talk at the Silicon Valley Virtual Reality Conference today. I'll probably be giving some summary or highlights at some point on the podcast, maybe through the process of a conversation. We'll see, but I have got a lot of really interesting ideas because, you know, over the last two years, I have been really thinking about the ultimate potential of VR. And, you know, part of my process has been, how do I think about the landscape of VR? And Tipitat Shavasan has come up with this great VR fund where he's painting this picture of the landscape of VR. And it's basically like this huge logo graveyard of, you know, hundreds of different companies and startups that are in this space and trying to really paint the picture of what's happening from a startup and enterprise lens and When I look at that, I had probably about 30 to 40% of those companies have been represented on the Voices of VR podcast as interviews. But yet there's all sorts of other dimensions of VR that were not on Tipitats landscape and ways that I think about it in my own mind in terms of what is interesting about VR. And so as I started to try to think about the challenge of trying to summarize the 400 interviews that I've done so far in a 20 minute talk, at this conference, I really started to think about, well, how do I think about the landscape? And how do I think about the ultimate potential? And how do I map it out? And how do I describe it? And think about it myself. And I really had to come up with a framework for how I think about VR from the perspective of the human experience. And so that's kind of what I'll be talking about. And if you're not able to make it, then hopefully I'll be able to share the insights and graphics and visualizations that have come out of that. I think it's already starting to help contextualize for me the different relationships between all the different dimensions of VR and kind of like some sort of framework of describing the complexity of the human experience and how we can think about how VR fits into all these different aspects of our life and where this is all going and where you fit into that, where you as a listener to the voices of VR and what you want to do and help bring out all the different potentials So that's what this podcast has been about. It's been about trying to figure out what that is. And I just wanted to thank you for listening and thank you for supporting me and your emails and coming up to me. And that has been great to be here at the conference and to get all that feedback of just hearing all the different ways that people listen and what this podcast has meant for them. And if you do enjoy the podcast, then again, then please do consider becoming a contributor to the Patreon at patreon.com slash Voices of VR.