It was on a San Jose sidewalk in 2015 that I first tried the SubPac, and it blew me away. I felt like I was immediately transported into a dance club standing in front of giant subwoofers with the soundwaves of bass rippling through my body, but yet no one around me could hear a thing. The magic of the SubPac is that it translates the inaudible frequencies lower than 40 Hz into a vibrating haptic feedback that provides a much more immersive experience.
I’ve seen a lot of different experiences at conferences over the past year using the SubPac to increase immersion, and I was able to catch up with business developer Zach Jaffe at VRLA to hear about some of their content partnerships and new S2 Backback. We talk about some of the VR experiences and songs that have good low frequency bass design that specifically take the SubPac into consideration. We also talk about the future of immersive sound design, and his prediction that music labels will want to remix albums to work well within spatialized audio environments.
LISTEN TO THE VOICES OF VR PODCAST
There isn’t a spatialized audio open standard yet, and so the music industry is waiting to see what formats emerge. At the moment, a fully immersive VR experience is the best option to get fully audio spatialization, but a yet-to-be determined, standard format for object-oriented or spatialized audio could be used with head-tracked headphones like Ossic X as well as with future versions of SubPac devices that have directional bass incorporated.
Here’s a number of VR experiences and music videos that are SubPac-optimized:
- Jurassic Worlds
- Suicide Squad: Special Ops VR
- TheWave Rave at VRLA
- Mr Robot VR experience by Within
- The Presence Gear VR video
- Rocket Launch 360: DeltaIV NROL-45 by Koncept VR
- In the Eyes of the Animal at Sundance
- Inner Activity Meditative Experience at SIGGRAPH 2016.
Run The Jewels – Crown (Official VR 360 Music Video)
Grandtheft – Summer In The Winter | 360 VR
Jazz Cartier – Red Alert / 100 Roses (360° Virtual Reality Video)
Moderat – Running (Official Video)
3 @SubPac playlists to try out @tiltbrush's audio reactive brushes:https://t.co/Tdy5RH573Xhttps://t.co/lPa8M3KrLZhttps://t.co/p4RAeyT9wR
— Kent Bye (Voices of VR) (@kentbye) August 2, 2016
Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Rough Transcript
[00:00:05.412] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR podcast. So, over the last couple of years, I've been going to a lot of different VR expos and conferences, and tried out a number of different demos that have been using the SubPack, which is essentially like a haptic vest that you put on your back, and it translates all these sub-audible frequencies into visceral vibrations that just ripple through your body. So I had a chance to catch up again with Zach Jaffe who does business development for Subpac and we talked about how people have been using Subpac into different types of experiences from audio experiences all the way to horror experiences to make it more immersive. So we talk about a number of different experiences and trends that he's seeing within the industry using the subpack, as well as the future of music and immersive sound and what he thinks is going to happen with all these music studios as they start to move and remaster a lot of their library into full immersive experiences within VR. So that's what we'll be covering on today's episode of the Voices of VR podcast. But first, a quick word from our sponsors. This is a paid sponsored ad by the Intel Core i7 processor. If you're going to be playing the best VR experiences, then you're going to need a high-end PC. So Intel asked me to talk about my process for why I decided to go with the Intel Core i7 processor. I figured that the computational resources needed for VR are only going to get bigger. I researched online, compared CPU benchmark scores, and read reviews over at Amazon and Newegg. What I found is that the i7 is the best of what's out there today. So future-proof your VR PC and go with the Intel Core i7 processor. Today's episode is also brought to you by VR on the Lot. VR on the Lot is an education summit from the VR Society happening at Paramount Studios October 13th and 14th. More than 1,000 creators from Hollywood studios and over 40 VR companies will be sharing immersive storytelling best practices and industry analytics, as well as a VR expo with the latest world premiere VR demos. This is going to be the can't miss networking event of the year with exclusive access to the thought leaders of immersive entertainment. So purchase your tickets today while early bird pricing is still in effect at vronthelot.com So this interview with Zach happened at VRLA which was happening at the LA Convention Center from August 5th to 6th. So with that, let's go ahead and dive right in.
[00:02:46.616] Zach Jaffe: My name is Zach Jaffe and I do business development for a company called Subpac and we make physical audio technology that allows you to feel the virtual experience inside and out so like you're at a live event or you know if you would feel an explosion in VR you would feel that in your body using the pre-existing audio using our hardware.
[00:03:07.886] Kent Bye: Nice, and so I think I first tried the Subpac back at GDC 2015, the same time that the Vive was being announced. And so I was on the streets wearing the Subpac, listening to some music that made me feel like I was in a dance club. It was probably one of the most immersive experiences that I had at GDC, and I wasn't even in VR. I think the great thing I love about the SubPak is that you just put it on your back and you don't have to do any type of special software SDK integration. You just plug in the audio and then it gets all the kind of lower register frequencies translated into just visceral bass that just goes directly to your body.
[00:03:45.505] Zach Jaffe: Yeah, that's right. And the reason we made it as, for lack of a better word, a dumb peripheral, is because there's not really a standard right now in audio. We have some incredible companies out there who are trying to create those standards. And I could go on for quite a while about those companies who are out there trying to really make a model for what we're going to do with audio. But we didn't want to come out there right now and say, this is how you have to use our product. We didn't want to make people use middleware. We didn't want to make people use an SDK and, you know, going back in with API to start off with. Because this is a new form of technology, so having them easily fit it into their chain is the best way to do it. Because we want to take a collaborative approach to VR audio and really, you know, bring everybody up together and figure out what the needs are, who needs them, and how to solve those problems together using those other, you know, groundbreaking companies out there, you know, like the Ausics and Dysonics of the world. who are really trying to, you know, and Dolby and, you know, it's almost endless.
[00:04:50.862] Kent Bye: Yeah, well, I guess the thing when talking to Austics, you know, one of the things that they have to integrate a special SDK because there's no kind of standardized spatialized audio format right now. And I'm curious if that would even be helpful for a subpack because bass is like unidirectional in a way. You don't really know which direction it's coming from. And so, you know, I could imagine you could try to do directional audio, but I don't know if that would even make sense or work.
[00:05:17.727] Zach Jaffe: Well, hypothetically, yes, we could do directional audio and I'll say that we're constantly innovating and we'll be making things in the years to come that I think will be interesting and relevant to your line of questioning there. Yes, because bass is mono, you still can have it coming from a different direction depending on where the subwoofer or in case sub-pack is placed. So while bass is mono, physical bass may not be.
[00:05:54.531] Kent Bye: Yeah, I mean the thing that I really see Subpac is almost like this haptic suit that to me is bringing up the level of immersion. One good example that I saw at the Silicon Valley Virtual Reality Conference was this rocket launch with wearing a Subpac and that it just gave that extra visceral feeling of presence and I think that a lot of experiences that I saw at Sundance and cinematic experiences and even here at the Rave is using Subpacs as part of the the whole setup, and so it feels like it's one dimension of really easy-to-do haptics at this point. It's really kind of just plug-and-play, you don't even have to do anything additional other than do some sound design.
[00:06:34.410] Zach Jaffe: Yeah, and Concept VR's rocket launch experience is super cool. We love those guys. And I think the last time since we talked, we had really just started almost two years ago now, really getting deep into this world. And fast forward two years, we've done content partnerships with everyone from Jurassic World to most recently Suicide Squad, where people are using Subpac and Concept. And the Wave guys who are partnered with on the Wave Rave here at VRLA, which is a whole ton of people wearing subpacks, Skullcandy headphones, 3D glasses, Gear VR headsets, while a DJ is performing in the Wave VR virtual reality performance platform. You can see that the range of use is continually growing. We've got people using it for meditative VR experiences now. And then we still have the creators using this to create content. Because it's not just about using the pack to demo content, it's about using it to create content. Because as you know, we come from the pro audio world. People like Timbaland and Richie Haughton are partners in the company. We work with Grammy award-winning artists. all the time because that's the standard for VR audio needs to be. It needs to be the standard that we have in movies and games and music at the professional level. And right now, I demo the pack with a lot of different content. You know, I work out of the Upload Collective. Shout out Will and Taylor in San Francisco. And, you know, we have pretty much everything that's been released publicly and some not publicly on the Vives and Oculuses we have in there. And in my personal gear VR, I've spent way too much time in at this point. And not everything works with a pack, because sometimes the sound design just isn't there. And that's not a fault of ours right now, or necessarily a fault of the people who are making the content, but it speaks to what is happening in the industry as a whole, that even though people are starting to focus on audio, we're still not there. People are still mixing in white earbuds in their mom's basement. for some of these experiences. And no offense to people who are doing that. That's awesome. I would love to empower more people to create as much content as possible because that's how we're going to find the killer apps and videos and experiences of this new virtual world. As we move forward into and push the boundaries of what we can do in audio, we really want to send the message to people is we're here to help you make your content better. So you should create with it as well as showcase with it because you get so much more room in your low-end mix. You can go down to five hertz, which is in the inaudible range. So you actually have narrative room to do physical cues in a virtual experience that you can't just use with your eyes and your ears.
[00:09:26.427] Kent Bye: What's the sort of normal frequency response kind of cutoff for, like, earbud headphones versus, like, good headphones versus actually wearing the sub-pack?
[00:09:34.789] Zach Jaffe: I mean, you read some headphone literature out there that says, you know, frequency cutoff at 10 hertz, which is not true or even possible. So usually probably about 40, 50, depending on, you know, the headphones. So we're adding in that whole extra bottom range. And we've had people actually creating sub-frequency content. You know, we've gone to deaf schools with Eskimo, who's an artist and friend of ours, and Brendan, and he's created music that was only sub-frequency for deaf and hard of hearing first and second graders with a project called Feel Harmonic. So there's a lot of different tools you can use just with those low frequencies without having to go into extra coding and just this is something your sound designers can use. And I talked to some unnamed big sound designers for a couple of gaming in hardware companies who were telling me, you know, like, check out our beautiful studio that we do all our mixing in, you know, go into the editing bay, and they're like, we've got a million dollars of equipment. And we start our mixes in there, and then we go to here, and they pick up a pair of headphones. And they say it's killing us if we can't use the sub, it's turned off. Because eventually, no matter what, if you're mixing VR audio in an editing bay, it's still going to have to go to headphones for your final mix down regardless. So that solves again the problem that we solved for bedroom producers. when we started really in the music world in the early days, giving them a reference tool that was pure and clean and able to be used anywhere and everywhere, and more accurate than the ear because everybody was producing in headphones in their bedroom. The same thing is now happening at the professional level in VR because we have these audio professionals who have these beautiful systems that they're finely tuned and very expensive and paid for by their studios, but they still, no matter what, have to end up mixing in headphones at the end of the day. So we give them back that low-frequency response that they're missing when they're using headphones, just like the underground bass music producers that we started with originally.
[00:11:32.604] Kent Bye: And so when somebody's composing the sub-audible frequencies, just from the headphones perspective, from 40 Hz down to 5 Hz, that's 35 Hz of dynamic range to be able to play with. And so how does somebody compose within that spectral frequency?
[00:11:49.175] Zach Jaffe: Well, first of all, use a sub-pack. Actually, a good example of this is the Presence VR experience. Have you done that? It's a couple guys here in LA. It's actually a very, very interesting proof of concept video. So it's an active and reactive piece of video content. So there's a seance going on with a bunch of kids in an apartment, and you can either choose the direction that you look around the apartments. You can either sit at the table during the seance, go into the kitchen, the living room, the hallway, and the video continues going on while that's happening. Or you can do the reactive mode, where depending on where your gaze is in the video, you'll be transported to different parts, and you know, it's a spooky thing. The seance goes on, kid's nose starts bleeding, goes in the bathroom, eventually people start dropping dead, becoming zombies. So, it's kind of a spooky experience, but they took the sub-pack and added in sub-frequencies into their sound mix to scare people even more than they would have otherwise. So, instead of seeing something frightening or hearing something frightening, they're feeling something frightening, which I can't wait to play Resident Evil 7 on the PSVR with the pack and feel some zombies coming up behind me. I like to be scared shitless.
[00:13:05.253] Kent Bye: So is it basically using like any sort of like MIDI keyboard and just bumping down the frequency so that when you're playing things you actually feel it in your body like live?
[00:13:12.724] Zach Jaffe: Yeah, it's literally just using that bottom-end room like you talked about to mix where you wouldn't have been able to do it before you now can.
[00:13:20.638] Kent Bye: Now when I was at SIGGRAPH there was an experience there which felt like this meditative experience where I was sitting in this tent and wearing a sub-pack but I felt like the use of the bass was so constant that it wasn't have this like rhythmic kind of pulsing and so it just didn't have any tension between going either slower or faster. I almost wanted just this rhythmic kind of pulsing. So when you've looked at different ways that people use the sub-pack and What have you seen as the kind of the best practices for how to actually use some of these sub frequencies?
[00:13:52.968] Zach Jaffe: Well, shout out to Anshul and her activity team. I'm sure you've seen their, if you've been to gaming conferences for the past couple of years, you'll see their tent. There's some recent grads of the USC's grad program in the MXR lab. And, you know, they and I would say that the Presence guys that I talked about, the Jurassic World Apatosaurus experience that's been out for some time, and actually the Mr. Robot experience by Within. has incredible sound design in the low frequencies. And as a way, like a standard in how to do it would be, you know, instead of just mixing in your headphones, you incorporate the pack into your mixing, you know, whether you're a sound designer or, you know, moonlighting is one because you're a one-man army working on your content. No matter what, having that incorporated into your mixing process will let you understand how that mix needs to be, because the body is just such a powerful tool when it comes to mixing audio.
[00:14:53.993] Kent Bye: And, yeah, I'm wondering if you could talk a bit more about these content partnerships, because I imagine that Subpac is going to be used in digital out-of-home entertainment experiences, but I'm just curious what specifically some of these early content agreements that you've come to look like.
[00:15:10.120] Zach Jaffe: Well, first of all, talking about our home experience, we just opened our first co-branded movie theater with CGV in Seoul, South Korea, which is pretty cool. I know not VR related, but it's individualized control seats with a sub-pack OEMed into the actual seats and headphone base. So there's two theaters, 75, so we can actually take our tech, it doesn't have to necessarily be in the form factors that you see on the website, and put them into other things. But in terms of actual content partnerships, Samsung just got a whole bunch of units for us that they were using for the Suicide Squad premiere. They had a VR experience that I think's available on Samsung VR. And more recently coming up, there's some things I can't quite talk about yet. But most of our partnerships, we're trying to reach out, and me specifically, and our other team member Ryan, are trying to find people who are already doing, you know, proper sound design and getting them to, you know, showcase with us and, you know, we're going to be working on a developable program where things are going to be, quote, you know, sub-pack approved or it's something that we know feels good, the sound design is proper and, you know, that's how we want to help. bump up that standard that doesn't quite exist, at least in terms of quality. Not necessarily in terms of software, or, well, obviously the hardware that's being used, we want to be ours, but just the bar of quality, you know, needs to be bumped up, and I think we're finally starting to turn a corner on that point, and, you know, you had those great casts a couple weeks ago, focused on audio with, you know, Jason from Mosaic and Pete Moss at Unity, And I really enjoyed listening to those gentlemen speak, because I think they have a great vision of where things need to be in terms of bar of quality, and not necessarily as opposed to standard, even though those standards need to be developed. I'm really optimistic about where things are going to be going, especially in the next six to 12 months.
[00:16:56.811] Kent Bye: Yeah, and by the time this podcast is released, you're going to be releasing some new hardware technology. Maybe you could talk a bit about what that is.
[00:17:04.053] Zach Jaffe: Yeah, actually, so finally after years of requests, we are actually going to be taking the Subpac S2, which is our seated model, and turning it into a convertible backpack. So you'll be able to buy what's going to be, at least tentatively right now, called the backpack. B-A-C-K-P-A-C, and you'll be able to take your S2 and modularly input that into the back of the backpack, so you'll be able to have your seated experience and your wearable experience, and it's just large enough that you can fit a nice-sized Alienware PowerPC to power a mobile VR experience without being tethered to anything. So, really interested to see how people start to use that.
[00:17:46.754] Kent Bye: Maybe you could describe to me a little bit more, like how is that different from the existing version?
[00:17:50.457] Zach Jaffe: So the existing version, the M2, doesn't have any compartment on it. It's a different use case. So when we came out with the S2 and the M2, it was S2 was going to be for seated VR and music studio production, home listening experiences. M2 was made for active lifestyle and mobile VR gaming experiences because You wouldn't necessarily want to be unless you know, you're going the direction that I was just speaking of, you know Having a power PC in the bag self-contained powering the HMD with a battery You don't necessarily want a giant thing on you while you're roaming around in VR, you know The m2 is slim for nice form factor like, you know, it's being used at the rave. You can dance around it doesn't encumber you. It's less than four pounds and it still packs a punch. So this is going to be more for the person on the go, not just specifically geared towards, you know, a use case of active and mobile VR experiences, but everyone, and especially, you know, kids moving around campuses, you know, now they get their base in the classroom or on the street or in the dorm room. And hopefully this, you know, will help RAs everywhere and not have to write as many decibel level violation tickets.
[00:19:01.770] Kent Bye: Does this new backpack have front sensors at all, or is it just on the back still?
[00:19:05.631] Zach Jaffe: Still just on the back. But we are constantly innovating and experimenting with new things. And you can expect a lot more from us within the next year. And we're really excited to work with each and every person in the VR community, from other hardware manufacturers out there to the content providers. We're here to work with you and raise the bar of what we're all doing and help create this marketplace for people who are going to be Feeling and not just hearing and for lack of a better word being as immersed as possible Even though I really hate saying that word these days.
[00:19:38.725] Kent Bye: It's gonna start looking up synonyms for immerse Do you have any favorite stories or anecdotes of people using a sub pack in VR?
[00:19:47.371] Zach Jaffe: Yeah, actually, so I believe it was at VR fest. I don't remember the gentleman's name but he had created this kind of cool like castle like experience and it was a mystery and it was a We wouldn't say he was a professional game designer, it was more of a passion project for him. He was a composer and sound designer mainly. So the guy was telling me, I was demoing the game, I said this is the sound design, this is really good. And it was a pretty crude experience. And I told him about the pack and we had brought enough for other exhibitors to use. He was kind of hesitant at first. So, you know, I'm in the middle of doing demos I don't know if I want to incorporate something new I said that look just just take it, you know, take it with you leave it here You know, if you don't want it in 10 minutes, I'll come back so I walked across the room back to my booth and I'm standing there and You know almost as soon as I had walked across the room. I hear somebody yelling. Hey, hey, hey. I And I turn around to see this gentleman sprinting across the room and just runs up to me and gives me a hug and says, man, I made that experience. I've played it thousands of times. That was like doing it for the first time when he incorporated the pack. into it. Stuff like that is super special to me and that's happened a few times on a couple different content pieces over at different conferences where I'll be showing something and someone will sit down and say, what do you got? And I'll say, XYZ is what we're showing today. And they'll sit down, they'll do it and they'll go, so I didn't want to offend you and say I didn't want to do it, but that was my content. I had never done it like this before and this is the only way it should be experienced. So that's fun. We like to give people life to their projects. Give them positivity. We want to spread bass like we've always done. And bass is a positive, energetic thing. basketball players in the NBA out at the Olympics right now rocking the sub-pack to be inspired and you know we want to just inspire people to create cool content and feel inspired when they're doing this beautiful stuff that everybody is creating because there's so much beautiful stuff out there and you know to not feel it and not have that visceral physical sensation in your body you know to some of it is just a disservice because you should be able to reach through the digital medium in a physical way to bring people together and that's really what we want to do. I think the physical sensation of sound is something that can bind us together and something everyone can identify with, not to speak on too lofty of terms, but feeling sound is happy and nothing we would want more than for everyone to be happy in VR.
[00:22:25.397] Kent Bye: Yeah, and because you're so kind of tied into the music industry at large, I'm also curious to see if you've seen kind of a shift or a sea change in terms of doing more than just 5.1 channel, but do full spatialized audio mixes in the process of creating music. In addition to, you know, the more haptic and tactile, but if you see kind of a larger trend within the music industry, if it's kind of moving more towards spatialized audio?
[00:22:52.219] Zach Jaffe: I think it will be. I don't think it is just yet. Talking to a lot of record labels and artists on those major labels and even some of the smaller indie ones, All of them are looking at VR with wide eyes going, oh my god, this new amazing medium in which to create and for them monetize music videos again. So we're at a point where people are still experimenting with 360 video and virtual reality. So there have been some pretty amazing pieces of VR music video content so far. like Sasuk in Germany made a video for Mode Rats running. It's available on Oculus. I think they'll be porting it to the Gear pretty soon. The Run the Jewels Crown video is absolutely incredible. Grand Theft Summer in the Winter is pretty awesome. And then Jazz Cartier, hip-hop guy out of Toronto, has a video in the Samsung VR store called Red Alert 100 Roses, which is a pretty rad piece of content too. But no one's really experimenting with spatialized audio, at least publicly yet. They're really kind of seeing how to even use the medium visually, but the interest that we're seeing in that space is, you know, it's off the charts. And I, for one, as just an old-school music video fanatic, can't wait to see the type of narrative music videos and then, you know, the audio that comes with it and spatialize with, you know, like companies like AASIC who are going to be able to combine with us and those video content creators and make something that's truly special.
[00:24:20.363] Kent Bye: Yeah, I think probably one limitation is that even if they produced it, the output of it would have to probably be within the context of a virtual reality experience because you have the head tracking to actually be able to determine where people are looking at to actually experience the spatialized nature of the audio. Another austic is kind of creating a system to be able to have that independent of the VR but that hasn't been released yet and there's pretty much nothing else out there where you can really actually listen to the audio unless you have a whole huge array of ambisonic speakers and a huge, you know, $30,000 to $50,000 setup.
[00:24:52.437] Zach Jaffe: Right, so I guess only time will tell, and I think that we'll be there way quicker than we anticipate, and I can't wait to really be able to experience the audio that I've loved my whole life in, you know, new ways, because I think that actually the VR space, we've had remasters and remixes of classic albums like Zeppelin, The Beatles, Neil Young, Bob Dylan. All those old catalogs have subsequently been remastered as time's gone on because you were able to listen better. Going from mono to stereo and then opening up the frequency range because now we had speakers that could go down lower and higher for personal listening purposes. And I think that's going to happen again in VR. I see no reason why major labels won't take old catalog content and say, oh, you loved Abbey Road? We'll wait until you see Abbey Road in full stereo VR, 360 experience, kind of like the Beatles love show in Vegas.
[00:25:55.303] Kent Bye: Wow, so because they have the individual recordings of each of the instruments, they'll be able to potentially kind of recreate a spatialized version of it.
[00:26:01.690] Zach Jaffe: Yeah, I totally see that. I mean, why not? They have it. It's not like it's not going to sell.
[00:26:08.036] Kent Bye: Great. And finally, what do you see as kind of the ultimate potential of virtual reality, and what it might be able to enable?
[00:26:15.574] Zach Jaffe: I just, you know, I love the idea of being able to go someplace that you've never been, whether it's a real place or, you know, something that came out of somebody's imagination. And I think the collective ability to go places together, but in VR, is going to do wonders for human connection. Not in the sense of becoming empathetic towards somebody else, but friendships. I know a girl who met her husband in World of Warcraft, and that's not the only story like that. There's a lot of people out there who meet significant others and dear friends through a digital world, a game, or a second life platform. That stuff happened all the time. And I think that type of human-to-human connection is going to come much quicker and be way more prevalent once we go in VR. Unless, of course, we're using fake avatars like Ready Player One style. No one knows what everyone looks like, and we're modulating our voices. And it's completely confusing, and everybody's lying and wearing a mask. But I hope that doesn't happen. I'm sure there will be places for that, but an open and honest human connection. And then they can feel each other with subpacks. We'll all communicate through low frequency transmissions. Like whales. So yes, I'm excited for everyone to become whales in VR in the future.
[00:27:39.780] Kent Bye: Awesome. Anything else left unsaid that you'd like to say?
[00:27:45.221] Zach Jaffe: Love your podcast, man. And everybody out there, looking forward to working with you. Keep on doing the amazing, beautiful work you're doing. We're so only at the beginning, and can't wait to see what comes next. Awesome. Well, thank you so much.
[00:28:02.206] Kent Bye: You're very welcome, Kent. So that was Zach Jaffe. He does business development for SubPAC, which is an audio device that translates all the sub-audible frequencies into haptic feedback within immersive experiences. So I have a number of different takeaways from this interview is that first of all, usually the subwoofer is a unidirectional mono experience within a typical stereo setup, it sounds like that sub-pack is going to actually be able to eventually start to experiment with directional audio when it comes to physical bass. So that's something that's within the technology roadmap, I think, that they're going to be working on and kind of makes sense as the next step as being immersed within a VR experience and then being able to have different directional and positional sound. I think with something like that, they're going to have to move from something that's been pretty lightweight and no SDK or no special integration, and it sounds like that level of integration is going to, at some point, either going to need an open standard that's going to be able to handle that, or they're going to have to develop their own SDK and have people do special integrations with their backpack and peripheral. So I think that's been one of the beautiful things about the SubPAC and why it's one of my favorite VR peripherals is that it's so lightweight to be able to do integration. You don't have to do anything specific other than to do a little bit more of sound design in the sub-audible frequencies from like 5 Hz to 40 Hz. Things that you can't necessarily hear with your ears, you're mixing it with headphones, something that you really need to actually be wearing the SubPAC in order to actually experience the effects of some of those lower registers. So again, I think some of the experiences that I've seen, I really like it when it's got a little bit more heartbeat or pulsing behavior. I think if you just turn on the subpack and just have a low rumble, it doesn't necessarily give enough of a dynamic experience to really stand out. So the experiences that I've had that really are successful are kind of using it for different impulses. And so I thought it was really interesting to hear some of Zach's insights on the music industry that how a lot of the different big music labels are likely going to at some point start to remix a lot of their albums so that they're kind of a fully immersive experience. Right now there's not really any other way to experience that other than through a VR experience. I think the AUSX headphones are going to be another way to be able to actually move your head around and have some sort of way of tracking your head position so that it can actually change dynamically what you're hearing through the headphones. But I think at some point they're also going to need, as an industry, a bit of an open standard for how to treat some of this specialized audio. I know that using ambisonic recordings you're able to have these four different channels and then from those four different channels you're able to capture a sound field that you can then tweak and adjust based upon how your head is being turned. But in terms of a lot of the game engines there's going to be a little bit more dynamic room reflections and other things that the more professional approaches like Dolby Atmos are kind of like at the pinnacle of integration of being able to tweak and change on the fly. So I think that at some point we'll start to see some sort of open standard that is a competitor to Dolby Atmos and Once that comes into place, then I can expect to see some of these other audio peripherals like the AUSIC headphones or the SubPAC perhaps be able to take advantage of that without having to do their own specialized SDK. But for right now as a stopgap solution, I think we're going to be seeing a lot of these kind of one-off SDKs to be able to do special integration, to be able to take the full advantage of some of these immersive audio soundscapes that may not be necessarily bundled in with a VR experience that is using something like Unity or the Unreal Engine to take care of a lot of that spatialization. Also, the subpack just launched the S2 backpack that has now since launched, so you can go check that out. So that's all that I have for today. I'd like to just thank you for listening. And if you would like to support the podcast, then tell your friends, spread the word, and become a donor. Just a few dollars a month really does make a difference and adds up in the end. So become a donor at patreon.com slash Voices of VR.