#1456: Turning Your Phone into a Spectacles AR controller with DB Creations’ “Tiny Motor Arcade”

I interviewed Dustin Kochensparger and Blake Gross at the Snap Partner Summit about the Snap Spectacles. See more context in the rough transcript below.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my series of looking at different announcements around the Snap Spectacles, today's episode is with the independent game development shop called DB Creations that has an experience that is called Tiny Motors Arcade. So DB Creations is another independent mixed reality development studio. They've been building some AR games and some VR games. This is a little bit more of an AR game where they wanted to see to what degree can you start to look at other existing toys like drones or RC cars and then turn your phone into a controller. So you have like all the different controls that we're all familiar with with. the last number of generations from video games, and then use that as an input controller with this snap spectacles to be paired. So you can have some, a little bit more of an abstracted input mechanisms, but to also kind of play around with games that you need to have very precise and timed clicks and inputs. So I think probably most of the eventual AR games that are really AR native are going to focus on the affordances of the hand tracking and other gesture based controls. But both Dustin and Blake see that it's a little bit too early for some of those best practices for hand gesture and hand interactions with XR games to be fully translated into what's going to work well for the Snap Spectacles AR standalone headset. So again, this is another independent development shop that was getting some financial support from Snap. And also a lot of the stuff that they're developing are going to be baked into some of the core operating systems. So you'll have lots of other ways that you could use your phone as some sort of input device that you could scan and rescan and have different controls on there as well. So I know there is also like using your phone as a golf club, also using it as a controller to control these different games within the context of their game called Tiny Motors Arcade. So that's what we're coming on today's episode of the Voices of VR podcast. So this interview with Dustin and Blake happened on Tuesday, September 17th, 2024. So with that, let's go ahead and dive right in.

[00:02:16.156] Dustin Kochensparger: I'm Dustin Kogensparger. I'm one of the co-founders of DB Creations. We're an MR game studio based in Bellevue, Washington. And specifically at the studio, I work on a lot of the biz dev, production, project management, and keeping the team running.

[00:02:28.951] Blake Gross: I'm Blake Gross. I'm the other co-founder of DB Creations. And then I'm in charge of a lot of the technical areas, as well as UX. And I was the creative director on the lens we're releasing for the spectacles called Tiny Motors Arcade.

[00:02:40.977] Kent Bye: MARK MIRCHANDANI- Great. Maybe you could each give a bit more context as to your background and your journey into this space.

[00:02:46.078] Dustin Kochensparger: Sure. So I went to school for game design at the Rochester Institute of Technology in upstate New York. And then after I graduated, I ended up working at Bungie for seven years in AAA game dev, working as a producer on the Destiny project. And so I was doing a lot of team management stuff across all sorts of things, from like cinematics and art all the way through monetization and retention design. And I met Blake when I was at school at RIT. That's where we originally started working together. And then after a bit of time in industry, we decided we wanted to get into the XR space more completely and start working together on our new studio that we found at DB Creations.

[00:03:19.885] Blake Gross: Yeah, and then previously to DB Creations, I was at Microsoft. So I started working, well, I was like a DK1 backer. I was like an early believer. And I also, my first experience was at like Disney Quest. So that like really got me hooked. It was like Disney Quest, Aladdin, VR. And then that kind of stayed with me. And then I was like a DK1 backer. But then my first like professional experience I helped Minecraft get authentication working on Gear VR when I was on Xbox Live. And then I worked on HoloLens enterprise applications from like 2017 to 2019. And then I left to start doing professional AR game development.

[00:03:54.234] Kent Bye: So I know that in a lot of different gaming and VR gaming, there's traditional gaming. Why AR or mixed reality gaming? What's the draw or allure for why this was an interesting area for you to look into?

[00:04:06.536] Blake Gross: I think for us, there's a lot. I mean, we constantly draw from the power of taking childhood experiences playing with toys and bringing them to life in AR, and that's been a huge theme. And the other kind of big theme for us has been multi-user. With the advent of online gaming, we both grew up where you would have friends over, you would do local gameplay sessions in front of the TV. That was a real bonding experience. That kind of disappeared a bit with online gaming, and local AR gaming kind of forces you to have that connection again where you're both playing in the same space. So a lot of the work we have done is in like the multi-user AR and it's just magical when you can both kind of see these shared hallucinations together and be interacting with them.

[00:04:45.526] Dustin Kochensparger: Yeah, I mean, the most magic moment I remember was when we were building our first AR game together and we just got, like, gray cubes networked and connected in the same way. We were just using mobile at the time. And the fact that I could move my cubes and he could see it and he could move his cubes and I could see it, there was something that just clicked for me about, like, that experience. It was like, wow, now we're having this shared experience together that's just for us that, like, nobody else can see, but we're participating in this way that interacts with our world together. And we find that that's just really sticky with people compared to doing a thing on your own. When multiple people are sharing together and experiencing that, it really goes to another level that I think is just awesome.

[00:05:21.336] Kent Bye: Great. So maybe you could give a bit more context to your historical relationship with Snap and working with some of their previous platforms, both their mobile platform, AR lenses, Lens Studio, and Spectacles, previous iterations, just to catch me up a little bit of your historical relationship with Snap.

[00:05:37.805] Dustin Kochensparger: Sure, yeah. So we started working with Snapchat actually while we were working on a VR project at the time. And we used their Facelens system to build a promotional piece that we were kind of playing around with because we hadn't done much with the Snap platform. We were working in, like Blake was talking about, sort of traditional VR at the time. And they saw the sort of rigged face model that we made as a promo piece and reached out to us actually and were like, hey, you guys are AR developers and you do this VR, AR stuff. We're looking for people like you to work with. We should talk more. And that's how we learned about the Ghost Innovation Lab, which was the first place we started working with at Snapchat. And we built a strategy game for that. And we also learned about their connected lens flow and doing multiplayer together. And so that was the first thing we kind of built was with that. And then Blake, do you want to take it from here with where we got after that?

[00:06:21.272] Blake Gross: Yeah, so that was the start was kind of building connected strategy game lenses. And that one won lens of the year at their awards that year. And then we got another contract through their ghost program to continue building mobile lenses. And we built a city builder called Table Towers, which won lens of the year last year. And then we also built like an archery game that's kind of like a normal like room spawning type thing. And we built like a sushi eating game that's had like 700 million plays. So we've had like a big experience with them on mobile, but then we also got the opportunity, I think, early 2023 to build a prototype on specs 2021 that was playing with the custom controller and all this stuff that we then got kind of a full project approval for about a year ago that turned into the lens that we showed you today. So it's kind of been this journey from building with them kind of these higher quality AR experience for the mobile audience to transition to headset and then building these kind of larger scope AR headset project for their classes.

[00:07:17.133] Kent Bye: What's the Ghost Innovation Lab?

[00:07:19.275] Dustin Kochensparger: So did they rebrand it to just Ghost? Yeah. I think it's just called Ghost now. It was called the Ghost Innovation Lab at the time. But it is their sort of developer relations program with funding that connects to developers that want to push the platform in new and different ways and kind of see where the limits of Lens Studio is, essentially. And so we work closely with them when they say, hey, we've got this new feature rolling out. We'd like to see how it could be developed or built into a lens of some complexity. We'll work with them to figure out a way to build. For us, it's a game. It's a game studio, right? To test that feature, push that functionality, give them feedback on that to help improve the product before maybe they'd roll it out to the larger audience of Snap developers.

[00:07:57.006] Kent Bye: And as far as I can tell, with this new Snap Spectacles platform, that the only way to actually get experiences onto the platform is if you use the internal Snap Studio and not something like a Unity or Unreal or other traditional game engines. And so as game developers, I'd love to hear some of your thoughts in terms if there's been enough of the interactive features that make it a proper game engine, or if you've also been at the frontiers of nagging them and pushing them in order to get more and more interactivity and to become more and more of a proper game engine.

[00:08:28.303] Dustin Kochensparger: Sure, yeah. The only piece I'll say, and then I'll pass it over to Blake, who has much to say about this topic, is we do love the speed of innovation on Lens Studio. Building in Unity is something we do a lot with some of our other partners and platforms, and we love using it. But it can take a bit of time to get something up and running, even in a gray box form. And Lens Studio is just very quick to get going, because it's an AR-first engine. And so we like that a lot for, like, quickly testing out ideas. I mean, we were able to ship multiple fairly complex lenses in less than three months, you know, which is pretty hard to put out a Unity product of any quality or worth looking at in that time. So we love that a lot about working with the platform. But Blake has a lot to say on that, I'm sure.

[00:09:04.162] Blake Gross: Yeah, I think it's kind of both the things you said which is both is kind of a fool enough engine to build experiences on while it also has I think the limitations that come with when someone tries to do a platform proprietary engine you have to work with the team and we I think a lot of the value that we bring is giving them and developers like us bring is giving them that type of feedback and of, OK, here's where the spatial capabilities need to be improved. Here's issues with the audio system we're hitting. There's a lot of platform validation work that we're definitely assisting with. But I really do enjoy working with Lensity overall. I think my controversial take maybe is that I think that the path they're taking is going to become a bit more popular. I wouldn't be surprised to see Apple really doubling down and making their own tool chain a bit better. I think there are a lot of advantages to working with tightly coupled software within that. I think Unity, especially our foundation, is not very user-friendly. It's very complicated to get started. It can be difficult to get Unity in the right state, and there's all this packaging stuff. It's not a pleasant developer experience, especially for new developers. I think one of the real advantages Snap has over other platforms is that you can get really novice developers creating interesting stuff in their ecosystem pretty quickly.

[00:10:14.586] Kent Bye: OK. And so maybe you could give a bit more context for how the project that you just showed me, how that came about. Because this is such a new platform, there's no real market. And is Snap funding these things? Or how did it come to be?

[00:10:28.464] Dustin Kochensparger: So yeah, so we partnered closely with Snapchat from the beginning to build this product. And so we worked with them on the funding side as well as the creative side. And so they're a great partner in that way, though, in that we work together to kind of figure out, hey, how do we want to push this thing? What are we trying to prove out? Obviously, we're starting from games, so we're kind of always talking about, let's do a game of some kind. But we really wanted to see, for this one especially, how could we iterate controller? Because we think that's a very comfortable place for people to come into things like this, because everything else is so new and different that at least putting your hands on something that feels like a controller. If you've ever played any video game console, you'll immediately feel a bit more at home compared to learning a gesture set or something, which is kind of overwhelming for people. So that was our core conceit was, OK, we need to build something that uses a controller input. What would be fun with that? And we've been toying around, like Blake said, with the earlier iteration of the specs to do an RC vehicle prototype, because we thought that would be a good fit. OK, what's a natural thing that I want to do indoors, but it's kind of annoying to set up and do? It's like, oh, RC vehicles are a thing that from our childhood that we used to play with. They were a lot of fun until they'd knock stuff over and you'd get yelled at. And so we kind of wanted to say, OK, let's bring that to this experience and design a game that's going to work to those strengths and allow us to build something cool.

[00:11:37.170] Kent Bye: So did you design the controller, or is that something that Snap had developed as part of their Lens Studio?

[00:11:42.663] Blake Gross: We designed the controller. The intention for the controller is that it's going to become generally available for developers. And I think the technical capability of it is just a camera kit application, basically. So I think developers will be able to customize that in the future. But we designed the one for our game.

[00:11:58.840] Kent Bye: So you have your Snap Spectacles, and then you have a phone that's tethered to that. And so maybe just talk through a little bit, because you showed me your headset and your phone. But it sounds like that whatever phone's associated or paired with the Spectacles is going to be the thing that's going to then get transformed into a controller, so using your phone as a touchpad controller.

[00:12:18.911] Blake Gross: Yeah, that's exactly right. Basically, when you set up your Spectacles, you have to set it up with a phone. And it gets paired to their custom Spectacles application. And then when you launch our game through the lens carousel on the Spectacles device, the controller just pops off. You don't have to do anything special. And then you're in and playing our game. So it's just integrated into their application directly.

[00:12:38.863] Kent Bye: OK. And so you've had access to the Snap Spectacles for a while. What are some of the other apps or experiences that you've enjoyed seeing and that you think starts to unlock the potential for where this all might be going?

[00:12:51.038] Blake Gross: I mean, our friends at Wabi Sabi made a Capture the Flag lens that we think is really awesome. I mean, like we said, we love multiplayer experiences, and they kind of tackled that challenge for launch. So that's really cool. It's always fun to see games.

[00:13:03.272] Dustin Kochensparger: I really like the one that Evan showed off, the just generate stuff one. Like, it's very, very simple. It's complex to build, I'm sure. functionally it's very simple for users and I think it's a really fun one to get people in that creative mindset of just like no no say anything and it will make it for you like I think that's just I love seeing things like that for new technology because it is so foreign to people and like we said you know we want to bring that sort of magic of childhood memories to these platforms to give people that sort of comfortable experience to get into and I think that one's just a really good way to get people thinking in the mindset of like oh okay this is how these things work and can behave and do. And so I like that one a lot because I think it does all those things really well.

[00:13:40.894] Blake Gross: Yeah, I don't think they talked about it today, but I think that lens is multi-user as well. So multiple people can be in a session together, all kind of being creative, almost like a brainstorming session. And you can see the objects popping up and interact with them between each other. So it's a pretty neat lens.

[00:13:53.641] Dustin Kochensparger: I do really like the piano lens as well. I don't play the piano. Maybe I would like to learn at some point, but I think that lens does a great job of showing kind of the opposite of what we do, of very functional, usability-focused things, which I think are compelling and necessary for these kind of platforms to be successful as well. And so as much as we're game developers and want to just see cool games all the time, I really appreciate seeing creative and new ways to bring people into AR from wherever they're coming from. Because obviously, we're not all gamers. Not everybody wants to play games all the time on their devices.

[00:14:23.157] Kent Bye: And is your experience a single player and or a multiple player?

[00:14:26.900] Blake Gross: TINYMOTORS is currently a single player only game. We're looking into, for our next project, having it be a multiplayer game.

[00:14:33.305] Kent Bye: OK. And so you have a number of different types of vehicles that you have, like a jet helicopter, like a tank. Maybe you could talk about some of the different types of vehicles and what you, I guess it's a process of trying to find the fun. So what are the things that were fun about each of these different vehicles and the different tasks that you want to do with them?

[00:14:52.361] Dustin Kochensparger: Sure. Yeah, I mean, so we started off with the vehicles themselves in prototyping to figure out, like, what feels good to play with in this way and kind of work from there. So, like, I really like the jet personally because that one sort of started as, okay, we're going to have a vehicle that moves all the time, and that's going to be the challenge the player needs to overcome is, like, managing a vehicle that consistently is moving, and you have to kind of wrangle that. And so the gameplay needs to be somewhat simple around that because the player is kind of fighting against, okay, the jet is moving. I need to turn it and move it and control it. But I love the freedom that it offers. have it flip around and do tricks and move around to catch those drones. And the other mode for it is a tractor beam, so kind of an iteration on the helicopter mode. So flying around, but more close to the ground, picking things up and dropping them. And it's more of a timing challenge as you zip over the deposit zone, making sure you drop them at the right time. So we really took an iterative approach with all of the modes and said, OK, what's interesting about these vehicles? What makes them fun? And how do we make them different from each other? Because we also didn't want them to be carbon copies of each other. So like I said, the chopper mode and the jet mode have a similar mechanic of you have an ability to pick things up and drop them off. That's the core loop mechanic. But then, OK, let's make them different for the different ways they do. The chopper is going to move in place, so that's more of a precision picking up and optimization challenge. The jet is more of that timing challenge of getting a bunch of stuff and then dropping it all at once. And I think that worked really well for kind of iterating on the modes.

[00:16:12.022] Blake Gross: Yeah. We actually started with, when we were prototyping, we came up with 16 modes, and we cut it down to eight. Because we were trying to figure out, we were really trying to go blue sky, find the fun, and then kind of narrow it down to what worked the best. And so you covered helicopter and jet. For car, we start off with racing, doing procedural space. We wanted everything to be really simple. We didn't want the user to have to do a bunch of setup. Our game, you can kind of get into within 10 seconds, even with room scanning. So there's not a lot of manual setup. And so we didn't want players to have to create racetracks or go over the room. So we ended up finding that for car was catching drones was a lot of fun. kind of zipping around. And the whole point of that game is you're trying to get combos. You have a boost meter. And your boost stays on while you're catching the drones. But if you hit a bomb, you lose the boost. And then the other mode we have is vacuum. So we have a few different types of cleanup modes, like you're putting crates away or cleaning up gunk. And this one is you're vacuuming up balls and putting them in kind of a vacuum receiver type thing. And then for Tank, the fun was all about shooting. So we have different turret games. So one is kind of like a whack-a-bot type thing. The other one is like you have a water cannon and you're putting out fires everywhere. So it was kind of figuring out, okay, what are the unique traits of each of these vehicles? And the whole game was built around these tools that the different vehicles have. And then the tools correspond directly with the different game modes.

[00:17:28.302] Kent Bye: Yeah, it's interesting to see the controller come back into AR gaming. I know that when the Rift first launched, it was just launching with the Xbox controller, because all of the games had been developed with those controllers, and it wasn't enough to turn it into a fully spatial, embodied game with 6DOF controllers. So I feel like there's this trade-off between having precise agency with the input that's always being taken in and that you can be assured that it is going to be highly reactive to that agency, but is abstracted and you don't have as much of the embodiment that you might have if you're actually using your body or using your hands. And so I feel like there's this trade-off between the embodiment that you get through AR, which is a kind of a natural affordance of it, versus having that high agency abstraction that you have more precise controls. So it's certainly a trade-off that you're leaning into in order to develop some of these games, but I'd love to hear any of those reflections on some ways of the controller being, I guess, antithetical to some of the, where the zeitgeist of most AR gaming is going with this natural intuitive embodiment.

[00:18:27.645] Blake Gross: Yeah, I think that... I mean, the nice thing about a controller for an RC game is that it's a very, like, direct mapping. Like, if you're using RC vehicles, you're going to be having a controller, so it's not controlling an avatar like maybe in other games or like you saw with a lot of kind of early Rift stuff. I do think that you're right, that, like, there is a lot of energy around hand tracking. I mean, I think that there's still kind of this... tough nut to crack where there aren't a lot of core experiences using hand tracking still. I've seen with a lot of the Vision Pro applications that they're a little bit more casual and they're more like hand tracking focused. So we're very interested in trying to keep that core. Like Quest 3 is still very controller focused. Like the Quest 3 game that we're working on, we're not really even thinking about hand tracking for that. So I guess in our minds, we're not jumping on the hand tracking, hand controls hype train. we're kind of i guess letting the industry figure it out a bit more first and then go from there but i'm pretty convinced about the power of controllers for having the precise ability to like interact with things in this type of gameplay they also just drive that affordance of being like held and touched and used right like i think we talked before we started this interview about like

[00:19:34.787] Dustin Kochensparger: Something that is challenging with hand tracking is having to teach the user how the hand tracking works and what its limitations are and what its functionality is for any experience, especially if the platform isn't strongly enforcing, like you must do taps in this way or drags in this way or whatever. Then it's on your app to then also train people that. And in my experience, it's very easy for new users to get a bit lost if they forget what those things are, or they can't quite remember exactly what the motion or the gesture is. You have to do a lot of work to get them back out of there. Whereas the controller, you can only press a button. So if there's so many buttons, you can only press two buttons if there are two buttons. So you don't have to worry about, oh, I could tap anywhere and something might happen. So I like a lot that by having the controller, you know, once people learn what the controller does and they're into it, like it kind of keeps them safe in a way, like they're on rails inside of what it can do. And that lets them get to the fun a little bit more quickly for our more complicated style of experiences where we want you to be able to feel like you're moving these RC vehicles around.

[00:20:33.052] Kent Bye: Yeah, well, as we start to reflect on the Snapchat Spectacles and what they mean for the larger industry, with the Apple Vision Pro it feels like, even though it's not called a developer kit, it's essentially like a developer kit for mixed reality. And the Snapchat Spectacles, similarly to me, feels very much like a developer kit in the sense that it's for developers to pay $99 a month for however long they're going to be paying that. But you essentially have this developer kit for what's outside first mixed reality, augmented reality, pass-through headset that there's not a lot of other companies out there that are really kind of pushing on that specific use case. So I'd love to hear some of your reflections for why you feel like this is a compelling platform and why you want to continue to push it forward and what gets you excited about it.

[00:21:17.315] Dustin Kochensparger: Sure, yeah. I mean, one of my favorite features is the, and they showed it on stage as well, is the electrochromatic dimming. The fact that you can dim down the display with the touch of a button, essentially, is very cool. I think it feels super sci-fi, but also has this extremely useful functionality of enabling more of that outdoor use case, which is something that we are really excited about doing more of, as platforms will be able to do more of that. AR is really fun, but it's even more fun when you can do it literally wherever you are. And outdoors is a big place that a lot of people are regularly. So that's something that we're very excited about, is seeing more of that kind of utility coming from devices like the Spectacles to enable us to bring things outdoors more and get people to be able to do things that maybe didn't make a lot of sense when you're like indoors inside of your living room and mom's china is set up nicely on a cabinet nearby you know like there's only so much movement you want to push players to do when they're indoors because you don't know what kind of space they're going to be in you know what the constraints are that are going to be outdoors they have a little bit more freedom so we're really excited about that i think the other big piece is just pushing forward that untethered like fully free to use pair of glasses feeling right like it's the sci-fi fantasy that i think a lot of us looked to and said oh this is why i want to do this i want to see this happen and so having more devices that are pushing us in that direction to where you know we can work around the trade-offs that are there to get closer to that vision of a seamless AR experience like Evan was talking about that really does get us excited you know like I don't like wearing my Quest outside walking around you know like it's very fun for the experiences I do with it but it's not something that I relish you know taking outside to wander around you know and hopefully it will see more people playing in that space too but I think that's what most excites me about the spectacles right now.

[00:22:49.522] Blake Gross: Yeah, I think there's this continuous debate in the AR space about like see-through displays like this and pass-through headsets. And there's these trade-offs that we all talk about all the time about like, oh, do you want to see through cameras or not? I think there is a lot of value to being able to see the real world while you're interacting with content that obviously comes with these FOV trade-offs. I think the real core of the Snap offering is like the untethered. It's like the real AR, like you're seeing the real world still. It's the outdoor and it's the multi-user, especially like the seamless multi-user stuff. And that's a lot of the... direction that we're looking as we think about future product development. And I think it is kind of a unique selling point. You don't see a lot of other companies trying to tackle outdoor use cases right now. And I think that having glasses that you can put on, like go to the park, play some experiences with, that's kind of where our minds are at. And I think that it's a very unique opportunity. And it's also where AR has had some history, like on mobile, like Nantix had a lot of success with kind of like the outdoor use case. So I think there's something there.

[00:23:46.920] Kent Bye: Awesome. And finally, what do you each think is the ultimate potential for these spatial computing devices and what they might be able to enable?

[00:23:56.224] Dustin Kochensparger: I mean, we're really excited about figuring out what gaming looks like in this space. And I feel like that's a cop-out answer, but it really is. For us, that's what drives us most of the time, is no one's made games for spatial computing before in these ways that we're starting to be able to. what is an outdoor AR game, right? Like there are people that are starting to try to answer that question, but there's so much blue ocean space to try and actually investigate that. And all of us together can build different takes on what does that look like and what does that mean? And I think that is really exciting coming from the AAA world where I came from, where a lot of the same stuff is what everybody's kind of working on because that's what's selling like we don't know what's going to sell in this space and so we can kind of have a little more freedom to just figure that out together and make cool stuff that maybe we would never have thought of before like that i would never have said in this interview right now like ah yes the game is going to be this like to me the journey is what is driving a lot of the excitement for me is like i don't know what the future holds but i'm very excited to bring games to it we know games are probably the biggest content category on most platforms that exist. And people love to play with whatever the thing that they have. It's just a very natural thing to want to do. And so I think enabling that play in new ways powered by whatever this technology can bring us is what gets me excited in the morning.

[00:25:11.812] Blake Gross: For me, I think that traditionally gaming as a category was often competitive and defined in terms of graphical enhancements and having stuff look more realistic. I think for AR, it's all going to be about how immersive is any of the content with the actual world and that the improvements we're going to see is how immersive is the content with the world. So I'm really excited about making games that are more aware of cities, parks, more spatially aware. I think that's kind of where this is all going. And, like, when you're in a living room, there's this demo that Michael Abrash showed off a long time ago where he has these glasses on and it, like, perfectly maps the living room and does perfect semantic mapping and all that stuff, and the game is highly contextually aware of all that. That's the direction that I'm really excited about and seeing things heading is where these games, they call it, like, AI, but where the games are really intelligent about the environments that they're existing in. And it's not just kind of the heuristical nature that we're all kind of doing right now.

[00:26:06.946] Kent Bye: Yeah, here playing your demo, there was some objects that were in the far distance, and the different vehicles were just kind of clipping through it. But do you imagine that that's something that would have a little bit more in the future, being able to make a world mesh in a way that is able to be a little bit more reactive?

[00:26:23.243] Blake Gross: Yes, right now a lot. It's a really complicated problem, especially for gaming. Like, mesh refresh is like a whole UX category of problem, I feel like. And right now the solution we all kind of have is you build up a static mesh and then maybe do real-time occlusion so that things look a little better. And... That's what we're doing here. That's kind of where the Quest platform is as well, is you kind of build up the static mesh, and then you do all your reasoning based on that. I definitely think that the direction everyone's looking is like, OK, how do we make this all work dynamically? Like, how are all objects in the world affected by dynamic real-world objects? And that's definitely kind of the future direction everything's looking towards.

[00:26:57.777] Kent Bye: Great. Any final thoughts? Anything else that's left unsaid that you'd like to say to the broader immersive community?

[00:27:05.305] Dustin Kochensparger: We're just really excited to continue to be a part of this. This is what we got in this space to do was to make cool stuff and get to show people a little bit of where gaming can be brought to these categories and these devices. And so just really excited and hope people get to check out the device and see what the Spectacles are all about. I think it is a really cool device once you get to look at it. And it's very hard to demo AR content and show it off not through the device. I was sharing this video and launch trailer for this game that you just saw to people and they were having a hard time grokking it because it's weird when you're doing a capture of a thing on AR glasses. My biggest encouragement to the people in the community is if you have the opportunity to try this stuff out, check it out because seeing it happening in front of you is a little bit more magic than looking at screenshots of it. It just doesn't do it justice.

[00:27:52.878] Blake Gross: I mean, I guess my mentality is always like, oh, I'm too late to this opportunity, and I've never been too late to any opportunity or any market. When I left my job at Microsoft in late 2019 to get into this stuff, I was like, oh, no, I'm going to miss out on the entire MR industry. And it was like, no, it's very nascent still. It's very early. I think everyone's been saying that for 10 years now. And so that's like, if you have an idea, build it. It's not too late. It's never too late. I always feel like our ideas are too late, and they aren't. And then also, it's still just so cool to see devices like this I remember using the HoloLens for the first time. And the HoloLens is a lot bigger than this device and kind of less feature rich in a lot of ways. So it's awesome to be part of the space and see it evolve.

[00:28:30.402] Kent Bye: Yeah, as we're talking here, we're standing outside at this festival. It feels like the context of this conference is all around mimicking where they want to take the future of the computing with having this outdoor spatial computing devices. But yeah, it's just really interesting also to hear Snap and the way that they've supported developers over time. I feel like the way that they've been investing and supporting an ecosystem and really cultivating an ecosystem, I think is distinctly different than any other company. And it's really encouraging to meet a lot of independent developers like yourself who have been a beneficiary of some of that funding to help innovate and push the platform forward. And I feel like that's something that I don't see as much in the other sectors of the immersive industry. And so I feel like the way that Snap has been cultivating their developer ecosystem has been and really helping to develop the platforms and to also just grow the ecosystem. So yeah, anyway, it was just great to be able to see a little bit of what we were working on and hear a little bit more about your story and get a little bit of a sneak peek for where this all might be going. So thanks again for joining me here on the podcast to help break it all down.

[00:29:27.121] Dustin Kochensparger: MARK MANDELMANN- Awesome. Yeah, thank you so much. Really appreciate it. MARK MIRCHANDANI- Yeah, thank you so much.

[00:29:30.779] Kent Bye: Thanks again for listening to this episode of the Voices of VR podcast. That's a part of my larger series of doing a deep dive into both the announcements around Snap Spectacles, as well as the AR ecosystem at Snap. What I do here at the Voices of VR podcast is fairly unique. I really like to lean into oral history, so to capture the stories of people who are on the front lines, but also to have my own experiences and to try to give a holistic picture of what's happening, not only with the company, but also the ecosystem of developers that they've been able to cultivate. And so for me, I find the most valuable information that comes from the independent artists and creators and developers who are at the front lines of pushing the edges of what this technology can do and listening to what their dreams and aspirations are for where this technology is going to go in the future. So I feel like that's a little bit different approach than what anybody else is doing. But it also takes a lot of time and energy to go to these places and to do these interviews and put it together in this type of production. So if you find value in that, then please do consider becoming a member of the Patreon. Just $5 a month will go a long way of helping me to sustain this type of coverage. And if you could give more $10 or $20 or $50 a month, that has also been a huge help for allowing me to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voices of VR. Thanks for listening.

More from this show