#1454: Niantic Launches Virtual Pet AR Game “Peridot Beyond” on Snap’s Spectacles

I interviewed Niantic’s Erin Schaefer and Asim Ahmed at the Snap Partner Summit about the Snap Spectacles as well as their game Peridot Beyond. See more context in the rough transcript below.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my series of looking at different announcements around the Snap Spectacles, today's episode is with Niantic, where they were announcing a couple of new things at the Snap Partner Summit. First of all, they have the Niantic's Peridot Beyond, which is their app to experiment with head-worn augmented reality devices. So they've already had this as an experience. It's like a virtual pet that you can have these different interactions with, and they're kind of sandboxing it out from their other properties with Pokemon Go and Ingress. They've been a real pioneer when it comes to like, quote unquote, real world gaming or the worldwide team based geomobile territory control game with Ingress. And there's the free to play open world location based augmented reality game of Pokemon Go. So these are both games that are really encouraging people to be out in the world, kind of engaging with different landmarks and different locations and have this mashup between what's happening in the physical reality with what's happening in these digital realms. So Niantic's really been a pioneer when it comes to this kind of form of gaming and really encouraging people to get outside. So it's no surprise that they want to be involved with Snap to have some of the first experiments on the Snap Spectacles, which are really tuned to be this outdoor augmented reality device. I mean, it can also do indoors as well. but they're really interested in pushing the edge with what's possible with getting you out into the world. So this is kind of like their sandbox game that they're developing, and they're also announcing with the Scaniverse the ability to do volumetric capture with Gaussian splats and then to bring them into the Lens Studio to be used in these different lenses. And so I had a chance to talk to a couple of folks from Snap, Aaron Schaffer and Asim Ahmed, to talk a little bit more about each of these different announcements. So that's what we're coming on today's episode of the Voices of VR podcast. So this interview with Aaron and SM happened on Tuesday, September 17th, 2024. So with that, let's go ahead and dive right in.

[00:02:08.311] Erin Schaefer: Hi, I'm Erin Schaefer. I look after our US studio and publishing at Niantic, along with go-to-market for our platform. So that means the way I'm involved in spatial computing, I get to help work on our games that are leveraging AR out in the real world. And I also get to make all of our tools extensible to other developers or other businesses that want to build augmented reality solutions.

[00:02:31.055] Asim Ahmed: Awesome. Hi, my name is Asim Ahmed. I work on our product marketing team at Niantic. I've been with Niantic for about eight years and currently work on Peridot. And today we announced Peridot Beyond, so I'm excited to share more about that.

[00:02:42.280] Kent Bye: Maybe you could each give a bit more context as to your background, the different disciplines you're bringing into your practice, and your journey into the space.

[00:02:49.306] Erin Schaefer: Sure. So before Niantic, I've been at Niantic for about four years. I spent over a decade at Google and YouTube, so mostly in the entertainment space. And I had kids that loved watching video, but one of the things that made me sad about watching their own experiences was they were very much seated on the couch, staring at screens, not necessarily out in the real world interacting with other kids or with me and my husband. I got introduced to John Henke our CEO at Niantic and I knew a little bit about Niantic because it incubated at Google and I got very excited about the experience I might be able to bring from a different side of the entertainment world to this company that was pioneering entertaining and educational experiences out in the real world where they are.

[00:03:34.195] Asim Ahmed: Yeah, and I joined Niantic, as I mentioned, eight years ago. I actually joined directly out of college and was fortunate to find this awesome opportunity. I'd like to consider myself a lifelong gamer, and so finding an opportunity where I could really build on that passion, but also kind of venture into this new era of technology. I've been really excited about augmented reality specifically, and I really love our mission, which is we just want to get people a little bit healthier outdoors, discovering the beauty of the world. And so, yeah, that's my passion in the space, and that's kind of how I got into it.

[00:04:03.886] Kent Bye: Great. So we're here at the Snap Partner Summit. And there's a big keynote this morning where the Snap Spectacles fifth generation was just announced with some new integrations from Niantic. Maybe you could give a little bit of a recap of what was announced on behalf of Niantic today.

[00:04:17.278] Erin Schaefer: Absolutely, I'll talk about one and I'll let Asim talk about the other. So we announced two things today. The first is that we're bringing our technology to scan objects or locations with an app called Scanaverse. And you get to build those scans as Gaussian splats, which are incredibly rich 3D meshes. We are allowing developers to bring those to Spectacles and to Lens Studio. So that was one of our big announcements today, to sort of bring the real world and anything out from the real world into your development experience. And I'll let Asim talk about the awesome game and delightful experience we also announced today.

[00:04:49.537] Asim Ahmed: Yeah, so today we announced Peridot Beyond, which is an extension of our Peridot franchise. Peridot started as a mobile game, which we launched back in 2023. It's this adorable virtual creature that enjoys time outdoors in the real world with you. You can go on daily walks together, get some really nice, fun times outdoors, and be accompanied by this companion. And so we're extending that into this new era of spatial computing. We partnered with Snap to bring an extension of that onto their spectacles, and we're really excited. In the Peridot Beyond experience, you can really immerse yourself into the real world with your Dot in this very hands-free way. So you can walk up to your Dot, you can actually feel the pets that you can give it. You can grab a Frisbee with your hand and toss a Frisbee and play catch back and forth. And there are a variety of other things you can do, and it's an experience we're going to continue to build on over time.

[00:05:37.802] Kent Bye: So when did it originally launch then? When was the first initial launch of the Peridot?

[00:05:42.704] Asim Ahmed: Yeah, so we launched Peridot in May of 2023, but it's been an experience we've been developing for a while. So I joined the project back in 2020. And really, the idea with Peridot was to build for the future. For this day, where we actually have glasses that you can wear outdoors, You can really be immersed in experience. But we wanted to push the boundaries of the mobile technology. And so we launched Peridot on mobile in 2023, really leveraging the power of our light chip platform. And so you can really realistically feel the creatures in the world with you. As you're walking down the street, they'll naturally navigate. That's leveraging real-time mapping. Semantic segmentation is this idea where you can understand the real world to a greater degree. So if I'm near water, the game can actually tell that that's water. And you can have a really unique experience for that. So that's the mobile game which we launched. We've been experimenting for a long time. We've actually leveraged generative AI to make these creatures feel more sophisticated and they understand the world better. So if you show your dot a flower in the real world, it might go up and smell it and admire it and sit back. Or you can ask your dot, hey, do you want to go on an adventure? Do you want to go outside? It might start spinning really quickly, letting you know that it wants to go outdoors and grab your shoes and head on an adventure.

[00:06:51.539] Kent Bye: So if you were to describe the core gameplay loop of Peridot, then what are the types of actions and reactions that are happening within the context of the game?

[00:07:00.661] Asim Ahmed: Yeah, so Peridot is a mobile game. We like to think of it as kind of Tamagotchi reimagined for the real world. And so if you're familiar with Tamagotchis, maybe growing up, this idea that you can keep this creature with you by your side, enjoy company with it. There are a variety of different ways that you can engage with it. Like I mentioned, you can play catch with it. You can just admire it and have it in your world with you. But in the mobile game, we also have a variety of other game loops where you can hatch new dot generations over time. For the Peridot Beyond experience, we're really leaning in on this very simplified kind of pet simulation experience where you can really feel immersed in the world with your creature. I think over time, there are opportunities to bring in things like multiplayer, where multiple people can see and interact with the same dot, and a variety of other ways where I think we can leverage generative AI to make your creature really understand the world

[00:07:51.367] Kent Bye: OK. And to go back to the Gaussian splats, I know that was something that was announced back at SIGGRAPH like last year. And so it seems like it's taken the world by storm in terms of having a new way of rendering spatial objects and capturing them in a way, this point cloud that then gets splatted out into something that's got a complete new rendering pipeline. And so maybe you could describe a little bit about what you're enabling with the Gaussian splat integration and what that is going to allow the end users to be able to do that they couldn't do before.

[00:08:19.426] Erin Schaefer: So we've had a technology, we acquired a company a couple years ago called Scaniverse that would allow you to scan any 3D object or place and create a terrific 3D mesh of it. What's now possible is instead of just a 3D mesh, we actually will create a Gaussian splat of whatever you have scanned. And you could today have that on Scaniverse, and you can take it with you anywhere, you can actually share it on a public map as well if you'd like, but what we've done today is actually enabled developers on the Snap platform to leverage it within Snap and build it into Spectacles and into Lens experiences that they're building either for Spectacles or for mobile, and we think that will just allow you to both bring the outdoors in, bring the real world in, in a much richer, more realistic, incredibly robust way. And again, this is a free app experience that we have right now, And we're allowing developers who are using Lens Studio to use Scanaverse at no cost to build those Gaussian Splats for their experiences.

[00:09:18.308] Kent Bye: OK. And is there anything on the back end that you had to do in order to, like, is it being converted back into a mesh? Or is it actually being rendered in real time as a Gaussian Splat? Because I know there's new implications for the types of rendering pipelines that are being developed in order to even change in this completely new paradigm.

[00:09:35.236] Erin Schaefer: Yes, great question. It is being rendered within seconds as a Gaussian splat. And you, again, as a user of Scaniverse, which is an app that you can get on iOS or Android, you can choose to either share that splat with others, or you can just keep it private to yourself. And here, again, with developers who can leverage that via Spectacles and via Lens Studio, they can choose to include that splat in an experience. But yes, the breakthrough for us is that we've been able to render that splat incredibly quickly. So if you're taking a scan, it turns into a splat within seconds.

[00:10:07.417] Kent Bye: And what do you imagine that this is going to enable in terms of having the ability for people to take these existing physical objects and then bring them into a virtual instance and then start to play around or remix or modify them? What do you see as this opening up?

[00:10:20.711] Erin Schaefer: Well, we feel like there are all sorts of different ways you could leverage these things. What it really is is it's allowing you to bring the real world in or the real world to other people. So you can imagine there could be an experience where you and I are in two different places, but we can share an incredibly realistic view of where we are with one another in a gameplay experience. Instead of an AI-generated view of the Croatian coast, you can now have a real view of the Croatian coast built with an actual splat. So it's the magic of real time place with today, not tomorrow. You can also look at the same place over time and bring that realistic view into any experience you want. So you could do anything from tourism to gameplay to an educational experience. But we think making the real world and both objects and places much more tangible with these spots will unlock all sorts of creativity.

[00:11:15.086] Kent Bye: OK. And so from the experiential perspective of looking at something like the Snapchat Spectacles fifth generation that was just announced, kind of a developer kit in a lot of ways. But as I did a demo, I've done a lot of XR over the years and trying to think about the differences between, say, looking at a phone where you're looking at a 2D screen and a portal versus being fully immersed and fully embodied and lessening the distance between the virtual and the physical in the way that blends it. in a way that's a little bit more seamless. And so I'm curious to hear around some of your reflections on comparing the 2D mobile version versus what are the new experiential affordances that you're seeing that are new and different now that you're starting to experiment with something like the Snapchat Spectacles?

[00:11:57.315] Asim Ahmed: I think the exciting part of it is that these are glasses that can be worn on your face and you're kind of hands-free. And so the real, I think, amazing part of it is that you can really be immersed in the experience. And when you build for an experience where your hands are part of it, you're doing that in a very different way than you might imagine on your phone. In your phone, you're just getting this little glimpse, this little window into this beautiful world around you. With the glasses, you can actually experience what that looks like. And so as we've been thinking about how do we design for it, and Peridot team has really been an experimental team. We're really trying to push the boundaries. We're thinking of all the ways that we can make that moment even more delightful. So the ways that we think about that, instead of just like, 2D panels where I might be like selecting between character A or character B. Are there different ways that I can make that really more immersive? And we've actually built an experience for mixed reality, for example, where you can pick up your dot and you can dunk it into a paint bucket and that's how you would change its appearance versus just like swiping back and forth between a few characters. So really, we're using this opportunity to learn this new UI, this new experience for what kind of hands-free, everyday outdoor wear can look like. And I think there are a lot of amazing opportunities to come. And really, with Peridot and a bunch of other night experiences, we're really looking at this opportunity, this early kind of messy middle where the technology is heading as this experimental opportunity to build those delightful experiences we want to see when these glasses become more ubiquitous and mass market.

[00:13:25.959] Erin Schaefer: It is amazing what already just, you know, this first version of spectacles or this first version of outdoor glasses unlocks. You really do feel like you are petting this Peridot. You really do feel like you are feeding this Peridot, playing catch with this Peridot in a way that you just can't capture on mobile. And the fact that it can only get better is a little mind blowing. You literally feel that sort of haptic response, that emotional response when you're petting these dots through spectacles. It's pretty magical. So we're so excited about what it will unlock going forward.

[00:13:57.375] Kent Bye: And in terms of Niantic, there's also Pokemon Go. Was there any discussions of using that as an IP to start off? Or why start with Peridot rather than something like Pokemon Go?

[00:14:08.496] Erin Schaefer: We have nothing to announce about Pokemon Go today, but the reason we actually very proactively with Snap wanted to start with Peridot, as Asim was saying earlier, we really built Peridot to be an AR-first experience. It was really meant to be truly immersive in the real world, and in fact, If we had a hands-free, heads-up device ready sooner, we would have launched on glasses-like devices as opposed to launching on mobile. Unfortunately, the hardware market wasn't ready for us when our game was ready, so we wanted to get it out in the world on mobile first. But we always envisioned this as truly a head-worn experience, and so we felt like it was natural that this was the one that we should bring to Spectacles first.

[00:14:47.315] Kent Bye: When I've talked to different AI implementers and researchers like Michael Mateus and Andrew Stern, they talk about virtual pets as being like this playground for innovations of artificial intelligence in a way that I think a lot of folks may look at virtual pets as something that is kind of like a toy or not all that interesting, but it also feels like the core mechanics are the bed for innovation when it comes to different types of interaction designs for how you are relating with these virtual beings. So I'd love to hear any reflections on working with the medium of these virtual pets and what you're able to do with them?

[00:15:19.645] Asim Ahmed: I think I love the idea of virtual pets because everyone knows what a pet is and how you would interact with a pet. And so as we're exploring this future of pushing the boundaries of the technology, it's really comfortable to have that comfort spot of a virtual pet. So I think the ways that we're experimenting with it when you think about generative AI, for example, is we know pets can understand the world and there's always some affordance for like, maybe it doesn't understand to the full degree. And so if your dot makes a mistake in what it's thinking, that's pretty normal for what a pet might experience. And so I think that is a great opportunity to explore this kind of new medium with general AI, with glasses, and leveraging the technology.

[00:16:01.761] Erin Schaefer: I think for us it's been less about leveraging generative AI just to generate the pet itself. I think what's much more exciting is what Asim talked about, which is how do you take the magical pet that's been generated and actually make it a much more interactive, much more realistic, companion and that's where we think genii is most interesting in this use case i think what i've heard a lot of people talk about is using genii to create the landscape the world around a player and i think we're very excited about what it can unlock in terms of the interaction between player and game and not just the scape around you though that is interesting too

[00:16:39.096] Kent Bye: I know I've had some previous conversations with Niantic about the Lightship API. And you have your existing infrastructure for whatever game engines that you're using. It was just announced today that a part of the Spectacles is SnapOS. And so maybe you could elaborate a little bit as to if this is still the same game engine architecture that you have, if it's Unity, or if there's other innovations that you can start to build more native integrations with what you're doing with these IP.

[00:17:04.838] Erin Schaefer: So the Peridot Beyond experience was built in Lens Studio, Snap's own proprietary platform, and we were excited to do that in order to learn what's possible on this device. We have solutions on both Unity and on web, formerly known as Lightchip on Unity and formerly known as 8th Wall on web, but just think of it as our Niantic spatial services are available whether you build in Unity or web. We're excited to offer those to developers because those are such common ways for people to build experiences in games, but we would like to enable people to use our tools on whatever platform they'd like to build on and we see today as just early days of our partnership with Snap and we're excited to work with them in a lot of different ways to bring a lot of our technology and a lot of our experiences to their hardware and to their software.

[00:17:52.056] Kent Bye: Does that mean that there's like a fork now with like there's a Unity version for mobile, but now there's Lens Studio? Or is everything being built onto one code base built on top of the Lens Studio for both mobile and Spectacles?

[00:18:02.972] Erin Schaefer: So our mobile game is built on our, what we call ARDK, our Unity-based set of tools, and that is not going to change for Spectacles. The only option was to build in Lens Studio, and so we did that. And we're open to seeing where Snap wants to take this, if they want to open up to other platforms or not. But we have now learned a lot about building in Lens Studio, which will help us, we think, as we think about anything else that we want to bring to the platform going forward.

[00:18:27.830] Kent Bye: OK, and so we're just learning a lot of the initial specs for the Snapchat spectacles. And one of the things that was mentioned was a battery life of 45 minutes. And one of my first thoughts is like, is that enough? Is that for very specific things that people are doing? If there's social experiences, maybe sometimes those go longer. So I'm just curious from your own testing if you feel like the battery life of 45 minutes is sufficient, more than sufficient, or what the experience of is if you feel like you're running out of battery or if it feels fine for what you're doing.

[00:18:58.732] Asim Ahmed: I think 45 minutes is a great start. As time goes on, we will see that battery life get better and better over time. The way that we designed Peridot Beyond specifically was really meant to be the session-based experience, where you can jump into it for a handful of minutes and engage with your Dot, go about your day, and then bring your Dot back into the real world. But as I mentioned, I think over time, as the technology gets better, we will continue to design around what the devices have to offer. And so Peridot Beyond, I think 45 minutes is a perfect opportunity.

[00:19:28.615] Kent Bye: And is the Apple Vision Pro something that you've looked at at all in terms of experimenting with that platform? Or is there anything that's been announced yet?

[00:19:35.837] Asim Ahmed: I think mixed reality is incredibly exciting. So as I mentioned earlier, we have a mixed reality-based experience for Peridot. It's called Hello Dot. So it's a really amazing opportunity to get immersed, get your hands on with your Dot in your play space indoors. But as part of the Peridot franchise, we also have a partnership with a weather app called SunnyTune, and Peridot makes a cameo in that. But the Apple Vision Pro is a very exciting device, and it's something that we're definitely looking into.

[00:20:00.892] Kent Bye: One of the things that was at the end of today's keynote that I think is maybe kind of in the same branding as what John Hinckley has talked a lot about, the Unreal The physical life, the real metaverse being out in the physical world. I have the voices of VR podcasts, so I'm covering both VR and AR. I tend to see a little bit more of a spectrum where there's kind of a blending and blurring of these different realities depending on the context. So I feel like the outside world scale types of experiences are something that's kind of a new genre that we're starting to push into with the hardware that's specifically designed for that. especially considering a lot of the MetaQuests are more indoors and then even the Apple Vision Pro, it's not really optimized to have the ability to be outside. And so I'd love to hear any reflections of this idea of being immersed into the physical reality and what these specific sets of hardware you think are going to help to open up or unlock in terms of creating new social dynamics there.

[00:20:55.009] Erin Schaefer: Well, I think we're excited to experiment on lots of different types of hardware. As Asim said, we have a MR experience for Peridot. We are looking at building other MR experiences as well. Our ultimate goal, our mission, as Asim shared, is to really get people out in the real world exploring together. So we are most excited about AR glasses, but we also think there's a lot to learn from MR experiences. First we get to learn a lot about what it's like to have a head-worn device that's hands-free, and also there might be times where you want to have that experience inside, and then there's another experience you want to have outside. As we look forward in how we design experiences, we'd like to make sure that they work really well across multiple different screens, whether it's a mobile screen, an MR device screen, a smart glass screen with notifications, et cetera, or true AR glasses. But we're certainly most excited about the AR glasses future, because we think that will unlock the most exploring in the real world.

[00:21:53.223] Asim Ahmed: Yeah, one thing I'd love to just add is there's this really exciting idea of asymmetric gaming. And what Aaron was mentioning is the future is outdoors. That's what we're really excited about. You can imagine a world where you can take content with you. Let's take Peridot, for example. You can take this content with you outdoors on your mobile phone or through your glasses, but then you can bring that indoors, and maybe you can experience that in a headset. So I think that's a really exciting opportunity to explore. One is like thinking about how do we build spatial game design, but two is how can you bring this content with you no matter what platform you are on. I think there's a lot of opportunity to build experiences like that.

[00:22:33.396] Kent Bye: And when you look at the evolution of XR, you've seen the early players and early movers were folks that didn't have their own mobile system. So like you have Meta and then Microsoft were making some early moves because there's already Android and iOS. And so folks like Microsoft or with Meta would always have to build their app so that they were compatible for all the different devices that are out there. So I kind of see that Niantic's filling in that role where you're a little bit platform agnostic or just really being one of the first third party developers on these new headsets that are out there. But I'd love to hear any reflections on this embracing of the spatial computing and how that fits into the overall thrust for where you see this all going.

[00:23:13.171] Erin Schaefer: Well, as we said, we're really excited about all the different head-worn device types that are coming, and we'd like to build for all of them. We also want to make sure that as we enable developers to build experiences, not just stuff that we Niantic and our own developers build, we want to make sure that we make it easy for them to do that however they feel most comfortable building, whether that's in Unity or web or... somewhere in between, and so that's been our North Star is really enable experiences everywhere and enable developers everywhere. It's not easy because some of our technology works better on one type of medium versus another. We're trying to get to a place where there's parity You know, our visual positioning system works as well on Unity as it works on web. And your experience of, you know, a virtual Peridot is just as delightful in an MR device as it is in a Spectacles outdoor device. But that's our North Star is to try to really make sure that regardless of the device you're on, you're having a delightful experience. It's optimized for that device type. And that again, developers can build wherever they want and that we can get their experiences on all the screens that they want to be on.

[00:24:20.851] Kent Bye: And is Peridot an app that you buy, or is it more of an R&D project that you're doing to prototype? Or what's the business model for where you see how revenue is going to start to come in into these different types of experiences?

[00:24:32.185] Asim Ahmed: So for me, Peridot is a completely free experience that is available on mobile and in mixed reality. And now for the Snap Spectacles, I really look at Peridot as this really awesome sandbox of experimentation. And from a Niantic standpoint, it's something that we're going to continue to invest in from that standpoint. This opportunity to continue to experiment as these new devices and these new paradigms of how to leverage technology in different ways come to life. So that's it from my standpoint. It's more of this research and development project.

[00:25:02.170] Erin Schaefer: Our mobile game is free to play, as Sim said, and we do allow people to purchase things in-app, so there is revenue associated with this game, and you can imagine that different SKUs that we might bring to head-worn devices might have paid options as well, but we're excited that all the options that are out in the world today is free to play, and then if users want to spend money, they can, but we want as many people enjoying these virtual pets as possible, and that's the approach we've taken to date.

[00:25:29.431] Kent Bye: Yeah, I just came back from Venice Immersive where a lot of the immersive storytelling folks are trying to find the business models that really work. There's Emissive that they have the immersive expeditions where they have hundreds of people that are coming out and going through these experiences per hour, 80 to 100,000 people over a number of months. And so there seems to be like a movement towards this more types of immersive entertainment, but I feel like the business models are still evolving. And so I know that Niantic has had things like Pokemon Go and Ingress where there's kind of a mapping of, of the geography of different spaces, where there's maybe sites of interest that people are going to that are trying to create these emergent social dynamics. So I'm just curious if you see the same type of potential monetization strategies that have worked in mobile, if those are still going to be ported over into these site-specific types of monetization strategies for spectacles, or if you still think it's so early that you're not really even thinking about monetization because it's kind of a developer platform at this point.

[00:26:25.051] Erin Schaefer: I imagine that monetization is going to look fairly similar to how it has historically in that if you're talking about making a more immersive experience at, say, a theme park or a retail store, the reason that the owner would want to do that is to drive further engagement of their customers, bring their customers back, and the experience itself might not need to be paid. It's about actually bringing people back more often monetize them the way they've always monetized through tickets or concessions or name your thing. And AR is really the mechanism of making that thing more engaging, new again, a reason to return. And if you're talking about a game itself, we've seen this free-to-play model works really, really well. And some publishers monetize through ads. We primarily have in-app purchase. We actually have most of our users play free and do not purchase, but enough to that that's been a great model for Niantic with our games. So I think we'll probably return to some of those well-known sort of modes of monetization. And really, when you think about the AR experience, it's about that being a more engaging way to have that experience that drives more repeat usage, more time used, et cetera, as opposed to it having to have its own form of monetization in and of itself.

[00:27:42.028] Kent Bye: OK, great. And finally, I'd love to hear what each of you think is the ultimate potential of some of these spatial computing platforms and what they might be able to enable.

[00:27:52.924] Erin Schaefer: This may sound trite and too simplistic, but the way I always think about it is, could I be wearing glasses that really just let me layer on interesting information, entertainment in the real world as I live the life I live today? Could it be as seamless as that, where I get to invoke the experience information when I want, and if I don't want it, I don't have to? I really think we're not that far from that, and that to me is just yet another reason to go outside versus be indoors in front of screens that don't move. The fact that I can move around with my moving glasses and have this delightful experience and have technology simply augment my world and make it a little better, I really think we're not far from that.

[00:28:37.611] Asim Ahmed: I have a very similar view to Aaron. I think for me, it's like, I love this idea of this technology that can just follow you as you're going about your normal day. And it brings in those moments of delight when you need them. But it's not distracting, because we live in such a beautiful world. We should admire it. And we should not layer over this world that is always ever present with you, but in the right moments. So that's why I'm excited about Peridot, because it's just this little creature that follows you as you're going about your day. And maybe you just need to take a break from the world for a second and want to have that kind of heartwarming interaction. But those kind of right moments of delight when you need it the most, and then you can get back to your day and enjoy it. And you can have this content follow you and be really kind of immersed into your world at the moments you need them versus all the time.

[00:29:23.680] Kent Bye: Any other final thoughts or anything else that's left unsaid that you'd like to say to the broader immersive community?

[00:29:29.431] Erin Schaefer: We're just really excited about all the progress that's being made on hardware and software. I think we all feel at Niantic that there have been so many breakthroughs in the last year, both with this Gaussian splat technology that really allows you to bring the outside in, and then of course with hardware, whether it's the Quest, smart glasses from Meta, or certainly today these Snap Spectacles, we're just so excited to see the world really evolving quickly and we feel like we're getting closer and closer to the moment where our vision truly gets to come to life.

[00:30:01.826] Asim Ahmed: I'm just gonna say I've had an amazing eight years with Niantic and I've seen kind of the potential of what this technology can offer and I think we're at this really amazing inflection point like today marks a significant milestone of where we kind of see this technology heading and Right now, we're getting a glimpse of that future that I think we'll see in a few years. And as Aaron mentioned, we're not so far away. And so from my point of view, we're going to continue to experiment, we're going to continue to tinker, and we're going to find those really amazing, delightful moments of gameplay and other opportunities to bring joy into people's world.

[00:30:34.630] Kent Bye: And I know a number of developers are going to get their hands onto the Spectacles as soon as tomorrow. And so is Peridot going to be a launch title? Is it going to be available as soon as it's available for the developers?

[00:30:46.055] Erin Schaefer: Peridot is available today and tomorrow, and so is the ability to bring in scans and bring in Gaussian splats. So we're super excited for the developers who are going to have their hands on the devices to play this great game that we hope then consumers, when they have their hands on these devices, get to play as well. And then we're also thrilled for the developers to have access to this great G-Splat technology and be able to bring scans into the experiences that they're building.

[00:31:09.830] Kent Bye: Awesome. Well, super exciting to be on the bleeding edge of new platforms and to see Niantic continuing to push the edge for what's possible and integrating all these different technologies and building these platforms and APIs with the Lightship API. And yeah, very excited to get my hands on and get a chance to play it and just see where it all goes here in the future. So thanks again for joining me here on the podcast to help break it all down.

[00:31:30.070] Erin Schaefer: Thank you so much, Kent. It's been great chatting with you.

[00:31:32.753] Kent Bye: Thank you. This was fantastic. Thanks again for listening to this episode of the Voices of VR podcast. That's a part of my larger series of doing a deep dive into both the announcements around Snap Spectacles, as well as the AR ecosystem at Snap. What I do here at the Voices of VR podcast is fairly unique. I really like to lean into oral history, so to capture the stories of people who are on the front lines, but also to have my own experiences and to try to give a holistic picture of what's happening, not only with the company, but also the ecosystem of developers that they've been able to cultivate. And so for me, I find the most valuable information that comes from the independent artists and creators and developers who are at the front lines of pushing the edges of what this technology can do and listening to what their dreams and aspirations are for where this technology is going to go in the future. So I feel like that's a little bit different approach than what anybody else is doing, but it also takes a lot of time and energy to go to these places and to do these interviews and put it together in this type of production. So if you find value in that, then please do consider becoming a member of the Patreon. Just $5 a month will go a long way of helping me to sustain this type of coverage. And if you could give more, 10 or 20 or $50 a month, that has also been a huge help for allowing me to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voices of VR. Thanks for listening.

More from this show