#1525: Niantic’s “Into the Scaniverse” Maps Over 50k Gaussian Splats from Around the World on Quest and WebXR

Niantic launched their Into the Scaniverse application on Quest 3 on February 26th, 2025 that features over 50,000 Gaussian Spats from 120 different countries. They originally launched the WebXR version on December 10th, 2024 at IntoTheScaniverse.com, which was built using Niantic Studio (be sure to check out their comprehensive history of Gaussian Splats by Kirsten M. Johnson released at the same time). Users can use the Scaniverse mobile app on Android or iOS to capture, render, geotag, and upload their own Gaussian Splats onto the Into the Scaniverse mapps that can be viewed on either mobile phone or XR devices.

I had a chance to speak more about Into the Scaniverse with Joel Udwin, who is Niantic’s Director of Product for Niantic’s AR, Research, Developer Platforms, and Scaniverse. Gaussian Splats are only about 1 year and a half old as the original “3D Gaussian Splatting for Real-Time Radiance Field Rendering” paper was presented at SIGGRAPH in August 2023, but it represents a new rendering pipeline for volumetrically captured content. Niantic’s Into the Scaniverse apps are able to process and render these splats locally on the phone or Quest devices, and they have a lot of plans for how they will continue to utilize and develop this as a core part of their technology infrastructure and enabling new mixed reality applications.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So in today's episode, we're going to be doing a deep dive into Niantic's Into the Scannerverse application that just launched on February 26th to the MetaQuest store. They also have a WebXR version that they've previously launched back on December 10th of 2024. And more generally around the application of Scaniverse, which they acquired on August 10th of 2021 from Toolbox AI. So what the Scaniverse does is there's an app that you can have on your phone. You can basically do what would be normally like a photogrammed sheet type of scan where you push the record button and you walk around an object and it's able to take that information and translate it into like a point cloud. And those point clouds are then smoothed out through this Gaussian spotting process. There's like a whole layer of neural rendering that's in there too. And so basically you get a very high resolution capture of an object very quickly. And this is all local to your phone. So the Scaniverse app will be able to scan everything, processes everything locally, and then you can choose whether or not you want to upload it or not. Into the Scaniverse is also actually rendering everything locally on your device as well. And so it doesn't have to do the cloud rendering that we saw from Hyperscape from Meta to be able to do this type of Gaussian spot. But that was much bigger scenes that were higher fidelity. This is more for like small objects, medium range and large objects. This is a pretty good job of capturing these objects and Into the Scaniverse is a way to have a map-based approach to be able to scan, save, upload, and attach to a specific location. And so they have this map where you can zoom in and out and find all these different Gaussian splats from around the world. Over 50,000 splats are now available on this Into the Scaniverse application. So we're coming all that and more on today's episode of the Voices of VR podcast. So this interview with Joel happened on Wednesday, February 26th, 2025. So with that, let's go ahead and dive right in.

[00:02:14.335] Joel Udwin: My name's Joel. I'm Director of Product for Niantic's AR Research, Developer Platforms, and Scanaverse. And I'm really excited and passionate about how we can use XR to solve everyday problems. And there's lots of people in our industry who are really excited and amazing forward thinkers. And I'm a very, very practical tomorrow thinker. rather than like a year or even two years from now. So I just get really excited when we are able to bring XR into people's everyday lives in a way that's like meaningful and a value add today.

[00:02:53.101] Kent Bye: Okay, great. And maybe you could give a bit more context as to your background and your journey into this space.

[00:02:59.764] Joel Udwin: So I started in tech as a product manager at a company called Patronix, which was making loyalty and customer service software for restaurants and convenience stores. Again, really aligned with that value of just like being really meaningful and practical to people every day. So if you've used, for example, the Panera loyalty program that was powered by Patronix under the hood. And that was early in my career. And so I'm sharing this because, you know, as a big fan of your podcast, I know there's other early career people listening to your podcast. So don't be confused as an early career person like I was. I thought, you know, I've been at Patronix for five years. I was really worried. Oh, no, I'm going to be stuck in this domain for the rest of my life. I'll never be able to leave. In case anyone's early career listening, that's not true. You can always change, reinvent yourself, learn new things. So anyway, though, I left that company. I was like, what do I want to do? Where do I want to spend my time? What am I passionate about? And at that time, I had... a Oculus Go. I have to keep all the names straight now. And I thought that was so cool. And it reminded me of when I was in high school driving my dad's midlife crisis car. Don't judge me, dad. Sorry for saying that on a podcast. It had that heads up display that had the miles per hour and other stuff. And I was like, oh my gosh, AR is actually everywhere already. VR is everywhere already. It's like ready today. And I got really excited about finding ways to use that and make people aware that they're probably already using AR in their everyday lives. And this was during the pandemic. I went to the virtual VR AR Association conference and I saw Tom Emmerich there. And I was like, oh, that guy's so smart. I have to work for Tom. And I joined 8th Wall so I could work with Tom.

[00:04:50.809] Kent Bye: Oh, wow. Okay. So it goes back to 8th Wall. Okay.

[00:04:53.971] Joel Udwin: Which was then acquired, of course, by Niantic.

[00:04:56.960] Kent Bye: Right, right. Well, eighth wall is really quite an interesting entry point because I see this larger trend right now of this kind of battle that we've had between like the closed walled garden model for what we've seen in XR industry for the last decade plus. And then now starting to really have some real movements into the open metaverse, open technology platforms, or at least the technology that's getting outside of that normal space. app ecosystem mindset. You know, the acquisition of eighth wall is another indication of not just having one app based technology stack, but having the web and integrating into the web. And so, yeah, maybe you could just talk a little bit around like what drew you to like the open web aspect of what was happening in the XR space and why you wanted to go work specifically for what eighth wall was doing.

[00:05:45.734] Joel Udwin: Yeah, especially like remembering the hardware limitations at the time, using sort of the walled garden AR was very limited in terms of the devices that could access that. Obviously, over time, people upgrade their devices. Maybe that's less of a concern now, but it still is a real thing, especially in markets where maybe it's not accessible. economical to buy devices with LIDAR or sort of other things like that. So, you know, at the time, I was really excited about how 8th Wall and other, you know, open web-based technologies were making AR really democratized and accessible to a huge array of people. And it's interestingly, it's really a full circle moment Behind the scenes, under the hood, Niantic's Into the Scaniverse, which we're announcing today, was actually built on Niantic Studio. So that same engine and platform that can publish to the web published this app to the native MetaHorizon store as well.

[00:06:46.229] Kent Bye: Okay, yeah, I've noticed that Niantic's in this really interesting position where every new AR, XR device that's out there, you're out there with like the latest cutting edge mixed reality device. I know at the Snap Summit where the spectacles were announced, you were showing some latest iterations of this intersection between how to start to use mixed reality in these cutting edge XR devices. in the context of mixed reality, I'd say, with AR and this blending and blurring of the principles of experiential design in the XR broader sphere. So maybe you could just take a step back and walk me through, how did you go from what was going on in 8th Wall and what you were working on there, and then eventually how you got into now working into what is now the Scannerverse?

[00:07:30.444] Joel Udwin: Yeah, absolutely. So with Tom Emmerich and Megan Boo, and obviously Eric Murphy tutoring and amazing product team and a huge and fantastic engineering team and marketing and others at eighth wall, we were really focused on democratizing access and the ability to build XR content, particularly there. I worked on our AR feature set. working with our computer vision team to bring things like curved image targets to market, sky effects, hand tracking, adding in. So on the web, you could do like multiple faces being tracked for face effects rather than just like one face at a time. And a lot of these were driven by really strong product market fit we had with marketers and advertising to create these campaigns or sort of moments in time where you stop the scroll and really lean into the experience and the immersive nature of storytelling that XR enables. And so that was a really fantastic moment. We had a growing platform there. And then I think Niantic, and you've previously chatted about this when we launched Niantic Studio at AWEE. I think last year now with Eric, Niantic saw that and saw this growing platform and was really excited to marry that developer growth with this vision for the map of the world and VPS that Niantic had been developing. And then after 8thWall was acquired, we worked together on the first major launch, which was bringing VPS to the web, which was, I think, the first web-based VPS solution at the time.

[00:09:05.652] Kent Bye: Okay, and so you're announcing today a free downloadable app for the MetaQuest 3 into the Scaniverse. Maybe just give a bit more context for what is into the Scaniverse.

[00:09:15.237] Joel Udwin: Yeah, into the Scaniverse is really hitting on the core themes and values at Niantic. We wanna get people together to go outdoors or at least bring the outdoors in and explore the world together. And we have a ecosystem that's helping to do this. One of the parts of that ecosystem is Scanaverse. And we've really focused on development in Scanaverse to enable people to take these stunning 3D immersive photos of the world around them. And they can keep them private and keep them in their own library. They can share them privately or they can contribute them to the public map. so that everyone else can stumble upon and discover the amazing things people are recording around the world today. And I really think of it as this amazing 3D photos experiences app where you can visit famous landmarks and hidden treasures. There's already over 50,000 3D photos scenes available on the map today. And of course, you can view, as I mentioned, and share your own private ones instantly. And it really creates this amazing loop with the MetaQuest where, you know, previous to this app, I think it was not necessarily the easiest thing for the everyday user to just scan something in 3D and get it onto an immersive device. And now anyone can just download Scaniverse, scan, and either view privately or post to the map and see it within minutes on their MetaPost.

[00:10:54.938] Kent Bye: Okay. And so in terms of rendering out these Gaussian splats, I know that Meta had announced and launched Hyperscape, which was at MetaConnect last fall, and they were doing all this backend cloud rendering in order to do that. So how are you rendering out these scenes on the MetaQuest? Is there a backend of a cloud rendering or are there ways that you've found to render everything locally within whatever compute power you have in the context of MetaQuest 3?

[00:11:20.410] Joel Udwin: Yeah, absolutely. So everything is rendered locally and we do that across our ecosystem. So a thing that's unique about Scannerverse is when you're scanning, we don't send it to the cloud in order to process. It's processing locally on your device. And then if you choose to share it, you're basically just uploading that completed splat to the map. And we also developed and then open source our own file format, the SPZ file format, which dramatically reduces the file size of Splats with the purpose of enabling this quick transportation between mobile and headset and all the other devices, as well as allowing us to render these amazing scenes on device. And it's really important for us to render on device. Hyperscape is amazing because it gives you these really well thought out, detailed 3D spaces that Meta has curated. And it also requires like really strong internet connection to actually be able to access that content. At Niantic, it was really important for us to make the Into the Scannerverse experience available to as many people as possible. And so getting rid of that requirement to have a strong internet connection, just being able to download and then locally render really enhances that accessibility.

[00:12:40.103] Kent Bye: Nice. That's super exciting. Has Niantic been involved with some of the cutting edge discussions around Gaussian splats and kind of the research around it? Like how have you been able to do this type of innovation? Is this just from working internally with the resources that you have there, either at Niantic or with Alphabet, Google, or are you just tracking the latest innovations and research that's happening in this space?

[00:13:03.191] Joel Udwin: Yeah, I mean, all of the above. So Gosh and Splats was debuted at SIGGRAPH. I'm going to mess up my dates now, like a year and a half, two years ago, something like that.

[00:13:13.788] Kent Bye: I believe it was like, yeah, August of 2023.

[00:13:16.929] Joel Udwin: Yeah. So we saw that, and I think everyone across the industry saw that, and everyone's mouths sort of collectively dropped and said, wow, this is a really exciting and compelling way to display this type of content that we're capturing. And especially at Niantic, where we had been building VPS and really focusing on how computers see the world over maybe humans when we're collecting that data, we saw the possibility all of a sudden of moving to... Well, now we can actually display this in a meaningful way for human viewers outside of just, you know, AI and computers understanding the world. So that was a really exciting moment. At Niantic, we have research teams and our research teams contribute to CVPR and ICCV and CHI and all these other academic conferences. So we're part of the research universe. We learn from others, we expand and we move forward together. And then we're also parts of forums of open web standards. The Metaverse Standards Forum was just a place where we spoke about the SPZ file format and talked about how we can continue to expand on it together. maybe work with Kronos and others to integrate it into GLTF to really expand how Splats can be understood in an open format. So there's a lot of working together to push the industry to where we all know it can be. And there's no, you know, in my opinion, there's no world where anyone can do it alone. We have to work together to maximize the potential that we all have.

[00:14:50.080] Kent Bye: And so there's also this really great blog post that Niantic had from December 10th, 2024 from Kristen M. Johnson called Splats Change Everything, Why a Once Obscure Technology is Taking the 3D Graphics and Niantic by Storm. So just wanted to point to that blog post because I think that that was a really comprehensive history of Gaussian splats. because there's developments that happened before the latest iteration of that. But I think with the latest innovations with XR technologies and AI and everything else, it feels like it's a critical mass. I read this blog post of someone saying that this is a completely new render pipeline. And I think that's an important thing for folks to realize is that the physical base rendering and polygons and all that existing infrastructure, like Gaussian splats introduces like a whole new era of, And I think that Niantic is really picking up on that because you have the capability to have, well, let me have you explain, like, what are Gaussian splats allowing you at Niantic to do that you weren't able to do before with the existing render pipelines? And what is this opening up now?

[00:15:55.819] Joel Udwin: Yeah, absolutely. So, I mean, we can already see it with Into the Scaniverse and, you know, as I mentioned a little bit earlier, Niantic has been collecting this data of the world that people have been sharing with us. And we have been using that to build out the understanding that computers have of the world so that we can create these persistent and meaningful XR experiences. There was another blog post about Niantic's large geospatial model. It's contributing to that spatial understanding of the world. Today, when people think about Gen AI, Gen AI is really good at text, It's really good at 2D images. It's getting really pretty good at video. Some of the latest stuff is really cool. It's not yet amazing at 3D or real-world space. And we need to bring data in to help it be really good at that so it can solve a lot of meaningful use cases for people, as well as push forward on some of the really exciting gaming and other entertainment things that are also possible in XR. And when we think about Gaussian splats, that allowed us to take a step back and say, hey, in addition to giving all this data to computers so computers can understand the world, We now also can meaningfully render and visualize these large areas where, you know, you might have a lot of points in one area when you scan with Scaniverse, because you're really concentrated or focusing on your camera on one area. But as you, you know, sort of move outwards, your points get less concentrated and more sparse. And splats are a fantastic format for taking that data, whether it's dense or a little bit more sparse and making it meaningful to the human eye so that there's less holes, less weird gaps, and it's more of a real place or a real object or whatever it is that you're scanning. So that's really exciting to be able to visualize things really well. Splats being so new and being this new rendering pipeline, they also have problems that they aren't yet solving. For example, you can't use a splat necessarily as a physics object or for other interactive elements that you would expect from a mesh in your scene. And those are areas that I know we at Niantic are thinking about and others are thinking about as well.

[00:18:28.086] Kent Bye: Yeah, so I'm seeing that there was a WebXR previous version of Into the Scannerverse. I don't know if it was launching back in like December of 2024. I see a blog post from December 11th, 2024, building Into the Scannerverse in Niantic Studio. And so- It sounds like you've been able to have this mobile app where people are able to capture these Gaussian splats and then have like this web-based way of uploading. And maybe you could take me back to like the first iteration of Into the Scannerverse, this web XR version, and give me a bit of context where you started. And then we can get a little bit more into what the mobile app on the Quest 3 is now going to be able to enable. But yeah, so maybe take me back to the initial launch of Into the Scannerverse.

[00:19:11.385] Joel Udwin: Yeah, absolutely. So this was around December 10th, I think, of 2024. And this was all about the vision of really making it tangible for people who are using Scannerverse. And we have a really dedicated team. cohort of individuals who love using Scannerverse. We just had a meetup in Japan where I don't want to butcher the town name, so I'm not going to say the town name, but where everyone came together and they basically scanned every square inch of this town and These are people who, this is just like a Scannerverse meetup. They're just like really excited and motivated by collecting sort of the world around them digitally. And so certainly there are users, many users like that in Scannerverse. There's many users of Scannerverse who are using it to scan objects, might be using it for professional purposes, maybe are using it to scan people and create like fun 3D memories. And we wanted to expand the universe of Scannerverse and make it something that can be even more immersive and exciting to people and also an opportunity to bring sort of the more casual user in and give them something to be excited about. And we found there to be a lot of overlap between the personas and users who are interested in Scaniverse and people who would happen to own a MetaQuest device. And so we saw that as an opportunity to grow and to expand that user base and to make something that's meaningful to everyone and really motivates people to scan and share more. Because ultimately, Scannerverse has the capability to be this place where people share and store and celebrate their 3D memories.

[00:21:01.274] Kent Bye: Yeah, going back into my backlog from episode 475, I talked to the developers of Google Earth VR. That was published back on November 16th, 2016. And so I remember seeing that demo and being completely blown away in terms of being able to travel all over the world. And at that point it was more of like high level view. And so you were like from a bird's eye view flying over spaces and you would get down to the ground level and be really like almost corrupting my memories of places because it was so low poly and fragmented. And if they were flying over planes, then the planes would just have a lot more weird digital artifacts on the ground level. They did eventually launch with Google Earth VR the street view, but that was always a monoscopic view and there was no ability to kind of move around. And it was still a lot better than what you would get from the low poly version of the Google Earth VR. And there's since been some folks who have been doing some navigating through some of those Google Street View images. But I'd love to have you compare and contrast of what Scannerverse is doing, because I see that from my perspective, Scannerverse is that high resolution street view, but then maybe eventually get enough to the point where you can have that overview effect that you get with Google Earth. But you're kind of incrementally getting these objects of interest that are going to be super high fidelity at a human scale rather than flying around. But I could see where this could be going, where you eventually have like a Gaussian splat of the earth that you're able to then navigate around similar to like what Google Earth VR. So you're kind of starting in the opposite where Google Earth was from the high kind of see the forest. And here you're starting with the trees and maybe eventually you'll get up to the forest. So let me hear some of your perspectives on that.

[00:22:46.767] Joel Udwin: Yeah, well, and there's a lot of, you know, thought leadership and thinking that have happened between, you know, Niantic, obviously, as well as Google. Niantic was a spinoff from Google, including people from Google Maps and Google Earth. What's really interesting, as you sort of mentioned, is Google starts at the bird's eye or maybe satellite's eye. view and goes in. And today, it sort of goes all the way into things that are right next to streets. And at Niantic, where we're focused on helping people explore the world outside together, we're really excited to democratize and enable the collection of the everyday moments, the alleyway that the car can't go down, the path in that ancient city that is full of cobblestone and not always accessible, and the top of that mountain, right? And so we're sort of approaching it maybe from the other direction of let's collect that really hard, only humans can go out there and collect places that people are willing to share with us. And then from that, We can expand out over time as we're interested and able. There are a lot of places in the world, though, and the world is always changing. And so enabling people to collect and be excited about collecting and sharing, that's a big win for getting this type of data that doesn't really exist in a huge capacity outside of Niantic into this new world of like Gen AI and these other algorithms that need this data, or else it's going to be a big blind spot for computing in the future.

[00:24:27.125] Kent Bye: Well, I know one of the things that has happened on Google Earth is that they started with the satellite view and then with the street view, especially in the European Union, there's a lot more concern around privacy and blurring out license plate numbers or addresses. And there's an option for Google Earth. If you want to have your home blurred out, you can, but you have to go out of your way. To do that, how do you at Niantic start to handle some of these more intractable privacy issues where let's say somebody comes over and scans my living room and uploads it and it's against my consent. What are the controls to have those different types of boundaries, but also how do you navigate some of the privacy issues that are starting to come up with something like being able to do high resolution scans of locations and upload them to the internet?

[00:25:17.441] Joel Udwin: Yeah, so starting as a value when I was working at 8thwall and carrying through to Niantic and Scaniverse here, we're really private by design. And so in Scaniverse, what does that look like? Every scan you take is in your library, and your library is only on your device. So if you take a scan, it's only yours. Niantic only receives and can use your scan if you proactively share it to the map. So you know you're publicly posting something in the same way that you would if you are writing a status update and posting on Facebook or LinkedIn or wherever else. You know you're posting publicly at that point. That's the same approach we're taking with scanning. So everything is private first. And it's local on your device. It's not on any Niantic server. And it's only shared when you are proactively sharing it.

[00:26:10.174] Kent Bye: And then how do you prevent someone from, say, scanning something against someone's consent and then uploading it? Is that under DCMA? I'm thinking of things like the Fourth Amendment is the fact that you have protections from unreasonable search and seizure. However, there's the third-party doctrine of the Fourth Amendment, meaning that whenever you are uploading stuff to the internet, then it has no reasonable expectation of privacy. So if people are going into houses and doing scans, it creates all these fourth amendment implications. It's also a larger question around like, what if someone comes into your home and without you knowing starts to upload stuff, even if you're uploading from your own home, then there may be Fourth Amendment implications of that from a privacy perspective because of the third-party doctrine. So just curious, how does that get handled with these intimate private spaces of people?

[00:27:02.448] Joel Udwin: Yeah. Well, first of all, I feel like I just learned a lot. I feel like I got a little law school lesson, which I like. At Niantic and in Scanaverse, our terms of service require that if you are contributing something, that you have the right to contribute that. obviously that looks different in different countries. So it's hard to like spell out exactly this is what you must do in each locale. We really depend on people to have the correct knowledge and also outside of just legal, just do the right thing and be kind neighbors to each other. Outside of that, Within Pokemon Go and Ingress and Niantic's titles, we've been using location-based content for a very long time, and we have a robust system in place that's also in place in Scaniverse, where users can report things that should be taken down, can request takedowns for various reasons, can have areas blocked from even being able to submit for scanning, whether it's in Pokemon Go or elsewhere. And so, you know, Scaniverse was released in 2021 and we added all of this really map-based focus in the last year or so. It's built on a tool stack and a stack at Niantic that is privacy first by design and really focused on ensuring that people are able to comply with laws and that they are able to be respectful of each other and report when things are going wrong.

[00:28:21.359] Kent Bye: Okay. Well, there seems to be like a pretty big social component. You had mentioned that there was like a Scanaverse meetup, which sounds like it's in the vein of like, say, Pokemon Go or Ingress, where you're creating these location-based services where you're encouraging people to go out into the physical world, the physical metaverse, as Niantic likes to say, that you're able to have these social interactions that are happening in physical reality. Are there other ways that Niantic is thinking about how to catalyze and facilitate those meetings in physical space? Or are there also, like now that it's on the Quest 3, ways for people to have those different types of social interactions but be on the Quest and have a social VR? I know that Niantic has a lot of strong opinions in terms of the physical reality and preferencing that, so... I would be surprised if there's too many social features within the context of the meta quest three. But I'm curious if you've thought around like, it seems like a natural next step that if people are not in those physical locations, are you hoping that people could start to have these social interactions within the context of the scan averse as building the scaffolding of a metaverse platform where people could meet up in these virtual spaces?

[00:29:29.727] Joel Udwin: Yeah, that's definitely something that's on the radar. You know, you mentioned Niantic's value of like bringing people together, exploring the real world and the, you know, real world metaverse, which is based on the real world. And I think there's definitely a world where Into the Scanaverse would resonate even more strongly with those values with multiplayer features or the ability to see and engage others in those exciting spaces that they can discover in Into the Scanaverse. It's not something that's currently available. We've done a lot of work to get the rendering on device right and be able to add in the 50,000 splats that are available today and more being added every day and complete that loop of scan into headset really quickly. But it's definitely something that is on the radar that we're thinking about in the future. A really good, not exactly for the end user, more on the developer side, but a related item we've worked on that showcases that. is a paper that was just accepted at the CHI conference, Computer Human Interaction Conference, called Co-creatar. And that was a study we did with developers, really testing out whether it's helpful for developers who are building real world content to have someone on site. So, you know, imagine I'm in San Francisco, maybe there's someone in London and we're developing for a location in London. It's really hard as a developer in San Francisco to sort of, you know, just download that splat or mesh or whatever and sort of think about how I'm building everything in the real world. But I haven't validated it. I don't know that I have the most confidence. You know, sure, virtually looks great. But if I'm using VPS and doing it like actually attached to the real world, does it work? operate correctly? Does it look right? So we created the system and tested out whether it makes developers feel more confident, whether they're able to build faster and things like that, where you could create the shared space using all of this technology that's also in Scaniverse and into the Scaniverse and everywhere else. And we tested it out with developers to see, hey, does this make your development experience building real world content more meaningful, faster, easier to iterate and understand? And basically I'd be developing here. Someone would be looking live at what I'm developing on site. And there's like this two-way channel of communication to talk about it and sort of iterate together. And it had a great impact for developers. I don't want to reveal too much because I don't want to like scoop the paper release, but that goes to show sort of the shared nature we're thinking about both for people creating these experiences and over time we'll see it hopefully for the end users who are enjoying them as well.

[00:32:10.336] Kent Bye: Yeah, and I know when Google Earth VR launched as well, they had this more cinematic versions where you could take this edited journey through all these different spaces, or you could go to different hotspots where they were the wonders of the world type of places that were a lot more majestic in terms of this size and scope of places that people already go to in physical reality to go visit. And so it sounds like that you've also got a number of different starter locations that you're listing here, the Bia Miosa Temple in Busan, South Korea. There's a Banksy Tunnel in London, Jomeos del Agua in Canary Islands in Spain, and the Piazza Mincio in Rome, Italy, and then the High Line in New York City, USA. So it sounds like you're also starting to take people on these guided tours. Can you speak to that curation of these locations to kickstart people into putting them into places that you think are the paradigmatic examples of what is possible with this technology?

[00:33:08.993] Joel Udwin: Yeah, and I would add, it really is more curation and enhancing the discoverability because all content in Into the Scannerverse is user-generated content. We didn't go and scan those locations personally as Niantic. We definitely have people here who could do amazing professional scans and have the The drones fly in and everything else that is fun when you get to like the really, really advanced scanning. And these are all created by either Scanverse users or players of Pokemon Go, Ingress, and other Niantic games who have contributed scans to our public map. And what we're doing when we surface things up in our discovery feed, we have this discovery feed just highlighting some really exciting and fun locations. We've actually either looked at information about where people have gone and are interested in going or have just personally encountered these splats in our own testing and said, wow, this is a really great example. The filtering that we do in Scannerverse is really more around the performance of the location when it's rendering rather than is this location in and of itself like the most meaningful location. As long as it's meaningful enough for one person to have scanned it and posted, it will appear into the Scannerverse as long as it will perform well when it appears there.

[00:34:28.679] Kent Bye: Okay. Yeah. And speaking of Pokemon Go and Ingress, there was a report that came out of Bloomberg back on February 18th, 2025, that reported that Niantic's in talks to be selling off Pokemon Go and the games unit from Niantic in a $3.5 billion deal to Saudi Arabia-owned Scopely Inc. Is there anything that you can comment on that? I was honestly a little bit surprised to see that, but... Just curious to hear if you have any comment in terms of the games component at Niantic, or if that is something that can't be confirmed right now.

[00:35:03.133] Joel Udwin: I can't comment on that. I'm just really excited about Into the Scannerverse and all the work that we're doing and launching today.

[00:35:09.485] Kent Bye: Okay, yeah, I figured that would be as much, but even if that does happen, you still have a commitment to this type of scanning of the world. What I'm seeing with Into the Scaniverse in this move is this mapping out of the world. I'm curious to hear any comments on, Gaussian Splats already is using a neural rendering technology, which is the new render pipeline, but there's machine learning AI that's integrated into the core of how this even works. And so you had mentioned briefly that AI is really good at text and images and video because there's a lot of content to train on. And so there hasn't traditionally been a lot of content that is more 3D based to train on. But I think that with the Gaussian Splats, you're opening up these new vectors and possibilities to do all sorts of new things with this content. including putting shaders and other things to kind of like modulate physical reality, you know, augmented reality. There's lots of things where this could go in the future, things with mixed reality devices and everything else. So let me hear a little bit more of your pie in the sky of like now what is going to be enabled with this repository of content, you are reporting that there's around 50,000 spots that you've gathered so far. That's starting to get to the point that you would be able to start to train AI, large language models or other models on. And so just curious to hear how AI starts to play into the future of the Scandiverse and where you see that going.

[00:36:29.062] Joel Udwin: Yeah, well, since forming Niantic, we've had an ambition to build a world map and Scanaverse and Into the Scanaverse fit right into that mission of contributing to this spatial intelligence that we'd like to build out together. And Into the Scaniverse provides both a human-level interface to visualize and see this amazing world that you're building out. So people get to finally see their contributions. And it also provides an amazing incentive to continue contributing for those who are motivated by seeing what they've built out of the world. When you get all of that data, You can bring it in and do all sorts of amazing things. I'm one person. I have some thoughts. But the power of large language models that we've seen is that when you bring the data all together, train the model, and make it available, people do some incredible, incredible things with it. I was just listening to the Economics of Everyday Things podcast. Sorry for plugging another podcast. But they were doing an amazing podcast about the economic realities of guide dogs. I never knew that a guide dog costs $75,000. And usually, hopefully, for someone with limited or no vision, it's given for free by a nonprofit who took that cost and collected donations. Imagine a world where we have this spatial intelligence and technology can help make things like guide dogs even more accessible to everyone, including parts of the world where maybe either caring for a dog or being able to spend $75,000 on training a dog is out of reach. And that might be out of reach But across the globe, cell phones are broadly accessible. And so if we can leverage that camera and large geospatial model for, you know, that's just one example that is on top of my mind because of that podcast. But there's so many other examples like that that can be used in everyday practice, as well as, you know, the pie in the sky, which I'll leave to the big dreamers.

[00:38:32.486] Kent Bye: Yeah, well, one of the other things that I've noticed about Niantic is that you're really on the bleeding edge of integrating with all the existing hardware manufacturers, where at this point, I haven't seen much indication that Niantic's working on their own hardware, but you are on the front lines of integrating with all the existing platforms that are out there. So you have the Snap Spectacles, You have the MetaQuest and I'm sure that there's integrations with Android XR since that's still the nascent beginnings. But with the close relationship with Google, I can imagine that there's some Android XR headsets on site that you're also working on. But I'm wondering if you could just speak generally in terms of like there's the VR side of things. There's the mixed reality side of things. There's the AR side of things. How are you starting to see how Scanaverse might be living on each of these different domains of virtual reality, mixed reality, and augmented reality?

[00:39:25.347] Joel Udwin: Yeah, that's a really great question because Scannerverse absolutely plays across them all. When you scan something with Scannerverse via Into the Scannerverse, as well as you can just export your scan as well, that can be something that's really powerful for humans to view in VR as it is in Into the Scannerverse where you can visit all these 3D moments around the world. When we think about AR, those scans can become VPS locations where you can anchor real world content. And while you're out in the real world, encounter these digital moments that are meaningfully tied to that real world place. And then mixed reality gets the benefit, I guess, of both worlds. You know, as long as the manufacturers enable that camera access so that you can actually localize, you could get that benefits of VPS. If you think about the headsets that maybe aren't displaying things, but you could think about an audio guide or sort of like localization via camera and then audio output. And then of course, mixed reality is also capable of VR. So you sort of get the best of both worlds for mixed reality, but also thinking about audio augmented reality, which is becoming really prevalent in the hardware space.

[00:40:35.572] Kent Bye: So I know that at the Snap Summit, there was Peridot that was being announced on the Snap Spectacles. And in discussion around that, that was really the very early beginnings of like taking the existing Peridot version that's on the Quest platform. And like each of the different platforms provides a new opportunity to explore the unique affordances of that specific device. And so I'm curious what you foresee the unique affordances of something like Into the Scaniverse on an AR-specific headset like the Snap Spectacles, where you're able to potentially overlay on top of physical reality or modulate it or add or have additional context information. Just curious to hear where you see the AR component of Into the Scaniverse starting to continue to be played out.

[00:41:21.420] Joel Udwin: Yeah, well, certainly if you're using those spectacles and they're capable of localizing, then you can imagine using that for real world anchored content, for guidance, for things like that. You can also imagine maybe you're not at that location, some sort of tabletop experience where maybe a location is annotated or you're getting historical information or something like that. Another sort of area of the story is, you know, you always ask your guests about, like, what's the potential you're excited about with this technology? And something that keeps me up at night is I think there's so much potential. I'm really worried that we won't get to harness all of it because we're not making the technology accessible to enough people. And that's sort of a reason why I'm really proud and excited that Into the Scaniverse was built on Niantic Studio and why it's amazing to see Niantic go after all of these early devices, as well as beyond all of the mobile devices that it can be, support the web, support Unity. You know, with Niantic Studio, we want to make it the best and fastest place to get started with 3D development. And we want to make it as easy as possible for eventually everyone to be able to build and deploy wherever they want to deploy. And in that way, harness the creativity and potential of everyone to drive the unlimited potential that XR represents.

[00:42:47.166] Kent Bye: I'd love to ask about the Lightship API. Is there anything that you're using in the context of Into the Scannerverse that is using any of the Lightship APIs?

[00:42:55.638] Joel Udwin: So Lightchip APIs typically refer to our ARDK or like Unity solution. So we're absolutely building on Niantic's tech stack through Niantic Studio.

[00:43:06.755] Kent Bye: Okay. And do you see that Into the Scannerverse is creating new opportunities for new APIs that could be provided in the future because of data that are now available? Like what type of core infrastructure things might Gaussian Splats from Into the Scannerverse lead to the capabilities in the future?

[00:43:26.638] Joel Udwin: Yeah, certainly. I mean, we have this whole new data type of Gaussian splat where, you know, previously we had APIs to return a mesh of a location. I can imagine over time exposing more of that data via API so people can build other meaningful things via our, you know, mapping SDKs and APIs that are available. So definitely, you know, something that we want to make sure we're getting right and it's working well is certainly, you know, when you put someone in a headset, You don't want to make them sick. So, you know, when you mentioned earlier in our conversation, you know, what led up to the release on web? You know, the biggest thing we were doing when we released on web was making sure that we're rendering performantly and then, you know, testing it before going into a store environment where people will very strongly tell you if it's not working up to their standards. And so that's been... a real amount of technical work that the team here has done. And once we feel good about it, which we do now that it's out on the MetaHorizon store, there's opportunities to safely make it available to others as well over time.

[00:44:36.971] Kent Bye: Well, I know a lot of times when I've been traveling to somewhere, going to a place and I want to just get a sense of what it's like. Sometimes I'll go to Google Maps and go to Street View and just get a sense of like, what does this place look like? And I can imagine that there could be a use case for businesses who want to show their showroom floor or to do updates around what's happening in their business. And so what kind of enterprise use cases for businesses do you see starting to adopt a technology like this to start to make available what's happening in the context of their business?

[00:45:08.171] Joel Udwin: Yeah, absolutely. I saw a demo the other day of someone scanning their latest display in storage and saying like, hey, here, wanted to show you what's going on and do little hotspots. This is something that we've previously already seen built with Niantic's 8th Wall product, where people might integrate into e-commerce. They might show location information. We had digitally anchored content for store-based experiences using VPS. So it's something that it's already in our DNA. We've seen people using our technology that way already and very happy to see them continue to use Scaniverse and scanning as well as the rest of our stack in enterprise applications. It's really important that we enable people to build the exciting, meaningful content that they're excited by to put out there, because we need more and more people to encounter this type of 3D content so it becomes normal, understood, and they know how to interact with it. So part of the reason splats are so phenomenal today are because it's this draw dropping wow moment for, wow, I can really visualize this thing was scanned and I can see it. And it's a really robust visual representation. Eventually that is going to go away and people are going to expect all visual representations to be like that. And we're going to have to move forward on the interactivity and adding more meaning behind the visualization. So I'm really excited about that. There's an opportunity there. And it's just work to add that interactivity.

[00:46:47.964] Kent Bye: Yeah, and as I talk to different people around Gaussian splats, it sounds like part of the next frontier is the 4D Gaussian splats, which I think probably requires more of a Lytro array of perspectives that are able to capture the same moment at the same time. Because if you have one single perspective, it's really difficult to have a video or dynamic motion happening because you're only getting, at that point, it'd be more like a depth kit type of thing would be a single perspective and there'd be a lot of artifacts behind it. But just curious if the 4D Gaussian Splats is anything on your radar or if that's something that would be more beyond the technical capabilities of a single perspective mobile based solution. Because I'm thinking like what Scanverse is, is like a still life that's representing like a snapshot of a moment in time. It's actually like many moments that are stitched together into a single moment. But I'm just curious to hear some of your thoughts on the future of 4D Gaussian Splats and more dynamic moving motion animations, capturing things that are unfolding over time.

[00:47:45.774] Joel Udwin: Yeah, I mean, it's definitely on the radar, nothing active that I can comment on. When we think about the evolution of moments on the internet and in apps, you went from Instagram sort of covering up the fact that cell phone cameras weren't so great to TikTok, where you have to have that movement. You've got to get the dancing in there, right? And so we're in our Instagram moment of casual splat capture across the world. And I love to look at history and see how that's going to unfold for new things. And I could totally see a world where over time we get to, you know, the TikTok moment for Splats. You know, other areas that are on the radar that we're thinking about are even things just like adding more to the capture, like audio. Today, when you go into the Scannerverse, we have this amazing, fun, kind of like on theme background music. If you're capturing this moment, wouldn't it be interesting and fun to capture the audio that goes along with it and have that as the background? So there's a lot of different possibilities for making things more interactive, engaging, and bringing people more into that moment. And certainly 4D is definitely one of those possibilities as well.

[00:48:55.182] Kent Bye: And so one of the things that I know Niantic's been looking at is the frontiers of looking at what the emerging business models are gonna be with this physical based metaverse, geolocated business models that you've been able to experiment with both Pokemon Go and Ingress. And so what kind of business models are you foreseeing with something like Scannerverse? Do you see an immediate monetization strategy or do you see this more as like an R&D building up the infrastructure and there's no real short-term interest in trying to find emerging business models for where this might lead?

[00:49:28.001] Joel Udwin: Yeah, so Niantic acquired Scanaverse in 2021. And prior to that, I believe Scanaverse had like a pro tier or there was like a paid element to Scanaverse. And when Niantic brought Scanaverse in-house, Scanaverse became a free application. I don't know, and I don't know of any current plans to monetize within Scannerverse directly. There's always, I guess, possibilities. I would say, though, that there's a lot of value in the data that users choose to share with us. And so having Scaniverse as a free application that is really a meaningful part of many professionals who use it to scan things for their own professions, in addition to scanning locations and sharing things with Niantic, for now, it's really valuable as this entry point for people to contribute. And there's value in the data when they choose to contribute. And that's something that we're really excited about. something that they're proactively choosing to do. So I would hope the users are excited about that too. And it creates this map of the world that we can all see into the Scannerverse building out together.

[00:50:34.773] Kent Bye: Awesome. And finally, what do you think the ultimate potential of Gaussian Splats and physical-based Metaverse might be and what it may be able to enable?

[00:50:47.543] Joel Udwin: Yeah, I mean, that's such a big question because I'm just one person. I'm really excited about, as I mentioned, sort of the everyday practical things that you can do with XR and Splats, but there's so much more out there. And as I mentioned earlier, my biggest fear is that we're not making the tools accessible enough for everyone to contribute and unleash the full potential. And so I'm really passionate about Niantic Studio, about Scaniverse, about making all of the tools that Niantic has available to as many people as possible, because that potential isn't going to come from me. It's going to come from from some amazing person who has an amazing idea out there. And so if we can get them the right tool set so that anyone can access it, anyone can create and show the meaningful thing that they've built, that to me is the sum of the potential that maybe I and the team I work with at Niantic can bring to the game.

[00:51:46.321] Kent Bye: Awesome. And is there anything else that's left unsaid that you'd like to say to the broader immersive community?

[00:51:52.005] Joel Udwin: First of all, thank you. And thanks to Kent as well. As I mentioned, when we were just chatting, like I've been in the immersive space now for five years and I got up to speed because of podcasts like this communities like the VRAR association. And so just. Thank you, community, for being there, for providing resources, for educating. It's been such a ride. I'm really enjoying it and wouldn't have been possible without all of the resourcing that a lot of people have spent a lot of time putting together to bring more people on board.

[00:52:24.966] Kent Bye: Awesome. Well, Joel, thanks so much for joining me here on the podcast. It's really exciting to see all these announcements and innovations from Niantic and, you know, from your own personal journey coming from 8th Wall and WebXR and this kind of like open web technology stack and how you had to really optimize for that and how that has led into being able to render Gaussian splats locally. Just really exciting to see this type of allowing people to go out and capture different aspects of the physical reality. And it really feels like it's the beginning of the physical-based metaverse to have this baseline of content, the physical reality, and also just how that's going to bring people out. And it's really exciting just to hear that there's communities and groups of people that are doing this collaboratively together because it's something they're able to create together. And then by the end, they're able to all enjoy that for themselves within their community, but also sharing it to the wider world. So really excited to dive deeper into this latest Into the Scannerverse release on the Quest 3. I'm going to be diving in right over this conversation to go check it out myself. And yeah, looking forward to see where Niantic takes all this here in the future. So thanks again for joining me here on the podcast.

[00:53:28.248] Joel Udwin: Thanks, Kent. See you in the Scannerverse.

[00:53:31.024] Kent Bye: So that was Joel Uden. He's the director of product for Niantic's AR Research, developer platforms in the Scaniverse. So I've had a number of different takeaways about this interview is that, first of all, well, the Scaniverse is a really cool app. You should definitely check it out because you can do very quick scans. You can have everything processed locally, and it's rendering all on your phone as well. Also super fascinating to learn around the connection between 8th Wall at Niantic and what they're doing with all these different technologies and just how in order to get things to work on WebXR, there's been a lot of optimizations that have had to happen. And then those optimizations can then be translated into the different applications that they're building. just a really smooth overall user experience. I will say that when you're zooming in and out of the map, there's this kind of annoying thing when you get super close, then rather than the center being the point of focus, it ends up being like a little bit higher because it does this tilting of the map. And I found it a little bit difficult to navigate close in and also to try to get your bearings to find specific locations. You have to kind of know your geography and know all the cities surrounding it as you're zooming in and It can be a little bit tricky to find wherever you might live to see if there's any local scans there. But there is a way to scan in and find more and more of these different splats that are being featured within the context of your town or anywhere that you want to go around the world. So it's really cool to see another type of mapping application and to explore around. One of the other things that would be nice would be to be able to turn off the audio. One of the things I noticed on the MetaQuest is that you can pull up a browser and like pull up a YouTube video and start playing it. So you could like listen to some other audio or podcast or something in the background while you're exploring around the Scaniaverse. But there's no easy way to turn off the audio. Generally, I think it would be nice if the Quest offered some way to manage the sound sources from different apps as you do more and more of these different mixed reality applications. But yeah, just overall, I'm really impressed with the application. I appreciate the privacy first types of design where you don't have to upload things. You can scan everything locally. I didn't find a way to look up my splats that I recorded unless I upload them. At least it wasn't syncing up for me. If I just had them privately, I think he mentioned that you have to actually upload them in order to actually see them. I think it'd be great if you could highlight some of the different scans that you'd want to jump in and take a look at, be on your mobile phone and find a scan because the mobile phone interface is a lot easier to find things just generally with the map interface that they have there. And then to be able to launch it into the quest, it'd be nice to have a feature like that, or at least to flag them and have them be synced across your accounts. And it did mention the VPS a number of times. If folks don't know what that, that's the virtual positioning system where it's basically like GPS, but like a hyper localized context that you're able to do even more mixed reality applications. And it's pretty interesting to see the ways in which that they're blending and blurring the more VR versions and having a way to develop and preview things, but also have someone there in real time, seeing how those kinds of mixed reality applications are looking within the actual physical context. And so, um, This seems like it'd be a great way to do some experimentations within the context of VR. So I think overall, they're way more interested in like the more AR version. So to me, it's interesting the fact that they're launching this on VR platform, but I think VR is just a much better platform to see some of these different Gaussian splats. It feels like really native to VR. be immersed into a space and to be able to get close up they have stick locomotion you can move around you can fly up and down just a really great interface and it's a great way to jump in and see all these different objects that they're scanning most of them are like monuments or statues or things that are in different cities or maybe a mural or big objects and It's whatever the interest was for that person who happened to take the scan. So I think overall, it'd be nice to be like, if there's a guided tour that would have some themes, or at least if you wanted to search for the different tags, if you were interested in this one thing, if you could just go through and find those different places. So that type of discovery thing could use the next iteration of trying to help people discover these different splats. or allow you to have people record like a guided tour or something like that as you are popping around to these different places so having other audio components whether it's just ambisonic captures of different places although you know with the phone it's hard to capture ambisonic because you'd need an actual ambisonic microphone but in the future i think this is headed towards having more and more immersive components on the audio level having people add stories or add narrative parts or have different playlists that you could go and watch stuff And just like the Google Earth VR would have like some more cinematic guided tours that they could take you on, have a little bit more of those things as well. They have a shuffle button that can allow you to kind of randomly find different splats from around the world. But yeah, just curating things in a way that is categorizing things into different types of objects and different genres. I was looking for a little bit more of like vegetation or plants that would be unique to different geographies. And so as time goes on, it's going to get pretty cluttered. And so discoverability is going to be more and more of a priority to figure out how you're going to actually find some of these different things that you're interested in. But super cool to see this pipeline from the phone into the MetaQuest and then eventually into more AR mixed reality applications. It sounds like that they're on that path. And Niantic has been really on the bleeding edge of whenever there's a new platform announced, they usually have something that they're also showing off all the different R&D that they're doing. Really impressed with a lot of the different stuff that Niantic has been doing. And you can see that this passion for mapping is a part of the DNA that just makes a lot of sense for augmented reality, mixed reality. So expect to see a lot more of this type of stuff from Niantic in the future. And I'm very curious to see how they continue to develop this line of product and how that continues to build off on all the other things that they have plans for here in the future. So that's all I have for today, and I just want to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a listen-supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.

More from this show