Today Niantic’s 8th Wall is announcing Niantic Studio, which is “a new visual interface for Niantic 8th Wall developers that offers an entirely new way to build immersive 3D and XR experiences.” It’s essentially a web-based game engine using three.js that can render WebXR experiences and starts to integrate a few of Niantic’s Lightship APIs, but will be launching with more integrated computer vision and geospatial mapping features soon.
I had a chance to speak with 8th Wall Founder Erik Murphy-Chutorian at length to get a lot more details, and be sure to tune into the podcast or read more information below to get a lot more context on this announcement that’s being made at the Augmented World Expo. You can read more about Niantic Studio in their blog announcement.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast, the podcast that looks at the future of spatial computing. You can support the podcast at patreon.com. So in today's episode, I'm featuring some news that's being announced at AWE, which is starting today. The founder of 8th Wall, which was founded in 2016 and acquired by by Niantic in 2022. And they're announcing a brand new Niantic Studio, which is a WebXR web-based game engine, essentially, where you can build immersive experiences and then deploy it out into WebXR experiences that's using 3.js. And they have all sorts of other entity component systems. And yeah, essentially, it's like a game engine that's within the browser that they've built. So in the past, 8th Wall has slowly integrated APIs that have been developed by Niantic. They have their mobile games of Ingress and Pokemon Go. They've abstracted out a lot of their geospatial mapping APIs into something called the Lightship API, which you can use for mobile applications. But they've also taken some of those same APIs and slowly integrated into their 8th Wall AR projects. So they have WebAR, which is essentially like a WebXR skin that is able to do augmented reality because WebXR can do both VR and AR. And so they're creating this whole new game engine so that you could build even more nuanced scenes. And eventually they're going to be tying in more and more of their Niantic APIs and their computer vision stuff that is going to be baked into this game engine. So it'll be just easier for you to start to utilize some of the cutting edge of computer vision and augmented reality within the context of WebXR. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Eric happened on Friday, June 14th, 2024. So with that, let's go ahead and dive right in.
[00:02:02.992] Erik Murphy-Chutorian: My name is Eric Murphy-Katori. I am a VP of engineering at Niantic. I was the founder and CEO of Aethwal, which is a company Niantic acquired a little over two years ago. And once we came in house, the Aethwal products and platform, it became rebranded under the Niantic name. And we continue to launch Niantic's Aethwal platform where we power a lot of the web-based augmented reality on the internet today. And we're continuing to expand that and bring more Niantic technology into it. I have a pretty exciting new roadmap and set of projects.
[00:02:37.926] Kent Bye: Maybe you could give a bit more context as to your background and your journey into working with XR and launching 8thWall.
[00:02:45.188] Erik Murphy-Chutorian: Sure thing. So, I mean, my personal background, so I was the sole founder of 8th Wall, but before that I had been at Google for about eight years, I think, between internship and full-time and then Facebook for just under two years. Prior to that, I had done a PhD in computer vision and that kind of whet my appetite for the 3D geometry and a lot of the key components that go into the augmented reality stuff that we ended up doing, you know, many years later at 8th Wall. But I had a career that focused on applied computer vision in search infrastructure at Google and then on the Google Photos product. And then at Facebook, I worked on the Messenger team for core engagement, which was things like our sticker sends and photo sends and GIFs and a lot of the kind of rich media that goes into Facebook Messenger product. The idea for 8th Wall was in 2016 when people were getting pretty interested about AR and VR, but a lot of the focus was going to be on hardware and headsets. And it felt like at the time we had another decade until we were going to really see kind of critical mass in some of these devices. And the desire was, could we bring some of that technology to market faster on mobile phones and empower creators and developers with tools that would allow them to build augmented reality content? And that was the genesis for the Aethual idea. Founded the company to begin, and certainly it's been not quite a decade later, but we've really seen this almost come full circle with hardware attaining critical mass in some of the biggest hardware manufacturers. And Aethual and Niantic has now become a premier platform for augmented reality development. Awesome.
[00:04:20.704] Kent Bye: Well, I know you're going to be announcing some stuff at AWE. And before we dive into that, I want to just give a broader context to web AR and doing augmented reality on the web. Because I remember seeing a demo back at AWE, must have been like 2019, where You had a eighth wall demo that was showing on a phone. It was on the HoloLens. It was basically cross-platform. But yet when I asked Tom Emrich, I asked, is this OpenXR? And he said, no, it's kind of like our own thing that's not quite using OpenXR. I see a lot of the use of WebAR and I know that WebVR for a long time was the initial specification that Mozilla had implemented, but everything kind of shifted over from WebVR into WebXR and everything else in the open standards realm has been operating under WebXR. So I guess the first question is, I know in the past you weren't using OpenXR, but as you move forward, is that something that you're looking at or have you decided to go down your own path in terms of how you're mediating talking to the hardware with doing stuff on the web.
[00:05:26.647] Erik Murphy-Chutorian: OK, so there are a number of questions I can answer there. The first is, there are a lot of open source WebXR technologies, and the naming can get a little bit confusing around them. So there are two separate specs. One is called OpenXR, and one is called WebXR. OpenXR is a native hardware specification that allows headset manufacturers to essentially implement the same set of functions so that a native software product can work on it, whether it's a Quest or a Magic Leap or other devices that implement the OpenXR spec. WebXR is the W3C specification for XR content in web browsers. And that supports a lot of VR content as well as some mixed reality content. And WebXR, in many ways, mimics the OpenXR spec. It has a lot of the same functions instead, so that an OpenXR-compliant device, the browser can then support it in a WebXR context. So the Eighthwell product certainly has been supporting what we've called metaversal deployment, where you can build an application and run it on both mobile phones and on headsets. Since I think about 2020, I want to make sure I get that date right. But that is actually based off of the WebXR spec. So when you're running a headset session, it uses the standard WebXR, which is the successor of WebVR. And it's a more mature specification. It's used predominantly. And we've been supporting that. I've been a big proponent of it since we really came out with that big change a number of years ago.
[00:06:53.871] Kent Bye: Awesome. And so now that you're announcing at AWE the Mandic Studios, maybe you could just give a bit more context as the types of things that you're going to be announcing in terms of What looks like to be a little bit of like a web browser based thing to be able to produce 8th wall experiences and using presumably some features from Lightship API, but maybe just give a bit more context for what's happening with Niantic Studio.
[00:07:21.614] Erik Murphy-Chutorian: So this is really a major launch and rethinking and revisiting of the 8th Wall product suite. And really thinking of it holistically is where do we want to be at Niantic and how do we allow people to create bigger and better types of content than they ever have before on the 8th Wall platform? So when AFOL started, it was a technology stack. Originally, it was native library to be able to do augmented reality across devices. And in 2018, it became this WebAR technology product that allowed you to do augmented reality in the browser on mobile phones. But one of the things that launched, I think, the year after that was what we called for a long time the cloud editor. It was a web development and hosting platform that integrated natively with our technology set. And that's been the way that people have been building Aethel applications for the last four years. And what that is is, I mean, you have code editing. You have website hosting. You use our technology, but you bring your own rendering engine. So we've supported as many as five different rendering engines, including well-known ones like 3JS and A-Frame, Babel and JS, and Play Canvas. And we were agnostic to the building of these experiences. You would build with these other tool sets. You would use our website development, our hosting, and our technology stack for XR. But most of the development experience was code-based, which is great for traditional websites, but it's a challenge to build immersive 3D content without being able to see it as you develop. And so this flow would be you'd make some code changes, you would compile it, you would view it, you would go back and And we wanted to rethink this, not only from a new standpoint, how do we build a visual editor that allows you to do visual development, like you might think of as 3D development or 3D game engine tools, but then also take it further as, can we expand this platform from the kind of augmented reality marketing content that's been built traditionally to support casual gaming, as well as our use cases, as well as XR, and then as well as a lot of the mapping goals and initiatives in Niantic. And that is the genesis of this new Niantic Studio product. It is an in-web development platform that has visual editing, 3D content creation, a full, very performant game engine that's perfectly tailored for the casual gaming use case we're going after, and then the ability to then deploy that content across devices on desktop, mobile, and importantly, on headsets, where we support both VR and AR headsets through the WebXR spec. Awesome.
[00:09:55.147] Kent Bye: I guess to take a step back and to look at Niantic as a company has developed very popular AR games like Pokemon Go, but they've mostly been native applications to integrate this type of interactive immersive experience where people are going out into the world and catching their Pokemons. As this product is more web-based, I'm wondering if you could speak to what was it that Niantic saw in the open web that was different than what you could do on, say, a native app? Because a lot of the native apps have all of these integrated APIs to do experiences that you can't do on the web. But yet at the same time, there's other affordances with the web that maybe had other trade-offs that were easier for people to either have a low friction to get in there or have a URL and not have to download anything. And so just walk through a little bit of like, what are some of the major metrics that Niantic looked at in order to say that, okay, this is a viable thing to really invest in the open web portion of how that could help to propagate what the medium of AR could do.
[00:10:58.457] Erik Murphy-Chutorian: Yeah. Well, so Niantic is a company that is about getting people together, getting them outside, you know, walking with friends and having in-person interactions. And throughout that process, Niantic has really built a foundation on the Niantic map. our knowledge of where people go, where interesting places to visit, what are routes to get from place to place. And that technology stack is kind of independent from the way in which you view it and the way in which you use it. Our native applications are certainly some of the top mobile games in the world. have ardent followings and are really incredibly fun ways to get outside and play. But they have a number of internal technology stacks that are used for both developing native applications on this technology stack and web applications on the stack. And so I think Niantic's interest in 8thWall was at the time, this is going back a couple of years, there is a flagship platform that Niantic has for building native applications and integrating core technology within it. And that works within the context of Unity applications and the native apps that it certainly runs within our games and is used internally. But 8thWall had targeted a different demographic, which was more casual augmented reality on the web. The applications that are built on the AFL platform generally have a much shorter development time. Many of them are used for marketing and branded use cases. And even by then, I think we had powered over a few thousand commercial AR experiences for a lot of the world's top brands. This really is carved out of what's now called WebAR. It's this clear category of web-based casual AR content. And we, as part of Niantic, one of the first things we did in year one was bring Niantic's visual positioning system, the VPS query that allows you to figure out where in the world you are from your camera, bring that into the AFL platform. So now you could do VPS on web for the first time anywhere. And we've brought Niantic's map tiles into the web, and we've allowed now you to build applications that uses that same kind of stack, but runs in a web context as well. And then we've gone the other way as well, where we've taken our web platform and we've used it to bring in-game features into our native applications, where you can have in-game web views that open up and have certain features that are web-developed but still have the native feel, and you can go out of them within the application.
[00:13:23.223] Kent Bye: Nice. And so this is ahead of AWE. You're going to be announcing this on the 18th. So I haven't had a chance to actually play around with it. But as I was looking on your website, I noticed that in your previous iterations that you actually had like a number of the other existing frameworks like 3GS, Babylon, JS, and A-Frame and Maybe you could just speak about in the previous iterations, you were able to somehow integrate these different frameworks if people already had projects or if they wanted to integrate it with more of the backend for what 8th Wall was doing. But as you move forward with Niantic Studio, are you planning on having support for some of these existing 3D frameworks like 3.js, Babylon.js, and potentially others?
[00:14:03.425] Erik Murphy-Chutorian: So, well, so yes and no is the answer. So for the existing platform that we have today, which is the existing eighth-world web hosting and development platform, is continuing and we're not getting rid of it. So the same way that you can build applications today with 3.js or A-Frame or Babylon.js and bring in your own pieces, that functionality is still there and still can be used by developers. But one of the things that we've seen is we've really focused on the technology stack, and not on the rendering, and not on the physics, or animations, or particles, or the way these interact with each other inside of an entity component system. And so that developing was kind of a bring your own pieces, like bring your own renderer, bring your own parts. And we would really focus on giving examples on how to use a lot of these complementary and great open source technologies to build applications. But it became an extra challenge to do so because these requisite parts don't work well together. You can't see them in a single visual editor. You can't make changes that will impact out of physics force to an object and have it both render or change places, follow and collide with other objects. And so the Antics Studio takes a more opinionated viewpoint. It takes the right set of internal technologies and open source technologies to solve the holistic kind of game engine and development piece. It lives on the back of giants for some of the permissive open source packages, but it has a lot of in-house technology and things that we've built up through four years of expertise to really try and build a holistic performant experience. And the end result is something that's much, much easier to develop with that has the ability to go much further in terms of scope and scale and size than the kind of applications you could build with Aethwell previously.
[00:15:47.399] Kent Bye: Does that mean that you have chosen one specific JavaScript framework like 3.js or Babylon.js in order to build Niantic Studio?
[00:15:57.546] Erik Murphy-Chutorian: Yes, we have chosen PC, yes, is the correct answer. So in its initial launch, Niantic Studio uses 3JS internally as a renderer.
[00:16:07.111] Kent Bye: Okay.
[00:16:07.531] Erik Murphy-Chutorian: And it uses a number of other technologies. So it uses other code for physics and for animation and for particle effects for some of the entity component system pieces. And it uses a mix of WebAssembly as well as JavaScript code to build out the full feature set.
[00:16:24.200] Kent Bye: Okay. And I know that when I did the demo, it might have been actually at AWE in 2022 when I saw it, but I saw the HoloLens and was able to see the same kind of volcano experience on these different headsets. And so as you start to have different cross-compatible capabilities, are you just using the browser for these different headsets to be able to render out the web AR experience, or are you actually compiling it down to a native application that would be deployed to some of these different headsets like the Quest or other augmented reality devices?
[00:16:57.785] Erik Murphy-Chutorian: Oh, yeah. So Niantic Studio is a web-based development platform that is targeting web-based content at launch. And so the content that's produced is hosted in a web browser, can be viewed in mobile web on Android or iPhones, can be on desktop web on laptops, on Windows and Mac, and then it's viewed as a WebXR-compatible content on headsets like the Quest and the AVP, the Vision Pro. And so there you would navigate to the content in the browser, and then you would have an immersive session that you enter, and then you can view the experience in the content.
[00:17:31.653] Kent Bye: Okay, yeah, and I know that at WWDC, they just announced that with VisionOS 2.0 that launches sometime in the fall, that WebXR is going to be implemented in Safari without any feature flags. So I imagine that ahead of that time, if people do have a Vision Pro, they would have to enable the feature flags in order to see some of the WebXR content. But it sounds like that some of the stuff that you're building is also targeting both the Quest platform with their Chromium version of their Quest browser, and then with Apple Vision Pro with Safari.
[00:18:01.016] Erik Murphy-Chutorian: Yeah, that's 100% correct. So the first version of Division Pro had WebXR spec behind the flag. A lot of people were developing by turning that on and gaining experience, but moving into this next version, you're going to see that on by default, and that will make it much easier to get into content.
[00:18:16.567] Kent Bye: Okay. Well, as I'm looking at your website, there's a number of different product suites that include like world tracking, human AR, which is sort of like the facial tracking and being able to look at from the front facing camera, do different types of effects, sky effects, light ship maps, image targets, light ship VPS, cloud editor modules and templates. And so as you're launching this new Niantic Studio, are you essentially including all of those or some subsection of all those other features and including them within the context of this new platform to develop almost like a web-based game engine or a development place to be able to put together these immersive AR experiences? So just curious to hear how many of these existing features are going to all be baked in into this at launch.
[00:19:04.096] Erik Murphy-Chutorian: So at launch, we're launching as a public beta because we have some, but not all, of the features already working in a visual editing mode. So at launch, we will have world effects, which is world tracking and the ability to simulate and place objects in a scene. You will also have our face effects feature, so you'll have the ability to see a face mask, be able to add content to it, and develop on the individual debugger and editor. You will be able to bring in a number of our other technologies through code. And we'll be working shortly after launch to bring those all to visual parity. And you'll be able to bring in our mapping features like VPS, Gaussian splats, and map tiles through code as well. And we envision bringing those to market with visual tooling shortly thereafter. And so that beta label is really coming out as a result of not having all the feature parity on day one, but the team is going to be working hard to bring that all in and support all the things that we do at Equal today, both with visual editing, simulation, and then obviously deployment.
[00:20:02.748] Kent Bye: Okay, and very curious to hear Gaussian Splats, because I know that was presented at SIGGRAPH back August of last year, and it's really a completely renovated way of how to do rendering of volumetric content with this kind of neural rendering approach that was a completely new paradigm. Very curious to hear what kind of specific features that you've built, because I know that there's been different WebXR implementations to render out Gaussian splats. But yeah, just very curious to hear. There's many different types of iterations, and this is a very rapidly developing type of area with new novel ways of displaying volumetric content. But just curious to hear what type of stuff that you've been able to integrate.
[00:20:42.628] Erik Murphy-Chutorian: Well, so I mean, again, we're pretty passionate about mapping and 3D reconstruction and services in Niantic. And we also have a number of other internal products. But one of them particularly is this product called Scannerverse, which was an iOS app for photogrammetry and being able to scan objects in places. And this was an acquisition of Niantic many years ago before 8th Wall. And it's been one of the most popular ways in iOS to do object scanning. At the beginning of this year, we launched a Gaussian splats inside of Scannerverse. And it was, as far as I know, the first mobile application that ever did on-device splatting. So a lot of these services are taking video content, sending it to a cloud, allowing it to be processed and brought back down. But Scannerverse was able to do on-device local splatting from being able to do, you could scan, and within a minute or so, you can view these splats directly on-device. And then a couple of weeks ago, we launched Scannerish for Android, which also has splatting capabilities. And so these two products are providing Niantic users with the ability to very easily create 3D high-quality Gaussian splat-based content. And a big goal of Studio product will be allowing people to use that content in their development. And so at launch, we're going to show some examples of Gaussian splats being brought into Studio-created products. and being used as backdrops for applications, so the ability to navigate and view scenes. But we envision there are going to be a lot more here as the year goes on, and really focus on bringing the map features as first-class citizens and really almost the best way to engage with this type of mapping content within Studio. Awesome.
[00:22:21.248] Kent Bye: Well, one of the curious things that you alluded to that just wanted to elaborate a little bit here is that internally while developing applications like Ingress as well as Pokemon Go, you were including things like the virtual positioning system that may have been an internally used features and then As you launch the Lightship API that was productized in a way that allowed other developers who wanted to do augmented reality experiences to use that API. And so then you have many more different contexts and use cases that are being used. And so I'm just curious to hear a little bit more about now that that's been out there and there's been many other different use cases and contexts. What are the types of things that have fed back into that system that has bled back to specific features that could be baked into an experience like what you might want to create on something like Niantic Studio?
[00:23:13.828] Erik Murphy-Chutorian: We've seen so many great examples of applications in so many different learnings. Certainly, 8th Wall has been used for storefront activations, for out-of-home activations, for DPG and unboxed packaging. We've seen people build games, infomercials. We've seen people use it for educational purposes, for artistic purposes. And so one of the goals of the studio is a product that really is generic enough to use for a lot of these different types of use cases. The Lightship product has seen a lot of use in gaming, things like Skatrix Pro, which is on AVP, but also, I mean, these applications that could really make best use of the environment around you, where the game structure can play in your space. You can throw balls and bounce them off your walls and couches. have real inclusions and semantic understanding of the world around you. And I think being able to make it very easy to develop with these game elements and this kind of environmental sensing, environmental understanding, and then real live iterative development is where Studio is going to shine and where we really want to continue to push it. So I expect to see a lot of different use cases in these different types of domains. And it'll be really exciting to see what developers do when they get their hands on it.
[00:24:23.965] Kent Bye: Okay. And when you think about the different applications that you've seen and on your website, you list a lot of the ways that different companies, advertising agencies and brands have been using 8th Wall. What are some of the metrics that you look at to see like, okay, was this successful or why is augmented reality on the web something that should be compelling for brands to consider?
[00:24:49.125] Erik Murphy-Chutorian: Yeah, so two of the big hero metrics that we've seen is one is time spent and experience. And so certainly with this content, especially with the branded content, the brands want to engage with consumers that are fans of their content or fans of the experiences they build. And we've really tried to encourage and provide them with tools to be able to understand how much time people are spending in this. especially marketing based AR has really outshined traditional advertising and marketing is in that time spent metric. People like to engage with these immersive experiences, allow them to spend time, play games, interact with content, explore spaces, get informational content in a very new way. And that's where we've seen lots of success and really kind of one of the best value pieces for the content itself. You can also measure views, total number of times people or unique sessions that people are spending in these experiences. And certainly, the more sessions, the more time, the more engagement leads to the more value for those marketing-based experiences.
[00:25:49.892] Kent Bye: I saw an old Augmented World Expo talk where you were talking about 8th Wall and three major learnings that augmented reality is really meant to have you move around, that you could use it as a wand, and the way that you are using your peripheral vision rather than having everything be an interaction on the screen. And so I'm just wondering if you could comment on some of the different best practices that you've learned from what makes a compelling augmented reality on the web type of experience.
[00:26:20.173] Erik Murphy-Chutorian: Absolutely. I mean, I think that for one is really play and understand the form factor. You know, if you're a prominent target as headsets, which we do with our platform, really the end of the focus is immersion. It's like being able to explore, to see things, to engage, to interact. I mean, be able to grab something, feel it, throw it. On the mobile side, the goal really is to play to the value add of the phone. Certainly, you can build interactive things where you're moving around or you're playing a game where you're walking up to something, you're walking behind it, you can engage and you can see it. Or ways in which engaging with your space provides extra fun, like face effects on yourself, where you can take pictures and transform yourself. world effects where you can decorate the space that you're in and create novel views or photos that you can share with friends. And then we've really up-leveled things recently with generative AI. So we have a number of integrations with Aethwell that allow you to tie into large language models and image generation. to be able to unlock really new types of content in its entirety. A great example was the Wahl application, W-O-L, which was built earlier this year, where you could be in both mobile or headset. We partnered with InWorld to bring an AI-generated owl to life in this space, where you have a real conversation with this owl character who explains to you about the forest that you're in and describes things and you use natural language. And it's a transformational experience to have an artificial character that you can speak with and have natural language conversation while exploring a space. And I think these things that are really pushing the boundaries of the immersion and the visual content are where these applications really shine.
[00:27:58.806] Kent Bye: Nice. Yeah, I know I had a chance to check out, well, last year at AW interview with Ichi Matsuda and also nworld.ai. Do you know if that's using the web speech API on the backend or if that is somehow directly feeding into nworld.ai or how's the audio actually get into nworld.ai?
[00:28:18.393] Erik Murphy-Chutorian: Yeah, so it is using the Web Microphone API to be able to capture audio. That audio can be transcribed into text using services that... Actually, I don't know offhand what speech-to-text that we're using. And then it goes to the in-world API, and the in-world is able to then take that text, if not the audio, and be able to use that to create the conversational elements as well. Yeah, and Keiichi Masuda at Liquid City did the creative part for that project. He really helped tie this together with the in-world NA2 components, and a really excellent result.
[00:28:52.142] Kent Bye: I noticed on your website, for trying out some of the existing products, there's a two-week trial, and I saw that the public beta is going to have a free period, or I don't know if there's going to be a free period and then it's going to go back into a paid model, but maybe you could just describe a little bit about How does the revenue on your end start to come in or how to, as a consumer or producer of some of these pieces, what can people expect on what the pricing structure might be for using some of these different tools like this new tool of Niantic Studio?
[00:29:24.122] Erik Murphy-Chutorian: So internally in Niantic, we wanted ways to get our technology and our stack in hands of as many people as possible. And one of Eighth Wall's longest and biggest concerns is that as a professional tool, it's an expensive tool and out of reach for casual developers and hobbyists and people, artists and education. We are announcing an additional launch is that not only is Niantic Studio coming out next week, we are launching a new free tier. And this will be a totally free way for anyone to use the platform, develop applications, and build stuff, and get access to 8th Wall in a way that we never had previously.
[00:30:00.215] Kent Bye: Nice. And when you say next week, this is going to be airing next week, and so it's going to be presumably... Yeah, exactly.
[00:30:08.280] Erik Murphy-Chutorian: On June 18th, we will be launching Niantic Studio, and we'll also be launching a free tier so that anyone can try it out at 8thwall.com.
[00:30:16.271] Kent Bye: Okay. Well, I know one of the hot topics of debate for people who are wanting to do augmented reality development on these different platforms is that for a lot of these platforms, you don't get access to the camera, like the raw camera footage to be able to overlay information. The Apple Vision Pro just announced that there's enterprise APIs to get access to the camera, but even the Quest, it can be difficult to get access to what's happening on the camera to overlay things. In some ways, it's kind of ironic in the way that a phone-based AR can sometimes have more access to be able to do more things than you can do on the types of XR, VR, mixed reality, spatial computing headsets like the Apple Vision Pro or the Quest. So I'd love to hear any of your reflections on what is available now and what you'd like to see to be able to take it to the next level for where the medium of AR could go with the head-mounted displays.
[00:31:07.495] Erik Murphy-Chutorian: Yeah. And so we at 8th Wall and IATSEC, we build a lot of our own technology stack that's computer vision and camera based. And so on mobile phones, we're able to offer a suite of technologies, including things like face effects and image target tracking and world effects and semantic understanding and visual positioning system. And that all works by having access to the camera and being able to use that with our in-house algorithms. Headsets have been another story. The headset manufacturers have had a lot of privacy concerns about camera use. And so access has been fairly restricted to certain devices and certain use cases. Magic Leap 2, for example, provides camera access, but the Quest doesn't. Apple Vision Pro had not provided any camera access until the announcement last week, which is that enterprises of certain sizes will have the ability to use camera for internal applications, which could be useful for a number of cases. But I feel like, as you said, we're all carrying around cameras with phones in our pocket all the time. We're very comfortable with what it means to capture and share and understand a lot of the privacy expectations around doing so. And I would love to see us really understand that in the headset world and be able to come out with conventions for devices that provide camera access that can be used for these types of great technology use cases while retaining a lot of the privacy safety, and a lot of the situational and conventional knowledge about when is it OK to be using these? Where is the data going? How does it play into privacy and safety? And hopefully, we'll see as time goes on the ability to really innovate on top of these new hardware platform.
[00:32:41.347] Kent Bye: Well, another thing I noticed on your website is that there's a lot of project files, which I'm presuming are if people wanted to see some code examples of different things that they can start to dig in. Is that something also that you're planning on having a number of different examples that you could have with Niantic Studios to see like, here's a range of different archetypal use cases that you can dive into and here's some example projects that you can start to build off of?
[00:33:06.943] Erik Murphy-Chutorian: Yeah, absolutely. So today, I think we have over 100 example projects on our website for the traditional 8th Wall stack. When Niantic Studio launches, we're going to have a handful of new applications that really highlight a lot of the key features and use cases, and we expect that library to grow dramatically as time goes on and we continue to add to it. But something that you can do on 8thwall.com today, and you'll be able to do with Studio, is that anyone can publish public project content and code. So as a developer, you can build an application, you can publish it, and then allow other people to see what you've done, clone your experience if you want to, and then build on top of it. And you have this optional but really important and influential way to knowledge share by creating new applications.
[00:33:49.844] Kent Bye: Yeah, one of the interesting things about what you're doing there with Atlantic Studio with this new product is that it's sort of like, on one hand, it's open web standards. On the other hand, it's like a closed platform. Would there be, for example, the possibility to export any version of what you create so that if... people wanted to take it into other contexts or just archive it. I think that's a concern for some creators is that if they are developing for a closed walled garden platform, and then if that platform shuts down at some point, then all of the content that they make has no longer have access to it. It's kind of like you're in between worlds there where you're creating something that is a turnkey solution, but also at the same time using all the open web standards. So is exportability something that would be of interest or both of exporting, but also potentially importing stuff as well?
[00:34:41.553] Erik Murphy-Chutorian: Yeah, so absolutely. I mean, we've built this on an open web framework and the ability to, you know, at least at your discretion, create open source samples and use them. But there's a lot of open source permissive technology that's involved. And we've really tried to build a platform that encourages people to share and remix and build libraries and reuse components. We certainly have the ability to import content into it. And we have the ability to export the individual set of files that go into the application. And we've talked about will we be able to build some features to do a full project export where you could use the tools for creation and then do external hosting on other sites? That's something that's on our product team radar and we'll definitely be looking for more feedback on input and whether that's What we build next or where it slots in. The traditional 8th wall product has offered a self-hosting option on our professional tiers where you can use our technology stack, but you can use it on your own website with your own content hosting and your own interaction and pieces. And we become essentially a JavaScript client library that you can use to use a lot of our CV technology. Due to studio's complexity, we don't have an easy way to be able to bring all these pieces together and do it in a current self-hosting mechanism. But we know that that's something our developers have been pretty passionate about, and we'll be thinking about ways to use our development for the creative side, but also still giving you more flexibility on the hosting and serving side.
[00:36:04.421] Kent Bye: Yeah, that's really great to hear, just because it's sort of the spirit of the open web to be able to offer something like that. And in terms of like the 3D object format, are you recommending people use GLTF or is USDZ also supported or is there any specific types of 3D objects that people should be working with?
[00:36:22.591] Erik Murphy-Chutorian: Yeah. So Launch Studio is going to support GLTF as the primary 3D object format, which has excellent browser performance and support today and has been the most used 3D format on 8th wall to date. But we certainly have requests for other, you know, to support other 3D formats. They won't be at launch in the beta, but there's something we'll keep on our radar for being able to expand that support as time goes on. And with like USDZ and FBX being listed as like two of the most popular formats that we'd love to be able to support.
[00:36:52.915] Kent Bye: Okay. Well, I know that the founder of Niantic, John Hinckley, has talked about how the metaverse is a dystopian nightmare. Let's build a better reality. And so there's a bit of having people treat the physical reality as the metaverse. So I'd love to hear if you have any thoughts on how you either personally or the company thinks about physical reality as the perfect place to start to build out what would be a more exalted version of the metaverse.
[00:37:18.715] Erik Murphy-Chutorian: Yeah, absolutely. I mean, I, you know, part of Niantic's core mission identity is to get people outside, walking around, engaging with each other in the real world. Whether the dystopian metaverse is like the opposite of that. It's like everybody in a closet with a headset engaging with each other in virtual space and not ever having real in-person interaction. When we think through what Studio is doing, we really are trying to bring a lot of Niantic's knowledge about mapping real-world spaces, splatting and scanning, and be able to bring that into applications that can then take place in the real world. And so this is why we've integrated things like a visual positioning system, which allows you to go out in the world, certainly hold up your phone to content, but then have that augment the space you're in, where you can see virtual content appear in the world around you. And that can be used for game interactions or for social interactions. being able to do real time shares. One of the features we also support is shared augmented reality, where multiple people can be in the same virtual space and playing a game with each other at the same time. And as we think through the features that we're building, that is at the top and first and foremost for us is how do we get the stuff that's being built to really engage with people in real time, in the real world, and be able to make sure that even for things like headsets, we're bringing it all together. It's like, can you have asynchronous operations where someone's in a headset, someone's on the phone, maybe one person's in the real world, one person's at home, and the ability to share and be things together. It speaks to our ethos when we think of our products at Niantic and at A12.
[00:38:54.655] Kent Bye: Yeah. There's a bit of a philosophical difference that I have with John Hinckley on calling it the real world metaverse. I would say that there are certainly affordances of physical reality that you can do things with social dynamics and group dynamics, especially connected to physical locations as a center for how you're bringing people together and having new social dynamics that are unique, that could be facilitated through technology. I guess for me, I feel like as David Chalmers argues that virtual reality is a genuine reality and that there are still meaningful social interactions and ways that you are interacting and having agency and different levels of embodiment and virtual spaces. And that for me, it feels like less of a hierarchy of saying that in real life or physical reality is somehow better than the virtual life. But I do see that as a company that Niantic is trying to make that value judgment of saying that bringing people together in the physical reality does have something that's unique and different. And that's something that is maybe different than what other companies are really trying to focus on.
[00:39:53.854] Erik Murphy-Chutorian: Yeah. I think we're at an interesting waypoint in the development of this technology, is that we've really come a long way and have head-mounted computing that is spectacular. It really puts the batteries of anything we've seen previously. But we're still not quite yet at that device that I can wear with me throughout the day, be in the real world, going about my business, engaging with computing through a site rather than through a 2D screen on my phone. And so we kind of have to, to some extent, meet the technology where it is. I mean, VR is providing excellent immersive sessions in indoor use cases. Mixed reality devices have opened that up tremendously and allowed you to kind of have a new kind of immersive session that's both in your space and maybe not as distracting. We even have devices like the Meta Ray-Ban where you can walk around and have AI interactions and take photos. But we're not quite there yet with this, call it mixed reality outdoor device that doesn't give you eye strain on a daily basis and cadence. And so I think we're pointing to the strengths of the technology is where we really can help benefit everyone here. It's like, you know, focus on content for mixed reality headsets that works in an indoor setting, but still meets a lot of Niantic's values around real world interactivity. interactions with other people, whether it's virtual content or asymmetric with devices, and then on mobile phones, which are excellent devices for going out and about and you have them with you always, then play to the walking around and engagement and the ability to capture content in the world and share it with friends. And so I think that's one of the key pieces in Studio is thinking about how do you design applications that can work across these mediums, handle them differently, certainly different interactions, different experiences, the ability to have the same content connect to each other, but work differently. And I think that's what we're going to see for the next few years. Yeah, we really do believe that still fits with this vision of really trying to get to a device and world where people are outdoors, are engaging with each other and can use this technology and its evolution to its best.
[00:42:01.411] Kent Bye: Nice. That's really well said. So I guess as you're launching the Niantic Studio, I'm assuming that you're going to have some demo projects because we're recording this ahead of the announcement. I haven't had a chance to see any of those, but maybe you could just describe some of the paradigmatic examples or use cases that you're going to be shipping this with so that when people check it out, they can get a sense of what might be possible.
[00:42:22.765] Erik Murphy-Chutorian: Yep. So we have a number of demo projects and applications that are going to come out with the launch. Some of them focus on the entity component system and physics. So you have the ability to roll a ball around a number of blocks and platforms and have it jump, knock over bowling pins, and really do a lot of physics engages that show how objects can interact with each other and how you can manipulate them. We'll have projects that focus on the particle system and how you can create a lot of effects within your applications. We'll have some projects that really focus on the animation parts and how do you have animated objects that move around and walk and engage with each other and do animation blending and transitions. We have some game content that's there to really highlight how easy it is to make fun, casual applications. One of them is it's a soccer game that has elements of the original Pong, where you play with a friend. Each has a character. You're trying to get a soccer ball into the opposing goal. But the physics interactions really play. You knock into the ball to move it, but you can knock the other player around, and you can really kick them around the pitch and really try to get your ball, which is a lot of fun. And there's another one, which is a, it's like a cannon shooter. I think, I forget what the name of it is, but you have a, you know, a wooden cannon and you're firing it in a VR or mobile and you're trying to hit targets and it gets increasingly difficult as time goes on. And we have another experience that's showing off, you know, the ways to do attraction or repulsion of game entities and elements. And so there should be a good set of examples, but certainly a lot more work to go to get to the level of those hundreds of examples we have in the existing Hill product.
[00:43:59.144] Kent Bye: Nice. Well, I know that on May 29th, 2024, 3GS announced that they have a new shader language of the 3GS shader language, and that this has been in the works for Niantic Studio for probably a long time. I don't know if it was robust and nimble enough to be able to start to integrate some of those brand new features that are coming out, but maybe you could have a few comments on moving forward, if you have a preference for the different types of shader languages that people want to use shaders in Niantic Studio.
[00:44:24.942] Erik Murphy-Chutorian: Yeah. So we're predominantly using GLSL shaders for almost every custom shading use case we have. 3.js has always provided a number of ways to engage with shading. One is you can use the standard of 3 stack. You can also build raw shaders. When we render Gaussian splats inside of our engine, we use raw shading and a lot of in-house work to be able to perform shading the splats at runtime. In the beta launch, we don't yet have custom shading on launch day, or at least we're not going to have examples around it, although there are hooks to be able to do that within the renderer. But we expect to have, in the near future as this product evolves, a really good story and examples around building custom shooters, how to use them for all sorts of different use cases, both some for visual, some for rendering, some for compute, and really try to help build up the expertise around this. The launch story will be, definitely we'll need a good fundamental background in computer graphics to be able to tie into the system, but we hope to make that easier as time goes on.
[00:45:27.574] Kent Bye: Yeah, that makes sense. And I think as even more like low-level geometry manipulation with like the low-level mesh APIs that Apple Vision Pro just launched with, there's going to be more ways of creating more dynamic fluid geometries. And I feel like playing with shaders is a whole new area of kind of moving beyond the more static 3D objects and moving into much more dynamic, interactive, immersive experiences. So yeah, that's cool to hear that that's certainly on the radar and the long-term roadmap. So yeah, I guess as we start to wrap up, I'd love to hear what you think the ultimate potential of XR and WebAR might be and what it might be able to enable.
[00:46:06.482] Erik Murphy-Chutorian: I mean, in many ways, I think the potential is limitless. I mean, having seen all of the incredible things people have built on the AFL platform over the last several years has led me to believe that people are incredibly talented, smart, and can build incredible things with giving the right set of tools. And so my feeling is Niantic Studio is bringing a much more compelling and powerful set of tools than we've ever had before. And the visual editing capabilities allow a level of immediate interaction and visualization that we've never had previously. And then the entity component system, which underlays this, is allowing you to design applications around objects and custom scripts on those objects. physics interactions between them. And it's made developing with 8thWall an order of magnitude faster and easier and better. And bringing all this together, bringing in our mapping technology and our stack is going to provide people with this new tooling platform to go far beyond anything we've seen previously. And I think we're going to see some incredible applications built in the next year.
[00:47:09.352] Kent Bye: Just a quick follow on in terms of the scripting, are you going to allow people to enter in their own JavaScript? I know with content management systems, there's like security concerns of just allowing anybody to put anything in there. But, and so just curious, like if you have a system for people to add the custom types of scripting and interactive components.
[00:47:29.560] Erik Murphy-Chutorian: Yeah, so I mean, absolutely. So at launch, custom components can be written in JavaScript or TypeScript. They execute on the page. They can do whatever behaviors that the developer wants. What we've done at 8th Wall is that every developer account gets a subdomain and essentially their own domain to be able to run their own scripting and their own experiences. So they are building their own application. They're deploying it on their own domain. You can also connect custom domains. So if you want to take your own DNS names and connect them to your applications, that's a feature we've had for a long time. And we continue to have that in Studio. But in that way, certainly the scripts are written by the developer. They're running on a developer domain. They are at the discretion of what people are building. And you certainly need to adhere to privacy safety. We have reason to believe that a developer is doing something malicious or abusing an account, being in violation of returns of service. But this isn't code running on 8th Wall's website. It's running on the developer-created website.
[00:48:27.200] Kent Bye: Okay. And is there anything else that's left unsaid that you'd like to say to the broader immersive community?
[00:48:32.982] Erik Murphy-Chutorian: I really appreciate all of the support and people who've been building with 8th Wall to date and Niantic and look forward to bringing better and greater technology and offerings and can't wait to see what you're going to build. Awesome.
[00:48:48.201] Kent Bye: Well, Niantic Studio is launching on June 18th and really excited to see some of the initial demo projects and to just see where people take it. Because I do think that, you know, I've been doing a lot of interviews about the open standards and WebXR and just to see how people are starting to use the open web technologies and just really exciting to see a company like Niantic really embrace it and to really promote it as a real viable way of delivering these immersive experiences, but also to develop this type of infrastructure and tool set for people to be able to push the limits for what's possible with these types of experiences. So thanks again for joining me here on the podcast to help break it all down.
[00:49:24.690] Erik Murphy-Chutorian: Thank you so much, Ken, for inviting me. It was really a pleasure having this conversation and I look forward to staying in touch.
[00:49:31.181] Kent Bye: So that was Eric Murphy-Katorian. He's the founder of 8th Wall, which was founded in 2016 and acquired by Niantic in 2022. And we were talking about Niantic Studio, which is launching today and freely available as a WebXR game engine. So I have a number of different takeaways about this interview is that first of all, Well, it's great to see a company like Niantic that has built a lot of native applications on both Android and iOS and to abstract out a lot of their APIs and to release the Lightship API and then slowly be bringing in some of those same APIs and make them available to a web-based platform. Already existing, those products will still be there, but they're launching a brand new product, which is Niantic Studio. It's essentially like a 3JS-driven WebXR game engine that you can do in the browser, build out all the different scenes with this entity component system. If you've done anything with Unity or Unreal Engine, this has got a very similar type of structure in that entity component system. But you're able to compose these 3D scenes and then deploy them out into WebXR experiences that can be seen on mobile platforms, but also on these headsets. So the main mechanism to see these on the headsets is through the browser. So we'll be very curious to see what has traditionally been a very AR-focused company, what type of experiences they're able to do now that they're leaning a little bit more into VR. which I think a lot of their branding over time has been really emphasizing, okay, let's bring people together in the physical world. But this is the first time when one of the products is really exploring some of the different virtual reality types of experiences that could be built with WebXR, with this new product of Niantic Studio. It has its own physics engine. You can start to do your own coding and own components. Essentially, like it's using WebXR, but it's also tying into some of their native APIs. And as time goes on, they're going to be expanding out into featuring more and more. And because it's using the open web technologies, they're hoping that at some point there'll be export options that if you want to do self-hosting, but that's not at launch. So everything is posted right now. And they're also going to be bringing in more shader support and everything else at some point as well. Using GLSL, which with WebGPU, it actually uses a completely different shader language. And the 3GS shader language can actually render out to both of those shading languages of GLSL, as well as the WebGPU shading language, which is WGSL. So hopefully at some point, they'll also have support for the 3GS shader language, just because that would be a shader language that could actually support both WebGPU and WebGL. So yeah, really excited to see Niantic is having one foot into the native world and the WebXR world and that they're acquired eighth wall and that are really embracing the open web in a way that is pushing forward a lot of these different open standards. And it is a turnkey solution. And at the moment it's free. They didn't really announce at what point they're going to start to charge for this, but at least for now it's going to be freely available until they have, I guess, more information at some point here in the future. So I guess stay tuned. But yeah, definitely check it out. It looks like a really robust game engine that is built on WebXR. So yeah, really curious to see where they take this in the future. So that's all I have for today. And I just wanted to thank you for listening to the Wishes of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a part of podcast. And so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash Wishes of VR. Thanks for listening.