#1170: Niantic’s Lightship AR Developer Kit Enables Third-Party Devs to Build the Real-World Metaverse

Niantic’s Lightship Augmented Reality Developer Kit celebrated it’s one-year anniversary in November 2022, and I had a chance to catch up Caitlin Lacey, a senior director of AR platform marketing at Niantic to talk about the evolution of the Lightship ARDK. Their Virtual Positioning System has the ability to “determine a user’s position and orientation with centimeter-level accuracy”, and their free Unity SDK enable’s third-party AR developers to build site-specific apps using the same systems that Niantic has built for AR games like Ingress and Pokémon GO. We talk about some of the other major features, and their intention to help build what they call the “real-world metaverse.”

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to The Voices of VR Podcast. So on today's episode, I'm going to be talking about the frontiers of augmented reality from the perspective of Niantic, because they've been obviously starting with both Ingress and Pokemon Go as the video games, but they've also been, back in November of 2021, they released this Lightship ARDK. It's an augmented reality developer kit. They're taking some of the aspects that they were developing in the context of their first-party apps and making it widely available for developers. They launched it back in 2021. I had a chance to see it there at Augmented World Expo. They were celebrating their one year anniversary and reached out to me to have a conversation. So I ended up having this conversation with Caitlin Lacey. She's the senior director of platform marketing at Niantic back on Wednesday, December 7th, 2022. So with that, let's go ahead and dive right in.

[00:01:04.686] Caitlin Lacey: Hi, I'm Caitlin Lacy and I am the Senior Director of Platform Marketing at Niantic, which includes overseeing product marketing and community marketing for our AR developer platforms. And I've been in the AR space for just a few years now, but I've been in the emerging technology spaces for about five years.

[00:01:22.991] Kent Bye: Okay. And maybe you could give a bit more context as to your background and your journey into AR.

[00:01:27.863] Caitlin Lacey: Yeah, I have fallen into AR. I've always been a curious person and a curious kid with regards to technology, but didn't find my way into the emerging spaces until I was at Meta. I had been at Meta for more than a decade of my career. And in about 2016, I joined the Emerging Platforms business marketing team, where I focused primarily at first on business messaging and the emerging of chatbots and helping businesses understand the impact of chat technology with regards to customer service. And then from there actually started to lead alongside one of our business leaders, the emerging AR and VR strategy for business decision makers. So helping them understand how the technology that we were starting to invest in after the acquisition of Oculus could actually help them drive customer engagement, create new channels for ROI, and ultimately help them be first to market in something that was unique.

[00:02:26.527] Kent Bye: Okay. And then what brought you to Niantic then?

[00:02:30.508] Caitlin Lacey: I believe wholeheartedly that augmented reality is going to be the transitionary technology that gets us to the longer term vision for the Metaverse. And Meta as a whole is invested in both AR and VR, but in my role as head of AR and MR business marketing, I was excited to kind of explore more about the specific AR technology outside of a walled garden like Spark AR. And Niantic has been a long time passion project of mine, a passion partner of mine. I wanted to work with them when I was at Meta and the opportunity came to lead this team. And having seen the launch of Lightship last year, I was really excited to join and figure out how do I make that the foundational technology for what everyone uses in the future.

[00:03:12.697] Kent Bye: Yeah. So maybe let's go back a little bit to some of the original AR apps with Ingress launching December of 2013, and then Pokemon Go July, 2016. So these were two games that I think were using different aspects of, I'd say probably more site-specific augmented reality gaming rather than anything that I would necessarily consider AR proper, but with six day AI that was acquired in March of 2020. And now with the Lightship ARDK that was launched last November of 2021, you're starting to do a little bit more of augmented reality. Let's go back and set a bit of the context for Niantic for both Ingress and Pokemon Go, these original forays into augmented reality gaming.

[00:03:54.562] Caitlin Lacey: Yeah, I think You know, one of the magic things of how Niantic came to be was actually their history in mapping and just the importance of mapping to the Niantic brand. And part of what makes Ingress and Pokemon Go magical is exactly what you're talking about, is that site-specific opportunity to bring people to a location and give new meaning to that location through a whimsical magical experience that you're seeing through the AR features that are in the games. And I think one thing that is so critical for AR right now, and has been over the last few decades, too, as we've seen the technology evolve, is just user education, consumer education. And I think one of the benefits that we've seen with Ingress and Pokemon Go is customers' desires to interact with AR in a new and different way that adds value to them outside of just how they look or potentially what they're looking at in some regards with some of the product placement AR experiences that we're seeing. So that would be what I would say as being the unique differentiators of what Ingress and Pokemon Go have brought to the market through AR. And then, you know, you think about our acquisition to have eighth wall this year, you know, we are lending more into the traditional AR space, but even there, we're seeing Niantic technology take on new form functions within web AR, and just unlocking a whole new channel of opportunity for not only developers, but brands and agencies too.

[00:05:13.876] Kent Bye: Yeah, I know that the CEO founder, John Hinckley, talked a lot about what he calls the real world metaverse, which I take some opposition to because I feel like there's this dialectic of the virtual and the real. And for me, it's much more about the physical versus the virtual, because I do think that there's different aspects of even virtual reality where you have different qualities of mental presence or active presence or social presence or emotional presence. And I think the big differentiating factor between VR and AR is that sense of embodied and environmental presence and the physical sensory experience of the medium. But John Hickey is taking a very strict look at this kind of real world metaverse. And so I'd love to have your take of how you or Niantic sees this real world metaverse.

[00:06:00.881] Caitlin Lacey: I think one of the values that we come back to in our mission is just inspiring people to explore the world together. And I think it provides a different kind of presence than what you're finding in VR. So with VR, it's not isolated because you see examples of things like in Horizon Worlds or Rec Room where people are gathering together and they're having an experience despite the physical distance that may be between them. Whereas with real world AR, you're bringing people together in the physical world to have a magical experience through the technology. And I think part of what Niantic is hoping to do with communicating and conveying this idea of the real world metaverse is showing that the physical world is just as magical as it was before, and it can only be enhanced through technology. And I think you see that with some of the phenomenon that happened with Pokemon Go, right? I mean, those were experiences that brought hundreds of communities together in the physical world without any prompting. back in 2016. And then you've seen that just continue to grow with our GoFest that we've had around the world in the last few years. Most recently this year, it was brought back post-pandemic. So really interesting to kind of see the differences. And I agree with you. I think that there is a natural embodiment and sense of presence that kind of overtakes you in VR. And yet there's something so interesting about looking through, you know, the lenses of our phones now, but in someday in the future, maybe a glass or a headset that can bring you together with people in a physical location and all see the same thing from a different angle that no one else can see. There's some kind of secretive opportunity there. That's very fun.

[00:07:26.925] Kent Bye: Yeah. I guess my pushback would be that I would call it the physical metaverse rather than the real world metaverse, because I feel like that sets up saying that the experiences that people have in VR are somehow not real. And I think that gets into some deeper philosophical questions that I would say it's physical rather than real.

[00:07:44.093] Caitlin Lacey: Yes. And I think that's a big difference between meta and Niantic that I've found. So at meta, my old boss, Asher Rabkin, actually continued to have this conversation around, it was the physical world, the augmented world, and the virtual world. And you couldn't say real world because to your exact point, you would dismiss some of the real experiences that people have in both augmented and virtual worlds. So I've definitely been pushing back on that at Niantic as well. And it's a personal opinion, but I think real world is, sometimes easier for the consumer to understand, even if we do mean the physical world.

[00:08:18.603] Kent Bye: Let's talk a little bit about the Lightship ARDK that launched. It was last year, November, 2021. I was actually at Augmented World Expo when the first demos were publicly being shown. So I had a chance to see some of the early takes of segmentation and the semantics and also the virtual positioning system. I think it was formally launched later, but it was kind of announced initially with these six different cities. And now we're coming up on the one year anniversary and that's, you in some ways why we're talking now is taking a look back of a bit of a retrospective of this Lightship ARDK. So love to hear a little bit of this initial launch that happened last year, last November, and then what's been happening over the last year.

[00:08:59.305] Caitlin Lacey: We, yes, launched last November, which was probably the biggest thing Niantic has ever done because it unleashed into the world a lot of the technology that Niantic had been building for its own games and started to share. And I think one of the values that Niantic wants to live by is sharing our technology with others and getting them to build the foundational infrastructure that we're going to need to make AR as ubiquitous as the smartphone generation. And one of the things that we have just been blown away by over the last year is what people are creating with it. And to your point, VPS, our visual positioning system, has been widely adopted. We launched that in May, and that was right after the acquisition of 8thwall. But within six months, 8thwall actually integrated the visual positioning system into WebAR, which was net new, a great experience for WebAR, first of its kind. And with Lightship, the goal was to create this beacon of partnership and leadership in the AR technology space for others. At Niantic, we have a lot of nautical themes. And so Lightship is not a play on words. It's intentional to be the ships that are out at sea that guide people into shore. And we hope that through this technology that we're unleashing to developers that they're going to find a great fit for their business or their ideas that can be built. And you see this across a couple of the experiences that Are in testing right now and being announced a great story that we have coming out of a couple of our developers are trip who most recently were just on the today show showing their VR experience that is about meditation and mental well being and they're going to be testing and augmented reality case in the next year. We have Pixel Lynx, which is co-founded by Deadmau5, who are releasing a AR inspired game around music. And they just tested this in Amsterdam and at Art Basel last week and seen just incredible adoption and engagement. And that'll be widely released next year as well. And actually they were just acquired by a larger company. And so that just shows that there is momentum with this technology that is creating something that is unique and differentiated from potentially what other people are sharing and creating.

[00:11:06.746] Kent Bye: Yeah. So maybe if we take a step back and look at Niantic doing the first party applications of say Ingress and Pokemon Go, there's obviously different aspects of advertising and you know, with Alphabet, the parent company that I guess the sister company to Niantic would be Google. And so you have this larger things that were born out of different technologies. Maybe it could go back in the history. Was Niantic spun out of Google or what is the shared relationship there with Alphabet?

[00:11:33.594] Caitlin Lacey: Yes, Niantic was spun out of Google. So the original history, if I remember it right, was John Hanke had founded a company named Keyhole and Keyhole was acquired by Google to help be the foundation for Google Maps. And while there, he took on a number of different product roles that ultimately created Niantic Labs, which was kind of an incubator for a ton of different ideas related to mapping technology and AR technology. which ultimately released Ingress and Pokemon Go as games that helped kind of build out the digital map of the world. And then they spun out based on, you know, product priorities and things like that. And then were ultimately made into their own private company with an investment from Google. Google is definitely one of the early investors in the company, but that's the history as I know it today.

[00:12:21.455] Kent Bye: And so I guess there's this transition there, I guess, with this acquisition of 60AI, which was trying to do these different types of scanning of the world. But there's this, I guess, this larger vision of trying to create this AR cloud, or there's lots of different words of this, you call it the real world metaverse, but it's in some ways creating a digital twin of physical reality. And that there's this larger strategy of scanning these different locations and places so that you could have say a model of a building and then have something that's occluded by that building and go behind it, or maybe even these site specific locations. And so it originally launched in May with like six different cities. And now I guess with this latest announcement, I'm seeing in the press release that it's expanded to 120,000 locations. But I guess with the caveat that there's still a center of gravity of what cities are made available and expanding from the original six cities to 125 cities, I was personally trying to see, okay, Is this even available where I live in Portland, Oregon? And it was a little bit difficult for me to know where it was at, but maybe you could just expand on that larger strategy of creating this digital twin through this virtual positioning system and this launching of this ARDK as part of this larger mission of creating this AR cloud or real world metaverse and, you know, kind of physical digital twin of reality.

[00:13:40.869] Caitlin Lacey: You know, with the mapping history of the company, I think to John, I feel like he's a cartographer. I feel like he is just someone who is fascinated by the physical world and wants to find ways to enhance that reality. And so that's where a lot of the drive comes from creating this digital twin. And it is exactly that, right? It is trying to scan the world so that you can put persistent AR into those places and allow people to experience the same things, you know, no matter where they are or when they visit from wherever they're coming. And I think part of the reason why I'm so drawn to that mission is the idea of bringing new meaning to places and leaving kind of a historical understanding of where the world is. You see so much of augmented reality, you know, a lot of the benefit I see is around education and learning. And if you think about the physical world and how often it changes, you know, the Maps view that we see and know today actually can't keep up with a lot of the dynamics of the physical world. Whereas a lot of what we're asking folks to do through scanning and VPS and even scans within our games is actually help us keep the world updated so that as you leave your footprint, others behind you can know how the history has changed. So that's a value that I really see in kind of what we're offering people. And then I think the other part of it too is just this nature of persistent AR and the ability to actually leave a digital footprint for, you know, a gaming purpose or for a learning purpose, or even just for an artistic purpose. There's a great developer out of the UK, JR Reality, that is leaving this artistic experience of a photograph. Someone can take a selfie and then pin it to this location in London and leave it for others to find. And I think that there's so much power in the ability to kind of leave your mark and know that it's not going to go anywhere. It's just going to be there for others to find.

[00:15:31.898] Kent Bye: And if we look at both the Ingress and Pokemon Go, can you talk a bit about the revenue sources of these apps? And if it's really around trying to give advertising to specific locations and that, you know, maybe just talk a bit about the revenue streams of the previous apps so I can kind of get a sense of where things might be going.

[00:15:50.829] Caitlin Lacey: I would love to, but I actually don't have those answers. I'm only on the platform side, so I can't even speak to the games revenue streams, unfortunately. But I'm happy to have you meet someone else who can give you better advice on that.

[00:16:02.418] Kent Bye: Oh, okay. Well, I guess one of the questions that comes up is how does these different ARDKs, how does that get funded then? Is this something that is coming from the revenue streams of these other apps or is this something that you feel like is going to have a similar type of site specific, you know, advertising model for these different types of applications?

[00:16:21.333] Caitlin Lacey: I don't know if it'll be advertising. I mean, one of the things right now it's, it's free. You can download it and it works within the unity. You know, we've talked about pricing models and strategies and things like that, but right now we're just trying to get feedback on the technology. Is it working? Is it working for you? What value does it add? What do you want to see? So eventually we'll find ways to kind of find a revenue stream that benefits Niantic and isn't too, too difficult for others to take advantage of. But right now it's. Free funded by larger company initiatives.

[00:16:49.602] Kent Bye: And so I guess one of the things that I noticed in terms of the previous ingress in Pokemon Go was that it was kind of like overlaying objects that could have really been anywhere. And so what are some of the things that the virtual positioning system are able to do that you couldn't do just by putting an option in a scene? Like what is the real benefit of being able to be really site-specific with VPS?

[00:17:11.789] Caitlin Lacey: The example that I like to give is actually a marketing and advertising example. So if you think about a window display. Of a physical retail location, I mean, so much of the conversation post pandemic is is brick and mortar no longer. And one of the benefits that I see with ar is actually again bringing people back to a location and offering them something that they can't necessarily experience anywhere else. So if you think about you know, a Christmas window display in New York, I just read an article about how. People aren't investing, brands aren't investing in those windows displays as much as they have previously in the past, again, just because foot traffic is down. And if you think about the ability to create a window display that costs a good amount of money and takes time from a design perspective to make it whimsical and delightful, what if you could augment that and bring people there for a different experience and offer them something that they wouldn't be able to have anywhere else? And you have to physically visit that location in order to unlock that magic. The other example that I love to give is one from the web AR side of the house. So Eighth Wall just worked with Coca-Cola to produce a Halloween campaign that was site-specific to the USC campus. And ultimately what it did was it turned physical vending machines into treasure boxes. And folks could go to these vending machines and actually augment them and then take away either a digital or physical treat. So it was a trick or treat across the USC campus, again, to just drive engagement with a younger audience and offer them kind of a spooky experience. And that's just a really cool way of thinking about you're not just offering a piece of AR in a physical place and just leaving it there. You're actually augmenting it and inviting people to participate, which is another benefit of AR. You're definitely inviting that participation.

[00:18:52.715] Kent Bye: Yeah, I know that there was the acquisition of 8th wall that happened and, you know, looking at this aspect of the web versus native apps. I know a lot of stuff that's happened within the context of phone-based AR has been focused on these native applications or built into things like Instagram or, you know, these existing apps that have facial filters, either Snapchat, TikTok, or Instagram, but. the web is this whole other aspect of what a lot of people when they think about the metaverse, they think about the ability to have things that are beyond the native applications with both WebXR and WebAR. And so I'd love to hear a little bit of the backstory for why Niantic saw an opportunity with 8th wall and what they were doing with web-based AR. And I know the last demo that I saw at Augmented World Expo, they were showing one application that was showing on the web, it was showing on a HoloLens application, and then on another platform, I think it was on three different platforms, the same application. But this idea that you could have the API functionality not only within the context of a native app, if you're using something like Unity, or building something natively or having something on the web. And so, yeah, I'd love to hear a little bit more context as to how Niantic sees the web and the role of 8thwall.

[00:20:04.268] Caitlin Lacey: I think one of the values that we're deeply committed to at Niantic is this idea of building open tools and services and platforms that can kind of support anyone across devices, across operating systems. And one of the net benefits I see from acquiring a company like 8thwall is now we can help support everyone both across native app if you want to build a standalone experience that can drive more immersion and translate really well to a future where headsets are our primary. And then you've got these web AR extensions that kind of allow you to be anywhere for anyone, you know, with just a scan of a QR code or some kind of image target without any app required. And I think part of that benefit, again, goes back to just consumer education. And so much of the shift that is going to need to happen to make AR, you know, an everyday experience is going to have to be through these one-off delightful experiences that prove AR is more than just a gimmick. And I realized that when I say web AR is a one-off, it makes it sound like it's a gimmick, but actually people are creating these repeatable experiences that people want to go back to. And it is that repeatable experience, that delight, that magic that generates the excitement for folks and starts building community around it. And I think that's another element that is often not talked about enough with augmented reality is you can create community. Another great example of a developer that has taken to our technology is Foundry 6. And it's a group that is creating an augmented reality game based on Dungeons and Dragons. It's a RPG game and they've led with AR and there's a lot of AR experiences within that and in the visual positioning system as well. And so they're driving people to different places within, right now it's based in LA, the surrounding LA area and having them do experiences there together in order to fight battles and move on through the journeys that are there. And I think That sense of community by bringing people together through a digital experience is something that we're going to see more and more of. And I think WebAR is going to help us start to unlock that chapter because there is no barrier to entry with the download even app, which not everyone wants to do.

[00:22:07.098] Kent Bye: Yeah, maybe you can go into some of those trade-offs that you have, because with web AR, you could send somebody a link and they could open up a web browser, but sometimes you have to still give permissions to like the camera and everything else. And so there's still a way that you have to consent to different things. And then I guess the other dynamic is that you send people to download an app and then they have to download something that they're going to have on their phone for a long time. And is it something that if they just want something quick and easy, is it something they want to just have to kind of manage the files over a long period of time? So I guess there's other trade-offs there that I see, but in some ways there's more potential for native integration for some of the different stuff. So I guess, what are the different trade-offs that you look at when people are trying to decide, okay, is this a type of experience that should be in the eighth wall web AR version? Or is this something that we should build our own native app using the Lightship ARDK and a Unity app that we're asking people to download? So how do you think of the different trade-offs between those two options?

[00:23:04.652] Caitlin Lacey: I always like to ask clients, you know, what is your end goal? What are you really trying to do? What value are you trying to provide? I think a lot of the time people are trying to be first to market and they're trying to do something unique and see what the return is going to be. So a lot of they are right now is sitting in the innovation bucket of budgets and we want to move them over to the marketing budgets. And I think when people see that. people are spending more time in their augmented reality experience, which is ultimately proven through dwell time or return time or return sessions, I should say, then they're understanding that there is something magical there. And once they have proven enough times that there's a return there, then we can guide them into a longer experience, or we can have them integrate in their existing app through a number of different tools that can ultimately bring AR into their experience. On top of that, I think it is just around consumer education because you're right, there are access to camera that have to be approved, access to microphones on the web AR side. There is the barrier to entry for download and what that includes for an app, native app experience. But it's mostly when I talk to clients, it's like, what are you really trying to do? What are you really trying to gain from this experience? And then helping them kind of go through the decision tree of what makes sense when. And again, it comes down to what level of immersion are you trying to get for people? Because I do think that WebAR, there are still limitations with web as far as the technology that can be kind of bypassed if you develop in Unity. But again, you're looking at build time too. I mean, we've seen a couple of experiences built with WebAR that take a matter of weeks. Whereas with a native app development, you're looking at potentially several months to longer. And if you're thinking about a game, I mean, game development takes months to years to really knock it out of the park. So you do have to consider the trade off of time as well. So those are the things that I think about. And then it's really just a decision tree of questions. And that's one of my favorite things to do is just have conversations about where do you want to be? What's the press release you want to write at the end of this project? And what's that headline going to say?

[00:24:57.539] Kent Bye: Yeah, this shifting the budgets from R&D to marketing is really intriguing to me because I feel like there's ways in which that the existing, let's say online marketing or advertising has these certain metrics of impressions. And, you know, did you watch a video? Did they click through? Did they actually sign up for something? And so I guess as we do this shift from 2D to 3D, we have new metrics to be able to say, is this successful? What kind of engagement and how long are people looking at this or are they actually physically engaging and walking out? And so I guess from a marketing perspective, what are some of the different metrics that you look to, to say that this is a successful campaign that someone used augmented reality to market something?

[00:25:40.350] Caitlin Lacey: It's a huge conversation that's happening in the industry right now. And when I was at Metta was having these conversations every day around We cannot assign metrics, previous metrics and digital marketing to AR and VR. It's just not going to be apples to apples. So there's a lot of work being done by the IAB right now to actually articulate what are those metrics for success for this next wave of technology and who needs to be part of those conversations and how do we communicate those out. For me, when I was working within the Spark AR realm, we were looking at repeat sessions and dwell time. And then ultimately through AR ads, which were available on the meta platform as well, you could actually track conversion. And for CPG, retail, e-commerce, DTC, like all of that, all of those industries, that conversion rate was super important because it proved that this digital experience that people, especially around furniture or makeup, people were not only engaging, but because of that experience of being able to try before you buy in your own physical location, People were buying and there was a lower opportunity for return rate. So you see that there are benefits there from an augmented reality perspective. And it's only going to get better over time as the technology improves to make it feel more realistic, either in your space, from a size perspective or a color perspective, or even just from a makeup, like making sure that the face tracking is correct. And the Niantic side. I think one of the things that we're really paying attention to is again, that repeatable experience, because there's value. If someone comes back, it means that something was triggered for them that they wanted to return to. And then dwell time, making sure that people are actually spending time with this technology, not only coming back to it, but then playing around with it, showing it to their friends, sharing it in the social realm too. Sharing is obviously one of the best reviews you can get because people are saying that this is worthy of my time. And I want to share my time with my family and my friends.

[00:27:31.145] Kent Bye: You mentioned the AIB, is that the Academy for International Business or what is this organization that's starting to look at some of these AR metrics?

[00:27:39.267] Caitlin Lacey: Interactive Advertising Bureau. And they set a lot of the standards for measurement and marketing.

[00:27:44.149] Kent Bye: Okay. So the IAB, the Interactive Advertising Bureau. So, okay. They're starting to look into this as well. And I guess they're talking to the larger advertising and marketing communities to think about this. And when I think about this concept of experiential marketing, which has a lot of overlaps in terms of presence research from researchers that I've interviewed before that looking at different things like active presence and mental social presence and emotional presence and embodied and environmental presence. So there's different ways that people are engaging in these experiences that are deeper than just say looking at a video. But there seems to be another degree to which that people are having these multimodal ways of interacting and engaging with this more experiential marketing method Dustin Chertoff was the researcher that was doing presence research and he was making this connection between what was happening in the advertising world and the experiential marketing world and presence research. And so, yeah, there's this interesting overlap between these concepts of presence and this future of experiential marketing. And so I don't know if Niantic internally has their own way of thinking about this new immersive or experiential way of engaging with media.

[00:28:54.113] Caitlin Lacey: I haven't heard of a strategy yet. Doesn't mean it doesn't exist. I think AR needs to be something that we all add on to our marketing and our event experiences, because it's such a natural fit to offer an extension to something that's already happening. A great example of this is an experience that, and I will have to send you the link after the fact, because I don't actually remember the name of it. But it was happening in Dallas last March. And a woman that I spoke on a panel with at South by Southwest Lauren Ruffin was telling me about it. And it was an experience of walking through an exhibit that was meant to be like crossing the border. And you take off your shoes and you would walk through sand and dirt and you would have the spatial audio around you to give you the sounds that you would be hearing in the same environment. And you would be physically experiencing similar temperatures. And at the end of it, you were then shown pictures of other immigrants that had gone on this journey. And if you think about that as a physical experience that is meant to provoke all of the emotions and physical sensations that are possible through the human body, and then you offered people the ability to then share that experience with others that couldn't physically be there through some kind of AR experience too. I just think that that would have been like a net benefit for the exhibit because I only heard about it through word of mouth. But I wish that Lauren could have shared with me some kind of example that I could then take and visually represent elsewhere. I think about pictures a lot. I think about the static nature of images and how AR can actually turn those representations into a physical moving memory. There was a great example at a GoFest in Berlin this last year where The marketing team tried something different and they created a video wall with a bunch of different monitors. And what happened on the physical monitor was you could trigger the experience in AR and then you could ultimately take a selfie with a character in camera and then see it projected on this larger experience. And people were so used to just taking the selfie with the phone that they missed out on the opportunity to actually record the experience of them interacting with the digital object. And again, it just comes back to user education. And so with experiential marketing, I would love to see AR as the main event at every physical experience or virtual experience. But right now, I just want people to interact with it and experience it. So AR as a feature to experiential marketing is a net benefit for the industry as well.

[00:31:32.043] Kent Bye: Yeah. And that experience that you were talking to, I believe is probably Enerito's Carne y Arena that started in LA a number of years ago. And yeah, it's kind of making the rounds and yeah, I think it shows the power of that kind of experiential storytelling. So as I look at the Lightship ARDK, there's a number of different things beyond just the visual positioning system. It's a Unity SDK that people can download, and as they're building their app, they have other things like the real-time mapping, multiplayer integrations, and semantic segmentation. Maybe you could talk about some of the other features that this Lightship ARDK has that people wanted to go and download the Unity SDK and integrate it into their project. What other types of things could they do that may not already be baked in within the context of Unity.

[00:32:16.574] Caitlin Lacey: Yeah, I think you named all of them. And I think the areas of interest that we've seen from developers are really around that connection, the multiplayer experience, and then again, location, like bringing new meaning to place. And from a standards perspective, I think what Niantic really brings to the table is just the level of immersion that these tools and features ultimately unlock for folks who are building in Unity. I mean, the experiences are next to none as far as what's possible. If you go onto lightship.dev slash partners, you'll see some of the experiences that launched this year. I mean, Coachella is a great example of allowing digital experiences to kind of fly over the festival and the butterfly just next level. It feels real when you look at it in video. I think another great example is from Historic Royal Palaces. In celebration of the Queen's Jubilee, they wanted to celebrate the site-specific area of the Tower of London. And so they actually planted thousands of physical seeds in the moat. And it grew this wildflower garden that people could visit physically. But knowing how many expats are around the world, they wanted to invite anyone to kind of visit that experience. And so they worked with a developer called Preloaded to create a super gloom. app, which you could then download and place those same wildflowers in and around your physical location. It's so realistic. I mean, down to the pollinators that visit the flowers and just the level of immersion that you could get. I'm just so excited to imagine the future state where we're wearing glasses or headsets and we're just seeing this almost real piece of technology that we can touch. Obviously, it won't be able to be digital, but overlayed on the real physical world that it would just be amazing.

[00:34:02.922] Kent Bye: Yeah, just a few clarifying questions on these different features, because within Unity, there's already other network solutions like Photon to be able to have multiplayer apps. But most of the time, that's in the context of a virtual context that people are playing in. And so you have a shared experience within that context. But What are the different types of things that you're able to bring in a multiplayer context? Is it like these persistent objects that if someone puts in that multiple people can see them? Or maybe you could talk about what is this multiplayer aspect of the Lightship ARDK?

[00:34:33.945] Caitlin Lacey: Yes, it's exactly that. So it's an object that everyone can see and interact with. There's a great demo experience on lightship.dev that shows people passing a virtual ball. And then there's another great example. I think it's on lightship.dev. If it's not, it's on our youtube.com slash lightship AR channel that shows kicking a ball around with Captain Doty, who's one of our mascots at Niantic. So it's the ability to kind of play with not only the digital object, but also with each other and pass these virtual experiences across. So that's a shared experience that's offered through the ARDK.

[00:35:09.328] Kent Bye: And I think when I did the demo last year at Augmented World Expo, there was also segmentation. And so I don't know if there was natural language processing of something or if it was something more about looking at different objects that were in the scene and being able to identify those objects and then based upon those objects, then have different conditionals that are fired. Maybe you could talk about this semantic segmentation that's included as well of being able to identify objects. And what does that give you in the context of an AR experience?

[00:35:37.071] Caitlin Lacey: Yeah, I think what you're speaking to is probably sky segmentation and ground segmentation, which includes water. So the A Realm example, the Foundry 6, that game was actually, it had been in John's head for a while. John's one of the main developers on the project. And he was at AR House when Niantic hosted that this last March. And there's this great demo of one of his of these characters and this huge squid octopus that's actually coming out of the pool. And that's a great example of the segmentation of water from physical ground. And then there are some other demos where you'll see Captain Doty trying to shoot balls into the sky. So you have this virtual character who's then shooting balls into a ring in the sky that ultimately unlocks this next path of the game. So that's a great example of sky segmentation. And there's more coming on the sky segmentation side across our properties within the next few months. So more to come from that. And I'd be happy to connect you with one of our product leaders to talk a little bit more about the details there.

[00:36:35.135] Kent Bye: Okay. And there's the real-time mapping and then there's the virtual positioning system. And so when I think about this, I think about how Niantic's been launching initially with these six different cities and now expanding into like 125 different cities. And so the virtual positioning system, I understand as like, more accurate positional tracking of different objects in the context of a site-specific location. So what's the difference of that virtual positioning system versus something that's like a real-time mapping? Is the real-time mapping not have a sense of what that location is, and it's more you can do real-time mapping anywhere in the world, and the VPS is limited to these cities where you've launched that feature? Maybe you could talk a bit about the difference between what is the virtual positioning system doing versus what is the real-time mapping feature?

[00:37:20.407] Caitlin Lacey: The way I understand it is it's about dynamic. So the real-time scanning feature is helping us kind of map the world right now as it exists today. And within BPS, it's that centimeter-level precision that allows you to anchor a virtual object to that particular position in the world that then can unlock a sense of persistence AR. So that's how I think about it. And I think the dynamic piece is the really interesting one, because if you think about the map that we're trying to create, it's pedestrian-centric, not car-centric. And so you're actually unlocking more visibility into the physical world through that real time mapping, which is through the scanning, which takes images of the physical world and inputs it into our map so that we can then use machine learning and everything like that to actually understand. What is this physical location look like and how could we then map it? And the VPS is only available in certain cities because ultimately, I mean, there's just a ton of infrastructure that we need to build in order to map the entire world. So we're just at the very beginning of this journey. But I think long term, the hope is that we can do this globally with the help of everyone in the industry.

[00:38:23.520] Kent Bye: Yeah, I know that at Google IO back in 2015 or 2016, Google actually showed off a virtual positioning system where it was in the context of a Lowe's department store where it had something that was very specific inside of a store. Because you have GPS and GPS has a certain amount of ability to say where you're at, but it doesn't get down into the type of positional accuracy that you'd need to have an anchored and persistent objects within augmented reality. And so I know that Google themselves have done all sorts of things of identifying different Wi-Fi networks to help position people in the strength of those networks and doing these other techniques to give increased level of positional accuracy, because you're talking about GPS satellites that are triangulating us. And you know, there's only so much distance that those can say for sure where you're at. And there's been some amazing advancements, but I guess there's this other aspect of being able to take an actual scan of that world and do maybe a computer vision on top of that based upon where you're at, and then be able to get down into even more specific, you said centimeter accuracy. I don't know if it's centimeter or millimeter, maybe you could talk a bit about what you know of What are some of the other things that are happening to get you something that's better than say GPS or other techniques? What is this virtual positioning system doing in order to get this level of accuracy of placing objects in space?

[00:39:47.444] Caitlin Lacey: I think we're still, I am still defining why that center meter level precision matters for myself. I think one of the demonstrations that we go back to a lot is one that was released at the Lightship Summit this last May, which was there was a physical anchor at the location in the Metreon in San Francisco that we had, and they created a ship that would fly through the sky, again, that sky segmentation, and it would anchor, it would drop down its rope to this anchor, this physical anchor that was in the physical world, and that rope Would circle around the top of the anchor and how would it know to do that without that site specific centimeter level precision. documentation of where that physical object was. And that is the power of VPS. And so for me, one of the use cases that I really see is historical relevance. So if you think about all the historical sites in the world and the importance that those carry to certain people, what if you could create a dynamic map of that particular location and show its evolution over time And create an experience that could sit on top of that, that could, one, educate you, two, show you the history, and then, three, provide you the meaning for those folks that are finding a meaning in that particular location. And I haven't seen that created yet, but I know it probably exists, and I know that it's coming in the future. But I just think there's so much that we rely on history books for that augmented reality could help kind of bring to life. You see that through, this is taking us a little bit off topic of augmented reality, but you see this through the magic of a lot of the volumetric experiences that are being developed. There's a great story on PBS earlier this year or last year that was interviewing survivors of the Holocaust, the last remaining survivors of the Holocaust, and interviewing them with volumetric video. so that they could then at some point in the future sit in a classroom with students and actually convey their story and the importance of that history to them in a way that just not translate through text or even through two-dimensional video. But the actual, to your point earlier about that experience and the different elements that are pulled at you when you experience experiential marketing the right way, the emotional pulls, the physical pulls, the spatial audio, I just think there's really a lot of power in what's possible there.

[00:42:11.417] Kent Bye: Yeah, I wish I could have been at that lightship development summit. Cause I remember seeing, I think it was Keiichi Matsuda, who's a architect and he had the liquid city that was overlaying an architectural take of using augmented reality to modify the space. And he's famous for the hyper-reality demo that goes into more of a dystopic vision of that. But that video may have actually been part of the catalyst. I think I saw him speaking there and kind of inspired, okay, what could you do that was more of a. beneficial aspect of taking this augmented reality technology and being able to overlay an architectural take. So I don't know if that was Kichi Matsuda that was doing that ship that was overlaying, but I know that he had that demo of the liquid city there that was at the light ship developer summit there. And yeah, I'd love to hear any other thoughts you have on that demo that was showing the potential of that type of VPS from an architectural perspective.

[00:43:04.053] Caitlin Lacey: Yeah, he did speak and his keynote is on the Lightship AR YouTube channel. So highly recommend if you haven't watched it lately to go check it out. He's phenomenal. What I loved about his vision was it unlocks this idea of the reality channels. And I think part of the dystopian nature of what he had originally put together in his film was the idea that there would just be so much noise, right? And that happens in the physical world today. There's so much noise. We don't know what to look at. Our attention spans are shorter than ever. But what if you could go in and out of the experiences that augmented reality could offer you? And what if on top of that, programming and everything could just know what you need to see in that moment based on where you are in the physical world. One demo, and I don't know if it was in his talk or if it was in another talk at Lightship Summit, was around the idea that you could use VPS to know where it's the contextual computing conversation. It's, it would know where you were in the world and it would know that that's your favorite coffee shop. And it would prompt you to say, do you want to order your favorite drink or your favorite bite to eat? And I can do that for you. And here it is, and here's how you can do it. And here's what's available today. Here's the menu. I did that a lot with vision setting at Metta too, was what would the experience be that would really get people's attention? And I think the reality channels conversation that The good city brought up at summit was just the one that I was most excited about because you can imagine this channel is going to have Pokemon go and I'm going to be able to see all of my friends from Pokemon go and all of the Pokemon that are around me right now and then I don't want to do that right now I need to go over here and be on business mode and here's all the things that are around me that. matter to me. Here's my calendar. Here are my messages, my emails. I'm going to go out of that now. I'm going to be with my kids and I'm going to be playing, you know, another game that Niantic has delivered. That is Pigmen Bloom is a great example. I play that with my kids where we walk around and we pick up pigmen and all the different elements of that. That's great for my five-year-old. But there's just all these different things that you could go in and out of depending on the context and the contextual computing nature of that will be so powerful. Our head of product, Shel Blirander, talks a lot about this in regards to just navigating a city. He bikes to work. And wouldn't it be amazing if augmented reality could not only show him where to turn, but actually give him the contextual awareness of where he is. So instead of saying, turn left or turn right in 500 meters, which I don't know about you, but I'm terrible at that kind of metric, it could actually prompt you to say, hey, turn right at that blue awning on the bank, as opposed to giving you something that may not be something you see, but augmented reality can actually offer you that visual perspective in addition to the navigational perspective.

[00:45:39.711] Kent Bye: Yeah, yeah, certainly lots of different potential applications as we are in the center of gravity of physical reality and overlaying all these other different contextual realities and reality channels as Keiichi Matsuda called it and with magically calling it the magic verse with Matt Meissner is calling it the AR cloud with, you know, the real world metaverse as John Hinckley has called it. So yeah, whatever we end up calling this blending of the physical and the digital, then yeah, there's going to be lots of different potential applications as we have both phone-based AR and also head-mounted augmented reality applications. So yeah, I guess as we start to wrap up, I'd love to hear any of your thoughts of what you think the ultimate potential of the blending of all these realities and augmented reality and the real world metaverse might be and what it might be able to enable.

[00:46:24.405] Caitlin Lacey: I am so excited about the potential and I think we should all take to heart Tim Cook's comment a few weeks ago about there will be a time when we can't even fathom living without augmented reality and the benefits and the values that it can bring. And I think one thing that I've discovered in my last six months at Niantic is just there are incredible builders that are out there that are setting visions for a future that we can't even imagine today. And a lot of them are building on Niantic technology and we're excited to continue working with them and learning from them and taking their feedback. And one thing that I'll just mention is we're going to be on the road next year. We're going to be traveling to a lot of industry events and hosting hackathons. And so I would just invite anyone from the immersive community who is interested in learning about Niantic or wants to try out the technology to visit with us and honestly offer us your ideas too, because we are, we are open for partnerships. We are open to kind of help bring this foundational technology to everyone and make it usable by everyone too. That's one of our biggest values.

[00:47:27.413] Kent Bye: Awesome. Is there anything else that's left and said that you'd like to say to the broader immersive community?

[00:47:32.483] Caitlin Lacey: Just thank you for listening and for the time and help us make AR a consumer desired experience together.

[00:47:41.940] Kent Bye: Awesome. Well, looking forward to seeing how all the different developers are able to create stuff. I think SDKs like this are enabling new capabilities to have different applications that weren't possible before. And the fact that Niantic is supporting this type of infrastructure is great in terms of innovation to push the limits for what's possible with these technologies. So we're starting with phone-based AR and I imagine at some point we'll move into more headset-based and yeah, just excited to see where this all goes in the future. And Niantic's right there at the center of it with this site-specific orientation with the mapping and the cartography, but also the real world metaverse and how the blending of the physical and the virtual. I'd love to see how it all continues to evolve and unfold. So thanks again for joining me here on the podcast to help unpack it all.

[00:48:25.728] Caitlin Lacey: Absolutely. Thank you for having me.

[00:48:28.677] Kent Bye: So that was Caitlin Lacey. She's the Senior Director of Platform Marketing at Niantic. So, a number of different takeaways about this interview is that, first of all, well, just in thinking about VR and AR, obviously, with how many phones are out there in the world, there's billions of them, then a lot of different aspects of spatial computing can start to be developed there first. I'm personally still a huge fan of virtual reality, but I think that in terms of thinking about marketing and marketing budgets It makes sense for a lot of companies to start to do these different campaigns that have these site specific types of Engagements there and this whole idea of experiential marketing in different ways that you can get people involved in an embodied and spatialized context Expressing their agency and to be able to give them something novel to be able to augment something that's in physical reality that is actually creating a reason for them to actually go into those locations and They have different aspects of social dynamics of shared objects. They have the virtual positioning system to get down into very specific Locations and yeah, just generally trying to make some of these tools available for other Developers to start to develop on I don't know what their business model is going to be that's free at the moment but it may be a paid model or there may be other aspects of advertising that come along later and Really didn't get a long-term look in terms of how they're going to sustain this Beyond the fact that they already have a vibrant business with Pokemon go and ingress so plenty of money to start to develop these different types of Functionalities to start to get them into the hands of third-party developers to start to innovate on the platform So keep an eye out on the lightship augmented reality developer kit AR DK They've got different videos and different examples for what different users are doing. She had mentioned the pixel links Which was a dead mouse? music ecosystem, and other ways of just trying to create more engagement of these site-specific locations where she says that people are able to actually start to cultivate community. Also, I was really struck by the Interactive Advertising Bureau as an entity that is looking into some of the different metrics to say what constitutes a successful campaign or not, and going beyond the existing 2D campaign metrics, because I think there are new aspects of the augmented and virtual reality that are going to need new ways of measuring what the success for some of these different campaigns might be. The goal for Kaitland is to move some of the different budget from R&D budgets to be able to experiment and to prototype new technologies and to funding some of these initiatives from the marketing budget. Yeah, very interesting just to hear a little bit more about this platform and where they're going, and the legacy of how they were doing a lot of mapping applications, and the future of trying to create this digital twin of physical reality, and the ways that they're going to actually support and sustain that. Yeah, you know look to see how they continue to expand out from their existing initial with the lightship virtual positioning system back on November 10th when I got this email of 2022 they're saying that they're expanding from just six cities out to 125 plus cities into 120,000 different locations and so They have listed, like, Singapore, Los Angeles, San Francisco, Tokyo, New York City, London, Paris, Chicago, Seattle, Washington, D.C., Nagoya, Kyoto. So those are the ones that were listed online, and I don't see, like, a comprehensive list of all the 125 cities. But anyway, if you're in one of those areas, you can start to check out some of the different demos that might be utilizing this Lightship ARDK. Just generally, I think it's interesting to see when companies start to release these different types of APIs or SDKs to take the different building blocks for what is going to enable new potential applications here in the future. Yeah, I don't typically come across a lot of these different AR apps when I do say the different film festival circuit. And so it sounds like that, you know, Art Basel, she had mentioned, and these other marketing campaigns that are just perhaps just direct to consumer at this point. So keep an eye out on their YouTube channel to get these different use case studies of different developers and what they're doing, just to see if some of these things that are being prototyped on the phone may or may not be suitable to go into the head-mounted display for augmented reality. So that's all I have for today and I just wanted to thank you for listening to the Voices of VR podcast and if you enjoy the podcast then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a listed supporter podcast so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.

More from this show