#1461: AR Lens Genres, Spectacles Impressions, & SelfReflect VTubing App Using Snap’s Camera Kit with Brielle Garcia

I interviewed Brielle Garcia at the Snap Lens Fest about the Snap Spectacles and her SelfReflect VTubbing app that uses Snap’s Camera Kit. See more context in the rough transcript below.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my series of looking at different news and announcements around Snap spectacles, as well as the Snap ecosystem, today's episode is with Brielle Garcia, who is an AR lens developer who has been working with all the different platforms from Instagram to Snap to TikTok, and And also doing lots of different motion graphic things. And so lots of really kind of experimental avant-garde art with like the musician Nick Lusko, but also is working on her own VTuber application called Self-Reflect that she was going to be launching at TwitchCon, but ran into some bugs. So I'm not sure when it's going to actually be launching, but here within the next week or so launching this out, it's going to be a VTubing platform platform. so that you could basically use a webcam and use the Snapchat camera kit, which is essentially their tools that are integrating lots of the different facial recognition, body tracking, all these things that enable the Snapchat lens filters to use in the context of independent third-party apps. It's kind of like their own sweet sauce for what makes Snapchat really amazing with all these different algorithms and basically putting it into this camera kit that they can then use in other contexts with, enterprises, with the Super Bowl, with different sports stadiums, but also in independent applications like what Brielle is making, this type of VTubing app. And yeah, just also the Snap ecosystem is very friendly and working with different independent developers and really sustaining and supporting their ecosystem. And like I said earlier, I find that the Snap community has the most interesting and creative and kind of edge when it comes to AR development. And so it was a real pleasure to be invited to not only snap partner summit, but also the lens fest to have a chance to talk to some of the different developers there. Now, lots of the developers that were there were very busy about to go into a hackathon. And so I did manage to do a number of different interviews throughout the course of my time there, but I would have done a lot more if they weren't busy trying to work as quickly as they could in order to build some of the first applications for the snap spectacles. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Brielle happened on Wednesday, September 18th, 2024. So with that, let's go ahead and dive right in.

[00:02:23.822] Brielle Garcia: Hi, I'm Brielle Garcia. I'm a AR designer, developer. I do animation, visual effects. Yeah, I do a lot of things with the Snapchat and their platform and their development tools. And yeah, I'm a big advocate for the stuff they're doing. Yeah. Awesome.

[00:02:41.773] Kent Bye: Maybe you could give a bit more context as to your background and your journey into the space.

[00:02:45.615] Brielle Garcia: Absolutely. So my background is I come from animation and visual effects. I spent many years doing motion graphics, doing 3D animation, and doing visual effects for different types of videos, films, music videos, and such. What led to my work in AR was I started to see things being done on mobile devices in real time that would take hours to do in normal post-production. And I was like, why am I spending so much energy to do these types of effects in post-production whenever these devices and these apps are doing it in real time? And so that led to me experimenting with What can these tools do? And at the time it was just ARKit and then ARCore just came out and then I started doing more experiments and Snap reached out to me with their first beta of Lens Studio and said, hey, do you want to test this new product we've been working on? And so it kind of took off from there. So I've been with Lens Studio from the beginning and it's been amazing to watch how much it's grown over the years, but also how solid the foundation was from the beginning that Snap has had a great vision for what AR can do and where it could go.

[00:04:02.665] Kent Bye: And so I know in the past you've been doing a lot more of like building lenses and more of an, I guess, agency model or like building different branded experiences. How do you, I guess, self describe in terms of the type of work that you have done in the past and maybe some of the stuff that you may be moving into in the future?

[00:04:19.542] Brielle Garcia: So yeah, a lot of the stuff that normally pays the bills is the building branded experiences. I work with a number of different agencies who then do work for movie studios and record labels and other companies like that. I've built a great relationship with a number of different agencies and it's always a delight to work on new movies that are coming out or things that I'm like a fan of and it's just great to get to play in that world and get to explore new technology and finding like how do we do things that haven't been done before that really enhance like what's this brand trying to do because it's really fun if when the studios give us a little freedom with they're like well what cool new things what new capabilities are there how can we make this stand out and so getting a chance to like I stay on top as much as possible of what the new features are and how we can bring that and make their experiences really unique Yeah. I've also been doing a lot of work with Snap's camera kit SDK, which is the ability to embed Snap's AR runtime into your own native applications, whether iOS or Android. And I've worked on a number of different projects for different companies in the past using this SDK, but I've been working on a personal project over the last i'd say 18 months off and on pursuing because my background is in animation and i've always had an eye towards like how can i improve my animation work using ar and then vice versa how can i improve my ar through my animation skill set So my holy grail has been to be able to replace my motion capture suit by using AR. I want to be able to do full body hand face tracking in real time to make it as easy as opening a camera and capturing. And the quality of Snap's body tracking is just the best in the industry, bar none. They outperform Apple's ARKit on Apple's own hardware. And it's just a delight to work with. The data I get access to through Snap's SDK is wonderful. So I've been building tools for animators, for VTubers, for streamers to be able to drive characters in real time with extremely high fidelity with the ease of just a webcam. So I'm hoping to launch the public beta for my new app Self Reflect this week for TwitchCon. This app is geared towards VTubers specifically to allow them to animate avatars and replace their suits, their trackers, their everything with just a single device and a single app.

[00:07:08.513] Kent Bye: You know, speaking to Aiden Wolfe, and he was saying that you posted some clips on Twitter and were able to have some things really go viral with this new application. It sounds like you've got a lot of interest of people that want to have some sort of self-contained solution that they can modulate and augment their own identity and find new modes of self-expression. And so, yeah, I'd love to hear what that's been like in terms of getting like an early It feels like you're in the middle of a zeitgeist of people who want to be a part of the stream, but not always be limited by the constraints of their own physical meat space body that they have, but to be able to augment it into all sorts of other new forms of identity and self-expression.

[00:07:47.850] Brielle Garcia: So I knew there was a large industry of VTubers. I wasn't quite sure how large it was and I've been kind of overwhelmed with the response and I'm extremely excited because the clips I've posted of the app in motion has people just extremely excited and I posted a simple demo that people were able to test just the tracking and the response to that was people saying that it was as good as their suits that they have or as better than their current setup and they were ready and willing to pay whatever they needed to to replace it which was it's always a great thing to hear from your customer base that they're willing to they're willing and ready to pay yeah so the trick was figuring out that i needed to not reinvent the wheel for everything because for a while i was focused on developing professional level tools for myself and so i was building unity and unreal integrations and it's a lot of work and it's a lot of ask for people to change their entire setup to work with your solution so my app uses the standard vmc protocol that most of the standard vtuber applications use So, my goal is to get the best data quality to their computer as fast as possible and to work with the setup that they already have. That it should be as seamless to replace what they're using as possible. Yeah, several of the clips have gone pretty viral online. I posted a clip showing the app Self Reflect working with four different prominent VTuber applications. And I didn't have to do any changes to my app because I'm sticking to the standard VMC protocol that it just works out of the box with most of these programs. Yeah, it's extremely exciting. And I myself have always wanted to get into V-tubing, but part of the hang-up is the complexity and to get the fidelity that I've always wanted. My goal was always, I want this to be as easy to use as a camera. And so I can do that now. I tested it for a few hours last week. And I just set it up and it instantly worked and I was able to play games, hold a controller, move hand gestures around and it just, yeah, it was everything that I've wanted. So it's exciting that people are responding to it and are excited for the beta.

[00:10:06.297] Kent Bye: Now, we're here at the Snapchats Lens Fest, and yesterday was the Snap Partner Summit. And so you spoke about the Snap camera kit, which seems to be enabling a lot of this stuff. What's their pricing model for how they deal with that? Is that something they just give out for free and open source? Are you having to pay money back into Snap? And yeah, just maybe talk around the money flows both for what you are planning on charging folks, but also if Snap is going to be taking any revenue from this as well.

[00:10:31.966] Brielle Garcia: Yeah, so currently Camera Kit is still a beta product and they're trying to get feedback from developers and to figure out how to make this the best SDK that they can for what do developers need, what do they want to build, and how can they enable these new visions. I'm still working on my own pricing structure and I've been discussing with Snap and getting guidance on what they think might be best for that. But currently they don't charge for access. You do have to apply for camera kit to get access, but it's a pretty simple process. And as far as future pricing structure, I'm not in the know on that. So we'll have to see.

[00:11:09.040] Kent Bye: Okay. All right. So it sounds like that Snap's interest in this is that they want to just push the technology forward and they work very closely with developers and that you've had a long relationship of working with them. It sounds like, as I cover the XR industry, it seems like of all the different companies out there that Snap is really on the leading edge of being able to do these different types of collaborations with developers. Yeah.

[00:11:28.069] Brielle Garcia: Yeah, absolutely. Snap's just a wonderful company to partner with and work with. They've just supported everything I've tried to do over the last seven years and any crazy questions I have, I can email the team. I'm on a first name basis with many of the engineers here and it's great to build that kind of relationship. Any clips of mine that go viral help promote the Snap platform and then they provide support back so it's a great mutually beneficial relationship but it's they're just a wonderful company to partner with.

[00:12:00.960] Kent Bye: Okay and I saw that you were holding either was a switch or like a Steam Deck with like some camera peripherals is this like a type of system that you could run in more of a mobile context or maybe talk about some of the other gear that you were holding yesterday at the Snap partner summit?

[00:12:16.570] Brielle Garcia: Yeah, so I was carrying around a ROG Ally. It's a handheld PC that was running a popular VTuber program, and so it was paired with my phone running Self Reflect, and so I was getting to demo what this workflow looks like, how easy it is to use, so I could show people that there's no calibration, there's no setup. Once it's connected to the PC and streaming, it's It's right there, and they were able to, we could switch between people depending on if I just moved the camera to a different person, and they were immediately driving this high-quality avatar. So it was just showing what a standard VTuber setup in a more portable form factor, but yeah.

[00:12:56.955] Kent Bye: So I guess enabling an ability to be able to do like on-site context-specific reporting, but as a VTuber persona by having this mobile setup, it sounds like.

[00:13:06.621] Brielle Garcia: Yeah, that's pretty much the idea. And it was just an easy setup to be able to demo to people without needing to get a laptop and a bigger screen, but yeah.

[00:13:17.441] Kent Bye: OK, well, I'd love to get some of your reflections on the ecosystem of AR lenses, since there's been a number of big players out there in terms of like Meta for a long time had the Spark AR, which they just recently shuttered and shut down within the last couple of weeks. You have Snap, and then you have TikTok. And so how do you see each of these different platforms? And what do you see as the different trade-offs of either getting feedback or getting money to be able to build on different plots so yeah just love to hear how you start to make sense of some of these platforms and if met is even going to have a platform now that they've shuttered and killed off the spark ar

[00:13:52.034] Brielle Garcia: I will shed no tears for Spark AR. It gave me lots of grief over the years because it was well behind the other players in the space and anytime a project was coming down to the wire, Spark would be the one to have issues. In the final stretch, that was very frustrating. I suspect Meta will replace it with something. I'm honestly surprised if they are replacing it with something that they wouldn't have announced the new thing before canceling it, but that's Meta sometimes. But if they don't, I don't... I don't know what they're doing. To have a huge platform and clearly a base of developers that are very disappointed and I... Spark needed work, it needed a ground up rewrite, but I just don't understand some of their decisions sometimes. But it shouldn't be an indication of the AR industry as a whole because Snap is thriving, TikTok with Effect House is thriving. Many of my clients had already been moving away from Spark anyway because when they want to make an AR effect that works across platforms, We always run into the issue of, well, we can do this on Snapchat and TikTok, but on Spark, we can't do this or we're limited by this or can't do this. So I would love to see something new from Meta that was more on par with where the industry is at and where it's going that could really enable creators to have more freedom and flexibility than they previously had on Spark. Yeah.

[00:15:21.280] Kent Bye: So when you think about the different types of projects that you were doing for AR lenses, how do you start to think about those genres in terms of if it's a lens that people are augmenting their identity for self-reflection to somehow embody the brand? And then there's things that are more external into the world. So how do you start to make sense of the different types of experiences that you've been a part of creating in this ecosystem?

[00:15:44.628] Brielle Garcia: Yeah. In the different genres of lenses and AR effects, there's really the two primary ones of like you have the face effects and the world facing ones. I've always guided clients towards, in my experience, that if you want to generate more user generated content, like you need something that interacts with their face, that allows them to be seen because people like to... be seen and to be involved in the scenes themselves. While I love world tracking and there's a lot of great AR experiences that you can do with that, it's less likely to drive user generated content because it's harder to be in those environments. Yeah, so that's the guidance I always give to brands is kind of focus on what is your goal? What do you want to achieve? Are you wanting to do a specific AR experience that requires world tracking? Or do you specifically just want user generated content where they are interacting with your brand and sharing that out with the world? Then think of new creative ways to interact with face and other AR effects.

[00:16:48.925] Kent Bye: I'm wondering how this is developed in terms of brands surrendering the sovereignty over how people are modulating and using their brand in terms of their own identity or expression. There seems to be a certain amount of risk of a brand that is enabling or allowing something to happen, but then setting that out into the world, and then who knows what is going to maybe create something that goes viral but is actually kind of a negative impact on the brand. So how does that navigate it?

[00:17:14.059] Brielle Garcia: So I believe brands are getting more educated about what these platforms can do for them and how people use them. I would say they were a bit more weary in the earlier days of wanting to control like they were still coming from a specific advertising mindset of like this is how the user needs to interact with it instead of giving the user the flexibility to create what they want with it. I liken ARFX more to a creative tool than specifically an advertisement. Even though they're clearly promoting brands, they're clearly promoting movies or something, but you need to give users something to do with it something fun and I believe over the past several years brands have become wiser to how these things work and more open to experimenting and trying new things and I've certainly seen it with the brands I've worked with that they've been more open to letting people drive characters and full body characters for some popular characters. And yeah, they've had great response with it and it's led to more work in that realm. And so it's, I think we've come a long way and I'm excited to see brands be more willing to take some risks that the fans want to interact with the content. Like, so let them.

[00:18:31.709] Kent Bye: Have you been a part of any filters that went viral or ones that your personal favorites that you want to just mention here? Yeah.

[00:18:37.859] Brielle Garcia: I worked with Pretty Big Monster and Paramount on the lenses for Teenage Mutant Ninja Turtles last summer. As a kid of the 90s it was just extremely exciting to work on an official Ninja Turtles project and that one in particular we got to really push the bounds of the face, hand, body tracking that Snap brings to give like The ultimate experience of it was on Snap because it had all the capabilities that could run all at once. And we let people embody these characters and bring them to life. And it's just a delight to go through and watch videos that people have recorded of themselves doing dances or jumping into a pool as these characters and yelling cowabunga or whatever. And it just thrills me to no end.

[00:19:23.632] Kent Bye: And I've also seen a number of clips that you've done over the years with a musician named Nick Lutsko. And I'm wondering if you could talk a bit about your collaboration with Nick as a musician, because it seems like you've got a lot of running inside jokes, but also you've done quite a lot of work on different music videos and other projects with him. So yeah, I'd love to hear that.

[00:19:40.454] Brielle Garcia: NICK LUSSKO I'm going to chef my brain a bit. Getting to work with Nick Lutzko has just been a life changing experience. He is absolutely brilliant songwriter, brilliant comedian, and I am just honored that I get to tag along and do funny things. He just writes the most incredible music and it's absurd and weird and funny and we almost kind of do improv on Twitter. where he'll post things in this absurdist character, and I'll just play along with it, of like, oh, of course, in this universe, like, yeah, there's Gremlins 3, and you're working on it, and Hollywood is, like, blacklisting you, and it's like, okay. And it's really fun to just bring some absurdity and bring some levity to, like, to the world, and if I could do nothing else in my life but make silly stuff with Nick, that would be it. Every project I get to work on with him pushes my skill set to be better. I get to experiment with things that other brands may be more weary to try. And any new technology, I'm like, oh, let's try this. And he's more than willing to like, yeah, let's go for it. Yeah, we got some super exciting stuff in the works. But yeah, he's just an amazing collaborator. And I'm just honored to just get to do anything with him.

[00:21:08.633] Kent Bye: Some of the clips that I've seen that I remember, it has this sort of like very rough experimental avant-garde feel where you're like maybe doing a photogrammetry scan but it looks super weird or uncanny or janky or it just, it's not super polished but yet at the same time it's almost like this vapor wave retro aesthetic that is embracing the kind of the uncanniness of the aesthetic and that it's using that to humorous effect so you're on the one end pushing the technology forward but on the other hand kind of embodying the awkwardness or uncanniness of whatever you're doing as a part of the joke

[00:21:41.962] Brielle Garcia: We are very much the students of Tim and Eric, Tim Heidecker, and that surrealist style of humor. That's part of the charm, is you get to make things that, when I work on effects, that look better than they should, but it's also weird and off-kilter, and you're not quite sure, like, is this real? And if I get people to just question reality for just a moment, it's like, is this absurdist thing real? possible. Like, is Warner adding Desmond into multiverses? Like, no. But if I make a fake trailer for it that looks reasonably well done, who knows? The showrunner for the new Gremlins animated show is aware of Nicoletsko and all of the humor, and they seem in great spirits about it, and they seem to have good fun with it. So it's Yeah, I love the absurdity and it brings me so much joy.

[00:22:40.270] Kent Bye: Yeah, definitely enjoyed seeing those clips over the last couple of years or so. So yeah, I guess as you start to think about the Snapchat Spectacles that are launching here, do you have any plans for developing something? Or what's your overall thoughts for where these AR devices, the Snapchat Spectacles version 5, sounds like it's going to be very much a developer device to be able to experiment, prototype. So I'm just curious if there's anything that you want to try to build or experiment with having access now to these glasses.

[00:23:08.021] Brielle Garcia: Yeah, I'm super excited about the new spectacles. They seem to have improved in all of the ways that I was needing to for some of the ideas that I've wanted to build. I'm really excited to build experiences that only work with this form factor because these are the only ones that are doing true standalone AR without any wires, any pucks, or anything. And experiences I've built in the past, like I wore the last spectacles, I wore them roller skating, I wore them to the store, and they just look like normal sunglasses to most people. And I'm really excited to build more experiences that can get you out into the real world in places that you couldn't wear headsets, specifically like rollerblading and things like that. I'm also excited to use some of the object tracking models because I had an idea over the summer. I was DJing at an event and it was a small setup that I was using off my phone with a controller and The phone worked well enough but the screen was really small and I was like, man I really wish I had some good AR glasses that I could have all my information in front of me but I could still see the audience and interact with the audience and I wouldn't have that separation from a headset. And so the phone mirroring setup in Spectacles looks really promising to be able to do that and Yeah, I'm also excited to experiment with the more marker tracking and things that can help you interact with tangible objects in the real world and how do we augment those and yeah.

[00:24:45.228] Kent Bye: I know that Niantic was also announcing the Scaniverse and the Gaussian Splats. And I know you've done some photogrammetry work and more the world-based lenses. And I'm just curious if the Gaussian Splats is something that you've started to look into or get excited about in terms of what that might be able to enable to take these physical objects, but to have maybe a lightweight way of capturing them and integrating them into these virtual experiences.

[00:25:07.472] Brielle Garcia: Yeah, I actually put up two experiments a few months ago because Lens Studios supported Gaussian Splats for a little bit for the last few versions. But I was testing how they would look if you could wear them. And so I had a photogrammetry scan of a football helmet that I reprocessed as a Gaussian Splat. And it looked fantastic because it kept a lot of the reflection data and it looked like really high fidelity and it looked great when you're wearing it. But the more interesting use case I found was it was actually a scan of Nicoletsko's head that I then cropped out the face and kept his hair. And so I had this Gaussian splat of hair that I could then put on the user and the fidelity of it just looked really great because Gaussian splats work really well with soft flowy scans like plants or hair or things like that. And so I'm really excited to explore more with that and do some more scanning and see what that can do.

[00:26:07.188] Kent Bye: So the Snap Partner Summit was yesterday. We're here at LensFest. And so there's always new announcements that are coming out. And I know that you also have a very tight relationship with the developer relations here. And you may actually get access to some stuff early. But I'm just curious to hear what you're personally really excited about, any other types of announcements or integrations that are coming that you are excited to explore the potentials as you are continuing to build these types of experiences that are on the bleeding edge and providing new opportunities for users to experiment and be creative.

[00:26:37.694] Brielle Garcia: Yeah, I think I'm mostly just focused on these new spectacles. I didn't have early access to them this time, so I'm still kind of taking everything in and processing and brainstorming about what I might be able to build with them. And I'm focused on my client work and my camera kit apps. So my brain is really focused and running wild right now with what can I build for spectacles, what I might work on at the hackathon later today, and Yeah. I've only got to spend a few minutes with my pair of Spectacles and I'm itching to spend more time with them.

[00:27:10.863] Kent Bye: Great. And finally, what do you think the ultimate potential of spatial computing might be and what it might be able to enable?

[00:27:19.745] Brielle Garcia: I really like this concept of having spatial computing embedding in our world, but not taking us away from the people that we connect with. I love VR, I love my VR headset, but it is a very separating experience. It was honestly the demo yesterday, the council demo they had, where we got to do a co-location of 10 people in a space, and I could see what they could see, and I could see what they were interacting with, and it was honestly the best AR experience that I have had. It was in that moment, like, you didn't notice field of view, you didn't notice the frames, you were lost in the moment of... oh, these things are actually here. And it was really fascinating to feel that part of my brain start to accept more of the augmented reality as the reality that was there. But sharing it with the other people in the space was just really, really cool.

[00:28:16.874] Kent Bye: Yeah, I had a chance to try that out and it was definitely up there for me as well. There's some experiences from Magic Leap with Tanandi I think was one that I really resonated with and I did appreciate the social dynamics. I feel like in that experience it was like part techno, part community ritual, but what I was really impressed with was how the technology can start to facilitate these type of emergent social dynamics and rituals. I'm very excited to see where developers are starting to take that in the future. And yeah, any other final thoughts or anything else that's left unsaid you'd like to say to the broader immersive community?

[00:28:48.058] Brielle Garcia: I would just encourage everyone to get out, download Lin Studio, experiment, have fun. The tools are better than they've ever been. And this industry is still new. And the best ideas are still yet to be discovered. And yeah, just have fun.

[00:29:05.754] Kent Bye: Awesome. Well, Brielle, thanks so much for joining me here on the podcast to share a little bit more about your journey of working with Snap and these lenses and this whole ecosystem of AR over the last number of years. Very much enjoyed watching your different clips you've posted, all this absurdist experiments and avant-garde stuff. And also, good luck with your VTuber app that you're going to be launching here next week. So you've got a lot on your plate and very much looking forward to seeing where you're able to take it all here in the future. So thank you.

[00:29:29.467] Brielle Garcia: Yeah, thank you very much.

[00:29:31.360] Kent Bye: Thanks again for listening to this episode of the Voices of VR podcast. That's a part of my larger series of doing a deep dive into both the announcements around Snap Spectacles, as well as the AR ecosystem at Snap. What I do here at the Voices of VR podcast is fairly unique. I really like to lean into oral history, so to capture the stories of people who are on the front lines, but also to have my own experiences and to try to give a holistic picture of what's happening, not only with the company, but also the ecosystem of developers that they've been able to cultivate. And so for me, I find the most valuable information that comes from the independent artists and creators and developers who are at the front lines of pushing the edges of what this technology can do and listening to what their dreams and aspirations are for where this technology is going to go in the future. so i feel like that's a little bit different approach than what anybody else is doing but it also takes a lot of time and energy to go to these places and to do these interviews and put it together in this type of production so if you find value in that then please do consider becoming a member of the patreon just five dollars a month will go a long way of helping me to sustain this type of coverage and if you could give more 10 or 20 or 50 a month that has also been a huge help for allowing me to continue to bring this coverage So you can become a member and donate today at patreon.com slash voices of VR. Thanks for listening.

More from this show