I did an interview with prolific Horizon world builder Paige Dansinger at Meta Connect about the latest Meta Horizon news including the new Meta Horizon Studio as well as Meta Horizon Engine. She also was wearing some custom, hand-made pants that were inspired by virtual pants that she designed for her MetaversePaige avatar. See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my coverage of MetaConnect 2025, today's episode is with Paige Danziger, who's been... one of the very early alpha testers of Horizon Worlds. And so she's been creating and making worlds within Horizon Worlds. She says she's made over 900 worlds and published over 200. So she's someone who's very familiar with what's happening with all the different news from Meta Horizon. And so we are talking about some of the biggest news that are coming out of Meta Connect, which is the Meta Horizon Studio, as well as the game engine, although we don't have any specific information as to what the technology or anything is. underneath it but essentially there's a lot of generative ai features that are being promoted from what meta is saying is this kind of intersection between ai and xr is using generative ai to be able to create new worlds within horizon world so they've created a new engine not sure what they're doing but they're saying that their capacity is expanding up to four times to have over 100 people maybe up to 120 150 i don't know what the the max is going to be but over 100 users within horizon worlds which has really been a bottleneck so they're getting away from the unity runtime and doing their own custom engine they've also been focusing a lot on mobile apps and gaming in fact they had a whole poster that linked to a number of the different mobile horizon games with super strike open beta battle can shovel up smash golf early access and bumble dudes and so i'll include some links in the show notes for you to go check out some of these so essentially meta is wanting to create a Roblox clone for the best that I can understand what meta is doing, their strategy or why they're doing it is that they're saying Roblox is really eating their lunch with a certain youth demographic. And they've wanted to translate what they've done with meta horizon worlds and expand out into more and more people using and playing these different games. And so actually the super strike open beta, I tried to play it on my VR headset and it was only for mobile. So they're, creating mobile only games that they're promoting. And so there's a broader strategy here for meta is seeing that in order to grow the larger ecosystem, they have to go more towards mobile gaming and creating these opportunities for people to start to play some of these mobile games on their phones, which is something that Matt Hargett, who's someone who I talked to, he actually worked at Roblox and had a lot of really specific thoughts on that strategy and what considerations you have to think about in terms of people using old phones hand-me-downs for that specific demographic so anyway it's sort of an interesting pivot that meta is doing and really focusing on that but also at the very beginning of their developer keynote which when i talked to page we were about to go watch the developer keynote and dive more into what they were actually announcing and they start off right at the very beginning with promoting their generative AI features within the MetaHorizon Studio, which is going to be a PC way for creating these different mobile apps. So I had a chance to talk to Paige about that, her use and experience of that, but also as a part of the different programs she's a part of, create her own custom avatars, her own style. And then she had actually handmade her pants that she was wearing was inspired by the virtual fashion design that she was creating. So if you look at the photo of this You can see her both in physical reality, but also her virtual avatar in ways that she was starting to create virtual fashions that were also physical. So kind of the blend between the physical and digital and digital is something else that we talk about here in this conversation. So we're covering all that and more on today's episode of the Voices of VR podcast. So this interview with Paige happened on Thursday, September 18th, 2025 at the MetaConnect conference at Meta's headquarters in Midland Park, California. So with that, let's go ahead and dive right in.
[00:03:56.846] Paige Dansinger: My name is Paige Dansinger, also known as MetaversePaige, and I'm the founding director of Better World Museum, Horizon Art Museum, Women in Horizon, as well as I'm on the leadership committee of XR Women, where I'm honored to be the director of the XR Women Museum.
[00:04:16.013] Kent Bye: Great. Maybe you could give a bit more context as to your background and your journey into this space.
[00:04:20.296] Paige Dansinger: Absolutely. My background is as a museum professional. I worked at the Minneapolis Institute of Arts for many years in the early 2000s. And in 2012, I opened my own museum called Better World Museum. It was a brick and mortar space occupying 8,000 square feet, downtown Minneapolis, where we created an indoor edible garden using IoT, as well as have our signature project called the VR garden.
[00:04:52.491] Kent Bye: Nice, and so we're here at MetaConnect, and I know that you, we've had a previous conversation where we talked a lot about your journey into working with Horizon Worlds, and I know you've been a very active creator in that community, and so maybe just give me a little bit of an update from your perspective as some of the latest news that we heard around Meta Horizon Worlds, around the new studio program with generative AI features, but also the new engine, and so, yeah, just curious to hear some of your thoughts as to what's happening in Horizon Worlds.
[00:05:18.422] Paige Dansinger: Absolutely. Well, as you said, I've been in Horizon for a long time since 2019 as a pre-alpha creator. And so I've had the incredible opportunity to see and experience and be part of the evolution of this amazing platform. I'm so excited about the new Horizon Studio. First of all, I have to tell you, I love building worlds. If one clicks over my head in VR, they can see that I've built over 900 worlds. and published close to 200 of those worlds. And in that time, I've evolved from being an in-headset VR creator to using the desktop and the Gen AI features, which I absolutely am in love with. And now I'm excited to learn Horizon Studio because, for one, I think it's going to really elevate the playing field of what one can create in Horizon, as well as maximize the amount of attendees in a world. But also, the Gen AI will improve dramatically, so while the platform has higher creation features, the bar will also be lowered for new creators using the Gen AI tools. And so it levels the playing field, still keeping the ethos of world building as a democratic process.
[00:06:50.441] Kent Bye: Now, I heard rumblings that they were potentially deprecating some of the in-world collaborative worldbuilding. Is that true? Are they moving towards a path of only creating through the studio, or is it still possible to do collaborative worldbuilding in Horizon Worlds?
[00:07:04.052] Paige Dansinger: At this time, everything is possible. You're only limited by your imagination. I love building in VR collaboratively, but one can also build, in a sense, collaboratively via the desktop editor.
[00:07:18.755] Kent Bye: So is it still possible to build collaboratively in VR?
[00:07:22.298] Paige Dansinger: At this time, yes. And I do not know what the future holds, but I hope that all of these pathways are forever offered for all of our people. Because in-world building really allows artists like myself to have an intuitive and creative creating process. whereas another professional game-developing studio may really, really be thrilled to use that new Horizon Studio. I think there needs to be something for everybody.
[00:07:55.409] Kent Bye: If you create something in the studio, can you go in and edit stuff collaboratively, or is it only pathways that are separated?
[00:08:03.294] Paige Dansinger: I've not yet used the new studio, however, at this time, when one uses the desktop editor, one is also able to pop into VR with zero friction.
[00:08:17.090] Kent Bye: Okay. I know that there's been a number of things with Horizon Worlds, with the Academy, and I think even before the MetaConnect there was a gathering of a lot of the Horizon World builders. Maybe just give a sense of what type of activities were happening ahead of MetaConnect this year that was bringing together the MetaHorizon community.
[00:08:34.804] Paige Dansinger: absolutely i understand that there are several events happening there is a meta start gathering prior to connect as well as the world creator academy which i had the opportunity to participate earlier this summer in new york city and i was part of the creator summit which was an incredible overview for creators of the possibilities of our tools and creating worlds for the future. So we learned a lot about some of the current initiatives to help us maximize our impact within creating our worlds, but they gave us a really interesting forecast for the future and most of that talk surrounded the Creator Studio for Horizon.
[00:09:30.316] Kent Bye: Yeah, I saw on J. Dunn's social media that he and Joey Rain had won a biggest Impact Award. It seems like they were giving awards for different creators for different contributions. Maybe just talk a little bit about what were some of the awards that were being handed out.
[00:09:44.126] Paige Dansinger: JOY REINHARDT- Oh, it was really exciting. I believe that everyone deserves an award, and big props to J. Dunn and Joy Rain. They are really bringing the metaverse to the people, so I applaud them in every way. I love our community, and if it was up to me, I would get everyone an award. It was very exciting. There was about seven awards shared with Horizon builders and community members, and I hope to just see that increase more and more.
[00:10:16.721] Kent Bye: Nice. And so there was mention of this new engine that's also going to be a part of running a lot of Meta Horizon Worlds and that is going to increase the instance capacity by up to 100 people in one instance. What was the instance cap before? And have you been able to see some of this increased instance capacity?
[00:10:36.430] Paige Dansinger: I am so excited. Creators have been, and event hosts in Horizon have been really waiting for increasing the capacity. Up till now, it has been 32, and I have already scheduled an event for next week for the Women in Horizon group to test out our capacity pushing. So there'll be a mobile mixer after MetaConnect, And I would love to see as many people there as possible. Let's try to break it. Let's try to push it for 150 and see what happens.
[00:11:13.823] Kent Bye: Is that live now to be able to have more people in instances?
[00:11:16.445] Paige Dansinger: I don't know, but hopefully so.
[00:11:19.308] Kent Bye: Nice. And so what do you know around the underlying engine that is new and different? Because they showed the demo of the new immersive home. Have you seen the new immersive home?
[00:11:32.498] Paige Dansinger: I have seen the demo of the new immersive home. I believe that the new engine is built in-house through Meta with the tools that will sustain Horizon building for the future. At this time, the current runtime Unity-based Horizon platform tools in the desktop editor will also run concurrently with the studio so that there does not need to be a grand changeover and one could decide to remake all of their worlds, but one may not have to for a while.
[00:12:16.442] Kent Bye: Okay, so there's a developer keynote that we're outside in the lobby that's about to start. Perhaps they'll be sharing more information and context as to what they're going to be announcing with any details. Do you have a sense of if they're using something like React or React Native or some of these other tools?
[00:12:30.828] Paige Dansinger: I really have no idea. I wish I knew more about developer tools. Perhaps my future will include leveling up.
[00:12:43.216] Kent Bye: So have you had a chance to try out and play with some of the generative AI features of Building Worlds and MetaHorizon? Maybe just talk a bit about what you've been doing since you are already quite a prolific, maybe one of the most prolific world builders within the context of MetaHorizon Worlds.
[00:12:56.912] Paige Dansinger: I'm absolutely in love with the Gen-A eye tools, and I have been using them mainly to create Coral Collective, as well as several adjoining worlds, including Coral Quest Mobile, Deep Sea, and other ocean-based games. This year I've been taking a special focus to highlight sustainable development goal 14 life underwater and i've led a game design team to create several mobile games for the meta horizon competition they've all been ocean themed and currently i'm really excited because well not only am i wearing an outfit head to toe in That is a digital outfit that one can purchase in the metaverse worlds inside of the bubble shop. And the impetus for this is to raise money and awareness to donate to ocean conservation. Our sea life is threatening the entire world. like the degradation of coral reefs are threatening our entire sustainable way of life on Earth. And I really care about uplifting and creating partnerships surrounding ocean conservation. And I have exciting news. Our choral collective, which is filled with AI gen assets, audio, MPCs, and the MPCs are now being connected to an LLM language model so that each one of our MPCs in the world represents a different choral and will be able to speak to the facts surrounding that individual choral. We are sending the entire choral collective to the moon. And I'm not talking about the metaverse moon, I'm talking about the moon right above our Earth. I'm so excited about this. Through BitBasil, our incredible partner, we have the opportunity to burn an image of our world onto a nickel disk. The nickel disk, along with other artists' work that are featuring life below water, will be burned onto this disk, put onto a NASA payload, and go right up to the south pole of the moon, where it will be there for eternity.
[00:15:33.129] Kent Bye: Wow. OK. Nice. And so I want to go back to your outfit here, because you were talking around this digital idea of mixing physical with digital. And so you were mentioning to me that you were designing your avatar clothes within VR and then taking that design. And then actually, you said before you came to Meta Connect, you actually made these pants that you're wearing that were inspired by what you created in VR. I didn't realize that it was even possible to create your own clothes and designs with the Meta Horizon. Is that something that's new?
[00:16:02.884] Paige Dansinger: It is not brand new, but it's been only available to the MetaHorizon Creator Program partners, which I have the honor of being a part of this smaller group. I believe that the clothing will be available for everybody very soon. And it's incredibly fun. One is able to create textures and add them to skin an already existing clothing mesh. And those clothing meshes are really sophisticated because they're able to adjust to any size or shape avatar at this time. And I'm really inspired by creating fashion for avatars. I believe that the way we connect and represent ourselves, whether it's non-human, fantastical, or humanoid ways of expressing our identity within virtual worlds is so important to living our authentic selves and whatever that looks like. So I created this outfit inspired by water and the ripples and light on water waves, as well as it has a bit of a mermaid pattern on the texture. And I just feel like, hmm, maybe that's a part of me that I can feel out in the outside world, not just in my avatar. So embodying my identity in and out of VR is really important to me.
[00:17:41.469] Kent Bye: You mentioned the non-player characters having large language models wired up to them. I know that going to Augmented World Expo and seeing some of the different demos, they had the ability to have something like inworld.ai, which would be like a front end where a narrative designer would be able to put a boundedness as to what the facts were, steer it in a specific direction. Sometimes without that front end, having just a raw, large language model can start to hallucinate. It can be very chatty. It could be off topic. And so what's the interface or the program that you're able to more contain the knowledge or the facts or be able to control some of those NPCs that are in Horizon Worlds?
[00:18:21.297] Paige Dansinger: That's such a great question Kent, thank you. I'm excited to learn more about the LLM and what actually activates them and where the information is actually sourced from, but I would love to circle around once I've experimented more and be able to share my insights with you.
[00:18:44.532] Kent Bye: So that's not something that's already been launched?
[00:18:47.341] Paige Dansinger: It has just been launched about a week ago. And honestly, I've been making these clothing. So I put the clothing first.
[00:18:56.425] Kent Bye: It's a good choice, I think. You'll have plenty of time to explore all those potentials. I know they have had NPCs in different worlds. I know the Bobber's Fisher world that for a while, there was a program they had that if you go into these worlds, you can get points. And then if you get enough points, you get avatars. And I use that as an excuse to do a lot of exploration and playing. Leveling up in different ways, but at the end getting a range of different avatars just to kind of like explore what's happening in Horizon Worlds But I did notice that there are some NPCs that are already Out into worlds that have been there for a while and more of the meta generated worlds maybe as an early access So just curious to hear some of your own experiences with what the NPCs are already there on Horizon Worlds.
[00:19:38.399] Paige Dansinger: Yes. Well, up until the new LL language model activation a week ago, there has been MPCs that are available. There are already custom ones via the meta asset. where you can put out a toaster or a different kind of robot-looking NPCs, but there's also NPCs that look just like avatars, and you're able to customize the way they look, what they're wearing, and also animate them with some creative ingenuity. I have used the avatars in several of my worlds. The NPCs that have been in Coral Collective Until I linked them with the LLM, for now they're a recorded dialogue that I used 11 labs to create audio voices and personas for. And so it is scripted at this point. but give me a week and they'll be able to conversate freely about ocean conservation, different corals and the importance of what you can do. So they'll help to prompt you to take action creatively, poetically or within your community to participate and anyone is able Actually, to participate in the Choral Collective now, one can create a 3D asset or a mobile game or a soundscape. Anything one creates, especially using the Gen AI, you can send it over to me, Metaverse Paige, and I'll add it to the Choral Collective. I want everyone to be able to say, hey, mom, dad, look, I have art on the moon, and to be able to share this with everyone.
[00:21:41.192] Kent Bye: Nice. What are some of the other features that are being announced for Meta Horizon Worlds that you're really excited to get launched and to be a part of the community?
[00:21:50.455] Paige Dansinger: The exciting parts that are being launched are really the initiatives of the creators. So each creator launches and creates something new to add into the community. And the platform itself's biggest launch is really the Horizon Studio, as well as the maximizing the amount of people in our worlds. And with the new studio, there'll be more texture opportunities for environmental design. For instance, there is a new feature that I'm really excited about that creates landscapes using the Gen AI. the grass and the trees are all procedurally generated and they're beautiful. So that's something that I'm personally really looking forward to using. But I'm able to use that already now on the desktop editor, but it will continue to improve and improve over the next months and years. So that's something that I think will be exciting. And with that environmental generator, let's say you would like a grassy hill with a museum. You can select how many trees you'll have, but you'll also be able to designate bare spots for putting some outdoor sculpture So being able to like hone in to your world and personalize some of this gen AI creation. Now, something else that I cannot do without is the TypeScript assistant. Now, I am not a great scripter. I put my heart into it. I try my best. I'm always happy to learn more. But some things are way more complicated to script than I would be able to on my own. I'm, you know, a crazy artist, so I might be working at 2 a.m. and not want to reach out and wake a member of my community to talk through a challenge. This way, the Gen AI assistant is always on hand with me to walk through. how to script the next effect or experience that I'd like to implement. And I find it really exciting. It's like vibe coding where one would add a prompt and work with it back and forth as partners to achieve the results that are desired.
[00:24:34.181] Kent Bye: Now, I know that any time that you're building an immersive experience, you want to be able to preview it. But if you're on a PC in a 2D interface, then is there a way to, say, hook up the VR headset into this Horizon Studio program to maybe jump into VR and preview it? Because I imagine that reducing the friction of iterating through building something and seeing how it looks and then tweaking something could be a pain point if you have to upload it and if you can't preview it or have some sort of OpenXR extension into it to live go into VR if you have a PC. So I'm just curious if that's possible or what that workflow is going to be like of rapidly iterating with this tool.
[00:25:14.335] Paige Dansinger: Yes, currently through the desktop editor it's really simple to pop into VR. There's a little tab right up on top on the right side of your screen that clicks you immediately switches you into VR. All you need to do is pop your headset on and it's as easy as that. You're also able to select preview mode on your phone or on PC and it's really simple. I understand the New Horizons studio will not have the VR preview and that's okay. One can evolve and just pop on their headset. They're really pushing towards mobile development at this time and I hope that with the voices of the community, communicating how important that VR creating is, that they will always keep VR embedded within the creator tools. I was so bold yesterday or the day before during the Creator Studio and told them that I love building worlds. I don't want to be tied to my VR headset or a computer. Please, maybe consider creating a mobile app so that I'm able to build worlds wherever I am. So, we'll see. Maybe if they listen to all of our great ideas, which they often do, that new things can be implemented. It's always worth hoping, asking, and sharing why your ideas are important and create betterment for all.
[00:26:55.580] Kent Bye: Nice. I noticed here at Meta Connect there was actually a little station that had a number of different Meta Horizon worlds that were available for mobile gaming. They had Super Strike and Open Beta, Battle Ken, Shovel Up, Smash Golf Early Access, and Bumble Dudes. They had QR codes for each of these worlds that you could take a snapshot and then you could go in and play on that. I'm just curious if you've had a chance to play through any of the mobile gaming interfaces to Meta's push towards wanting to have more of a phone-based mobile experience for Meta Horizon.
[00:27:31.723] Paige Dansinger: Sure. I have played several mobile games, but none that you just listed. Those may have just been launched for this event. And I have really loved like Keiju, City Showdown, Merge Donuts. as well as Merge Kitties and several other games as well as my own Coral Quest mobile game is really exciting and I loved developing it and I've been able to share it here on my phone with many people. QR codes can sometimes be funny. I have put a QR code to the bubble shop right on this garment that I'm wearing on my pants and people have been able to scan it and purchase this outfit. TIMOTHY JORDAN- Nice.
[00:28:18.409] Kent Bye: Nice. And so just one quick question on the mobile. Have you noticed a difference of people being in VR versus not being in VR? Because there's kind of a hierarchy of embodiment of some sorts. And just curious what kind of the social dynamics are emerging. We've seen this before on the VR chat platform when you have a PC VR versus Quest and the sort of different capabilities between the two. But just curious to hear some of what you've noticed in terms of the social experience of people in VR versus not in VR.
[00:28:45.965] Paige Dansinger: Sure. At first it was awkward. It was a little uncanny valley. You would see people kind of hunched over and looking down the wrong way, unsure of how to use their avatar on a mobile small space. And until one kind of feels a sense of embodiment through their mobile, it can be a little awkward. Oh, there are some differences. For instance, the mobile creators are able to select emojis that are really fun they're able to do really cool dances and great gestures and it's kind of adorable and so when I'm in VR where I have other capabilities like jumping and using my arms and legs in more expansive ways sometimes I watch those people on mobile dancing and think hey hey now right so every has its pros, cons, and perks.
[00:29:48.588] Kent Bye: Great, and finally, what do you think the ultimate potential of virtual reality and all these immersive technologies might be and what they might be able to enable?
[00:29:57.853] Paige Dansinger: Kent, thank you for asking this question. It's the reason why I wake up every day and commit myself not only to VR, but the way that Horizon is evolving. Because I believe that when one builds the worlds that you vision, dream, and deserve. You'll create the world that you deserve, dream, and envision outside of VR. And today, more than ever, it feels as though we need to create a beautiful vision of cooperation, empathy, and connection, and that VR is and Horizon and all creator tools, platforms, and ways to connect with each other via meta quest headsets could not be more necessary at this time. especially for young people. Young people deserve a beautiful, better world. And to empower young creators to build those worlds now gives me faith that in the future, our world can be run by leaders who are able to envision what collaboration can look like, what creating a vision and what the possibilities really are. And we're only limited by our imaginations. So if I were to ask for anything in the world, let's use our imaginations to create what a better world of buy-for-all really looks like.
[00:31:38.721] Kent Bye: Nice. And is there anything else that's left unsaid that you'd like to say to the broader immersive community?
[00:31:42.967] Paige Dansinger: Well, I really appreciate you, Kent. Thank you for all the work that you have been doing to highlight creators, innovators, industry leaders, and the people who really believe. It's as if you've created like a rainbow connection, right? Going back to kermit the frog brought all the believers together and when they struggled they were there to share their stories to highlight the highs the lows the winds and the ways forward and our community is golden so thank you thanks for being part of it i also wish to thank Mark Zuckerberg and Connect and all of the people here who are behind the scenes and in front of scenes creating this event. I'd also like to thank the members of Women in Horizon and XR Women for believing that their voices are important, integral and necessary in this space.
[00:32:45.208] Kent Bye: Awesome. Well, Paige, it was a real pleasure to get a chance to catch up with you again to hear all the things that you're working on and excited about. And it sounds like this Horizon Studio program is going to be a bit of a game changer in terms of where it's going to take the community here in the future. There was a Rec Room AI tools that were cited as a reason why they weren't launched properly that created a fracture in the community. But hopefully that it won't be too disruptive for the existing communities that are already there and that they'll be empowered to create even more incredible creations. Yeah, I know there is a tendency for creators and AI to be in opposition to each other. So I'm hoping that that isn't necessarily the case where everything ends up being just generated by AI, but that there's a fingerprint of the human connection and ways that people can still find a way to tell their stories through the environment or through this type of world building. So anyway, I just really appreciate your perspective and where things are at now and where they might be headed here in the future. So thanks again for joining me here on the podcast.
[00:33:39.193] Paige Dansinger: Thank you so much, Ken. And have a wonderful MetaConnect. Awesome.
[00:33:43.354] Kent Bye: Thanks. Thanks again for listening to this episode of the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.