#1524: HTC Viverse Leverages Open Web Tech in Aim to Become ‘YouTube of 3D Content’

HTC has launched Viverse platform that is aiming to become the YouTube of 3D Content. It’s a platform that hosts 3D content and worlds built on PlayCanvas as well as many other open source technologies like WebXR, WebGPU, and VRM. It is sort of like a mix between VRChat with social worlds, 3D web crypto-based metaverse worlds, but also with some more enterprise embedding features as well. It’s exciting to see this shift towards building out the open and interoperable metaverse, but in the context of this hybrid walled garden context that is built using open web technology stacks. It is also a hybrid in another way in that it is mostly consumer-facing but also has enterprise use cases like privately embedding of 3D content.

I interviewed HTC’s Andranik Aslanyan about the new VIVERSE platform, how they recruited over a hundred XR and WebXR developers to seed the content, and how VIVERSE fits into their overall strategy. I get some clarifications on the Android XR non-exclusive IP acquisition, and how VIVERSE fits into where HTC will be heading now that we’re coming up on 10 years the HTC Vive was announced on March 1, 2014.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So back on March 1st, 2015, HTC and Valve surprised and shocked the world with the announcement of the HTC Vive. And just a few days later, it was at GDC giving demos to a lot of game developers for the first time. Now, 10 years on, HTC has continued to be a big player within the XR industry. They've really carved out a space in the enterprise space. They've continued to develop and support headsets that have White House tracking and have their own standalone headsets. But they also just had a non-exclusive content agreement with Google XR, and they did a whole talent acquisition for $250 million. So it's a little bit like what's happening with HTC. So they're just announcing this brand new thing called Viverse, which is... kind of a 3D social network platform. So they're aiming to be like the YouTube of 3D content, kind of like this intersection between what VRChat is doing with independent creators, creating these different immersive worlds, but a little bit more bite-sized because it's on the web and it's not as extensive. It's all using WebXR technology. So using like all this open standard stack, but they've also got use cases for enterprise to kind of build this out for integrating more and more 3D components into the broader web where you have kind of like this public and private interface. conceits where you could embed some of these 3d worlds on these websites kind of like how you have private youtube videos so it's kind of like a mishmash of a lot of different things that are all thrown together it's part web it's part immersive it's got elements that kind of feel like what the crypto-based immersive metaverse folks have been doing where there's more web-based 3d worlds that are happening but there's also much more integration with WebXR through Play Canvas, which is one of the JavaScript engines that are out there to integrate with 3D content. So you can actually pull up Fiverr on the Quest browser or any other immersive browser and start to dive in and have more of an immersive experience of some of these bite-sized worlds. They've been collaborating with a number of different creators from lots of different ecosystems, from VRChat to refugees from Mozilla Hubs. They've got Folks have been in the WebXR industry and different award winners from the Polys Awards. And so they've gathered about 100 different creators that have been fleshing out some of the initial content that's on Vyvers. It's really quite interesting. You should go check it out and then pull it up in a browser and jump in. I found it optimized for like 2D and mobile and PC and then some of the more immersive things. It's not like going into VR chat and just having like an immersive experience going to worlds. You're dealing with a lot of the frictions of WebXR and dealing with these 2D interfaces. But this is the sign of a completely new direction for where HTC is going. And it's something like what Meta has been building with Horizon Worlds. But if Meta had decided to use open source technology stack to do that, then I would be a lot more excited about where Meta is going with Horizon Worlds, but it's much more of a closed wall garden. So you see this kind of battle that is emerging with folks that have been all in with Unity, Unreal Engine, and these proprietary walled garden models. And then HTC is saying, nope, we're just going to use all these open source technologies, pull in Gaussian splats, pull in all these kind of like VRM, open source, metaverse-y type of things. And so this is, in a lot of ways, the most viable metaverse platform that I've seen so far that is integrating all these different things in a way that is really cohesive. And you should actually go check it out because it's really quite exciting. So I had a chance to sit down with Andranik Aslanian. He's the head of growth at HTC to talk about how this virus all came about. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Andranik happened on Monday, February 24th, 2025. So with that, let's go ahead and dive right in.

[00:03:53.692] Andranik Aslanyan: So my name is Andronik Aslanian. I'm the head of growth here at HTC for Viveverse. And Viveverse is essentially the brand at HTC that's all about interconnected experiences on the internet. So one of the things that we're launching today is Viveverse Worlds. And it's about the delivery of 3D content for creators, for businesses, everyday gamers. Really just about making content delivery for interactive experiences just as easy as sharing video on YouTube.

[00:04:33.281] Kent Bye: Okay, awesome. And maybe you could give a bit more context as to your background and your journey into the space.

[00:04:39.027] Andranik Aslanyan: Sure. So I've been a game developer... for 16 years. And I started with working in indie studios. Then I had my own indie studio for 10 years. Then I ended up launching a product called Obscure, which was all about Twitch streamers and VTubers. And eventually I ended up here at HTC for the last six months running the growth team and really just making this product as practical and as useful to people on the internet as possible.

[00:05:16.564] Kent Bye: Okay, well, I know that with HTC, it's got a long history in terms of with Vive and different iterations of hardware. And then there's been Viveport. Can you give a bit more context for how, if this is a continuation of some of these other ventures or if this is a completely new venture and just to help set a larger context for how this is coming out of HTC and help me get a sense of the landscape that we're talking about here.

[00:05:44.278] Andranik Aslanyan: Yeah, for sure. So HTC has been a leader in the VR space for quite some time. Actually, the OpenXR standard, we're a big contributor to it and sponsor. And Viveport is actually part of the Viveverse brand. It's just a different product compared to Worlds and Create. And really, Viveport is kind of a game app store, essentially, for VR content. And when we launched Viveverse Create and now Worlds, this is more about generally being able to deliver 3D experiences, whether it's desktop, mobile, or VR, across the web for everyone. And really, I think the most useful piece of that is that you don't have to install anything. And when you build an experience, it works across all three platforms out of the gate. So you don't have to build a VR-only experience, or a mobile, or a desktop, or three different experiences. integrated and cross-platform. So I think that just makes it, for creators, like you build one piece of content and you reach basically the whole internet, no matter what device someone is on. And yeah.

[00:07:06.144] Kent Bye: Okay. And I know there was some recent news. I just want to see if you have any comment in terms of like, I know there was acquisition of some of the different either IP or it's a little unclear exactly what had happened with Android XR and Google and HTC and what is still remaining of HTC. So I'm just trying to get a sense of the context of the hardware division and software, and if it's split along those lines, or if they're still existing. Because I know there's trackers that HTC has been building for a long time. And so can you give me a sense of what is different now that there was that acquisition? And yeah, just love to hear some additional context for you.

[00:07:44.921] Andranik Aslanyan: So what was done was essentially a non-exclusive licensing deal with Google of our XRIP. So HTC will continue to develop XR hardware and software that includes the Vive product line. If anything, it just is opening the door to more collaboration with Google on XR stuff. And really what it was was a play to strengthen that relationship and help accelerate the XR space and less so much as an acquisition. We're still continuing to produce our hardware and make new stuff in the future.

[00:08:20.506] Kent Bye: Okay, that's helpful. And was there any employees that got transferred over as well or not?

[00:08:25.528] Andranik Aslanyan: Yeah, I think a small portion to basically support the integration efforts were moved to Google.

[00:08:33.051] Kent Bye: Okay, all right. Thanks. That's helpful just to hear a little bit more context because that's been some of the biggest news. And I think folks who are listeners appreciate any additional context on that. Okay, so we go back to Vyverse, which is what's being announced here on February 26th. Can you give me a sense of the target audiences, some of the use cases? Because what I'm seeing here is a lot of WebXR type of web interfaces, and you just walked me briefly through some of the different features, and you can upload models and have embeds on third-party sites. And so it's in some ways like you're having these 3D objects that are being hosted through Viverse, but are spread out throughout the entirety of the internet, kind of this idea of What if there was the YouTube of 3D content? And so you're encouraging this type of embed process across the web. What was the catalyst to go from Viveport, which is a part of the Viverse brand, was sort of like this curation of more app-based experiences that were kind of subscription model that you could explore a lot of these different worlds. But this seems to, in some ways, trying to be building out different core components of what we would think of as the metaverse, having these interconnected worlds, or at least these types of software and services to promote this type of infrastructure to have more and more 3D content across the web. So what was the catalyst or what were some of the initial use cases that really had HTC decide that you wanted to develop this and launch it as a product?

[00:10:01.451] Andranik Aslanyan: Yeah, for sure. So interoperability is definitely a core piece of it, which is what I think makes it seem like a metaverse system, because you make an avatar, then your avatar exists in all of these different experiences, regardless of the experience type. But really, what we tried to do was actually build something that's super practical for use. We're providing creators with all of the difficult lift, like the distribution of the actual content and infrastructure, the multiplayer components, the kind of social pieces, and really it just allows the creator to actually focus on the content itself. So the world, whatever interactions happen in those worlds, And I think what you'll see on there is some people build entire video games where there's an inventory, and you go and collect stuff, and there's a whole storyline. And other people do simple product visualizations, so a suitcase that you can open and close. embed onto your Shopify. And there's something in the middle where people could use LumaLabs or Polycam to create Gaussian splats of real-world spaces, upload them on Viverse, and then share the link on their socials and have everyone go there and experience the same kind of spaces. So there's this huge variety. And that variety is very akin to YouTube. You go to YouTube, you can see music videos. You can watch finance videos. I mean, there's everything on there. And we kind of see Vyvers the same way. It's not for any particular person. It's really for the audience of the maker. And that means that we have tons of different genres of content. And the nice thing is when you go to experience something, you don't have to make a huge commitment. There's no installation, download, whatever, paywall. And in that sense, it kind of makes 3D content a lot easier to consume. because everything else out there is either a really unified system where all the experiences are the same and it's for a particular audience, or it's like Steam where it's an install base and you buy each piece of content and all of those are sold individually and they're really, really large and only work on certain devices. And I think this was a very lofty goal. In order to make it happen, we had to build proprietary technology. For instance, we have... this thing called polygon streaming, which what it does is when you go to load an experience, it essentially calculates where you're looking. And similar to video streaming, we stream the meshes at the density needed for the distance from you. Or you can kind of make the similarity to, let's say, Nanite in Unreal Engine, that it's like a mix of streaming technologies. So instead of downloading this entire package of a full game or world, you'll just get what you need for the view that you're at at the moment. And then you can have really high density spaces work on an iPhone, for instance. Even if the space has 22 million polygons or something, you're only rendering what's needed right near you.

[00:13:24.690] Kent Bye: One of the interesting things around HTC, from my perspective, is that HTC has done a great job of going down both tracks of the enterprise market as well as the consumer-facing market, where Meta has really focused mostly on the consumer-facing side and not so much the enterprise side. In a lot of ways, they left the market for HTC to come in and really be a dominant player in so many different location-based entertainment contexts or even just enterprise applications that needed to have certain requirements. So as I look at this Vyvers project, the thing that is striking to me is that it also seems like it's a bit of a hybrid approach where you do have a consumer-facing side, but you also have enterprise use cases. And so I'd love to hear how you see that breakdown in the beginning, if you feel like there's going to be more B2B type of applications, or if you see it's more of like a consumer-facing application that's going to be a big driver. I'd love to hear how you split those two different demographics for something like Vyvers.

[00:14:21.821] Andranik Aslanyan: I think what will happen, Survivor's World supports both publicly curated stuff and then unlisted, similar to how YouTube does your videos. You can upload videos and then keep them unlisted but still share them. I think what would happen... is that most of the consumer facing stuff will be publicly shared. And then all the enterprise people who use the platform will likely have it unlisted and embedded on their sites or whatever. Because I think that's the most common business case. They don't necessarily want to drive people to our website, but they want the tech and ability to distribute content quickly. I would probably say that it'll be like 70% consumer and then maybe 30% business use. It's a total guess. At the moment, it's majority consumer because we have this creator program where we went out and tried to get as much of the WebXR development community to join our platform as possible. So we incentivized them, we paid them to build stuff. And we're even going to eventually launch a public version of that program. But for businesses, there's kind of like different tiers. So the lowest tier would be that you just use the platform the way that consumers do. But we also offer custom integrations or like entire custom deployments of the system. For instance, we have one in the Middle East where they have this metaverse world of living on Mars, what that's like. And it's entirely powered by the Viber system, but it's a custom version of it. It has all their branding on the Mars. And so that's kind of like the ultimate enterprise level of using our system.

[00:16:10.147] Kent Bye: OK. And when I see what you've created so far, it feels like it's like a content management system in some ways, where you're able to build some of these 3D worlds, like these creator tools, but also have something that's on your own site. But you can also potentially develop a channel, kind of like the equivalent of what we imagine of people building up a brand in lots of different worlds. And that is a little bit more like VRChat, which HTC is an investor in VRChat, where that's all Unity-based, and this is more WebXR-based. But love to hear how you're kind of like this hybrid between something like VRChat, where you're curating all these worlds, but at the same time, this model of YouTube, which has some similarities and a lot of differences. The differences are that YouTube, you could go viral. I don't think that there's necessarily an equivalent type of virality that we've seen so far within the context of VR. Maybe it's because you have to be embedded into the virtual world to really get it, and we don't have the penetration of as many devices for people to really fully understand it. And so that provides a design challenge from the users. Do you make this world look good on a mobile phone? and maybe not as interesting if you're in VR, and then vice versa. Sometimes things can be a lot more compelling if you're in VR, but not look as good if you're just on mobile. So there's these differences when I'm looking at this space where we haven't seen something like the equivalent of a YouTube or VR, mostly because most of the applications that have been built so far have been under proprietary technologies like Unity. But using the open technologies, we have new capability. But I also see that there's these points of friction where You send it to people and they don't have any barrier to entry to be able to fully experience a video, let's say, versus a 3D immersive world. If they don't have a headset or if they're not fluent of how to navigate around, then it becomes more of a question of the tools not being easily accessible for people to fully enjoy them. So I'd love to hear just some of your initial thoughts on that.

[00:18:11.064] Andranik Aslanyan: Yeah, I mean, I think that's a great parallel. So I think when you look at VRChat and you want to make something go viral, I think you're basically looking at your existing user base and making a single piece of content go viral within VRChat. Because if you link to, say, a world in VRChat on socials and try to make it go viral, you have this really high barrier to entry or commitment level where someone has to go to VRChat, install the product, make the commitment to learn how to use it get set up and i think even just the name vr chat because it has the vr in it people don't even realize that it works on other platforms like mobile or desktop In our scenario, if something is truly interesting, people can just share a link. And without even creating an account, you can click on the link and then just join in as a guest and experience it on your desktop, on your mobile, on your VR headset. It's like basically zero barrier to entry. As long as you've got the person to click on the link, it's loading in their browser and it's going to function. And that will make it a lot easier to make something go viral because people load web pages all the time, whether it's 3D or not. So yeah, I don't think there really is a parallel to what we've built out there because it's a mix of those two things. I will say VRChat is a lot more developed in its VR social features because they spent a ton of time and effort focusing on that one primary use case. And we really built more of a generic experience platform. And it's kind of more up to the creator of each world to add any additional functionality that they may want. And we'll continue to develop our VR and AR feature sets to get more and more featureful like VRChat is. But we have similar capabilities. For instance, the avatar system, while we do have auto-generated avatars, we support VRMs. So if you are a VTuber or you have a VRM that you use in VRChat, we can just import that directly into Vyverse and use it in Vyverse as well. As for Unity, we're working on Unity support for Viveverse Worlds, which would be kind of like a plugin that you install in Unity and then one-click basically build and publish to Viveverse. And that would also lower the barrier of entry on the creator side. Because right now, for our creators, if they're building really interactive experiences, we have them use Play Canvas with our extension. Play Canvas is like a WebXR game engine that's similar to Unity, but it's not a one-to-one. And it has a little bit of, I guess, onboarding if you're a Unity developer. And we've built tools to make that easier. We have a tool where it'll take your Unity scene and then export it and bring it into Play Canvas. But that doesn't include coding functionality. So I think we've done our best to make onboarding onto Play Canvas as easy as possible. We even added no-code tools to Play Canvas that don't generally exist so that artists can build interactive stuff using Play Canvas. But we're reaching the point where we feel like we need to support Unity, and we will in the near future, just to open the door to even more creators.

[00:21:48.973] Kent Bye: Yeah, just to follow on the social features, because I unfortunately did not get a chance to try out some of the demos myself yet. And I'll probably wait until the day of the launch and maybe jump in and see for myself how things are looking. But just in terms of the social features, I know something like VRChat, you're able to have a friend network and then join your friends. That's something that is not always centralized on the backend of VRChat's website. And I know that there's been some other people that have looked at like open standards of like PubSub and just finding ways to have more of a federated social network. I mean, obviously it's a little bit easier to have control of that social graph and, You have a little more leeway to push the innovation for what kind of features you want to develop above and beyond any kind of open standard constraints and other use cases where there isn't really like any equivalent for that to really have a dialectical development. At least right now, there's not a lot of... other folks that are in the space of having your social network across the distributed metaverse. But in terms of the social features, can you elaborate a little bit about if you are able to like friend people, join your friends, or I feel like the social component is kind of like the magical aspect of what has been really able to facilitate so much growth in VRChat, especially with groups and having other ways for folks to come together. So I'd love to hear some of your first minimum viable product launch of Viverse and what kind of social features you have.

[00:23:17.081] Andranik Aslanyan: So we do have a friends system, so you can friend people. And we have voice chat and text chat in the spaces. You can also follow creators. So if you play an experience that you really like, you can go to their profile page, similar to a YouTube channel, and you can follow them. And then when they launch new experiences, you'll get notified. We're building some more of the complex systems that you would see in something like VRChat that's moderating who can speak to you and things like that. And we are also exploring things like Discord integration. So since our system is WebXR and web-based, we think that there's opportunities to branch even further. So when someone sends a link to a Vyverse experience, maybe they don't even necessarily have to click the link. We can have it be an embedded player within Discord. Or we've actually had done some experiments where when you go to post on X, it'll actually have the player directly within the feed. So just like how you would come across an X video, it would actually have a playable 3D experience right inside of the tweet itself. And yeah, I think that's just something that's super unique. I've never seen something like that. When's the last time you saw a video game come through your feed without you having to go somewhere else to experience it? So I think there's a lot of opportunity in the social feature space. I think it's a big focus for us at this point.

[00:25:00.782] Kent Bye: Yeah, in terms of the WebXR framework that you're using, Play Canvas, and there's a number of other JavaScript-based WebGL interfaces and WebGPU functionality that's coming up and as well as kind of replacing that WebGL. But can you elaborate a little bit around the evaluation process of looking at something like 3.js, Babylon.js, and Play Canvas and What was it around Play Camas that made you decide to go down that path versus some of these two other platforms that have also been very popular on the web with 3D content?

[00:25:33.304] Andranik Aslanyan: Yeah, it's actually interesting. So we do have an SDK to support 3.js. So if you're a 3.js developer, you can actually bring in our login and avatars and leaderboards and stuff and use all of that and then publish to our platform. I think the nice thing about Play Canvas is because it offered a web-based editor, it made it obvious for us to just say, OK, well, there's already this editor. We'll just launch an extension that expands that editor. And that means people who have never done web-based content development before, it's so similar to Unity. They can just pick it up. And I think that's really something that's helped a lot. Because if you told a Unity developer, now you're going to go do 3.js, they have no idea where to start. But we're also looking at other things. Like right now, we're looking at Wonderland engine integration. So it's another web-based engine for WebXR content. And they're working on porting some demos. And eventually, we'll have direct porting. integration there as well. And we just want to be like an open platform. It doesn't really matter where you produce your content. As long as it can run on the web, we should have integrations that will allow you to access our core systems so that people, when they go to experience your stuff, they can still have all the friends and VoIP and chat and all of these things. But the back end that's actually running it shouldn't matter as much. Same with a Unity experience. it should run on their platform exactly the same way.

[00:27:07.771] Kent Bye: JASON MAYES. Nice. Well, one of the other features that is mentioned here is this integration with Sketchfab. And what that reminds me of is back in the day with Google Poly, they had Google blocks that then you could upload a bunch of those blocks to Google Poly. And then there were other companies like Wave VR is now Wave XR was integrating a lot of those Google Poly objects into their DJ events and music events. and then when google announced that they were shutting it down then that was like a catalyst for wave to be like oh we're not going to be able to replace all of that so i mean there are other headwinds that they were facing that they decided to get out of vr for a bit and now they're just now getting back into the vr space but This whole content delivery network of 3D objects has had a number of iterations before. And with Sketchfab, they have Creative Commons content. They have content that you can buy. And so wondering if you can maybe elaborate on this integration with Sketchfab if you see it as a way of like the Unity store where you're buying like premium content, or if there's like Creative Commons content that you can create more of a commons for getting access to 3D objects that you wanna build out a scene without having to build every object. And if it's to the point where, there's a reliance upon Sketchfab as an intermediary content delivery network to serve some of the content into these scenes. And so I'd just love to hear a little bit more explication of what's happening with the Sketchfab integration.

[00:28:39.386] Andranik Aslanyan: So we have two options right now for how to create content. One is the Play Canvas site. And then we have a web builder that's kind of in a version 1.0. And in the web builder, when you click it, you get some templates of different spaces. a room or an empty space. And from there, if you choose to use the web builder, it's all within Viveverse. We don't take you to any third party. And you enter this edit mode, which lets you upload your own 3D models or videos, images. You can even link to Twitch or YouTube streams in your space. And it's really for building Hangout locations with zero technical skill. And one of the buttons is to open up a Sketchfab browser for assets. And it just brings up the entire free library. So you can't purchase anything from there. But it's nice for someone with no technical skill. You could just type in, I want a dog. And then you'll get 100 dog meshes. And you can just click on it, and it shows up in the world. And you can place it. And so it's just kind of nice to have for building your kind of space, I guess you could say, in this metaverse or world space. But it's really not for building the interactive experiences. And when you do import those meshes, they're not actually linked permanently to Sketchfab. Once you import it, it becomes a part of your world. So it's permanent. It's in your folder. So if it ever does end up going away, it won't break your world. People just won't have access to the library to grab new stuff.

[00:30:31.937] Kent Bye: JASON MAYES. OK, well, that's good to hear that there's a way of downloading that content. And Vyverse, at that point, is responsible for delivering it out there. In terms of the scripting language for the Play Canvas side and making things that are interactive, is it with JavaScript? I mean, how robust is it to be able to actually make a full-on equivalent of what you could do with C Sharp and Unity? And just wondering if you could elaborate on what kind of interactive capability is there with Play Canvas.

[00:31:02.523] Andranik Aslanyan: yeah it is js and you can pretty much do everything that you could do in unity i think we've been also opening as much as we can to controlling our core services so for instance the avatar we recently launched more SDK endpoints so that you can change the mesh of people. So like once they join your world, even if they came in with their standard avatar, you can swap them out and you can build like a prop hunt game or change the way the camera works and their movement abilities, playing animations on them. I think that was really the original limitations to a system like this where if the avatar was locked behind the wall where you can do everything you want in the world but you can't touch the player it made it very hard for people to make super interactive spaces but we've unlock that recently. So I've seen people build really single player, like super interactive stuff and multiplayer. I've seen the prop hunt game, a web like space exploration with an inventory and you can pick stuff up and yeah, I mean, you can build essentially a full video game in there. It just comes down to your skill level, I think.

[00:32:27.005] Kent Bye: Okay, in terms of the sharing out with these different worlds, in the press release, it says that Vyverse Create is a suite of tools that allows creators to build and share interactive multiplayer worlds on any device via headset, mobile, PC, or Mac without requiring code. So I'm imagining that you're sending over like a URL Getting the URL on, say, a mobile device and then actually transferring that into a headset, there's some friction there in terms of how to seamlessly jump in. I'm just wondering if you could elaborate on how you see that workflow from receiving a link to a virus world and then the least friction to be able to see it across all the different platforms. Especially VR, I think, is the one where it's the most isolated from this other ecosystem of I can imagine anywhere else you can very quickly pull up a web browser on a mobile phone, a PC, or a Mac. But the VR headset is the tricky one to seamlessly launch into some of these different worlds. So I'm just curious how you've handled that problem.

[00:33:28.427] Andranik Aslanyan: Yeah, sure. So I think the main thing is actually getting to the page on your headset. So if you have an all-in-one device, either Discord or through some sort of social network or something like that, as long as you can get to the web page. On our site, when you load an experience, there's just like a little headset icon, and we auto-detect the device that's currently loading the page. So if you're In a VR headset, we'll know, and you can trigger the XR experience right away. There's no setup or install of any kind. But it is an interesting thing to say. If someone texts you a link, how do you get that text message to your device? I'm not exactly sure how you solve that. But I think that's true of anything, actually. How do you actually get there? Maybe Facebook Messenger, I don't know, or Discord.

[00:34:22.301] Kent Bye: Yeah, I know that on MetaQuest, they have WhatsApp integration now where you can more seamlessly click links from within the quest and maybe jump in. Or sometimes you can send the link through the app, but that's a number of different steps that could be enough friction to stop. But I think it's probably the biggest barrier that jump from receiving the link to diving into the link. There's enough friction there to stop people from doing it. And I think that... When you're jumping into a VR experience, sometimes it's like sitting down to read a book. Because when you sit down to read a book, you usually are there for at least 20 to 30 minutes, an hour. You're committing a certain amount of time. And I think VR has been like that. I feel like there is a space for this kind of bite size, like you put on a headset and you quickly see something. I think the friction of seamlessly getting those links in there and to be able to do that and have the habit of just jumping in for more like a minute or so or even 30 seconds, something like you would click a video and see an animated GIF or something, the fluency of seeing the 2D screen is so much quicker than that type of user behavior. So I feel like that's another challenge is the inertia of the commitment that the user has to have to want to go through the trouble to have the full immersive experience. And if they see a little bit of a taste on the phone, then what is the thing that's going to make them say, oh, this is going to be wholly insufficient for me to see this on my phone. I need to jump into VR. Because most of the existing ecosystem has been app-based, Moving into the web-based, you have new kind of user flow challenges, but also new opportunities for getting out of the app store ecosystem, but also new challenges of what's the business model to support this web-based approach that goes above and beyond all the existing models, which are very much centered on creating these proprietary apps that are more of a closed-wall garden. And so I'd love to hear some of your thoughts on what are some of the innovative business model angles for something like Vyvers that is able to help escape out of the inertia of what we've seen so far in the XR industry, which is everything being boiled down to an app and all of the exchanges happening in the context of that app rather than on the open web.

[00:36:33.462] Andranik Aslanyan: Yeah, so we haven't quite launched a monetization model that we can talk about the way that we're thinking of it. And it's basically entirely a web-based creator model, just like how YouTube and Twitch and all these platforms work. So part of supporting that will be advertising. As we're like a immersive interactive space, we also don't want that to be obstructive. So we're working on little banner ads and stuff that aren't kind of in the way of content, but then also potentially allowing creators to do 3D embedded advertising. So automatic product placement or posters and stuff that can dynamically change based on who the consumer is. Those are kinds of things that we're thinking about and exploring. But we'll never move towards a subscription or a single purchase model because that's just not conducive of a healthy creator ecosystem. What we really want is creators to bring a lot of audience over and then basically benefit from bringing that audience there. And that can be things like subscriber-only content. So not subscribing to the platform, but say, subscribing to an individual creator. And then that creator can have locked behind walls worlds. Or they can do in-app purchases if they're building a really big game or even potentially donation based stuff so we're looking at everything that creator platforms do and then trying to think about well this is a really unique space how do we take advantage of the fact that it's interactive and it's 3d but it's also open to everybody so guests and these kinds of things we don't want to put a sign-in wall in front of anything like we want it to be just You made something really interesting, you want to get it to people, they should be able to just hop right in no matter where they are and experience it. And eventually, hopefully, as technologies progress, whether it's AI-generated stuff or... easier no-code tools, things like that, that we can see more and more of the general public making interactive content. It's not just for engineers or people who have tons of 3D experience.

[00:39:00.962] Kent Bye: Yeah, that makes total sense. And I think that... There's obviously an existing model with YouTube, and I think all the things we were talking about in terms of the inertia is part of the challenges that we haven't seen that so far. I think also most of the people so far have been very focused on more of the game engine approach with Unity Unreal Engine and even Godot, which is kind of an emerging game engine. So it's great to see that there's this push towards trying to really expand out to the potentials of WebXR and what can be done on the web. So one other technical question I wanted to follow up on was the Gaussian splats, because when you have Gaussian splats, it's a completely new renderer pipeline where It's like point cloud-based and then neural rendering. It's a whole other way that goes above and beyond a mesh and polygons and existing rendering pipelines. And so how is the integration with Gaussian splats with everything else that we've been talking around? Is it that if you want to have a Gaussian splat, that's all you have and it has an optimized rendering for that? Or are these Gaussian splats being converted into meshes that can be integrated with other scenes? So I'd love to hear some of your thoughts on how you're integrating with these splats.

[00:40:10.194] Andranik Aslanyan: Oh, you can totally mix the two rendering technologies. So actually, the Gaussian splat just comes in like any other asset. So you bring your PLY file in. Generally, we tell people, no matter where you produced your Gaussian splat, take it into this open source product called SuperSplat. Because if you bring it in there and then re-export it without doing anything to your Gaussian, it compresses it pretty well. And then you can bring that into Play Canvas, put it in your scene for Vyverse. We're also working on direct integration for the web builder for your PLY files. And you can just put it in your scene like you would put any other 3D asset, but it still renders like a point cloud based Gaussian splat. And then you can set up your hidden collision so you can actually like walk up around the space. And if you want to mix in 3D assets, you totally can. I mean, they might look a little different. You'll probably have to simulate where your lighting sources and stuff are, but there's no limitation in terms of choosing one or the other. You can totally mix the two. And that means potentially produce a super hyper-realistic shooter game or something on Viverse because they can just record real life spaces for the environment and then they would just have to focus on the animated pieces or like their characters and guns and stuff looking at least close enough to realism to match up. And now with web GPU support, that's more and more possible. And there's also new Gaussian splatting technologies coming out for 4D captures, so like animated Gaussian splats. It's like almost there. I think in the next few months, you'll start to see more and more of that. I've been seeing like... tech demos and research papers and stuff for animated Gaussian, but it hasn't really made it to a usable state yet. But I think it'll get there.

[00:42:08.311] Kent Bye: One of the key features of YouTube is that it's based upon user-generated content. So very little content is produced by YouTube itself and has all these creators that are making content. And so as you've been flushing out Vyvers, how have you been cultivating this creator community? Where did you reach out to and maybe talk about the different types of people that you've been able to use as beta testers to build out what is going to be launching here in a few days?

[00:42:34.939] Andranik Aslanyan: So our creator program, we have a couple of people in the growth team that went out to different WebXR festival winners. We went out to Reddit channels, like all cold outreach. There was this product called Mozilla Hubs. That was a WebXR platform that shut down. So we also reached out to the biggest creators on that platform and said, hey, we know you were using this thing. We've launched something new that has similar feature sets but more modern. And those people were super interested. So we signed on over 100 creators in about three months just through cold outreach and negotiating with them directly. And then we set up a creator Discord. And the community there has been great. They're all helping each other. They're sharing their progress. It's really interesting to see, even when people hit roadblocks that seem like potentially a platform limitation that we would have just generally tried to unblock them from, they find ways to just get around it. Like, oh, they don't want a piece of UI. They'll make a code that just goes in and removes the UI element, or if there's a feature that's just not inherently there, someone will build the JS code for that and share it in our Discord. And we recently even open sourced our documentation. So it's hosted on GitHub. And if you're a creator and you run across something that you feel is under-documented or there's an issue with the documentation, you can just push a pull request for fixing the documentation and we'll take a look at it. We're just trying to build as open of a system as we can because honestly, yeah, just like you said, how YouTube's creators are what really built up the platform, that's the same way that we're approaching ours. It's like without the creators, it's basically nothing.

[00:44:27.864] Kent Bye: Yeah, you're in this really interesting space of having something that is essentially in a centralized server platform, but at the same time using a lot of these open technologies. Have you had internal discussions in terms of something similar to Mozilla Hubs where they had an open source version where if you wanted to self-host things, you could, but if you didn't want to have to deal with all that hosting, you could use a platform? So they were coming from Firefox and Mozilla. They have open source in their DNA to build something like that. And so this seems like it's more of a centralized model with Viverse, but have you had discussions around having people have the ability to export something and have it be self-hosted because it's using something like Play Canvas? Or is this something that you're at this point still focusing on trying to build out your own centralized version of this using open technologies rather than having a pathway towards open sourcing certain components or exporting components into self-hosted solutions?

[00:45:26.553] Andranik Aslanyan: I think we're very interested in open sourcing parts of the platform. Maybe not necessarily the server orchestration or infrastructure and stuff because it's super complex, but the actual platform it's running on and anything that's not proprietary algorithms and stuff, we see the situation in which Maybe a creator really wants to do something that we just are blocking them from doing, not on purpose, but we just don't have the feature or whatever. Right now, they're kind of stuck waiting for us to solve that. for them, but we really don't see the issue with just like how Unreal did where they open sourced their game engine where people can make the fixes and propose them. I mean, why not? That's just more of an expansion of this kind of creator ideology. I think we're going to try to do that as soon as we can, but it is a big undertaking because going from a closed source software to opening it up to the world, I think, I mean, HTC is not a startup really. It's a huge corporation and it's going to take some time to go through all the proper legal clearances and whatever it takes to make that happen, but we're going to try for sure.

[00:46:52.182] Kent Bye: Yeah, I can definitely see that. It's a very interesting hybrid of all the things that you have. It feels like a mix between some of the best parts of open source and the best parts of having a centralized control where you don't have to generalize out to all these different use cases, but just focusing on your own use case and building what you need. But then when you do that, it's hard to open source it because it's not generalized enough for other people to use. So I know there's different design decisions along the way that you have to make. Very cool. Well, this is all very exciting. I'm very excited to see that HTC is going in this direction with Viverse. And I'd love to have you answer the question I like to ask all my different interviewees, which is, what do you think the ultimate potential of XR and the future of the Metaverse might be? And what it might be able to enable?

[00:47:35.214] Andranik Aslanyan: Sure, yeah. Honestly, I could see a future in which every site has some 3D component to it. You go to Amazon, why doesn't Amazon have 3D versions of all of their products? Maybe you want to rotate the asset, you want to see it compared to other things for scale. I mean, it seems like not that big of a leap. I think the hardest part of all of that has been the actual production of going from 2D captures that were photos to the actual asset itself. Manually modeling hundreds of thousands of products probably is crazy to do. But things are changing now with generation of 3D assets. You can also see people producing interactive experiences maybe close to the ease of maybe mid-level Adobe Premiere use. So if you're able to shoot pretty good looking video and then edit it well in Premiere, I don't think that technical strength is that much less than being able to produce potentially good interactive content. If we give you good no-code tools or using the newer LLM systems, it's really lowered the barrier of entry to even coding in JS could probably handhold. I think we're even considering ways to do direct integrations of LLMs into Play Canvas to make that even easier. So it's not like, oh, let me go find the button that it's telling me to click or whatever. Maybe we can automate some of it. So ultimate future is probably like what we're calling the 3D internet, but it's just really like 3D intertwined in everything that you do. Why not? I think it is practical and actually helpful, brings value to people. I think it's worth doing.

[00:49:37.320] Kent Bye: Nice. And is there anything else that's left unsaid that you'd like to say to the broader immersive community?

[00:49:42.206] Andranik Aslanyan: No. I'll just say on the 26th, if you have some time, go check out Viverse Worlds and play some experiences.

[00:49:50.714] Kent Bye: Yeah, one other final thought I wanted to add is that we're coming up on the 10-year anniversary of HTC Vive being announced and launched at the Mobile World Congress 10 years ago back in 2015. And so 10 years later now in 2025, we have Viverse that's launching. So I don't know if you have any comments on that with Mobile World Congress coming up and coming back to the place where they were a decade ago to announce this latest iteration of where this is all going.

[00:50:15.526] Andranik Aslanyan: Yeah, that's interesting. I actually didn't realize that it's exactly 10 years. But I think one thing that I would say is we're going to be at MWC. We have a bunch of demos there. And Viverse is the largest component in that booth. So HTC is definitely putting a lot of focus on this platform. And we're committed to it. And we're just going to keep building it up as fast as we can.

[00:50:39.911] Kent Bye: One other follow-up question I forgot to ask. In terms of the HTC Vive and all the different headsets, what's the browser that's internal? I know Meta has been working on their Quest browser and just wondering if it's a Chromium flavor of browser or what kind of browser you're using within the context of HTC.

[00:51:01.467] Andranik Aslanyan: I think we do support Chrome itself, but we have a Vive browser and that is a Chromium-based browser. in the background. So yeah, we do want all of our systems to kind of work across all headsets equivalently. So Vyverse experiences work very well on the Quest 3, for instance. We didn't do anything to favor one or the other. We just want everything to work super well everywhere.

[00:51:31.465] Kent Bye: Yeah, just another follow-on. I know it's taken a long time for Apple and Safari to come up and start supporting things like WebXR and other standards. How are things looking on Safari for Vyverse?

[00:51:44.858] Andranik Aslanyan: We work fine on Safari. I think one challenge on Safari is when you go into landscape mode, they auto add this top bar, which is terrible. And there's no way to hide it without clicking a bunch of buttons. Safari is behind in web GPU support. It's hidden behind the feature flag that you have to go turn on in your settings, which is really annoying because every other browser out there is just supporting web GPU out of the box. I don't really know why that is. They've promised to support it by default for two years now. But I think it'll eventually happen, especially once more and more people use WebGPU. Yeah. I mean, for us, we have some demo experiences using WebGPU. And honestly, it's pretty crazy. You can get dynamic global illumination and SSAO and all sorts of things that you're just used to in Unreal Engine, for instance. But now you're rendering it in the browser, something I don't think people ever thought was possible.

[00:52:48.755] Kent Bye: Yeah, for anyone that's been following from WebVR to WebXR, Apple has been holding back the entire 3D industry for not implementing all these different open standards as quickly as all the other companies have been. So they've certainly been lagging behind anywhere from three to four or five years from these other browsers. So it's not anything that's new, but it's also one of those other things that the lack of cross-browser support has unfortunately slowed down initiatives and efforts like this. So it's really great to see HTC out there pushing the edge. And despite all of the lack of complete support across all the different browsers, I think we'll get there eventually. And hopefully if we'll get more users encouraging Apple to pick up the pace a little bit, especially with Apple Vision Pro now launched a year on. So yeah. I have hopes that we can start to see a higher pace of development across all the different platforms. So anyway, thanks again for joining me here on the podcast to help break down all this news around Vyverse. And it's launching on February 26th, 2025 at Mobile World Congress. And yeah, just really appreciated this deep dive, hearing all about it and a little bit more context for how it came about and where it might be going here in the future. So thanks again. Well, thank you for having me. So that was Andranik Aslanian. He's the head of growth at HTC. And we're talking around the launch of Viverse, which is like an open world 3D platform that's aiming to be the YouTube of 3D content. So I have a number of different takeaways about this interview is that first of all, Well, I really admire the ambition of HTC to go in and all these different open source technologies and to launch something like Viverse. It's something that is the most polished WebXR social platform that I've seen so far that is a little bit different than like Mozilla Hubs, which is much more around like a self-hosted solution to have these immersive worlds. And this is more of a way to browse a lot of these different 3D worlds, these kind of bite-sized content. And they've created like 100 different curators from all these different types of immersive communities from like vr chat creators from refugees from mozilla hubs from like a crypto metaverse type of folks that were doing 3d content but they're building it on play canvas and webexr and all these open source technology stacks you have vrm which is like taking a gltf model that has rigging and I would say from all the different platforms, if you're a creator, you want to have some way of having an off ramp to whatever you're building. It's great to have these really polished VR chat worlds and minor horizon and all these kind of more unity based Unreal Engine. But all of those are a certain mindset for how you're building these worlds. And doing stuff on the web is like opening up all these new vectors. And I think we're finally at this point where we're starting to see like a little bit more momentum for the open web version of all these technologies. And it's one step closer to the dreams of the metaverse than anything else that I've seen so far, because it's building on top of a lot of the core technologies that are going to be a key part of the future of the open metaverse on the open web using all these interoperable open technologies. So the fact that they're using Play Canvas, which, you know, they have 3DS and Babylon.js, they had their own web content creator builders in the backend. They have all these integrations with like GutsFab. They have integrations with Gaussian Splats and being able to do hybrid rendering. They have like Polygon Streaming Solution, which is kind of like the nanite of Unreal Engine, where there's like a number of different fidelities of models that are being delivered based upon the bandwidth and distance. you know, it's something that is integrating a lot of stuff together. It's like throwing everything in the kitchen sink and this is what you would come up with. You know, the challenge is that because it's so broad and covering so many things and trying to do like, you know, even the social features, it's doing everything okay, but not doing everything in a cohesive way just yet, but they just launched. And so I expect over time, especially if they move into more of this open source model where they have the code that's available for folks to edit and and they have it like self-hosted, then it's gonna move a lot faster. I have like skepticism, the fact that they've launched this in that proprietary context, it's very, very difficult to then translate into the open source model later, but hopefully they'll be able to figure that out because I do think that that will be a catalyst for them to continue to expand out. Obviously something like YouTube doesn't like open source all their technology. So there's value for having that centralization because you can have tight control over everything. So it'll just be interesting to see how that continues to play out. They're already integrating with something like Play Canvas, which is already something that's like an open source project that you can go check out. So I'd see what Vyverse is doing is like this integration of all the different open source technologies that are out there for the open 3D web. And there is a lot of friction right now. If you want to jump into like a headset and I found it not very intuitive or there's not really a immersive interface. And so you end up going through the browser and back out and exiting wasn't very easy. And you end up having like push the mini button and reloading, you know, on the quest browser. And then like, sometimes it wouldn't properly load. And then when you're on a 2D web, you have more of a optimized for a third person view as you're like walking around these different worlds. And so it's a little bit like, is this world designed for people who are mobile and PC, or is this designed for someone to actually be within the context of an immersive world? They don't have stick locomotion that's implemented yet. And so you end up having to teleport around. I personally prefer stick locomotion, and so I would rather be able to seamlessly locomote around. I don't know if that's a performance consideration or if there's other things that they don't have stick locomotion. So it ends up being a little bit more of a fragmented experience as I'm teleporting around these different immersive worlds. But it's sort of like if you want to have these bite-sized experiences. are people going to jump in? I guess I'm sort of struggling with the initial use cases of something that's going to be really compelling. What I see with something that's going to be really compelling, at least on the content creators, what I'm seeing is all these other creators from all these other platforms that have some level of dissatisfaction for what they've done in these other platforms. And there is some overlap for the different type of 3D work that they've done in these other platforms that they're able to port over and bring into this platform. And There's a lot of worlds that look actually really quite good in terms of the environmental design and pushing the edge of what the browser is able to do. Especially with web GPU support that is now coming up, you're starting to see a lot more parity for what you might see on a native app versus what you're able to do within the context of WebXR. That all said, that's still like a lot of friction. It's not as good. Sometimes you have dropped frames and it's not as solid as something as mobile, but this is a really solid first step. And I see it in the broader context of what we've traditionally had with the app ecosystem within the XR industry. And this is a really positive, optimistic move into integrating all these open source technologies on a web stack. with WebXR, WebGPU, GLTF, all these open standards that are being seamlessly woven together with some proprietary algorithms and features that they're also throwing in there. The other thing is that Meta has really abandoned the enterprise market for a number of years now, and they really haven't emphasized how XR is going to be integrated into the enterprise. And so because Meta has really skipped that step of integrating fully with the enterprise market, they are having a lot less use cases and their closed mindset in a way that they're not really collaborating with a lot of other people that are asking for this or that feature that would be useful for a broad set of different contexts. ATC has been doing that for a number of years, and that's been a big reason why they've been able to persist in the XR industry is because they haven't abandoned that enterprise market. So the fact that there's also an enterprise component to this, I think people shouldn't sleep on. Even if you don't think that some of these bite-sized virtual worlds are all that compelling for what you're interested in, there could be a business use case for having something like this. that is able to serve 3D content across the web, much like YouTube has been able to proliferate video content across the web. And that's probably one of the different services that I use the most in terms of like watching these different videos across many different contexts. And so the fact that you're able to take the 3D content, just kind of spread it out across the web for all sorts of different use cases and make it public and private. And this idea of the YouTube of 3D content it's starting to get into much more of what the benefits of the open web are versus what we've been seeing within this more closed walled garden mindset of the XR industry for the last decade. So we're finally starting to get into this completely new phase of breaking down the walls and barriers of all the different mindset for what the paradigm has been for the XR industry. And I'm excited just because we're starting to see a little bit more integration into what's happening on the open web and the XR. This is probably like the closest intersection point that we've seen through a major player like HTC and what's been happening on this kind of underground movement of web developers. People are like really believe in the openness and the open web and interoperability, the open metaverse. Meta has been saying that, but they haven't really been living into it as much as what HTC has done with launching this. So I'm just like super excited to see where they're going to be taking all this in the future and don't bet against the open web, I'd say. But at the same time, they have to make something that is coherent and really nails the user experience. And right now there's a lot of friction and a lot of things that really need to be ironed out for specific use cases for user stories. Like this is how they're going to use it. What's the pathway for taking this link and seeing it into an immersive space. And if there's going to be 90% of people that are watching it on the 2d platforms, the design decisions between whether or not you're seeing it on 2d and 3d, there's still quite a huge difference between Most of the experiences that I'm seeing so far work better on the 2D web, on my laptop or on my phone versus jumping into a VR experience. The VR experience is great that you actually have those immersive components because what we saw in the crypto metaverse land was that even though they were saying they were building the metaverse, they were really never integrating WebXR and the more XR components of that vision of the open metaverse. I feel like HTC is like taking a lot of really great steps with the fibers. And I'm super excited to see where it goes, especially as you see content creators from these other platforms, see that there's other ways to take their prior work and start to recontextualize it for this platform and to have it on a platform. That's honestly going to be a lot more sustainable for the longterm. If you have content that you've created and you want people to be able to see it in like 10 years, then using it on an open web stack is like the way to go. against some of these different platforms that may or may not be around for that long. And, you know, you may be locked into their system. The fact that you could build like an open technologies and kind of prototype and have some people use it, that's going to be a much closer way to then taking that and then putting it and self-hosting it on your own site. So that's in terms of what creators want. That's a big thing in that we haven't really seen that so far within the XR industry. And so it'd be super exciting to see what they do with the monetization strategies in the future. But I see this as a really positive direction, especially looking at like 10 years on from when HTC and Valve launched the HTC Vive a decade ago at Mobile World Congress 2015 and shocked the world 10 years on and now they're had all this non-exclusive acquisition so they still are going to be selling and developing their own hardware but with a little bit less team off to Google now they're going to be potentially building off some of these different enterprise solutions and software stacks and so yeah I'm excited to see where they're able to take all this here in the future So that's all I have for today, and I just want to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon at patreon.com slash voicesofvr. Thanks for listening.

More from this show