#25: Philip Rosedale on High Fidelity, virtual body language, telepresence, 3D audio, distributed computing, Voxels, cryptocurrencies & the future of the Metaverse

Philip Rosedale is the creator of Second Life, and more recently High Fidelity. He talks about a lot of the things that he’s doing differently in creating a virtual world for the second time around including a focus on 3D audio, low latency, speed and texture of experience as well as using a standard scripting language with JavaScript rather than rolling their own.

philip-rosedaleHe talks about virtual body language and how the target of 100ms of latency is the threshold for a compelling telepresence experience that is indistinguishable from face-to-face interactions.

Philip talks about how High Fidelity wants to create a set of open standards and protocols so that people can host their own virtual worlds on their own servers. He also talks about their approach to distributed computing to help offload the computer power to run a complex and nuanced virtual world, and how mining a cryptocurrency could be a part of that process.

Finally, he talks about his vision for the future of the Metaverse, and how these virtual worlds will provide opportunities for participants to be more thoughtful, more open, and more creative than they can be in the real world. He doesn’t see that these worlds are necessarily escapist since they can be as sophisticated, complex, navigable and challenging as the real world. His experience with Second Life was that you have to be just as capable, smart and entrepreneurial to succeed in virtual world environments.

Reddit discussion here.

Be sure to check out this blog post on High Fidelity’s system-level architecture for more details.


  • 0:00 – Intro – High Fidelity. New virtual world taking advantage of changes in technology
  • 0:32 – Motion sensors and Oculus Rift are driving changes in virtual worlds. Driving to interact naturally in 3D virtual spaces, and the requirement to have the learned skill of using mouse and keyboard is going to end soon.
  • 1:33 – What types of interactions have you had within High Fidelity with these new tools? Body language, and seeing someone blink. Nodding head is important. Moving hands is remarkable. Got PrioVR working for full upper body animation. Group interactions and face-to-face interactions.
  • 2:47 – Facial capture with either a 3D camera or a webcam with Faceshift, and reconstruct it via 50 floating point numbers. Aiming to get less than 100ms latency to mimic 1-to-1 interactions
  • 3:48 – Using VR HMD and facial capture at the same time. Can only get one at a time. Oculus thinking about doing facial capture. Can use a 3D TV, and adjust the view as a intermediary between full VR HMD and computer screen
  • 4:54 – Using High Fidelity as a telepresence tool. Use it with their distributed team, and cool to see others.
  • 5:35 – Good enough for enterprise use? Proof point of recording telling the same story with the same story with same avatar, and can identify people even without sound
  • 6:20 – Distributed computation at High Fidelity. Limited by centralized hosting. Distributing small computers quickly, and use computers at home to offload some of the processing.
  • 7:30 – Dynamic multicasting with audio. Mixing it in 3D. Dynamically assembling a multicast repeater and can perform a concert in real-time with less latency than in the real world.
  • 8:47 – What is a voxel, and how are you using it? A way to organize space virtually. Represent what things look like at a distance, and enables to see at infinite distance. See full mesh topology up close.
  • 10:06 – Hierarchical nesting of voxels for the decomposition of space with a “sparse voxel octree”, and then distributed computing with those. Can create infinitely complex city
  • 10:59 – Other things that you’re doing differently from Second Life: Audio processing, low latency, speed and texture of experience, using a standard scripting language with JavaScript rather than rolling their own. People want to run their own services, it’s a protocol and open source standard rather than a world upon it’s own.
  • 11:59 – Cryptocurrency and paying people for helping run the virtual world.
  • 12:56 – How is identity different on High Fidelity? By default, you’re anonymous, and using OAuth and SSL for authorization for certain secure sites, but also a lot of open worlds. Having name floating over your head is not a great solution, because sharing you name is a choice and form of greeting
  • 14:23 – Future of the Metaverse. Create a set of interconnected virtual worlds, where they’re living adjacent to each other. Instead of hypertext links, there will likely be doors. Virtual worlds of the future will be a set of interconnected spaces like the real world. There will be hidden servers that you can’t get to, just as there are private intranets.
  • 15:34 – What inspires you with what you want to see? How people are changed by virtual worlds for the better, more thoughtful, more open, more creative. Virtual worlds are our future. They will become a real added space, and it’ll be a profound expansion of the real world.
  • 16:35 – Are virtual worlds escapist? Technology is getting us the ability to create worlds that are just as sophisticated, complex, and navigable and challenging as the real world. Only escapist if you’re escaping from other people, or simplifying the world too much in a way that isn’t in our best interest. To be successful in Second Life you have to be capable, smart and entrepreneurial.

Theme music: “Fatality” by Tigoolio

Here’s a recent talk that Philip Rosedale gave about High Fidelity. Note that this is NOT within High Fidelity, but a government virtual world called MOSES, which is the “Military Open Simulator Enterprise Strategy.”