Ebbe Altberg CEO of Linden Lab, talks about how Second Life is the currently the biggest and most successful virtual world. He shares all types of insights that he’s gained over the years when it comes to identity, in-world economies, governance and culture.

He also talks a bit about Second Life’s support for the Oculus Rift, with their Second Life Project Oculus Rift Viewer
ebbe-altbergThe consumer virtual reality movement has added a lot of renewed interest and vitality in the idea of virtual worlds and the Metaverse. Ebbe is more cautious in the concept of interconnected, but independent virtual worlds because he sees that there are a lot of challenges with making the overall experience a lot easier and better first.

He sees some of the biggest open problems with virtual worlds are:

  • having a stable economy
  • easy communication tools
  • social tools to stay connected
  • having great building tools
  • being able to easily enhance or modify your world
  • having discovery mechanisms to find the experiences and communities that will resonate with you.

He sees that everything has to become easier before thinking about what data and information needs to be available to be exchanged with other systems.

He also talks about how the mouse and keyboard are not necessarily the best input devices, and that Linden Lab was investigating other input devices. This interview was at SVVRCon and since that time, Ebbe Altberg has revealed that Linden Lab is looking to rewrite Second Life from scratch to address to make it better suited for virtual reality.

Ebbe told The Next Web that “With technology, market interest, hardware and software available, now is the time to give it another big shot. We have the experience to do it more than anyone else… We’re not going to constrain ourselves with backwards compatibility.”

Linden Lab certainly has a lot of lessons learned over the years for running a virtual world, and it’ll be interesting to watch to see if they’re able to innovate and adapt to all of the latest virtual reality technology and new input devices.

Reddit discussion here.


  • 0:00 – Key pioneer of virtual world
  • 0:41 – Immersive VR in Second Life. There’s a plug-in. Pushing for a real sense of presence. Want other device access with phone, tablet and PC.
  • 2:04 – Insights from SVVR. Not a lot of new discoveries because they’re doing it everyday. PR peak in 2006-2008. Renewed energy in the space. Lots of technological innovation since then. Hoping for more acceleration in this space
  • 3:27 – Experiences within Second Life as Ebbe Linden.
  • 4:27 – Experiences with VR within Second Life. Lots of work to change the user interface beyond the keyboard and mouse. New input methods needed.
  • 5:27 – Identity insights in identity. Should be up to the user depending on the context. Has an anonymous alt to be treated less as a Linden employee. Big part is being able to be someone else.
  • 6:26 – Closed aspects and walled garden? It belongs to the user and can be imported elsewhere. Haven’t figured out how to make it mass market. Lots of problems to solve. Compatibility and getting to work for lots of people first before thinking about data portability. Make it more approachable and easier to use for more people first.
  • 8:18 – Economy within Second Life. Hundreds of millions of dollars in GDP. Mostly of people selling goods to each other. Non-trivial effort to have a stable economy and have exchanges around the world. People depend upon Second Life as their livelihood.
  • 9:46 – Rules, laws and governance within Second Life. Have rules & laws just like the real world. Try to be as open as possible and not limit people from expressing themselves. Proud of how open and free Second Life is.
  • 11:02 – Code as law and then how to enforce violations beyond that. Can watch what’s happening, but with freedom comes responsibility. Harassment, causing harm, being mean spirited. How do you enable all the good, but prevent the bad. With openness comes the risk that people will abuse their freedoms, but have gotten good at managing that balance. Give people control of the environment within their in.
  • 12:42 – Create artifacts and the range of different cultures from different communities, experiences. Arts, games, role playing fantasies as vampires, be in a different time and place, experience world through different set of eyes. It’s part of the freedom and diversity of experiences within Second Life.
  • 13:45 – The metaverse and Second Life’s connection to that. Interconnected, but independent worlds should come later. First make it easy before make it interconnected. Then talk about what data and information should be interchangeable.
  • 15:56 – Potential of VR is unbounded. Go anywhere, be anywhere with anyone. Second Life is on the leading edge, and they’re way early. Networking, devices and software is getting there, and it’ll be an interesting journey.
  • 16:48 – Biggest open problems in virtual worlds: economy, communication tools, social tools, great building tools, enhance and modify world, discovery mechanisms. Everything has to become easier. Needs to get easier to get in, navigate, communicate, find relevant experiences. Still earlier and geeky, and hasn’t crossed the chasm to reach early majority. Ease of use is the biggest issue to solve.

Theme music: “Fatality” by Tigoolio

UPDATE: A Linden Lab press rep reached out and shared this statement about their future plans

Linden Lab is working on a next generation virtual world that will be in the spirit of Second Life, an open world where users have incredible power to create anything they can imagine and content creators are king. This is a significant focus for Linden Lab, and we are actively hiring to help with this ambitious effort. We believe that there is a massive opportunity ahead to carry on the spirit of Second Life while leveraging the significant technological advancements that have occurred since its creation, as well as our unparalleled experience as the provider of the most successful user-created virtual world ever.

The next generation virtual world will go far beyond what is possible with Second Life, and we don’t want to constrain our development by setting backward compatibility with Second Life as an absolute requirement from the start. That doesn’t mean you necessarily won’t be able to bring parts of your Second Life over, just that our priority in building the next generation platform is to create an incredible experience and enable stunningly high-quality creativity, rather than ensuring that everything could work seamlessly with everything created over Second Life’s 11 year history.

Does this mean we’re giving up on Second Life? Absolutely not. It is thanks to the Second Life community that our virtual world today is without question the best there is, and after 11 years we certainly have no intention of abandoning our users nor the virtual world they continually fill with their astounding creativity. Second Life has many years ahead of it, and in addition to improvements and new developments specifically for Second Life, we think that much of the work we do for the next generation project will also be beneficial for Second Life.

It’s still very early days for this new project, and as we forge ahead in creating the next generation virtual world, we’ll share as much as we can.

If we had one message to share with Second Life users about this new project at this point, it would be: don’t panic, get excited! Again, Second Life isn’t going away, nor are we ceasing our work to improve it. But, we’re also working on something that we think will truly fulfill the promise of virtual worlds that few people understand as well as Second Life users.

Stefan Pernar of Virtual Reality Ventures talks about his Virtual Reality Fashion show. He’s integrated the Marvelous Designer CAD program with Unity 3D, and created a pipeline so that designers could design fashion pieces and see how the fall and flow on a virtual model.

StefanPernar-SVVRHe and his partner Joel De Ross have also been able to create a lot of buzz for the potential of VR within the Marvelous Designer community, and get artists such as Android Jones interested in providing pieces to be shown in their demos.

Fashion is largely a 2D design process, but the rendering the objects in 3D and see how they’ll look and feel on a model of a specific size is something that is VR is very well suited for. Stefan will be targeting a consumer experience of allowing people to preview fashion before buying it, and there’s even a feature that shows how well the clothing fits on your body through a heatmap of where it’d be too tight or too loose.

Stefan is also looking to see how clothing manufacturers could start to use VR to allow retailers to pre-visualize products before they’re actually produced to help in the decision as to whether or not they’re interested in pre-ordering it.

If VR is going to go mainstream, then Stefan believes that VR needs to be seen as a viable business tool and so he’s a co-founder of the Australian Virtual Reality Industry Association. He hopes to connect businesses and developers and consumers together to see the potential of VR to help businesses.

Real estate previews with Matterport scans could enable clients to see 10 locations within an half hour that would be physically impossible to do due to how much time it takes to travel around physical spaces. Remote real estate inspections is also another important potential application.

He also sees that data visualization will also be a really big enterprise application. Try-before-you-buy tourism like Experience Japan will also be a big application.

Finally, he sees that we’re still waiting for the big killer VR app, and that he expects that the impact of VR will be larger impact on society than most people are expecting.

Reddit discussion here. Also be sure to check out Road to VR’s coverage of this fashion demo.


  • 0:00 – Intro. Virtual Reality Ventures is a VR consultant, and is showing a VR fashion show.
  • 0:43 – Virtual Reality pipeline. Integrated Marvelous Designer CAD program with Unity 3d
  • 1:16 – Difference between 2D and fully immersive, 3D VR. Try on different sizes of virtual garnets and see where it’s a tight fit with
  • 2:35 – Getting a photoscan into VR. Entering in measurement points
  • 2:59 – Target market. Not penetrated consumer market, and so creating a point-of-sales experience. Collaborating Android Jones and other Marvelous Designer artists. More at http://www.virtualrealityventures.com.au/fashion
  • 4:02 – Getting fashion design artists to collaborate. Joel De Ross has been evan
  • 4:58 – Similar to 3D printing revolution. Connect design and artists directly to the customers without going through the manufacturing giants
  • 5:45 – Where to get these fashion products actually produced
  • 6:17 – Marvelous Designer is like blender where you define thickness of materials
  • 6:40 – What is Marvelous Designer? Design tool to design and pre-visualize fashion. Make patterns and define how they’re put together. Define materials and textures and produces a natural fall and flow. Developed pipeline to integrate with Unity3D.
  • 7:46 – Produce videos with virtual models.
  • 8:19 – Designing things within a 3D space? Fashion is a 2D process. Putting it together creates the 3D shape.
  • 9:17 – No real benefit to bring designing in 3D. It’s more about seeing what it looks like in 3D.
  • 9:51 – Physics models of how materials works comes from Marvelous Designer
  • 10:11 – Better than looking at 2D cloth, and likely not 100% accurate, but good enough. Could be used for visualizing products before they produced so that they can be pre-ordered
  • 11:23 – How’s the response to fashion at SVVR? Trying to make VR as a business tool. It’ll make it more mainstream.
  • 12:25 – Co-founder of the Australian Virtual Reality Industry Association. Tie businesses and developers and consumers together. Help start various meet-ups. Lots of potentials for real estate. Going to speak about VR to a CIO Summit. What would you show a CIO about business applications of VR? Matterport scan of the expo. Consulting people on VR strategy and for remote inspections for real estate.
  • 15:15 – Business applications of data visualization and enterprise software. Potentially network visualization tool for Oculus Rift. Humans are not designed to see data in spreadsheets.
  • 16:28 – Huge impact on society. Try-before-you-buy tourism like Experience Japan. No killer VR app yet, but most people are underestimating how huge VR will be.

Theme music: “Fatality” by Tigoolio

Nathan Burba of Survios talks about some of the custom VR hardware, software and games that they’re developing after recently receiving a $4 million dollar investment.

nathan_headshot_200He talks about the Zombies on the Holodeck experience, and what they’re creating in order to have a more untethered VR experience where the user doesn’t feel constrained or limited by being in VR. They’re goal is to create a more natural user interaction within VR in order to create a deeper sense of immersion and presence.

Nathan also talks about his response to the Ben Kuchera’s article about “Let’s put down the guns in virtual reality, and learn to pick up anything else.” He sees VR as an opportunity to live out our action-filled fantasies of having light-saber battles or shooting zombies in the face.

He also makes the observation that you have to innovate one step at a time with the VR medium. For example, recreating the experience of going to the beach requires all sorts of haptic feedback to make it feel real, that it becomes one of the most challenging problems to solve. Shooting things in video games is a well-establish game mechanic that is fairly easy to implement, and that’s why they started there. Plus zombies aren’t real, and it’s one of his action-filled fantasies that VR allows him to live out with their Zombies on the Holodeck experience.

He talks about his ideas of the mobile, PC and IMAX-location based type of tiers within VR, and how Survios is going down a path of creating optimized hardware and software so that VR isn’t limited by computer systems that exist today. And finally, he shares his vision of the potential for VR as being able to allow us to return to our fundamental humanity of running around and playing games, exploring exciting virtual worlds, and allowing us to express our full creative potential.

Reddit discussion here.


  • 0:00 – Intro to Survios, Hardware, Software and Game development
  • 0:33 – Recent $4 million round, new tech and new product to push VR forward
  • 1:07 – Zombies on the Holodeck. Feel like you’re in an 8x8ft space. Visceral experience where there’s natural interactions to have a tremendous amount of presence.
  • 1:54 – Tracking technology, wearable, server, optical camera, VR HMD, Hydras to give sense of presence and be free within a world
  • 2:30 – Going for an untethered experience where you don’t feel any limits or constraints
  • 2:53 – Safety issues with untethered VR. Pieces of tech to help out with that
  • 3:15 – Project Holodeck at USC doing Kinect Research, and learned everything he knows about VR HMDs from Palmer Luckey. Take VR and make games with it rather than just art-based and academic research. Wanted to
  • 4:39 – Ben Kuchera article talking about tired of shooting Zombies in the face. Don’t innovate on too many aspects at once, and so taking an easy to implement idea and starting there.
  • 5:59 – Response to violence in VR. Hard to simulate going to the beach because it’s a felt experience. Killing Zombies is an effective experience. VR is about living out our action-filled fantasies.
  • 7:37 – Nonny de la Pena’s immersive journalism and untethered VR experiences. Using VR for emotional appeal. Push the VR medium forward to make it easier to make this type of content
  • 8:49 – VR tiers: mobile, PC-based, and IMAX location based approach. Based upon computing systems that we have in society today, consumer PC and then super powerful, location-based computers. The best virtual reality experiences will be on nextgen computer systems that are tailor-made for VR — something that Survios is focusing on.
  • 10:10 – Don’t want to be limited by computer systems that exist today
  • 10:41 – VR will let you run around and let you feel human again, and run around, jump, play sports and play games. Games are fundamental, and VR will get us back to our fundamental humanity of running around playing games with each other, exploring worlds, and manifesting our creative potential.

Theme music: “Fatality” by Tigoolio

Jesse Joudrey of Jespionage Entertainment talks about the weekly VR Chat gatherings that have been happening, and how meeting people in virtual spaces compares to meeting them in real life. He talks about the what type of body language cues translate, and how they’ll be expanding that with hand gestures.

Jesse JoudreyJesse also talks about the process of stress testing gatherings in VR Chat, and the current bottleneck is when too many people speak at the same time. They’re currently limited by what the uSpeak Unity plug-in provides, but they can also host additional servers to help out with the load. He also talks about how the VR Chat SDK provides the ability to customize your own avatar look with animations as well as customized spaces. Gunter has also gathered an archive of past VR Chat meet ups.

Finally, he talks about his VR Game jam game of Snow Drift, which is an extreme sports game. He was surprised that it made a lot of people motion sick because he doesn’t suffer from any symptoms of simulator sickness, which he talks about as well.

Reddit discussion here.


  • 0:00 – Intro to VR Chat
  • 0:30 – The VR Chat experience. Almost like being in reality in that you’re meeting other real people from around the world.
  • 1:13 – Body language that translates. Where people are looking. Hand tracking is coming. Fidgety people also move around a lot in VR
  • 2:00 – VR Chat stress tests. How to measure bottlenecks. Simultaneous in-voice communication
  • 3:12 – Different challenges with audio. They use the uSpeak Unity plug-in. Can add servers to help with the load. Lots of data is being pushed when people speak simultaneously
  • 4:15 – VR Chat SDK – Import avatar into the chat. Create own animations, and their own meet-up spaces. Build environments to express themselves.
  • 5:30 – Can people host chats on their own server? Chats pare
  • 5:56 – What it’s like meeting people face-to-face after communicating with them in VR Chat.
  • 6:38 – What’s next in VR Chat? More personalization.
  • 7:16 – Snow Drift, extreme sports VR Jam game.
  • 7:53 – Doesn’t get motion sickness.
  • 8:39 – Get more info at Snow Drift and VR Chat and Jespionage Entertainment

Theme music: “Fatality” by Tigoolio

I’m joined by the Kite and Lightning team including co-founders Cory Strassburger & Ikrima Elhassan as well as developer/VFX artist John Dewar.

kite-and-lightning-allThey talk about the creative process behind converting the mini-opera song of Senza Peso into a 2D motion graphics film and then into an immersive virtual reality experience, which created some impressive buzz within the VR community.

They also discuss a number of the reasons why they went with using Unreal Engine 4 over Unity 3D, and how it enables them to more rapidly prototype on the look and feel of their VR experiences. They also have more control by being able to change the source code. They also talked about the decision to record stereoscopic video of the characters rather than using motion captured avatars.

Cory also talks about his background in the sci-fi film Minority Report, and his interest in helping develop 3D user interfaces in VR as demonstrated in The Cave & The K&L Station experience..

Finally, everyone talks talks about some of the major take-aways and lessons learned from working on all of their VR experiences over the past year, where they see VR going as well as how many exciting, open questions there are right now.

To keep up with all of the latest developments with Kite and Lightning, then be sure to sign up on their newsletter listed at the bottom of their website here.

Reddit discussion here.


  • 0:00 – Intros
  • 0:51 – Backstory behind Senza Peso. Getting a DK1 changed everything. Switching to Unreal Engine
  • 2:56 – Comparing Unreal Engine to Unity, and what UE4 provides
  • 5:25 – Translating the story to a 2D motion graphics film, and then translating it into a cinematic VR experience
  • 9:35 – How they did the character capture with stereoscopic video
  • 11:06 – Programming challenges for creating this cinematic VR experience
  • 12:47 – Visual design considerations & working with the Unreal Engine 4 in contrast to what the workflow would’ve been with Unity.
  • 15:29 – Ikrima’s take-aways from working on this project, and Kite and Lightning’s
  • 17:14 – 3D user interface prototypes in the Cave & insights from working on sci-fi films like Minority Report
  • 21:51 – Other 3DUI interface insights from the VR community including Oliver Kreylos’ Virtual Reality User Interface (Vrui)
  • 25:56 – Tradeoffs between file sizes in using different motion capture techniques
  • 31:38 – Experimenting with experiences that are either on-rails, some triggers, completely open world
  • 35:17 – What type of innovations they’re working on in terms of motion capture and graphics. Optimizing their production pipeline processes.
  • 37:14 – Lessons learned for what works and doesn’t work within VR
  • 44:51 – The ultimate potential for what VR can provide
  • 52:35 – What’s next for Kite and Lightning

Theme music: “Fatality” by Tigoolio

Other related and recommended interviews:

Matt Bell of Matterport on their 3D camera for capturing physical spaces. The Matterport camera and software seems like a great solution if you wanted to quickly convert a physical space into a 3D model to use within a game context. Here’s the 3D scan that Paul Grasshoff from Matterport captured of the SVVRCon 2014 exhibition space, which was then imported into VR Chat.

matt-bell-300x300At the moment, the primary use cases for the camera have been for real estate, home improvement, remodeling, construction, insurance, documentation of spaces, and crime scene investigation. But since they provide a textured mesh in the form of an *.obj file, then you can scan a room and within a half hour be able to get the file and import it into a VR experience that you’re developing.

Matterport is primarily a software company, and they use an iPad camera to be able to control their $4500 professional camera. One thing to note is that they are charging a monthly fee ranging from $49 to $149 to be able to scan, process and host a number of different files — and so there does appear to be a recurring fee to be able to actually use their camera.

You can either host the 3D model with Matterport if you need to have other people look at it online, or you can just export the textured mesh and start manipulating it for your game.

It’s also able to measuring distances within these models, and it’s dimensionally accurate down to around 1%, which is good enough for remodeling and furniture placement — and certainly good enough to quickly create 1:1 scale environments for VR. The current camera has a range limit of about 30 feet, but cameras with larger ranges will be coming in the future.

Matt also talks about how Matterport is developing some apps for Google’s Project Tango & Intel’s RealSense mobile 3D cameras, as well as lighting considerations when doing a 3D scan of a physical space.

Reddit discussion here.


  • 0:00 – Intro of Matterport. Solving content creation problem of making 3D models of physical spaces. Place a Matterport camera in a space, take a number of shots, and it’ll build a model within ~30 minutes.
  • 0:42 – What are people using this for? Real estate, home improvement, remodeling, construction, insurance, documentation of spaces, crime scene investigation
  • 1:03 – What is the technology behind Matterport? Primesense 3D sensors that capture depth and color. Matterport software puts together the 3D puzzle pieces to create a coherent model of the space that’s dimensionally accurate.
  • 1:57 – Using an iPad for the software end to control the camera. Uses the CPU to align the pieces & GPU to display 3D models.
  • 2:25 – What’s the cost of a Matterport camera? Aimed at professionals at $4500. Writing apps for Google’s Project Tango and Intel’s RealSense, mobile 3D cameras. Built a demo for project Tango to scan a room
  • 3:21 – What’s the output from Matterport? Textured 3D mesh. They allocate the polygons to describe complex models and how they’re optimizing the 3D model.
  • 4:21 – What are some applications for how people are using Matterport? Scanning of famous monuments and world cultural treasures, and using the Oculus Rift to have an immersive experience of a space. Take spaces you care about and start to experiment with them. Make a model of your house, and you can play a game in your house or do remodeling with drag-and-dropping of furniture.
  • 5:48 – Measuring distances within these models. Dimensionally accurate down to around 1%, which is good enough for remodeling and furniture placement.
  • 6:24 – What type of file is it, and can you alter the file? Most people just leave the models on the Matterport platform and it’s embed code. You can download the model as a *.obj and then edit the textured mesh just like any other 3D file.
  • 7:15 – Considerations for lighting? We have the real world as our rendering engine. Generally light the room that looks pleasing to you as you walk around it. What if you wanted to turn off a light later? Could get the geometry an paint it later
  • 8:28 – Have people used it to scan people and faces? Not the focus. More focused on spaces. Mobile app will be optimized for more use cases.
  • 9:15 – Is there a scanning pattern to avoid occlusion issues? Not really, just be sure that camera can see all parts of the room.
  • 9:49 – In a room with high ceilings? What is the range? ~30ft high is the current limit. There are nextgen sensors that have a greater range. Matterport is primarily a software company.
  • 10:30 – Matterport.com is where you can see more models and see their 3D cameras.

Theme music: “Fatality” by Tigoolio

Tony Davidson is the developer of the puzzle-based, adventure game called Ethereon. He talks about the process of creating a world that you can explore and interact with where there are puzzles that are well-integrated into the environment and make sense. He liked to take things apart and put them back together as a kid, and wanted to create a slower-paced VR experience that appeals to this type of exploration.

Photo by VRFocusHe talks his approach for optimizing his VR experience by only using 270k polygons, and a handful of texture maps. He set his target platform to Android, and was very mindful of getting frame rates ranging from 120 to 300 frames per second.

Tony also talks about the process of using RiftUp in order to upgrade the resolution of his Oculus Rift. He shares his thoughts on creating more immersion by avoiding photorealism and creating a less familiar, lucid dream type of environment where it’s easier to suspend your disbelief.

Finally, he talks about his vision for creating personal and solo experiences within VR, and his hopes that the slower-paced games adventure and puzzle-based games will have a resurgence within VR.

Reddit discussion here.


  • 0:00 – Intro. Ethereon puzzle game. Solo project. Exploration game. Worked on Riven. Has an adventure background. Learned 3D because of VR, and made a demo which got him a job. In promo mode now.
  • 1:46 – Myst and Riven games were old-school, pre-rendered game. Explore a world and interact. You figure things out to explore more. Ethereon is physics-based game, and well integrated in environment
  • 3:34 – What happens when people can’t figure it out. Some people just give up. Other people really get it. Like to tear things apart as a kid, and people like that will appreciate that. VR mimics reality. Put them in a world where they have to figure it out.
  • 4:48 – How to balance moving people forward without giving too many spoilers. Community will share hints and spoilers. It’s a slower-paced game. People who like to run and gun are ones who have more trouble.
  • 6:35 – Slower pace, and dealing with VR simulator sickness. Had a hard time with it.
  • 7:46 – RiftUp kit details to get higher definition within Oculus Rift
  • 9:59 – Does it account for the barrel distortion?
  • 10:37 – AMOLED screens will have low persistence relative to LCD.
  • 11:13 – Frame rates and adapting for higher resolution. Only 270k polygons. Targeting Android as a minimum platform. Getting 120-300fps
  • 13:02 – Photorealism vs. more immersion in less photo-real, low-poly world. Reflections and other effects help trick people in believing it. Non-photo real is less familiar and it’s easier to suspend your disbelief. Reality and Lucid dream.
  • 15:54 – High contrast working better in VR. Setting it in space.
  • 16:43 – Potential of VR. Not a big fan of the metaverse, and more about creating a personal and solo experience. Adventure, puzzle-based games and slower-paced games to have a resurgence.

Theme music: “Fatality” by Tigoolio

Caitlyn Meeks is one of the creators and current manager of the Unity Asset Store, which is a marketplace that is changing game development for both game developers and content creators. She describes how Unity has built an extensible framework where you can extend it’s functionality through the asset store, and so it’s functionality is not fixed with the new features that come from their official releases.

CaitlynUnityThe asset store innovates in many different areas, and it is a slow and methodical process that Unity goes through to eventually integrate some of those features into it’s core engine.

Caitlyn talks about what differentiates Unity from other game engines, and how Unity is responding to recent pressures in the marketplace from Unreal Engine 4. They’re staying the course with their plan and roadmap, and see that they’re focusing on creating a streamlined user experience for game developers. She sees Unity as an unstoppable tortoise who may not always be first to market with all of the new features, but that they’re implemented well in a methodical fashion and with love.

She mentions some of the VR specific plug-ins on the Asset Store including SDKs for Sixense and Leap Motion, Cast AR, CAVE projection systems, and DIS HLA interfaces. She also talks about the free Unity Multipurpose Avatar (UMA) plug-in, which is a Unity-sponsored, avatar creation tool.

Finally, she talks about her vision for how VR will change the humanities and expression in a way that makes us more human and grateful to be alive.

Reddit discussion here.


  • 0:00 – Unity asset store. Provides content to game and VR developers, 3D audio, texture and music, and scripts and tools that extend Unity. Exponential growth since 2010. 750k active users forming a community of content producers. It’s an ecosystem where people are helping each other. People can make a living off of selling assets. One of top 3 reasons for using Unity. Bedroom artists who are making $10k-$100k. Senior of artist from Ubisoft making more from the Unity asset store sells. It’s changing game development for developers and content creators.
  • 3:08 – How does it differentiate from other engines? Unity is extensible. Asset store can be a stopgap for providing new features. Unity isn’t a static product. New functionality is coming through the asset store. Unity feature list doesn’t include all of the functionality available through the asset store. You don’t need access to the source code to create your dream tool in Unity.
  • 4:45 – What are some popular VR plug-ins? First submission from Palmer Luckey two years ago. SDK for Sixense and Leap Motion, Cast AR, CAVE projection systems, and DIS HLA interfaces. UMA is the Unity Multipurpose Avatar, which is a Unity-sponsored, avatar creation tool.
  • 6:42 – Binaural audio plug-ins to enable positional audio. New audio implementation in Unity 5 including audio
  • 7:20 – Lots of excitement for UE4 and EVE: Valkyrie moving from Unity to UE4, and what is Unity’s approach for counteracting this? Staying the course with a solid product with an unparalleled workflow. Lots of new features coming in 5.0 that have been on the roadmap. Not going to change anything drastically. Always been a bit behind, but they do it well, methodically and with love. They’re an unstoppable tortoise.
  • 8:55 – Your vision for what you want to see happen in VR. Fan of Cyberpunk and Snow Crash and Second Life. Coming from an artist’s perspective, and VR will be one of the most significant developments in the humanities and human expression. So many different possibilities for creating worlds and experiences that are beautiful, horrifying, mechanical, alien, etc. Goal is to see beautiful morphologies emerge. See worlds, spaces and scenarios that make us more grateful to be alive and things that make us more human.

Theme music: “Fatality” by Tigoolio

Mike Sutherland of YEI Technology talks about the PrioVR immersive body suit, which aims to immerse your whole body into VR experiences. YEI Technology is bringing this technology that they’ve been developing for the military into the consumer gaming market.

priovr They’re hoping to provide the first, consumer-grade motion capture suit with their pro version for $429, and also have a Core full-body option for causal gamers for $369, as well as an upper-body only Lite version for $289.

Mike talks about succeeding with the PrioVR Kickstarter the second time around, their custom motion controllers, game development plans, interest in finger tracking, target time for suiting up, and more details about their 3-Space Sensor technology.

Their website describes these sensors as “miniature, high-precision, high-reliability, Attitude and Heading Reference Systems (AHRS) / Inertial Measurement Units (IMU). Each YEI 3-Space Sensor uses triaxial gyroscope, accelerometer, and compass sensors in conjunction with advanced processing and on-board quaternion-based Kalman filtering algorithms to determine orientation relative to an absolute reference in real-time.”

For one of the most comprehensive reviews about this technology, then I’d recommend checking out this epic review of SVVRCon gear by Oliver “Doc_Ok” Kreylos.

Reddit discussion here.


  • 0:00 – Intro. PrioVR first consumer level, full immersive gaming suit. Bring VR into the next stage by bringing your body into VR
  • 0:39 – Low-cost motion capture suit. Working with motion sensors for the military, and bringing that technology into the consumer level. Core suit configuration. Pro version for motion capture for indie developers AAA game toolsets into the hands of independent devs. Also an option for upper body suit for seated VR
  • 2:07 – Arm controls. Currently have an aftermarket controller, and they’ll be shipping with custom controllers.
  • 2:52 – Kickstarter history. Launched unsuccessful Kickstarter. Learned lessons, and focused on improving the suit, and did better marketing.
  • 3:44 – Using PrioVR for motion capture. Pro suit with 17 sensors cost around $400. Get rich character animations. They have an existing motion capture studio, and have tooling around that
  • 4:36 – Game titles that will be available. Have an in-house dev team, and getting it into the hands of developers as quickly as possible. Some partnerships developing.
  • 5:32 – What type of user interactions are possible? Doing full-joint reconstruction rather than inverse kinematics. Keyboards or mice doesn’t work in 3D. Use your hands and reach out, and it’s more intuitive for the non-gamers. Don’t have to remember buttons
  • 6:56 – Finger tracking plans? Just focusing on the suit for now, but interested in it.
  • 7:19 – Talk about the types of sensors and fusion system that you’re using. Untethered experience and range.
  • 8:07 – How long does it take to suit up? Working with design companies to get it easier to get on and off than alpha suit. Target is 15 seconds.
  • 8:44 – What type of latency can you get. 9:26 – What’s the roadmap for when these will be available. Later this year for dev kits.
  • 10:08 – Price points for the different products. $289 upper body. $369 core suit. Pro suit $429. First time for sub-$1000 motion capture suit will be available. More info PrioVR.com. Available for pre-order now.

Theme music: “Fatality” by Tigoolio

Philip Rosedale is the creator of Second Life, and more recently High Fidelity. He talks about a lot of the things that he’s doing differently in creating a virtual world for the second time around including a focus on 3D audio, low latency, speed and texture of experience as well as using a standard scripting language with JavaScript rather than rolling their own.

philip-rosedaleHe talks about virtual body language and how the target of 100ms of latency is the threshold for a compelling telepresence experience that is indistinguishable from face-to-face interactions.

Philip talks about how High Fidelity wants to create a set of open standards and protocols so that people can host their own virtual worlds on their own servers. He also talks about their approach to distributed computing to help offload the computer power to run a complex and nuanced virtual world, and how mining a cryptocurrency could be a part of that process.

Finally, he talks about his vision for the future of the Metaverse, and how these virtual worlds will provide opportunities for participants to be more thoughtful, more open, and more creative than they can be in the real world. He doesn’t see that these worlds are necessarily escapist since they can be as sophisticated, complex, navigable and challenging as the real world. His experience with Second Life was that you have to be just as capable, smart and entrepreneurial to succeed in virtual world environments.

Reddit discussion here.

Be sure to check out this blog post on High Fidelity’s system-level architecture for more details.


  • 0:00 – Intro – High Fidelity. New virtual world taking advantage of changes in technology
  • 0:32 – Motion sensors and Oculus Rift are driving changes in virtual worlds. Driving to interact naturally in 3D virtual spaces, and the requirement to have the learned skill of using mouse and keyboard is going to end soon.
  • 1:33 – What types of interactions have you had within High Fidelity with these new tools? Body language, and seeing someone blink. Nodding head is important. Moving hands is remarkable. Got PrioVR working for full upper body animation. Group interactions and face-to-face interactions.
  • 2:47 – Facial capture with either a 3D camera or a webcam with Faceshift, and reconstruct it via 50 floating point numbers. Aiming to get less than 100ms latency to mimic 1-to-1 interactions
  • 3:48 – Using VR HMD and facial capture at the same time. Can only get one at a time. Oculus thinking about doing facial capture. Can use a 3D TV, and adjust the view as a intermediary between full VR HMD and computer screen
  • 4:54 – Using High Fidelity as a telepresence tool. Use it with their distributed team, and cool to see others.
  • 5:35 – Good enough for enterprise use? Proof point of recording telling the same story with the same story with same avatar, and can identify people even without sound
  • 6:20 – Distributed computation at High Fidelity. Limited by centralized hosting. Distributing small computers quickly, and use computers at home to offload some of the processing.
  • 7:30 – Dynamic multicasting with audio. Mixing it in 3D. Dynamically assembling a multicast repeater and can perform a concert in real-time with less latency than in the real world.
  • 8:47 – What is a voxel, and how are you using it? A way to organize space virtually. Represent what things look like at a distance, and enables to see at infinite distance. See full mesh topology up close.
  • 10:06 – Hierarchical nesting of voxels for the decomposition of space with a “sparse voxel octree”, and then distributed computing with those. Can create infinitely complex city
  • 10:59 – Other things that you’re doing differently from Second Life: Audio processing, low latency, speed and texture of experience, using a standard scripting language with JavaScript rather than rolling their own. People want to run their own services, it’s a protocol and open source standard rather than a world upon it’s own.
  • 11:59 – Cryptocurrency and paying people for helping run the virtual world.
  • 12:56 – How is identity different on High Fidelity? By default, you’re anonymous, and using OAuth and SSL for authorization for certain secure sites, but also a lot of open worlds. Having name floating over your head is not a great solution, because sharing you name is a choice and form of greeting
  • 14:23 – Future of the Metaverse. Create a set of interconnected virtual worlds, where they’re living adjacent to each other. Instead of hypertext links, there will likely be doors. Virtual worlds of the future will be a set of interconnected spaces like the real world. There will be hidden servers that you can’t get to, just as there are private intranets.
  • 15:34 – What inspires you with what you want to see? How people are changed by virtual worlds for the better, more thoughtful, more open, more creative. Virtual worlds are our future. They will become a real added space, and it’ll be a profound expansion of the real world.
  • 16:35 – Are virtual worlds escapist? Technology is getting us the ability to create worlds that are just as sophisticated, complex, and navigable and challenging as the real world. Only escapist if you’re escaping from other people, or simplifying the world too much in a way that isn’t in our best interest. To be successful in Second Life you have to be capable, smart and entrepreneurial.

Theme music: “Fatality” by Tigoolio

Here’s a recent talk that Philip Rosedale gave about High Fidelity. Note that this is NOT within High Fidelity, but a government virtual world called MOSES, which is the “Military Open Simulator Enterprise Strategy.”