Philip Rosedale has a lot of really brilliant dreams and visions for how to create a sustainable and open metaverse with High Fidelity. Being that he’s the founder of Second Life, we talked last year about some of the lessons learned in how he’s approaching creating a more sustainable and scalable model of interconnected virtual worlds.

This year his focus was on how important being able to create a hyperlink between multiple virtual worlds is going to be. He says that the history of the Internet provides a really valuable lesson in the fate of what happened to the walled garden platforms of AOL and CompuServe. Even though the content on these platforms were of much higher quality, over time the links between other websites with a more rudimentary and less polished look and feel ultimately won out. Philip cites Metcalfe’s law which states that the value of a telecommunications network is proportional to the square of the number of connected users of the system. As each virtual world is able to link off to different portals to other virtual worlds, then that makes that virtual world that much more valuable and compelling.

This insight seems to be at the foundation and core of High Fidelity’s approach of using open source licensed technology in order for people to stand up their own hosted servers containing the code for their virtual worlds. In fact, part of Philip’s long-term vision is to start to use the distributed computing power of the many more desktop computers and mobile phones to be able to create a virtual world equivalent to the square footage of the entire earth so that it could concurrently create virtual experiences for all 7 billion people on the planet. This vision is what keeps Philip going to work every day, and to be able to create the technological backbone in order to make that happen.

Currently, linked between virtual worlds is a bit of an open problem for how to actually pull that off in a seamless fashion. Text on a website has the ability to unobtrusively link to other websites, and there are metadate queues that add contextual information about where links will lead to. There are no equivalent standards within a VR environment with 3D objects, but the closest metaphor is a using a Door or Portal or completely different building to be able to navigate between different virtual worlds. There are potential perceptual hacks that could be exploited, but Philip cautions that there may be very physical limitations for how we navigate virtual worlds that would the equivalent of the disorienting effects of providing contradictory information to our perceptual system thereby causing simulator sickness.

Philip was also really excited to have created some shared physics simulations in order to have games like air hockey that could be played in VR. This will add a level of physical reality that could add to the coherence of the virtual experience, but also provide a lot of opportunities of engaging in fun and playful activities with other people within High Fidelity environments. A common theme amongst all of the social VR applications from AltSpaceVR, Convrge, and VR Chat is that all of them have been adding more social gaming experiences within their social apps, and so this has been a consistent theme amongst all of the social VR applications.

If I were to bet who has the most viable and sustainable approach for creating the metaverse, then my money would be on High Fidelity’s strategy and open source technology stack. I’m really excited to see how Philip Rosedale and the rest of High Fidelity continue to evolve over time.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

First Contact! from High Fidelity on Vimeo.

Graham Gaylor & Jesse Joudrey have been working on VR Chat for well over a year now, and they’ve been consistently providing social spaces for some early experimentation with Social VR experiences. They talk about some of the latest features that they’ve been implementing in their platform, and what they’re doing to help make their dreams of the metaverse a reality.

Jesse says that social interaction within a VR experience a feature, not the final destination of a VR environment. More and more people will want to be doing or seeing something specific within the VR experience, and so that’s part of the reason why VR Chat has created an SDK with a set of unity scripts to help easily add social experiences to your VR experience.

grahamgaylorThey’re also interesting in being able to link virtual experiences together as well and helping to do some early experiments in building out the metaverse using tools like Unity that are capable of creating highly-performant and interactive immersive environments.

One of the requests that people have had with VR Chat was to be able to know when their friends were online, and so Jesse and Graham have been implementing user accounts so that people can have friends lists. They also have a whole system for being able to create rooms within VR Chat, and whomever creates the room has a set of moderator privileges in terms of whether it’s hidden, open, and access only given to a white list of people or friends of the moderator. They’re starting to implement the technological foundation in order to have the type of chat room environments that were described in Ready Player One novel. I think that this is going to be really, really powerful to be able to invite people over to your personalized VR chat environment, much like you might invite people to come hang out at your home.

Some of the other features that VR Chat has implemented was being able to have different types of games and interactions while hanging out socially. For example, Jesse created a race track where you can ride around in a car with four other people while racing other people. But these types of physical proximity constraints start to replicate taking a road trip with friends, but in a way that’s super silly and a bit absurd. Jesse says that one of the biggest lessons and reactions of creating experiences like this is that people tend to laugh a lot more together.

In the future, I foresee that Virtual Reality has the capability to allow us to play with a way that we don’t tend to do in real life. It’s going to break us out of socialized patterns for how we normally connect and relate to each other, and I think there’s a lot of rich opportunities for finding new and fun ways for people to play and spend time together whether they live across the country or if it’s just something that people decide to do within a room-scale VR environment or VR arcade.

For more information, be sure to check out VRChat and drop by one of their upcoming events.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

When Shawn Whiting and Hayden Lee decided to create a social VR application, they decided to create the simplest minimal viable product. It was literally just a cube and a name that was tied to head movements, but they had a dance party where over 50 people showed up for at least a half hour each. Even though they were simple blocks, knowing that they were tied to the movements to a real human being on the other side proved to be extremely compelling.

That is a perfect story that encapsulates the iterative approach and rapid feedback that has been the driving philosophy for Convrge. Known for it’s minimalist floating heads, Convrge has adopted a low-poly aesthetic that has proven to be lightweight and super compelling.

Shawn and Hayden have found a number of ways to keep users engaged with social applications ranging from dance parties, VR developer talks with notable speakers from the VR circuit, virtual campfire hangout sessions, YouTube watching parties, and watching livestream events of either VR meetups or breaking news and press conferences from VR companies.

As a particularly salient example, here’s a reaction video of over 73 people in VR watching the livestream of the Oculus CV1 press conference where it was announced that the Rift would be shipping with Xbox controllers. This was clearly disappointing on some level for the VR enthusiasts who really wanted to see 6DOF controllers ship with the Rift, but then there’s another moment later in the video where Palmer announces the Oculus Touch, and there’s clearly a lot of excitement that this is something that is coming in the future. Note that there’s a recording error in having two tabs open as to why there’s an echo in the sound.

Shawn and Hayden even were livestreaming the VR mixer afterparty onto the theater screen in Convrge while at the same time projecting that scene onto a screen at the party. Here’s an excerpt of a little dancing that I did at that party that Hayden mentions within the interview

Convrge has also recently added the ability for users to play games with each other in VR, and that was by request. They talk about some of their future plans, and what some of the most popular feature requests have been.

Finally, Shawn gives the advice to anyone creating a social VR application to get people involved as quickly as possible so that you can start to get their feedback for what they want. In other words, don’t spend a year and a half developing a polished social experience without ever once having people involved with using it.

I’m looking forward to seeing how the community and Convrge platform continues to evolve, and encourage anyone who hasn’t made time to check it out to definitely drop by sometime. It’s a very welcoming and open community.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Hallie-McConlogue2Hallie McConlogue has been waiting for virtual reality to go mainstream for over 20 years now. She used to spend many consecutive hours within a VR design and drawing tool called HoloSketch at Sun Microsystems’ VR lab where she did interface design, modeling, art direction, and animation for 7 years. She worked on the first claymation for real-time playback within a head tracked virtual reality HMD.

I talked with Hallie right after she had tried out the Crescent Bay demo for the first time at SVVRCon. She was marveling at how well the head-tracking from system that she used over 20 years ago has stood the test of time, and even the framerates and resolution were comparable in her mind. Of course, computer graphics have improved exponentially and the cost of the Sun system that she was using was over $53,000.

She talks a bit more about using HoloSketch, which is a program developed in 1994 by Michael Deering. It sounds like the metaphoric grandfather of an interactive painting and design program like Tilt Brush.

One of the things that Hallie is interested in doing is applying the insights that she’s gained from improv acting and comedy over the years. She argues that improv theater actors have been on the vanguard of interactive media for as long as improv has been around. She thinks that improv has a lot of lessons to teach VR designers for how to create a compelling and engaging interactive experience.

Hallie also advocates for the importance of facial tracking, especially being able to track the eyes. Where the eyes are looking can give so much information about how to navigate social situations, and she says that the eyes are a huge feedback mechanism for how improv actors communicate with each other.

Hallie also talks about getting more women involved in Virtual Reality, and provides some feedback about what type of experiences that she’s interested in. She says that romance novels are like porn for women, and that there’s an romantic and emotional component that is worth exploring through the medium of virtual reality. And again, a lot of these types of romantic experiences get back to being able to track facial expressions and to have a strong sense of eye contact between either the audience or characters within a story.

Incidentally, Maria Korolov, the editor of Hypergrid Business, has started a Women in Virtual Reality website aggregating women speakers and professionals who are working with VR.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Nick Donaldson is a Senior Designer at Epic Games, and I caught up with him at Oculus Connect to talk about the design process behind The Showdown Demo. This demo was meant to be the highest fidelity VR experience with AAA-game polish up to that point and time. They ended up re-using a lot of assets that were designed for other experiences, but in the end The Showdown Demo was an epic slow-motion walk through a war scene unfolding as you’re slowly moving down a road towards a giant robot who leans down to scream in your face. Here’s a video of what was shown within the demo.

Nick talks about some of the easter eggs and design process that they went through in order to create this experience. At Oculus Connect, he and Nick Whiting talked about some of the optimizations that they had to make within Unreal Engine in order to get the Showdown Demo operating smoothly. Here’s a video of that talk:

One of the fun anecdotes that Nick shares in this interview is the first VR social interaction that he had with Nick Whiting while working on the Couch Knights demo. There were still early in the debugging phases of creating a multiplayer experience, and he talks about how much body language and information that they were able to communicate with each other non-verbally. Nick Whiting looked over at him and just froze. Nick Donaldson shared a gut-level response of “Sup?” with a head nod, which Nick replied to with his own head nod and “Sup?” It sounds like a simple interaction, but until you’ve had your first social VR experience like this, then it can be quite a profound experience knowing that there’s a human being on the other side of all of the movements of an electronic avatar.

Nick says that Epic Games is very interested in continuing to explore the medium of virtual reality, and that they’re really excited about the potential for creating more games and other VR experiences moving forward.

For more information, then be sure to check out the Road To VR write-up of their Unreal Engine optimization talk or the summary from the Unreal Engine blog.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.

Ryan-McMahanRyan McMahan is Assistant Professor of Computer Science at the University of Texas at Dallas, where his research focuses on the effects of system fidelity for virtual reality (VR) applications and systems. He had an interest in VR training applications and was presenting at IEEE VR about a realistic ladder climbing technique called “march-and-reach.”

Climbing a ladder sounds in VR like it’d be a fairly straight-forward problem, but yet this interview shows all of the various nuances and design decisions that had to be made in order to both accurately replicate the feeling of virtually climbing a ladder, but also do it in a way where they could teach ladder safety.

There were previous ladder climbing techniques that would either be purely hand or feet-based methods for controlling vertical locomotion. Ryan decided to use the feet for controlling the vertical locomotion, and that there needed to be at least two points of contact at all time otherwise the person would fall off the ladder.

Part of why VR training can be so effective is that you can show people what a failure condition looks and feels like without putting someone’s physical safety in danger. They can make people fall of a ladder in VR and show that they’d break their legs, and having this level of realistic fear can actually help create episodic memories that can help people more effectively remember to always keep two points of contact while climbing a ladder.

There were other nuanced changes that they had to make in order to make the simulation more realistic. While you’re climbing a ladder, you’re holding on to the rungs of the ladder and leaning backwards so that when you look down you can see your feet. But yet if you’re walking in place and look down to see where your feet are, then you actually have to lean forward if the feet are tracked accurately, which would make people loose their balance and potentially fall down in real life. So Ryan had to create an offset the feet by 10 degrees forward so that people would be able to actually see their feet in VR while still maintaining their balance.

Here’s more information on the march-and-reach paper that Ryan presented at IEEE VR.

March-and-Reach: A realistic ladder climbing technique

In most 3D applications, travel is limited to horizontal movement. A few 3D travel techniques allow for vertical travel, but most of them rely on “magic” abilities, such as flying. We sought to develop a realistic vertical travel technique for climbing ladders. We have developed March-and-Reach, with which the user marches in place to virtually step on lower ladder rungs while reaching to virtually grab higher rungs. We conducted a within-subject study to compare March-and-Reach to two prior ladder-climbing techniques. Results indicate that users consider and treat March-and-Reach as the most realistic ladder climbing technique.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.

Luca-MarchettiLuca Marchetti is the CEO of Studio Evil and he talks about developing a serious game called Relive. Relive is a sci-fi adventure game set on Mars that is designed to teach people CPR. There were demos taking place in the hallways at Oculus Connect with people wearing an Oculus Rift while pressing on a Mini-Virtual Reality Enhanced Mannequin that was being tracked by a Kinect.


Here’s a video of the Mini-Virtual Reality Enhanced Mannequin and Kinect integration, which was being worked on before Palmer launched the Oculus Rift Kickstarter in 2012.

Here’s a description of the mannequin:

The project involves the development of a serious game and a self-learning software specifically dedicated to quality cardiopulmonary resuscitation.
We developed Mini-VREM (Mini-Virtual Reality Enhanced Mannequin), a CPR feedback device with a new motion detection technology including a Kinect sensor and software specifically designed to analyse chest compression performance and provides real-time feedback in a cardiac arrest simulation training setting.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Yon-VisellYon Visell is an assistant professor at Drexel University focusing on Haptic display engineering at the RE Touch Lab. His research is in trying to figure out the neuroscientific and physical basis of human tactile sensation and perception. It’s in this process of trying to model physical sensations in the hands that he started to look at stochastic physics models that have a random probability distribution pattern that may be analyzed statistically but may not be predicted precisely.

At IEEE VR, Yon was presenting a long-paper titled, “Fast physically accurate rendering of multimodal signatures of distributed fracture in heterogeneous materials.” In essence, he’s able to model what it looks and sounds like when wood breaks at a sampling frequency of 48 kHz. There are thousands of fibers that could be modeled individually, but that’d be too computationally intensive to do in a real-time physics simulation. But there’s a pretty consistent probability distribution for how these fibers fail that can be approximated with pretty remarkable accuracy.

Here’s the abstract of his paper:

Abstract: This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.

Yon also talks about the latest and most cutting edge haptics research, including what the best research-grade machines are capable of doing such as simulating force feedback at a sampling frequency of 5000 to 10000 samples per second with a desired latency of 1 sample.

He also discusses some of the other frequency bandwidth requirements of other haptic systems including the detection of temperature changes, pressure changes, and our sense of touch.

Here’s an excerpt from Yon’s RE Touch Lab describing the overall research focus:

What do we feel when we touch and manipulate objects in the world?

How do mechanical signatures of contact elicit conscious percepts of touch?

How is touch perception enabled through structural and functional specializations in the body?

In our lab, we study mechanisms underlying haptic perception and action, and the neuroscientific and biomechanical basis of touch. Our long-term goal is to uncover the biological (neural and mechanical) computations that enable haptic interaction, when movement-dependent sensory signals are concurrently available via multiple perceptual channels.

We conduct theoretical and behavioral studies of haptic perception to illuminate contributions of different mechanical cues and motor behaviors. We aim to develop novel hypotheses about how real haptic objects are perceived, and how they can be simulated with new technologies.

Yon says that the two biggest Haptics conferences in US are the
IEEE World Haptics Conference which just happened from June 22-26 in Illinois, and then the IEEE Haptics Symposium 2016 that’s coming up next year.

After talking to Yon and other researchers at the IEEE VR conference, then it’s pretty clear to me that haptics is something that is going to be very use-case specific. The idea of creating a generalized VR haptic solution that’s affordable to consumers is something that feels a long way out, perhaps 5-15 years especially if you imagine trying to achieve a 5-10kHz sampling rate combined with less than 100 microsecond latency. In the meantime, then we can get the most out of task-specific haptic devices that provide just enough haptic feedback to give that extra amount of presence and immersion.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rob-SwattonAt the time of this interview, Rob Swatton was the Research & Development Manager at Earth & Sky Ltd, which is an New Zealand-based astronomy organization. They do public outreach and education at Earth & Sky as well as scientific research into discovering exoplanets through a technique called microlensing.

In his R&D role, Rob was brainstorming different ways that virtual reality could be used in helping the visitors get a better understanding of our universe, but also potentially help with the process of discovering new exoplanets through distributed pattern recognition like SETI@home, but with people’s brains.

He sees that VR could help us visualization processes that happen on a grand time scale that span billions of years. There are visualizations of scale, distance, and time that transcend our metaphors and abilities to describe to people. Most of these visualizations are either really esoteric or rely upon complex mathematical models that are difficult for the public to fully comprehend. Rob speculates that perhaps VR could help show the process of a galaxy forming, how a nebula creates new stars, or what a black hole would look like. It’d be like a timelapse visualization that spans over billions of years.

Rob also imagines that some of the VR experiences would benefit from having an interactive guide to help explain difficult concepts. A visitor would be able to see a concept within VR, but also have an expert on hand to be able to ask questions about it as the experience unfolded.

One of the more speculative ideas that Rob had was thinking about how VR could make the process of exoplanet discovery more interactive with crowd-sourced pattern recognition tasks that people could do at home. He would imagine something along the lines of what SETI@home is doing with distributed cloud computing, but doing it with people’s brains.

They would likely have to do some type of filtering or symbolic translation of the raw data to be able to have the public understand the concepts and what they’re looking for. So it’d be a balance between making it accessible to more people to understand conceptually versus maintaining the integrity of the data.

There’s a lot of unanswered questions for how something like this would actually play out and be implemented in practice, but it’s an interesting idea to be able to crowd-source pattern recognition in order to help with different scientific research endeavors. One example of where this is already happening is with the Fold It game, which has been able to gamify the process of protein folding while at the same time allow people to contribute to scientific research.

Rob says that you wouldn’t necessarily need VR in order to search for the data analysis and number crunching within the raw data of how gravitational effects bends light in this microlensing process for exoplanet discovery. But it’s something where VR could just make the process more compelling to participate in.

Learn more about some of the ideas for how VR could be applied for education and scientific research by listening to this interview with Rob.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Didrik-SteinssonDiðrik Steinsson has a vision of a future where virtual reality headsets can provide a more ideal work environment. MureVR is creating VR workplace environments that are designed to eliminate stress and increase productivity with the help of environmental psychologists.

There’s been a trend within the tech industry to replace isolated offices with more open environments. MureVR’s website cautions that, “the benefits of an open collaborative workspace are paid with concentration difficulty and less privacy as well as increased level of stress. This is the problem we aim to solve.”

Diðrik talks about collaborating with environmental psychologists to create VR environments that reduce stress, use the best colors depending on what task is being done, and creating tailored spaces that address specific needs and aesthetics.

One open question is how much the social stigma and perceived isolation of wearing virtual reality goggles will play out within the corporate environment. One the one hand, it may actually decrease unnecessary disruptions and allow people to be more focused and productive. One the other hand, there are potential negative perceptions and impact of using virtual reality headsets within an environment and context in which you’re expected to be social and interact with your co-workers as a part of the helping each other and sharing knowledge.

That said, anyone who works in an open office can attest to how much headphones are used to socially isolate yourself from the chaos of your surroundings, and so perhaps wearing a VR headset would just be another level of making it that much more explicit.

Listen in to our conversation we explore using VR to take breaks and increase productivity within your virtual workplace environment.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.