Gavan-WilhiteThe new JavaScript SDK from AltspaceVR is going to allow front-end web developers to quickly and easily create social VR applications. The AltspaceVR SDK is based upon three.js, which means that developers can create a WebVR-enabled application on the web, and then bring up that experience from a Chromium browser within AltspaceVR and have a fully immersive VR experience that is aware of the other people within Altspace.

WebVR is still currently suffering from a lack of optimization in order to hit the target latency specifications, and Altspace provides the user with a native Unity application that will be performant enough to run at the desired framerate and latency. The available APIs also allow for the exchange of information to the web application including “natural social interactions, synchronized multiplayer capabilities, networking, VOIP and immersive virtual environments.”

I talked with AltspaceVR co-founder and director of engineering Gavan Wilhite about what this new SDK, what it will enable front-end developers to do, and what some of the implications of having a cross-platform VR environment that has a Vive, Oculus Rift, and Oculus Mobile GearVR headset.

Gavan also talked about the new live coding capabilities and integration of CodePen, which will enable some really interesting interactive and social construction of VR experiences. Blind typing and coding is still a barrier to entry for this, but will likely be a useful skill to develop to be able to quickly and easily experiment with different VR experiences while in VR. And as Gavan noted, often it’s the accidents and glitches that end up being some of the most entertaining and fun things to happen within VR.

Gavan also mentioned that AltspaceVR is offering up grants up to $150,000 to different developers to kickstart these types of multiplayer, open web VR apps that can be used within AltspaceVR.

You can visit the AltspaceVR Developer Portal to download the new JavaScript SDK or apply for the AltspaceVR Developer Initiative Program.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Frank-SteinickeFrank Steinicke is professor for Human-Computer Interaction at the Department of Informatics at the University of Hamburg. His research into VR strives to understand the limitations of human perceptual, cognitive and motor abilities to reform the 3D user interactions within computer-mediated realities.

Frank is the outgoing chairman of the 3D User Interface (3DUI) conference, which happens just before the IEEE VR conference each year. He talks about 3DUI, human perception, and some of his research in using multi-touch screens in conjunction with depth cameras in order to create more non-fatiguing user interactions within virtual environments.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Madis-VasserMadis Vasser is a psychology student at University of Tartu Virtual Neuroscience Lab, and he collaborated with the computer science department to create a VR toolbox for doing experimental psychology research. He was showing off a demo of a change blindness experiment that he created within Unity at the IEEE VR conference.

Change blindness is the effect where it is difficult to notice changes in your virtual environment. It’s not easy to make objects disappear in our physical reality, and so we’ve evolved to not really notice subtle changes in our physical environment. But in VR, it’s very easy to make small changes to your environment that are really difficult to perceive.

The VR experience called Sightline really exploits this psychological phenomena to great effect to be able to cut between scenes in VR. So even though I’m intellectually familiar with the change blindness concept, when I tried out my own perceptual acuity in a controlled experiment I still found it to be an effect that was way more difficult to perceive than I would have expected. Here’s more results from Madis’ change blindness research.

Maris talks about creating a generalized VR toolkit for non-programming psychology students to do other research projects, as well as some of the ethical implications of replicating controversial psychological experiments like the Milgram obedience experiment. While no ethics review board would allow psychological students to replicate this study, there’s nothing preventing people from recreating the experience in a VR experience. There are many open ethical questions around this that will be interesting to see how they play out both in more controlled research environments, but all the free market of consumer VR experiences.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Sebastien-Kuntz
Sébastien Kuntz is the founder of MiddleVR, which is a middleware solution that allows you to connect to a wide range of consumer and industrial VR peripheral devices. He talks about some of the other VR locomotion and 3DUI features that are included within the MiddleVR Unity integration.

Sébastien was on a couple of different panel discussions at the IEEE VR conference including talking about open source vs closed source solutions in VR, as well as the future of projection-based VR solutions in the consumer market. One interesting point that Sébastien made is that there will always be projection-based environments because you can keep adding projectors to increase the number of pixels and resolution fidelity in these CAVE environments.

Sébastien also makes some interested differentiations between the consumer and academic community. While the consumer VR market is worried about providing solutions to problems right now, the academic IEEE VR community is more focused on looking at the problems that are 5-10 years down the road. But the IEEE VR community is also really focused on asking the right questions, and so it’s more about leaving the conference with more questions and research trajectories than it is necessarily about finding the answers to an immediate problem.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Max-PfeifferMax Pfeiffer was at the IEEE VR talking a demo where he’s experimenting with off-the-shelf Electrical Muscle Stimulation (ESM) message devices in order to provide haptic feedback for pointing with your hand within 3D virtual environments. He’s tracking the hands with a depth sensor camera to detect finger pointing, and then providing haptic feedback on the arms to stimulate a subtle haptic feedback. He did a Fitts’ Law analysis of the efficiency of this technique, and he found that “results demonstrate that both EMS and vibration provide reasonable addition to visual feedback. We also found good user acceptance for both technologies.”

esm-haptics

ems

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Ragaad-Al-TarawnehThere are some open questions within the data visualization community as to what benefits the third dimension might add to visualizing information that doesn’t have an inherent spatial component. Ragaad Al Tarawneh was at the 3DUI conference presenting a paper called “Utilization of Variation in Stereoscopic Depth for Encoding Aspects of Non-spatial Data.”

Al-Tarawneh-Poster

Ragaad recently completed her Ph.D. from the Computer Graphics and HCI group at the University of Kaiserslautern. She studied how to use different graph visualization techniques in order to help users better understand the behavior of complex data sets. She talks about a project collaboration with the Fraunhofer Institute about visualizing the failure mechanisms of complex embedded systems.

She talks about one of the useful applications of stereoscopic visualization techniques that she discovered in her research, which is called ExpanD. ExpanD is a “technique for expanding or contracting the nodes in order to align graph nodes in the 3D space with minimal occlusion.” She found that “stereoscopic depth can be used to encode data aspects of graphs under certain circumstances.”

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Quentin-ParentQuentin Parent is a sales engineer at Haption, and he talks about some of their force-feedback haptic devices that are used in different industry applications. These are haptic feedback systems are used to train astronauts, for nuclear training, as well as with other medical training applications. Some of these devices cost nearly $100k, and so they’re not really aiming for the consumer market. But it’s interesting to hear about how precisely these Haption devices can simulate rigid body interactions for industrial applications.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Patrick-HarrisPatrick Harris is the lead game designer at Minority Media where they’re developing a sci-fi adventure game called Time Machine VR. The premise of Time Machine VR is that humanity is facing a deadly plague from ancient times, and that best way to eradicate this disease it is to travel back in time to find the cure within the DNA of underwater dinosaurs.

Time Machine VR takes place within a submarine probe, and part of the reason for this is that Minority Media discovered that this a great way to move around within a VR space that isn’t as nauseating as moving around on the ground. Part of it could be our expectations of what it feels like to move underwater from the perspective of our vestibular system. Or it could be that there are less objects on the ground that are moving by you quickly, and so floating around in the water tends to not have as much motion that you’re detecting while moving.

Patrick talks about the how they’re prioritizing Time Machine VR as a game experience, but also trying to be as accurate as they can be with the dinosaurs by consulting with a paleontologist. He also discusses the different game mechanics available to interact with the dinosaurs including baiting, scoping, tagging, and a fun time freeze that allows you to swim inside of creatures for a closer look.

They also have an interesting approach where there will be two ways to experience amy given level. The first will be in the more linear and story-driven narrative mode. But then after gaining more tools, then you can revisit levels in a more exploratory context to gather more clues from the environment in order to solve the higher-level puzzles.

Overall, there are some pretty stunning graphics and the underwater movement was fairly comfortable but it wasn’t 100% free of motion sickness for me. I’m fairly sensitive to sim sickness, and so the yaw rotation was still a bit nauseating and it varied with how far away I was from the sides of the ocean walls.

Time Machine VR is available via Steam as early access, and so it’s worth checking out what they have so far if you’re into the idea of interacting with underwater dinosaurs, especially big ones with really sharp and pointy teeth.

Be sure to check out Patrick at Oculus Connect 2, where he’ll be talking about “Game Design Un-Rules: Examining Design Failures in Time Machine VR

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

michael-blockMichael Block of Culture Shock Games talks about his interactive documentary VR game experience called We Are Chicago about gang violence in Chicago’s south side. It has a single ending to the story, but there are many different branching actions that are dependent upon your actions and choices throughout the experience.

Michael wanted to tell the story of the dangers and challenges of living amidst gangs and gang violence. They started collecting a wide range of interviews with members of dangerous neighborhoods in Chicago, and hired a writer from the area to bring in some personal experiences and tie the story together. They wanted to create an experience where you can witness and empathize with the hardships of dealing with friends and family members who are involved in gangs.

Michael talks about the inspiration for creating this experience, some of the technical considerations, lessons learned for developing a 2D and immersive 3D version, and what he hopes to accomplish with the project.

Michael hopes to see a lot more of these types of interactive documentaries, especially once the technology improves to the point where it’s a lot easier to create a more narrative experience with fully motion tracked movements with tracked facial expressions. VR has the power to put you into someone else’s shoes, and this is a really interesting experience that explores the consequences of making tradeoffs and choices around gang violence in the south side of Chicago. You can learn more about this project on the We Are Chicago website.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

olli-sinermaPollen is a first-person exploration game set in space by Helsinki-based Mindfield Games. I had a chance to catch up with Project Lead Olli Sinerma at PAX Prime to talk about the story behind the game as well as some of the lessons learned from working on Pollen for the past two years.

The graphics in Pollen look amazing, and there is a very detailed blog post on Unity’s site talking about how they’ve been able to achieve high-end visuals within Unity 5. Road to VR also did a pretty comprehensive hands-up write up on the experience that you can find here.

Olli talks about some of the considerations that Mindfield Games took into account for VR locomotion, and how they’ve been approaching locomotion within a room-scale Vive environment. They considered using the type of Blink locomotion system that Cloudhead Games is using, but for now are opting to use a combination of using the thumbstick controller as well as walking around in a room scale environment. Time will tell to see if this is a viable approach in the long-run as I suspect that this will not be 100% nausea-free as the Blink approach promises to be. VR locomotion is still largely an open problem within the VR exploration genre, and so I expect that this will be an area in VR that will continue to have different approaches and potential solutions.

Olli also talks about designing the first-person control mechanisms for both the gamepad, mouse and keyboard, as well as the Vive and the Half Moon Oculus Touch controllers. They’re considering experimenting with some of gesture capabilities within that are made available with the Touch, but likely not going to be a vital part of the gameplay considering that it wouldn’t be possible if people were playing with a gamepad or Vive controllers. Olli describes how they’re trying to make the interactions as natural as possible whether using a gamepad, mouse or 6DOF hand controllers.

They’re also designing a 2D version of the game, and he mentioned that there will be things that you can do in VR that you can’t do in the 2D version like look underneath a bed. There are definitely design restrictions and considerations like this when trying to develop a game for multiple platforms, and so it’ll be interesting to see whether or not it’s worth for VR game developers to make 2D versions available and how much that 2D version would limit the VR experience.

Olli is excited to see more medical applications of VR, and he’s really grateful for the amount of community and information that’s available on /r/oculus. You can get more information on Pollen on their website here.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.