The new JavaScript SDK from AltspaceVR is going to allow front-end web developers to quickly and easily create social VR applications. The AltspaceVR SDK is based upon three.js, which means that developers can create a WebVR-enabled application on the web, and then bring up that experience from a Chromium browser within AltspaceVR and have a fully immersive VR experience that is aware of the other people within Altspace.
WebVR is still currently suffering from a lack of optimization in order to hit the target latency specifications, and Altspace provides the user with a native Unity application that will be performant enough to run at the desired framerate and latency. The available APIs also allow for the exchange of information to the web application including “natural social interactions, synchronized multiplayer capabilities, networking, VOIP and immersive virtual environments.”
I talked with AltspaceVR co-founder and director of engineering Gavan Wilhite about what this new SDK, what it will enable front-end developers to do, and what some of the implications of having a cross-platform VR environment that has a Vive, Oculus Rift, and Oculus Mobile GearVR headset.
Gavan also talked about the new live coding capabilities and integration of CodePen, which will enable some really interesting interactive and social construction of VR experiences. Blind typing and coding is still a barrier to entry for this, but will likely be a useful skill to develop to be able to quickly and easily experiment with different VR experiences while in VR. And as Gavan noted, often it’s the accidents and glitches that end up being some of the most entertaining and fun things to happen within VR.
Gavan also mentioned that AltspaceVR is offering up grants up to $150,000 to different developers to kickstart these types of multiplayer, open web VR apps that can be used within AltspaceVR.
You can visit the AltspaceVR Developer Portal to download the new JavaScript SDK or apply for the AltspaceVR Developer Initiative Program.
Become a Patron! Support The Voices of VR Podcast Patreon
Theme music: “Fatality” by Tigoolio

Madis Vasser is a psychology student at University of Tartu Virtual Neuroscience Lab, and he collaborated with the computer science department to create a VR toolbox for doing experimental psychology research. He was showing off a demo of a change blindness experiment that he created within Unity at the IEEE VR conference. 
Max Pfeiffer was at the IEEE VR talking a demo where he’s experimenting with off-the-shelf Electrical Muscle Stimulation (ESM) message devices in order to provide haptic feedback for pointing with your hand within 3D virtual environments. He’s tracking the hands with a depth sensor camera to detect finger pointing, and then providing haptic feedback on the arms to stimulate a subtle haptic feedback. He did a Fitts’ Law analysis of the efficiency of this technique, and he found that “results demonstrate that both EMS and vibration provide reasonable addition to visual feedback. We also found good user acceptance for both technologies.”

There are some open questions within the data visualization community as to what benefits the third dimension might add to visualizing information that doesn’t have an inherent spatial component. Ragaad Al Tarawneh was at the 3DUI conference presenting a paper called “Utilization of Variation in Stereoscopic Depth for Encoding Aspects of Non-spatial Data.”
Quentin Parent is a sales engineer at 



