Amir Rubin, CEO of Sixense, the distributors of the Razer Hydra controller and manufacturers of the wireless STEM controllers, which have been described as enabling a 3D mutli-touch interface and will could become the first “mouse of VR.”

Amir-Rubin-269x200Amir talks about how the intention of Sixense is to create presence in VR by bringing your body into VR in an intuitive fashion. He talks about the differences between a physical-based solution like the STEM controllers versus using gesture controls with a camera-based system.

He talks about the history and timing of the Razer Hydra, and how it was a tough sell to convince developers that this paradigm shift was needed for interacting in 3D spaces — especially before virtual reality came along. He talks about the challenges faced with shortages of the Hydra and how they’ve been supporting exciting VR projects.

He talks about the modular design of the STEM controllers, and how developers will be able to bring all sorts of different peripheral objects into VR by using the STEM pack component.

Amir then goes into what first inspired the creation of Sixense, and how he wanted to be able to provide VR training experiences where the technology didn’t get into the way, and that people could have as much immersion and presence as possible in order to tap into their deeper intuition. He believes that almost every experience in reality could be improved with virtual reality, and he wants to help bring that sense of immersion and presence to people through Sixense technologies.

Finally, he talks about the lessons learned from the failed VR revolution of the 90s, and how that has influenced the collaborative mindset that Sixense has as well as others within the VR community. He’d like to help make the VR revolution happen this time around, and help provide incredible educational and training experiences as well as allowing people to step out of reality and have boundless entertainment experiences where they can live out their wildest fantasies.

Reddit discussion here.

TOPICS

  • 0:00 – Intro Sixense proving presence in virtual reality. Presence requires bringing your body into VR. Partnering with Razor Hydra. Working on the wireless STEM controllers.
  • 1:36 – Timing of Razer Hydra and being out of stock. Not manufactured by Sixense. Providing Hydras to exciting VR development projects. Hydra designed to enable people working with 3D applications as “3D multi-touch” technology. It was nice to have, but hard to convince developers. Valve was supportive, but Hydra was hard technology to justify adopting before VR. Carmack’s contribution to resurgence of VR. Hydra became the mouse of VR. Creating a wireless version with STEM, and the stoppage of manufacturing of the Hydra
  • 5:29 – Modular design of the STEM System controllers – Three components: STEM base. STEM controllers with lots of buttons and joystick controller. STEM pack that can be attached to other objects to get 6 DOF control with them. Trying to make it as simple for developers as possible.
  • 8:04 – How do you see the difference between VR input differences between camera-based gestures and controls with physical buttons? Camera-based gestures can be great if line of sight and you have an intuitive use case. But simulating using objects in VR doesn’t work as well because it’s not natural. Sixense sensors and input devices in order to be able to use your intuition rather than learning a new sign language. Power of bringing hand and eye coordination into VR. Sony & Oculus using a combination camera and physical controllers. Sixense SDK supporting other 6 DOF controllers like Sony Move and Intel RealSense camera for perceptual computing.
  • 11:28 – What was the inspiration for the creation of Sixense Entertainment? Amir was inspired by VR in the 90s. Saw how much gear that military soldiers had to wear to have their VR training exercises. General wanted to have soldiers have a VR experience without mentioning all of the technology that was getting in the way. Creating a system that’s 100% experience delivery where the transition between reality and virtual reality is seamless and fast. Focusing on the consumer VR version, and wants to create VR training experiences where you forget about the VR tech and easily forget reality and step into a fantasy world and experience all of the experiences you want to. Don’t worry about the constraints of technology because it breaks VR immersion.
  • 15:01 – What is driving you now? Is it a business desire or is there something in VR that you want to experience? Reality isn’t as good as as it should be. VR can improve on nearly everything we have in reality. Step into entertainment experiences and transform into a character and live out your fantasies. Explore resort vacations with his family. Go anywhere together in with your family. Everything that you need to can come to you digitally.
  • 17:11 – How have you seen the VR industry evolve since you’ve been involved with it? Learn from the failures of the 90s. The technology was not there yet, but the business execution was also lacking from too much competition and not enough collaboration. He’s seeing a lot more collaboration this time around from Oculus to the open collaborative mindset of Sixense. Can’t support professional applications. Focusing on the has market at the moment. Integrating with the Sony SDK. Sixense wants to collaborate with other hardware manufactures and distributors.
  • 19:57 – What is the ultimate potential for virtual reality? Every use case of reality can be improved by virtual reality. Education and training implications are huge.

Theme music: “Fatality” by Tigoolio

Vlad Vukićević is a technology innovator at Mozilla trying to figure out how virtual reality fits into the web. He’s the inventor of WebGL, and he’s been working with Unity & Unreal Engine 4 to be able to create plug-in free gaming and VR experiences on the web.

Vladimir-VukicevicVlad sees that we’re on the cusp of a VR revolution, and he wants to be sure the web is treated as a first-class citizen. He talks about JavaScript performance, asm.js, libraries like three.js, Unity’s IL2CPP, and the challenge of working with compiled libraries on the web.

We talk about the process of giving web developers the data and output capabilities that are necessary for VR on the web. And then talks about the implications of responsive web design when adding an additional dimension and abilities to provide a fully immersive experience.

Reddit discussion here.

TOPICS

  • 0:00 – How VR fits into the web. WebGL and enabling high performance games on the web via Unity & UE4, and making sure the web is a first-class citizen in this upcoming VR revolution
  • 0:57 – What is the importance of OpenGL and how did that lead to WebGL? Making sure the WebGl runs as well in mobile as it does on desktop PCs.
  • 1:55 – Does three.js using WebGL in order to do the 3D rendering functionality that it provides. All of the browsers provide access to WebGL
  • 2:28 – Is JavaScript fast enough to be equivalent to natively compiled code? asm.js A lot of graphically intense content comes from the graphics card, and Mozilla believes that JavaScript is up to the task
  • 3:39 – What is asm.js? A subset of JavaScript design to be highly opimizable.
  • 4:28 – The meaning behind the 1.4x times slower metric. The goal is to get as close to 1x parity as possible.
  • 4:58 – Other cases in which asm.js can make the application faster?
  • 5:26 – What is IL2CPP for Unity, and how does that enable Unity to get to the web
  • 6:30 – Unity’s embedded player and moving to a plug-in free experience for the Web
  • 7:21 – Moving away from walled gardens
  • 7:38 – Custom binaries as being a blocker for putting VR content onto the web.
  • 8:49 – Being aware of decisions for your VR experience that may prevent it from putting it on the web
  • 9:37 – What’s the potential for the web and VR. Presenting content in VR via the web. Giving web developers to the data and output capabilities that are necessary for VR.
  • 10:39 – The future of responsive web design after adding an additional dimension
  • 11:42 – What makes you excited about virtual reality

Theme music: “Fatality” by Tigoolio

William Provancher is the founder of Tactical Haptics, which produces the Reactive Grip controller. The Reactive Grip controller simulates force feedback haptics through the two components of the kinesthetic (muscle forces) feedback as well as the tactile (surface frictional forces).

Will-278-269x200 William explains the genesis of his project, and some technical details for how it works. He also talks about why he’s on a quest to bring his academic research to a wider audience, despite the fact that a number of investors are telling him that he”s too early. He admits that he’s more of an academic than a businessman, but he’s been able to live in both worlds by choosing the right framing to create devices for different types of perceptual research.

He also talks about how at the IEEE VR conference, Henry Fuchs identified haptics as the next big problem to solve with VR. William was able to get some recognition for his work by winning the best demo at the IEEE VR conference.

He also talks about the upcoming SDK for the Reactive Grip where you will be able to translate the physics engine data from Unity into haptic feedback, as well as a number of canned interactions, and any combination of mass, spring and damper.

William is very candid about what did and did not work with his unsuccessful Kickstarter campaign, but is very determined to stay the course and help bring this technology to the world.

As far as the future of VR, William wonders how long the collegial and collaborative nature of the VR community will remain once the consumer market starts to take shape.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & Tactical Haptics Reactive Grip controller mimics the friction or sheer forces that you’d feel when you’re grasping onto an object
  • 0:33 – How are you creating this Haptic force reaction? Sliding plates can represent frictional forces where it mimics force feedback.
  • 1:15 – Force feedback is any external forces applied to you, which has two components. Kinesthetic component forces that are received through the muscles, and the tactile component felt by the hand. You provide 1/2 of the kinesthetic & the tactile forces are mimicked.
  • 2:08 – In VR, there’s body tracking & head tracking visual component, and haptics being the key components. Matching your expectations in VR is what makes makes it more real.
  • 3:03 – How did you get started with Haptics? What’s driving you now? Doing research for 15 years. It worked so well, and it became a quest to bring it into the world. And it’s simple enough that it could just work. Early user testing for what people want from haptic feedback in immersive games to add to the experience to the game.
  • 4:45 – What are some of the interactions that you have with your SDK. Two modes of interaction of support direct calculations of physical forces that will come out of your game engine starting with Unity & eventually UE4. Scripts that simulating a gunshot type of motion.
  • 6:24 – What are the canned interactions that you’ll be providing? Have a set of demos that can portray the sense of contact and resistance after contact, portray sense of inertia, kickback torque, and having elastic, multi-dimensional deformable objects. In addition, there’s all sorts of combinations of mass, spring and damper.
  • 7:55 – What are you using for positional tracking – Using Razor Hydra at the moment. Others know how to do tracking, and so they will use solutions from others. Their goal is provide more compelling feedback that’s more sophisticated than rumble, but cheaper than force feedback.
  • 8:23 – Are there any safety concerns with haptic feedback injuring people? Not sure, but usually people who complain about that were gripping the controller to tightly.
  • 10:10 – What lessons did you learn from your Kickstarter that was not successful.
  • 12:26 – Will you doing another crowdfunding efforts or planning any other fundraising efforts?
  • 13:28 – The Razor Hydra seemed to come a few years too early before the demand from VR was there. Is there enough of a viable consumer market to support this type of haptic feedback. Conditional venture capital interest and doubts about viability, and then changing perspectives post-Facebook acquisition. We’ll e interested if others are interested.
  • 15:07 – Talk about winning the demo competition at IEEE VR with the Reactive Grip controller. Henry Fuchs’ keynote about what Facebook acquisition that was the best thing to ever happen to VR. It’s a hard problem, and their solution it’s not perfect, but it’s really good. Haptics is the next big challenge for VR, and the research academic community sees the value.
  • 17:27 – How have you dealt with the culture clash between an academic mindset versus the more enterprising start-up mindset. Make devices and study perception with it. The next things that need done aren’t always framed in an academic way, but they can be. It’s all a matter of framing, and he’s been able to find the intersection between framing what you want to do with what the research needs are. Needs to pass the KISS principle to be viable in the consumer market
  • 19:26 – What do you see in the future of VR? Wondering to see how much of the collegial collaboration vibe will remain once the market forces start to divide people when it comes down to going after a limited pool of resources.

Theme music: “Fatality” by Tigoolio

James Blaha had never been able to see in 3D before. He has strabismus or more commonly known as crossed eye, which has prevented him from being able to have each eye look at the same point in space. This can lead to lazy eye or Amblyopia and cause a loss of depth perception.

james-blaha2It used to be common knowledge that there was a critical period to treat these conditions needed to be successfully treated between the ages of 8-10 years old, otherwise it was believed that the visual system would be hard-wired for the rest of their life. However, there have been a number of neuroplasticity studies over the past couple of years that indicated that the brain was more malleable than we previously thought and that it was possible to have effective treatments for lazy eye.

James had been following this neuroplasticity research and decided to start a virtual reality, side project with the Oculus Rift to put some of these studies into practice. He created some scenes that increased the intensity to his lazy eye and decreased the intensity to his good eye, and he was blown away at being able to see in 3D for the first time in his life. Encouraged with his success, he continued to develop a series of games that he played for over 20 hours over the course of 3 weeks until he was able to see in 3D in the real world for the first time in his life.

James tells his story of creating his Diplopia virtual reality game and what type of research interest he’s receiving. He also had a very successful Indiegogo campaign, which enabled him to have other people try Diplopia as well. He talks about other people who have had similar success with being able to see in 3D for the first time, and that there are a number of controlled research studies under way to verify these results.

James says that what gets him excited about the future applications of virtual reality is that for the first time ever, we’re able to finely control the input given to the brain. This means that we’re going to be able to control human perception on a precise level and that it will have more applications than anyone imagines.

It makes me wonder that with the neuroplasticity of our brains and the principle of spaced repetition of controlled perceptual input through virtual reality, then what other types of nascent capabilities will we be able to unlock for ourselves? There are already implications for Virtual Reality Exposure Therapy for PTSD, cerebral palsy, autism, pain management and beyond.

I agree with James that we’re just starting to scratch the surface for what’s possible with the medical & psychological applications for virtual reality. There’s been a lot of research happening in these areas, but I have a feeling that it will become a lot more popular now that there will be affordable VR devices available to consumers for the first time.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & developing Diplopia to help with lazy eye, and being able see in 3D for the first time
  • 1:00 – Being able to see in 3D in the real world after 20 hours of playing across 3 weeks
  • 1:33 – Who’s heard of this breakthrough so far?
  • 2:02 – What do you have to do to maintain 3D vision in the real world? Similar to a weak muscle
  • 2:35 – Did you imagine that you would be able to get to this point in the beginning? Reading neuroplasticity research studies related to lazy eye.
  • 3:09 – Have there been others who have been able to successfully see in 3D? Example of eliminating double vision. Testimonial video from Dillon
  • 3:47 – Spaced repetition bringing improvement in vision.
  • 3:58 – What is the optimal time for learning using this? 15 minutes every other day.
  • 4:26 – What was the previous “common knowledge” thinking about the critical age for when the visual system would hard-wired? It used to be 8-10 years old until neuroplasticity studies started coming out. Traditional methods of treating lazy eye don’t really work.
  • 5:20 – What other research interest has Diplopia received?
  • 5:55 – More details about the game and the game design, and how you make it fun. Take advantage of the brain’s goal and reward system. The goal of the game is to improve your vision.
  • 6:37 – Two miniature games at the moment, and building out more.
  • 7:00 – What gets you excited about the potential for Virtual Reality? Able to finely control the input to the brain, and being able to control perception on a fine level will have more applications than anyone imagines.
  • 7:49 – What’s next for Diplopia? Get it out and help as many people as they can.

Theme music: “Fatality” by Tigoolio

Paul Mlyniec has been involved in computer graphics for virtual reality for over 30 years & has been chasing his dream of creating an immersive, 3D painting medium. He started his career at Silicon Graphics to joining MultiGen to develop what became the de facto standard in modeling for flight simulation. He’s now with Sixense Entertainment as the Head Developer on MakeVR continuing his career of making highly interactive, immersive applications.

Paul-Mlyniec-200x200Paul provides a great piece of historical framing of where we are today. Whereas the space program in the 1960s drove a lot of innovations in technology, the space program driving technology today is immersive gaming within virtual reality.

Paul also talks about some of the user interface lessons for manipulating objects within 3D. For example, he talks about how the 2-handed interface is like 3D multi-touch & how it helps prevent motion sickness. He also shares the surprising insight that turning VR knobs while pressing a button with the same hand turns out to be a straining action. And he also talks about the differences between using physical input devices with a button versus when it makes more sense to use gesture controls with camera-based, visual input.

Finally, he talks about some of the lessons learned from the MakeVR Kickstarter and their plans moving forward.

Reddit discussion here.

TOPICS

  • 0:00 – Intro and 30-year background in VR & computer graphics
  • 1:37 – LEGO VR demo at SIGGRAPH 1996 with MultiGen that was head-mounted & hand tracked
  • 3:00 – Space program of the 60s was to get to the moon, and today’s space program is gaming in virtual reality
  • 4:46 – What are benchmarks do you look at when moving computer graphics forward? How easy it is to create 3D graphics with the speed of creation & fluid flow.
  • 6:39 – What are the differences in using 3D input devices vs 2D mouse and keyboard & how that changes the 3D graphics creation process?
  • 7:58 – Who would be the target demographic for something like MakeVR? Would you see professionals who use Maya using this tool? How well rapid prototyping works with MakeVR
  • 9:24 – How you’re evolving 3DUI with the two-handed, 3D user interface w/ fully immersive experience? Making the most out of limited real estate, and 2D buttons vs. 3D cube widgets.
  • 11:19 – 3DUI experiments that failed? Twist control being straining activity for your wrist. Pressing button on the opposite hand seems to be a best practice.
  • 12:38 – What are things that work with 3DUI? Two-handed, 3D user interface is like 3D multi-touch and how it avoids motion sickness when driving a viewpoint through the world.
  • 14:18 – Physical controls vs camera-based motion controls in 3D. Physical controls meet the professional demands for speed and precision where you can have rapid input without physical strain.
  • 16:13 – MakeVR Kickstarter lessons learned and plans moving forward. Too high of a price point, no early access, misgauged the target market.
  • 17:33 – Best way to track progress of MakeVR

Theme music: “Fatality” by Tigoolio

Cymatic Bruce Wooden is a VR enthusiast who went from tracking the Oculus Rift project on the Meant to be Scene forums to backing it on Kickstarter on the first day to making a first impression video and other demo videos to starting a weekly livestream to co-founding the Silicon Valley Virtual Reality meetup to eventually landing a full-time VR position at Qualia3D.

cymatic-bruceHe talks about some of his most memorable VR experience, things to avoid in VR development, some of his current and future VR projects, and where he’s like to see VR go in the future.

Cymatic Bruce’s enthusiastic evangelism of virtual reality has inspired many people to get into VR game development. In fact, it was his Top 20 Oculus Rift VR experiences of 2013 that convinced me to jump into getting a Rift.

Congratulations to Cymatic Bruce for chasing your VR dream and landing a full-time VR gig, and may you continue to inspire many more future VR devs with your presence and energy.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & how he first got into virtual reality, participating in Meant To Be Seen forums, supporting Oculus Kickstarter, doing first impression videos, reviewing demos, starting a weekly livestream to getting a full-time VR job.
  • 2:36 – Why did he started reviewing videos and becoming a VR enthusiast
  • 3:49 – Cymatic Bruce’s Top 20 Oculus Rift VR experiences of 2013 being an inspiration & Titans of Space being #4
  • 5:10 – What VR experiences have stuck out after doing 500 demos: Shadow Projection, Spectre, Minecrift Mod, & Crashland
  • 6:28 – Things to avoid when doing VR development & what to include more of
  • 7:53 – Intention behind Cymatic Bruce’s Hitbox livestream show & future plans
  • 9:21 – Some of Cymatic Bruce’s development projects
  • 10:13 – Getting a full-time VR position at Qualia3D
  • 11:09 – Vision for VR and what he’s like to see happen in the VR community & Ready Player One

Theme music: “Fatality” by Tigoolio

Aaron Davies is the Head of Developer Relations at Oculus VR, and he talks about how the interest in VR development has exploded since the Facebook acquisition. He discusses some of the applications that in development ranging from VR surgery training, education, film & media, enterprise apps, data visualization, telepresence, social apps to virtual reading experiences and comic books in VR.

aaron_headshotAaron also shares his initial experiences with the Rift and his reactions to simulator sickness. He also talks about one of the most compelling experiences he’s had in VR within the Minecrift mod with some of his co-workers at Oculus.

He also talks about what type of resources will change in terms of what can be provided to independent developers since the Facebook acquisition, and what other types of issues that Facebook will be helping Oculus solve.

Finally, he talks about the importance of the community at this early stage of exploring the VR medium, and what he sees as the ultimate potential of VR and what the Metaverse can provide.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & how the VR development community is branching out beyond gaming since the Facebook acquisition
  • 0:50 – What will change in terms of what types of resources can be provided by Oculus VR for independent VR developers + what specific competencies that the Facebook acquisition will provide to the Oculus VR team.
  • 2:12 – What industries beyond gaming are doing with VR development for the Oculus Rift including education, academia, history, film & media, and virtual comic books & virtual reading experiences
  • 3:46 – Aaron’s first experiences with the Rift, motion sickness, joining Oculus, and the most compelling experience that he’s had in VR
  • 5:38 – Aaron’s 60-second pitch at SVVRCon & the importance of the VR community
  • 6:33 – The potential of what VR can bring to society & vision of the Metaverse

Theme music: “Fatality” by Tigoolio

Having believable characters within virtual reality can be one of the most challenging but most important components to get right within a virtual reality experience. Stefano Corazza and Mixamo.com have been working on a set of tools to make it easier to create & animate 3D characters within your virtual worlds.

stefano_headshotStefano talks about some of Mixamo’s services like Fuse to create characters, their auto-rigging functionality, their Decimator to optimize the polygon count, and how they have over 10,000 different animations available in different packs. They also have developed Face Plus to be able to motion capture the facial expressions and emotions for a character, which is the technology that High Fidelity is using in order to translate what your doing with your face onto a VR avatar. gangnam-style

Finally, he talks about the principle of the uncanny valley, and how Mixamo avoids it — as well as the importance of the expression of emotions within VR.

Mixamo also recently updated their Fuse product to make it easier to create 3D characters by making it possible to upload your own content that can be customized with their tools.

They have a couple of pricing options from pay as you go to a $1499/year professional account where you get access to all of Mixamo’s services. Fuse is also available on Steam for $99.

Reddit discussion here.

Topics

  • 0:00 – Intro
  • 0:34 – Emotional importance of 3D characters – Face Plus facial animation, Fuze character creator
  • 1:56 – Optimizing high polygon count characters with Decimator & transforming Kinect scans into rigged 3D characters
  • 2:53 – 10,000 animations in pre-set packs and importing into game engine like Unity
  • 3:33 – Face Plus motion capture of facial expressions, Unity plug-in & variety of applications of Face Plus
  • 4:30 – Mapping emotions to a face via abstraction process
  • 5:34 – How Mixamo avoids the Uncanny Valley via stylized characters
  • 6:47 – Creating efficient game characters
  • 7:11 – How people like Nonny de la Peña are using Mixamo
  • 7:48 – High Fidelity & their use of Face Plus & using facial animation in game play
  • 8:41 – Pricing structure for Mixamo
  • 9:10 – Future of Virtual Reality & importance of emotions
  • 10:02 – Representing yourself in virtual reality

Theme music: “Fatality” by Tigoolio

This interview with Nonny de la Peña was by far my favorite discussion from the Silicon Valley Virtual Reality Conference & Expo. It’s moving to hear about the type of emotional reactions that she’s receiving from her human rights-centered, immersive journalism pieces that are experienced within virtual reality. She has some amazing insights into VR storytelling, virtual identity, and the importance of bringing in more diversity into VR.

Nonny-de-la-Peña-headshot-269x200

Nonny has been working on VR storytelling since creating a Virtual Guantanamo Bay prison cell in Second Life in 2009. She started working on with VR HMDs before the Oculus Rift existed, and in fact was a part of USC’s Institute for Creative Technologies when Palmer Luckey was there. Luckey even provided Nonny with a pre-Oculus Rift HMD for her 2012 Sundance showing of “Hunger in LA.”

She’s also worked with Mel Slater, who has explored a lot of interesting effects of Positive Illusions of Self in Immersive Virtual Reality

Nonny has a ton of insights on the components of creating a compelling VR experience from starting with great audio to creating a believable virtual humans. I also found her vision of a tiered VR future of the untethered IMAX-like experience to the Oculus Rift home experience, and then finally mobile VR to be a compelling distinction for the different levels of VR immersion and associated technologies.

For more information on the service that Nonny uses to create her virtual humans, then be sure to check out this interview with the founder of Mixamo.

Reddit discussion is here.

Topics

  • 0:00 – Intro to Immersive Journalism & how it got started
  • 1:29 – Recreating a scene of a Guantanamo Bay prison cell in Second Life
  • 3:30 – Taking control of somebody’s Second Life avatar, and the type of reactions of going through an virtual re-enactment of being a Guantanamo Bay prisoner
  • 4:29 – How people identified with their avatar being bound
  • 5:14 – What were some of your first immersive journalism stories that used a fully immersive, virtual reality head mounted display? Identifying with a VR dody in stress position
  • 7:12 – Institute for Creative Technologies, Mark Bolas, and her connection to Palmer Luckey
  • 8:02 – Immersive VR piece on “Hunger in Los Angeles” & starting with audio
  • 9:20 – Palmer Luckey & pre-Oculus Rift, VR HMD prototype for Sundance January 2012, and audience reactions
  • 11:42 – Commissioned VR piece on Syrian refugees shown at the World Economic Forum
  • 13:21 – Witnessing a border patrol taxing death
  • 13:56 – Next projects and the potential of immersive storytelling
  • 15:20 – What are some key components of storytelling within an immersive VR environment?
  • 17:32 – Why is the reaction of empathy so much stronger in immersive VR?
  • 18:38 – What are the risks of putting people into a traumatic VR scene and triggering PTSD?
  • 19:47 – How do you direct attention within a immersive VR story?
  • 20:55 – Are your immersive journalism pieces interactive at all?
  • 21:30 – How else are people using this immersive VR medium to tell unique stories?
  • 22:47 – What type of software and hardware are you using for your virtual humans in your immersive VR pieces?
  • 21:15 – Being the only woman panelist at SVVR and importance of diversity to VR’s resurgence.
  • 26:36 – Bringing into more diversity into VR storytelling
  • 28:19 – The tiers of VR experiences of IMAX, home and mobile.
  • 29:20 – Location-based, untethered VR experiences being equivalent to going to an IMAX movie.

Theme music: “Fatality” by Tigoolio

This is the first of 44 interviews that I conducted at the Silicon Valley Virtual Reality Conference & Expo. I was able to capture over 11.5 hours worth of material from 3/4 of the speakers, 2/3 of the exhibitors and 11% of all attendees, and I’ll be doing a daily podcast over the next month and a half (and maybe beyond). A full list of interviewees is listed down below.

Palmer-LuckeyPalmer Luckey is the founder of Oculus VR, and I had an opportunity to conduct a brief interview with him. I noticed that Palmer’s bio mentions that he founded the ModRetro Forums, and it turns out that there’s some interesting connections between ModRetro and the founding of Oculus VR.

He talks about some of his first and forgettable VR experiences, and the process of starting what he claims is the world’s largest private VR HMD collection. He also covers his connection to the /r/oculus reddit community, the reaction to the Facebook acquisition announcement on Reddit, as well as what he sees as what the future of VR.

Reddit Discussion here. And be sure to check out the Rev VR Ubercast featuring an in-depth discussion with Palmer here.

Topics

  • 0:00 – What led to founding the ModRetro forums & how has the console modding scene evolved?
  • 2:29 – How did VR evolve out of being involved with the console modding scene?
  • 3:38 – What do you remember about your first VR experience?
  • 4:20 – When did you start your VR HMD collection and what were some of the first VR headsets that you got?
  • 4:56 – Did you end up fixing a lot of these VR HMDs?
  • 5:17 – What have been some of your favorite VR experiences?
  • 5:47 – How often do you read the Oculus subreddit community?
  • 6:25 – How did you react to the Reddit community’s downvoting and skepticism of the Facebook acquisition?
  • 7:14 – What were some of your takeaways from the panel about the next five years of VR?
  • 8:11 – What has been your experience of this first consumer VR conference & what it means?

Full list of interviews from the 1st Silicon Valley Virtual Reality Conference

  1. Aaron Davies, Head of Developer Relations at Oculus VR
  2. Aaron Lemke, Unello Design founder & Eden River game
  3. Amir Rubin, Sixense Entertainment CEO
  4. Ben Lang, Road to VR founder & executive editor
  5. Bernhard Drax, Draxtor.com Second Life documentarian
  6. Blair Renaud, IRIS VR Co-Founder & Lead Designer of Technolust
  7. Caitlyn Meeks, Unity Asset Store Manager
  8. Cosmo Scharf, Founder of VRLA
  9. Cris Miranda, EnterVR podcast
  10. “Cymatic” Bruce Wooden, VR evangelist
  11. David Holz, Leap Motion CTO and Co-founder
  12. Denny Unger, Cloudhead Games President & Creative Director of “The Gallery: Six Elements” game
  13. Ebbe Altberg, Linden Lab CEO (Second Life)
  14. Edward Mason, GameFace Labs Founder and CEO
  15. Eric Greenbaum, Jema VR Founder
  16. George Burger, Infinadeck founder
  17. James Blaha, Diplopia game for lazy eye
  18. Jan Goetgeluk, Virtuix CEO & developer of the Virtuix Omni
  19. Jesse Joudrey, Jespionage Entertainment founder & creator of VRChat
  20. John Murray, Seebright founder & CEO
  21. Josh Farkas, Cubicle Ninjas CEO
  22. Matt Bell, Matterport founder & CEO
  23. Matt “Stompz” Carrell, Stompz founder & co-host of PodVR podcast
  24. Max Geiger, producer at Wemo Lab
  25. Mike Sutherland, PrioVR & VP of technology at YEI Technology
  26. Nathan Burba, Survios CEO
  27. Nick Lebesis, Network Flo
  28. Nonny de la Peña, Immersive Journalism founder & Annenberg Fellow at USC School of Cinematic Arts
  29. OlivierJT, Synthesis Universe creator
  30. Palmer Luckey, Oculus VR Founder
  31. Paul Mylyniec, MakeVR Head of Development at Sixense Entertainment
  32. Peter Sassaman, Gauntl33t Project Haptic Feedback Glove
  33. Philip Rosedale, Founder of High Fidelity & Second Life
  34. Reverend Kyle Riesenbeck, Rev VR Podcast & Road to VR contributor
  35. Scott Phillips, VR Walker Project
  36. Sean Edwards, Director of Development Lucid VR & Shovsoft & Lunar Flight & ZVR
  37. Simon Solotko, All Future Parties Founder
  38. Stefan Pernar, Virtual Reality Ventures founder & Virtual Reality Fashion project
  39. Stefano Corazza, Mixamo CEO & Co-founder
  40. Tony Davidson, Innervision VR & Ethereon game
  41. Tony Parisi, Vizi Founder, & co-creator of the VRML & co-chair of the San Francisco WebGL Meetup
  42. Vladimir Vukicevic, Gaming director at Mozilla & inventor of WebGL
  43. Walter Greenleaf, Stanford University, MediaX Program & medical applications of VR
  44. William Provancher, Tactical Haptics founder & Reactive Grip™ touch feedback

Theme music: “Fatality” by Tigoolio