Cosmo Scharf is a film student and co-founder of VRLA. He talks about some of the challenges of VR storytelling and the differences between a free-will VR environment vs. a traditional 2D film. Film has a series of tools to direct attention such as depth of field, camera direction, cues to look to left or right, contrasting colors or movement in the frame. Some of these translate over to VR, but you can’t use all of them since there’s no camera movement, focus or framing in VR.

cosmo-269x200He talks about the process of starting a meet up in Los Angeles called VRLA, and that a lot of people in the film industry are seeing VR as the future of storytelling and entertainment.

Cosmo also sees that VR experiences are a spectrum from ranging from completely interactive like a video game, semi-interactive cinematic experience, and the completely passive. There’s not a whole lot of people are looking into the completely passive experiences yet, but that’s what he’s interested in exploring as a film student.

He strongly believes it’s the future of storytelling, video games, computing as well as disseminating information in general, and he’s very excited to get more involved in the VR industry

Reddit discussion here.

TOPICS

  • 0:00 – Founder of VR LA. Initially interested in VR after hearing about Valve’s involvement in VR. Reading /r/oculus and listening to podcasts and heard a podcast about starting a meet up.
  • 1:44 – Film industry’s involvement and how many were new to VR? Weeks after the Facebook acquisition, and so there were over 200 people who came out.
  • 2:34 – What type of feedback did you receive? A lot of people in the movie industry are seeing VR as the future of storytelling. Cosmo wants to provide emotionally-engaging experiences.
  • 3:22 – What type of story things are interesting to you. Not a lot of storytelling in VR happening yet. VR is early. Differences between film and VR. Filmmaking rules and practices to use 4 frames for 100 years. VR is a new medium. How do you effectively tell a story without relying upon the same filmmaking techniques.
  • 4:36 – What are some of the open problems in VR storytelling? How to direct someone’s attention. With filmmaking you can use depth of field, camera direction, cues to look to left or right, colors or movement in the frame. Some of the cues in real life are if others are looking in a direction and which direction sounds are coming from. Passive vs. Interactive VR: completely interactive like a video game, semi-interactive cinematic experience, and the completely passive. Not a whole lot of people are looking at the third way.
  • 6:22 – Familiar with Nonny de la Pena’s work. She attended VRLA #1. It’s interesting in terms of VR as a tool for building empathy. When you’re placed virtually in someone else’s shoes, you’ll feel what they’re going to feel. You can connect to people via a 2D screen, but you know that there’s a distance. With VR, you’re in it and completely immersed and engaged.
  • 7:31 – What about choose your own adventure vs. a linear film: VR experience similar to Mass Effect where you have different options to say to characters. Your response will change how the story unfolds. Would like to see a natural feedback between you and the AI characters
  • 8:22 – Where would like to see VR going? We’re still very early with VR, no consumer product is out yet and that will determine if VR is a real thing. Strongly believes it’s the future of storytelling, video games, computing & disseminating information in general. HMDs will be portable. Convergence of augmented reality with VR. Hard to determine how the industry will evolve within the next month, but most exciting industry to be a part of. All of the leaders of the consumer VR space are at SVVRCon.

Theme music: “Fatality” by Tigoolio

Ben Lang is a futurist and avid gamer who prospected that all of the pieces for a virtual reality revolution were in place back in October 2011. He started RoadToVR.com as an outlet for his desire to learn about VR by reading and writing about it.

Ben LangHe discovered that some of the best discussions were happening on the Meant to be Seen 3D (MTBS3D) forums, and had been following the progression of the Oculus Rift before their successful Kickstarter that launched on August 1, 2012.

He talks about his inspirations for getting involved with tracking the evolution and road towards virtual reality as well as what types of experiences he’s really looking forward to having in VR, including some of the more physical aspects of virtual reality.

Ben talks about the intention and focus of Road to VR in that they’re trying to cover the latest technological developments, but also new and exciting VR experiences that are getting a lot of buzz. He describes some of his favorite VR experiences so far, and what types of VR experiences he’s looking forward to having in the future.

He gives an overview of the state of the VR HMD space as well as which one of the peripheral VR input devices are worth tracking more closely.

Finally, Ben talks about some of the non-gaming potential applications that get him excited, as well as what types of social experiences he’d find really compelling.

Reddit discussion here.

TOPICS

  • 0:00 – Founded RoadToVR.com Oct 2011. Allowed him to follow passion of VR. Needed an outlet to write about it in order to learn about it more. Could see that the pieces were there to bring VR to fruition. But at that time you had to be a hacker/maker to enjoy VR because all of the VR HMDs were DIY. Oculus Rift Kickstarter launched on Aug 2012, and he’d been tracking it on the MTBS3D forums. Rift was originally going to be a DIY kit, and then Carmack got involved. It’s been an amazing amount of growth since then. Kudos to Ben for having some winning futurist vision to be able prospect out the potential of the VR space.
  • 3:17 – How did you come across MTBS3D forums? Part of the research process to discover that’s where the leading thinking was. People were excited about the Sony HMZ. Forums are a great place to take a deep dive and learn about a topic.
  • 4:46 – What was your inspiration for getting excited about VR? Mostly about the technological evolution and it’s impact on society. It’ll be powerful medium going forward because it taps into something deep in your subconscious. It’s not going away any time soon. No specific movie or book served as a singular inspiration. The Matrix is the example of what he uses to use a metaphor for others. But had a lot of thought experiments as to what would be possible – like a physics simulation where you slow down the speed of light.
  • 6:29 – What do you want to experience in VR? Studio Ghibli and other films or games where you want to be a part of that world so that you go inside of the stories that you love and be a part of it. Also excited about the adrenaline rushing, and heart-pounding physical interactions with an emotional component made possible with technologies like the Virtuix Omni. Also looking forward to having entire self within VR and be able to recall intense moments with friends.
  • 9:24 – Are you coming from a gaming background then? More of a technology futurist. Was carrying around mobile computers before mobile phones. Then tablets and mobile phones came out and that space got boring. Also like to stay on the cutting edge, and VR feels like it’ll have legs and be around for quite a while.
  • 11:04 – There’s lots of technological components to this topic, what is Road to VR’s focus and intention with their coverage? They get excited about people doing cool stuff with VR. Trying to pick out the new and awesome hardware as well as innovative VR experiences. The key is what is truly new and exciting. Crash Land is compelling demo. Looking to help document the cool and new VR experiences.
  • 12:50 – What are some of the VR demos that are the most compelling? Half Life 2 and the Half Life VR mod that enable hydras controls that add an additional component of immersion. Valve did a great job of creating an engaging world with great story, characters, interesting enemies, and different types of weapons. They did a great job of adapting it to VR. Lots of demos and short experiences that weren’t build from the ground up for VR.
  • 14:30 – What are some the top VR products out there that get you excited? Obviously, the Oculus Rift Consumer Version, and can’t wait for the DK2. Sony Morpheus. Microsoft likely coming out with something soon, perhaps at E3. Incumbent game companies were a bit too jaded by the previous failures of VR though which is why it had to come from a grassroots group who could afford to take a risk.
  • 17:38 – What sticks out to you at SVVRCon? Everything at SVVRCon worth checking out. Infinadeck omnidirectional treadmill prototype proof of concept. Sandbox Augmented Reality adapter for the Rift. Would like to see new demos for Morpheus.
  • 19:44 – What about the Avegant Glyph display technology? There’s a Difference between VR headset and Head-mounted Display. Glyph is a HMD, and not a VR headset. 45-degree FOV. No screendoor effect. The Glyph creators are keeping an eye on VR, and it’d be good to have competition in displays.
  • 22:39 – What do you see the potential of VR beyond gaming? Immerse people within another world while your changing bandages on a burn patient. Looking forward to gaming experiences where you can hang out and play poker with friends.

Theme music: “Fatality” by Tigoolio

walter-greenleaf Walter Greenleaf, Ph.D. is an early pioneer in the medical application of virtual environment technology, and is viewed as one of the founders of the field. His focus has been on the use of virtual reality and telemedicine technology to treat Post Traumatic Stress Disorder (PTSD), addictions, Autism, and other difficult problems in behavioral medicine.

He talks about the types of medical applications of VR that have been studied and verified by research, and what the potentials are for implementing these in medical applications for the first time. He also talks about the impact of isolation that happens within elderly and disabled populations, and the social implications of have a way to interact online beyond using the keyboard and mouse.

Reddit discussion here.

Theme music: “Fatality” by Tigoolio

Amir Rubin, CEO of Sixense, the distributors of the Razer Hydra controller and manufacturers of the wireless STEM controllers, which have been described as enabling a 3D mutli-touch interface and will could become the first “mouse of VR.”

Amir-Rubin-269x200Amir talks about how the intention of Sixense is to create presence in VR by bringing your body into VR in an intuitive fashion. He talks about the differences between a physical-based solution like the STEM controllers versus using gesture controls with a camera-based system.

He talks about the history and timing of the Razer Hydra, and how it was a tough sell to convince developers that this paradigm shift was needed for interacting in 3D spaces — especially before virtual reality came along. He talks about the challenges faced with shortages of the Hydra and how they’ve been supporting exciting VR projects.

He talks about the modular design of the STEM controllers, and how developers will be able to bring all sorts of different peripheral objects into VR by using the STEM pack component.

Amir then goes into what first inspired the creation of Sixense, and how he wanted to be able to provide VR training experiences where the technology didn’t get into the way, and that people could have as much immersion and presence as possible in order to tap into their deeper intuition. He believes that almost every experience in reality could be improved with virtual reality, and he wants to help bring that sense of immersion and presence to people through Sixense technologies.

Finally, he talks about the lessons learned from the failed VR revolution of the 90s, and how that has influenced the collaborative mindset that Sixense has as well as others within the VR community. He’d like to help make the VR revolution happen this time around, and help provide incredible educational and training experiences as well as allowing people to step out of reality and have boundless entertainment experiences where they can live out their wildest fantasies.

Reddit discussion here.

TOPICS

  • 0:00 – Intro Sixense proving presence in virtual reality. Presence requires bringing your body into VR. Partnering with Razor Hydra. Working on the wireless STEM controllers.
  • 1:36 – Timing of Razer Hydra and being out of stock. Not manufactured by Sixense. Providing Hydras to exciting VR development projects. Hydra designed to enable people working with 3D applications as “3D multi-touch” technology. It was nice to have, but hard to convince developers. Valve was supportive, but Hydra was hard technology to justify adopting before VR. Carmack’s contribution to resurgence of VR. Hydra became the mouse of VR. Creating a wireless version with STEM, and the stoppage of manufacturing of the Hydra
  • 5:29 – Modular design of the STEM System controllers – Three components: STEM base. STEM controllers with lots of buttons and joystick controller. STEM pack that can be attached to other objects to get 6 DOF control with them. Trying to make it as simple for developers as possible.
  • 8:04 – How do you see the difference between VR input differences between camera-based gestures and controls with physical buttons? Camera-based gestures can be great if line of sight and you have an intuitive use case. But simulating using objects in VR doesn’t work as well because it’s not natural. Sixense sensors and input devices in order to be able to use your intuition rather than learning a new sign language. Power of bringing hand and eye coordination into VR. Sony & Oculus using a combination camera and physical controllers. Sixense SDK supporting other 6 DOF controllers like Sony Move and Intel RealSense camera for perceptual computing.
  • 11:28 – What was the inspiration for the creation of Sixense Entertainment? Amir was inspired by VR in the 90s. Saw how much gear that military soldiers had to wear to have their VR training exercises. General wanted to have soldiers have a VR experience without mentioning all of the technology that was getting in the way. Creating a system that’s 100% experience delivery where the transition between reality and virtual reality is seamless and fast. Focusing on the consumer VR version, and wants to create VR training experiences where you forget about the VR tech and easily forget reality and step into a fantasy world and experience all of the experiences you want to. Don’t worry about the constraints of technology because it breaks VR immersion.
  • 15:01 – What is driving you now? Is it a business desire or is there something in VR that you want to experience? Reality isn’t as good as as it should be. VR can improve on nearly everything we have in reality. Step into entertainment experiences and transform into a character and live out your fantasies. Explore resort vacations with his family. Go anywhere together in with your family. Everything that you need to can come to you digitally.
  • 17:11 – How have you seen the VR industry evolve since you’ve been involved with it? Learn from the failures of the 90s. The technology was not there yet, but the business execution was also lacking from too much competition and not enough collaboration. He’s seeing a lot more collaboration this time around from Oculus to the open collaborative mindset of Sixense. Can’t support professional applications. Focusing on the has market at the moment. Integrating with the Sony SDK. Sixense wants to collaborate with other hardware manufactures and distributors.
  • 19:57 – What is the ultimate potential for virtual reality? Every use case of reality can be improved by virtual reality. Education and training implications are huge.

Theme music: “Fatality” by Tigoolio

Vlad Vukićević is a technology innovator at Mozilla trying to figure out how virtual reality fits into the web. He’s the inventor of WebGL, and he’s been working with Unity & Unreal Engine 4 to be able to create plug-in free gaming and VR experiences on the web.

Vladimir-VukicevicVlad sees that we’re on the cusp of a VR revolution, and he wants to be sure the web is treated as a first-class citizen. He talks about JavaScript performance, asm.js, libraries like three.js, Unity’s IL2CPP, and the challenge of working with compiled libraries on the web.

We talk about the process of giving web developers the data and output capabilities that are necessary for VR on the web. And then talks about the implications of responsive web design when adding an additional dimension and abilities to provide a fully immersive experience.

Reddit discussion here.

TOPICS

  • 0:00 – How VR fits into the web. WebGL and enabling high performance games on the web via Unity & UE4, and making sure the web is a first-class citizen in this upcoming VR revolution
  • 0:57 – What is the importance of OpenGL and how did that lead to WebGL? Making sure the WebGl runs as well in mobile as it does on desktop PCs.
  • 1:55 – Does three.js using WebGL in order to do the 3D rendering functionality that it provides. All of the browsers provide access to WebGL
  • 2:28 – Is JavaScript fast enough to be equivalent to natively compiled code? asm.js A lot of graphically intense content comes from the graphics card, and Mozilla believes that JavaScript is up to the task
  • 3:39 – What is asm.js? A subset of JavaScript design to be highly opimizable.
  • 4:28 – The meaning behind the 1.4x times slower metric. The goal is to get as close to 1x parity as possible.
  • 4:58 – Other cases in which asm.js can make the application faster?
  • 5:26 – What is IL2CPP for Unity, and how does that enable Unity to get to the web
  • 6:30 – Unity’s embedded player and moving to a plug-in free experience for the Web
  • 7:21 – Moving away from walled gardens
  • 7:38 – Custom binaries as being a blocker for putting VR content onto the web.
  • 8:49 – Being aware of decisions for your VR experience that may prevent it from putting it on the web
  • 9:37 – What’s the potential for the web and VR. Presenting content in VR via the web. Giving web developers to the data and output capabilities that are necessary for VR.
  • 10:39 – The future of responsive web design after adding an additional dimension
  • 11:42 – What makes you excited about virtual reality

Theme music: “Fatality” by Tigoolio

William Provancher is the founder of Tactical Haptics, which produces the Reactive Grip controller. The Reactive Grip controller simulates force feedback haptics through the two components of the kinesthetic (muscle forces) feedback as well as the tactile (surface frictional forces).

Will-278-269x200 William explains the genesis of his project, and some technical details for how it works. He also talks about why he’s on a quest to bring his academic research to a wider audience, despite the fact that a number of investors are telling him that he”s too early. He admits that he’s more of an academic than a businessman, but he’s been able to live in both worlds by choosing the right framing to create devices for different types of perceptual research.

He also talks about how at the IEEE VR conference, Henry Fuchs identified haptics as the next big problem to solve with VR. William was able to get some recognition for his work by winning the best demo at the IEEE VR conference.

He also talks about the upcoming SDK for the Reactive Grip where you will be able to translate the physics engine data from Unity into haptic feedback, as well as a number of canned interactions, and any combination of mass, spring and damper.

William is very candid about what did and did not work with his unsuccessful Kickstarter campaign, but is very determined to stay the course and help bring this technology to the world.

As far as the future of VR, William wonders how long the collegial and collaborative nature of the VR community will remain once the consumer market starts to take shape.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & Tactical Haptics Reactive Grip controller mimics the friction or sheer forces that you’d feel when you’re grasping onto an object
  • 0:33 – How are you creating this Haptic force reaction? Sliding plates can represent frictional forces where it mimics force feedback.
  • 1:15 – Force feedback is any external forces applied to you, which has two components. Kinesthetic component forces that are received through the muscles, and the tactile component felt by the hand. You provide 1/2 of the kinesthetic & the tactile forces are mimicked.
  • 2:08 – In VR, there’s body tracking & head tracking visual component, and haptics being the key components. Matching your expectations in VR is what makes makes it more real.
  • 3:03 – How did you get started with Haptics? What’s driving you now? Doing research for 15 years. It worked so well, and it became a quest to bring it into the world. And it’s simple enough that it could just work. Early user testing for what people want from haptic feedback in immersive games to add to the experience to the game.
  • 4:45 – What are some of the interactions that you have with your SDK. Two modes of interaction of support direct calculations of physical forces that will come out of your game engine starting with Unity & eventually UE4. Scripts that simulating a gunshot type of motion.
  • 6:24 – What are the canned interactions that you’ll be providing? Have a set of demos that can portray the sense of contact and resistance after contact, portray sense of inertia, kickback torque, and having elastic, multi-dimensional deformable objects. In addition, there’s all sorts of combinations of mass, spring and damper.
  • 7:55 – What are you using for positional tracking – Using Razor Hydra at the moment. Others know how to do tracking, and so they will use solutions from others. Their goal is provide more compelling feedback that’s more sophisticated than rumble, but cheaper than force feedback.
  • 8:23 – Are there any safety concerns with haptic feedback injuring people? Not sure, but usually people who complain about that were gripping the controller to tightly.
  • 10:10 – What lessons did you learn from your Kickstarter that was not successful.
  • 12:26 – Will you doing another crowdfunding efforts or planning any other fundraising efforts?
  • 13:28 – The Razor Hydra seemed to come a few years too early before the demand from VR was there. Is there enough of a viable consumer market to support this type of haptic feedback. Conditional venture capital interest and doubts about viability, and then changing perspectives post-Facebook acquisition. We’ll e interested if others are interested.
  • 15:07 – Talk about winning the demo competition at IEEE VR with the Reactive Grip controller. Henry Fuchs’ keynote about what Facebook acquisition that was the best thing to ever happen to VR. It’s a hard problem, and their solution it’s not perfect, but it’s really good. Haptics is the next big challenge for VR, and the research academic community sees the value.
  • 17:27 – How have you dealt with the culture clash between an academic mindset versus the more enterprising start-up mindset. Make devices and study perception with it. The next things that need done aren’t always framed in an academic way, but they can be. It’s all a matter of framing, and he’s been able to find the intersection between framing what you want to do with what the research needs are. Needs to pass the KISS principle to be viable in the consumer market
  • 19:26 – What do you see in the future of VR? Wondering to see how much of the collegial collaboration vibe will remain once the market forces start to divide people when it comes down to going after a limited pool of resources.

Theme music: “Fatality” by Tigoolio

James Blaha had never been able to see in 3D before. He has strabismus or more commonly known as crossed eye, which has prevented him from being able to have each eye look at the same point in space. This can lead to lazy eye or Amblyopia and cause a loss of depth perception.

james-blaha2It used to be common knowledge that there was a critical period to treat these conditions needed to be successfully treated between the ages of 8-10 years old, otherwise it was believed that the visual system would be hard-wired for the rest of their life. However, there have been a number of neuroplasticity studies over the past couple of years that indicated that the brain was more malleable than we previously thought and that it was possible to have effective treatments for lazy eye.

James had been following this neuroplasticity research and decided to start a virtual reality, side project with the Oculus Rift to put some of these studies into practice. He created some scenes that increased the intensity to his lazy eye and decreased the intensity to his good eye, and he was blown away at being able to see in 3D for the first time in his life. Encouraged with his success, he continued to develop a series of games that he played for over 20 hours over the course of 3 weeks until he was able to see in 3D in the real world for the first time in his life.

James tells his story of creating his Diplopia virtual reality game and what type of research interest he’s receiving. He also had a very successful Indiegogo campaign, which enabled him to have other people try Diplopia as well. He talks about other people who have had similar success with being able to see in 3D for the first time, and that there are a number of controlled research studies under way to verify these results.

James says that what gets him excited about the future applications of virtual reality is that for the first time ever, we’re able to finely control the input given to the brain. This means that we’re going to be able to control human perception on a precise level and that it will have more applications than anyone imagines.

It makes me wonder that with the neuroplasticity of our brains and the principle of spaced repetition of controlled perceptual input through virtual reality, then what other types of nascent capabilities will we be able to unlock for ourselves? There are already implications for Virtual Reality Exposure Therapy for PTSD, cerebral palsy, autism, pain management and beyond.

I agree with James that we’re just starting to scratch the surface for what’s possible with the medical & psychological applications for virtual reality. There’s been a lot of research happening in these areas, but I have a feeling that it will become a lot more popular now that there will be affordable VR devices available to consumers for the first time.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & developing Diplopia to help with lazy eye, and being able see in 3D for the first time
  • 1:00 – Being able to see in 3D in the real world after 20 hours of playing across 3 weeks
  • 1:33 – Who’s heard of this breakthrough so far?
  • 2:02 – What do you have to do to maintain 3D vision in the real world? Similar to a weak muscle
  • 2:35 – Did you imagine that you would be able to get to this point in the beginning? Reading neuroplasticity research studies related to lazy eye.
  • 3:09 – Have there been others who have been able to successfully see in 3D? Example of eliminating double vision. Testimonial video from Dillon
  • 3:47 – Spaced repetition bringing improvement in vision.
  • 3:58 – What is the optimal time for learning using this? 15 minutes every other day.
  • 4:26 – What was the previous “common knowledge” thinking about the critical age for when the visual system would hard-wired? It used to be 8-10 years old until neuroplasticity studies started coming out. Traditional methods of treating lazy eye don’t really work.
  • 5:20 – What other research interest has Diplopia received?
  • 5:55 – More details about the game and the game design, and how you make it fun. Take advantage of the brain’s goal and reward system. The goal of the game is to improve your vision.
  • 6:37 – Two miniature games at the moment, and building out more.
  • 7:00 – What gets you excited about the potential for Virtual Reality? Able to finely control the input to the brain, and being able to control perception on a fine level will have more applications than anyone imagines.
  • 7:49 – What’s next for Diplopia? Get it out and help as many people as they can.

Theme music: “Fatality” by Tigoolio

Paul Mlyniec has been involved in computer graphics for virtual reality for over 30 years & has been chasing his dream of creating an immersive, 3D painting medium. He started his career at Silicon Graphics to joining MultiGen to develop what became the de facto standard in modeling for flight simulation. He’s now with Sixense Entertainment as the Head Developer on MakeVR continuing his career of making highly interactive, immersive applications.

Paul-Mlyniec-200x200Paul provides a great piece of historical framing of where we are today. Whereas the space program in the 1960s drove a lot of innovations in technology, the space program driving technology today is immersive gaming within virtual reality.

Paul also talks about some of the user interface lessons for manipulating objects within 3D. For example, he talks about how the 2-handed interface is like 3D multi-touch & how it helps prevent motion sickness. He also shares the surprising insight that turning VR knobs while pressing a button with the same hand turns out to be a straining action. And he also talks about the differences between using physical input devices with a button versus when it makes more sense to use gesture controls with camera-based, visual input.

Finally, he talks about some of the lessons learned from the MakeVR Kickstarter and their plans moving forward.

Reddit discussion here.

TOPICS

  • 0:00 – Intro and 30-year background in VR & computer graphics
  • 1:37 – LEGO VR demo at SIGGRAPH 1996 with MultiGen that was head-mounted & hand tracked
  • 3:00 – Space program of the 60s was to get to the moon, and today’s space program is gaming in virtual reality
  • 4:46 – What are benchmarks do you look at when moving computer graphics forward? How easy it is to create 3D graphics with the speed of creation & fluid flow.
  • 6:39 – What are the differences in using 3D input devices vs 2D mouse and keyboard & how that changes the 3D graphics creation process?
  • 7:58 – Who would be the target demographic for something like MakeVR? Would you see professionals who use Maya using this tool? How well rapid prototyping works with MakeVR
  • 9:24 – How you’re evolving 3DUI with the two-handed, 3D user interface w/ fully immersive experience? Making the most out of limited real estate, and 2D buttons vs. 3D cube widgets.
  • 11:19 – 3DUI experiments that failed? Twist control being straining activity for your wrist. Pressing button on the opposite hand seems to be a best practice.
  • 12:38 – What are things that work with 3DUI? Two-handed, 3D user interface is like 3D multi-touch and how it avoids motion sickness when driving a viewpoint through the world.
  • 14:18 – Physical controls vs camera-based motion controls in 3D. Physical controls meet the professional demands for speed and precision where you can have rapid input without physical strain.
  • 16:13 – MakeVR Kickstarter lessons learned and plans moving forward. Too high of a price point, no early access, misgauged the target market.
  • 17:33 – Best way to track progress of MakeVR

Theme music: “Fatality” by Tigoolio

Cymatic Bruce Wooden is a VR enthusiast who went from tracking the Oculus Rift project on the Meant to be Scene forums to backing it on Kickstarter on the first day to making a first impression video and other demo videos to starting a weekly livestream to co-founding the Silicon Valley Virtual Reality meetup to eventually landing a full-time VR position at Qualia3D.

cymatic-bruceHe talks about some of his most memorable VR experience, things to avoid in VR development, some of his current and future VR projects, and where he’s like to see VR go in the future.

Cymatic Bruce’s enthusiastic evangelism of virtual reality has inspired many people to get into VR game development. In fact, it was his Top 20 Oculus Rift VR experiences of 2013 that convinced me to jump into getting a Rift.

Congratulations to Cymatic Bruce for chasing your VR dream and landing a full-time VR gig, and may you continue to inspire many more future VR devs with your presence and energy.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & how he first got into virtual reality, participating in Meant To Be Seen forums, supporting Oculus Kickstarter, doing first impression videos, reviewing demos, starting a weekly livestream to getting a full-time VR job.
  • 2:36 – Why did he started reviewing videos and becoming a VR enthusiast
  • 3:49 – Cymatic Bruce’s Top 20 Oculus Rift VR experiences of 2013 being an inspiration & Titans of Space being #4
  • 5:10 – What VR experiences have stuck out after doing 500 demos: Shadow Projection, Spectre, Minecrift Mod, & Crashland
  • 6:28 – Things to avoid when doing VR development & what to include more of
  • 7:53 – Intention behind Cymatic Bruce’s Hitbox livestream show & future plans
  • 9:21 – Some of Cymatic Bruce’s development projects
  • 10:13 – Getting a full-time VR position at Qualia3D
  • 11:09 – Vision for VR and what he’s like to see happen in the VR community & Ready Player One

Theme music: “Fatality” by Tigoolio

Aaron Davies is the Head of Developer Relations at Oculus VR, and he talks about how the interest in VR development has exploded since the Facebook acquisition. He discusses some of the applications that in development ranging from VR surgery training, education, film & media, enterprise apps, data visualization, telepresence, social apps to virtual reading experiences and comic books in VR.

aaron_headshotAaron also shares his initial experiences with the Rift and his reactions to simulator sickness. He also talks about one of the most compelling experiences he’s had in VR within the Minecrift mod with some of his co-workers at Oculus.

He also talks about what type of resources will change in terms of what can be provided to independent developers since the Facebook acquisition, and what other types of issues that Facebook will be helping Oculus solve.

Finally, he talks about the importance of the community at this early stage of exploring the VR medium, and what he sees as the ultimate potential of VR and what the Metaverse can provide.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & how the VR development community is branching out beyond gaming since the Facebook acquisition
  • 0:50 – What will change in terms of what types of resources can be provided by Oculus VR for independent VR developers + what specific competencies that the Facebook acquisition will provide to the Oculus VR team.
  • 2:12 – What industries beyond gaming are doing with VR development for the Oculus Rift including education, academia, history, film & media, and virtual comic books & virtual reading experiences
  • 3:46 – Aaron’s first experiences with the Rift, motion sickness, joining Oculus, and the most compelling experience that he’s had in VR
  • 5:38 – Aaron’s 60-second pitch at SVVRCon & the importance of the VR community
  • 6:33 – The potential of what VR can bring to society & vision of the Metaverse

Theme music: “Fatality” by Tigoolio