James Blaha had never been able to see in 3D before. He has strabismus or more commonly known as crossed eye, which has prevented him from being able to have each eye look at the same point in space. This can lead to lazy eye or Amblyopia and cause a loss of depth perception.

james-blaha2It used to be common knowledge that there was a critical period to treat these conditions needed to be successfully treated between the ages of 8-10 years old, otherwise it was believed that the visual system would be hard-wired for the rest of their life. However, there have been a number of neuroplasticity studies over the past couple of years that indicated that the brain was more malleable than we previously thought and that it was possible to have effective treatments for lazy eye.

James had been following this neuroplasticity research and decided to start a virtual reality, side project with the Oculus Rift to put some of these studies into practice. He created some scenes that increased the intensity to his lazy eye and decreased the intensity to his good eye, and he was blown away at being able to see in 3D for the first time in his life. Encouraged with his success, he continued to develop a series of games that he played for over 20 hours over the course of 3 weeks until he was able to see in 3D in the real world for the first time in his life.

James tells his story of creating his Diplopia virtual reality game and what type of research interest he’s receiving. He also had a very successful Indiegogo campaign, which enabled him to have other people try Diplopia as well. He talks about other people who have had similar success with being able to see in 3D for the first time, and that there are a number of controlled research studies under way to verify these results.

James says that what gets him excited about the future applications of virtual reality is that for the first time ever, we’re able to finely control the input given to the brain. This means that we’re going to be able to control human perception on a precise level and that it will have more applications than anyone imagines.

It makes me wonder that with the neuroplasticity of our brains and the principle of spaced repetition of controlled perceptual input through virtual reality, then what other types of nascent capabilities will we be able to unlock for ourselves? There are already implications for Virtual Reality Exposure Therapy for PTSD, cerebral palsy, autism, pain management and beyond.

I agree with James that we’re just starting to scratch the surface for what’s possible with the medical & psychological applications for virtual reality. There’s been a lot of research happening in these areas, but I have a feeling that it will become a lot more popular now that there will be affordable VR devices available to consumers for the first time.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & developing Diplopia to help with lazy eye, and being able see in 3D for the first time
  • 1:00 – Being able to see in 3D in the real world after 20 hours of playing across 3 weeks
  • 1:33 – Who’s heard of this breakthrough so far?
  • 2:02 – What do you have to do to maintain 3D vision in the real world? Similar to a weak muscle
  • 2:35 – Did you imagine that you would be able to get to this point in the beginning? Reading neuroplasticity research studies related to lazy eye.
  • 3:09 – Have there been others who have been able to successfully see in 3D? Example of eliminating double vision. Testimonial video from Dillon
  • 3:47 – Spaced repetition bringing improvement in vision.
  • 3:58 – What is the optimal time for learning using this? 15 minutes every other day.
  • 4:26 – What was the previous “common knowledge” thinking about the critical age for when the visual system would hard-wired? It used to be 8-10 years old until neuroplasticity studies started coming out. Traditional methods of treating lazy eye don’t really work.
  • 5:20 – What other research interest has Diplopia received?
  • 5:55 – More details about the game and the game design, and how you make it fun. Take advantage of the brain’s goal and reward system. The goal of the game is to improve your vision.
  • 6:37 – Two miniature games at the moment, and building out more.
  • 7:00 – What gets you excited about the potential for Virtual Reality? Able to finely control the input to the brain, and being able to control perception on a fine level will have more applications than anyone imagines.
  • 7:49 – What’s next for Diplopia? Get it out and help as many people as they can.

Theme music: “Fatality” by Tigoolio

Paul Mlyniec has been involved in computer graphics for virtual reality for over 30 years & has been chasing his dream of creating an immersive, 3D painting medium. He started his career at Silicon Graphics to joining MultiGen to develop what became the de facto standard in modeling for flight simulation. He’s now with Sixense Entertainment as the Head Developer on MakeVR continuing his career of making highly interactive, immersive applications.

Paul-Mlyniec-200x200Paul provides a great piece of historical framing of where we are today. Whereas the space program in the 1960s drove a lot of innovations in technology, the space program driving technology today is immersive gaming within virtual reality.

Paul also talks about some of the user interface lessons for manipulating objects within 3D. For example, he talks about how the 2-handed interface is like 3D multi-touch & how it helps prevent motion sickness. He also shares the surprising insight that turning VR knobs while pressing a button with the same hand turns out to be a straining action. And he also talks about the differences between using physical input devices with a button versus when it makes more sense to use gesture controls with camera-based, visual input.

Finally, he talks about some of the lessons learned from the MakeVR Kickstarter and their plans moving forward.

Reddit discussion here.

TOPICS

  • 0:00 – Intro and 30-year background in VR & computer graphics
  • 1:37 – LEGO VR demo at SIGGRAPH 1996 with MultiGen that was head-mounted & hand tracked
  • 3:00 – Space program of the 60s was to get to the moon, and today’s space program is gaming in virtual reality
  • 4:46 – What are benchmarks do you look at when moving computer graphics forward? How easy it is to create 3D graphics with the speed of creation & fluid flow.
  • 6:39 – What are the differences in using 3D input devices vs 2D mouse and keyboard & how that changes the 3D graphics creation process?
  • 7:58 – Who would be the target demographic for something like MakeVR? Would you see professionals who use Maya using this tool? How well rapid prototyping works with MakeVR
  • 9:24 – How you’re evolving 3DUI with the two-handed, 3D user interface w/ fully immersive experience? Making the most out of limited real estate, and 2D buttons vs. 3D cube widgets.
  • 11:19 – 3DUI experiments that failed? Twist control being straining activity for your wrist. Pressing button on the opposite hand seems to be a best practice.
  • 12:38 – What are things that work with 3DUI? Two-handed, 3D user interface is like 3D multi-touch and how it avoids motion sickness when driving a viewpoint through the world.
  • 14:18 – Physical controls vs camera-based motion controls in 3D. Physical controls meet the professional demands for speed and precision where you can have rapid input without physical strain.
  • 16:13 – MakeVR Kickstarter lessons learned and plans moving forward. Too high of a price point, no early access, misgauged the target market.
  • 17:33 – Best way to track progress of MakeVR

Theme music: “Fatality” by Tigoolio

Cymatic Bruce Wooden is a VR enthusiast who went from tracking the Oculus Rift project on the Meant to be Scene forums to backing it on Kickstarter on the first day to making a first impression video and other demo videos to starting a weekly livestream to co-founding the Silicon Valley Virtual Reality meetup to eventually landing a full-time VR position at Qualia3D.

cymatic-bruceHe talks about some of his most memorable VR experience, things to avoid in VR development, some of his current and future VR projects, and where he’s like to see VR go in the future.

Cymatic Bruce’s enthusiastic evangelism of virtual reality has inspired many people to get into VR game development. In fact, it was his Top 20 Oculus Rift VR experiences of 2013 that convinced me to jump into getting a Rift.

Congratulations to Cymatic Bruce for chasing your VR dream and landing a full-time VR gig, and may you continue to inspire many more future VR devs with your presence and energy.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & how he first got into virtual reality, participating in Meant To Be Seen forums, supporting Oculus Kickstarter, doing first impression videos, reviewing demos, starting a weekly livestream to getting a full-time VR job.
  • 2:36 – Why did he started reviewing videos and becoming a VR enthusiast
  • 3:49 – Cymatic Bruce’s Top 20 Oculus Rift VR experiences of 2013 being an inspiration & Titans of Space being #4
  • 5:10 – What VR experiences have stuck out after doing 500 demos: Shadow Projection, Spectre, Minecrift Mod, & Crashland
  • 6:28 – Things to avoid when doing VR development & what to include more of
  • 7:53 – Intention behind Cymatic Bruce’s Hitbox livestream show & future plans
  • 9:21 – Some of Cymatic Bruce’s development projects
  • 10:13 – Getting a full-time VR position at Qualia3D
  • 11:09 – Vision for VR and what he’s like to see happen in the VR community & Ready Player One

Theme music: “Fatality” by Tigoolio

Aaron Davies is the Head of Developer Relations at Oculus VR, and he talks about how the interest in VR development has exploded since the Facebook acquisition. He discusses some of the applications that in development ranging from VR surgery training, education, film & media, enterprise apps, data visualization, telepresence, social apps to virtual reading experiences and comic books in VR.

aaron_headshotAaron also shares his initial experiences with the Rift and his reactions to simulator sickness. He also talks about one of the most compelling experiences he’s had in VR within the Minecrift mod with some of his co-workers at Oculus.

He also talks about what type of resources will change in terms of what can be provided to independent developers since the Facebook acquisition, and what other types of issues that Facebook will be helping Oculus solve.

Finally, he talks about the importance of the community at this early stage of exploring the VR medium, and what he sees as the ultimate potential of VR and what the Metaverse can provide.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & how the VR development community is branching out beyond gaming since the Facebook acquisition
  • 0:50 – What will change in terms of what types of resources can be provided by Oculus VR for independent VR developers + what specific competencies that the Facebook acquisition will provide to the Oculus VR team.
  • 2:12 – What industries beyond gaming are doing with VR development for the Oculus Rift including education, academia, history, film & media, and virtual comic books & virtual reading experiences
  • 3:46 – Aaron’s first experiences with the Rift, motion sickness, joining Oculus, and the most compelling experience that he’s had in VR
  • 5:38 – Aaron’s 60-second pitch at SVVRCon & the importance of the VR community
  • 6:33 – The potential of what VR can bring to society & vision of the Metaverse

Theme music: “Fatality” by Tigoolio

Having believable characters within virtual reality can be one of the most challenging but most important components to get right within a virtual reality experience. Stefano Corazza and Mixamo.com have been working on a set of tools to make it easier to create & animate 3D characters within your virtual worlds.

stefano_headshotStefano talks about some of Mixamo’s services like Fuse to create characters, their auto-rigging functionality, their Decimator to optimize the polygon count, and how they have over 10,000 different animations available in different packs. They also have developed Face Plus to be able to motion capture the facial expressions and emotions for a character, which is the technology that High Fidelity is using in order to translate what your doing with your face onto a VR avatar. gangnam-style

Finally, he talks about the principle of the uncanny valley, and how Mixamo avoids it — as well as the importance of the expression of emotions within VR.

Mixamo also recently updated their Fuse product to make it easier to create 3D characters by making it possible to upload your own content that can be customized with their tools.

They have a couple of pricing options from pay as you go to a $1499/year professional account where you get access to all of Mixamo’s services. Fuse is also available on Steam for $99.

Reddit discussion here.

Topics

  • 0:00 – Intro
  • 0:34 – Emotional importance of 3D characters – Face Plus facial animation, Fuze character creator
  • 1:56 – Optimizing high polygon count characters with Decimator & transforming Kinect scans into rigged 3D characters
  • 2:53 – 10,000 animations in pre-set packs and importing into game engine like Unity
  • 3:33 – Face Plus motion capture of facial expressions, Unity plug-in & variety of applications of Face Plus
  • 4:30 – Mapping emotions to a face via abstraction process
  • 5:34 – How Mixamo avoids the Uncanny Valley via stylized characters
  • 6:47 – Creating efficient game characters
  • 7:11 – How people like Nonny de la Peña are using Mixamo
  • 7:48 – High Fidelity & their use of Face Plus & using facial animation in game play
  • 8:41 – Pricing structure for Mixamo
  • 9:10 – Future of Virtual Reality & importance of emotions
  • 10:02 – Representing yourself in virtual reality

Theme music: “Fatality” by Tigoolio

This interview with Nonny de la Peña was by far my favorite discussion from the Silicon Valley Virtual Reality Conference & Expo. It’s moving to hear about the type of emotional reactions that she’s receiving from her human rights-centered, immersive journalism pieces that are experienced within virtual reality. She has some amazing insights into VR storytelling, virtual identity, and the importance of bringing in more diversity into VR.

Nonny-de-la-Peña-headshot-269x200

Nonny has been working on VR storytelling since creating a Virtual Guantanamo Bay prison cell in Second Life in 2009. She started working on with VR HMDs before the Oculus Rift existed, and in fact was a part of USC’s Institute for Creative Technologies when Palmer Luckey was there. Luckey even provided Nonny with a pre-Oculus Rift HMD for her 2012 Sundance showing of “Hunger in LA.”

She’s also worked with Mel Slater, who has explored a lot of interesting effects of Positive Illusions of Self in Immersive Virtual Reality

Nonny has a ton of insights on the components of creating a compelling VR experience from starting with great audio to creating a believable virtual humans. I also found her vision of a tiered VR future of the untethered IMAX-like experience to the Oculus Rift home experience, and then finally mobile VR to be a compelling distinction for the different levels of VR immersion and associated technologies.

For more information on the service that Nonny uses to create her virtual humans, then be sure to check out this interview with the founder of Mixamo.

Reddit discussion is here.

Topics

  • 0:00 – Intro to Immersive Journalism & how it got started
  • 1:29 – Recreating a scene of a Guantanamo Bay prison cell in Second Life
  • 3:30 – Taking control of somebody’s Second Life avatar, and the type of reactions of going through an virtual re-enactment of being a Guantanamo Bay prisoner
  • 4:29 – How people identified with their avatar being bound
  • 5:14 – What were some of your first immersive journalism stories that used a fully immersive, virtual reality head mounted display? Identifying with a VR dody in stress position
  • 7:12 – Institute for Creative Technologies, Mark Bolas, and her connection to Palmer Luckey
  • 8:02 – Immersive VR piece on “Hunger in Los Angeles” & starting with audio
  • 9:20 – Palmer Luckey & pre-Oculus Rift, VR HMD prototype for Sundance January 2012, and audience reactions
  • 11:42 – Commissioned VR piece on Syrian refugees shown at the World Economic Forum
  • 13:21 – Witnessing a border patrol taxing death
  • 13:56 – Next projects and the potential of immersive storytelling
  • 15:20 – What are some key components of storytelling within an immersive VR environment?
  • 17:32 – Why is the reaction of empathy so much stronger in immersive VR?
  • 18:38 – What are the risks of putting people into a traumatic VR scene and triggering PTSD?
  • 19:47 – How do you direct attention within a immersive VR story?
  • 20:55 – Are your immersive journalism pieces interactive at all?
  • 21:30 – How else are people using this immersive VR medium to tell unique stories?
  • 22:47 – What type of software and hardware are you using for your virtual humans in your immersive VR pieces?
  • 21:15 – Being the only woman panelist at SVVR and importance of diversity to VR’s resurgence.
  • 26:36 – Bringing into more diversity into VR storytelling
  • 28:19 – The tiers of VR experiences of IMAX, home and mobile.
  • 29:20 – Location-based, untethered VR experiences being equivalent to going to an IMAX movie.

Theme music: “Fatality” by Tigoolio

This is the first of 44 interviews that I conducted at the Silicon Valley Virtual Reality Conference & Expo. I was able to capture over 11.5 hours worth of material from 3/4 of the speakers, 2/3 of the exhibitors and 11% of all attendees, and I’ll be doing a daily podcast over the next month and a half (and maybe beyond). A full list of interviewees is listed down below.

Palmer-LuckeyPalmer Luckey is the founder of Oculus VR, and I had an opportunity to conduct a brief interview with him. I noticed that Palmer’s bio mentions that he founded the ModRetro Forums, and it turns out that there’s some interesting connections between ModRetro and the founding of Oculus VR.

He talks about some of his first and forgettable VR experiences, and the process of starting what he claims is the world’s largest private VR HMD collection. He also covers his connection to the /r/oculus reddit community, the reaction to the Facebook acquisition announcement on Reddit, as well as what he sees as what the future of VR.

Reddit Discussion here. And be sure to check out the Rev VR Ubercast featuring an in-depth discussion with Palmer here.

Topics

  • 0:00 – What led to founding the ModRetro forums & how has the console modding scene evolved?
  • 2:29 – How did VR evolve out of being involved with the console modding scene?
  • 3:38 – What do you remember about your first VR experience?
  • 4:20 – When did you start your VR HMD collection and what were some of the first VR headsets that you got?
  • 4:56 – Did you end up fixing a lot of these VR HMDs?
  • 5:17 – What have been some of your favorite VR experiences?
  • 5:47 – How often do you read the Oculus subreddit community?
  • 6:25 – How did you react to the Reddit community’s downvoting and skepticism of the Facebook acquisition?
  • 7:14 – What were some of your takeaways from the panel about the next five years of VR?
  • 8:11 – What has been your experience of this first consumer VR conference & what it means?

Full list of interviews from the 1st Silicon Valley Virtual Reality Conference

  1. Aaron Davies, Head of Developer Relations at Oculus VR
  2. Aaron Lemke, Unello Design founder & Eden River game
  3. Amir Rubin, Sixense Entertainment CEO
  4. Ben Lang, Road to VR founder & executive editor
  5. Bernhard Drax, Draxtor.com Second Life documentarian
  6. Blair Renaud, IRIS VR Co-Founder & Lead Designer of Technolust
  7. Caitlyn Meeks, Unity Asset Store Manager
  8. Cosmo Scharf, Founder of VRLA
  9. Cris Miranda, EnterVR podcast
  10. “Cymatic” Bruce Wooden, VR evangelist
  11. David Holz, Leap Motion CTO and Co-founder
  12. Denny Unger, Cloudhead Games President & Creative Director of “The Gallery: Six Elements” game
  13. Ebbe Altberg, Linden Lab CEO (Second Life)
  14. Edward Mason, GameFace Labs Founder and CEO
  15. Eric Greenbaum, Jema VR Founder
  16. George Burger, Infinadeck founder
  17. James Blaha, Diplopia game for lazy eye
  18. Jan Goetgeluk, Virtuix CEO & developer of the Virtuix Omni
  19. Jesse Joudrey, Jespionage Entertainment founder & creator of VRChat
  20. John Murray, Seebright founder & CEO
  21. Josh Farkas, Cubicle Ninjas CEO
  22. Matt Bell, Matterport founder & CEO
  23. Matt “Stompz” Carrell, Stompz founder & co-host of PodVR podcast
  24. Max Geiger, producer at Wemo Lab
  25. Mike Sutherland, PrioVR & VP of technology at YEI Technology
  26. Nathan Burba, Survios CEO
  27. Nick Lebesis, Network Flo
  28. Nonny de la Peña, Immersive Journalism founder & Annenberg Fellow at USC School of Cinematic Arts
  29. OlivierJT, Synthesis Universe creator
  30. Palmer Luckey, Oculus VR Founder
  31. Paul Mylyniec, MakeVR Head of Development at Sixense Entertainment
  32. Peter Sassaman, Gauntl33t Project Haptic Feedback Glove
  33. Philip Rosedale, Founder of High Fidelity & Second Life
  34. Reverend Kyle Riesenbeck, Rev VR Podcast & Road to VR contributor
  35. Scott Phillips, VR Walker Project
  36. Sean Edwards, Director of Development Lucid VR & Shovsoft & Lunar Flight & ZVR
  37. Simon Solotko, All Future Parties Founder
  38. Stefan Pernar, Virtual Reality Ventures founder & Virtual Reality Fashion project
  39. Stefano Corazza, Mixamo CEO & Co-founder
  40. Tony Davidson, Innervision VR & Ethereon game
  41. Tony Parisi, Vizi Founder, & co-creator of the VRML & co-chair of the San Francisco WebGL Meetup
  42. Vladimir Vukicevic, Gaming director at Mozilla & inventor of WebGL
  43. Walter Greenleaf, Stanford University, MediaX Program & medical applications of VR
  44. William Provancher, Tactical Haptics founder & Reactive Grip™ touch feedback

Theme music: “Fatality” by Tigoolio

Jason Jerald of NextGen Interactions has been involved with creating computer graphics and next-generation 3D computer interfaces for 20 years. His virtual reality consulting client list ranges from Oculus VR, Valve and Sixense Entertainment to NASA Ames Research Center, Battelle Pacific Northwest National Laboratories, Naval Research Laboratories, HRL Laboratories, DARPA & NIH.

jason-jerald

We talk about some of his research and thoughts on VR latency, simulator sickness, presence and VR input devices & 3D user interface constructs. We also cover highlights from the IEEE VR, 3DUI, SIGGRAPH & Neurogaming conferences.

Topics

  • 0:00 Intro
  • 1:58 Consulting work with Oculus VR
  • 2:46 Jason’s Ph.D work in reducing latency leading to work with Valve & Oculus VR
  • 4:08 The 20ms latency threshold target
  • 5:41 Research process for measuring VR latency
  • 7:37 Other VR user studies comparing 3D user interface tasks with 2D equivalents
  • 9:00 3D User Interface (3DUI) conference contest
  • 10:46 The importance of VR hand input, point-to-fly UIs, & going beyond 2D menu constructs
  • 12:43 VR input options of vision-based systems, physical based devices and data gloves
  • 15:01 Comparing and contrasting the strengths and weaknesses of VR input devices
  • 16:19 IEEE VR highlights including the Head-Mounted Display panel that Jason moderated
  • 19:07 IEEE VR perspective on Facebook acquisition, and Henry Fuchs’ inspirational keynote.
  • 20:24 The biases towards low-risk dissertations that prevented academia from making a VR breakthrough
  • 22:25 IEEE VR Unity 3D workshop, MiddleVR, Virtual Human Toolkit, and AutoVerb binaural audio plug-in
  • 25:27 Adoption of Unity in Academia
  • 27:04 Academic VR frameworks & toolkits and UE4
  • 28:04 Unity Asset Store and the Impulsonic AutoVerb Unity Plug-in for binaural audio
  • 28:54 SIGGRAPH computer graphics conference and it’s connection to Virtual Reality
  • 30:27 Jason’s background in real-time 3D graphics
  • 31:24 Neurogaming conference impressions
  • 32:34 Tradeoff of consumer EEG interfaces of ease of use vs. more powerful EEG signals with more electrodes & paste.
  • 33:48 Using palm sweat and heart rate to measure VR presence
  • 36:34 Quantitative and qualitative measures for researching simulator sickness
  • 37:39 Sixense’s serious game grant for “Motion-Controlled Gaming for Neuroscience Education”
  • 39:39 Potential of getting a VR dream job in academia
  • 42:28 Keenly interested in the open problems of 3D user interfaces, researching simulator sickness best practices & moving towards higher-level VR problems rather than implementation
  • 44:50 Wrap up and conclusion

Music: “Fatality” by Tigoolio

Oliver “Doc_Ok” Kreylos is a research scientist / computer scientist who develops virtual reality applications for scientific research, specifically immersive 3D visualizations for the department of geology at the W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCAVES).

doc-ok-kinect
He is an active participant in the Oculus subreddit posting as “Doc_Ok” where he has been gathering a lot of attention for his innovative data visualizations as well as an array of Kinects to create a “pseudo-holographic” avatar within virtual reality.

Oliver has been making virtual apps since 1998  including the Virtual Reality User Interface (aka Vrui) for navigating and interacting with immersive 3D visualizations of scientific data. He has a wealth of knowledge about VR, and has been providing a lot of insightful commentary on his blog at Doc-Ok.org.

 

TOPICS

  • 0:00 Intro & enabling scientific VR visualization
  • 1:57 Remote collaboration with scientific collaboration
  • 4:49 Developing the Virtual Reality User Interface (Vrui) toolkit
  • 8:01 3D visualizations that are impossible in 2D
  • 10:37 Converting 2D CAT scan slices into full 3D medical visualizations
  • 14:12 Future of Kinect-enabled telepresence collaboration
  • 16:32 Hardware & software stack for setting up a calibrated Kinect-array for telepresence
  • 19:07 Speculation on hacking Kinect V2
  • 21:37 How Kinect VR avatars can provide a sense of presence
  • 24:12 The importance of implementing positional head tracking for presence
  • 25:32 Importance of using 6DOF Hydra & STEM controllers with Vrui & data visualization
  • 27:29 Importance of supporting VR input devices in a unified manner to avoid previous VR mistakes
  • 28:32 Prophetic feedback on the Oculus DK1 that has been integrated into DK2
  • 31:33 Find Oliver online at Doc-Ok.org, @okreylos & Doc_Ok on Reddit. Oliver’s future projects

Links

Music: “Fatality” by Tigoolio

hive-schematicProfessor Eric Hodgson explores some of the psychological impacts of virtual reality when it comes to spatial perception & memory, and effects like change blindness. He also talks about how to walk in infinite VR spaces within limited physical spaces with a technique called redirected walking. Finally, he talks about insights from IEEE VR community on this consumer VR revolution, compares the split between the old VR community to the new VR community, 3D user interfaces, data visualizations, and some examples for how corporations and the military are using VR.

EricHodgson_2008Eric Hodgson is the director of the Smale Visualization Center & also works with the Armstrong Institute for Interactive Media Studies at the Miami University of Ohio. He runs the HIVE (huge immersive virtual environment), which is a gym-sized VR Lab. For more information on Eric, here’s some of his VR research publications.

Topics

  • 0:00 Intro
  • 1:04 Redirected Walking
  • 3:50 Factors that go into spatial perception & visual dominance
  • 5:26 Why was the HIVE lab built?
  • 8:10 How our spatial memory works
  • 10:49 Change blindness for changes that are out of the field of view
  • 12:12 Navigating geometrically impossible spaces & how we mentally fill in gaps
  • 13:29 Movement speeds in virtual environments – natural vs. VR movement speeds
  • 14:38 Why VR movement speeds are 2x normal walking speeds. Field-of-view?
  • 15:42 IEEEVR community perspective on the rise of consumer VR
  • 17:20 The divide between Old VR vs New VR communities
  • 19:03 Future of Professional VR companies & high-end VR markets
  • 20:41 How the military is using VR
  • 22:42 How corporations are using VR
  • 25:04 Why now? The timing of this VR revolution. Previous low-end VR headsets
  • 27:01 Cutting-edge demos and prototypes at IEEE VR conference
  • 30:03 3DUI conference, and user interface research in VR environments
  • 30:58 Highlights of the future of telepresence keynote from the IEEE VR conference
  • 34:03 Where VR is headed within five years & what gets you excited about what’s to come
  • 35:41 Data visualization within 3D VR environments

Links:

Music: “Fatality” by Tigoolio