John Murray talks about the iPhone-powered, mixed reality VR/AR HMD made by his company Seebright. He started Seebright in 2012 as a way to explore augmented and virtual reality experiences in an affordable way, and start to have interactions that went beyond what you could do with a tablet or smartphone alone.
Seebright uses a reflective surface that allows you to see both what’s in the real world, but also an augmented reality, stereoscopic image that comes by dropping in your iPhone into their head-mounted display. They’re able to have a mixed-reality AR experience because the optics are reflecting the iPhone’s display into your field of view.
Finally, John talks a bit about the latency on an iPhone and Seebright’s future plans including a developer program, a possible crowd-funding campaign, as well as a motion controller to serve as an input device
- 0:00 – Founder of Seebright in 2012. Started b/c computers are becoming more personal. Envision Seebright as something that was affordable to explore 3D AR/VR environments beyond tablet or smartphone alone
- 0:56 – VR and AR are on a continuum where you’re either more or less engaged. Occupy less part of your field of view, but allow to you to see more of what’s the real world. Optics reflects it into your field of view, where you can combine virtual imagery onto the real world.
- 2:24 – Latencies vary widely in their implementation of 9-axis motion sensors. Latency is more important with your field of view. Target device is an iPhone.
- 3:10 – Magical number is 20ms, how good is it on an iPhone. 20-30ms on an iPhone 5. Larger to 20ms. Not close to dedicated hardware.
- 3:53 – Announcing a developer program later this year. Will continue to explore low-cost AR/VR display that will also include a motion controller.
Theme music: “Fatality” by Tigoolio