#42: John Murray on Seebright’s iPhone-powered, mixed reality VR/AR HMD

John Murray talks about the iPhone-powered, mixed reality VR/AR HMD made by his company Seebright. He started Seebright in 2012 as a way to explore augmented and virtual reality experiences in an affordable way, and start to have interactions that went beyond what you could do with a tablet or smartphone alone.

john-murraySeebright uses a reflective surface that allows you to see both what’s in the real world, but also an augmented reality, stereoscopic image that comes by dropping in your iPhone into their head-mounted display. They’re able to have a mixed-reality AR experience because the optics are reflecting the iPhone’s display into your field of view.

Finally, John talks a bit about the latency on an iPhone and Seebright’s future plans including a developer program, a possible crowd-funding campaign, as well as a motion controller to serve as an input device

TOPICS

  • 0:00 – Founder of Seebright in 2012. Started b/c computers are becoming more personal. Envision Seebright as something that was affordable to explore 3D AR/VR environments beyond tablet or smartphone alone
  • 0:56 – VR and AR are on a continuum where you’re either more or less engaged. Occupy less part of your field of view, but allow to you to see more of what’s the real world. Optics reflects it into your field of view, where you can combine virtual imagery onto the real world.
  • 2:24 – Latencies vary widely in their implementation of 9-axis motion sensors. Latency is more important with your field of view. Target device is an iPhone.
  • 3:10 – Magical number is 20ms, how good is it on an iPhone. 20-30ms on an iPhone 5. Larger to 20ms. Not close to dedicated hardware.
  • 3:53 – Announcing a developer program later this year. Will continue to explore low-cost AR/VR display that will also include a motion controller.

Theme music: “Fatality” by Tigoolio

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast.

[00:00:11.941] John Murray: My name is John Murray. I am founder and CEO of Seabright, which was founded in 2012. It came out of my interest in how computers were becoming increasingly personal, evolving from these giant mainframes that held bits of our data to devices in our pocket that accompanied us from appointment to appointment, to eventually the most intimate of screens that hovered in front of us and accompanied us as we observed our environment or perhaps took us to other environments. So I envision Seabright as a system that could be really affordable and accessible, something which was at a price point that didn't have to justify a $300 killer app in order for people to experiment with it, to see how they can create value, to essentially create new types of applications that you can't create with a tablet or a smartphone.

[00:00:55.652] Kent Bye: And so you have a sort of a see-through augmented reality using mobile phones. Maybe you could just talk about your system and all those things that it encompasses.

[00:01:04.841] John Murray: Sure. So it's interesting that everyone's lashing onto these terms of VR and AR. They're really sort of on this continuum of head-mounted displays where you're less or more engaged with what's in front of you in the real world. All of them take advantage of our sense of this environment that if we tilt our head around, we see different things. So we tried to create a display that may have occupied a smaller portion of your field of view, but which in turn allowed you to have both see-through and non-see-through lenses that attach to it. In the VR case, it allows you to get as absorbed in what you see as you would in a movie or in a PC game while still having that sense of visceral movement that maybe if you could just look a little to the left or a little to the right, you could see something that you wouldn't otherwise see. That sort of sense of immersion that you get from wanting the camera to move a little bit left in a favorite movie or something and knowing that there's something interesting over there. Because of the way it uses a set of optics and reflects it into your field of view also affords a see-through display since the last surface your eye can see can also be see-through. That allows you to combine virtual imagery on top of the real world especially when you take advantage of the smartphone's existing camera capabilities. Taking the two together, you can create displays and interfaces that take advantage of physical objects and that can be seen from different perspectives from people in the same room.

[00:02:24.385] Kent Bye: And can you talk a bit about the latencies that you're experiencing in terms of, you know, moving your head around and how you're able to measure that?

[00:02:29.773] John Murray: Sure, so smartphones vary widely in their implementation of 9-axis motion sensors. One of the interesting things is when you have a limited field of view, you're less sensitive to motion and latency than you would if it was in your peripheral view. A lot of your peripheral vision is directly tied to your inner ear and your sense of motion. latency becomes even more important the wider your field of view is. Now, our current target display, the iPhone, has some pretty good sensors, but they will definitely improve in the future, and we suspect that you can augment them with increasingly cheap and better performing motion sensors that can send the data to the smartphone, or perhaps even to special purpose devices to decrease the latency and increase the performance.

[00:03:10.302] Kent Bye: And so, one of the magic numbers that Oculus has been aiming for in their hardware is 20 milliseconds. I'm just curious about what type of latency you're seeing on an iPhone.

[00:03:17.650] John Murray: So by latency, you mean the time from the sensor input to the change of the pixel, I assume, which I believe they're calling time to photon or something, right? I think it's around 20 to 36 milliseconds for an iPhone 5, approximately 60 frames per second. But more importantly, the latency between the gyroscope accelerometer to that is around in that realm. It is not quite four or five or 10 milliseconds. It's definitely larger than 20 milliseconds. So I would not say it's close to the dedicated hardware. And I think that the range of applications, though, go far beyond those that require the one-to-one immersion in the real world.

[00:03:52.370] Kent Bye: Great. And so what's next for Seabright?

[00:03:54.811] John Murray: So we'd like to announce a developer program later this year, possibly an earlier Alpha 1 preceding a Kickstarter or some other crowdfunding. And we'd like to continue to explore the types of applications that a low-cost, affordable, yet versatile head-mounted display that includes a motion sensor controller can afford. And we're really curious what developers will come up with. So we want to get it in their hands as quickly as possible.

[00:04:19.379] Kent Bye: Great. Well, thank you. No, thank you.

More from this show