#142: Yuta Itoh on HMD calibration techniques for eye-tracking and augmented reality

Yuta-ItohFor anyone who’s gone through a calibration process in order to have eye tracking working within a HMD knows how annoying it can be. Yuta Itoh is working on a number of different techniques to be able to automate this process, and is a leader in this area.

Yuta is a Ph.D. Fellow at the Chair for Computer Aided Medical Procedures & Augmented Reality, Munich, Germany. He specializes within augmented reality since calibration is much more important when you’re overlaying virtual objects within a mixed reality context through an optical see-through, head-mounted display.

Academic VR researchers submit their latest research to present at the IEEE VR conference, and if it sufficiently advances the field forward enough, then they’re published in the IEEE Transactions on Visualization and Computer Graphics journal. Each year the TVCG publishes the accepted long papers within a special Proceedings Virtual Reality edition.

Yuta was able to have three co-authored papers accepted as TVCG journal papers, which is quite an accomplishment. Here’s his three papers:

hmd1
Light-Field Correction for Spatial Calibration of Optical See-Through Head-Mounted Displays

hmd-calibration
Corneal-Imaging Calibration for Optical See-Through Head-Mounted Displays

hmd2
Subjective Evaluation of a Semi-Automatic Optical See-Through Head-Mounted Display Calibration Technique

I found it interesting that Yuta and a lot of other AR researchers get a lot of inspiration from the 3D user interfaces shown in science fiction blockbusters like Iron Man and Minority Report.

This calibration work that Yuta is doing could help make eye tracking within VR applications a lot more user friendly, and more resilient to shifting movements of the VR HMD. One of the complaints that Mark Schramm had about FOVE on a recent podcast discussion is that if the HMD moves at all then it’ll ruin the eye tracking calibration procedure. Some of the light-field corrections and corneal reflection calibration techniques that Yuta is working could provide a way to automatically adjust the calibration for any movements of the HMD or for any new user.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast.

[00:00:12.113] Yuta Itoh: I'm Yuta Ito from Team Munich. I'm presenting my research on a head-mounted display, but not like Oculus. It's not like Oculus, but more like a Google Glass, so you see through the real world via the display. So what I want to do is like Iron Man. You know, you wear HMD and you see the perfect overlay of the virtual thing, like a fancy robot and the real robot, right? But our reality is not like this science fiction, so we really need to align the virtual object and the real object align to your field of view, so your eye. So whenever you want to realize such cool, fancy sci-fi future, you really have to calibrate the display and your eyeball. So that means your real world, your real physical 3D eyeball position and the display position has to be computed and aligned automatically and then dynamically. So that's a really difficult problem because you have to first track your eyeball and then you have to track the display which is semi-transparent like a screen which is just floating mid-air. So that's what I'm working on and this time I was talking about a bit more into the display. because the display is like glasses, so it has optical distortion and the image is distorted. So whenever you show the image, which is not matched to the distorted view of the user. So I computed this distortion by using cameras and so on, and then I solved this thing. So that's what I did.

[00:01:39.452] Kent Bye: So it sounds like, you know, at the IEEE VR, there's a number of papers that are submitted, and some of them get into, like, the transactions of visualization and computer graphics. And it sounds like some of these HMD calibration techniques that you're working on was pushing the field forward enough to be able to be included in some of these publications. So maybe you could talk a bit about, you know, what kind of innovations you're bringing to the field in terms of some of these calibration techniques.

[00:02:04.104] Yuta Itoh: So I believe this kind of calibration, automatic eye position display calibration techniques is the fundamental for any AR application which uses those displays. I'm sure those big companies, Facebook or Google, they I think probably working on this kind of eye tracking and the display calibration, but at least in academia we haven't done this before. So that's why we presented these fundamentals.

[00:02:27.545] Kent Bye: Not application, but so that everyone can use our platform So that's why we also trying to distribute our source code and so on and hopefully some data set and so on I see and in one of the papers that you were also a part of was actually It seemed like it was using the reflection of an eyeball of a checkerboard and maybe get to talk a bit about you know Some of these other techniques of how you're actually physically tracking the eye and calibrating everything

[00:02:50.917] Yuta Itoh: Oh yeah, that's a good question. Actually, that's one of my colleagues working on. So I was working on this display modeling, and then the colleague is working on this eye tracking. And since the display, HMD, has an image on the screen, right? And this image is reflected on your eye. So the cool idea of his work is that you can track this reflected image on your eye surface, cornea it's called, and then it gives you more much precise registration than just try to track your iris or the circle or some points on your eye.

[00:03:25.740] Kent Bye: And so it seems like with an AR glasses you have to also track the space that you're in if you want to actually put the objects into the space. Is that something that also would require like very precise calibration from the eyeball to the glasses and then to the virtual objects into the 3D space?

[00:03:42.167] Yuta Itoh: Yes, yes it is. So basically the point is how you fuse the virtual world into the real world and how to make them consistent. So our case the consistency is spatial, the position of the 3D object and the virtual object. But whenever it comes into tracking and then now temporal consistency, which means the delay or latency, you also have to shorten this rendering time and the tracking time so that user do not perceive this difference. That's really important also field.

[00:04:08.590] Kent Bye: Yeah, and in terms of, like, virtual reality head-mounted displays, the frame rates that they're shooting for is anywhere from 90 to 120 Hz. For augmented reality, is it in the same realm, or does it need to be even higher fidelity?

[00:04:21.061] Yuta Itoh: Oh, it has to be higher fidelity, I think. Like, there is one paper about, like, projection-based camera system, and then they noticed that 10 milliseconds, people wouldn't notice, but higher than 10 milliseconds, people would notice the delay.

[00:04:35.133] Kent Bye: So it's really hard. And so where are things at now in terms of the latency and the frame rates?

[00:04:42.425] Yuta Itoh: Oh, that's a hard question. I don't know. Our system haven't considered this latency, but I haven't seen any really unperceivable system that I don't perceive in terms of this augmented reality system. So I have no idea, but that's a really challenging problem.

[00:05:00.773] Kent Bye: Yeah, I mean, it seems like the virtual reality head-mounted displays will be hitting the consumer version perhaps in 2015, certainly in 2016, but in terms of augmented reality, like a consumer version of AR, how far out do you think we are until we start to see something that is ready for consumers?

[00:05:19.026] Yuta Itoh: perhaps five years. I mean now it's a good big wave of this VR comes in and then people are shifting also investigating a lot for this augmented reality HMD not like Oculus. So yeah let's see that's why I'm also working on this field and then let's see in a decade big companies or companies or kick-starting companies Now, also this good thing is now rapid prototyping. For example, in the VR this time, there is also some demos that they built their own HMD by 3D printing. So this way will push us to this realization of AR displays.

[00:05:50.987] Kent Bye: You know, in terms of augmented reality glasses displays, is there any devices that you've seen? And I guess I'm just curious to see what is the best one that you've seen so far.

[00:06:00.874] Yuta Itoh: Oh, the best one, depending on what do you mean by the best. But there are some expensive professional display, like some Israel companies, they produce quite wide field of view display, but which costs. And on the other hand, the consumer products in Japan, they also produce it quite cheap, inexpensive one, but they are limited field of view and so on. So it's like a trade-off. So it's really up to how much you can expense. But in terms of consumer market, who would pay for your display like 10Ks or some thousands of dollars? No one would pay. So you really need to think about who you will reach, what kind of consumers we have to reach.

[00:06:40.595] Kent Bye: What's the range of field of views that you've seen? I've tried the meta AR glasses and they seem to be very small, maybe 20 or 30 degrees field of view. But I'm curious sort of the range of glasses that you've seen in terms of the consumer to professional range.

[00:06:57.667] Yuta Itoh: So as you said, Mita, they are using actually Epson OEM display panel, so which only like 20 degrees. And then the Israel company I mentioned is about 60 or 40 or 60. And then recently in academia, I mean in the research area, NVIDIA, they produced a special concept display which has like 100 degrees. So these are the kind of existing technology, but I think you need, like Oculus always said, you need over 100 degrees. Yeah.

[00:07:26.472] Kent Bye: Yeah, and you know there's HoloLens that's on the threshold of potentially coming out soon and then also Magic Leap who's been very stealthy they've recently released a concept video which you know looks like to me to be pre-rendered and not actual through the glasses or I guess they're shooting photons in the eyes with their digital light field but I'm curious from your your take in terms of like evaluating some of these you know concept videos how many leaps of the academic world they would have to make in order to actually do that or what your sense of how viable it is that what they're showing so far.

[00:08:00.948] Yuta Itoh: Yeah, those promotion videos, I mean, I always get disappointed, of course. I mean, because they are mostly pre-rendered. So I'm a bit worried about this situation that because, you know, I understand as a company, you have to promote your product and then just get attract the people. But once customers buy the real product, and then the expectation is already quite high, and what we could bring is quite low. So then they wouldn't buy the product again. And now the VR wave gone somewhere around. So I think we should always have to show the real current demo or something, not like a very fancy vision. Or you have to really label it. This is demonstration of the virtual rendering, computer graphics, and this is our current stage. And then please support us. We will try to push this to that level.

[00:08:47.026] Kent Bye: And so what type of experiences do you want to have in augmented reality? And what's sort of motivating you to keep involved in this field?

[00:08:53.635] Yuta Itoh: Oh yeah, lots of science fiction, like starting from the Back to the Future and the recent Iron Man. Those science fictions are our motivation to realize the future. I really want to make an indistinguishable augmented reality where you see naturally those virtual objects in the real scene because it's cool. I think I'm a geek. But those drive us to bring the actual field. But it's also the truth that you have many steps toward that. So I think it's always quite important to have a vision. But don't sell vision as already have. So that's my point.

[00:09:29.221] Kent Bye: Great. And finally, what do you see as the ultimate potential of once these augmented reality technologies are available, what will that be able to enable?

[00:09:38.594] Yuta Itoh: Yeah like a smartphone now it comes into my market and it's now among the people and then people using everyday like a tablet and smartphone and you get really convenient life than before and or before the internet now the internet you can search everything, right? But on the other hand, those new devices also produce many problems, like maybe people just don't learn by themselves in search or devices, and you don't talk to each other, you just chat every day. So like that, once the display comes into our life, real life, we will find a lot. But we should also prepare and discuss the potential issues, and then maybe you can find a correct direction, or you will notice after it's launched in the real market. So we have to wait somehow and prepare.

[00:10:21.905] Kent Bye: Yeah, I guess that's the danger is that, you know, we're already looking at our cell phones so much that if it's just in our eyes all the time, then, you know, if we're actually present to what's happening in the world. So, yeah. How do you see that sort of trade-off of how to start to point people in the right direction, I guess? Or, you know, or is it just a matter of kind of speaking about both the good and the bad?

[00:10:42.112] Yuta Itoh: Yeah, I see. Well, perhaps it's just a way how the communication style changes or interaction between people changes by those new devices. So HMD, we will get the personal display nearby and then combine with those sensing system or, you know, cloud system. So perhaps you will need to change your lifestyle to talk to people. Maybe you never like letters nowadays, right? And then you always talk to your friend via the display. Who knows? OK, great. Well, thank you. Thank you very much.

[00:11:14.161] Kent Bye: See you. And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash Voices of VR.

More from this show