Here’s my interview with Karl Guttag of the KGOnTech Blog as well as independent analyst for consumer XR display devices and systems that was conducted on Thursday, June 1, 2023 at Augmented World Expo in Santa Clara, CA. See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my coverage of AWE past and present, today's interview is with Carl Guttag, an interview that I did with him a couple of years ago at Augmented World Expo 2023. So Carl's quite a fascinating character who has really made a name for himself for doing these technical deep dives into augmented reality and virtual reality displays, specifically the liquid crystal on silicon displays, LCOS, but also a lot of next generation of AR. So he does these super detailed technical deep dives onto his blog called KG on Tech. So he's a blogger, but an independent analyst, and he's been involved in different Startups, he's had an all existing career, you know, working as an electrical engineer at Texas Instruments and, you know, 150 patents and quite a successful engineer in his own right. But now he's really quite obsessed with analyzing and doing analysis on XR technology and display technology. So do lots of really different. deep down technical breakdowns on stuff and be quite contrarian on different topics in terms of like not being a fan of video pass through as being like a thing that's going to be viable. And yeah, he just has a lot of really strong opinions, I guess you could say, but they're backed by his technical background and trying to like, over time he's been proven to call different things out in terms of his judgments on various different technologies. Um, And so the context of this conversation was that we were just a few days out from when Apple was going to announce the Apple Vision Pro on Monday, June 5th, 2023. I was talking to Carl on that previous Thursday. And so we're just a few days out and everybody with AWE knew that the announcement was coming with all the reporting and other things happening. We all sensed that that was coming. And he's continued to kind of go out and do all sorts of analysis on that. But I think his real kind of sweet spot is in looking at augmented reality displays. And after you listen to this conversation, you might think, oh, well, it's going to be totally outdated. But there's still a lot of these deeper principles around the physics of these different devices and these different trends. He's got all these kind of aphorisms and sayings that I hear him repeat in looking at his other talks, but it's definitely worth watching some of his talks from AWE because he is very much information dense and just has so much that he wants to get into and talk about. And it's way more than he ever has time to talk around. His blog is really a format for him to really go deep into like these months long research projects and kind of just do these really long reads and digging into every single nuance detail and so this conversation is a little bit longer conversation to kind of get an overview of everything and you know for me i'm not personally all that interested into the hardware side of things anymore you know i'm looking one to more of the experiential aspects but that's not necessarily my wheelhouse to kind of like try out all the hardware and have opinions on this or that it to me it feels like like i went to ces in 2017 and there was like hundreds of different devices and companies and the ones that are really successful end up getting purchased by the big companies and so i guess my approach is a little bit more along the lines of like once it gets to the point where the technology is being distributed by these big major companies then i'll take a look at it but for me it's like not as much of an advantage to have strong opinions around the quality of various different technologies before it's ready for the mainstream Although it is very interesting at Augmented World Expo, like it's a great opportunity to try out some of these and form some of your own opinions on them. However, I choose to focus most of my time on having conversations like this and to share them out to the larger community. Just because, again, like I think that there's a certain amount of the way that the industry develops is that eventually the big stuff will going from this kind of more enterprise space into the consumer space. And once it gets closer to that, then I'll certainly be trying out a lot more of the different technologies and technologies. I look to people like Carl Guttag to evaluate what's happening with these more cutting edge augmented reality devices. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Carl happened on Thursday, June 1st, 2023 at the Augmented World Expo in Santa Clara, California. So with that, let's go ahead and dive right in.
[00:04:19.352] Karl Guttag: I'm Carl Gutag. I'm actually an electrical engineer by profession, and somehow accidentally, about 2011, I started a blog called KG on Tech, and then the website is called kgutag.com. So, unfortunately, they're not the same, and trying to reconcile that is a big pain. So I write a blog covering mostly see-through AR, display devices, optic devices. It actually started out to be more about projectors. I'd worked on Pico projectors. I'd been working in building Elkos devices. I was the CTO of a startup, seven years in a startup that I've helped found. I wanted to keep my hand in and just keep busy and, you know, see what's going on out there. So I started the blog and I just kind of write it, started as a hobby. It still is more or less a hobby. So before that, I mean, my real profession, I've really had two full careers. The first 20 years I worked for Texas Instruments, so I have the big company experience for 20 years. I was the youngest fellow in the history of the company. I developed the original sprite chip, the one that everybody else copied. That was my chip that I worked on 9918 using ColecoVision TI home computer. After that, I worked on a couple of 16-bit microprocessors, followed that up with a 340 graphics processor. So I worked on the very first fully programmable graphics processor and did that from around about 82 or 83 up until around 90. We had a moderately successful chip. We were the leading graphics accelerator for things like, it ended up being in those days, CAD and Adobe Photoshop. In 98, I worked on an image processor, the 320C80, which was a MVP, which was a very early image processor. It had four DSPs and a RISC processor all on one die. So we were already replicating CPUs on a die back then. And I did that. And then not very long after, a few years after that, I left TI and was brought out by a guy who founded Cyrex, Jerry Rogers, who founded Cyrex. He asked me if I wanted to be VP of a startup working on an Elcos near-eye display. And I thought, well, I want to do the start, you know, things, there are a lot of politics going on at TI at the time. And I thought, well, you know, if somebody wants to be a CTO of a startup, I'd save my pennies. I guess I can go up in the startup world. So for the last, since 98, I've been in the startup world. So that's where my second career began, and I got into displays. So that's where I learned about Elcos. I didn't really know anything about optics back then. I've kind of picked up the optics from being on the go. The startup, my timing was impeccable. As you may or may not know, I left in 98 and the dot-com bubble burst in 99. So you couldn't raise spit, and particularly spit for a see-through display of any type. But there was no money to be raised in the early 90s, particularly at that point I was in Dallas area. And so after that fell apart, I started developing my own L-cost design with a small company. And then that led to me getting some interest from some big companies because I had a... I was an IC designer. See, I really got into this from my IC design. I designed CPUs. My time at TI, I was all designing CPUs. And I was also involved in the video RAM. I'm actually probably most famous for the video RAM, which led to the graphics RAM and also led to the synchronous DRAM. So I got involved in all that stuff. back in my TI days. But now, I was working on building the display device, the backplane, for driving liquid crystal on silicon. I did that, I formed a company called Syniant, stayed there, I was CTO and co-founder, I was there for about seven years. Good experience, but it just wasn't going anywhere. We worked on Pico projectors. One of the things I learned from Pico projectors is you shouldn't let, the customers can be very wrong, because every cell phone guy said they were going to put a Pico projector in his cell phone. And they kept asking, the specs kept changing, they kept asking for more, and then eventually it just kind of disappeared, even though every cell phone company said they wanted a Pico projector. So we were developing L-Cost for Pico projectors after we had initially developed a chip for going after rear projection television, which went away just about the time we got that chip together. So my timing was kind of impeccable in the startup world. So anyway, I did that for seven years. The Pico projectors weren't selling, so I decided to get out into the world. And some of those same Elcos chips were also being used in AR displays. So I started writing my blog about projectors and technology, and I was actually going to talk more about my early work in video game chips and processors and stuff, but somehow it kept drifting more and more towards AR and augmented reality. Sorry, that's kind of a long story in there, and you can edit that down as much as you want, but that's kind of the story of how I got into this. And then I've often on consulted and been joined companies and startups, mostly startup world, since 98. So I've been CTO of like three companies. I've been in four or five of them, various levels and stages, and about three of them since I formed CTO. kg on tech so i've kept kg on tech going which has led to a little bit of work you know consulting and stuff like that but that's about it so that's anyway that gets you up to where i am in kg on tech okay that's a a lot of really helpful context of your journey into this space and a couple of follow-up questions is that first of all are you currently working with any ar companies are you more a consultant Right now, I just do presentations. I give them general advice and stuff, but I don't have any full-time consulting gigs right now. There's no full-time work I'm doing right now.
[00:10:02.608] Kent Bye: you're not working for a company in any AR company capacity and you're not even doing full-time consulting, so yeah.
[00:10:07.733] Karl Guttag: No, I have no full-time consulting. Mostly I just give presentations, mostly state of the market. I give feedback on, you know, I give sometimes feedback. I have on occasion had an investor, VC or something, come to me for some due diligence and things like that. So I get these kind of random gigs. Mostly what people come to me for is an overall appraisal of the market. You know, what's the market really like? I'm kind of well-known for not being sugarcoating stuff. So, you know, most people in this industry, most people everywhere, there's a lot more money to be had by telling a guy, say, give me a bag of money and I'll solve all your problems. So that's not what I do. I tend to give them as straight a story as I can on what's really going on in the market, what's happening. what people are doing. Thanks to my blog, I have access to almost everything that's going on in the AR, VR world, mostly AR though, where I focus.
[00:11:00.612] Kent Bye: Is that because you've cultivated a readership of folks that are in the industry and they reach out to you or that they want to be in conversation?
[00:11:06.056] Karl Guttag: It turned out it kind of happened incrementally. I worked on Elkos. Elkos started getting designed into a lot of AR headsets. I started figuring out how the optics work. My first big thing on my blog that really went kind of crazy was with Microvision. Microvision kind of got my blog going in some ways because I knew they were not telling the truth about lasers and some of their laser scanning stuff, so I wrote stuff about it, and I got into one of their FTC filings. They actually had to file an FTC filing denying what I said. What I said was true, but they were calling me a soothsayer, and so I ran a whole series. I said, well, if you're going to call me a soothsayer, I'll take that as a mantle, and I wrote a whole series of articles about saying, you know, talking about the soothsayer. You know, that was not a good move on their part, but I guess, you know, because they kind of were trying, they didn't want to use my name, so they did that. Another big one was when Hymax, I figured out, I knew what Hymax's Elkos looked like, and I recognized the Hymax Elkos chip in a Google Glass, just from the chip, seeing the chip. And so I reported that the Highmax was working with Google on Google Glass. And I think it caused Highmax stock to go up like $200 million in one day. And I didn't own a dime of it. I had no investment in it. I had no interest in it. If I had known, I would have bought stock. But anyway, the stock market cap went up $200 million in one day just on that rumor. I knew Hymax. Hymax was really a display driver company. I had no reason to believe that that information would actually affect their stock at all because the L cost was the teeny tiny part of the entire Hymax business. I knew about them because I had tried to sell my design to them when I was off on my own before I formed Cendiant. So I knew kind of what the company was. So that did that. Then I got I heard about this company called Magic Leap that got, you know, all these billions of dollars And a lot of times what I end up doing, certainly early days, was somebody would ask a question like, what are they doing? And I'm like, there's this company getting billions of dollars? What the hell? And AR? And doing this. So I dug in and it turned out they're doing L costs. And I figured out. And like two years before they came out, I was able to predict almost exactly what they did. And that's documented, you know. So I wasn't real impressed. I could tell what they were doing. What always gets me is when guys start fibbing. I think I kind of give small little startups maybe a little more reason to be optimistic than I do give bigger, you know, guys who've got billions of dollars shouldn't be fibbing. And they were telling some whoppers that I would call them out for, and they got mad at me, which, you know, actually caused me to do more. Because if they're getting mad, I'm, you know, but I was telling the truth about them. I think I get some level of respect because they know what I, when I do criticize people, I criticize them for things that are measurable, tangible. I don't like the cord. I mean, one of my big things is I did a mock-up. I showed that based on what they were doing, the field of view was going to be a problem. They're going to block a lot of light. For an AR headset, they're really very VR-ish. They block your peripheral vision off. They give you a tunnel vision view. And like I say, the original device blocked about 75% of the light, and the new design is 80%, blocks 80% of the light. That's a lot of light blockage. You really should be more like... no more than 50, and you'd really like to be at least 80% transparent. But one of my big things I did, I did a mock-up even, I did a 3D printed mock-up of their thing based on pictures and stuff I had. And this cord thing I said was going to be a disaster, it's a snag hazard. They had to wire it in on both ends because they had so many wires and so much power going through that cable. that they couldn't afford connectors on either end. And I thought that was just terrible that they wired in. I think it'll still be interesting when we see Apple. The rumors are Apple's gonna have one that's magnetically connected. I'll be really curious, because it's really hard to make a cable that will disconnect where the force to disconnect it is not a complete annoyance. Like if you have too little force to disconnect it, it'll be a constant annoyance of falling off all the time. And if it's held in too tightly, well, then it could rip your head off. Or that cord could get caught in a machine or something, particularly if you're on an industrial application, and pull your head into the machine, which is not a very good thing to happen. So I'm not a big fan of cords dangling down out of headsets in the AR world. I tend to believe they should be self-contained. There's an exception to that, but I tend to believe you're better off. I've been very critical, and after I did Magic Leap, I guess the next big one was I did HoloLens. And I did a very big series showing how bad the image quality was of HoloLens. HoloLens, going back to the very early days of my blog, they used microvision, although they wouldn't really admit it. It was obvious they were using microvision, laser beam scanning. And I'm not a big fan of laser beam scanning. People think I'm just anti because I don't like it because I'm in Elkos. Well, I haven't been in Elkos since 2011. I don't have a reason to want Elkos or any of these technologies to necessarily win. It's just laser beam scanning is an absolutely terrible way to do displays. It's an interesting way to do some of this LiDAR, and there's other companies out there doing displays. laser beam stuff for sensing and all, it does make some sense. But for displays, laser beam scanning is just a terrible way to go. And I've outlined that on my blog on why it's terrible. So I pulled all that knowledge from my very early days in the blog, became all of a sudden applicable when HoloLens 2 came out. Because I did kind of a shootout when HoloLens 1 and 2 came out. I did kind of a shootout with them. And then later, when HoloLens 1 and Magic Leap 1 came out, and then later when HoloLens 2 came out, I did a complete analysis of its image quality or lack thereof. Because it's absolutely the most abysmal image quality it seemed like they hired i guess they hired a bunch of people from microvision and i think to perfect their stock options they encouraged the company to go with laser scanning but it was foolish it was absolutely foolish they spent a ton of money to make a really terrible image and all you have to do because i have both headsets side by side take the hololens 2 off and put the hololens one on and you say oh my god it's clear It's like you can see again. You don't even appreciate, you kind of get trained to how bad that image is. And when you put the HoloLens 1 on, it looks better. because the image is sharper and clearer and whatnot. They wasted a ton of money on that laser stuff. Laser stuff never gets there because they can never... They've been trying to make that laser thing work. Laser scanning displays work since the... Well, Furness, back in the early 90s or maybe even late 80s. I know Furness is a big laser scanning fan, and he and I disagree on that because I just don't think it works. I think there's an issue if you try to direct scan into the eye, so-called focus-free, well, then you have no pupil. You have no... no eye box so there's no you have zero way of seeing it if you go into a wave guide you get all kinds of other problems as hololens proved so neither was a good way to go so i'm not a fan of laser laser scans so anyway that gets you to hololens since then i did kind of a spread on the magic because of all this activity going on with Pancake Optics and Meta, Meta buying up companies. Sadly, it's Bradley and I've had some back and forth. That's how I met, sadly, it's Bradley. There's this company I knew from my senior days, probably I met him in 2005 or something. I think it was Silicon Optics or something like that. Anyway, for some reason, by accident, I had a reason to look at their website and I went back to find something else from that same website and it had gone. This website had disappeared. So I'm curious, so I started searching around, and that's how I found Bradley, because Bradley had reported on silicon optics that everybody used in Valve. So I then went to somebody I knew who was a founder. I knew him personally. And I went to him. He said he couldn't talk. That just gets you more interested, right? If a guy can't, website disappears, guy can't talk, founder can't talk. So finally, I talked with Bradley some more, found out that they've been a lawsuit. There was a lawsuit with Valve against Silicon Optics because they basically paid to get some stuff done and there was some kind of lawsuit going on. They thought they had bought exclusivity or something and they didn't get it. Something like that. I don't know all the ins and outs of the lawsuit. Bradley was following it. But that's how Bradley and I got connected is because I was chasing that down. And then one day I had to tell Brad I had another source because I'd been feeling around on my other sources saying, who bought them? What happened to them? And the word came out, and one of my sources said it was Metabottom. And I told Brad, and Brad, he does not agree with this, but I swear to God, it was like the scene from Darth Vader where he said, I am your father. It was like that. I told him, Metabottom. He was like, no. Because, you know, that's like the, almost that's like Darth Vader, you know, basically it's the Darth Vader to his thing and Val. I'm a bit more agnostic about it. I try to be fair on people. Like I say, I think MetaQuest Pro is obviously, and so what happened, I got into MetaQuest Pro because I was starting to look at pass-through AR stuff more. And the MetaQuest Pro is, I mean, clearly the pass-through was an afterthought. I think it looked like a couple of engineers, somebody said, well, we've got the cameras in the wrong place because we've got one color camera in the middle and two IR cameras. Can we possibly make this work as pass-throughs? Well, we can do something so you can find your mouse and whatnot. And I think marketing ran off with it saying, we've got the best, clearest, you know, they were like talking like this was really working and it was absolutely terrible. So I'm kind of, I'm going to be talking tomorrow actually about optical versus pass-through AR. I think people are making them out to be the same when they're, the overlap is actually not very good between them. I think, I always like to say when you're doing pass-through AR, the important thing is seeing the real world. That's the number one thing to see the real world when you're doing optical. When you're doing optical AR, you want to see the real world. When you're doing video pass-through AR, your importance is on the virtual world. And, you know, it's sort of a grass is greener thing where people who think that... People think that... You know, they look at optical and say, this is really hard to do, and it is. Optical is much harder, much, much harder to do than pass-through or VR. One guy said, it's not twice as hard, it's like 10x as hard. Because you have to preserve the real world while you're doing... You can't put the display in front of you, so you've got to put the display someplace it shouldn't be... It doesn't naturally want to be optically. So you've got to figure out all kinds of complex optics to route it around to your eye. Whereas VR, you plaster it right in front of the eye. You can go with a moderately big display. You've got to be really tiny if you're trying to build pass-through stuff that's fairly lightweight. But I oftentimes say that basically everything that's easy with optical AR is hard and pass-through AR and vice versa. So people get really hung up, I think, on the... I believe it violates laws of physics to try to do hard-edge occlusion in optically, where you try to stop a pixel. You want to block those pixels of light. Like, we know Magic Leap 2, they did what we call soft-edge occlusion, but a single... light blocking element, I believe blocks about 6,000 pixels. It slightly dims 6,000 pixels. Trying to block one, you block about 6,000 pixels. So it's very crude, it's very soft. You get kind of a dark halo. around edges and stuff because they can't block accurately, and I believe it violates laws of physics to do that accurately. Whereas in the case of pass-through AR, piece of cake. You just swap a pixel. You just literally take the camera feed and decide which one you take. On the other hand, When you do optical AR, I get infinite depth, everything works the way your eye behaves, the scaling's perfect, everything in the real world behaves like it's supposed to behave. Whereas when I do pass-through AR or pass-through mixed reality, it's a flat depth or it's got funky things and it doesn't behave right and the scaling's never perfect. And when you look at your hands, there's a little bit of a scale ratio, and maybe it doesn't behave well in perspective, so as you move your hands in and out, everything's a little bit off, and your brain is sending messages to you, this is screwed up. And you do that for long enough, and you won't feel too good. And so the visual system has a way of getting its revenge if you send it a bunch of conflicting information, form of nausea, headaches, tiredness, and so forth. Anyway, so that kind of gets you, that's the entire history brought up today, including a little bit of my presentation from tomorrow.
[00:24:05.278] Kent Bye: Yeah, and you had mentioned the acronym ELCOS. What does ELCOS stand for and what was that technology?
[00:24:10.963] Karl Guttag: Yeah, okay, very good. That's called liquid crystal on silicon. You know, most people are familiar with liquid crystal displays, LCDs. Well, what liquid crystal and silicon is, we take a semiconductor chip, almost standard CMOS semiconductor chip, and the top layer of metal has to be really flat. So that's the only difference between normal cmos and lcos for the silicon is we put a very very flat layer of metal on top it's very thin too turns out the thinner you make aluminum the shinier it is if you make aluminum more thick it tends to crystallize and gets rough so you put a very thin layer of metal on the top that gets really reflective you got filled there's a lot of complexity to it but it works a little bit like an lcd however When liquid crystal, and this is true of almost all of them, we're talking liquid crystal shutters, displays, whatnot. The speed of liquid crystal of a given family of liquid crystal is proportional to the square of the depth. So if you have it, it goes four times faster. Because we're reflective and we're also building these small chips, we can go with very thin layers of liquid crystal. So the L-Cross switches much, much, much faster than typical LCDs, allowing us to do field sequential color. Now I hear there's a rumor somebody's talking about doing a field sequential color transmissive. They've tried that many times, and the problem they always have is that square law thing. They have to be twice as thick, because a liquid crystal with a reflective pass, it twists. The light gets basically twisted halfway on the way in and halfway on the way out. So you only need half the thickness of the liquid crystal to get the same twisting effect. Because people think that LCDs block light. They don't. They change the polarization of light. They are illuminated with polarized light, And what you're doing with the liquid crystal is changing the polarization of it. And by changing the polarization, then you feed it to some optic, some polarizer that we call an analyzer, the bout path, that will reject either mirror-reflected or absorb it. Normally, in bright stuff, it's a mirror. If it's really dim, you can sometimes absorb it. But if you absorb it, you've got that heat. So normally, most of the time, you end up with a reflective polarizer. So Elkos works a lot like LCDs, only it's a mirror reflection. What happens is, because it's that fast, most Elkos does field sequential color. That means we go red, green, blue, red, green, blue. DLP is most famous for doing the field sequential color, their so-called single chip, not what they use in movie theaters, but what they use in data projectors and televisions for a while, was field sequential color. So Elkos, typically, particularly when we're talking about these small chips or field sequential color, the reason why is you only now need one element one mirror, it's really an electrode mirror, because you got a chip under it that's controlling that electrode that's switching the liquid crystal, but because you only have one of those per pixel, you now do red, green, blue, all with that one element, and switch the light really fast. What's the frequency of that switch typically? Usually in the, you'd like to be in the high hundreds, of Hertz to in the thousands. You typically would like to have at least three cycles. So if you're doing 60 Hertz and you got three fields, you'd like to be 120 Hertz times three. So you'd like to be at least 360 or 360 is kind of the bare minimum. And you'd like to get the 540 or even higher, really high end stuff. Like for a while I'd heard that Compound Photonics was supposedly trying to get to like over a thousand. But you'd like it to be up there, and the reason why is, you may notice, when you've ever had these L-Costs, for example, the original HoloLens, if you move your head around, if you've ever seen an L-Cost headset, and you move your head around, but now they use a pretty slow, old L-Cost, even in the HoloLens 1, it was not one of the best L-Costs even in its day. But if you move your head around, you will see a breakup of colors. Some people call that, used to be called the rainbow effect, but now there's other people calling other things rainbow effect. But it's field sequential color breakup is the technical term for it. DLP has the same issue with a single chip. The faster you can do that, the less it tends to break up. There's other tricks you can do. There are actually things you can do in the algorithms. For example, I know Tilt 5, they do some motion stuff, so they actually are updating the color fields. If you keep the color fields updated, you can prevent some of that or reduce the effect. So there are some digital tricks you can do to reduce the effect of color breakup as well.
[00:28:44.332] Kent Bye: Yeah, and so as we're here at Augmented World Expo 2023, I know that there's some folks are showing like Ant VR, and then there's XREAL, and then there's, so I'd love to hear some of your hot takes on some of the different stuff that you've seen at Augmented World Expo on the floor.
[00:28:59.184] Karl Guttag: Okay. Yeah, well, XRail's shipping a lot of product. They claim they're shipping, I think, about 100,000 units a year or more now, which if true, that's probably more than half the market. I don't know anybody else shipping anything close to those numbers. I don't think the whole market, the rest of the market together is shipping 100,000 units, the best I can find. You're just not seeing them everywhere. But they're shipping a lot. The issue for them is right now they're shipping a birdbath design. I've not seen their new stuff. I'm hopefully going to meet with them tomorrow. So I don't know what their new and future stuff is. But their current stuff is a birdbath. That birdbath is a very old design. It goes back very similar to what Osterhaus Design Group, ODG, did back in the mid, really mid-2000s, maybe 2015, 2017. They were doing that kind of design. The good and the bad of the birdbath is it works well with OLED. You get a really good image out of an OLED. You can also do it with LCDs, you can also do it with LCOs. But you get a really good contrasting image out of it. The problem is that it's not particularly bright. It's not all that efficient. And the see-through is maybe 25%. So it kind of wins the sprint. The birdbath kind of wins the sprint. And you see, there must be two dozen guys doing birdbath designs now. NREL kind of started the wave, really with a design that was very similar to what ODG did several years before them. And now they started that wave. Now you've got lots and lots of similar designs with the birdbath. But it also runs out of steam, because they're all going to be about 25% transparency. They're all going to be about 100, 200, or 300 nits. Basically, you get about 10%, 15% of the nits out that the display puts out, and that's it. So unless you go to Elkos, which can put out huge amounts of nets. There's a bit of physics in there, which would take forever to explain, particularly on radio. But Elkos starts with a really tiny LED that illuminates it. And when you're trying to get into a wave guide, you know, these really thin pieces of glass wave guides, basically if the light source is bigger than the hole you're going through, you're going to throw away most of the light. Well, on an OLED, the light source is the whole display, so it's really big, whereas an L-COS, it turns out that the initial light the etendue of the initial light is set by the illumination. The ELCOS acts like a mirror, does not affect that etendue, so you can expand the light up, illuminate the ELCOS, and then crunch it back down and feed it into a waveguide. That's why you see ELCOS in almost all the waveguide designs now. DLP was in there. but I've seen almost everybody move away from DLP. Due to cost, there's more resolution, there are multiple L-cost suppliers, there are just a lot of factors. It's also less power hungry. DLP has advantages at really bright displays, theater projectors and stuff like that, but it feels like it's out of its element. in the near-eye display world. You've seen less and less. The only guy I know still using it is Snap, and that's basically, Snap's using about a five-year-old wave optics design still. The Snap Spectacles that were downstairs, that's like a five-year-old design, really old. And before they were bought by Snap, they were talking about moving to Elkos too. Almost everybody else I know of who was using DLP has switched to Elkos for anything involving wave guides or like that. So anyway, that kind of hits the birdbath. So there's a lot of birdbaths out there. Lenovo's got one. Oh, God, there's just so many out there, you can't count them all. HTC has one. And they're all about the same performance class. Like I say, they win the sprint in having a really good image quality. They have OLED-like image quality, but they're kind of set at 50 degrees. They're also about an inch thick. If you want a bigger field of view, they get thicker. On a smaller field of view, you can make it smaller. That's basically how the light worked. They made the field of view a little smaller, and they made the thing a little bit smaller and lighter. But if you want a big field of view, a birdbath, you got to make it bigger. The efficiency is kind of locked in because of the physics of how you have to move the light around and the beam splitter and all that. So it's kind of like locked in. It kind of wins the sprint, but then what do you do next? It never gets you to this kind of lightweight glasses form factor. It's never bright enough. For example, like the N-Reel, I don't know what the latest ones are, or X-Reel now, but they were like about, I measured the one they used to have at about 120 nits. You get Loomis, has a 50 degree field of view, they're doing like 3,000 and 4,000 nits with L-Cross. That's through a wave guide. That's not particularly efficient. So it's just orders of magnitude difference in the brightness. And that's for about the same power. Put about a watt into each one in a bird bag with OLEDs, you're going to get hundreds of nits out with a L-cost with a wave guide. And by the way, digital lens on a 30 degree is also getting 2,000 or 3,000 nits out. A little smaller. The Lumis reflectives are a bit more efficient. They can do a bigger field of view for the same power and the same nits. You kind of got to bring everybody to the same table. You got to talk power, field of view, and brightness. You got to lock them down in all three. You can't just talk only two of those. They can fudge the other one. The other one could be way out. But anyway, the Waveguide guys using Elkos are talking thousands of nits. whereas the birdbath guys are talking hundreds. I don't believe with 100 nits, even in the few hundred-nit range, I don't believe you're good for outdoor. Outdoor has a very high dynamic range of brightness, light to dark. So even if you darkened it, well, you'd have to darken them so much that you could barely see. If you went into a shadow, you couldn't see what's going on. You'd be wearing welder goggles. So really, you have to have, I believe, at least 1,000, preferably 3,000 or 4,000 nits in the display. Even if you're going to do dimming or if you're going to do anything else, you need to have thousands of nits to play with in order to do outdoor at all. And so, anyway, that's kind of how they get in a box. Ant Reality, actually, they gave me a shout out at the show because I had done an article on them. But they're basically a double birdbath to a degree. They have two displays feeding into a prism that kind of looks like a birdbath. It's kind of like a prism thing that burges the two birdbaths together.
[00:35:30.572] Kent Bye: Which gives it a little bit of a wider field of view, right?
[00:35:33.235] Karl Guttag: Yeah, it gets you to 100 to 120 degree field of view. I mean, that's pretty impressive. And you expect to see a seam in the center. And it is a little tricky on your eye. If your eye is not centered really well, you might see the seam. It's a little bit like Fresnel lenses. Have you ever seen Fresnel lenses in VR? You know, if your eye gets off center a little bit, You know, the biggest advantage of the pancake optics, I think, the god rays are one thing, but the biggest thing is you get a much bigger sweet spot for the eye. The sweet spot on a Quest II is like nothing. You know, you just get a little bit off and your eye ends up going through one of the Fresnel rings. Whereas on the pancake optics, they have a big sweet spot. So they're a little critical because you're not exactly lined up right. You might see the separation. But it does look, I mean, when you look at it on paper, you'd say, how is that going to work? But it actually looks better than you think. So Ant Reality's got there, but they're also back a little bit like Birdbath. Same problem. You know, they got a lot of similar issues with respect to throughput of the display to the eye and from the real world in. They block a lot of light, like a birdbath. So it has a lot of the same drawbacks of a birdbath. But they are able to get to 120 degrees. They're the only guy doing anything see-through that I know of that can get to 120 degrees. Although I do believe they're fudging a little bit. I think what they're actually showing on the floor may be more closer to 100, but it's still remarkable. They're also doing 80 degree, I believe, with just a single display. So they're getting some pretty interesting stuff up there. Let's see what else. Like I say, my personal favorite at the show, kind of like two years running, is Tilt 5. I really love Jerry Ellsworth's dedication to it. He's an inspiration to the whole industry. And it really works. I mean, it's not the best image quality by far. It's using, golly, they're probably, I hate to say it, but five, seven-year-old Elkos technology. I mean, it's really old stuff due to all the history of what it took her to get it done. But it has some of the best feedback to the human. of the visual response. You really feel the 3D depth really well. I bought one myself, by the way, my own money, I didn't, not given one, I bought one myself. But you really do feel the 3D depth. There's some tricks in it, the way it works, but like, You could do a better job than what they're doing, too. I mean, I kind of understand the physics of it now, but it works really well. And most people sit there, I call it the, oh, wow, because I bring people to see it and they always come away saying, oh, wow, it has a really good feedback loop with your brain, I think, in the way that things behave right. Like if you put a real object there, well, because they do require that special reflective map. and that's a big drawback to a lot of these guys. But for the whole application sets, what I call sand tables, like military, or if you're playing war games and stuff like that, where you want to have a tabletop thing, or all kinds of tabletop games and stuff, it does work best when you're at about, give or take, 30 to 40 to five degree angle. You've got to have some angle to the table. It doesn't work straight on. So it's really a tabletop-type design. That gets in all the physical. But that's one of my favorite things always here at the show. Let's see what else is here. Parotech, I've been very interested in them for a while. I visited them over in the U.K. a couple of times. This gets back, by the way, we talked about field sequential color and L-costs. I don't believe anybody who's building a micro-display, near-eye display, can get away with a spatial color. Like when you look at a television set, you have a red, green, and blue dot, or on your phone, you have a red, green, and blue dot. A lot of times, two greens, reds, you know, they're different patterns they do. But on a micro-display, the pixels are so small, you can't afford to have a red, green, and blue dot. You almost have to omit from a single point. And Pyrotech is one of companies, there's actually about And this is classic, I'll put on my blog, they're the only guy doing it. And of course I found out three more, immediately I find out three more people. As I often times say, nobody will volunteer information but everybody will correct you. So it's best to be positive. I find it's actually, you learn more by being positive about it and then correcting, you know, pull out your eraser. But yeah, because it turned out, and I'd forgotten that, that Astendo had actually done some work in a single emitter, although they do stack. MIT, a few, you know, the problem with stack, I mean, you do see people doing stack, which sounds good, where you stack three emitters, but the red's got to be at the bottom and it's got to pass the blue and the green. Red's typically the hardest one to get bright. And so whatever your circuitry and stuff and the diode and everything, the light has got to make it through the blue and the green. And so you lose a lot of light through that stacking. Typically, the stacked micro LEDs are talking hundreds of thousands of nits. The non-stack guys are talking millions of nits. So it's like several orders of magnitude different. We'll see how that all works out in the power equation. But I do believe if you're going to do a micro display for AR type stuff, it's going to have to be a single emitter. Elkos is doing the field sequential color. There also would be field sequential. They could do, basically what they would do is, the difference is, you know, I talked about you know, we're talking about microseconds or so, you know, maybe a thousand hertz or something, type rates for L costs. They're talking like megahertz or gigahertz. I mean, they can switch in nanosecond, the LEDs can switch in nanosecond. It's really tricky to control it because they only have two wires. Another neat thing, they only have two wires, but the control is very complicated because the color is controlled by current, the brightness of control by duty cycle. So it's a very complicated control scheme. The problem, and I think I try to warn people with micro LEDs, micro LEDs are likely the future, but that future could be 10 or 20 years away. They've got a long way to go to get micro LEDs. I'm not a fan of, you know, people talk micro LEDs, I'm not a fan of quantum dots for micro displays. First of all, it's spatial color, because you got a red, a green, and a blue, you're going to spatially color. The efficiency is not there. You're going to probably burn up the quantum dots if you go very bright. I could see quantum dots in televisions or even maybe watches and bigger things. But a watch is, you know, people don't realize it, but an L-cost pixel, we're looking at 3 to 6 microns as typical L-cost pixels. You know, when you're looking at a watch pixel, it's 20 to 30 microns. You're talking like 50, 100 times sometimes difference in area between an LCOS pixel and the kind of pixels we see. As I sometimes say, if you had a big screen TV, we can almost fit an entire display inside one pixel of a large television. It's crazy that way. So the sizes, and I think sizes affect the technology. What we got to do at the microscopic level You've got to do things like you've got to flip a whole set of LEDs at a time, whereas when they do watches or cell phones, they'll be individually picking and placing individual LEDs. You can then pick them, test them, and sort them so they're matched. The problem we have right now is when you flip, they do flip chip, they take a whole array of LEDs and flip them over onto a CMOS device, flip that on a CMOS chip. When you do that, You get the whole bag of parts. They're all different brightnesses. So you have a problem that you don't know what you're going to get. And so what you have is almost every micro LED I've looked at has pixel problems, either dead pixels, dim pixels. The uniformity across the displays are not great yet. It's really early phase technology. It doesn't mean they can't do interesting things. but it's not ready for the prime. It's not even close to competitive with what OLEDs can do today in image quality. It's so bad that you would never put up a picture of a person's face. By the way, I have criteria and I sometimes set. I say, well, we have monochrome, which is like green only or whatever, monocolor. Then we have what I call full color, which means it's got red, green, and blue. And then I call it true color. True color is you put up a person's face. With full color, you can tell red from green and green from blue and stuff like that. With true color, you hit the colors you mean to hit. You can do a person's face. And micro LEDs aren't even close to that right now. Whereas, of course, OLEDs are in that, you know, even the micro LEDs are in that range. The process is much more mature. They spent 20 years getting to where they are now. And then you got big boys out there. You got Sony and BOE out of China are the big boys in the OLED space. And then, of course, you got, you know, Imagine was just bought by Samsung. So everyone's interested what will happen there.
[00:44:22.785] Kent Bye: Yeah, I think when I see the different demos on the floor, I agree that the Tilt Brush, for me, it gives so much more of a deep, experiential, immersive experience. And when I see these other AR devices, the field of view is often very narrow. You're talking about how dim it is, and it's not very bright. And so there hasn't been as much of the AR experience that I feel like is super compelling, aside from what I'm seeing with Tilt Brush. Yeah, there's something about the experiential component for me is coming more from the VR world that I haven't been as interested in some of the AR stuff, but I'm very drawn to what Jerry's doing.
[00:44:57.698] Karl Guttag: I think it's a big misnomer, and that's a big part of my theme tomorrow. I think AR and VR are more different than they are alike in many ways. That people have somehow because VR happened first, it was easier to do. You know, you take basically a cell phone, a cheap cell phone display, at the time, slap it in front of a dollar optics per eye, that's basically Oculus Rift, okay? That's the hardware for doing the optics, okay? A cheap display, probably you could have for 20 bucks, and $2 worth of optics. And therefore you saw Google Cardboard, you saw all these other guys doing VR with cheap stuff. Now you can do it a little better, you've gotten better, but still the basic, you can do a halfway decent display with not a lot of money. AR isn't that way. Like I say, it's not twice as hard, it's 10x as hard. AR has got to be more where the real world is your primary concern. If you're really trying to be immersive, people who come from VR are used to immersive. They're thinking, I want absolute image quality, I want this great image quality, which means you've got to have good black. Well, you're not going to have good black. As I say, when you look out through an AR headset, that's your black. When you turn the display off and look out, that's your black. So you have to add light. You can only add light to it. Now you can try to do, like I say, like Magic Leap 2 has done with some dimming and all, and try to do that, but it still never equals what you can get by just blacking out the world. Because you can put the display in an easier place, because you don't have to combine the real world, you lose a lot of optics quality doing all that stuff. So, you know, it's just a much harder job to do AR. So what's important in AR usually, I always say, a lot of times it's hands-free, it's seeing the real world. Like, you kind of say, well, you don't want to look at an application and say, that's going to be AR, it's not going to be done with pass-through. You're not going to go walking in the street wearing a VR headset. You could get killed. It would be dangerous. I don't see people working on a factory floor unless it's got guardrails and stuff up. You're not working on a factory floor with a VR headset. You've got to have to be the sea of the world. You need much more openness. I'm even concerned that I don't particularly like the way Magic Leap 2 is done. It's way too confining. I really think, image quality-wise, there's a lot about the HoloLens 2 to hate, but they did a pretty good job ergonomically. They are very open that you can see around. They were thinking a lot of the other problems well. They got a lot of other things right, but they really screwed up on the image quality. But you don't need great, as I sometimes say about the HoloLens 2, You know, HoloLens 2 proves that you didn't need great image quality for the application to serve. By the way, I should mention digital lens. Digital lens is kind of, they took some advice from me at CES. They said, we've got enough eye relief to do glasses. We have these things that look like glasses. And I said, well, why don't you do something more like Holo? You say you're going after, somewhat after the HoloLens market. Why don't you put a headband? Do it as a headband and let me wear my glasses. You have enough eye relief for glasses. Why not go all the way? And they just, at this show, they just showed their new thing where they're putting a headband in there. So if you can't be really glasses, you should own what you are. Almost anything that tries to do it all ends up putting way too much weight on the front, making it front heavy, putting too much weight on the nose. If you can't build real glasses, and they're trying to do a lot. I mean, they're doing most of the things that HoloLens 2 does. know they're doing some tracking they're doing some stuff to integrate even better tracking and stuff so they're doing most of the things hololens does so why don't you just own it and say i'll build something that kind of looks like a hololens rather than trying to keep a classic because they still they're not really they got to reach around a lot they have to have a big nose pad they sit out from your face a little bit which is worse the more it sits out for you there's another problem with nreal Nobody wants to wear NREL glasses for like hours and hours a day. Birdbaths, because the birdbaths are so thick, they tend to push a lot of weight out further on the nose. You really want the weight centered. I mean, go back and look at HoloLens 2. They put that bustle. They moved the processor, the battery, all that to the back. That balanced the whole thing up. That's another important thing. But they also don't try to grab onto your face anywhere. They realize, really, all you have in your nose is cartilage. All you have in your ears is cartilage. It doesn't support any weight. and if you put weight on your nose or ears for any length of time when you take it off it hurts and so you've got to start finding other ways to mount stuff up here that don't involve putting much weight you know you can do a little bit maybe just to keep it from moving around but you can't put much weight on your nose or your ears
[00:49:46.348] Kent Bye: Well, I find it really remarkable that you've taken all this on as a hobby, but yet you're so technically detailed and being able to have the discernment of understanding what all these optical systems are doing technically, but also experientially. And so I'm curious what it is about augmented reality that has captured your imagination to dive so deep into it.
[00:50:05.055] Karl Guttag: Well, I sometimes say I'm handcuffed to it. Way back when at Texas Instruments, I went to a retreat for management. A TI fellow is basically like a vice president of the company. They got treated the same way. You get offices and parking and all that stuff. And when they do retreats with a VP management, they send you on these retreats and whatnot. And I did this. I don't know if it was Meyer Briggs or one of these analysis. But they said, I was in a room full of engineers, because most of the people who were managers were engineers, and I was off the scale analytic compared to them. So I like analyzing. I guess, compared to the average bearer, I like analyzing. It comes from my electrical engineering background. I think what makes my blog is I'm an electrical engineer, I'm not an optics expert, so I kind of have an understanding of what a technical person might need to know to understand the optics. start from the premise that they understand Etendu or Maxwell's equations or anything like that. I kind of start from, here's kind of what a technically sophisticated but not optic expert might know. And kind of right from that angle is kind of how I do it. I'm somewhat handcuffed to it now. I just turned 69, so it's not like I'm going to learn another field right now. I'm a little bit free-spirited. I sometimes say I've got the bit of the Ricky Gervais where he says he doesn't care. Well, at 69, I don't care so much. So I tend to call it, I've always, good or bad that way is bad in corporate politics, but I've always kind of called things the way I see it. If some program's going to work or not, I tend to, I say it's going to work or not. I don't tend to play the politics on it. I make an engineering judgment on it, and based on an engineering judgment, I say whether I think it's going to be good or bad. Yeah.
[00:51:50.424] Kent Bye: I see, and so I guess as we start to wrap up, I'm curious what you think the ultimate potential of augmented reality or mixed reality XR might be and what it might be able to enable.
[00:52:01.788] Karl Guttag: Yeah, well, of course, everybody around here is wondering what Apple's going to do. We're a few days away from the expected Apple announcement. I think Apple is, just to hit that subject real quick, I think Apple is, I think people are going to get a sugar high on it. They're going to say, oh, good. But as I said earlier, I think AR and VR are really different beasts. So I don't think that what Apple's doing with something that's mostly a VR, really VR with a little bit of, as I call it, augmented VR. They're augmenting the VR a little bit. I don't think it's going to change the AR very much. AR has got a solid market. It doesn't get you to cell phones. The problem we have is that people think that they're going to get to replace the cell phone. I think the AR market is quite real and big. That's what the people are finding. You know, you look at HoloLens was kind of unserious because it's not, you know, HoloLens I think lost interest when they said this isn't going to be like the cell phones. There's a lot of other small companies that can make a lot of money. servicing markets because it's really, selling AR is really easy. The elevator pitch is basically, you put this headset on, I can help your engineer or your technician or your factory worker or the guy working in the shop pulling stuff, I can make him 10% more effective. If I make him 10% more effective, the headset will pay for itself in a month. It's a real easy sell for the AR world. The VR world is a little still more game, and this is why I think even at Meta and I think somewhat at Apple, Apple's kind of schizophrenic a bit right now when it comes to AR, VR, but VR tends to be seen as a market of young males in their basement. is this pejorative way of saying that, that it's basically mostly male-dominated, mostly people who want to play games, and they don't see how it gets out of it. It's a solid market, it's a real market, but they don't see how it gets bigger. What I don't understand, I've never understood, is how AR got endowed with the fact that it would be as big as cell phones. I think that's quite a ways off. And that puts me in a minority. I'd love I have lots of friends in this industry who I'd love to see do well. But it's just like I cannot bring myself as an engineer to see how to reconcile that. I think I think AR is probably well, it's definitely got an industrial market. I don't see how you sell pass through VR industrial other than I mean, simulator training. Yeah, there are real markets. Like if you look at the military, the military, if I'm going to guy running around the field, it's got to be AR. But if I'm in a class or I'm simulating, well, then VR might be a better way to go. You know, so you can kind of see how it segregates. But I think you can almost look at what you're trying to do. What's the environment going to be? Are you running around? Are you moving around? If you're moving around, you're not wearing a VR headset. If you're simulating, you can recover and you're not gonna hurt yourself, then that's one thing. I sometimes say one of my new sayings is though, when you're in the VR world, there's never a boundary that's small enough to keep yourself from getting hurt or big enough not to be annoying. You know, I've set those boundaries in my office and whatnot, and it's finally you just set the boundary as big as you can because it's driving me crazy. And you just hope you don't run into something. But that's kind of, you know, these so-called safety boundary things are ridiculous. Besides which, your hand flies through the boundary. When you're doing something, you make some fast move, it goes through the boundary faster than it can warn you that you're in trouble. You've already hit it by the time it tells you not to.
[00:55:35.322] Kent Bye: Yeah, some of the buzz that I've heard, I don't know, we'll all see what happens on Monday, is that part of the use case may be a screen replacement. If you have 4K per eye, then maybe the compelling use case is that it's a productivity thing where you are using it at a computer, which there's also an AR version of a laptop that's showing here. But yeah, I don't know if you have any thoughts on the utility of using some of these devices as a screen replacement.
[00:55:59.537] Karl Guttag: times and i i'm not a believer in that i think the human look we've had i i use the example in my pro i'm going to talk about some of this tomorrow but my presentation we've had televisions that go on your eyes since the mid 90s sony had a thing called a glastron they even had a glastron with a lcd shuttered sort of like a little bit like magically in 98 and the problem is that people would rather look at a tablet they would rather look at a phone
[00:56:27.595] Kent Bye: It's like the vergence accommodation conflict that's part of it?
[00:56:30.598] Karl Guttag: It's part of it. The vergence accommodation conflict is the most obvious one we talk about. Even if you address that, your eyes are actually, when you look at how the human visual system works, the eyes are constantly moving. They're not staying stationary. They're snapshotting. They don't totally blank, but they partially blank when they move. The problem is you've got this thing really up close to your eye, and you've got this sensor that's jittering back and forth sensing that. And I think your visual system says, like when the eye jitters around, focus should be going here, there, and everywhere. And you can't keep up with it. And you're a little bit late. You're lagging a little bit. You've got the motion to photon delay. You've got all these things, and the combination of all that says it just is not a great experience. But look at all these years we've had televisions. I call it the airplane test. And the reason why is I remember I was getting on an airplane, and I saw an iGo, which was one of the television. They had their little computer monitor slash television, worked plugged in your PC, and you could use it on the plane and whatnot. And they actually had a kiosk at the airport. In all my years of flying, I have never seen anybody wear a VR headset or glasses on an airplane. When the iPad came out, this was a few weeks after the iPad came out. iPad had just announced they'd shipped a million units. You got on the airplane, there were iPads everywhere. They could have had a much bigger image, much bigger, you know, field of view and all that. People get a little too wrapped up over the field. That's another big mistake and a big problem in the industry. It just drives me crazy when I see these people who talk like they know something. When they sit there and say, oh, I'm not going to buy anything less than 120 degree field of view. These AR, you know, the AR is nothing. They're not looking at the same application. The reason for big field of view is to get super immersion. By the way, there's a company down there, HyperVision, I think it's called. They've got like a 240 degree field of view. Once again, it works pretty good. It's actually a dual... Combined together, two displays combined together with Pancake Optics. And it's pretty remarkably good for what they're doing. I was rather surprised. I'd seen their earlier version, which was not using Pancake. It was using Fresnel Optics and stuff. It did not look nearly as good. It's like dramatically better now. but yeah you can overdo the field of view thing but the bottom line is like i have at home i have a when i'm doing my editing and i do photographs and i'm putting together the blog and stuff like that i have a 34 inch lg widescreen plus a 4k by 4k next to it kind of mount it up and i use it more or less as three monitors two on the widescreen and one on the 4k by 4k. You're just not going to do that. The human factors don't work right. The way your head's used to when you turn your head. I know they're trying to simulate that. I saw the thing today. But you know they got a big old keyboard and I think what you're going to find is that the person would rather have a 13 inch tablet with a small like keyboard. You know I like that they're nice people and all that. I just don't see how as an engineer that that's a workable solution. I think people I call it the ice bucket challenge. I've seen a lot of reviewers say, I did it for a week. I did it for a month. I didn't hear one of them say, and at the end of that time, I threw away my monitor. I threw away my laptop. This is where I'm going. No, they said I tolerated it for a month. Well, yeah, you survive. That's one thing to survive. It's a different thing if you're going to really say, this is the future of what I'm doing. And I don't think it's just a question of... I think some of these problems are what I call 90-90-90 problems. There's the famous thing that takes you 90% of the effort to get to where you think you're 90% of the way there, and then it takes another 90% to get the last 10%. Well, sometimes it's 90-90-90, which means you think you're there at 90%, and then you spend another 90% effort, and you find out you're still... Not there. So you keep spending these 90% efforts, and what's happening is you find out that it gets exponentially difficult to get the problem down to zero.
[01:00:31.053] Kent Bye: Yeah, I also wasn't impressed with that AR laptop that was showing here. It felt like the small field of view was cutting off a little bit, and it didn't feel like, for my workflow, for what I do for editing audio files, I just felt like my existing system of laptop screen was much better.
[01:00:45.085] Karl Guttag: The subtle things you do, like you reach in, Yeah, they are doing the lean in. You know, you do a lot of lean in, but you turn your head and it's going to lag a little bit. You may touch the screen. You may actually point at something. You don't realize all the little subtle things that a human does. And it just doesn't mimic all that. And I don't think it will. I don't believe Apple will get there with that. I've had a number of people tell me that and say, look at that thing downstairs. And I can't get myself to believe it. Like I say, I've watched for 30-something years. When I joined that little startup, my very first Elko startup, the reason why I knew about Glastron, and I think it was Olympus was maybe the other company that had one, we had bought those to look at, you know, as our kind of first pass, what are these guys going to do? So I've known they've had those things for decades and decades. They have failed so miserably that nobody knows they ever existed. And I think it's a human factors thing. Because, yeah, you could argue, why aren't everyone in Tokyo wearing one? You've got guys commuting. Some people in Tokyo commute two hours a day. Why aren't the guys in Tokyo wearing them? Why aren't guys on airplanes wearing them? We've had the capability. Now, I know that the MetaQuest Pros are very poor implementation. In theory, the Apple, if everything holds true, they're going to be at the angular resolution that, I mean, the MetaQuest Pro was never meant for business. The angular resolution, they only had 20 pixels per degree, and that's not enough. You need to be at least 40 to do business work. You can't read efficiently. This is something people think, OK, make the text bigger. Well, if you make the text bigger, you read slower. I want to look at a spreadsheet. And remember, your vision's only really sharp in the center. So if I'm going to do a spreadsheet, I need to get the pixels into the center of my vision. I want as good a pixel as the center of my vision. I'll turn my head and look around. But I need to get a certain pixel size. I think if you get above about 40 pixels per degree, if you're below 40 pixels per degree, you become very inefficient in reading text. And if you're trying to do business work, you're dead. So the MetaQuest Pro is dead on arrival. The Apple is at 40, which is right at the edge. You mean the Apple device? The proposed Apple pass-through AR. Based on the specs that have been reported, you mean? Yeah, based on the specs reported. They're going to have double the pixels with about the same angular view. So they're going to be at about 40 pixels per degree. that's right at the marginal acceptable people will tell you that human vision is 60 pixels per degree it's not it's really sharp eyed human vision is more than 60. but 60 is also probably better than you really need at about 40 or so it starts to get acceptable most of the ar stuff we see is 40 to 50 pixels per degree ar by the way we have the opposite problem see when you're doing vr you have this big display normally they start with a direct view display and you put optics in front of it and cram this thing in front of your face. With AR, we're usually starting with a micro display. It has really tiny pixels, so they have the opposite problem. They're trying to make these pixels bigger. So people who are complaining about the field of view of AR, it's because when you're doing a big display, it's easy to get a wide field of view. When you have a tiny display, it takes incredible optics to get a big field of view. So when you look at AR, almost all AR displays that are using micro displays, are in the 40 to 50, 60 pixels per degree, whereas almost all VR displays today are at the 15 to 20 pixels per degree. Now, what Apple's doing, and there's lots of interesting things with Apple, but they're doing, by all reports, a Sony 4K. Well, their pixel size is going to be a lot smaller than it was for the MetaQuest Pro. MetaQuest Pro has enough eye relief to wear glasses because they have a moderately big display. Whereas the reports are they're going to use a micro-display, which gets you, I think, 6 or 7 or 8 microns per pixel. Whereas the MetaQuest Pro is about a 19 micron per pixel display. Well, what's happening with MetaQuest Pro, we have enough eye relief you can wear your glasses. Apple, by all reports, are going to need inserts. I think what's happening, because you've got these smaller pixels, your optics are forcing you to move this stuff in more. And so you got less eye relief and stuff. So it's kind of funny how the displays force you. But people don't seem to, I mean, they don't have to appreciate the problem when they're on Reddit and all these forums and all. But it is kind of funny when guys are saying, oh, the AR guys, you know, they don't have these wide, big, you know, they need 120. If you need 120 degree field of view, Yeah, Ant can do it, but you're going to make a lot of compromises with Ant. If you want to do these waveguide glasses-like displays, I tend to say once you get beyond even at 50 degrees, really 30 degrees is about all you can fit in anything that'll look like glasses when you're done. You may start with glasses, but then you say, oh, I got to put the cameras in there. I got to put the this in there. I got to put the battery in there. I got to do some more sensing. I want better slam. I want this. I want that. I want this. I want that. And before you're done, you're HoloLens. That's why I say people start with Ray-Bans and end up with HoloLens. But I do think we get people from the VR world looking at the AR world having a total different set of requirements. They're trying to turn an AR into a VR. And they're really, I think, quite different. Like I say, when I look at a VR headset and think about it in an AR application, I think it's stupid. I think it's dangerous. I think it's stupid beyond all measure. Similarly, if I was going to build a VR simulator thing, there's no way I'd start with AR optics. It's ridiculously expensive. The image quality won't be as good. because I had to make all these other compromises to make it see-through. So I think they're less alike than people think. Yeah, they're both things that sit on your head, but a lot of differences after you get past there.
[01:06:40.361] Kent Bye: Well, I wanted to ask you the ultimate potential question one more time because your first answer was Apple, but I just wanted to make sure that I got clear what you think sort of the most exalted potentials of like AR, mixed reality, or what you hope to see where this might go.
[01:06:55.152] Karl Guttag: Well, you know... It's hard to see how we get to the really mass market. I'm kind of a curmudgeonly there. Maybe people think I'm a Luddite on it, but I've been through this. I was involved in the early days of video games and the early days of graphics processors and computers and stuff. and PCs and all. So I think I've seen this before, and back in the early days, we knew that roadmap. We could see how, I mean, I knew when I was working on graphics processors back in the 80s, that we would eventually be doing what Pixar was doing, but, you know, in real time, as opposed to in the movies, okay? I knew people at Pixar used, they were actually a big user of video RAM, which is something I helped develop. They were a hardware company when Steve Jobs bought them. But you could see how we could get there. You could see how by just churning the rate at which semiconductors were progressing, we would get there. It's not as clear here because we're more up to optics boundaries. There are more optics issues. I kind of tend to have my favorites. I think, for example, right now in terms of waveguides, I think the Loomis waveguides look much better than the diffractive waveguides. However, Lumis has not demonstrated yet that they can make it at cost. They make a lot of arguments they can make it at a cost, but they have the best image quality. I think most people agree with that, who actually look at it objectively, but they don't have the cost right. Right now, Digital Lens got kind of an interesting product. Their image quality is not very good, but they're going after markets where HoloLens was in. Their image quality is better than most HoloLenses, You know, they've passed that bar. But these things can get you to millions of units. I could see, like I say, in all kinds of things where you work with your hands, I could see AR going in for everything from surgeons to auto mechanics to guys working in factories. But that only gets you to like a million units. That doesn't get you to mom, pop, grandkids, adults, you know, seniors doing it the way a cell phone is. I think the mistake to a degree has been made that people have expectations for AR that it will replace the cell phone. And I think that's, you know, that's a multi-billion dollar market and that's hard to live up to. So I think that's hard. But I think the squeeze has been, and of course a lot of people here are feeling it because of the AI, you know, there's all this AI talk now, all the generative AI is now the buzzword. soaking up money. Now, there is a lot of good talk, and some of it's whistling past the graveyard, but some of it's true, that you can see that what we will have is a lot of input. I'm a big believer that what we're going to see is a massive increase in input coming into whatever it is you're wearing. A lot of stuff going in. So you can see AR would be the output for all that input. You know, like you could have glasses up here that are taking in all kinds of stuff and using generative AI to figure stuff out, figure out where you are, what's going on, all kinds of stuff helping you out. So there's a lot of that stuff. And you might say, well, if you had that, then I might want to display to convey some of that information and whatnot. So that can make it go pretty big. But we're a long ways from, like I say, as soon as you go beyond about 30, even at 30 degrees, you can see some designs at 30 degrees. You'll see them. Even at 30 degrees, they're starting to get kind of big and bulky. Like I say, when you round it out, even digital lens, they rounded out their 30 degree. It's not looking so much like glasses, but more like chunky somethings that started as glasses. So it's hard to get there. So it's hard to see how we get to the... You know, the expectations back when this thing went crazy, when Magic Leap was getting billions of dollars invested in Google Glass and all that, they were talking like this would replace the cell phone. And I never did believe that. You can go back and look at my blog. When Google Glass came out, I said this wasn't going to be a big thing. I said that. And you cannot believe how many people told me, but it's Google. Well, now, of course, when I say this, they're all going to say, but it's Apple. Well, you know... Everybody slips occasionally. It's not clear what Apple's doing this time. There's so many rumors coming out about it. We'll just have to see what they did. You know, I'm trying to get one to evaluate as soon as it comes out. It may be, as they say, it'll be hard in line because you've got to believe that every lab in the country is going to want one. So they're going to sell a lot just to people who want to tear it apart and find out how it works and run it through its paces. Somebody said they were hoping to sell 300,000. They might get through the 300,000 just on people tearing it apart or evaluating it. But then the question comes, who buys it after that? Is the real consumer got a need for it? As I've said before, I don't see the replacement of a computer monitor. I think the thing that is real is, like I say, all this military, medical, industrial, that stuff's real. You can make that elevator speech that makes it go. The bad news is it probably gets you to a million units a year. I haven't done a scientific analysis of it or a detailed analysis, but it feels to me like it's a million unit, maybe a few million unit a year market. It could be quite tidy, a nice market. It does not look like cell phones. And the problem is it doesn't even... You know, Tim Cook and Zuckerberg, they wake up in the morning and they've shipped that many units by breakfast. So they don't tend to throw up their leg. So I don't quite get what Apple's trying to do other than it seems like somewhat people were chasing their own tail. The big guys were chasing around each other and not risking being outdone. Zuckerberg's a true believer. You look at what he said and the way he makes comments, he's definitely educated on the subject. Tim Cook is more, I'm reading a script. You get that feeling that Zuckerberg has sat down with the technical people, really understands that Tim Cook is being told that this is the thing to say, this is the next big thing. A lot of rumors saying this is Tim Cook's way out. He's got to go out on something. There's a lot of rumors that way that he's wanting to go out on something. I wouldn't be picking this as his way out because I think it's going to take a lot longer. Unless he's planning on sticking around a long time, it's going to take a while to mature and do. We'll see. I mean, it's Apple, you know, it's kind of like, you know, but see, I'm also old enough. I have been around. When I got left college, the question was, was DEC, Digital Equipment Corporation, you may not have heard of, but they were the second largest computer company in the world next to IBM. Nobody could see how IBM wasn't going to be the largest computer company in the world. That was unimaginable. AT&T was the largest company. They were going to dominate. It was unimaginable that IBM wouldn't be the largest thing in the world. And then, you know, basically they broke up AT&T, kind of scared IBM from being too aggressive antitrust-wise because they were afraid they would get broken up. They played a little bit of play soft, play Mr. Nice, and they let people get established in the PC world. And by the time they tried to reel it back in with the PS2, it was too late. People forget there was a breaking up of AT&T that caused IBM to go soft on the clones that allowed the clones to get established and wrench away from them, that PC market. If not that, today you would still be using... I believe if it wasn't for AT&T getting broken up, probably IBM would own everything today. They would have told Microsoft, you can only sell to us. They would have told Intel, I want your chip, or I won't even use your chip, I'll use my own chip. They would have been very, very different in their behavior. And there wouldn't be an Apple today. They would have crushed them. IBM had the patents to crush them all. They had a patent portfolio that could crush everybody. But I think the breakup of AT&T put the fear of God in them and let the other guys establish. Maybe a lesson for today, because in some ways the companies we have today are more monolithic and bigger than IBM ever was. But then that DEC, DEC missed the PC market. We all thought, all the engineers, if you came out of college, like I came out in 77, everybody kind of thought DEC was going to own it. DEC or HP, those were the two guys. Because DEC had the, you know, HP had those calculators that were really nifty. And DEC was kind of like the mini computer. They had the mini computers where IBM was mainframe. So everybody thought, well, the next step for many frames would be PCs, you know, home computers. And DEC blew it because Olson had founder's disease. The founder didn't see it. And they reacted too late. IBM kind of did that Boca Raton thing and let Bill Gates own the operating system rather than controlling it. They were too generous. Like, look what's going on today. In our market with here, you see Meta buying up all these companies. I mean, one of the biggest problems these guys have downstairs is they don't know what their supply chain is going to be because they don't know that Meta may not buy their supplier. There's a guy making this, like Snap bought Wave Optics. I think the only reason they bought Wave Optics is because Meta had been buying up companies, and they were afraid their supply chain would get bought out from under them. If this was IBM today, IBM would have bought up the supply chain and none of these guys could have bought chips from Intel. Imagine if IBM behaved the way Meta is behaving, there is no clone business because nobody could buy chips from Intel anymore. Nobody would buy Motorola. They would lock them out. So that's a lesson for today anyway.
[01:16:05.124] Kent Bye: Is there anything else that's left unsaid that you'd like to say to the broader immersive community?
[01:16:09.014] Karl Guttag: I don't think so. You know, people know how to find my blog. I hope, you know, you can find me on LinkedIn if you have something somewhat serious to talk about. But I'm out there. I'm on the blog and whatnot. And I'm hoping to do a little more video in the future. I can spend sometimes weeks or months writing one article. Writing articles is really hard. I am trying to move to more video where I can, like when I did that thing with Sadly It's Bradley, it took me two days. I prepped for two days, you know, getting images together, and then I went through 20 companies in one day with him. So it was basically a three-day effort. If I covered all 20 companies, it would take me like all year. So you can't do that. So I'm trying to find more effective ways to do stuff. Video might be a way. It lacks some of the research because one of the problems I have when I write an article, I'm doing research. I'm an electrical engineer by trade, so I tend to work that way. I'm also looking maybe at some ways to monetize. In the past, I've always had other jobs to do. I'm 69 now. I guess this is my job. So I am kind of looking at maybe putting out some paid content as well, maybe. We'll see if that works out. I've also got a relationship these days with Display Daily. Display Daily has been around for a long time. A guy named John Petty, John Petty Associates, I've known him. He knew of me when I was designing graphics chips. So he and I go back to the mid-1980s. And he ended up buying Display Daily. So I've developed kind of a backdoor. We've kind of got a loose relationship. So I'm working with them a little bit too.
[01:17:40.473] Kent Bye: Well, you've helped me to make sense of not only what's happening here at Augmented World Expo, all the displays, because coming from more VR, you have a brilliant way of breaking down and analyzing and having the quantitative and qualitative metrics of the light and the power and the field of view, all these things that help me sort of make sense of what I've seen. And yeah, it's been really helpful for me just to kind of hear you talk about all these things. And I just really appreciate the time. So yeah, thank you.
[01:18:07.864] Karl Guttag: Yeah. Well, thank you very much. And I appreciate getting to talk to you. I've been listening. I don't listen to everyone because I'm not a VR. I mostly follow AR. But I do listen to the occasional cast. And that's how we got, I guess, linked up and led to this interview. So thanks very much.
[01:18:23.973] Kent Bye: Thanks again for listening to this episode of the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.