Photo of The Beast Prototype courtesy of Rony Abovitz
I had a chance to sit down with Magic Leap founder Rony Abovitz for two hours to unpack some of the many threads Magic Leap’s behind origin story, including some of the underlying philosophical questions and aspirations towards what he calls “Neurologically-True Reality.” Considering the many tens of billions of dollars of XR R&D that companies like Meta, Apple, and Microsoft have been spending, then the multiple billions of venture capital raised by Magic Leap seems like a relatively efficient and tight budget.
We reflect upon the potentially communication strategy missteps of Magic Leap, the changing media landscape from 2014-2020, how Magic Leap was a part of the conversation with a broader open source XR movement with Valve, unlocking the magic of digital lightfields in their Beast prototype, their three-phased portablization process, the role of sci-fi and story and the persistent dual track of enterprise apps, a brief history of XR, and why Abovitz thought VR was not going to get them to the mountaintop of neurologically-true reality.
A comprehensive history of Magic Leap would require a lot more interviews in order to get multiple perspectives and compared to the public record of events. I’d recommend checking out this Clubhouse conversation with former Magic Leap employees aired as episode #989, and my wrap-up from 2018 Leap Con, and 7 years of my Tweets about Magic Leap.
Hopefully this extended, oral history interview will Abovitz will help to provide some additional context to the underlying philosophical motivations and inspirations behind Magic Leap, but also some of the broader media context, the tradeoff challenges in his journey up until the point where he had to lay off 700 people in April 2020 leading to his own departure.
ILMxLAB co-founder John Gaeta worked with Abovitz as a senior Vice President at Magic Leap, and calls Abovitz a “real deep thinker and incredibly instinctual and intuitive on these [XR] things.” He was certainly tapping into a broader zeitgeist in the early 2010s as Sony, Microsoft, Oculus, and Valve were all independently working on similar ideas and technologies, and hopefully this conversation fills in some of the gaps for how Magic Leap came out of stealth from no where with a $542M Series B led by Google Inc. that was announced on October 21, 2014.
Full transcript is below.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
This is a listener-supported podcast through the Voices of VR Patreon.
ABOVITZ: Hi, I’m Rony Abovitz, and I’m the founder of Magic Leap.
BYE: Yeah, maybe you could talk a bit about your journey into augmented reality.
ABOVITZ: So that’s a great open-ended question. By the way, Kent, thanks for having me. The journey to augmented reality. It really did not start out with this deliberate thought of there’s this field called augmented reality, and I want to go into it because I’m not even sure what the name of the thing was that I wanted to make yet.
But I started asking the question, How does our brain make the pictures that we experience as we move through the everyday world? So I was comparing the idea of like if you look at an Apple II or your current computer, your MacBook, your phone, there’s a display and you know, it’s got glass, it’s OLED or LCD, there’s televisions and movie screens or there’s an external display and you compute stuff to it.
And then when we walk around, we’re not looking at a display. We just have this display that seems to work, which seems like a really weird, obvious question. You wake up. But we don’t really ask the question like, how does that display work? I did biomedical engineering in college and grad school, so I’m sort of wired to ask those questions.
So I was thinking, “Well, how is it that we see all of this?” And like, “Do we see it or are we model building in what’s going on?” So I started asking those questions, which is how does the brain render this rich, detailed volumetric, amazing, you know, retina-resolution world? It seems to be head mounted because we all seem to have a head and whatever’s working, that display is mounted in there.
So I was thinking like, there’s that thing and could that ever be used as a display for computing rather than the external display? So it was more that question than going down, like maybe a lot of people today where they think there’s this field called AR and the field called VR or XR, you know, a lot of different definitions flying around or mixed reality or spatial computing or extended reality. They all kind of sort of mean the same thing, but not always.
But I was asking more fundamental question, like, which is how do you visualize the world when you open up your eyes? Like, did you see it? Is the brain a rendering engine? How does it work? And is it one day accessible? Could that be the thing that computers plug into?
Anyway, so that was like maybe a first step into asking this almost Zen like question, like it’s the best display — I think I actually wrote this down in an early notebook, like early pre-Magic Leap notebook — Like, “Is the best display, a no display?” That was sort of one Zen question. I was like writing these like Zen koans, I guess. “Is the best display, the no display?” Because it seemed to actually be the best display no matter how good of a 4K thing you buy from anywhere when you throw it away and you just walk on a beach or go to California and like, you know, just wander along the cliffs somewhere, or go to a desert. It’s always better than anything you could ever experience in a movie theater, anything you could ever buy. And you’re like, “Well, that’s pretty damn good. What’s going on there?”
And then I asked the second question, like, if we know how that works, do we ever see the outside? Or is it all happening inside? So is the inside = outside and the outside = inside? It was like another one of those Zen questions. But like — By the way, I hope that’s a very short answer to — because I could spend two days talking about it, but about like, how did I get into AR? I wasn’t thinking about the term AR or, you know, I want to make an app for an iPhone or an Android phone. It was like asking that fundamental question and trying to work back from there. Hopefully that makes some sense.
BYE: Yeah. I’m wondering if you could take me back to the moment that you actually decided to start Magic Leap, or to start a company, or to really seriously pursue this? If it was like 2010 timeframe or what was that turning point when you decided, okay, I’m going to go from what you were doing previously, which was biomedical engineering, and then into actually pursuing this idea of the no display or chasing down answering these questions that you’re asking.
ABOVITZ: So it’s not very linear in the way that people would want it to be, in the sense of — if you go back to probably 2010, I had founded a company previously called Mako Surgical that was actually having some real success in robotic surgery. We went public in 2008 and I wanted to see what life was like in a public company.
You know, I started Mako Surgical literally in a dorm room, apartments on the campus of the University of Miami, now my wife’s apartment, then sort of pre-fiancee. But more this $300 a month apartment on the campus, and we start this company. And you know, I had no idea like how to even start a company or anything like that. So I’m just like we had no money. We’re like living on 29 cent, 19 cent burritos from Taco Bell, like being we’re vegetarian.
So we’re these like cheap burritos. And I’m dreaming about building a company that uses robots for surgery because I was inspired by Star Wars. So I figured I’ll just go do that. I have no idea how much money is going to take. I’m just going to go wander off and try to figure that out. [A] very long story not for this podcast, but we in 2006 did our first surgery.
So that was like a big milestone. FDA cleared the first of its type in the world. We built this really sophisticated robot that used — and this will relate back to your question — It was a haptic robot. So haptics, if you’re not familiar, is this idea of like basically digitizing force feedback. So if you think about our senses, there’s like, you know, visual and sort of the physics term for like the whole visual world and its complete mathematic description would really be this like the idea of a full light field. And then the sonic world would be sound fields. And then touch could be like haptic fields.
So in surgery we were using normal displays to visualize three dimensional imaging like CAT scan and MRI and x rays that we were reconstructing, using the techniques from video games and computer graphic companies. So I’m going to SIGGRAPHs and hanging out with animation people and video game people and bringing that technology to surgery, which at the time was super weird and foreign and alien. We were kind of inventing computer-assisted surgery.
I was thinking it was like video games for surgeons because then growing up with computers and video games and like noticing this is bereft in medicine. Let’s bring it in there. Star Wars said we should do it, therefore Lucas, therefore do this. I was like, I was sort of thinking that way. And then robots, of course. But we didn’t just make a normal robot, we made a cooperative, haptic robot that would guide you — almost impossible to describe here. This is, like, even worse than the TV-on-a-Radio Problem. Like no one believed anything we did at Mako, until they came in and put their hand on the robot and felt invisible force fields guiding them into the right spot of surgery.
It really felt like total magic and it worked. It’s used all over the world. Coming out public had great success. The thing that was confounding me is like you could feel these shapes. It’s almost like if you had a pencil and there were glass tubes guiding your pencil to not make a mistake and the shape of the tube would constantly change so you’d always do the right thing.
BYE: Are these co-located robots? Or are these telepresence robots?
ABOVITZ: They’re in the room. And the surgeon would hold the robot, and the robot would act almost like a master surgeon guiding an apprentice to do the right thing. So this weird thing of a human and a machine working together, we were one of the first to actually show that and this really elegant way. So this idea of like an AI robot with amazing A.I. and computer vision and sensing and this three-dimensional reconstruction of what surgery it was, but not doing it autonomously, doing it in this cooperative way, which took a long time for people to understand what that meant.
And it’s actually been one of the most successful implementations of robotics in medicine, period. And there was another company called Intuitive Surgical, which is a giant public company. We got acquired for about $1.65 billion in 2013. Intuitive is still public. But I kind of came from that world into the normal computing world. So I brought computing and AI and game engine stuff and all these things into medicine, which is this whole fight.
Because I thought it could be really incredibly helpful to surgeons. And the haptics where you have these force fields guiding you. You couldn’t see them, you could only feel them. So it was literally like, “Use the Force, Luke.” It was totally awesome. I actually got to say that to heads of surgery at Harvard, and all kinds of things. It was really fun. And it’s used all over the world today.
I mean, people that have Mako implants in them. And then the robots, they have birthday parties today like you’ll have a team will celebrate its first surgery it’s 50th, it’s hundredth, it’s 5000th. So they befriended the robot. My vision was “Could you have this, like, R2-D2, like friend that would help you?” Not HAL From 2001. Not the Terminator, but a totally different vision of like how machines and people can work together.
And one of the things that was seeding in my mind was “Why could we feel this?” Like, we really did like virtual reality of objects in space, but you couldn’t see them. So that began an exploration of trying all kinds of — I think I could safely say this time — super big and expensive and horribly bad heads up displays and VR devices that either made me puke or gigantic, horrible.
I was just like, “What’s going on here? Why does it suck?” But I could make force fields work. Like we had these elegant, beautiful, force fields. You could do sound fields really well. What’s going on with, like, “Why is it such a nightmare?”
So that question of like, walking on the beach and asking, why do we have this great display? How does it work? Was also seeded by being in operating rooms. This was maybe one of the earlier seedings and asking the question, “Why can’t we relate that question to seeing data and to seeing visually — like right where you needed to be not on a screen, but unlock it from the screen, and it’s there, at the right size, at the right opacity, perfectly registered in space.”
What’s kind of cool is there’s a company in Germany called Brain Lab, who I competed with at Mako and now I’m friends with the CEO. He became a really early enthusiast of what we did at Magic Leap. And now he’s got a few hundred sites around the world using Magic Leap in my original health care thing that we did at Mako, he’s using them for brain surgery and spine surgery. It’s actually kind of amazing.
So it’s funny to see that loop closed, which was one of the original questions. What was originally a big competitor of mine, now a really good friend at Brain Lab, he’s doing that thing. He’s actually implementing that last gap, which I wasn’t able to solve at the time of Mako, but we did solve at Magic Leap, and Brain Lab is using it. So anyway, that’s a bit of a messy answer to your question. I don’t know if that makes any sense.
BYE: Yeah, well, that helps set the context of the time. I guess, when you think about the founding story of Magic Leap, when was the moment where your co-founders came together? And was there a moment where you went from full-time to at previous job at Mako? And then it got acquired, you said around 2013. What was the founding story of Magic Leap, the way that you tell it? Given that context of all the stuff that was happening, bringing all the different pieces together that then you started to focus more full-time on actually trying to bring out this augmented reality headset.
ABOVITZ: So I think of like the zero-hour — like, I had a — I turned my garage at my house and one of the rooms of my house into a recording studio. So I’m going to take you to a — like it’s got a few messy threads that integrate together. So we’ll go down this road. So I met this producer in London, his name is Mick Glossop, and he had done records with like King Crimson Van Morrison, Public Image Ltd, like really cool stuff.
He’s a British producer. And I sent him a tape of my weird, grungy band. We were playing like late night — like the CBGBs of South Florida is this place called Churchills. We’d play there. You know, I’d be, I mean in robotic surgeries in operating rooms during the day and playing with my kind of crappy band at midnight, getting bottles thrown at us.
And I had kind of a homemade recording studio in part of the garage and bedroom, but it wasn’t really well set up. It sounded kind of grungy. And I sent a tape of stuff we were doing the Mick, and he actually thought it was cool. There’s a whole story of how I met him, but let’s just leave it at that.
He was like, “That’s really cool. I’d like to help you guys. I want to produce your next album.” I was like, “Oh my God! This is great.” So you’re like, you’re like, “How does this have anything to do with Magic Leap?” But it loops back. So Mick comes down, comes down from London and he hangs out with us for a few weeks. And he says, “All right, before I can record, we’ve got to fix this up.”
So we cleared out of bedroom. Cleared out the garage. And he’s like, “I know an amazing sound architect. Guys that build like studios and like top studios, like Criteria and others.” He’s like, “Before I record, we’re going to turn your house into a recording studio into like a proper room.” So at recording studios, you might have like an A Room, a B and a C. So he’s like, “This is gonna be like a B room or C room, but like good quality, but not quite like a full A room, which is gigantic, like at Abbey Road, but maybe a B or C room, like full-on everything.”
He calls up one of his friends who had designed mixing boards. This guy, I think Malcolm Toft, he’s like, We’re going to get this 32-channel mixing board. Now this is — what’s cool is like this is not long after Mako went public. So we put most of the money away. But I was like, “I’m going to do a couple of fun things, post IPO.” And building the recording studio was one of them.
So he comes in, he helps build it, set it up. And I set up the control room with this 32-channel analog board. Get this cool gear. Upgrade some guitars. Basically every milestone at Make, like my main mental reward was” I’m going to get a guitar.” Like that was like the main splurge. So I have a whole bunch of guitars in the house.
I don’t really need cars or anything fancy, but like cool guitars. Yes. So that was like every time we got FDA clearance for surgery, tenth surgery, this thing happened, went public. That was an excuse to get a guitar. So I needed to name the thing. So I named it “Magic Leap Studios.” And that was kind of the founding of — So that studio was called Magic Leap, and it was like painted on the door. My mom is a painter. She studied art at Kent State. She was there during the shootings, like, you know, the classic ones that you hear about in the Crosby, Stills, Nash song. She was walking on campus at the time. So she paints this like Magic Leap Studio thing. I tack it on to the door. That’s Magic Leap Studios. And I was working on there, like in the control room and in the garage, thinking about what I’m going to do next.
So I have the studio, we’re working on music. And I’ve got like a bunch of ideas all at the same time, comic books, gonna make a film, going to do something really cool, technology, going to try to solve that problem, “Why can’t we see stuff?” You know, that whole visualization thing. And I’m kind of think about always at the same time. I’m like, that’s my incubation space.
You know, I had no co-founders — like it was — that was like the beginning of Magic Leap. I call it the wandering in the desert. But I, I brought like, I’d say, “Fellowship of the Ring Members.” You know, when you think about co-founding a company. It’s like you’re hanging out, and here’s the idea for the company. Boom! Like Larry and Sergey — Actually, there’s a whole story there, which we won’t get into, but like Scott, who was one of our investors, he was actually like a real co-founder of Google, too.
But I think of a co-founder as someone that actually really is there with you at that moment of inspiration, like Lennon McCartney writing the songs. So I start this thing called Magic Leap Studio, it’s the idea of multiple threads. And I start wandering, looking for others who want to go on the journey. I think the early members, I think of them as like, you know, they’re “Fellowship of the Ring Members.” You know, they become that fellowship, That doesn’t like diminish their status. I think it’s just as key. But there wasn’t this like moment where we’re hanging out down, “Let’s go start this company called ‘Magic Leap.’” It’s a little like me in a garage with the door painted “Magic Leap Studios,” because I had just built a studio. And that’s how it all got started.
BYE: Well, when I talked to Tom Furness, he had mentioned that he had created during his time while he was working on virtual reality headsets for the Air Force throughout the sixties, seventies and eighties. And then he eventually went off and started the lab at University of Washington. And at some point he created a –
The HIT Lab.
Yeah, he created the HIT Lab, but also created a virtual retina display patent that then expired at some point. So at what point did you come across the core technology pieces to start to build together some of this digital light field ideas and augmented reality headsets?
ABOVITZ: So I have a friend who is kind of a physics genius. He went to Caltech. We were friends for a long time, and I begin talking to him about how do we unravel how the brain is decoding this stuff? So I’m thinking about it from a neuroscience, and he’s thinking about a physics perspective. So a whole long story there. I’ll leave it for this. We come up with an idea that we actually think is viable and good. And then we get scared. “Is there anyone else who’s close to that idea thread?” Like, how this works? What’s this mix of physics and neuroscience that would really unlock the key to this? Which, forget about how difficult the engineering might be like, what’s the theory behind it?
So we start to do all this research. We regroup for lunch about a week later, and we both go, There’s a lab at the University of Washington. And there’s a guy there, Brian Schowengerdt, who was a scientist who was doing some work that was adjacently close to what we were thinking. And we ended up — Brian actually got the title “Technical Co-Founder.” Because what I did was I flew out to Seattle, sat down with them, said, “We’re going to start this crazy company. I’d love for you to be part of it. You’re one of the few people on the planet who’s even in the same direction.” Like it’s like you’re heading West and you look on the plains of Nebraska, and there’s like nobody there. And you’re walking — and suddenly you see like, “Wait, who’s over there? There’s a guy walking there.”
And I’m like, “Do you want to join our party? We’re going West. We’re going to Oregon –” Like, we thought, “We’re on the Oregon Trail. It’s going to be intense. It’s very early.” And Brian was in this lab from Tom Furness. So they were looking at virtual retina displays. And there are pieces, but our theory was like — our theory did not negate some of the work they were doing.
It had a couple unlocks that they had not done, but it was going to be incredibly important to work with them. So Brian became our chief scientist. So he’s officially a technical co-founder. So it was basically me then, Brian, and I would say like “The Fellowship of the Rings” people like Sam, Richard Taylor, Randall, others who came along the journey.
But then we did an important deal with the University of Washington where we licensed a bunch of IP. And we had some IP from NASA, original IP, and UW. And that really created almost like the kernel, like the yeast and the doe. Where we’re like, “We have this like systemic thoughts with a few original things that no one had ever done, just like totally new things.”
But we’re like, there’s this whole thread at UW which is too close. And then we brought Brian in, and he was awesome. And then we brought in the IP there. And there was this stuff from NASA. And then we found this brilliant engineer, Sam. A guy I had worked with at Mako Surgical, Randall, who is now working at a supercomputing center for the Department of Defense.
So he came in. I think the order might have b een, like — I don’t know which one came first. I’ve got to go back and look. But they were all arriving around the same time, and then one signed first and the other. But that was kind of like a very early picture of that group. So you have this like brilliant software guy, Randall. This like NASA’s system engineer, Sam. You’ve got the physics guy, Graham. And then you’ve got Brian coming in from UW. And it’s almost like, you know, they’d be the middle of film one of the trilogy, if that makes sense.
BYE: Yeah. And I know that when people recount the public history about virtual and augmented reality, Ivan Sutherland is often cited as one of the first people to put stuff together. But I know that Tom Furness was working at it also at the same time, but it was more of like a secret track of some of the stuff that he’s working on that may not even be declassified yet. But some of the stuff that he worked on in terms of the virtual retina displays of the idea of just essentially shooting photons directly into your eyeball.
And so when I talked to Tom, it was like, I made the connection that some of the stuff that he had been doing there was connected to the stuff that Magic Leap was doing. And so I know that eventually, you have the creation of the big prototype that — well, when I first got into the field, it was in January 1st, 2014, when I got my headset.
At that time, Oculus had already been out for a year. And I think it was later that year, in 2014, where there was that funding announcement that put you on the map. So it was sort of in this mix of before the HTC Vive had been announced at GDC later that spring. But it was after the acquisition of Facebook that you come out from stealth to say that you’ve been working on this for a number of years.
So help me understand what was happening before you had come out, and became public. And if you had heard of all these other strands because Sony was working on this PlayStation VR, you have Microsoft working on the HoloLens, you have Palmer Luckey that’s working on Oculus. You have folks at Valve, they’re starting to work with stuff with Jeri Elsworth.
And so, were you aware of all these other things that were happening? And maybe you can contextualize what Magic Leap was doing. Because you’re all independently with these different strands coming up with the same type of either virtual or augmented reality. So yeah, I just love to hear a little bit of what was happening in that time period when you’re still in stealth?
ABOVITZ: So Kent, I gave you two postings, which I posted in public, which you can read later. One of them, and I’m just going to use that — When you talk about the history, I got really deep into the history of it all, XR and reality is I kept going. I went to like, okay, there’s Tom Furness, and there’s Jaron Lanier, and there’s Ivan Sutherland’s lab.
And I just kept going. And there’s like the stereoscope, and Phantasmagorias. And it just got weirder and weirder, and deeper and deeper. And you go back like thousands of years. And I’m like, “Whoa!” So I realized that whatever we were doing was going to be part of a multi-thousand year, not somebody in the 90s to us, or someone in the 70s to us.
But it goes back a really long way. Like there’s a thread which I think I’m going to do a documentary film about – we could talk about that one day. I wrote this thing “The Brief and Incomplete History of XR,” but it goes back. You can really go back thousands of years to this line of thinking that leads to a lot of these things and branches. And one branch turns into cinema. And one branch turns into like the View-Master. And another branch turns into like surgeons wearing x-rays on their head that look just like VR & Oculuses, but they’re beaming fluoroscope x-rays in their head, to a branch that turns into holography in the early 1900s, and MIT plays a really big role.
So as you keep following that, I was like, I want to go to the root of the tree, and go back and back and back. And end up like shamans and Buddhists and Hindu religions and like the idea of Maya, which is like “all reality is illusion.”
We actually brought Tibetan monks to the building at some point because I was like chasing — I was going all the way trying to figure out like “What is really going on here?” So it wasn’t “We want to build a VR system.” It wasn’t “We want to build an AR system–” And we’re like, we’re trying to understand the nature of reality with this early group.
And this was pre- any real investor coming in. This was like “I’m angel funding it.” I got a little bit of capital I could put in post-Mako. My wife made me put almost the rest of it away. So we had this finite pool to do these like really weird, out there, almost philosophy meets phase-one, research grant, kind of stuff. So the early time was super fun. Like we’d have whiteboards where like, “We need a black hole that floats over here to create dark matter.” Things like any investor would not be happy with, but it was pre all of that.
And I think that early exploration was important because we were trying to trace back the whole history. So we became like hyper aware of like all of the different things and what their failings were and where do they go astray. For example, we became obsessed with holography. Like, I actually felt that like stereoscopes were a dead end, meaning that if I take two flat planes and try to convince my brain that is the real world, no matter how high of a resolution — I’m missing some fundamental things.
So then we went to holography, where like holography is capturing the totality of light field physics on film. But it’s frozen in time. And we’re like, “Well, that’s not it either, because the real world is not frozen in time. It’s dynamic. There’s a feedback loop with your brain. It’s rich. It’s got all the parameters of light fields.” And we’re just going around trying to go, “How do we actually unlock all of this?”
So one thing that’s now — you know, we’ve published patents. And I wrote a small white paper on it, so I could talk a little bit about it. We really were going after what I call “Neurologically-True Reality.” Like ,I was at SIGGRAPH when I saw Jaron Lanier speak — he’s a genius — about the whole possibility of virtual reality. Read all the SNOW CRASHes, RAINBOW’S END, all of that kind of stuff.
But I wanted to chase like, “What’s the end game look like?” Saw The Matrix. I never thought I’d work with people like [John] Gaeta or Neal [Stephenson], which was kind of insane. But the idea of like, “Could the magic of the brain — how it displays — could that be unlocked?” And the first thought was — To do that, you had to recreate the entirety of a lightfield.
Now I think this is where there was like confusion both internally and externally, because you have so many people in this field who are hardware engineers, optics engineers, display engineers, and that’s all they think about. And then the other side of it is like, “Well, how does the eye-brain system process these signals?”
So I’ll describe it this way, if you give me a minute, because I think it’s probably the root of the whole idea of why I started the company. And it’s still something to be chased, because it’s like an impossibly-difficult problem. And we did make a lot of progress in that direction, but it’s one that I think still needs to be chased, because I think it is the end game.
And it’s basically this — is like, I didn’t think the eye sees, I thought the eye was the filter. And I thought the retina acted like a CCD. And what it would do it’d take the full complexity of the universe, like the physics of lightfields coming at us, then their sound and touch — we can get into those later. But just — if you just focus on light fields. What’s outside of us is that full, dynamic, photonic wavefront definition of light energy. And it’s everywhere. And it’s all over the universe. And it travels in the speed of light. And it’s super complex. And it has all this information.
And we came up with this idea that the brain does not need all of it. So it was the idea that the brain could not possibly intake all of the lightfield physics. So then we — through this notion of evolutionary biology, which is like, “Well, we evolved to survive. So we’re here. So somehow we took in enough of what we needed to survive.”
And there’s something about the human body, our design, how we do things, even the distance of our fingertips to the eyeball, which tells us clues. If you start to look at how the brain works, there’s like little weird footprints of human evolution to deal with the fact that we took in what we needed to survive.
Another thing I haven’t talked about in public, but it’s worth time on here is — We thought about self-driving cars, and how they work. And if you think about a self-driving car — like, Sebastian Thrun is a friend of mine won the DARPA Race Across the Desert. He was the first to win that race. And it was kind of an amazing thing. And opened up the door for the whole self-driving car world. So I don’t remember the exact year he did it, but it wasn’t that far away from the time when we were starting up Magic Leap.
So we started to think about his car. And this is a bit of a conceptual, philosophical leap for a second. You’re a car driving through the desert. Now the desert is beautiful. It’s got the golden light of the sun. You see all the mountains, the cactuses, or maybe like a fox running around. And it’s just beautiful. The car doesn’t see that. The car’s covered with sensors. And it takes in sparse samples of that desert scene to add to the 3D map it has of the world that it got inherited from like other scans. And all it’s doing is driving in a line.
So that poor car never sees the beauty of the desert. It never sees that light. It never sees any of it. It’s just like in its Plato’s Cave. But it’s evolved through our engineering to do what it needs to do, which is get across the desert and win the race.
Then we have this idea that “We are just like that car.” Like human beings are in Plato’s Cave. Our eyes and our ears, our sensors are taking in and throwing out most of the information, meaning we take in just enough to have our brain build upon an existing model of the world. So we had a theory that there is an existing model of the world, #1. That comes in our brain that we’ve inherited, that you pass on through every generation. And we keep improving it.
So somehow our brain structure has a model of the world. And that model of the world gets reinforced when you’re waking up as a baby. You’re touching things. You’re seeing things in it like — almost like a machine learning, which is based on the human brain to an extent. We keep adding that model and shaping it, makes it like person-specific, human-specific.
But you have this like built-in model of the world. And we thought if that was true, that could explain why we only needed sparse data. And we kept looking through the structure of the eye, the structure of the brain, and where the visual cortex is, and sort of the density of parts of the brain, and how it does things.
And we’re running all these interesting ophthalmologic experiments on how the brain might take data in, and throw out what it doesn’t need. And the idea was — We throw out only what we needed to survive as human beings. So it was this collision of like evolutionary biology of people versus like this very intense, exotic, lightfield physics.
And then if you think about it for a while, it does match up. There’s this like nice harmony between what the human brain evolved — for many, many generations, you know, millions of years, tens of thousands of years, whatever number of cycles — enter this harmony with this like infinitely-older, lightfield physics of the universe. And we’ve adapted and evolved into this. And this is like stable system of equilibrium. And I was thinking, if you insert anything into that, that’s going to stress out the human brain, that’s going to be a problem.
Like, we needed to do something that would be like inert and then eventually find its way to be biomimetic. So before we thought about device, it was like, what’s the theory going on there? And if you then look at the whole history of like all these attempts at augmented and virtual things, every time you insert one of them and they break that equilibrium, there’s some kind of stress: there’s nausea, there’s dizziness, there’s something. There’s like lack of visual quality.
So we started to build a model of parameters on everything we would need to do to ultimately get that equilibrium. So I’m going to stop there for a second. Because that was just a lot of stuff. But does that make any sense? Because that was an opening theory to why we were even doing this thing.
BYE: Yeah, I think that helps set the broader context. And I guess part of the question that I was asking in terms of Magic Leap is — You’re doing this in stealth, but there’s also other people doing this in stealth. And I’m just curious at what point you become aware of some of these other projects, whether it’s Microsoft HoloLens or Valve with their AR lab that Jeri Ellsworth was setting up, or Palmer Luckey with Oculus that eventually had the Kickstarter in August of 2012. And so that’s a pretty public thing that happened within that month that put VR on the map. But there is all these other projects of AR that both Magic Leap and Microsoft were working on. And so, just trying to get a sense of this period because it’s like really an interesting turning point in the history of XR. All of these things are independently happening at the same time, even with Sony and their PlayStation VR. They’re all pushing forward the technology, incrementally. And I’m just wondering how you came across any of these other projects or if you had any connection with them before Magic Leap came out?
ABOVITZ: Yeah. I’m going to tell you about our intersection with Valve, and almost intersection with Oculus, which is fun. But also, you might get a kick out of this. I kick myself over it. If you look at our designs, like 2010, 2011, we designed a bunch of like — if you’ve seen those renderings of what people are projecting the Apple VR system to be. We had done all these like systems of like “Well, should do VR or AR?” And then we had designed this thing which was VR with Video Passthrough.
We actually filed a bunch of IP on that. And I have all these drawings that are sitting in Magic Leap, owned by the company, of like video passthrough, and all these sophisticated optics to do that. And it was like a thread. And I thought, in the end — and you could say, “I was stupid” — In the end, I thought that thread was not the way to the mountain top.
It was the way up the mountain. You can go up the mountain quickly. Maybe you should have done that because it probably a multi-billion dollar fast path. But I was more of a theoretical purist on trying to solve this physics-meets-the-brain problem. And I’m like, “We do that to cheat to get somewhere, but you won’t get to the mountaintop.”
And the goal of Magic Leap was, we were going to figure out how to go mountain top and try to climb up that mountain on the right path, the one that really takes you there. And I knew the other one would be like this quick hit, I called it “drinking seawater.”
And literally, all the stuff you’re seeing come out now, ’22 like VR with video passthrough displays. I have this whole bunch of renderings of those things, and they would have worked. We could’ve been very early in that. I know Valve did some things a couple of years ago on that. But I felt like it wouldn’t ultimately meet the goal of actually using the brain as it evolved and as it has designed through its evolution.
I kind of felt like there was something beautiful about the way the brain lives in the organic world, and it’s perfect in how it images with the physics of the real world. It’s just amazing. And I felt if we did this other thing, we would break that equilibrium. One’s billions of years old. One’s tens of thousands and millions, and like, why would we break that? That would be like arrogant of us. So I wanted to do this, like, “Can we somehow find a way to slip in between those things?” Which is like a much harder path, like an insanely difficult path we took.
So anyway, we decide that’s our path. We hear about Oculus. And we saw all this stuff going on. And you know, Brian was living in Seattle, Brian Schowengerdt. So we’re on our way to San Francisco. We’re going to Sandhill Road meeting with investors. I think doing our first significant round of funding after me, angel funding the company for a bit, like we need more. And I had hit a limit like if I go any more than everything I had put away from Mako would be gone and I would be like a poor college student again. And I think my family was like, “You’re not going to do that twice.” Like, “You bet everything on it. It worked. That’s put away so you can like have a secure life, but you’re not touching that anymore. You can use this much to get the company going. Now you can go see if other people believe in the idea.”
So we go out to the Bay Area, and then when we’re there we get invited to go out to meet Gabe at Valve. And I’m like a huge Valve fan, like love Valve. Michael Abrash was there, Carmack was collaborating with Palmer Luckey. And we’re hearing about, you know, we’re reading everything and friends are in the network about this amazing open source VR community that’s being formed.
And what I knew about what was happening with Oculus at the time was there’s like this group of people working on it: Carmack from ID, Abrash from Valve, Gabe is kind of like the Yoda pulling everyone together, You know, there’s this whole super commune feeling like it felt to me, like the Homebrew Computer Club, like everything’s coming together. We’re going to reinvent computing. Very idealistic, you know, Gabe in particular was like, open and idealistic.
I thought it was amazing that ID Software’s collaborating with Valve and all these open source people. And then no one cares about IP or copyrights or anything. I’m like, “It’s just going against the grain of how the whole world operates.” And it felt incredibly utopian. I was like, super excited by this. Like, maybe this is the Dawn — this the Age of Aquarius, Part Two. Because I was really into the idea of the Homebrew Computing Club and this idea of open computing. I thought that was awesome.
So we get invited up there. I remember hanging out with like, I think it was Brian, Sam, a couple of the other guys in San Francisco. We’re at this hotel near Stanford, and we’re wondering — We just have this late night talking like, “Well, let’s just think this through for a second. There’s this whole open source community. And I think they wanted us to be the augmented guys.”
Gabe is like, “You guys come up, and do this.” You know, Brian and UW had a great reputation. They sort of knew that we were partnering up and like, “You guys solve that problem. We’ve got this amazing game engine. You know, the Oculus guys had VR handled. We’re just going to change computing. We’re just going to catch everybody by surprise.”
And, you know, if we could go back in time, like in a Doctor Strange movie, and make that reality happen, one where they did not sell to Facebook, everyone hung out with Gabe. I think that future actually been kind of awesome. Like that would have been this great benevolent XR, and everyone in one nice big open platform. You know, it was very Web3. It was very decentralized. It was very super cool. This is like 2012 maybe, you know, before Facebook bought them.
And then in kind of in the end, we decided — We had the Spidey Sense Alarm. Something did not feel right about this utopian notion, because we’re like, “No one’s signing any agreements. Everything’s wide open. Like, is the world really gone into this, like, post-capitalistic, hippie commune, techno, like solar punk world?” You know what I mean? Like everyone wants that to be the case. And I was like, “Well, yes, let’s do it. We’re all very idealistic.”
And it turned out that the Spidey Sense was right. Because not long after that, Oculus was acquired by Facebook. And we’re like, “Oh my God! Like, Valve’s going to go nuts.” And so we end up, I think one of us called Abrash and we’re like, “Oh my God! You know, what happened?” And he’s like, “I’m joining — I’m joining Facebook.” And we’re like, “What?!” Like it was like this — you know, is that a bad thing? Or a good thing?
But it just felt weird that this utopian dream that was hovering around Valve and Gabe was kind of shattering. And this new element came in — was it Jedi or Sith? I’m not going to say that right now. You’re from your own opinion. But it felt like the Jedi Alliance was getting slurped into something else.
And then we heard John Carmack went there. And Gabe stopped talking to people for a while. This is my view of it. You can have them on your podcast. They could say, “I missed it.” But it felt like this moment in time where we were all going to hover around Valve as the center, as the new utopian capital for this new form of computing — Gabe, by the way, had the scale. He had the credibility. You know, he had the funding to probably pull it all together.
Like, again, if I had a time machine, I think that would be such an awesome future. Everyone would be really happy about it. And then that didn’t happen and it went one way. The device was named properly, a great “rift” happened. And one group of people went that way. And then others scattered — like, Jeri scattered from Valve into forming her own company –
BYE: Well, she was — She was fired from Valve, more accurately. It wasn’t that she just “scattered,” that she was actually forced out because of some of the things that were happening with Abrash. I mean, I have her telling that story, but –
ABOVITZ: For the record, I’m a huge fan of Jeri, super genius, one of the great hackers in computing. She’s underrated, and she got — On the record, somebody should give her a lot of funding. So putting it on the air there. She should be one of the real, like — I think the fact that she’s made it so far, it’s been too difficult for her relative to other players. And I think she should get her due, because I think she’s got a view of how to do it, which is different from others. And I think it sits in the ecosystem. So anyway, that’s my plug for Jeri. She should get her due. She’s great.
BYE: Yeah. I had a chance to do an interview with her AWE 2021 where she told her version of the whole history and story. And one of the things that she was saying was that her perspective was that “AR happens before VR.” Because she wanted to have — What Gabe’s initial vision was, to have people that were at home sitting around the dinner table, all playing games together.
And the technique that she has with Tilt Five, with the retroreflective material works perfectly for that in terms of having like a real experience of — like you were saying — like the real experience of the physics of the light that you see. But in the context, it’s very fixed and static and on a tabletop scale and for mostly for gaming. And so for that context, that works great.
But when you’re moving about the world, I feel like there is probably, in hindsight, a lot of innovations that have had to happen with computer vision and artificial intelligence to get to the point to have it fully mobile and portabilized. All of these core technologies that in a lot of ways what Magic Leap, as I look at the company, and see all the different — many different things that you’re trying to innovate in terms of the optics, and the digital lightfields, and the operating system, and the whole portable mobile computer that you have on your head.
Really ahead of your time in some sense. But at the same time, there is a complex of all these exponentially difficult problems that had to all come together, all at once. And I think at this point, we’re at the point where the technology is maturing to be able to really pull it off at scale. But at that point when you were doing it, VR has proven to be, like — some of those foundational bits and pieces needed to be in place to not really grow out the ecosystem, like the Quest tracking, and the computer vision aspects.
So you had your own track of all that stuff, but I think a very Hegelian dialectic process of looking at history and see how things develop and how they play off of each other. And it did go towards having these companies with a lot of access to capital. And I think, you know, it’s hard to say whether or not that open source utopian vision would have been able to be at the scale we’re at now.
It may have been like super small and still kind of like a hobbyist thing that not really on the radar of people talking about the Metaverse in the mainstream. I think the history that we are living with in this version of the multiverse is that it did went the corporate version, and it was highly capitalized of billions and billions and billions of dollars –
ABOVITZ: Tens — Tens of Billions –
BYE: Tens of billions. –
ABOVITZ: Easily, tens — By the way, Kent, I got to talk to you about the tabletop gaming, because if you go back to the very beginning of it, again, I got introduced to these two comic book writers, Anthony Williams and Andy Lanning, who had done all these great comics for Marvel and DC. They were actually guys. They knew a friend of mine, Richard Taylor, who co-founded WETA Workshop, the guys that did Hobbit, Lord of the Rings, all of that. Richard, by the way, was a founding board member with me of Magic Leap. So he was kind of one of the early fellowships. You can think of them as Aragorn or something, you know, maybe our Gandalf.
So he introduced me to Anthony and Andy, and they’re the guys creating comics on projects I was doing out of Magic Leap Studios. Like we made comic books and we were brainstorming around. I think I had a friend who was a film director who might have been part of this in Los Angeles. And one of our first visions for what Magic Leap would do was a bunch of people sitting around a table at home playing a game we called Monster Battle.
The idea was like — We had this rendering. This is what we showed early investors. You’d have this pair of glasses, each person would have their own monster. So I’d have this, like Godzilla, you’d have a King Kong. So it’s like all the classic movie monsters. You dropped them on the table and they would fight on the table. But the thing we wanted to do at Magic Leap, was like, they got really serious. If you wanted to level up, you would take it outside. You’d grab your little Godzilla. I’d grab my King Kong. We’d go out to the playground. You’d drop it in the floor and suddenly it gets full-sized, like 20 stories tall. You know, King Kong goes to full-scale sized. Yet with what we wanted to envision, you would still see it.
So we wanted to have this idea that you could start with like the Star Wars kind of chess game of like, you know, famous movie Monsters, King Kong, Godzilla, Mothra, and all of that. And what we did was we reinvented all of them into a classic characters we called Monster Battle. We ended up using those characters as all of our optics tests for every version of Magic Leap from the very first prototype onward. We named one Gerald, one Al. Like, Gerald might have been the King Kong one, and Al was the Godzilla — I might happen them mixed up.
And they were a little bit like more Pixarized. But anyone who came by early would see these things there. But the dream was you’d have this like — kind of like Jeri is doing out Tilt Five, these tabletop-sized creatures in a Star Wars chest kind of thing, which I thought would be awesome. And they’d battle with each other.
The leveling up was our big dream, that you could somehow take it — and the whole world would be digitized — and you could drop, let’s say, your Godzilla outside of your house. And it would just scale to however, you know, 30 stories tall, like a way past any video game, any board game. But also know where the whole world was. And we designed a system that would ultimately get to human field of view, but seeing the real world directly. And that you would be able to see this Godzilla running around and it would like it could even run away, like it might run from your city and you might find it in Chicago three years later. And you could look for it on Google Maps or in a satellite view, and they would be there running.
And if you went there and you had your system, it would be in that place. So we were envisioning what I called at our Leap Con, the Magicverse. We’re envisioning that like in 2010, 2011, this idea of an entirely scanned world where every single object could be known and co-located. And the whole world would be like one big storyboard, one big game board, so that you can go from a table to the inside of a house to the outside of the world.
We realized that was a very ambitious program. It’s a bit like SpaceX’s Occupy Mars. But we had this like very big idea that you’d have infinite layers on the world, objects could be of any size, and infinite amount of players, which had all these complex computer problems. But it’s the same scale as like, “How do you go occupy Mars?” It’s going to be a really hard problem.
But you’ve got to go step by step. You have to go Mercury, have to go Gemini, you have to go Apollo and then keep going. So that was the view. But we were intersecting at conferences and through knowing friends with what was happening. And I think at times we felt — We kind of looked at ourselves, “Are we making a huge mistake?” Because the VR path was quick, and much easier because you could use off-the-shelf optics. You didn’t have to solve all the cockamamie problems that I’m describing. You could literally just whip the thing together and go. I mean, probably with a 50th of the problem sets we needed, you could just build a decent working VR system. Because there’s just a whole different bag of problems, like it was a much more consolidated problem and I think that’s why it came earlier.
But I call that “The Early VR.” If you want to go to the ultimate version where you put something on and it really is indistinguishable from our actual reality, you have to solve all the problems Magic Leap tried to solve in what I called “Neurologically-True Reality.” So our view was you would play the long game, the really long game, which could take into the 2030s. Then you get AR, you get VR, you get everything, you win the whole game.
So I think there’s like this — it’s like I think of it as a very long race where there’s like, there’s like winners and there’s different spurts. I still am not sure what happens when we get to the finish line of like it all gets resolved. My guess is around 2035. Like you’ll see like the really big winners like and you’ll probably hit what I call “Neurologically-True Reality” in augmented mode, in virtual mode. People are like, “Oh my God! This is like, It’s Ready Player One, Matrix level.” I think we get there by then,
But that’s like SpaceX finally setting up a base on Mars and achieving the Occupy Mars that they wear on their sweatshirts. So I don’t know if that makes sense, but that was like our mentality at the time.
BYE: And as I hear you speak, some of the themes that come back again and again is the focus that you have on these entertainment or immersive experiences, being inspired by sci-fi, the storytelling. When I went to Leap Con — I have to say that I had been going to Microsoft’s HoloLens — Microsoft’s Build. And a lot of the demos there were like, nothing to write home about. They were all like just how to use the AR technology in an enterprise context, which wasn’t really a consumer product. But when I went to Leap Con, there was a lot more focus on immersive experiences, and Meow Wolf had a whole mech robot there, and different entertainment experiences.
And as I see all the different focus on Augmented Reality games, ARGs, and hiring Sean Stewart to be, you know, one of the creators of ARGs, and also Neal Stephenson, you know the author of Snow Crash as a science fiction author. I’d love to have you just maybe take a moment to reflect on not only how the stories and IP like Star Wars have influenced your thinking on this vision of the future of the Neurologically-Real Computing and augmented reality. But also how all the other sci-fi may have been inspiring this vision that you were going down.
ABOVITZ: Well, that’s great. And I also want to like unpack and maybe correct a couple of things that are just general misperceptions in the market. So there was no doubt a deep love of creativity and storytelling and games and film and music and comics, not only from me, but from a lot of the early folks who joined. It was like, absolutely no doubt.
And we didn’t have a problem letting that shine. I call that “The Yellow Submarine.” You know, like the Beatles Yellow Submarine. There was also the aircraft carrier of Magic Leap. And the part that people have a hard time characterizing is that “Could you be left-brained and right-brained simultaneously?” And I think that was not a clean story in the news. Like, they just can’t grasp that idea. And most people are not both at the same time.
But I had this idea that we could be both. And what I mean by that — I had just come out of graduate school through to starting Magic Leap in an FDA-controlled environment, doing robotic surgery, which is like highly regulated, incredibly safety oriented, you know, same level of like spaceflight-level of risk. You know, people die if you don’t do things right. So I just come out of that. I was trying to inject all these imaginative game engine and animation things into that field, and it worked. So some of that followed me immediately into Magic Leap. So the outward part that some people saw was what you described. But like, the day I started the company immediately, like health care companies started to work with us.
Like a friend of mine was a big competitor of mine at Mako’s, named Stefan, CEO of a company called Brain Lab. They’re like the most advanced medtech, computer software company of their era. And they’re in Munich. I’ve been to visit them. Stefan’s kind of like a Steve Jobs type figure in Germany. He’s a real interesting character. If you go to the Brain Lab site, he ended up becoming a very early partner.
He loved what we were doing. And my view is in enterprise stuff, we were going to find strategic partners like this. Now those partners were not going to be broadcasting their work to consumers. They were going to be working with us and building an advantage in their own businesses. So we had a huge array of like health care, automotive, industrial, defense, all kinds of things that we couldn’t really talk about too loud.
Some of it would appear once in a while, like, I know there was a YouTube video of like Navy soldiers running around with Magic Leap doing something. I can’t comment on any of it other than that what’s on that YouTube video. And they put that out. We didn’t put that out.
So we had like this, I would say, equally-weighted effort where lots of companies would reach out to us, many of them quietly, and entities like some of them universities, some of them large, large automotive players, aerospace, all kinds of stuff, all interested — CAD companies, math, physics, who are not that storytelling side.
And my view is — If we build a platform, we’ll let everyone build on it. I had a love for the storytelling part and the creative part. Like I envisioned the company as kind of a weird mash-up of Disney and Apple. But I also had this idea — and you might have grown up in a similar way of computing. My computer did not care if it was running a spreadsheet or a video game or an adventure game and then going back to like watching a video stream and then going back to a CAD. Like, it did all of it.
So I had a thing, like the first thing we would do would be like a stem cell. It would test all those areas. It wasn’t meant for one field or the other. We wanted to see — What’s going to make this group happy and that group happy? What did everyone want out of it? So we put something that we thought was like a first version to get all of these different entities trying it.
So we had so many cool things. Like I remember going at my office, sitting in a folding chair and a vehicle appeared around me. A full size vehicle like this was done with an automotive player. And you’re sitting in the car like parts work. You could change the color and change of things. And a CAD engineer somewhere else could make a shape changes. Could you stretch this out? They would stretch it. And multiple people could be in that car at the same time, people in the back seat, people in the front seat.
And it was really awesome. I’m like, This might be the future of how you sell cars, or how you design cars. And we had people that did sculpting and CAD and all sorts of prototypes and pilots because I realized, like, it wasn’t only consumers, it wasn’t only health care, it wasn’t only enterprise. It’s going to be all these sectors that we’re all going to explore computing.
I think the one that the media picked up the most was the shinier objects, you know, like the cool shiny looking things, because sometimes industrial CAD stuff or a boring enterprise thing is not sexy for an article. And it doesn’t make for a good video roll, and doesn’t bring eyeballs. I don’t know if that makes sense, but I had that experience of, you know, in terms of being in the company, there was a lot of non game, non story stuff going on, but it was with people who wanted to keep it close to the vest. It was their company, their work, not something they wanted to show their competitors yet. But we had hundreds of them, thousands actually.
BYE: Yeah, that makes sense. And I’d love to have you maybe comment on the science fiction and the other ARG elements because, you know, hiring in Neal Stephenson, a science fiction author to come work at Magic Leap. And I’m just curious to hear a little bit of the other stories or science fiction influences that had an impact on you in the creation of Magic Leap.
ABOVITZ: It was weird, because I’m still friends with Neal. He’s awesome. It’s one of those weird things in life that someone who’s a mythical figure, you ended up signing their expense report for lunch. You know, I was like, this isn’t really happening. And it add to this surreal nature of that company of Magic Leap. But Neal joined early on, and one of the things to say about him is he’s not just a brilliant author, he’s also a scientist and engineer. And he likes to be one of the guys rolling up his sleeves almost in a very blue collar way. Give some hammer, give him some tools, and he wants to get in there and build stuff. So which is amazing. Like he’s much more of a get-in-there engineer than anyone, maybe fully realizes and actually practical and smart on so many levels.
So that was just like — you know when they say, “Don’t meet your heroes.” In the case of Neal, it was “It was great to meet him. He’s amazing, and a friend and cooler than you’d imagine.” So that was all. I met a lot of other people and that bubble was burst and I don’t want to mention who they were, but it was like, I can’t watch their movie, listen to their music, or read their books anymore. But Neal, the opposite.
But there was this interesting thing where “Where was Neal going to go?” At the beginning, like was he going to go to the O company — or the F company, which might have been super lucrative, like, you know, “Here’s a big check. Come join us.” A lot of big names are going there, but for a variety of reasons he joined our thing. Because I think he liked the purity of the theory and the rebel alliance kind of nature. We were a little bit left of center.
And I created something called “SCEU: Self-Contained Existence Unit.” That was going to be Neal’s team. It was going to have 12 people in it, The Dirty Dozen. And he was going to be Lee Marvin. It was — Everything had to have a story. So Lee was Lee Marvin. It was The Dirty Dozen.
The name of the team was called SCEU. My job was to shield Neal and the team from any bureaucracies that would sprout as the company got bigger. And they would do the most extreme edge of ARG exploration of storytelling, which was awesome. Like take everything we’re building, you know, bring in the smartest co-writing authors and builders that you know. And let’s figure out how do you tell stories? What does that future look like? And some of the public stuff that they did, like the Goats, was like a bit of a sense of humor. But also, if you actually look at what goats was — I don’t know, did you ever see goats, Kent?
BYE: I saw the demo that was shown at LeapCon. So there’s like virtual pet, AI bots that were maybe emergently existing within a space that was mixed reality interactions with them.
ABOVITZ: Well, there was like two big projects. There was like the genius-level, Neal, full-unfolding story on the World Project, which I think is somewhat still confidential and secret. I have to check with Neal on that, which was just awesome. And then there was goats, which is the public-facing, look-over-here project but served the purpose.
And the funny thing with goats — I’ll tell you about one example of it. We took a room that was completely digitized, so we had a full digital twin of the room. And you can actually walk into the room with VR, take off the VR, put on the Magic Leap. And we were just like going, We have a VR version of this room with everything in place. When you feel it, take off that, put on the headset. Okay, now is the real room. And we were trying to see could the AR and VR elements commingle?
But goats was — Could you create sentient creatures that knew the world? Could find their way around objects and interact, and it was a step towards like if you’re going to tell stories one day across the whole world, well, you need that. You’re going to need all kinds of sentient things running around, and they need to know where the world is. They need to go under tables, over tables, on things, respond to you. And there was a version of goats that actually was pretty damn amazing. Like, it was just like, very next level. Something’s going on behind a couch, on their couch, jumping on a table, jumping off on a table, and there it is. And then you punch a hole in the wall, you see, like a window. And with the team sense of humor, there’s like a 300-foot goat out in the field, and then they go to jump into the window, into your room, onto a chair. So they were sort of exploring the mechanics of sentient creatures in both AR and VR environments and how they would blend back and forth.
And then we wanted to open source the whole thing, make it a big open platform and have lots of devs. You know, they created this like Goat SDK, And the idea is they’d have lots of devs remix it for any device all over the place. So Neal was going to be like the punk Seattle indie band version of our dev kits and open sourcing goats and trying to create this idea of like story tools for everywhere. That was sort of the goat’s mission.
And then the behind the scenes one was like full, novel storytelling, using every bit of technology trick we had learned. I’ll say this about it, and then maybe one of these days you’ll convince him to come on and see what he’d tell you, because it’s — I don’t want to mess with, like, that piece because maybe it gets unlocked one day and Neal will do something.
But we rented this building in Seattle, and I think it was a third or fourth floor. And we digitized like I think several miles around the building. So you have this digital twin. And the idea was you’d sort of find your way somehow when this thing like you find a way into this building and you’d unlock all this stuff. And you when you look out the window, all these things would be happening. And he actually made that work. It was just next level. It was really cool to look out a window and see real people and real traffic and then other things that really weren’t there knowing all the traffic people and jumping around over stuff at this wide scale.
And if you can imagine that knowing something like the Magic Leap 2 now exists, but if you imagine you’re Neal, you know what’s going after the Magic Leap 2 multiple generations and you’re designing for that. So he wasn’t thinking there’s only going to be a Magic Leap 1. He knew multiple generations of where we’re going and like what these end state things would look like.
So he’s like as an author and a science fiction storyteller, how awesome would it be to tell stories across the world using the best versions of these things where we want it to feel seamless, the device to get lighter and lighter, and less like you have anything on at all, and still have like amazing resolution wide field of view, the sound and the subtlety. So you just forget there is technology. And then if that was true, the storytelling, you could do with that was amazing and next level.
So we had like Sean Stewart just next-level, brilliant guy. I hope that what they learned and prototyped there does either in that form make its way or in some other manifestation makes its — because it was just incredible. It was one of the things I actually am biting my tongue and hate because like I wanted millions of people to see it if the pandemic had not happened. We were probably going to be doing something with one of the major studios and taking a lot that was like maybe 40 or 50 acres. Take an entire giant studio lot digitally twinning the whole thing, dropping Neal’s story on top of all of it and bringing like tens of thousands of people a month to beta that next level.
And then if that worked, we were going to unlock it across like a regional party, United States. And if that work, we were going to unlock it across the whole U.S. So like publishing a story from Neal and his team across the whole country. And that’s the kind of thinking like Neal was doing with his SCEU teams is totally next-level awesome.
I really hope somehow, despite the blip of the pandemic, it’ll find its way there because like, it was just cool. And it’s not like, “Oh, there’s a little creature on the sidewalk, click it, get ten gold coins.” It’s like, “No, it’s like full, novelistic way of telling a story across the whole country,” which I thought was just sick and amazing and super cool. And the world needs to see that one day.
BYE: They also had an ARG, or at least a novel that Sean Stewart, Neal Stephenson, and one other coauthor, I think it was from a project that had started maybe at Magic Leap, but they had released like an audiobook version of some of the aspects. And I don’t know if that was a separate, but some sort of innovation that had come out of their collaboration there that is out and available for people to listen to. I haven’t had a chance to check it out yet, but I don’t know if that’s connected to all of that.
ABOVITZ: Well, it’s connected to the SCEU Stew, you know, like the stew of fun things they were brewing. That novel escaped out of their lab and made its way to an Amazon partnership. I think Neal’s had a good relationship with the Amazon team. And it became an audio book.
By the way, there were a lot of things seeded in the world by that team that nobody fully realizes. It’s out there and soaking. So if they ever get to activate the whole thing, it’s there. But I won’t go more than that. But it was really cool. Sean is clever and Neal’s clever and Austin and those guys, they just — So you know, you could drop things and it’s soaking and soaking and you’re like, “Wait a minute.” And then they can activate the whole thing. But you know, it could have a long game. Like maybe that whole story will come together. I hope it does.
BYE: Yeah. ARG people discovering things and the story unfolding there. So it sounds like there’s a lot of stuff yet to be unfolded there.
ABOVITZ: I hope so.
BYE: But there was a metaphor that you’d used and you’re talking about the different iterations you have, the the Mercury, the Gemini and Apollo, where it sounds like where you’re having to have these multiple tracks of innovation. Because the big lore from the people that I was able to glean some information of, there’s, you know, lots of NDAs and secrecy around Magic Leap. So people weren’t even supposed to really acknowledge that they had seen any demo. But in the course of talking to enough people in the XR industry. And people who have had the privilege going down and seeing the demo, there was a big, massive refrigerator demo. But then eventually you had to portablize it into what eventually became the Magic Leap 1 and then Magic Leap 2, and then further iterations.
But that you had these multiple tracks — As much as you could tell me, what were people seeing? What was the demo and the content that if they would come down? Because you were flying all sorts of different investors, celebrities, people from around the world coming in to see this magical demo. And for most accounts that I heard, it was pretty mind-blowing. And even did an interview with Paul Reynolds, who told me an experience that had this synesthesia effect where he was able to see a light field that came, but he was able to feel like some haptics. And so you get sort of this phantom touch type of phenomena that you’re able to evoke using this digital light field technology. But what can you tell me in terms of this state of the art demo that you’re able to put together at Magic Leap that has been shrouded in secrecy for years and years?
ABOVITZ: Yeah. So that — that’s great — Because there was that era where we left my garage into a — I call it somewhere between a strip mall and a warehouse complex. And we were in this innocuous looking place with like a glass door. And I think there was like an accounting firm nearby. And there was like this gun range. These are shots all the time grandfathered in, like it shouldn’t have been there. But like in Florida, this gun range that got grandfathered in. It’s been there since like the 30s. So it’s like a half mile away from the parking lot. So you hear these pops all the time. It was very grungy.
It really felt like an X Files or Men in Black kind of place. Like you’d never, ever expect any of the stuff we were doing to be happening in that place. So it was like you’d expect it to be like a laundromat, maybe a bad diner, a low grade accountant, and then like it was like Magic Leap Tourist Agency or something. Like it really felt like we were completely out of sorts and the kind of tech company we were would make sense and Palo Alto. That would make no sense in Hollywood, Florida, like that place was where I grew up in Hollywood, and it was like there’s zero chance of a tech company being there.
So first of all, we were really safe because no one had any idea what we were doing. So I think there was also this notion of coming to Florida, coming to Hollywood, having no expectations, looking at this super modest, grungy strip mall warehouse thing, walking in through a door. And you’d expect that to be like a tourist agency trying to sell you a ticket to, you know, go to the Bahamas.
And then you go to the back room. And there’s like this contraption that literally looks like something out of Brazil or, you know, the movie Brainstorm. It was really gigantic. We made this thing, and I’ll tell you the purpose. I won’t tell you all the details because there’s still IP and trademarks that Magic Leap will use, and patents on it.
But the purpose was we wanted to test our theory of modifying digital lightfields. So I separated the idea of the pure analog lightfield — and I used to say this in public, which I did not think you could ever reproduce — to coming up with a way of digitizing lightfields. And like “What is a digital light field? And how do you make it?” So we built a machine that made digital lightfields. And had all these parameters that we can play with that was big and it had all kinds of weird knobs and computing things, so we can play with those parameters.
And the general notion was, I mean, very simplistically, think about your original PCs. You had the computer and, you hooked up to a monitor. Well, here — There is no monitor. We’re going to hook it up into us. So when you put yourself in this rig, you actually had this thing that mounted your head, I’m like, “You’re now completing the circuit. You’re plugging in. And you’re the monitor.” So I was like, “Can we build a computer with no monitor? And we’re the monitor.” That was the experiment. And in order to do that without jacking in like The Matrix, where we actually have to wire your visual cortex.
And remember, I came out of computer surgery where I actually saw surgeons stick electrodes into brains for Parkinson’s treatment and things like that. So I did have a notion that maybe our thing could not work. And you had to do that, like, you know, electrodes in the visual cortex, which I did not want to do at Magic Leap, I thought that that’s horrible and invasive when we shouldn’t do that. The only thing you should do that for is like someone that has Parkinson’s Dystonia, a real disease, and then you do brain surgery to help them. But don’t take healthy people and throw wires in there. I thought that was just awful and bad.
So the idea there was we wanted to prove could get in — I called it “Knocking on the Front Door.” And I’ll try to explain it this way. The idea was the retina is a doorway, a keypad, and it had a programing language — that was a theory. And that if you could unravel that, that would give you access to the brain’s GPU and display system in the natural way. That if we could figure that out, that would be the right way up the mountain. That would be the ultimate.
So instead of warping the interface between your brain and whatever signals coming in, if you sent the brain to the signal that it wants, the right one. And there’s two questions about what the right one is. There’s the raw analog signal, and then there’s the digital signal that we were chasing. And the theory was there would be a digital signal that would cause no stress, no nausea, no nothing. It would just be like sipping water or just like standing in a forest in Oregon and nothing. It would just be like perfect. And if you got that perfect, you could have all day, every day, forever. Everyone can use that. You’d have no stress on the brain. None of the things that you experience that people still have to tolerate with all these devices, still.
We were hunting for that and that machine was like our SETI experiment, or like going to one of those like CERN labs where you have all this complex stuff to try to find that set of parameters that unlocked it. And I think the moment of breakthrough was we actually came to a point where we felt like we had unlocked some things. That was that was the point where we started to bring people by. And we’re like, “Are you also saying this?” And we brought by hundreds of people. And then thousands of people. So it wasn’t just like myself and, you know, like the original Fellowship of the Rings folks.
We brought by all kinds of partners and investors — and of every type. Like, I can’t even mention all of them. But like, it was pretty incredible. Especially you walked in into this innocuous place, you saw this machine, and then you saw this very special thing happen.
The weird part was, there’s no way to take a picture of it. This was not something that wanted to be photographed because the CCD of the camera would not respond to our signal the way your brain would. The signal was only meant for the human brain. Not for a cat, not for a dog, not for a camera, not for an iPhone. So when we would try to like, take photos of it, it wasn’t the thing that we did. Immediately, it lost all of its magic.
The way to explain, if you go to the beach, go to the woods, and there’s this beautiful spatial perfection of reality. And then you look at every other thing. It’s just not that reality. And I think for a moment there on that first device, people were seeing maybe this is the way. We were opening up, “How does the brain really generate or render reality?” And that’s what made that machine magic.
That’s what we showed, you know, Larry and Sergey. They thought it was super cool. Now, that machine was gigantic. But could we have taken a shortcut and not done that and built a VR system? Yes, you’d avoid all of that. But I think at the beginning of the company, I was chasing this like almost Platonic Ideal of “How does the brain render reality?”
And I think we touched it for a moment. You know, it was like this brief moment where we go, “That’s it! That’s the math, that’s the parameters.” And it was like a lot of folks all working together at the same time. We had these brilliant optics engineers, I think Jan, Ethan, Randall, Sam — just a lot of brilliant folks all together — Brian.
And like at some point working very selflessly to make this thing happen. And many, many failures. Like the coolest part about that early period was we felt like we were the Wright brothers. We crashed 700 times, but once we had that first 50-foot flight, we were like, “Did you just see that?” And then the next person would come running into the room like, “Holy crap!”
And then the next person would come around the room like — It felt something new and magical had happened. And the part we were all like freaking out about was like, “We should probably publish this.” Like, this is an interesting moment in engineering. And we’re like, “Are we building a company? Or are we doing a Scientific American article?” So we did bring enough people by who saw it, who know it was real. But it took on this lore because instead of publishing it, and alerting every single big company in the world — as I was worried about.
It turns out my worries were real, because they did all come after we were doing. And they did all come after this field full force with tens of billions of dollars like, you know. So, I mean, I was worried that this would happen because if you unlock this possibility in computing, it could dislodge everybody. It could unseat the biggest companies in the world if computing goes that way. So we thought this was a glimmer into the next decades.
It was really awesome — I wish I could bring you back in time to that machine, because it was really cool. And it almost felt like a one-off Stradivarius. It was so tuned, so taped together. Everything was just hanging in this moment of like, it wasn’t like tuning a six-string guitar. It was like tuning an 18,000-string sitar. And we just got it right and it was like, “Whoa!”
But then we’re like, “How do we then take this 18,000-string sitar and make it into a compact, manufacturable, six-string guitar?” I don’t if that metaphor makes sense. But then the problem became from the theory to — How do you scale and manufacture this whole thing? And how do you still keep that magic in something smaller at a much different price point? That can’t be the size of a room, you know, all those kind of problems.
BYE: Yeah, well, for no lack of trying, I tried for three years from 2015 to 2018 to get permission to come down and see it. But I was unsuccessful.
ABOVITZ: By the way, Kent, knowing you now, I will apologize to you from my former self. And now I would have brought you by. I apologize for whatever parameters did not let you go there and other things.
BYE: Well, I mean, maybe that’s a good transition. There’s a number of hooks there that you just mentioned, I think that — You know, I did an interview with Paul Reynolds in 2015 because he was a listener of the podcast and he wanted to get more information about what you were doing there to just encourage people to come to Magic Leap to work on whatever you’re working — because there wasn’t a lot of information that was out at the time.
And at the same time, I feel like, as you started to come public, like — So I got my DK1 in 2014, you came out later that fall with the funding from Google Ventures [CORRECTION: The funding was actually from Google Inc., and not Google Ventues.], another big funding round that you had that was really coming out of stealth. And then there was this period leading into the GDC 2015 where the HTC Vive was shown to all the game developers had just been announced at Mobile World Congress just a few days in March of 2015.
And so it was like this period where from 2015, up until like the Leap Con in 2018, you were kind of still in a pretty stealth mode. And there is this, what I would say, this challenge between the secrecy that you were talking about in terms of not wanting to give away all the stuff that you were working on and have all these big companies just swoop it up and just have billions of dollars to pour into it to leapfrog whatever you were working on.
But also this challenge of — if you go back to 2010 and 2012 where you were given this independent strand of thinking about the future of computing from this “Neurologically-True Reality” that you call it. But the VR was on one end putting forth the visions of, like you said, the Homebrew Computer Club with Oculus representing this open source movement with dev kids going everywhere, flooding the market with innovation that happened from 2013 to 2014 and onward, that really the key turning point was having those dev kits in the hands of developers.
And, you know, I just wanted to say a comment on this thing that happens within VR, which is the accommodation vergence conflict, which means that you have a flat screen that you’re looking at. But our vision is actually dependent upon looking at different depths and focusing at different depths. And there’s something about the virtual retina display technologies and probably the stuff you were doing with digital light fields avoids this vergence accommodation conflict. So getting to this lack of eyestrain that happens when you use this different technology. So you’re trying –
ABOVITZ: Well, that’s one of the issues, not the only one, like accommodation vergence conflict is one of the well-published ones. But there’s a lot more. And one of the things that we unpacked over the years was how many other things are needed to get right? So the one that was most public was that. But then if you think about like all the errors that cause strain from nausea, and discomfort, and all these things. There’s actually a lot of them.
And we had this brilliant team that was measuring clinical and scientific impact. We work all sorts of neuro-ophthalmologists and institutions to try to understand what was safe. First of all, there’s like “What’s safe and healthy?” That’s like one border. And then “What’s like, perfect?”
So we realized the playing field was first you have to be safe. I wanted to make sure everything we did was safe and would not harm any user. So there were VR systems — I think less of them do it now — but they were aggressive and they were actually pushing this kind of diopter mismatch to a point of harm. And we actually studied that. I think we even did some papers, but there’s the whole thing of like, you get that too high and there’s harm. So I think there’s a place where you get that good enough that you’re safe, not going to cause harm, a bit of discomfort, but not harm. And then there’s like neurologically true.
So I wanted to be safe, and then get into neurologically true. And our first machine showed neurologically true is possible. And then I realized that Mercury, Gemini, Apollo would go — You’d have to be tiny, small, safe with all this other stuff. Computer vision and AI and a wearable computer and like all this other stuff that had nothing to do with the optics anymore, but you couldn’t be not safe. But you always needed to be safe. And then how could you go from safe to neurologically true? And then neurologically to perfect?
So that was the idea of Mercury, Gemini, Apollo. If you were on the engineering team at the time of the leadership, you kind of knew that was our goal. And it was this constant trade off of like I’m always pushing for neurologically true. And then the engineers are pushing back. It’s like, “Can we live with safe right now? Because we got to pack ten other startup companies worth of stuff into this device.” You know what I mean?
Like which you mentioned at the early part of the broadcast, like we had to do computer vision that sensed the world in real time. We had to do pixel stick without wiggles. We had to do sound field. We had to make things light. You had all the thermal issues about something so small generating so much heat if you put so much computing power in it. So the issue after issue after issue — and then you’ve got frame rate that needs to be really high, which creates more heat, but then you needed to be bigger in the computer, but you want to make it smaller.
So there’s this like endlessly maddening set of like opposing forces. And I’m in there screaming all the time for Neurologically-True Reality. And I think at times some of our engineers, like we’re pulling out all their hair because they’re like, “How are we going to make all of this happen?”
And if you talk to some of the folks who are there at the heat of the time where we are shrinking the big machine to the gen one, and then what’s now public as the Magic Leap 2. One of the hardest engineering projects any of them have ever worked on in their life. And some of these folks came from the biggest companies in the world. Some folks came from NASA, some folks came from Apple, Microsoft, Google. I think we were the hardest thing ever worked on, ever.
BYE: Yeah. So it sounds like you have in your secret lair, this neurologically-true, big, giant refrigerator sized machine that you need to split up into the Mercury, Gemini and Apollo, — the Magic Leap 1, Magic Leap 2, and future iterations that you’re going from the safe and aspiring towards that neurological-true vision. And at the same time, coming in cold in terms of everybody at that point when you came out in 2014 was VR was still emerging, but AR wasn’t really on the radar as much. I don’t think the HoloLens had even been announced yet. That came a little later. But then –
ABOVITZ: We didn’t even know that that they were up to something. I just kept rumors about this thing called Natall, I think. And we were just like, you know, verging on paranoia about what is Microsoft up to? Because we’re running. And then they’re trying to nix some of our people. And we don’t know what it is. And I remember I was in the Bay Area with a couple of our team and then we hear about HoloLens getting announced — the HoloLens 1. And we had the Magic Leap One PEQs already built in our building — a PEQ is a Production Equivalent.
We’re like, “Oh crap!” It felt like the Russians had gone into orbit first, you know, when they put the first man in orbit. We thought of Microsoft as like our arch enemy in that race. And when they put out the HoloLens one, it was like, “Oh no! Now we’ve got to scramble. And, you know, we’ve got to get our first mission into space.”
That mentality was so — It felt like a race on technical proofs. So our thing was that we’re going to have a bigger field of view and this and this and this on our first one. But they beat us to the punch. I have to give them credit. They’re also a thousand times bigger than we were. But, you know, give the team credit. They got out there with the HoloLens one.
But we didn’t know about each other really that stay interesting thing. You don’t really know who’s doing what because they were in real secrecy too. And who the hack knew what was going on in Apple? And, you know, it was like cars racing in the dark against who you think might be there. It was very a interesting period.
BYE: So as you start to come out, I guess from my perspective of someone in the XR industry and going and talking to lots of people, there was this challenge that I see from your perspective of having this experience of what you are approaching this neurologically true experience, but yet you have to sell it and explain it to the world as to why this is even interesting.
So you have what I see is a lot of aspirational visions of what types experiences might be possible that are like these renders or at least visions because like you said before, the TV-on-radio problem where you can’t even really necessarily capture what’s happening and what the experience is. You know, you can’t capture and recreate the full immersive experience of what we see and what’s it mean to be — the Place Illusion and Plausibility Illusion from Slater to really feel like you’re in another place.
But here you’re grounded in reality. So it’s more like you are already in another place, but you’re giving the plausibility illusion of all the stuff that’s coming in, believing that there’s these other layers of reality. So I feel like the big thing that I see, at least from Magic Leap, was this aspirational visions of different renders. And I remember going to the, IEEE VR in France in 2015, and people were reacting to the video that had been released in terms of it.
ABOVITZ: Was that the whale?
BYE: They had to make a little disclaimer that said that this was a true capture or whatnot. And people are like, “No, this is a render. There’s no way that whatever they’re doing, it looks like that.” So there was a distrust, I think, that had been starting to build up in terms of the promise of what Magic Leap could be versus the dialectic that you were talking about, the idealism of the neurologically true versus the pragmatic reality of what the first iterations of that Mercury could achieve with what the direct embodied experience of that was versus what, you know, was possible in a long trajectory of things.
I think that’s from my perspective, the kind of mismatch between what was being said would be possible versus the actual reality of what the experience was. And I don’t know what your reaction is to that, but that’s sort of, from my perspective in the VR industry, what the narrative and buzz was about Magic Leap.
ABOVITZ: So let me pick a couple specific examples and unpack them, like. One that has entered like a low-grade myth was this whale in the gym. And I’ll just give you the background that. So we were working with Lucasfilm with ILMxLAB. We had announced that partnership, and John Gaeta was there and then he joined Magic Leap. John Gatea won the Academy Award for Matrix, visual effects genius. He worked on just crazy movies, including The Matrix and all kinds of stuff. Just genius guy to work with. It was super exciting.
So Lucasfilm did a concept video of the whale in the gym with the idea that we were going to actually make on the Magic Leap. And I had this naive assumption that, like with film, you could do a trailer and you could tease people. And then with the idea that you were going to actually this. And just to close the loop at the University of Miami in the fall of 2018, not long after we did the LeapCon. We took over the UM campus. Thousands of people came by. We took over the University of Miami gym and we had a full-sized whale jump in the gym. And when it splashed, we had these fans that would throw air on you. And it was incredibly awesome. And we had these giant speakers so when it hit the floor and the water came up and John Gaeta, the same ILM team, did do that. So we had a chip on our shoulder where every single thing that we made a concept for, we were going to make real.
Another one that we did was the Dr. G. Invaders, which Richard Taylor, five-time Academy Award winning team down in Wellington. They put out like a concept trailer of this thing we were going to build. And every version of Magic Leap prototype, including the very first giant beast prototype, had elements of Dr. G on it.
They were like one of our best test objects, because they were so high fidelity, so well crafted. Like Gimbal, the robot, looked like shiny metal with propellers and smoke. So it would push everything we were doing to go. “How real? How neurologically-real is Gimbal?” In fact, we came up with something called the Dr. G. Gimbal Test. And a very famous film director, I’m not going to say who, but one of the top three of all time in the world came by and we used him to grade where we were on a neurologically true reality test.
So a perfect gimbal was a “1″. And he’d come by and say, “What level of the Gimbal are we?” So is it a 0.7, 0.8? It was like this really — like, we created this unit of trueness to neurologically true reality. And maybe one day this film director will let me say who he was out loud. But it was really cool because the Dr. G. Gimbal was the basis of that reality test. And we called it The Gimble Test. But we actually built and shipped that one. And the actual Dr. G. I thought was much cooler than the video.
But I’m going to talk about the one that Lucasfilm did that I thought was the most astounding. On the video, which we released, and Lucasfilm, ILM released it said, “Shot through Magic Leap.” Because we realized the world needed — basically, that my idea of like film trailers wasn’t going to work. They want to know what was shot through and what was concept. So we then started to label stuff. So we put “Shot through.” And the one I thought was most exciting has C-3PO and R2 looking shiny rendered solid. People thought that wasn’t real, that was shot through, it was incredible. The ILM folks thought it was incredible.
And then R2 sprays. What’d you think of that classic, Star Wars hologram on a table with like holographic X-Wings? And people thought there were two real metal robots, like built by like the model shop and then R2 spraying something. R2 was actually a neurologically true reality droid, solid, shiny, metallic, incredible. And then, we shot through it and it was like one of the very best things in terms of fidelity we ever did.
But I think in that co-mingling of how do you communicate this stuff to the world? Like, can you send a trailer? That was the very early days, like the whale and the robot. And we knew we were going to release this to the world, but the world didn’t know that any of this stuff was actually real.
So we had this kind like — You’re right, we had this like friction with people not believing. And then I’d have all these people fly into the building. Investors, partners, athletes, movie stars, because everyone was like trying to — basically, demanding their way in. I mean, the people I turned away, it was just like insane. Like, I’d have senators call me up. I’m demanding to come in. I need to see this. I’m like, “Sir, I can’t let you in.” But we had enough people come by and they had this, like, this is not real. And we’d come in and we’d show them all kinds of stuff. And they’d walk away going, “Oh, that was pretty awesome.”
You’re right though, that 2015 through 2018 was a weird time. Because in the building, we had developed many more mature prototypes, so people would see all these phases. We had generations past Magic Leap One happening so people can see the next level stuff. And it was really cool. It’s like walking into like an Imagineering, early Apple on steroids kind of place.
Again, I apologize. We did not bring there. I’m so sorry about that because it sucks that like you didn’t like — That was such a fun time to have visited. And I’m sorry that you did not go there, which is my fault, probably.
BYE: Well, I was in contact with the PR rep that I worked with. And yeah, just, everything was shut down. There were some people that made the trip out like an MIT journalist, but for a while, yeah, I, was always below the radar for I guess the strategy for whoever was running the comms team and PR team. That was like one of my other complaints was just how much the trade journalists like UploadVR or Road to VR or myself that were just kind of felt like, yeah, I see people from Rolling Stone would have access. But yeah, it just felt like people that were really the closest to the industry weren’t able to come in and vet or be able to say what their perspective was that went beyond whatever the other mainstream media, or WIRED, or other people that would get access to come down and see stuff.
ABOVITZ: I’d say — I’ll give you the sum total of my experience in media before starting Magic Leap, I was a cartoonist and a writer for my college paper, which is whatever that is. And then at Mako, we had like minimal interactions because the tech world was occasionally interested, but it wasn’t like Big Tech, like capital “T” Tech. And I had no idea what the tech world was really when I started Magic Leap, and after Google invested what that would become.
So I was like kind of clueless as all of it. So that was a learning experience. And I don’t know if I always was surrounded by the right folks who — I had really brilliant tech people. But if you think about like, “Did we have the right folks on how to interact with tech media and all that?” I don’t think we did the best job. I think we could have done better.
We had some good relationships, like, you know, Steven Levy would come by. Kevin Kelly was brilliant. Like these people I — they were legends and I knew about. I think it was like Rachel from MIT, you know we had some of that, but like I think we didn’t quite understand how to work with — and particularly me because my experience was super limited, like coming out of Mako Surgical. And basically being a college newspaper cartoonist does not prep you for basically growing up in the explosion of how social media changed media.
So the weird thing about Magic Leap, when I look in hindsight, 2014 to like the 2020s, the tech media world changed. It went from like this positive, optimistic view of the world to something else. And also social media and the business models of journalism change. And we were trying to hack reality, and all that. It was like this weird, weird thing of colliding with that.
And I had this naive view that all tech journalists were like, you know, Kevin Kelly and Steve Levy, these like tech-forward, positive optimists about the world. And some of it became more like political media.
You know, Kent, one thing you said earlier, I want to be able to address. We had this challenge of like wanting to reach partners and developers and great new team members as we’re growing up, but also not revealing to very large, difficult-to-compete-with companies because they’re so big, they have so much capital. Not all of them follow any sense of ethics at all. So we’re like, how do we send smoke signals to the people we want to hire, the awesome developers we want to work with without tipping off the biggest predators in the jungle? And that was a really complicated thing.
I don’t know. You probably felt some of it from whatever you were observing, but now knowing how much capital these guys are spending, I had my — I was getting like data points. Like people are spending five times what we were ten times what we were. Now it’s public. And you realize they’ve spent maybe 30, 40, 50 billion. They’re spending our whole decade of life as a company, they spend that every three months.
So that was intense. I mean, you know you’re going up against really ferocious, tough competitors, by the way, raiding your team. You know, as a startup, you can offer a certain amount of pay, a certain amount of equity. They were like, offer some of my best guys and engineers and women like five times what we could pay, you know, like NBA salary athlete kind of thing. Here’s a $2 million signing bonus, $3 million of RSU kind of stuff. And it was just really difficult.
You know, the playing field as a startup, trying to change computing is not level when you’re going up against — like we weren’t competing with other startups, we’re competing with companies that had half a billion to multitrillion dollar market caps. And, you know, and I kept feeling as a startup, there should be companies like us, like Tilt Five, and others. We should be able to exist. We shouldn’t be squished by the biggest giants in the world. There needs to be a way for us to come forward and exist. But they make it really hard. They make it almost impossible.
You’re fighting every day at all levels, and they don’t fight in the way that you’d think. It’s not only — You go, if I patent, if I invent first, if I’m fast, if I got really smart people. But then you’re like, “Wait, they do all these other things to compete with you.” It’s tough. But look, that’s the free market. You have to be really tough to fight it. We did a lot. We did some amazing things.
BYE: So yeah, I think as you are saying that, you know, reflecting upon that — because there’s a lot of talk around how many billions of dollars you had raised. But yeah, when you compare it to how much Microsoft and how much Facebook and who knows how much Apple has been spending, it’s many more scales because you’re really talking about the future of computing here. And then, like you said, trying to do this as a startup.
I’d say one of the frustrations I had from the outside was seeing how successful it was for Oculus to be able to make a dev kit available to all of the different developers and observing how much that was really a catalyst to innovation for the ecosystem for VR. And how much even AR had benefited from that, and to see how much the approach for Magic Leap was much more locked down. Even people who had early access to the headsets had to go through all sort of secrecy procedures. When I went to Leap Con, there seemed to be what my impression was, was that there had been a long, long hstory of that secrecy and that the company was trying to overcome some of that secrecy to really cultivate the openness, and knowledge sharing, and cultivation of those communities.
And that I ran into that personally because of the need of secrecy, not even having me go to like the opening night party because there was fears that I would overhear something. And so this locked down nature of the company versus trying to transition into that more openness, but also to make the hardware available to cultivate the ecosystem.
And the first Magic Leap One, there was debates whether or not you were saying at some points like weren’t going to have a developer kit. But then, you know, the Magic Leap was kind of like the Creator Edition. It was kind of clear whether or not this was a consumer device. And then if it was a consumer device, it’d be like thousand dollars.
As I was watching it, I was seeing what happened with Oculus and I was wishing for as much of what would be possible to get these into the hands of as many developers as possible. But given the price dynamics, it may have not even been feasible to be able to do that. So yeah, that was just from my outside looking in in terms of wanting to see a broad, robust, diverse ecosystem of people tinkering and pushing it forward, but also focusing on what would be the thing to really catalyze the augmented reality as a movement.
And in hindsight, not knowing whether or not it was just a matter of VR needed to come first and that same model didn’t have the same type of economies of scale because of all the different innovations and the nature of being a startup versus what Oculus was doing, which was eventually getting acquired by Facebook, but how much they would have been able to sustain that type of model as a startup.
So anyway, those are some of the reflections I had as a journalist covering it. And I know there’s a lot of tradeoffs and decisions that you had to make in order to navigate this as a company, taking all that into account, the secrecy, and the concerns from all these big players, and how things have continued to play out.
ABOVITZ: Well, if you unpack a little bit the complexity of building a VR of that era, it’s a lot of just off-the-shelf stuff. And stuff that was basically benefiting from the mass availability of like mobile phone components. So it’s more like, can I make a not super proprietary piece of hardware, put a lot of software, get a lot of people hacking on it, like that’s a strategy. And it is not a bad strategy to create a whole community of VR developers because you’re not worried about that hardware.
They weren’t really pushing the envelope on anything that much on that hardware. It was about the software development environment, just getting them something. But actually there’s zero way for a company to subsidize that. Like as a startup, you go out of business right away doing what they did. It was super important that they got acquired because if you don’t have a parent like that, every one of those things is like a massive, hundreds of dollars per loss. So you just go out of business very, very quickly. No matter how much money you’re raising, like at the scale that it ultimately became. It’s all at a massive loss. You could sort of see those losses now with Facebook.
But if you ultimately have that parent. And I think it’s very hard to see what that world would look like without the parent. But the thing I’ve got to give Mark a lot of credit for, he is subsidizing to tens of billions of dollars at massive losses per system, the creation of that whole ecosystem of developers.
By the way, I wanted to do the same thing. I had an idea to give out 50,000 systems of the Magic Leap One Creator. I just wanted to seed the world and see what all kinds of people would do. Now, when you have not a parent company with someone as ambitious, but also sees the vision of where all this is going, ala like Mark. And you’ve got this multitude of institutional and strategic investors of all types who all have different opinions. They didn’t really like that idea.
I actually think if I go back in time, I should have fought harder for that. Because I think it would have been kind of awesome to deploy. “Here’s 50,000 Magic Leap Creator One editions. Boom!” Then I would have been like, — I think that was the right idea. That was my gut. And you know, when you’re shot down, you can’t always like fight back to people that are multi-trillion dollar funds, you know? So I feel like –
By the way, could we have been under a parent? There were a number of opportunities where we could have enrolled up under a company. You could ask, should that have happened? Like, Is it better to develop like this as an independent startup with all this optimally going public or something? Or be under the wing?
You know, I think if you’re under the wing of someone like Facebook. I’m not going to make any comments on moral or ethics of their business model or any of that. But you have to get Mark credit for his commitment from a long-term thinking and the amount of capital. He understands that. I’ve met him over the years. He’s one of the few people in the industry that totally gets how much capital you’ve got to put in, and the duration and the intensity to do it.
I also understood that, but had to bring in all these different small, medium, large-sized partners and institutions who were not always aligned against this vision. And the nice thing about Oculus is that they had this one guy they had to convince, and they convinced early, who then from that point forward goes, “I’m going to do this, period.” And then they have this like unabated flow of capital. Nobody has to run around getting funding anymore. So that is a plus on their side in that they understand the space race, and they’re going to push for it.
I don’t know if that makes sense, but I really would have loved. And that was my original idea to drop 50,000 of these, get a lot of learnings, and then the Gen 2 would have been the thing to really go commercial with.
BYE: Yeah, and I guess to start to bring this conversation to a close with a few more questions. I’d love to hear anything you can share in terms of the end of your time working directly with Magic Leap. Because I know that — The big event for me, at least from the outside, was having to lay off half the staff, which I can only imagine how difficult that would have been to have to make a decision like that and then continue to move on with the company.
But what can you say in terms of what happened? And how do you wrap up this phase of this story? You know, we’re talking about all the constraints and limitations about trying to operate in this realm as a startup against all these big companies and the whole economics of it. But what was it like for you, as a person, as a leader of this company to have to see that point to make this shift away from your energy and passing the baton to others to carry that forward? But also to go through that more waning aspect of your time at Magic Leap?
ABOVITZ: I’ll say a couple of things, and I’ll talk a little bit about that. But — One, the company is there and very much alive and kicking and thriving and putting out a system that many, many people at Magic Leap pre-pandemic worked on for many years, which is the Gen 2, which is the Gemini. And that same team also worked on Apollo, which hopefully the world gets to see that one day, but — Super proud of that.
So I want to clear the air that like the company was never not going to do the Gen 2, and then, you know, hopefully the next thing. So one, I think it’s important like — I think everyone who is either there now or was there is probably super proud of it.
But the pandemic — I just can’t overstress that — was such a fracture not just the world shutting down but the financial markets crashing. So I’ll say a couple of things, which I could say in the rest company still operating. I helped recruit Peggy Johnson from Microsoft because I realized the immediate direction was going to be enterprise. We had tested it for years, by the way. I just want to make that point. We had tested enterprise with hundreds of partners, and we knew that was the near-term revenue.
We were actually announcing enterprise in 2019. If you go back and look at the news media, we’d launched Magic Leap One Enterprise Edition in 2019. So it wasn’t a new change of direction. We knew it was enterprise and then it was going to be consumer. And if you look at the Magic Leap 2 design and performance, it was for professionals and enterprise. We realized there was not quite yet consumer form factor, but good enough and sleek enough for many use cases, healthcare, industrial, all sorts of things. So it was designed with that in mind, it wasn’t an accident that it came up.
I’d say this — One, it was incredibly painful, because you spend a decade building up an amazing team. I think just before the pandemic, we probably had the world’s best team in place. Just the smartest, best — who learn how to work together. And we were humming on all cylinders. The Magic Leap 2 is coming off the factory floor in early 2020. It’s amazing. I had our chief product officer on Bloomberg talking about this in early 2020. We were all super excited.
And then we’re in the middle of a significant capital raise so that we can launch the Gen 2, and scale the company. And then fuel the Apollo, the Gen 3, which we also had in the basement. And we had generations after that too. So we’re like humming on every cylinder. All kinds of cool things are going on.
So the pandemic shutdown was significant as it impacted people’s ability to go to the building, work together. You know, this is such a complex system, everyone having to go home and very few people could be in the factory was like a nightmare.
But then the financial markets went through just a complete catastrophic crash. And many friends of mine who are running companies, those companies are gone. They went bankrupt. Like, they all died in that period. Some of them like were just utterly gone. Many of them survived in mutated ways. If you’re a very big company with a parent protecting you like a division, if you’re inside a Facebook or an Apple or Google and Microsoft with 100 billion in cash reserves or 50 billion, you can kind of sail through that unaffected.
So that’s not us. We were an independent company in the middle of a capital raise. But what I can’t complain about is the company survived. What did break for me as a leader was I had spent a decade building the team, building everything, and ultimately — We did not cut half the team. We cut 700 out of 1700. So we ended up going to about a thousand to keep enough critical mass to keep the Gen 2 going and everything else needed to happen.
But that was such a psychically difficult thing for me to do. I knew that, you know — I talked with the board. I needed to bring someone else in. Like if the pandemic hadn’t happened, the crash didn’t happen, we wouldn’t have done that — I would have made that change. But I think, like, in order to save the company and if I had to exit all those people who are my friends, and we had all built this thing together for years, I was going to go too. Like, I couldn’t do that them and not do it to myself. I don’t know if that makes any sense, but it was like –
One era was ending, and I wanted to bring in someone who could lead the next phase, which was like just go after the enterprise. Maybe all my, you know, The Yellow Submarine and the Aircraft Carrier together like that was maybe too much in the post-pandemic world, but it was fully supported by all of our investors up until that point.
They liked the dual track. We were going to win enterprise and we were going to win consumer. They really — That was the support we were getting. Go do both. Go take on the biggest companies in the world. That whole thing came to an end because the markets crashed. The world changed in many ways.
And there was good fortune actually that Peggy, who was like the right hand of Satya. When she joined Magic Leap — think about this — a company I started in my garage, she’s like the number two at a $1.65 trillion company came in to be my successor. So I told her, “I just pitched the first seven innings. You get the save. I’ll get the win. I’ll do everything I can to help you. And you know what? We’re going to have to fight Microsoft and everyone else in the enterprise. And I know you understand that.”
She was at Qualcomm. She was an engineer. You’re going to bring in that understanding. And you’re not going to have that same feeling that I have of having to let all these brilliant people know — who are my friends, and like — I just — To still stay there, was not going to be possible for me. If I didn’t do that, there would be no more company. I had to scale it down in order to get the funding, in order to get the momentum going again in the post-pandemic world. So that sucked, but it had to be done.
BYE: Yeah. And I guess finally, what do you think the ultimate potential of augmented reality and Neurologically-True Reality might be, and what it might be able to enable?
ABOVITZ: I think, Kent — and by the way, I’d love to do a part two or three with you at some point in the future, because I know we’ve got to wrap up real soon, but I’d say this — I feel like both AR and VR are converging into this like — the same thing in one system. Like the ML 2 is like native AR that turns into VR if you want it to, through this kind of like segmented electronic that’s super cool and very novel.
And the Cambria from Facebook is VR that lets you see the world through passthrough that gets you AR. So we’re now approaching the mountain from two directions, but we’re waving at each other. And I’m pretty sure whatever Apple does probably kind of feels a little Cambria-ish, or maybe sits in between both. But I think now the two sides of the coin, like the chocolate vanilla are coming closer.
So I feel like, knowing what I know about not just what’s happening now, but where we were looking at — the next ten years. We were seeing all the way into the 2030s with our R&D. I sincerely think it’s going to be awesome. Like some of the hardest stuff is behind us. Like we’ve solved some of those difficult problems. The commitment to investments happened. I think we’re at an interesting tipping point. I think there’s so much commitment from like both the Magic Leap team and its ability to do that, but also like the Facebook/Meta team Apple, others. You’re just going to see some amazing stuff coming out.
So I feel like there’s amazing potential. It takes a little bit of time for it to scale into the world. So I feel that technology will be ahead of people. I think if we’re in the 2030s, we have a billion plus users. And near the end of the decade, this decade, I think we’re in the hundreds of millions of users. But it will take this like time and dedicated patience, step-by-step. Because we’re moving from an addiction to the phone and television to ultimately something that really does look like what you and I have [i.e. glasses]. But performs at the ML 2 level and better, which is really incredible. If you get to see the ML 2 at some point this year, and you go that thing and what you’re wearing is not far away, it’s very exciting.
BYE: Nice. We’re — And we’re both wearing glasses for anyone who’s listening. So is there anything else that’s left unsaid that you like to say to the broader immersive community?
ABOVITZ: No, look. Really appreciate every developer who put in time and energy and effort. We had many thousands of developers working with us and they were the most awesome people, some of the most inventive stuff. The OG team of Magic Leap, amazing folks. They sweated blood and tears into making a lot of technology really come out of nowhere and make it work. And then really want to wish the new G team, the new group team at Magic Leap just completely hit a home run with the ML 2. It’s amazing. So you guys have the ball, and we want you to rock it. So we’ll leave it at that.
BYE: Awesome, well Rony, thanks so much for all the work that you’ve been working on with Magic Leap over the years, and coining the term spatial computing, and helping to push forward this vision of the “Neurologically-True Reality,” and where we’re moving with the future of digital light fields, and everything else. And yeah, I look forward to seeing how all these things converge. And I appreciate you coming on today to help tell a little bit more about your story and some of your thoughts about where things have been and where we might be going in the future.
ABOVITZ: Yeah, if we do another one of these, we should talk like all things Metaverse, which is a whole another, you know, spending a lot of time with Neal and trying to figure out how to actually build those. It’d be fun to introspect on that one day.
BYE: For sure, awesome. Well, thanks again.
ABOVITZ: Awesome. Thank you, Kent.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
This is a listener-supported podcast through the Voices of VR Patreon.