HoloLens development kits started shipping last week following Microsoft’s Build conference and there were also a number of NDAs that were lifted, which allowed some companies to start talking about their HoloLens projects for the first time. One of those companies was Portland’s Object Theory, which was founded in July 2015 after HoloLens engineer Michael Hoffman left Microsoft Studios to co-found Object Theory with serial entrepreneur Raven Zachary, who sold his previous iPhone app development company to Walmart Labs.
I had a chance to do 90 minutes worth of HoloLens demos on Friday, and then talk with Raven about their mixed reality collaboration service and early HoloLens client work with CDM Smith. We talked about AR vs VR, designing around HoloLens’ relatively small field of view, and why he decided to exclusively focus on developing enterprise applications for the HoloLens.
LISTEN TO THE VOICES OF VR PODCAST
One of the biggest complaints the people have about the HoloLens is the relatively small field of view. Raven claims that after extensive use of the HoloLens that eventually the brain adapts to this small field of view and forgets about it. This may be true if the developers design around the field of view limitations, but I found myself frustrated with the limited field of view after going through about 90 minutes worth of the latest HoloLens demos that shipped last week. I found that I had to continually pan my head around a scene in order to see everything or constantly take a step backwards and reposition myself so that I wasn’t too close to an augmented character or object of interest. It not only breaks immersion, but it can be physically fatiguing.
Raven said that early AR developers will have to design around these limitations until the field of view improves. Object Theory has learned that using a miniaturized tabletop scale or placing objects at an adequate distance can help minimize the limitations of the field of view, and they claim that most of the complaints are coming from the tech press or experienced VR developers while most new immersive tech users don’t seem to be too bothered by it.
It’s also inevitable that the field of view will improve over time, and so it’s just part of the initial limitations of the first iteration of a completely untethered mobile computing platform. The HoloLens’ ability to both do positional tracking and keep track of your relative position within a room is actually extremely impressive, especially when you start to look at their world-locking features which allow you to completely walked around holograms that you place in the room. Being able to orient yourself around virtual objects within your physical surroundings is both compelling and will have countless applications once the computer vision technology gets to the point of being able to identify and augment specific objects. Raven says that he expects the HoloLens to be the primary user interface to all of your Internet-Of-Things connected devices.
The HoloLens is also able to detect and keep track of multiple rooms, which starts to open up features and functionality that go beyond Vive’s room-scale capabilities. You don’t have to worry about completely clearing out your space, but you can do a high-resolution scan of all of the furniture and obstacles and the content will adapt to your environment. It was liberating to be able to roam around untethered in an entire room beyond the Chaperone constraints of the Vive, and still have a social connection to other people within the room.
While most of the initial demos were focused on scanning a single room, it does have the capability to recognize what room you’re in meaning that at some point you’ll be able to walk through your entire house and be able to interact with augmented reality stories and games.
The most impressive demo that I saw was Fragments, which is one of the first number of mixed reality experiences shipping with the HoloLens development kits. It’s a story-driven mystery game where clues from recovered memories are displayed to you overlaid your room after you do a high-resolution scan. I found myself immersed in finding the hidden objects, and then applying a number of different filters which helped me isolate the victim’s location across a number of different cities. The mixed reality experience was by itself novel and interesting, and the game component of having a story that you were trying to piece together helped maintain my interest longer than the other tech demos initially available.
The positional tracking and head tracking was low enough latency that there were a number of times when the mixed reality content did trick my subconscious brain into believing it was real. But the vast majority of the time, my rational brain knew that I was in an augmented reality experience. I imagine that it’ll be a lot more difficult for AR experiences to cultivate the same level of presence as VR because the latency and fidelity requirements are so much more higher, but I don’t think presence is necessarily the primary goal or motivation with AR. It’s more about adding a digital layer of value to your existing world rather than trying to completely transport you to another world.
For anyone who appreciates the level of immersion that virtual reality can provide, then the HoloLens is going to be pretty disappointing. But there are quite a number of unique affordances of AR that transcend the capability of VR that make the HoloLens a compelling platform.
Many analysts agree there will be many more market opportunities for augmented reality than virtual reality. There are many use cases where AR makes a lot more sense than VR. And while most of the innovation with virtual reality is coming from entrepreneurial start-ups and gaming, Raven argues that most of the innovation in AR will likely come from established Fortune 500 companies because there are a lot more immediately obvious business applications for AR than VR right now.
The HoloLens dev kits currently have two gestures of a finger click to select, and the exit “bloom” motion of having your palm upward and spreading your fingers apart. I found that the finger-clicking gesture was really fatiguing after doing 90 minutes of demos, and I’m glad to see that the development kit of the HoloLens is shipping with a remote clicker button. While a lot of people dream about the ability to do Minority Report style gestures as their primary user interface, the reality is that these motions can be quite fatiguing beyond doing them in short bursts.
There were a number of HoloLens demos where your primary method of manipulation was using the movements of your head to move objects around. While this may work in a short tech demo, I found it very fatiguing and frustrating for how imprecise my head is for moving objects around and I would’ve been much more efficient to do the same 3DUI tasks given a tracked six degree of freedom controller using my hands with buttons.
Raven did say that you will be able to connect 6-DOF controllers via Bluetooth to the HoloLens, and I expect that in the long run that most professional applications will start to use controllers with physical buttons rather than rely upon gestures. The voice interface can be quite nice, but I could also see how it might be difficult to use in shared office environments.
Finally, Raven says that there’s a lot of prototyping and development that you can start to do even if you don’t want to spend $3000 on a HoloLens development kit. Microsoft just announced the HoloLens emulator that allows you do build Unity project and start to preview your mixed reality application. There’s also a lot more new documentation and videos that was just released after last weeks’ Build conference.
If you’re in the Portland, OR area, then be sure to check out the monthly HoloLens Meetup that Raven founded, and you can learn more about their Object Theory’s latest news by following their blog.
Here’s an overview of Object Theory’s presentation on their Mixed Reality Collaboration Service:
Become a Patron! Support The Voices of VR Podcast Patreon
Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast.
[00:00:11.896] Raven Zachary: I'm Raven Zachary. I'm one of the founding members of a company called Object Theory. We're 100% focused on the Microsoft HoloLens. My co-founder, Michael Hoffman, was on the Microsoft HoloLens team and left Microsoft to help form this company with me in Portland. And you and I know each other through the Portland VR AR scene, which is great. So it's good to finally come on the show opposed to just being a listener.
[00:00:36.133] Kent Bye: Yeah, we started the VR meet up here in Portland and then at some point you split off and decided to go the AR route. Maybe you could talk about your journey into VR and then into eventually AR with the HoloLens.
[00:00:49.060] Raven Zachary: Yeah, so I'm a serial entrepreneur in tech as much as I loathe that term. I was looking at what I was going to do as my next company. I started a mobile company working with large brands. I built the Starbucks iPhone app and helped Obama's campaign in 2008 build their iPhone app. So a lot of work in the mobile industry and sold that company and getting ready to leave the acquire and start a company. The day that Facebook bought Oculus, I said, OK, my next company is going to be VR. And back when I was in high school in the early 90s, I was very inspired by Jaron Lanier's work in VPL and got really excited in the early 90s, pre-internet, to do VR. But just the time was not right. As you know, it went kind of in the academic and defense circles for 25 years and has now come back. And we have Oculus and Vive and everything else going on. So I looked really strongly at VR. You and I were getting together regularly to talk about VR and what kind of business opportunities exist in VR. And then out of the blue, Microsoft announced HoloLens in January of 2015, just over a year ago. And the light bulb went off for me and immediately I saw potential enterprise opportunities for business. I was really struggling with the VR role for business. Great for entertainment, great for therapy and travel and these types of things. But in terms of long, active proliferation within the business community was really struggling for the fit for VR in its current incarnation. And so right away, HoloLens got me super excited. And just by coincidence, a mutual acquaintance updated their LinkedIn profile that day and said, hey, I've been working secretly on this project at Microsoft for the last 18 months. And we started having weekend breakfast when he came down from Redmond, since he lives in Portland. And over the course of several months, decided to form this company focused on looking at enterprise use cases. So we're doing custom software. We're hired by Fortune 500. We come in, we build applications and deploy them. Right now, it's kind of in a pilot phase. We've announced our first client, CDM Smith, which is a global environmental engineering firm. So we're doing a lot of stuff this week. The announcement that we made at Microsoft Build Conference was around collaboration for architects, engineers, and construction foremen to review models together and annotate them and discuss problem solving in 3D.
[00:03:00.205] Kent Bye: What do you see as the major strengths of the HoloLens and AR versus something in the Oculus Rift or the HTC Vive?
[00:03:08.099] Raven Zachary: Sure, so for augmented reality and what Microsoft has termed mixed reality is the term that they're using, or volumetric computing is another term I've heard to describe this space, is the notion of additive, right? It's an additive technology. So you're taking an existing environment, you're adding to it a set of digital objects to augment that experience in a way that brings computing and the cloud services kind of into your real physical world. In terms of business use, AR, especially this kind of true 3D depth mixed reality like Microsoft HoloLens, I think is very complementary to our existing environments, our offices, our workplaces, our homes. You can have immersive experiences in Microsoft HoloLens, not perhaps quite as immersive as what you have with a Vive or an Oculus experience. It's harder to adapt a working experience to a fully immersive world with VR. Where AR, you're still having physical interactions with your co-workers, you're going into a conference room, you're having an additive experience where data, visualizations, information are being added to an existing environment. And I think that's going to probably be the longer term opportunity, at least in the business world. I think the obvious opportunities right away are things where you're augmenting a physical space with digital information. I mean, IoT is a big one. You have a set of sensors in a facility. Those sensors don't have a user interface or a screen attached to them. You walk through, it senses they're there. A UI pops up. You can look at maintenance history and performance data in real time in a facility. So I think for things like IoT, fantastic fit, because the HoloLens essentially is the front-end user interface to a headless set of back-end technology. I think entertainment is very strong on VR. I think there are some vertical market opportunities in VR that are also very, very strong as well. So I'm as bullish on VR as I am on AR. I just think they have different roles in society and different opportunities. I mean, for me, what really excites me is the last 50 years we've been looking at rectangles, televisions, computer screens, mobile phones. And we're having to take this digital world and flatten it into two dimensions and stick it into a rectangle. where, as a species, we've been operating in 3D environments since the beginning of the human experience. And now augmented reality and mixed reality allows us to bring digital information and artifacts into our physical spaces in a way that's much more natural to our human physiology, which is really exciting.
[00:05:28.793] Kent Bye: Yeah, and I just had a chance to do about 90 minutes worth of demos on the HoloLens, and my initial reaction is that there are some things that are very clearly made for AR. You know, I have HTC Vive and you have a room scale, but the room scale ends up being a little constrained in terms of, like, you're limited by the chord length and also just the amount of clear space that you can have. And I was able to scan my entire living room and play an experience called Fragments in AR, which used the entire space and created this mixed reality layer on top of this room. And that to me was a unique experience, having little characters and other things move around the space based upon what it's actually scanning in the room. On the other hand, there's other experiences that I thought would have been way better in virtual reality. For example, one of the things about the HoloLens is it's all gesture based at this point. So doing like 3D object manipulation, like I kind of wanted to have like track controllers, like with the Vive would have been way easier, faster, and less fatiguing with moving my head around. There seems to be a quite a bit of just reliance upon the head and moving your head, but also fatiguing because the field of view is still small enough that I felt like I kind of have to paint over a space to really get a sense of seeing the full picture. And so I kind of found myself moving my head around more than I would have normally had it been something that was completely in my field of view.
[00:06:54.899] Raven Zachary: So three things I want to cover that you just went through. One of them is controllers, one of them is field of view, and the other is immersion. So let's start with controllers. We've actually experimented at our company with using other devices, Bluetooth and Wi-Fi enabled devices, as input controllers. So we have a six degrees of freedom controller running on a smartphone that you actually hold in your hand. It can do a lot of really interesting things, opposed to just doing the air tap or the bloom gesture, which comes with the HoloLens. So I think there's going to be a whole opportunity for third-party accessories and integrations. Microsoft, as they're bringing HoloLens to market, they're telling a story about it being a self-contained unit, which it very much is. I think it's the third-party developer ecosystem that looks at what's the role of HoloLens in a larger technology solution stack. So we want to have a controller here and a back-end server here, and they all work together. So I think you will see controllers. You know, I won't speculate about what Microsoft's going to do because I don't know, but I know for us, we already have a controller system, at least in the lab working. So field of view, I think what we've learned from a user experience point of view is that you design experiences that take that into consideration. So we tend to do smaller objects on a tabletop. Or we project large, full-sized characters at a distance. So you're seeing that full experience. And I think there's clever user experience solutions with that in a way that doesn't in any way feel like a limitation. And then the third one was immersion. What I found fascinating watching you go through 90 minutes of demos was you were quietest in Fragments. You were having a conversation with me with the other five or six apps. When you went into Fragments, you were really immersed, I think, in the story. in a way where you and I weren't actually even chatting. You were so engaged. And to some extent, that's a level of immersion in AR because you kind of tuned out that I was in your living room.
[00:08:39.672] Kent Bye: Yeah. And one thing that you've mentioned to me before was regarding the field of view that you said it's kind of like a pair of glasses that once you wear for a long period of time, you kind of forget that the field of view is a limitation. There's something in your brain that clicks and that you just accept that there's this layer of virtual objects. I didn't personally have that exact experience because maybe it's just getting frustrated with having to move my head around. You know, if there's a character talking to me, I can't even see his whole body unless I take a step back and try to be at a distance so I could see the full scene. And if I get closer, then things get cut off. So I did find myself trying to step back as much as I could in order to see more of the space and kind of deal with that frustration of like, ah, I wish I didn't have to do so much extra movement just to get a sense of the scene.
[00:09:32.992] Raven Zachary: Yes, the field of view conversation comes up time and time again, but it tends to only come up in conversations with the press and with people who have never used a HoloLens or have used a HoloLens less than a couple of hours. For those of us who have logged dozens of hours on the device, We don't have conversations on field of view because we've adapted and we've built experiences that optimize for the great experience opportunities in that field of view. So it's a topic that we're fascinated that tends to be one that is most pronounced by people who haven't logged the time to grow used to and build ways of interacting with that environment in the same way that if I had no experience in VR, I probably would feel uncomfortable doing certain things with controllers or moving my head or getting out of my chair. And then once you've got it, it's just a natural experience. So for me, we design with field of view in mind, but it's not something that we have found to be in any way a limitation to our ability to build great experiences.
[00:10:27.720] Kent Bye: Yeah, the one experience that comes to mind specifically for that is the Galaxy Explorer, which was built very quickly over the course of six weeks as exploration for how much you can get done within six weeks. And again, I found myself wanting to stand as far back as I could, and that as I start to walk around objects, then I start to get that windowing effect. How would you think that that should be changed or do you feel like that? It should be just smaller or you know in terms of trying to design for the HoloLens What type of design principles are using in order to really maximize the limitations of the field of view?
[00:11:00.984] Raven Zachary: So the great news, you mentioned the six week sprint from Microsoft on the Share Your Idea project for Galaxy Explorer. What's great is not only is that available to the Hollands now, but it's an open source project. So anyone can contribute to that. My guess is that in a six week project, you don't have a lot of time to do optimization on things like frame rate and user experience. I think they got as much done as they can with a small team in six weeks. you will see evolutions of that product. So it doesn't surprise me that of the six experiences that you went through that you struggled most with the one that had the least amount of polish as to be expected in application development. So what we've learned a lot is how to do the right amount of object placement in a scene in a way that augments an experience, provides value, doesn't overload the user. You tend to do things with smaller polygon counts. You tend to decimate things down so that It can work efficiently on what essentially is kind of a mobile technology base. It's a wearable, it's fully untethered. This isn't like VR where you're plugging into a NVIDIA 900 series GPU. Think of it along the lines of a high-end tablet and the capabilities that you would have with a high-end tablet and optimize for that. There's some great materials actually on the Hollens developer site around things like optimization and how to maximize frame rate and things to have the best possible experience. But going back to an earlier point, I think, think of smaller objects on a tabletop or larger objects at a distance to really kind of optimize the immersion experience. And getting a good spatial map, as you did when you walked around for fragments, you did this kind of high resolution scan of your room. Having good room scan spatial maps really helps with the experience for things like occlusion and for things like planes and surfaces to be able to place objects properly.
[00:12:48.005] Kent Bye: I think one of the unique things that I experienced with the HoloLens was to be able to place a hologram in a specific spot and start to walk around it. And maybe you could talk a bit about like, how is that unique within what HoloLens is doing and how they're able to do that.
[00:13:01.806] Raven Zachary: Yeah, I think one of the strengths of HoloLens is this notion of world locking. So the ability to place an object in an environment, because it's built a spatial map, it knows what planes and surfaces exist in your room, to place an object, and then based on essentially detection for that object to stay locked in its position. So you can do a 360 degree turn around, you can turn back, leave the room and come back, that object persists in that space. And in fact, I even we were laughing as you came back into the room and you remove some holograms you would place to kind of tidy up your living area. And that's an experience I've had to where myself or my colleagues will place various holograms around and because the spatial mapping is so good, it will remember locations. my colleague had placed the letter R for my name above my desk on one of the HoloLenses. And so I put it on, I was like, hey, look, you've given everyone a title above their workspace as well. So yeah, I think world locking is exceptional. And I think that's one of the strengths of having a fully untethered device is that you can walk from room to room to room and build up essentially a library of spatial maps for your home or your workplace. And it will remember what's there, how to interact with that environment.
[00:14:09.683] Kent Bye: And so with this initial experience that you just announced at the build conference this past week, it seems to have a collaborative element where you have these avatars and you can see their hands. Are you actually tracking the hands? Or maybe you could talk a bit about like, it seems like you're tracking gaze and head position. And so to create a social virtual shared space within an augmented reality, mixed reality experience, talk a bit about all of what you had to do.
[00:14:37.136] Raven Zachary: Sure, so this week at the Microsoft Build Conference, we announced and gave three talks on something we're calling the Object Theory Mixed Reality Collaboration Service. And that collaboration service essentially is a developer engine. It's a set of components that we've built that we can build custom software for clients around. Three main components, avatars, 3D models, and annotations. So on the avatar side, what we've done is created essentially representations in full size of a human being being remoted in from another HoloLens, or a PC, or a tablet, or a smartphone into that experience. We are tracking head position. For motion, we're doing interpolation so that as you move from point A to point B to point C, we're estimating direction and speed and these types of things to be able to create a natural movement. We are able to turn the torso, we're able to move the head. We're not doing any hand tracking, so those hands are basically along the sides of the body. I think we could do those types of things in future iterations. And then we do customization, colors, name tags, and flair. You can have a hat or glasses or a pencil behind the ear, which helps to distinguish between who is who in a scene. For 3D models, you can bring in a standard 3D model from down to the tabletop size to full scale. We support the ability to take a building, let's say it's a water treatment facility, and start at the table size and then expand it to full scale and then walk around that facility, as long as you have a space large enough to be able to do that. And then for annotations, we support drawing in 3D, similar to Tilt Brush, the experience in the Vive, where you can draw in 3D space with your finger in different colors. You can leave a text note or a voice note as well, and it will remember that from session to session and have persistence around that annotation. So that's what we announced this week. We gave three talks down at the San Francisco at the Microsoft Build Conference, and those slides which include our developer and user experience learnings from that development project, are up on our website. And so you can look at the slide deck for people interested in this notion of what was it like for us to spend several months digging into what an avatar collaboration experience would look like in mixed reality.
[00:16:37.840] Kent Bye: Are you able to tell the avatars apart just from body language?
[00:16:41.982] Raven Zachary: A little bit. Definitely. People are more fidgety with their heads than other people. We were just joking about that earlier today is one of my colleagues moves his head around a lot more than the other colleague. And I knew who that was without even looking at the nameplate just by their head motions.
[00:16:55.048] Kent Bye: Absolutely. And, uh, I guess in an experience like that, what are some of the things that you think that AR with the Hollins is much better suited than doing something like that in VR?
[00:17:06.220] Raven Zachary: Well, I think for a lot of vertical market enterprise solutions, whether that be pharmaceuticals or architecture or oil and gas or manufacturing, there's so many of these where you need to be in a physical space for safety, for collaboration and communication. And so augmenting your physical world with digital objects is a better fit than going immersive into an office with a tethered VR headset. Again, that's not to say that there aren't some great examples of my vibes on its way. I'm super excited. I've got an Oculus coming as well. I've got a DK2 now and a Gear VR that I use regularly. So definitely bullish on VR in addition to AR. But I think what you're going to see in these early days is VR is going to be predominantly entertainment, sports, therapy, and experimental art, where I think right away out of the gate, HoloLens, and this is how Microsoft is largely positioning it with their partnerships they've announced, right out of the gate with enterprise use cases. So you're going to see this first wave of adoption being driven by business, which is Microsoft's historical strong market. And then you're going to see over time as devices become more available and they roll it out, much more of a consumer band. They've shown off a Minecraft version. You got to play some games today. They've shown other entertainment titles in their videos.
[00:18:20.153] Kent Bye: And so it seems like for the architecture industry, there's going to be, I think, clear use cases for both AR and VR, where I think some of the pre-visualization stuff may actually be better in VR to actually be immersed into this space. But yet, at different stages, I feel like there would be different times and different phases of a project. So from you working with an architectural firm, maybe you could talk a bit about how they see this design pipeline and where AR and VR kind of fits into that.
[00:18:49.765] Raven Zachary: Yeah, I think for an individual engineer or architect or construction person, VR, where you're kind of in your own design element and you're at your workstation and you want to have a fully immersive experience and focus on the details might be the best possible fit for an individual contributor. The moment you want to start having design reviews where you're in a conference room with a whiteboard and a speakerphone and a big monitor, the notion of being able to see a tabletop version of what was created, to annotate it, discuss it, point at things, take notes on a whiteboard, to me, I think you shift from VR perhaps being best suited for the individual contributor role in design with group conversation and collaboration being driven primarily by mixed reality. I do think HoloLens will be used by individual contributors as well. But just to go back to your point that there is a sweet spot for VR, and I think I struggle myself with the role of VR in collaborative use cases in a business setting, where I think there's been good examples of chat and social experiences in VR. It's harder to bring that into the workplace.
[00:19:56.462] Kent Bye: And so you've been working on this HoloLens stuff for a number of months. And within this past week, there's been a lot of restrictions and NDAs that have been lifted. And so what are some of the things that you can tell us about the HoloLens that you may have not been able to over the last couple of months?
[00:20:13.114] Raven Zachary: Well, the big one is you got to use it today. So today was the first day we could sit down together and have this conversation openly and have the demo. That was the first. You know, we're part of this program called the Microsoft HoloLens Agency Readiness Program, which gave us early access to Microsoft and learnings and mentoring and these types of things to build experiences for our clients. That's another part that we're able to share today. Our client, CDM Smith, we're able to share today. The learnings from building avatars we're able to share today and put up on the web. You know, other than that, I would probably have to refer you to Microsoft. It's their technology. So I'm here mostly just to talk about the ecosystem from a third party developers point of view, but I'm really excited about the technology.
[00:20:52.961] Kent Bye: And so you've started the HoloLens meetup here in Portland and talk a bit about some of the people that you've had talk there already.
[00:20:59.686] Raven Zachary: Yeah, so we've met a couple of times. The first meeting was Clackamas Community College, which was one of the grant winners with Microsoft around the academic grants. And so they're going to be working on vocational training using the HoloLens for engine repair in their community college. And we were part of the grant team that brought that here along with Intel and others, Oregon Storyboard being key as well. The second meetup, we had Garen Gardner from Autodesk. He was the lead on the Autodesk Fusion 360 partnership with Microsoft for HoloLens. And then our third meeting, we took a break for a few months due to the NDAs and hard to get speakers during the winter. And then we're meeting again on Monday, April 4th, where we're going to be doing basically a recap of what was announced the prior week at Build, and then talk some about our Q&A learnings with the dev community, which we can do for the first time. Also this week, Microsoft launched the emulator. So if you don't have access to a HoloLens, you can still download Unity and Visual Studio and all of the toolkits and run an emulator on your PC with Windows 10 to at least experience what your Unity app would look like or work within a HoloLens experience.
[00:22:06.312] Kent Bye: And so would there be a VR component where you put on the VR headset and see an equivalent of that? Or I'm just trying to think of how are you actually mimicking these immersive environments with 2D screens?
[00:22:17.402] Raven Zachary: Yeah, great question. Definitely encourage you to check out the emulator and I'd love to hear your feedback on that. We actually did a lot of prototyping in the early days on VR, specifically Gear VR. Because it is untethered, we did some tricks where we used the passthrough camera and actually tried to augment Unity technology through the passthrough camera, which is kind of the poor man's HoloLens. And that actually may be used by some people out there who want to do early HoloLens development, but don't have access to hardware, is this combination of hacking a Gear VR for the portability and walkthrough, plus the emulator experience on a desktop PC.
[00:22:52.617] Kent Bye: And so at this Build Conference, I imagine that there's a lot of different companies that were publicly coming out for the first time as being involved with Microsoft and being some of the earlier developers. Who were some of the other companies that were there doing some stuff that you thought was either interesting or interesting names that you saw there at the Build Conference being announced?
[00:23:11.577] Raven Zachary: So the most exciting one, I think, for most people was Destination Mars, which was the next partnership between NASA and Microsoft to HoloLens. They're going to be putting in a museum experience at Johnson Space Center in Florida, narrated by a 3D version of Buzz Aldrin. You go into a room, an empty room, and then magically you're transported to the surface of Mars and walking around with Buzz Aldrin and looking at rocks and experiencing the history of Mars. We were on a slide in the keynote on Wednesday with probably about 15 or 20 other companies that are early partners with Microsoft. Some you've heard of, Trimble and Volvo and others, but there were new ones up there, even new to me, Japanese Airlines, Saab, Boeing. Not all of these companies came out and showcased the work that they're doing, but some of these are in the keynote that were announced on Wednesday. And then there were other digital agencies like ours that had things to show as well. And they had talks there. And I believe those talks are going to come up onto the web in the next week that you can watch. And I think the HoloLens Reddit group will probably have links to those talks.
[00:24:11.241] Kent Bye: So going back to the first time that you actually saw some of these HoloLens demos, maybe we could talk about what you experienced and perhaps that moment when you really decided to go all in with AR.
[00:24:22.108] Raven Zachary: Yes. And there's this thing I will often draw on my whiteboard for people who have never used a HoloLens, you know, it's kind of enthusiasm or excitement on one axis and time on the other. And you generally have a lot of expectations coming in. around what you want it to be because we've had the holodeck from Star Trek. We've had movies like Minority Report. We've had augmented mapping in movies like Prometheus and Avatar. So we have holographic communications from Star Wars with R2-D2 projecting Princess Leia onto Obi-Wan Kenobi's table. So there's all of this preconceived notion around what an AR experience, an immersive mixed reality experience is like. And so You generally have that expectation, you kind of reset your expectations as you adjust to having this thing on your head and learning all of these gestures and going through the setup. And then there's this, again, this kind of gradual increase again towards kind of the top of the enthusiasm chart. So I think my first time using a HoloLens, I spent you know, about two and a half hours in a variety of experiences. And on my drive back to Portland from Redmond, all four hours, I was just like, Oh my God, what about this? And what about this? And you can do this. And, and so it opens you up to think about other things. It's an intense experience because we don't have a lot of reference to what it's like relative to other things that we've done. And especially if you're not a VR person, if you're someone who's never used VR and suddenly you're thrown into a HoloLens, it's a big eye opener for the first time. And I have been as enthusiastic now, I think, as I ever have been. And because I've worked with this now for so many months, I have pragmatically approached how to solve certain kinds of problems, how to optimize for certain things here and there, what not to do, how to do things better. super excited, very bullish on the HoloLens. And, you know, I think this is a very exciting time. I mean, we were joking before the interview that in the next week, we're getting shipments of HoloLens, Vive and Oculus, and they're all coming at kind of one moment in time. And it's pretty profound. There's a lot of transformation going on in this technology. And I think the bigger story, regardless of whether it's AR or VR, is we're shifting from rectangles, 2D, to depth environments, 3D. And it's pretty intense existential stuff. I mean, when you think about it, is that we're taking technology and we're bringing it into our physical environments in a way that has never happened before.
[00:26:47.480] Kent Bye: Yeah, the thing that I think was probably some of the most impressive stuff with the HoloLens was being able to scan the room. And it was actually remembering where everything was as I was walking around and it was able to place these digital objects. And, you know, as we're talking about some of these different applications, I know there was Microsoft showing kind of like these virtual personal assistant, someone who's a subject matter expert who may be guiding someone who may be in the field doing some sort of repair job and using these overlays. And at this point, it seems like a pretty simple, it's just giving a scan of the room, but yet I could imagine in the future, it actually being able to scan and identify objects and start to classify things and then be able to perhaps, you know, do digital overlays over that. So where do you see that going with the HoloLens? Is it, you see that that's something that, you know, Microsoft would be providing at an API level, some of this computer vision object recognition technology?
[00:27:42.391] Raven Zachary: It wouldn't surprise me. Microsoft Research has published a number of really compelling videos on YouTube in the last two months that have shown things like real-time holoportation with scanning and decimating video feeds into meshes. They've shown image recognition. They had that app they put out a few months ago where it would guess your age. Remember, there was this social media campaign to do that. So I think you're gonna see all of this stuff come together over time. I'm very excited about it. I mean, this is the beginning of a multi-year journey And I think you will see these things all kind of fold into the ecosystem over time.
[00:28:15.698] Kent Bye: And so for you, what makes you the most excited about this?
[00:28:18.759] Raven Zachary: What makes me the most excited about this is seeing, and you can chuckle as I say this, but you know, large fortune 100 companies that tend to be slow adopters to new technology actually getting really excited about mixed reality because they trust Microsoft. They've had a long-term relationship with Microsoft technology in their company. It's passed certifications for security and all of the things that are really important to Fortune 100 and Fortune 500 companies, and to be able to work with the leaders in these companies on actually solving real-world problems using innovative new technologies, where I think in the VR space, there has tended to be more innovation in the entrepreneurial space and in the gaming space, where I think in HoloLens, you're going to see large industries actually innovating and leading in this space.
[00:29:07.476] Kent Bye: And finally, what do you see as kind of the ultimate potential of augmented and virtual reality and what it might be able to enable?
[00:29:14.650] Raven Zachary: Well, I've often joked that by the year 2020, we're going to all be schizophrenics wearing glasses. And what's behind that statement is this notion of you're bringing these things in your life and in your world that you treasure and have a profound personal connection to into your day to day life. And so I think where I want to see this going, and what I'm really excited about doing is jump ahead five to 10 years in this notion of bringing advisors with you throughout your day. Those could be artificial intelligence, or they could be the personas of people that you have a personal relationship with. It could be a artificial dog that has an emotional response to you. This notion of kind of the Greek chorus following through your journey in a private way, only you can perceive it, but then if you and I meet, maybe we can share those Greek choruses and those advisors, and they can have conversations together, enabled by machine learning. And this notion of augmenting my human experience with the power of computing versus computing replacing the human experience. So I don't see AI and these topics to be disruptive to the human condition, I see them enhancing the human condition because ultimately they're going to be there to empower our experiences and allow us to share in ways we've never shared before. And that can also come in VR, not just in AR, but I think the notion of being able to travel physically, you know, from the home to an office to a cafe I think you're in this period, and you and I have talked a lot about this, this notion of this divergence period where AR and VR are on very clear, distinct paths, but at some point there will be a period of convergence where now AR and VR come back together. And, you know, once we do things like black shutter pixel HMDs, where we can go from fully immersive to transparent, we're not going to think of AR and VR separately. We may call the whole space virtual reality. That term is sticking more than AR or mixed reality. So all of this just may be called virtual reality. and it may be as portable today as a HoloLens is, and you can go fully immersive or see through, partially transparent, and you have both sets of experiences. But I'm really excited about this notion of bringing the private personal into the physical world, and then new kinds of sharing between human beings.
[00:31:24.185] Kent Bye: Yeah, just to kind of elaborate on one specific point there is that on the HoloLens, you can't show black because black is transparent. And so at this point, you can only lay things on top of other things. And so Yeah, and talking to Steve Feiner at the IEEE VR, he's someone who's been doing AR for the last 25 plus years, and he talks about the mixed reality spectrum for, you know, reality, then augmented reality, then augmented virtual reality, which would be adding real objects into a virtual environment. And then virtual reality, in that it's a full spectrum, in that you could claim that that full spectrum is, in some sense, augmented, because virtual reality could be a subset of augmented reality. So to me, I definitely see that the long-term market opportunities of AR are going to be way larger than VR, just because we're human beings that live in a world that we need to actually exist in.
[00:32:18.242] Raven Zachary: Agreed, agreed to all points. Yeah, it's fascinating.
[00:32:21.871] Kent Bye: It's a fascinating thing. Is there anything else that's left unsaid that you'd like to say?
[00:32:26.367] Raven Zachary: Yeah, you don't need to get a HoloLens, a $3,000 dev kit to start working in this environment. You can go to HoloLens.com. You can download the dev kit and the emulator and start work now and building AR experiences. And if you've already been building in VR and you know C sharp and unity, you have a tremendous advantage in terms of getting that learnings early on. And so much of that learnings in VR can be brought over to the HoloLens. So the one thing that's most important that I want your listeners, if there's one thing that they take away from this entire interview, it's this support can't buys Patrion. And I'm serious about that because he's bringing some fantastic content and he did not ask me to say this. This is my own plug as a supporter, a Patrion of Kent, and we need more interviews like this. And I'm excited to listen to your other interviews that you're bringing back from these conferences that you've been at recently.
[00:33:18.993] Kent Bye: Awesome. Well, thank you so much, Raven.
[00:33:20.653] Raven Zachary: Thank you.
[00:33:22.113] Kent Bye: And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash voicesofvr.