#762: HoloLens 2 & Azure Kinect with Microsoft Developer Relations Jesse McCulloch

jesse-mccullochI got to do a hands-on demo with the HoloLens 2 at Microsoft Build, and then talk about my impressions with Microsoft Developer Relations Program Manager Jesse McCulloch. We talked about where HoloLens is at, more details on the 4th generation Azure Kinect camera, the recent MRDevDays gathering, and the process of cultivating a developer ecosystem for Mixed Reality.


This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. So just this past week was the Microsoft Build Conference in Seattle, Washington, where I had a chance to attend the first couple of days. I talked to about six different people about what's happening in the mixed reality ecosystem for Microsoft, and it was a little bit more of a quiet year for mixed reality and augmented reality at Microsoft. They had made their big announcement at the Mobile World Congress on February 24th, 2018, where they announced the HoloLens 2, they announced the Kinect camera. The HoloLens 2 has double field of view, it's a lot more comfortable, it's got eye tracking, it has gesture detection. So that was in Barcelona, and I didn't have a chance to go to that. I was in San Francisco for the Immersive Design Summit. But I figured that I'd be able to go up to Microsoft Build to be able to get hands-on and try out a demo for the HoloLens 2 and to kind of catch up with the latest and greatest of what was happening with Microsoft as well as with augmented reality. So it sounds like they're still getting ready to launch it at some point. I don't know if they've even announced a launch date for it yet. They're still trying to get development units into the hands of developers. They actually just last week had a whole MR Dev Days, so Mixed Reality Dev Days, where they brought in over 400 different developers from around the world and did a whole workshop. I still like to go to Microsoft Build to be able to try out all the latest demos. There's really only like a couple of HoloLens demos all across all the different expo floor that was vastly expanded this year. But I had a chance to try out the HoloLens 2 and I was super impressed with it. It's got eye tracking which being able to like read text and it automatically scrolls down. Being able to actually have hands-on interactions with the holograms I think is a huge step and I think there's still a lot of like tweaking and tuning that has to happen. So it's still like not production ready. It's not going to be what it is when it's finally launching. They're still kind of working through a number of different optimizations and whatnot. So I had a chance to go through the demos and talk to Jesse McCulloch. He's the program manager for Mixed Reality. He's on the developer relations team. He's been cultivating a community within the Mixed Reality ecosystem for a long, long time and just finally took a job with Microsoft last September. But I had a chance just to kind of talk about all the different latest news and announcements that's happened. He kind of ran me through both the HoloLens 2 and the Azure Connect camera and You know, just kind of getting a sense of what you need to know in order to start developing for the Windows HoloLens 2. So we're covering all that and more on today's episode of the Voices of VR podcast. So this interview with Jesse happened on Monday, May 6th, 2019 at the Microsoft Build Conference in Seattle, Washington. So with that, let's go ahead and dive right in.

[00:02:50.661] Jesse McCulloch: My name's Jesse McCulloch. I'm a program manager with Microsoft Mixed Reality on the developer relations team. So we're here at Build doing some demos of HoloLens 2 and getting people some time with the new device.

[00:03:01.690] Kent Bye: So maybe you could tell me a bit about your background and journey into Mixed Reality.

[00:03:05.727] Jesse McCulloch: Yeah, so I was a developer who saw the original HoloLens announced on stage and I was like, oh my god, I gotta get into this. So I applied for the developer program there and was on the first wave of people to get HoloLens 1, started developing for it, started working really close with Microsoft. I built a community on Slack around mixed reality development. That grew really big and kind of got me noticed by Microsoft. I started working with them on throwing some events. And then back in September, I actually got hired to do dev relations full-time.

[00:03:38.238] Kent Bye: So you were doing a lot of cultivation of a community of mixed reality developers, doing that in your own spare time, until eventually now you're doing it full-time.

[00:03:44.680] Jesse McCulloch: Yeah, it's super awesome. It's like my dream job. I was doing it for fun, and now I'm getting paid to do it. And not a lot's changed. What is it about that that you enjoy? Just, you know, there's such passion in the people who are doing this and it's such cutting-edge technology and people really just are drawn to it. The people who are in it are really drawn to it and really focused on it and so it's really fun to cultivate that and see them grow and be able to put kind of the resources of Microsoft behind them and elevate the community and make sure that, you know, we're communicating very well back and forth because that hasn't always been the case.

[00:04:19.244] Kent Bye: I think I did an interview with you a couple years ago where it was before you were looking at Microsoft and there seemed to be a number of different tiers of either you were in with Microsoft and had access to everything or you were on the outside and didn't have very many resources at all. But it sounds like now that you're on the inside, you're trying to change that in some ways.

[00:04:37.395] Jesse McCulloch: Yeah, I mean, my job really is to try and be the voice of Microsoft to the development community and the voice of the community back to Microsoft and make sure that there's a good feedback loop going there. Regardless of whether you're an indie developer or a small shop or one of our big partners that, you know, we get a lot of hands-on time with. But to me, a developer is a developer and And right now, the people who are indie developers who are just kind of learning this, in two or three years when this takes off even bigger, they're going to be working for those partners. And making sure we cultivate that relationship early is super important to me.

[00:05:08.422] Kent Bye: So yeah, just this past week on Thursday and Friday, there was a big MR Dev Day, so the Mixed Reality Developer Days. Maybe you could tell me a bit about, like, is this the first gathering, and a little bit the back story of how that came about, and then kind of what happened there at that gathering.

[00:05:22.198] Jesse McCulloch: Yeah, so actually it's kind of been building for the last couple of years. So about two years ago I ran a community event called the MR Dev Summit, where I actually brought a bunch of community people in. We had it hosted at Microsoft and got some of the resources with devices and stuff. And it was about 20, 25 people and we spent a week on Microsoft's campus. That was super successful and Microsoft saw some value in that. And so last year at Build we did what was called the MR Jam. And again, this is me outside a community member trying to help, you know, organize that. We did kind of this mini conference inside Build called MR Jam. We had about 100 or 150 people come in and do a whole bunch of mixed rally specific sessions. You had to have applied for it and you got a special band. So not everybody from Build was able to come into those sessions. And that was, again, equally successful. And we started seeing a lot of value in that. And so this year, we decided we were going to do our whole own mini event. So last week on campus, we had 400 developers from around the world. I think somewhere around 60 or 65 countries were represented. So bringing a whole bunch of just really hardcore developers in to get content directly from the engineers and the program managers that are building the devices and services that they're using.

[00:06:31.935] Kent Bye: I know last year I was here at Microsoft Build and I was able to sneak in a few times to the MR Jam just to see the community and the people that were talking there. Super interesting. And this year, because the MR Dev Days was before Microsoft Build, And there doesn't seem to be a lot of announcements about mixed reality on the stage. There's a few things like the spatial demo teasing of Minecraft AR that's coming out with more details on May 17th. I know last year there was a lot about the Azure, a lot more about spatial computing and sort of the whole vision for spatial computing and all the different things tied together. And this year there wasn't as much of an emphasis. Why not as many emphasis on mixed reality and virtual reality, and why so few demos on the floor here this year relative to the previous two years?

[00:07:17.559] Jesse McCulloch: Yeah, so, I mean, kind of our big moment this year was at Mobile World Congress, where we announced the new HoloLens 2, the Azure Spatial Anchor service, and a couple other services and devices, the Azure Connect DK. So that was kind of our big moment back in February. Now we're in the process of actually rolling out those launches for HoloLens and Azure Connect. The Spatial Anchor service is in public preview and so there are some sessions around that. But right now we're really through this engineering phase where we're trying to get the devices ready to be shipped out. when that time comes. And so it has been a little bit of a lower presence here just because we are limited in the number of devices that we have, making sure that the devices that we do have are of good quality for doing demo purposes. And then just as the year rolls out, we'll probably see a little bit more devices available for developers to play with.

[00:08:04.800] Kent Bye: Yeah, I was in San Francisco at the Immersive Design Summit during Mobile World Congress, and I had to make a decision as to which I wanted to go to, and so I ended up covering the more immersive theater storytelling artists that were really pushing the edge of what's possible with immersive storytelling, because I feel like that's going to be a a big part of the future of spatial computing, whether it's in mixed reality, augmented reality, or virtual reality. So I missed out on the opportunity to go to Mobile World Congress. But maybe you could fill me in a little bit as to the big announcements, a little bit more details in terms of the spatial anchors and the SDK and other things that were being announced there at Mobile World Congress.

[00:08:40.446] Jesse McCulloch: Yeah, so three main things that got announced, the Hollins too. We increased the field of view to double the area. It's got integrated hand and eye tracking, so lets you do direct interaction, which is really, really fun to play around with and see people figure out how that works and kind of light up. The Azure Kinect EK, which is the fourth generation of the Azure Kinect sensor, the third generation having been in the HoloLens, the first HoloLens. This fourth generation in the new HoloLens as well as in this Azure Kinect sensor. It also has a 4K camera, an accelerometer, a microarray. So that's kind of a standalone device that works with our Azure services to do computer vision type stuff. And then our Azure Spatial Anchor Service, which is a cross-platform anchoring service that works across HoloLens, HoloLens 2, Android and iOS phones, and lets you place anchors in space and have them persist through time, so people can come back with your app later on and have models or holograms pop back into that same space across different apps.

[00:09:40.254] Kent Bye: Now, when you say it's cross-platform as it anchors, you know, one of the interesting things that I see about Microsoft is that the mobile's window phone never really took off, and so in some ways, both Microsoft and Facebook missed the mobile revolution in a lot of ways, but they're kind of leapfrogging into both augmented reality and virtual reality, but that There's a little bit more of a decentralized open strategy, especially with Microsoft, of making sure that you're really compatible with all the other systems that are out there. I'm really struck by just listening to both the keynotes and all the things that are announced of how much you're taking this whole decentralized open approach. And so the fact that the spatial anchors would be compatible on both the ARKit and ARCore, I think it's really interesting. I'm not sure if there's anybody else that's talking about that or doing something that is trying to be compatible with all of the various different augmented reality systems. But what does that mean that it's compatible? Is it using their own native systems on ARCore and ARKit or is it your own system that Microsoft has developed that happens to be an open protocol enough for it to just work on both the ARKit and ARCore?

[00:10:45.453] Jesse McCulloch: Yeah, so currently it's a separate DLL that you drag into your project depending on what you're using. But like you said, we're really pushing this open ecosystem, you know, go where the developers are on the devices that they're working on. And that just allows people to not have to think about, oh, am I building for this or am I building for that? And is this going to work across them? So currently we're working where ARCore, we have ARKit, and then again for the HoloLens, both for x86 and the HoloLens 2 is in ARM architecture. So we've got plugins for that as well to make sure that we work across everything.

[00:11:20.323] Kent Bye: And maybe you could expand a little bit more as to this fourth generation of the Kinect camera. Because when I first got into virtual reality, there was the Kinect version 2 that had just come out. And I think it was really being targeted towards gaming, but yet never really either was supported or took off. It seemed like it kind of died. But then a lot of the developers of the Kinect cameras then started working on the HoloLens for the third generation to be on the HoloLens 1. And yeah, I was told that the fourth generation is essentially the same tracking technology that's in both the HoloLens 2 and this fourth generation of the Azure Connect camera. And so maybe you could just talk a bit about what types of things are made available on the SDK and what types of use cases do you expect to see this Azure Connect camera be used for?

[00:12:04.498] Jesse McCulloch: Yeah, so as you mentioned with the Kinect 2, it was built for Xbox as an Xbox accessory, but people were taking and connecting it to their computers and doing some pretty awesome computer vision stuff, depth sensing stuff with it. And when that project was basically end of life, we heard this kind of big uprising of people who were like, hey, this was super useful for us, not in a gaming context, but we're using it in work and in research and whatnot. And so we kind of went back to the drawing board and said, OK, well, what does it look like if we have a device that has this technology in it that's not built around gaming, that it's built actually for people to use within a workplace or whatnot? And how will people use it? And that's always a big question. And Microsoft tries to do a really good job of listening to its users and seeing what they're doing with stuff, and then iterating on that feedback to make their products better. And so with the Azure Connect DK, really we wanted to take a bunch of different sensors, put them together in a package, add an SDK that kind of pulls it all together, and then see what people do with it. That way we can gather that feedback and see whether having all these sensors is really necessary, if people are just looking for just the depth camera and the color camera, or just the depth camera and an IMU, and what they're going to do with it. So that's kind of the idea behind this Azure Connect DK, is to get it in the hands of developers and see what feedback we get and how they want to use it. as well as, you know, how enterprises want to use it as well. So along with kind of the Azure edge scenario and edge and cloud story actually, being able to take these devices that have all these sensors and send that data back up to the cloud and then iterate on it and analyze it and then send data back down is really kind of a key component of that story. So this is just one more edge device that is sending data back and people can send up the color camera feed or the depth camera feed this audio and do kind of some of our cognitive service speech services work really well on it but really we just want to see what people want to do with it and how they're using it and then we iterate on it from there.

[00:14:00.270] Kent Bye: And one of the other trends that I'm seeing is the adoption and embracing of containers, like Kubernetes containers or Docker containers, and how there's some of the Microsoft Cognitive Services that could be taken off of Azure, put into a container, and then onto an Edge compute device. And so is it possible to put a container with some computer vision algorithm that's all managed by either Kubernetes or Docker onto an Azure Connect device and have everything kind of run locally in terms of doing computer vision or other types of AI detection that would only be happening on that edge compute device.

[00:14:35.091] Jesse McCulloch: Yeah, so actually, the Azure Connect DK actually doesn't have any onboard compute. So it requires that you have it connected up to a computer. But we've got it working on things as small as an Intel NUC. But you can absolutely put containers on the machine that it's connected to and run those services locally on that machine.

[00:14:52.336] Kent Bye: I see. And so last year, I talked a little bit about the MRTK. I know there's the VRTK within the larger community, and then there's the MRTK at Microsoft. And so maybe you could tell me a bit about what is this mixed reality toolkit?

[00:15:04.572] Jesse McCulloch: Yeah, so the Mixed Reality Toolkit, or MRTK, really started with the first HoloLens, and at that point it was called HoloToolkit. At that point it was just kind of a collection of scripts that made life a little bit easier for doing development with the HoloLens. We moved from Windows Holographic to Windows Mixed Reality when we added the VR headsets in. And so at that point, the toolkit kind of expanded a little bit. But it's been built over time, and it's kind of just been this hodgepodge of stuff that keeps getting added on. So about a year ago, we went back to the drawing board with it with some community members who looked at it and said, let's actually build this. Because we were getting a lot of feedback like, oh, you have to bring the whole thing in. It's kind of monolithic. It bloats your project. It's not built with performance in mind, because it's all just built on top of each other. So we brought it back in and said, OK, we're going to rework this. We're going to start with it, make it modular so you can pull in just pieces if you need to, make it performant, and then make it work with HoloLens, too. So that's what we've been working on for about the past year is making that transition to the MRTK. V2 and that just went into our release candidate mode about two or three weeks ago. But it's kind of the best way to right now to build for HoloLens or our mixed reality headsets. But again, with that same cross platform idea in mind, we've built it in a way that if you are building for other platforms, you can have it as part of your project and it won't cause any issues with the idea that we'll be able to add support to other platforms as we grow with it.

[00:16:31.312] Kent Bye: Is that all written in Unity, or is there also a version of MRTK which is using Epic Game Blueprints?

[00:16:37.094] Jesse McCulloch: Yeah, so right now it's Unity. We're kind of waiting for our Unreal story to kind of emerge a little bit from Microsoft, and then we can start building a toolkit that will work there too.

[00:16:47.270] Kent Bye: Well, it was a little unfortunate at the beginning of the keynote because there was supposed to be a big HoloLens demo that I think had some technical difficulties, but I know Dean from VentureBeat had reported that there was going to be some Epic Game demos that was being showed. Can you give me any information in terms of what was being shown or what this partnership with Epic Games is at all?

[00:17:07.563] Jesse McCulloch: So I don't know what was being shown. I was actually driving up from Portland this morning, listening to the keynote, and saw kind of the same thing happen somewhat live with you. But at Mobile World Congress, talking about our open platform, we announced Unreal Support is coming. That'll work with HoloLens. It already actually, we've had Unreal Support for the VR headsets for about six months now. And it was kind of something that was just kind of quietly released. But yeah, now we're working on kind of what that Unreal story is with Epic. and moving into this open ecosystem where our developers will really have choice in where they want to develop.

[00:17:41.914] Kent Bye: Yeah, there was a moment where they had mentioned streaming to the headset that Epic has worked on. I don't know if that was pixel streaming or if you have any more information in terms of what they meant in terms of streaming onto the headset.

[00:17:54.840] Jesse McCulloch: Yeah, so one of the things we announced at Mobile World Congress that's in really limited private preview is our Azure Remote Rendering service. So it actually lets you render high-density, high-poly models in the cloud and then stream them down to the device. And so I think that's what they were talking about.

[00:18:09.043] Kent Bye: OK, so you can do off-board rendering and then have low enough latency so as you move your head around, be able to do rendering in the cloud, essentially.

[00:18:17.250] Jesse McCulloch: Yep, exactly. So being able to, and we've had this running, you know, inside Microsoft on HoloLens 1 and being able to render like complete cars, details down to like individual screws, super high polygons and keeping it 60 frames per second as you move around to look at it. And it's super awesome. And again, right now it's in very limited private preview and then eventually we'll be able to open it up into a public preview and then launch it as a service.

[00:18:43.700] Kent Bye: So one of the things that I was really struck by when I was watching the presentation from Mobile World Congress announcing the HoloLens 2 was just a slide where you had all of the different software partners that you've had and an ecosystem that you've been developing on the HoloLens for the last three to four years or so. And so there seemed to be a little bit of an emphasis of, hey, don't worry. You're not going to have to build your own app because there's lots of different people that are going to be providing these different services to you. If you want to start using it right away, you'll have the opportunity to do that, which I thought was interesting because it's not about getting the HoloLens into the hands of the consumers and then expect them to develop everything from scratch and learn everything about spatial computing that you need to and how to do an optimized spatial platform. But maybe you could talk a bit about that as a strategy in terms of this wider ecosystem that's been cultivated and taking the burden off of some of these other companies to expect that they're going to do everything themselves, but to have a big, wide ecosystem of offerings that they may be able to buy into to be able to dive directly into starting to use the HoloLens with a lot of apps already.

[00:19:49.745] Jesse McCulloch: Yeah, so with HoloLens, with the first HoloLens, you know, kind of one of the burdens, like you said, is you get the device, you put it in the hands of the developer, and now you have to wait three or six months or however long it takes, and you do a proof of concept, and then you go through a bunch of testing. And really it was taking companies a long time to roll apps out and realize the value of having this device. So, you know, we've got some internal offerings with our Dynamics 365 apps and remote assist, guides, layout, where companies will be able to buy these devices and immediately be able to put them to use without having to build out a proof of concept or anything. And then allow them at the same time to build out more specific apps, either build them themselves or use one of our ISV partners who will come in and build custom solutions for these companies. So that time to value is super important because it was taking a long time. Companies were putting a large investment into trying to get an app off the ground. And so this was one of the kind of the key pillars. And again, we have a very rich ecosystem of partners and enterprises that are using it. really focused on that enterprise space right now because that's where we see companies that gain a lot of value. If you're in a manufacturing line and that line goes down, you know, you can be talking millions of dollars a day that these companies are down. So for them, if they have a tool that can let them see problems ahead of time or improve those processes, it's really, really easy for them to realize the value in those devices.

[00:21:13.547] Kent Bye: Yeah, and today at Microsoft Build was my first opportunity to have hands-on demos on HoloLens 2, both on the Spatial app very briefly, and then a little bit more involved demo that was being shown here that you helped show me through, actually. And I thought the eye tracking was super impressive to be able to have a response of be able to look at things and to speak and then have it react to whatever I was looking at. I think that's going to be a huge application. I was surprised to see the latency of interacting with holograms. First of all, there's a certain amount of fluency of learning how to interact with holograms with the little box, just like when you're resizing things in a Photoshop application and there's a certain understanding of all the different ways to do that and so knowing where to grab and how to grab to be able to either rotate or to make it larger that was not immediately intuitive and I think that people will eventually figure out and they'll probably be standards but I've also found that there was more latency than I was expecting that when I would grab onto the edge of a hologram and kind of move it around there was probably a 30 millisecond latency where, you know, the motion to photon latency wants to be, like, less than 10 milliseconds or less than a couple of milliseconds, and that's pretty imperceptible that there's a shift, but there is a definite perceptible latency for me moving the hologram down and how long it took for the hologram to actually move down. It wasn't, like, one-to-one. And I've had demos within, like, say, Leap Motion was tracking my hands and I was moving around virtual objects, and that was, like, the lowest latency interactions I've seen with with holograms. And so maybe you could talk a bit about that latency. Is that a known issue in terms of like having specific numbers for what that latency is and trying to get it down to like a specific threshold where it's like imperceptible that you're moving and being able to interact live in real time with holograms?

[00:22:55.789] Jesse McCulloch: Yeah, so I'm pre-release hardware. So we're still working through a lot of things with the displays and the firmware and everything. So what you experience today is not final shipping version. But also there's always this balance of on a device where it's a mobile device, we're wearing the whole thing on your head, you know, those trade offs of where do you spend your battery power? Where do you spend your display power and whatnot? So those are all things that we balance and try and work through and experiment with. And we do a lot of user research, what feels good. And like I said, we're still kind of working through a lot of this stuff. So I would expect that that gets better over time as our models get better and whatnot. But again, it's always a trade-off with battery. Things like Leap Motion, where you're tethered to a computer and you've got the full strength of a tethered computer and GPU and everything, allow you to have a little bit better refresh rate on some of that stuff than we can do in a mobile device.

[00:23:46.748] Kent Bye: What's the refresh rate on the HoloLens 2?

[00:23:49.630] Jesse McCulloch: On the displays, it's 120 hertz. I'm not sure on the hand tracking.

[00:23:53.792] Kent Bye: OK. Because I imagine there's got to be some computer vision stuff that's going on there as well to be able to detect and segment and kind of make sense of what's happening with these different gestures that also requires a certain amount of processing power, I imagine.

[00:24:06.003] Jesse McCulloch: Yeah, so just like with the first HoloLens, we had the holographic processing unit, which was some custom silicone and ASIC chip, basically, that took all that sensor data and kind of made sense of it for the HoloLens. We've got the HPU 2.0 on this one, which does some deep neural network stuff, which lets us improve the hand tracking, but it's taking all that sensor data and making it so that we can read it and start making use of it.

[00:24:29.097] Kent Bye: Now, the other mechanic that I thought was new and interesting was that you have these little holographic icons that you're expected to push your finger out and poke and touch. And I ended up having to push through further than I imagined that I would need to. And I guess my concern with these types of spatial interfaces is that anybody that looks at, say, the minority report types of UI, People will say that it looks really flashy and amazing on the screen But to actually do that for an extended period of time is extremely fatiguing And so I could imagine that there is a certain satisfaction for pushing buttons and skeuomorphic type of design But I could also imagine that that could be very fatiguing if I was expected to do that for six to eight hours a day And so I don't know if that means moving to tablet interfaces, doing some sort of controller base, you're able to push buttons. And there's a certain amount of visceral connectedness that getting your hands into the experience and moving the holograms around, there's a certain amount of embodied cognition that is activated there. But you have to trade that off with the interfaces and the types of UI that is going to be able to be sustained over long periods of time. Again, there's these different trade-offs there, but I'm just curious how you start to think about those types of UX decisions and how to trade off how it feels to do it versus the fatiguing impact of different gestures.

[00:25:48.049] Jesse McCulloch: Yeah, so multimodal input is obviously going to be super important in this area, being able to give people the option of how they want to interact with things, whether it's voice commands or reaching out and pushing through a button. But also for developers, it's really on them to be cognizant of what they're asking their users to do. And you don't want to have something where you have your arm up for a long period of time trying to poke through a bunch of buttons, because that does get fatiguing. So these kind of quick interactions where it's something where you just reach out and poke a button, like if you were changing a radio station on your car, you reach out, you press the button, and you're done. So being able to take some of those same, for lack of a better word, instinctual interactions for how we do things every day in our normal life and pull those in is going to be the way that we foresee people building interfaces. And this whole 3D UI, UX is still a very new field for all of us, Microsoft and VR and AR in general. So I think it's an experiment in progress for everybody as we try and figure out what a good UI is for 3D. And we're as interested in everybody else as seeing where that ends up.

[00:26:51.682] Kent Bye: Yeah, and just last week, it looked like there was some announcements that were made during the MR Dev Days. There was a blog post on Unity talking about if you're a mixed reality developer and you want to get started into developing for the HoloLens, then maybe you could talk a bit about what was announced there on that Unity blog post.

[00:27:07.872] Jesse McCulloch: Yeah, so when we announced in February at Mobile World Congress, we announced basically the commercial edition of the HoloLens 2, which I don't know the details off the top of my head of what that was as far as what it all included. I believe it included my seat with Dynamics 365, and it was really geared towards commercial enterprise customers. So this week, or last week at Dev Days, we announced a developer SKU, which is more focused on developers getting them devices in hand. There's a payment plan option. It includes some Azure credit, as well as, I think, a three-month Unity Pro license and three months of Unity Pixis. But we're trying to figure out how do we lower the barrier of entry for people coming in. Because with the original HoloLens, you had to front $3,500 to get in and start playing in this space. So we really wanted to make it a lot easier for developers to be able to get in and start getting started with us.

[00:28:00.211] Kent Bye: Yeah, in that post I saw that up front you could pay $3,500 to get outright the unit, but there's an option to pay $99 a month, and is that like a forever fee that they would be paying, like a hardware subscription, or is that something that is more of like a loan that has interest and that you eventually pay off, but with more payments? I'm just curious if Microsoft is thinking about this model of a hardware subscription, where you would get a subscription and maybe that subscription would imagine if the companies were getting it, they would always kind of get the latest version, almost like kind of leasing the hardware rather than owning it. And rather than having like this one-time payment of $3,500 or $5,000, whatever it ends up being, that you could have a lower recurring monthly subscription fee just to give more access to different people that may work better in their budgeting.

[00:28:48.797] Jesse McCulloch: Yeah, so actually I don't have the details of exactly how that's going to work. I'm still waiting to get that from our sales and marketing teams to come back to us. But as you mentioned, it's really about lowering the barrier of entry, getting people hardware in their hands and get them started. The more people we have in the ecosystem, the faster it'll grow. And the faster we'll figure out things like the UI design, UX, and other use cases that we haven't thought of.

[00:29:11.808] Kent Bye: Were there any mind-blowing demos that you saw at the MR Dev Days or just the use cases that you're seeing in terms of what people are doing with these headsets?

[00:29:20.534] Jesse McCulloch: So for most of the people that were there, it was their first time getting their hands on the HoloLens 2. And one of the things we had was this hands-on lab area where people can go. And we had some tutorials available. But they could also try and bring their own app over or build something brand new, which was really exciting to get 400 people in there and building and just seeing the excitement as they were able to move their apps on and see them work in HoloLens 2. kind of add a little bit of the new interaction model. So that was kind of the really super exciting thing for me, was just seeing people playing with the device and trying to figure it out and learn with us.

[00:29:52.525] Kent Bye: So what are some of the either biggest open questions that you're trying to answer or open problems that you're trying to solve?

[00:30:00.448] Jesse McCulloch: Yeah, so I mean the open questions are always what are developers looking for and what can we do to serve them a little bit better, whether it's services or adding things to the Mixed Reality Toolkit or opening a developer SKU to get the device in their hands. So we're always looking for feedback pretty much across the board, whether they're having software problems, hardware problems, everything's working great, they just want to tell us that, really making sure we have an open line of communication to them. which we do through Slack or Twitter, or they can email us directly. We've got devrelations at microsoft.com as an email address people can get to us at. So really gathering that feedback and Even though sometimes it doesn't feel like we're listening or we're not moving fast enough for them, we are. And HoloLens 2, I think, is kind of a testament to that. We heard a lot of feedback about the field of view, so we improved that. We heard a lot of feedback about how doing the AirTap thing was really hard to teach people, and so we've added this direct interaction model. A lot of feedback about people wanting to use eye tracking, so we added eye tracking. People saying that when you're in a convention center and you're trying to demo and you're trying to use voice commands, it can't hear you. We made that better with some beam-forming mics. We've turned up the volume on the device so you can hear it in louder environments. So all this feedback we've gotten through our enterprise customers and through our developers all goes back into improving the device. And we get there. The feedback does come in, and we mull through it and make sure that we're doing the right thing. But we do act on that feedback.

[00:31:27.205] Kent Bye: Great. And finally, what do you think the ultimate potential of mixed reality is? And what am I able to enable?

[00:31:35.127] Jesse McCulloch: Yeah, so I always have fun talking about this with, like, my mom, because she's the bare minimum tech user. She has an iPad and her iPhone, and that's about the only tech she works with. And she asks me, you know, like, what am I ever going to use this for? And she's very, like, in general, she's just spatially challenged. Like she has to, every time she drives to the airport, she has to pull up, you know, the GPS on her phone to get her there. And I was like, imagine if you were able to just wear a pair of glasses or something like that. And then when you say, map me to the airport you see just a line down the road that you follow to get there and that just like blows her mind the idea that it would just be built in and something that she has access to and doesn't have to think about and that's where I you know in my sky-high dreams I see this going is that it is just part of our everyday lives and used to augment what we're doing almost to the point that you don't realize it's there anymore and that'll be the magic moment is when we get to the point where it's just there and we don't realize it and it's just helping us out

[00:32:32.543] Kent Bye: Is there anything else that's left unsaid that you'd like to say to the immersive community?

[00:32:37.604] Jesse McCulloch: Come join us. We really want you guys on board. We want everybody to be able to come create with us and help move this platform along and really raise the level for everybody so that we move everybody up faster and make this as big as we can.

[00:32:50.388] Kent Bye: Awesome. Great. Well, thank you so much. Thanks, Kent. So that was Jesse McCulloch. He's a program manager for Windows Mixed Reality on the developer relations team for Microsoft. So I have a number of different takeaways about this interview is that, first of all, I was really impressed with my demos of the HoloLens. And I think that there's a lot of things that are still a little bit rough around the edges. And it's not as low latency interactions that I've seen in some of the virtual reality demos. But those demos that I was watching were also on a PC that was tethered. So, to be able to see what they're able to do on the constraints of the head-mounted display, it's a huge leap forward from the HoloLens 1. The comfort's a lot better, the field of view. I didn't even really notice it as being an issue as it was before. I mean, it was, like, super tiny before. it's at the same level as like the Magic Leap, which is also like, it's a good sweet spot. I think, you know, over time it's gonna get bigger and better and as processing power gets better. And also just in talking to Jesse, you know, there's a lot of like deep learning and artificial intelligence that, in a lot of ways, the advances that we've been seeing in deep learning and AI are going to continue to slowly filter into a lot of this augmented reality technologies and just to be able to detect the hands and be able to do the types of interactions you could do, that I think was a much harder problem just a number of years ago. There's been so many radical innovations in AI that are going to continue to be filtered in into these different devices. So, you know, I think the thing that's really striking to me is just the ecosystem that Microsoft has been cultivating, and that's a real differentiating factor from any other company. I mean, You go to the Microsoft build and the culture there is super interesting just because they're really trying to be a little bit more agnostic for their own products and to Just try to empower developers and to go where developers are already going I think that came in from the response of just not having a seat at the table when it comes to like a mobile operating system and Windows phone never really took off and so they've just had to kind of pivot and see that developers still needed to create mobile apps and applications and to put them both on a Mac and PC and on the different iPhone and iOS and so I think the way they've been architecting thing is this much more open and decentralized model and I think the CEO Satya Nadella has really helped bring a lot of those shifts and I've just been constantly impressed with Microsoft and the cultural shifts that I've been seeing in the company and it's really embodied within their conferences that are becoming much more like distributed. I mean they still had talks but they took over like the entire expo floor as much as that I could see that was basically like a lot of stations where people could go ask questions and directly engage with employees at Microsoft. And they had a number of different companies that are usually represented in the partners, but still they had so many different sections that were just the Microsoft employees and being able to be directly engaged with the developer communities. It was interesting just because if you had specific problems or questions you were able to go up to many different Microsoft employees and it's a little bit different than a lot of the other developer conferences that I go to. There's so many Microsoft employees that were there staffing the different booths and answering questions and engaging directly with the developers. In terms of looking at the traditional model of conferences where it's like this broadcast model, you go listen to lectures, the developer conference of Microsoft is really embodying how much of a more open and decentralized company that they've been becoming more and more over time. So with that, they have the Edge Compute and kind of looking at the HoloLens as this Edge Compute device. And so a lot of the strategies that I think Microsoft is taking is to look at all these other services on the back end with Microsoft Cognitive Services, the different AI, And with the announcement that was a bit of an unfortunate botched demo from Epic Games. They were supposed to have this big six to eight minute demo of the Apollo. It's like the 50th anniversary. They're coming up on July 20th, 1969. It's 50 years later. There's a lot of movies about Apollo 11 that were at Sundance that I saw an amazing documentary that's going to be coming out. There's a lot of people talking about Apollo, but they had done this whole like augmented reality spatial story with a couple of people from IMLX lab and an author who's been tracking sort of the history of Apollo. They're going to like walk through and tell like all these really interesting parts of the history of the landing while actually pointing at the spatial representation of the Apollo as it was flying through the air and all the different dynamic aspects of that. and it was using epics pixel streaming so you're able to do like this high quality visual representation that could be pixel streamed down so rather than dealing with the limitations of the HoloLens which is I think about a hundred thousand polygons at the moment they're able to do like a hundred million polygons and so That pixel streaming service is something that, you know, Epic Games and Tim Sweeney was there in Mobile World Congress and saying that Epic was going to start to have these collaborations with HoloLens. And so they had this big, huge, flashy demo that was supposed to kick off the keynote for Microsoft Build, but the demo didn't work. And so there's this really super awkward moment where they basically had to say, doing a live demo is sometimes harder than landing a rocket onto the moon. Anyway, it was a bit of a sad beginning to the Microsoft build but fortunately they were able to show the video of what was supposed to happen I'd highly recommend people to check it out because it's like a really compelling use of AR as a form of storytelling because when you get a tour by someone like if you're on a museum tour and you have this docent and you're able to like walk around and have them point out different things and it just felt like a bit of a guided tour of this whole journey of the Apollo 11 mission and It looked great, and they didn't have any actual demos to actually check it out. But I'm excited to see eventually what that looks like, just because all the demos you've seen so far have been, again, the super limited numbers of polygons and complexity in the scenes. But to have the access to be able to push that level of complexity in a scene, I think, is also going to be a really big aspect in the future. But to do that off. device rendering and then to be able to send it down in real time and to have like these low latency off-board rendering and just to be able to receive these highly complex scenes I think is going to be a huge thing as well. So not a lot of emphasis during this actual Microsoft build on the HoloLens but they've still continued to just focus on producing the hardware, getting it into the hands of the developers and continue to make a much more robust ecosystem At the Mobile World Congress, they had like hundreds and hundreds of different logos that they were showing in terms of the different partners. And so rather than having people buy a HoloLens without any expertise in how to create a spatial computing application, there's a lot of companies that are out there that are going to have different software and services that are made available. So I think that Microsoft has been focusing on the enterprise and the enterprise isn't necessarily like the most interesting thing for a lot of the people within the XR industry to either talk about or cover, but I think it's actually going to sustain a lot of the growth of augmented reality. and also virtual reality as well. And so I tend to want to go to these conferences just to see what kind of developers are there, what they're working on. And unfortunately, there wasn't as many different demos that were there really showcasing the power of augmented reality. I did also get a chance to try out the spatial demo and talk to the CEO there. Yeah, also ran into Tipitat, Shivasan, and he's seeing it now that this is a huge turning point. And now he's looking more seriously about investing in different companies that are creating HoloLens applications for the enterprise. A lot of the hesitation and investing in companies that were building HoloLens devices, you know, how feasible is this to get deployed? They have a new like leasing options, I think, for the hardware so that if people want to not have to pay upfront all the costs for buying the HoloLens, but to pay per device per month and more of a leasing option or something that is maybe a little bit just more of a payment plan. I think it's just going to increase the accessibility for these devices to have more people to be able to work with them in different ways. So I have a lot more other interviews with different people talking about their impressions of HoloLens and a little bit of a sampling of different folks that I ran into there at the Microsoft build. But I think overall, Microsoft is far beyond anyone else that's in the head-mounted augmented reality displays. number two obviously is their Magic Leap and Magic Leap has been taking much more of you know focusing on entertainment and there is like a whole military contract that came down to either Microsoft or Magic Leap and Microsoft ended up getting that like 480 million dollar Military contract and so Magic Leap was going that after that as well But because they've had so much more focus on more of the entertainment side I think that Microsoft just had a little bit more experience in the enterprise But also just created a device that was a little bit more well suited for some of the needs for the enterprise applications Whereas I do see that Magic Leap was trying to be a little bit more of a consumer device But yet the ecosystem is still so early and young that my biggest question around Magic Leap is are they going to be able to make a decent inroads into the enterprise market in order to bootstrap the entire company and industry before we're at the point where we're ready to have augmented reality head-mounted displays that are $3,000 to $5,000 for consumers to be buying and using. I think there's a lot of compelling applications in the phone-based AR. And so I think it's still yet to be seen all the different storytelling and other ways of telling spatial stories and the affordances of the medium and the whole AR cloud or what Magic Leap calls the magic verse and how that is going to eventually take off. So it's all yet to be seen. I think, you know, over the next five, six years, we're going to slowly get to the point of reaching some sort of like mass ubiquity for these immersive technologies. And I think it's going to still continue to be step-by-step conversational interfaces, AI, phone-based AR, Google IO is happening at the same time as Microsoft Build. So usually I'm able to try to go to both, but this year I just decided to go to Microsoft Build, hoping that I'd be able to see a lot more of what's happening in the augmented reality. Google io also had a lot more about what was happening with phone based ar and adding like 3d objects to the search So when you search you might be able to pull in like an augmented reality Results and pull out a 3d model and start to put that into the room So there was quite a bit of emphasis on augmented reality for google and because they're really trying to organize all the world's information they have things like integrations with ar into like google maps so that when you are Walking around you'll be able to get directions based upon all the different ways that they're correlating these photo spheres They've been taking with Google Street View correlated with where you're at GPS wise and to be able to dynamically Detect the world around you and then give you direction so it's those types of applications that I think that we're gonna start to see like people using augmented reality a lot more and So there's a lot that's going on right now in this time, and I'm going to be actually at the Augmented World Expo, and hopefully be able to catch up with a lot more folks there within the AR industry, and also go through my backlog a little bit on a lot of the AR interviews that I've been doing at the Microsoft Build for the last three years or so. There's a lot of content that I've been recording there, tracking the evolution of AR, and I think it's time to get out a lot of those interviews from my backlog as well. So, that's all that I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And, you know, this is a listeners-supported podcast, and so I rely upon donations from my listeners, like yourself, in order to continue to do this coverage. And so, I just wanted to send out a thank you to all my patrons that I wouldn't be able to do this without your support. And if you'd like to support me in this effort to continue to try to do this real-time oral history and to spread the word about what's happening in the realm of augmented and mixed reality and virtual reality, then please do become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.

More from this show