#913: All of Microsoft’s AR & AR Announcements from Microsoft Build with Director Greg Sullivan

There were not a lot of AR or VR announcements that happened during the main keynotes of Microsoft Build 2020, but there were a number of announcements that I was covering as comprehensively as I could in real-time in this Twitter thread. But Azure Spatial Anchors went into to general availability, Azure Remote Rendering is entering in a public preview, and Azure Digital Twins IoT expanding their offerings. There was some major software updates to HoloLens 2, and the number of regions where it’s available expanded from 10 to 25, and today they’re announcing that the HoloLens 2 was going to be generally available online and at Microsoft stores starting later this summer.

I had a chance to talk with Microsoft Director Greg Sullivan about all of their AR annoucements from Build as well as some updates on other mixed reality and virtual reality products including Azure Kinect camera, WebXR, their collaboration with HP and Valve on the HP Reverb G2, Minecraft Earth, MRTK, AltSpaceVR, and their Mixed Reality Dev Days which is happening later this week.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye and welcome to the Voices of VR Podcast. So on May 19th and 20th of 2020 is the Microsoft Build Conference and Build is happening virtually this year. So all the sessions are happening online. Usually it's an opportunity for Microsoft to make announcements about a number of their different products. Sometimes those announcements come to the main stage and other times they come through other channels. And so this year there wasn't a lot of new Microsoft HoloLens news for either AR or VR that happened during the keynotes. So I had a chance to talk to a director at Microsoft, Greg Sullivan, who went over all the different things that they actually are announcing. for the HoloLens. And so we go through each of their announcements through expanding to new regions, the inclusion of the spatial anchors, as well as the remote rendering. There's a digital twins announcement as well. It's actually going to be released in the summer for consumers generally, which is something they didn't announce on the first day, but they're announcing on the second day, the same day that this podcast is going to be coming out. And they also had a new software update. But we also talk about a little bit of the VR stuff, what's happening with Altspace, some of their new headsets that they're working on with HP and Valve. So just some brief information about what else is happening in VR as well. So we're covering all that and more on today's episode of the Voices of VR podcast. So this interview with Greg happened on Tuesday, May 19th, 2020, where I was in Portland, Oregon, and he was in Seattle, Washington. So with that, let's go ahead and dive right in.

[00:01:43.184] Greg Sullivan: My name is Greg Sullivan and I'm a director for mixed reality at Microsoft. I work on part of a team that works across with the engineering and marketing and sales organizations to help folks understand what our mixed reality technologies are and how we're going to market.

[00:01:59.628] Kent Bye: Great. So today is a Microsoft build, and I know that. Usually there's different announcements that come out and I usually like to watch the keynote to see what's announced to the entire community, but there wasn't too much that was announced to everybody in the major announcements, but it sounds like that you for the past week have been going around and sharing the big news for the HoloLens. So maybe you could catch me up to date as to what's being announced with the HoloLens and mixed reality there at Microsoft.

[00:02:24.235] Greg Sullivan: Yeah, you bet. We have a medium size, I'll call it, mixed reality update that this year's build. As we were putting this together, we realized, boy, there's a whole bunch of momentum here and good progress happening in a bunch of dimensions. We're seeing customer momentum. HoloLens 2 launched back in November, became available. We announced it, of course, last year at Mobile World Congress, and we started shipping it right at the end of 2019. And in the few months that it's been available, we've seen tremendous response from customers. Folks are using us in a whole bunch of exciting ways across a range of industries, in a whole host of use cases and scenarios, some of which we hadn't even anticipated. So it's really exciting to see the momentum behind that. And so as part of a response to that, one of the things that we announced at Build was that we are expanding the geographies, the countries where HoloLens is available. I'm not going to rattle them off. We're adding a whole bunch of additional countries, as I think you pointed out. And so that's exciting because later this fall, we'll begin selling in those countries. So we're going to give a heads up to the markets where we're coming. And then this fall, as we continue to ramp up our production, we'll make HoloLens 2 available in those countries. And so that's one of the exciting things that we talked about. The other was another expansion in terms of the channels of availability. I think folks have been excited to get their hands on a HoloLens 2. And we announced that this summer, we're going to begin selling HoloLens 2 in the Microsoft stores. So you'll be able to either go in person or online to a Microsoft store and purchase a HoloLens 2. Now we are continuing to position and talk about HoloLens as a commercial device. We're not trying to sell this to your son to play games. In fact, I had a buddy of mine ask me that very question. And I said, you know, you may want to move down the list because HoloLens is a commercial device. And it's designed for businesses. And the return on investment is so profound, it's a little bit more challenging to achieve that return on investment if you're playing a game in it. It is, as folks have pointed out, a great device to play a game in. But that's really not its sole purpose. The value of HoloLens as a commercial device is pretty profound. So we're excited that that will expand the reach of HoloLens in terms of the channels of distribution and get it in the hands of more folks, including developers who have been excited to get their hands on a device so that they can start playing with it themselves. So that was part of the announcement we made is just kind of expansion of market availability and expansion of distribution channels. We also talked about momentum and progress in terms of the device itself. We shipped a software update last week to HoloLens 2 that added a host of new capability and fixes and improvements across the board. But there's really a couple of broad themes of improvement that highlight this release. The first is around management and security. As we mentioned, HoloLens is a commercial device designed to be deployed in an environment that has some device management infrastructure. And we've enhanced the capabilities of HoloLens 2 in that regard. A few years ago, announced a program called Windows Autopilot. And that was designed really for our original equipment manufacturer, our PC manufacturer and ODM partners to pre-configure devices before they get deployed in a corporate environment so that they can be provisioned with the proper packages and security and authentication. Then you can just roll devices right out into your environment. Well, we realized that's a great thing to do for PCs and all kinds of devices, and it also makes sense to enable that capability for HoloLens as well. Windows Autopilot is now available in HoloLens 2. In fact, this year as we manufacture the devices, they will have a barcode on the side of the box then that can be used by our partners to pre-provision and configure those devices. So we're pretty excited about that. You know, mobile device management is another area where we have expanded capabilities of HoloLens 2 and just enabling customers to configure and manage these devices throughout their lifecycle, including now support for FIDO2 security so that we can more easily support the secure and authenticated use among multiple serial users These days, that's a little trickier. You're definitely wiping down that device real thoroughly between uses, but there are environments certainly where we have multiple people using a single device, and so we want to enable that scenario very easily. Then a host of other provisioning package and device management capabilities that have been added. There's really a whole lot here in this update for enterprises that are managing these devices, getting them provisioned, getting them rolled out, keeping them updated, and just managing their overall lifecycle. bucket of functionality I'd say in this update is really around just a whole bunch of kind of user experience improvements. One of the things that Kent and I know you know we were excited about with HoloLens 2 is the availability for it to do this fully articulated hand tracking. You don't need to use the air tap gesture just to interact with holograms. You can literally reach out and grab these three-dimensional digital objects with your hands. Well, we've done a lot of learning in the last few months and there was a couple of cases where we, one example is if your hand is in front of your face and your fingers are curved and your index finger is occluded by your palm or the back of your hand, we made some guesses and we had a whole bunch of machine-trained systems that learned how to predict where your fingertip would be and made some assumptions about that when you're trying to do input on a soft keyboard, for example. but we weren't 100% right and so we made some tweaks to that for example to make it easier to type and to not have false invocation of a select motion or an air tap. Another interesting thing that happens when you're doing hand tracking is if you have multiple users kind of working say you're working at a bench side by side and your hands are near my hands and you know the depth camera is spraying photons down there and getting that information back to determine exactly whose hands are whose and where where the hands are And it was possible that sometimes we could get into a situation called the hand stealing, for example, where the system would look at the other person's hands and get slightly confused. So we've made improvements to make that much less likely. So those are just some examples of some of the kinds of things that we've done throughout the system to improve the user experience. Another thing we've enabled, I myself have switched to dark mode in Windows 10, and I'm really loving it, dark mode in my applications. And so we've enabled that, and in fact, turned it on as the default mode the vault scheme in HoloLens 2. And so just little user experience improvements across the board in addition to some of the performance and other enhancements. We've also added the capability to support more Ethernet over USB connections, and that'll enable things like 5G tethering and other scenarios. So a lot of new capability in this release, and as we looked across it left to right, we realized there's a whole bunch of new stuff in here. In addition to the customer momentum, the market momentum, new regions, new channels, new features in the update, we were also excited to talk about a couple of our mixed reality services. I know you've been following this as well, Kent. The Azure Spatial Anchors mixed reality Cloud service is transitioning from public preview to general availability. It is now available for use. We're really excited that this is shared coordinate system that underpins Minecraft Earth, for example, as well as many other applications and scenarios. But it's the ability to have these precise points of contact stored and shared among these devices in a cross-platform way. So not just HoloLens, but iOS and Android devices can participate in that as well. So we're excited about that cloud service being generally available. At the same time, we also transitioned the Azure remote rendering cloud service from private preview to public preview. Progress in terms of the expansion and availability of these cloud services, whether it's going from private to public or public to generally available, is just another example of the progress and momentum that we're making. We wrapped these up and looked across and said, boy, there's the whole bunch of stuff going on here. There's some really exciting customer examples as well. I think in the build, We highlighted some of those at BUILD as well. Some folks like the National Health Service in the UK, they put out a call to an industry consortium asking for help in building ventilators, which of course is a key need right now. And so we were happy to participate and provide HoloLens devices to the NHS. And they're using that in a couple of interesting ways. They're using them to train workers how to manufacture ventilators. As you know, they're complicated devices and This is one of the areas where augmented reality plays a crucial role in simplifying the task of training workers how to do complex tasks. And when you have a three-dimensional digital overlay of a thing showing you the right way to do it, it's almost impossible to make a mistake. And so we're seeing that is one of the ways that they're using HoloLens. And then we're even seeing hospitals use HoloLens to minimize the interaction, first-line healthcare workers with infected patients to be able to do treatments and to be able to have access to critical health care data, but also to do so in a way that doesn't require as much direct interaction. So we highlighted some of those at this year's build as well. And so we're pretty excited about not just the ways that customers are using HoloLens, but the way that we're able to help in this crazy time that we're living in right now.

[00:12:12.325] Kent Bye: Yeah, so great. That was a great overview of all the news that you had today. And when I have been traveling around for the last year and talking to different developers, I think the one thing that I hear is that people want the device, they just can't get a hold of it. So I don't know if there's been coronavirus and COVID-19. Delays and production or if you've just been slowly rolling it out Maybe you could reiterate what you said about it being generally available or this being available at the stores and then maybe Kind of laying out what the process is now to be able to go through it Like what kind of hoops you have to go through in terms of qualifying to even get access to the HoloLens 2 Sure.

[00:12:49.850] Greg Sullivan: Well, I'll back up a little bit when we announced HoloLens 2 last February At the time, we put up a website that enabled us to capture interest from folks who wanted to, it wasn't actually a pre-order, but it was, we provided the ability to express interest in HoloLens. And so we captured a whole bunch of interest there. And then, as I say, we began shipping HoloLens 2 November 7th of last year. So near the end of last year, we started shipping. And in the intervening several months, we had captured a whole bunch of interest, including from some of our existing large HoloLens customers who had projects well underway who were really excited to take the next step and move forward with these projects. We've been busy fulfilling those orders and ramping up production. What we announced at Build is that this summer, we will make HoloLens 2 available in Microsoft stores, both in our physical retail establishments as well as online. Now, we will continue to position it as a commercial device as it is, but this is a way for enthusiasts, developers, other interested parties to get their hands on HoloLens 2, and that'll start this summer. Now, in the meantime, we encourage folks who are interested to go to hololens.com, and I know that not everybody has a Microsoft Enterprise account rep, especially smaller development shops, and you may have a relationship with Microsoft. But we have a way now, again, to express interest and place those orders online. As we continue to ramp up production, we expect to be able to fulfill all of those orders.

[00:14:26.305] Kent Bye: OK, and one of the other pieces of feedback that I heard from some developers is that one of the reasons why they're still developing on the Magic Leap is because there's a six degree of freedom controller. I think that, you know, having hand track controls and pinch controls are great for some use cases, but in other use cases, you just really want to have a physical button and you want to be able to have it tracked in six degrees of freedom. And so is that something that is either on the development map for the future, or is there going to be potentially some way to have third party six degree of freedom controllers be added in? So I'm just curious if there's any additional thought or progress or experimentation when it comes to additional controllers for the HoloLens 2.

[00:15:07.106] Greg Sullivan: Well, we have nothing to announce today. I'll share with you, but it is true that the platform, the underlying platform certainly has support for 6DoF controllers. One of the things that we encourage folks to do is to take a look at our mixed reality toolkit and to be able to use that as a device that supports multiple input mechanisms and hand tracking in addition to controller input. So it is possible to use that to develop applications. Yeah, you're asking about a specific hardware support that we don't have. Now, in the original HoloLens, we had the clicker, which was essentially, it was a 3DoF input device. With the advent of the fully articulated hand tracking in HoloLens 2, we're really focused on that as a natural input mechanism. For the scenarios that we're targeting this device for, it is very often, if not almost always, the case where folks require kind of this hands-free, heads-up interaction. They're typically doing something else, and so we want them to be able to continue to do that thing that we're helping them do with HoloLens, and that often precludes requiring them to hold a device in their hands. We have heard this request and it is true that the underlying platform obviously in Windows Mixed Reality supports that method of input. It's something that we'll continue to look at.

[00:16:22.798] Kent Bye: Well, one of the other complaints that I've heard from developers is just that the future trajectory of the roadmap for HoloLens 2 is a little bit opaque. It's hard to know where it's at and where is it going. Has Microsoft published a roadmap for HoloLens anywhere in terms of where this product is going in the future?

[00:16:41.769] Greg Sullivan: We haven't published a roadmap, and I think that would be, you know, in my time at Microsoft anyway, that would be kind of a first if we gave that kind of detail out in advance. And the challenge that we've had with that, frankly, is it's just too easy to be wrong. And what happens is if we have a plan that we think is the right plan today, and we articulate that to customers and partners, and then regardless of how much we can qualify that plan based on variable inputs and conditions changing, the plans often change. And then we end up disappointing people who have placed bets on our previous plan of a record. So it's kind of a no-win situation in some respects. to go into too much detail with respect to your product roadmap. Because, as I say, it's just way too easy to be wrong and to have people make bets on the thing that you said. And then, you know, there are even, frankly, liability concerns with respect to having people plan their businesses around such things. So what we do try to do, though, is give more of a vision of where we're going in terms of the overarching technology. And I think, you know, in the public discussions that Alex Kipman has, he particularly does that. Well, one of the things that I think if you asked Alex today, what's the next HoloLens going to be like, he would probably reiterate some of the key things that we tried to do with HoloLens 2. I think in HoloLens 2, we wanted to make the device more immersive and that, you know, there's several different dimensions to that. The field of view of the holographic frame is part of it. We more than doubled it in HoloLens 2. But there's more to immersion than just the size of the field of view. It's how close you can get to these digital objects. How do you interact with them? And I think the instinctual interaction model that we have in HoloLens 2, for my money, is what makes it much more immersive. Of course, having a much greater field of view doesn't hurt. But that combined with the ability to get close to those holograms is what makes it more immersive. I think Alex would tell you that for the next version, we're going to try to make it even more immersive. The second thing that we tried to do with HoloLens 2, of course, was make it more comfortable. This is a fun one because it turns out that when you try to make something more immersive, you're talking about things like increasing the field of view, increasing the power and the capability of the device, making the displays more powerful and better and higher resolution. All of those things equate pretty directly to a device that is heavier and hotter and you know, more expensive. And so the challenge really is how do you accomplish the goals of improving the immersion and improving the comfort at the same time? And I think we did a real good job with HoloLens 2. The team did an amazing job of both expanding the immersion and profoundly improving, I'll argue, the comfort of the device. The new fit system and the balanced weight distribution just make it a much, much different experience to wear for extended periods of time. So I think that's the second thing Alex would tell you as well, is that we're going to make the next HoloLens more, continue to make it more comfortable, and that has form factor implications and so forth, but that'll continue to be an evergreen goal. And then the third thing we talked about was, you know, we talked about it in terms of time to value. You can distill that even more, I would argue, just to value, but value right out of the box. We wanted to make it easier for people to just take HoloLens out of the box and start using it without having to necessarily write a whole bunch of code or spend a bunch of time doing that. Some of the inbox applications and the third-party applications as well have enabled us to deliver on that promise of making it more valuable right out of the box and then effectively having a price cut as well as something that helped with the value proposition. Those dimensions are the dimensions that we'll continue to invest in, We believe strongly in the underlying principle of mixed reality, the idea of liberating the digital world from the two-dimensional screens it's been in forever, for decades. And we do see, and we're continuing to see in some of these customer examples with HoloLens 2, just how valuable and how profound an effect interacting with three-dimensional digital objects in the real world can have. Case Western Reserve University in Ohio has been a longtime partner of ours. And they planned a while ago to do these holographic anatomy classes. They said, look, anatomy has been taught at medical schools for a long time. And we've been using cadaver dissection and graze anatomy textbooks. And they decided to supplement those learning approaches with a holographic anatomy class. And the students, it was very, very well received. In fact, they even architected a new classroom and built a new classroom designed for shared collaboration in mixed reality. Now, they're still using cadavers and still using anatomy textbooks, but this was a way for them to take that learning process into the future. And one of the things that has happened as a result of the pandemic is that Case Western took advantage of their investments in these holographic anatomy classes and said, boy, we can do this whole thing online and remotely. And so they gathered up the 185 devices that they had, and they shipped them off to their medical students. And they donned the HoloLens and participated in a remote holographic anatomy class. And it was very, very successful. So I think, you know, we believe enough in the underlying principles at play here, the value of bringing the digital world and the real world together in a meaningful way. So we're going to continue to invest in our product roadmap while we understand the frustration that there's a degree of opacity there. Like I said before, it's just too easy to be wrong and have folks end up being more disappointed because things change. So we try to strike that balance between articulating a vision, describing the broad investments that we're making, but not over-promising on an exact path that we're going to take.

[00:22:48.057] Kent Bye: Well, one of the things that I saw today that really caught my attention was an interview that you did with Scott Stein of CNET, where he was asking about the potential of expanding out into more consumer-friendly versions of the HoloLens. And you had mentioned that you were considering potentially doing something similar to what you did with the Windows Mixed Reality, which is the virtual reality headsets, which is that you were creating a licensing scheme for other third-party OEMs to be able to start to generate their own VR headsets. Is that something that you're seriously considering, to do something similar with AR?

[00:23:20.991] Greg Sullivan: Well, I think I was using that as an example of something that we've done. I mean, we took literally the 6DoF positional tracking, the SLAM algorithms, all of the work that we did to enable HoloLens, the original HoloLens, to do the inside-out 6DoF tracking that it did. And we effectively put that into Windows Mixed Reality and licensed it to our OEM partners and said, here, we invented this for our first party device, but we would love you to be able to make partner devices in this case that are immersive for that platform. And the general principle certainly could apply to AR device. Again, we're not announcing anything, but I think what I was giving was an example of this idea that we're gonna invest to solve some of the most challenging computer science and engineering problems to create a device like HoloLens 2. that represents what we think of as the high watermark of our ability to achieve that experience. And then we will take our learnings from building such a device and we will partner with our ecosystem of hardware manufacturers and software providers, software developers, and enable them to scale that to a degree that we wouldn't have if it was a first party only approach. And I was using the positional tracking in Windows Mixed Reality HMDs as an example where that has already taken place. We have a technology that exists because of the work that we did to create HoloLens, and then it just became part of what our ecosystem is able to take advantage of in Windows Mixed Reality. I guess this is a first-party example, but we're seeing a lot of interest in the Azure Connect Developer Kit. which is effectively just the depth camera from HoloLens 2. It's really what the fourth generation of Microsoft Connect, and that with a bunch of other sensors and intelligence obviously too. But that camera, that Azure Connect kit is another example of something that was effectively developed for HoloLens, but can see life in another way. In this case, as a development kit from us, but we can envision that over the long-term being either a first or third-party piece of hardware that serves a range of purposes. So the general idea is one that I was articulating, is that this idea is that we're going to continue to invest in what we think is some of the hardest problems to solve and some of the most meaningful challenges to overcome, and then take what we learn from solving those problems and enable our partner ecosystem to scale to a degree that we would not have been otherwise. So that's the general philosophy, and I was given examples of where that's already true.

[00:26:06.150] Kent Bye: Great. And, uh, I wanted to ask a few VR related questions because I know back in March, there was an announcement that. Valve, Microsoft and HP are working on a next generation steam VR headset. And so it's saying it's going to be a steam VR headset, which I'm assuming it's going to be using the lighthouse for tracking rather than something that is maybe the windows mixed reality, but I'm sort of unclear with that. It's called the HP reverb G2. So what, what can you say about this new headset?

[00:26:31.330] Greg Sullivan: Well, I can't steal thunder from my friends at HP, so I can't say a whole lot, but I think there are certainly, they've acknowledged and we've acknowledged that it is a collaboration, as you say, between Valve and HP and Microsoft. Let's see, how should I say this? I think one of the things I would say is that, as we've been discussing, the inside-out, six-top tracking of Windows Mixed Reality is one of the things that we bring to the table from a platform standpoint. So it'll be interesting to see what they say about that and what improvements are being made there. So I think that's an example of that partner ecosystem that I talked about. We're really excited to be working with HP. They've just got a great vision and understanding of this space and are really excited about it. So we're super excited to be partnering with them.

[00:27:20.760] Kent Bye: And I think, uh, probably a lot of people either forget or don't realize that Altspace VR is actually owned by Microsoft as well. So I'm just wondering if you could give a bit of an update with Altspace. Cause I know there's the mixed reality dev daves that's coming up at the end of the week. That's going to be an Altspace and maybe some of the increased interest that you may have had for different types of events and other ways of meeting. Since we're all sheltered at home to some extent in lockdown and quarantine, how you've seen the growth and change in Altspace VR.

[00:27:49.802] Greg Sullivan: Yeah, it's been pretty phenomenal. It's interesting to see how well positioned it was really for what we're dealing with right now. And what's happening at Mixed Reality Dev Days is a great example. We're going to have an all-space event that we think will probably break some records. It's going to be really exciting to see. And it's just another indication that the time really is right for this technology. I think somebody was asking me not long ago, has the pandemic changed any plans with respect to how we think these technologies will be rolled out. I'm sure there are specific cases where that's true, but I think one of the things that it's done is really highlighted the value and the benefit of having a shared collaborative virtual experience. If I can effectively remote my senses over the network and collaborate with you and feel as if we had a joint experience and shared something where we weren't physically together, then boy, I can think of no better tool in an environment where we're all having to stay home. So we're really excited about the fact that we were able to pivot build to a virtual event, that we're able to use to hold our Mixed Reality Dev Days event at the end of this week and have record turnout and have it be a fully virtual, fully online, and including in alt space. And I think we're going to have some folks talking about exactly that at our MR Dev Days Adjacent events over the next few days. So I again without stealing their thunder I'll I'll say the time is right and the technology is here and I think this is one of the things that may change, you know, we've been talking about boy, you know, maybe That meeting could have been an email and that business trip could have been a team's call or a Skype call that kind of thinking I think is brought on by the current situation But I also think it's one of the things that'll remove some of the barriers that maybe we have seen to doing this more proactively. So we're really excited to see what the team is doing at MR Dev Days with the AltSpace event. And I myself have been using VR more and more during the last few months of isolation. And it's just been really great. I had a really fun experience just going to a couple of musical events in AltSpace and hearing some performers that I wouldn't have otherwise been exposed to. And then interacting with other virtual participants there, it was really pretty cool. And it just felt like, yeah, this is the thing that we need right now in these times. And so I was really happy that we were able to participate in that.

[00:30:28.271] Kent Bye: So previously, there was a big announcement about the Kinect camera, you know, wasn't any new news that was announced about it. But I'm just curious if you could give a bit of an update as to what's been happening with the Kinect camera, because I know that's been such a big part of this larger movement towards spatial computing.

[00:30:45.044] Greg Sullivan: Yeah, exactly. And our goal with the Azure Kinect developer kit was to start seeding that thought process in the developer community. It's part of this broader architectural approach that we have where we talk about the Intelligent Cloud and the Intelligent Edge. Part of our strategy here is to think about world-scale computing in whichever cloud that happens to be ours. Obviously, we talk about Azure and Microsoft 365 and so forth and Xbox Live. But in this Intelligent Cloud, Intelligent Edge world, the compute framework, it's inherently distributed. to the cloud and then across these increasingly intelligent edge devices. And one of the exciting things that we're really at the cusp of is a couple of key technology trends. Number one, the network capacity, the capability of the networks to have increased bandwidth, lower latency, and all of the throughput required to enable some really exciting scenarios at the edge, combined with devices at the edge of the network that themselves have new capabilities they didn't have just a couple of years ago. The capabilities like Azure Connect giving IoT devices effectively the ability to perceive their environments in sensory ways that we couldn't have until this. The state that we're in is we've been shipping, I guess it was almost a year now since we've started shipping the Azure Connect DK. Then we're starting to see developers just do some amazing things. Now, the thing about this kit is it is developer kit. It's not designed to be necessarily a final deployed in production product because it doesn't have the intelligence that a final IoT intelligent edge device would have. It's effectively just the sensors connected to your PC. We rely on the PC that's connected to the Azure Connect DK to actually do the processing of the sensor data. Now, long-term, when this becomes something that will be deployed in real environments, and it's still undetermined, frankly, whether that is a Microsoft product or whether it's a partner product, then we'll have that device and it'll be more of a standalone device that will have the onboard compute as well as all the sensors.

[00:33:04.231] Kent Bye: Well, one question that came up was whether or not there's going to be any of the APIs from WebXR that's going to be supported with the HoloLens. And just to maybe contextualize that a little bit is that as of right now, I think if you create like a Unity binary or Unreal engine, it's usually how you're deploying different applications to the HoloLens. If you wanted to create, say, a website that would have all sorts of other web standards that would be still working 10 to 20 to 30 years from now, there's some institutions that are really thinking at those long-term timescales, and they would really prefer to have something that is maybe move slower, but is going to be still working in a long time from now. So what can you say about the future of WebXR and the support for WebXR APIs for the HoloLens?

[00:33:52.817] Greg Sullivan: Well, I am not the subject matter expert there. I'm going to be quickly out of my depth, so I'm probably not the right person to ask. But I know that we've, in a general principle, I mean, we are members of OpenVR and we are continuing to invest in what we call our hashtag open approach in general. We want to have, you know, open platforms, open stores and open browsing. And one of the things that we're excited about is when Mozilla, for example, announced that they're bringing the Firefox Reality Browser to HoloLens. So the idea of open and industry standards is something that we're strongly embracing. I don't have the latest on the WebXR API support though.

[00:34:34.022] Kent Bye: OK, well, I know that Edge has moved over to Chromium, which means that Chromium itself has support for WebXR built into it. So I guess it can abstract that question a little bit is Firefox is going to have their Firefox reality on HoloLens. But is there going to be Microsoft Edge support on the HoloLens?

[00:34:51.432] Greg Sullivan: Yeah, that is Edge Chromium Edge. Yeah. Yeah. Yeah. We haven't announced anything with respect to that yet, but stay tuned.

[00:35:00.958] Kent Bye: Okay. That's probably a yes then. Um, so 5g was one announcement that was in the release notes. You can have a USB dongle. What do you think that 5g is going to be able to enable on a device like the HoloLens?

[00:35:15.340] Greg Sullivan: Well, I think in some respects, HoloLens or mixed reality computing is kind of a killer app for 5g in some respects. And that, I guess begs the question, why is, you know, why didn't you have an integrated? Well, a couple of reasons for that is folks have pointed out that we're using a Qualcomm SoC, but we are not using the one with the cellular modem capabilities. It's their compute platform. And this was because 5G, the infrastructure and the customer demand for fully integrated 5G as we built HoloLens 2 just did not rise to the top of the list of priorities for us. It is not a free thing to be able to just kind of add that capability. It's not significant in terms of the SoC Delta, but There are thermals and testing and power considerations and the budget that we have to operate under. I think if we look at all of this, the overwhelming majority of the customer scenarios that we've been dealing with have enterprise Wi-Fi on the sites where these devices are being used. In the cases where they're not, we do support Wi-Fi tethering. It wasn't a super high priority from customers, to have integrated LTE or integrated 5G in HoloLens 2. And, you know, certainly, though, it's a scenario that we think is going to be greatly enhanced by 5G. And so we're excited to look what's coming. And it's one of the reasons we're excited to announce the USB Ethernet support for additional scenarios where you could use 5G dongles. I think it's going to be, you know, it's part of that overall architecture that I discussed where in a intelligent cloud, intelligent edge world, low latency, high throughput networks are a requirement. And the latency and quality of service characteristics of 5G are in some ways, you know, custom made for mixed reality and these kinds of scenarios. So we're super excited about that going forward. We think it'll become increasingly important. In the development of HoloLens 2, it didn't rise in the customer requirement stack. It didn't rise to the top to become worth the investment in terms of the power and thermals and the other things. But it is a scenario that we think is going to absolutely be tailor-made for mixed reality.

[00:37:31.700] Kent Bye: Great. Now I wanted to unpack a little bit the three Azure services, some of them going into general availability, some into preview. I'll just list them and my take on them. So the spatial anchors, maybe you could expand on a little bit of the Minecraft Earth and how Minecraft Earth was able to use spatial anchors. And then we have the Azure remote rendering, which I know at the beginning of build last year, there was a bit of a demo that didn't quite go off as it should have with Epic Games showing different aspects of remote rendering. And then at SIGGRAPH, I actually had a chance to see that demo. So I actually saw that remote rendering functionality. And then finally, the Azure Digital Twins, this Internet of Things way of being able to replicate a building and then keep track of devices relative to these digital twins. So maybe you could go through those three services from Azure and just give a little bit more context for each of them.

[00:38:23.340] Greg Sullivan: Yeah, yeah. And you know, one of the exciting things about Azure Spatial Anchors, as you mentioned, Minecraft Earth, and they were an internal customer, obviously, but an important customer nonetheless. And one of the requirements that they had, as did others, was that this service allowed them to very precisely and accurately maintain the kind of world lock position of these digital objects and to have these precise points of interest mapped and able to be experienced in a way that was repeatable and could scale, and importantly, was available across platforms. And so we had to make something that worked for iOS and Android for the devices that folks have in their pockets. And so with ARCore and ARKit, you have two frameworks that support their own anchoring system and enable sharing of that. And so what we were able to do was to just leverage the existing anchoring systems of those popular platforms and incorporate those with our own and provide a substrate really in the cloud, provide a common shared coordinate system for those precise points of interest and enable Minecraft Earth to do what that does in a way that was inherently cross-platform. The customers that we're highlighting at Build, for example, had the same needs. They said, boy, we need this to be accurate, precise, and cross-platform. And I think those are some of the attributes of Azure Spatial Anchors differentiate it from some alternative available services. I think this is among the reasons why the customers are choosing that. Transitioning from public preview to general availability was a watershed moment and we're pretty excited that we've gotten to that. In terms of remote rendering where that project started a little bit later and so it was in terms of the phase is a step or so behind was in preview when Spatial Anchors was in public preview and now that Spatial Anchors is going to general availability. Azure Remote Rendering is going into public preview. As you say, this is a way for us. The Qualcomm SoC in the HoloLens 2 is not dissimilar from the one that is in your cell phone. While that can do some pretty amazing things, it's not going to render a 100 million polygon CAD file quick amount of time. For the scenarios that we're talking about, in particular with HoloLens, but again, this service is cross-platform and supports iOS and Android devices as well. You don't want to have to rely on the GPU capability of the Edge device to do all of the rendering. As long as you have effectively unlimited GPU farms in the Cloud that you can scale up and rely on, rendering that 100 million polygon CAD file, happens a lot more quickly on those GPU farms in the sky, and the resultant data can be streamed in a very efficient way down to the endpoint device. Again, the endpoint device being either HoloLens or an iOS or an Android device was important here. Again, it was another reason why customers that are choosing Azure Remote Rendering are choosing it because of the performance of it, but also because of that cross-platform aspect. I think it's a couple of more examples that I would point out that pay off on what Alex Kittman talked about last year at Mobile World Congress when he said, look, we have this hashtag open approach. We have an approach that says we're not trying to create walled gardens. We're not going to stovepipe everything and have you know the secret handshake for entry. We're going to say, look, let's make stuff that people can use where they are on the device that they have. And let's take a broad open approach. And I think that's more evidence of exactly that philosophy.

[00:42:11.578] Kent Bye: And did you talk about the digital twins as well, the Azure digital twins?

[00:42:15.321] Greg Sullivan: I didn't talk about that one. And that's another one I have not been as close to that particular one. So I'm not the expert on that service, but I know that they're excited to be transitioning and to be coming to market and have people use that as well for the scenario that you just described. You know, when we have these devices at the edge of the network that have the ability to understand their environments, depth cameras that can create machine learning models based on much, much less data because of the nature of the data that you're collecting with the depth camera is just so much more meaningful because you have the depth, you have that Z order, you have the ability to train these machine learning models on less data. And so having devices at the edge of the network that understand their environments and then you can then create these digital twins also of devices in your environment that you need to monitor and track. It's all part of this broader philosophy that we talked about of intelligent cloud and edge. How do we basically make the world a computer and have all of the devices that have at least a little bit of electricity and some sense of smarts participate in that framework? And that's the model that we're moving towards.

[00:43:25.450] Kent Bye: Well, I just wanted to wrap up here. Usually I ask about the ultimate potential of virtual and augmented reality, but I also wanted to get your thoughts kind of casting out within the context of some major newspapers like the New York Times kind of evaluating VR and then not really being all excited about where it's at and people expecting that, you know, VR should have been a lot bigger during this quarantine. A lot of people saying that don't actually have any of the data and the companies that do, companies like Microsoft or Oculus or Valve or Sony, they're not necessarily giving us any hard data that spatial computing is on this trajectory of what I think is going to be the next computing platform. But just curious, like what you could say in terms of the future of spatial computing and where you think this is all going and why Microsoft is committed to this.

[00:44:16.025] Greg Sullivan: Yeah, absolutely. I don't know that I'm going to be able to satisfy the request for hard data. What I can say is this, which I think is at least as important. We are continuing to invest heavily in space. We've been investing for quite some time now. We are continuing to do so because of the returns that we're seeing, the response that we're seeing from customers. The reason Microsoft exists as a company is to things that empower people. The mission statement is not just a poster on the break room wall around here. It really does impact how we prioritize our investment. And so when we say that we want to empower every person and every organization on the planet to achieve more, we take that seriously. And so when we see people doing things with mixed reality that they literally could not do before, and when we see Case Western holding those anatomy classes that would otherwise have been Zoom calls or canceled, when we see customers able to do things that they only dreamed of. That's the payoff for us. We understand that we are on the right path. And so from our standpoint, I think maybe the most important data point or metric is our continued investment and our continued support of these technologies. And we're doubling down. We've got some really exciting momentum around HoloLens 2 that we've been talking about at Build. we're supplementing these devices with mixed reality cloud services that are inherently inclusive of this notion of the digital world, the physical world coexisting in a meaningful way in three dimensions. And so it is so profoundly linked to what we think the future of computing is that we don't frankly pay as much attention to the day-to-day ups and downs and stories about who's saying it's over and who's saying it's the next big thing. Because we see the response from customers and what they're doing with this. And then we also can imagine what we're going to be able to help them do with the investments that we're just now making. So we're very, very bullish and continuing to invest in these spaces for those reasons.

[00:46:20.464] Kent Bye: Awesome. Well, Greg, I just wanted to thank you for joining me on the podcast today. So thank you.

[00:46:26.033] Greg Sullivan: My pleasure, Kent.

[00:46:26.715] Kent Bye: Good to talk with you. So that was Greg Sullivan. He's a director at Microsoft. So I have a number of different takeaways about this interview is that first of all, So it sounds like that there's been a huge demand for the HoloLens and if anything, such a big demand that Microsoft has not been doing a terribly great job of being able to communicate to the broader audience or to even make them available. So it's really good to hear that later this summer that it's going to be just generally available for folks to buy either at an online store or directly from online at Microsoft. So there is quite a huge demand, especially in this pandemic. And so I'm just excited to see that they're opening up a little bit more and not just having like these big companies have access to it, because there's lots of people who want to do stuff with the HoloLens. And, you know, like the Oculus Rift DK1, it was a platform that was open enough for anybody to get it. And so because Microsoft is so focused on this hashtag open strategy, it'd just be nice to be able to have it more generally available, which it looks like it's going to be happening later this summer. So the different announcements that are happening, a number of different Azure things that are coming into general availability, the spatial anchors, which was used a lot on Minecraft Earth to have these persistent anchors. They also had a session on that last night that I had a chance to listen to as well. There'll be a couple more sessions. And just going through the MRTK, the Mixed Reality Toolkit, as well as some of the best practices for how to get people to move around. There's also going to be the mixed reality dev days, which I think there are going to be some more sessions that are happening later this week in alt space. There's also the remote rendering, which was premiered last year by Epic games, and it is entering into a public preview to be eventually going to general availability as well. the digital twins we didn't talk too much about but this whole idea of being able to capture a 3D spatial model of something and to be able to overlay different IOT devices on top of that to be able to kind of understand what's happening on a spatial level and yeah just overall a big part of Microsoft strategy seems to be in the future creating some sort of interfaces for these edge compute devices like the Kinect camera or whatever that ends up being but that the Azure cloud is going to be a pretty significant part of the strategy within Microsoft's future. We didn't really necessarily talk too much about the Microsoft Cognitive Services just because they weren't explicitly tying anything new into what was happening with the HoloLens but that is also just a huge part of the experiential design of these different ways of doing voice commands and being able to query natural language processing and Just expanding the capability that is possible when you are able to speak with your language and be able to use the Microsoft cognitive services on the back end as well. So some of the VR stuff doesn't sound like that is getting too much new information in terms of the HP reverb. I imagine that they actually are going to be using a lot of the. windows mixed reality tracking that it's not just going to be solely relying upon the tracking from valve if so it would just be a collaboration between valve and hp so a lot of things that microsoft has to offer is something that starts to be a little bit more of that inside out tracking uh that's what i expect again there's no new details yet so we're we just have to kind of wait and see what happens with the hp reverb g2 The WebXR does actually seem like on their documentation, they actually already do have documentation about using JavaScript to be able to interface with HoloLens. However, their documentation generally is pretty outdated. It's about a year old for a lot of their stuff. As I was going through different sessions, the different developers were commenting and like, hey, are you going to update some of this documentation? But there is documentation for the WebXR and the JavaScript ways of creating applications. Although when they're teaching themselves, they are just talking mainly about Unity and Unreal Engine. And then Altspace is their social VR platform that they picked up and that it has actually been seeing a lot of increase as well. And so I'm excited to see what happens at the Mixed Reality Dev Days. And talked a little bit about the other features that are new software features because I don't have a HoloLens and I don't test it out. It's a little bit harder for me to actually give any of these of my direct embodied experience beyond what kind of the release notes are saying. And yeah, the final point is just that, you know, Microsoft seems pretty committed to this whole field of spatial computing, both in the virtual reality side, but especially in augmented reality side. and that they're really interfacing with a lot of what's happening in the enterprises, and just they're listening to customers and what they need, and that they're seeing that the customers have huge demands, they're seeing really big returns of this is what is essentially a new paradigm of spatial computing. And they really feel committed to that, even though they're not getting any hard empirical evidence saying that this is a thing that is really something that they believe in and are investing in. You just have to look at their actions to be able to see that. This is something that they're continuing to invest in and to be able to do the R&D research that's necessary as well as just make a really amazing product that is going to be able to enable all sorts of new things within the future of spatial computing. So that's all that I have for today. And I just wanted to thank you for listening to the Voices of VR podcast. If you enjoy the podcast, then please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a listener supported podcast. And so I do rely upon donations from people like yourself. I've just spent the last couple of days really doing a deep dive into what's happening, you know, doing a lot of research and doing this interview just to make it available for you. If you enjoy this, then please do consider becoming a supporting member of this podcast. So you can become a member today at patreon.com slash voices of VR. Thanks for listening.

More from this show