#578: Advancing Immersive Computing with Intel’s Virtual Reality Center for Excellence

kim-pallisterIntel is investing in the future of immersive computing through their Virtual Reality Center for Excellence. They’re pushing the boundaries of high-end of VR gaming experiences, pursuing initiatives to help VR reach critical mass, and exploring how RealSense depth sensor cameras and WiGig wireless technologies fit into the VR ecosystem. I was able to demo an early prototype demo of an HTC Vive game rendered on a PC and transferred wirelessly to a mobile headset, and it’s part of a research project to search for additional market opportunities for how high-end PCs could drive immersive experiences.

I was able to sit down with the Kim Pallister, the director of Intel’s VR Center for Excellence to talk about their various initiatives to advance immersive computing, their WiGig wireless technology, RealSense and Project Alloy, and some of the experiential differences between their lower-end and higher-end CPUs. He predicts that immersive gaming markets may mirror differences in mobile, console, and PC markets, and that there will be a spectrum of experiences that have tradeoffs between price, performance, and power consumption. Intel is initially focusing on pushing the high-end of VR gaming experiences, but they believe in the future of immersive computing and are looking at how to support and are looking at how to support the full spectrum of virtual reality experiences.

LISTEN TO THE VOICES OF VR PODCAST

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye and welcome to the Voices of VR Podcast. So virtual reality to me is a completely new communications medium that represents this new paradigm of immersive computing. And whether or not this level of immersive computing with AR and VR technologies is going to happen within the next couple of years, till like 2025 or 2050, I feel like we're on this trajectory towards that. And it's a matter of when, not necessarily if, to me. I feel like the way that technology is going these days, from just looking at this year of how much of phone-based AR is starting to come into a lot of the latest phones, and it's going to be starting to hit into the gaming market, it's looking like phone-based AR is going to get at mass scale before virtual reality is. So in this interview today, I had a chance to sit down with Kim Pallister of Intel He's the director of the VR Center for excellence at Intel and there's a couple of major things that they're looking at one is to focus on driving the highest in virtual reality experience that's possible with their Intel chips and And then they're also doing some interesting stuff with 3D depth sensor cameras with RealSense and Project Alloy, but also wide technology in order to render scenes within a PC and then wirelessly transmit that into a mobile phone. And then finally, they're just looking at how can they help virtuality reach a more critical mass. So in talking to Intel, I get the sense that they're really committed to virtual reality as this sign towards where all this computing is going into more and more deeper immersion and more and more interactivity. And we're going to have much more natural, intuitive user interfaces with these types of immersive technologies. And that listening to an interview like this gets me just more convinced that this is a technological roadmap that we're all headed towards, regardless of when that critical mass of VR is going to hit. So we'll be covering all that and more on today's episode of the Voices of VR podcast. So this interview with Kim happened at Intel's offices in Hillsborough, Oregon on May 24th, 2017. So with that, let's go ahead and dive right in.

[00:02:23.445] Kim Pallister: My name is Kim Pallister, and I am the Director of the Virtual Reality Center of Excellence at Intel in the Client Computing Group here. And we are looking at ways that we can improve the VR experience and deliver it efficiently on Intel-based platforms.

[00:02:38.390] Kent Bye: Great. So my understanding is that Intel wants to show the highest capabilities of what's possible on PC gaming. That's one strand of the initiative that you've had. So maybe you could tell me a bit of the differences that you see at least between like the Intel i5 and the Intel i7 processor in terms of what it's able to allow you to do within a VR gaming experience.

[00:03:01.383] Kim Pallister: Sure, and maybe as a preface to that, there are kind of three different domains that we see our activities in VR. One is in pushing forward the high end of the experience. We own a lot of technologies that go into the highest performance consumer computing platforms today, and we want to continue to help that go forward together with our partners. We also believe that it's an essential part for VR to succeed, that we help VR reach a larger mainstream audience, right? It has to get to kind of critical mass for developers to be able to succeed. So we're also not only looking at the high end, but looking at the mainstream. And then we are looking at a whole range of technologies that Intel has within its walls that we think can help improve the VR experience. Everything from 3D depth cameras with our RealSense technology, to wireless technologies like our WiGig technology that we think can make the experience better or make it reach new capabilities. So on that first one, on the high end, what we've seen is that the first impressions out of the gate that a lot of people had was that VR was a very graphics heavy usage. That's certainly true. But we also, in talking to developers and talking to the users and playing around with the stuff ourselves, we realized that the expectations that people have in VR is that there's a much increased level of interactivity in the experience over and above what a standard 3D game has, right? A great example being like Job Simulator, or the Rick and Morty title from the guys at Alchemy, where every single object in the game you're expected to be able to pick up, utilize, it has to be functional, you have to be able to do stuff. And that's even in a relatively cartoony, simple type of title. And so we've worked with developers, with people that work on benchmarks, and with our own lab efforts internally to kind of show, here's how that extra computing power when you're on a top-of-line Core i7 platform can be used to extend the experience, right? So we did some work with The guys that did Arizona Sunshine, where there's extra features and workloads that when you have the performance available to you, will automatically get lit up, or you can choose to go switch them on yourselves. If you have a lower performance system, you don't want to suffer the frame rate drop, but you can try it out. where they'll do things like extra particle effects and ragdoll effects on the zombie bodies beyond the default, right? It's things like limbs that separate and fly off, stuff like that. We did some similar stuff with the Star Trek Bridge Crew title where there's, you know, watching space battles, there's extra particles and ships break apart and you have like a physical destruction of objects. So some of that's kind of standard kind of eye candy or physics effects eye candy that you would put in game titles, but being applied in the VR space. We're also looking at things like 3D audio and higher degrees of interaction. Like when you pick objects up, if you actually have articulated fingers and do physics and the contacts with them, all of these, we think are things that in VR, you're really going to get a palpable sense for how they manifest and people will pick up on the behavioral differences readily.

[00:06:14.376] Kent Bye: Yeah, and it seems like there's the CPU and the GPU. The CPU is doing a lot of serial processing, but also a lot of the physics engine, multiplayer, as well as kind of just running the operating system of the computer. So if you have other things like streaming, for example. So I'm just curious to hear your perspective on the differences between the type of stuff that's being handled by the GPU, but what specific things are in VR being handled by that CPU.

[00:06:39.880] Kim Pallister: Sure. So at the highest level, you could think of it as the GPU handles two things. There is the rendering of the graphics workload for the game. So it's heavier than standard game. It's a stereo image pair. So you've got additional pixels to render there. You've got a very wide field of view and you want to render it at a very high resolution. So there's a graphics demand there. And then there's this additional stage of kind of, you can think of it as a post-processing stage, which is take that stereo rendered output and prepare it for display in the HMD. I think over time those may not necessarily remain separate stages. People will work on making that more efficient. Right now you can think of it as being done in a very kind of brute force manner. And a lot of what we've been doing, for example, with Microsoft on bringing that VR experience down to mainstream systems with the Windows MR headsets that they're working on has been a focus on how do we do a really efficient version of that for kind of mainstream usages that'll run on a notebook with integrated graphics. On the CPU side, you're absolutely right that the standard kind of 3D pipeline stuff of character animation and IK and other kinds of rigid body physics and dynamic destruction of objects, all the stuff you do in high-end 3D games applies in VR. in gaming or in non-gaming VR applications. So some of that is just being dialed up. There are other areas where we think users will begin to demand more than just a basic level. So 3D audio is a good example, right? It used to be that doing stereo audio was kind of good enough and maybe you had a bit of attenuation with distance and you might do binaural audio to kind of mimic the sound coming from behind you. In a VR environment, when you have a sound emitter in a room and an occluder gets between you and the sound, people will pick up on that. They're like, I just put that thing in a box and it should be quiet now, right? And so we think that that'll be one of the next areas that people are really differentiating themselves in VR. And it's, like many things in 3D simulations, it's a pretty deep rabbit hole you can go down to. If you start doing things like mimicking reflections of sound off of objects of different materials, is that a cement floor or is it carpeted? We think that people are going to pick up on that stuff.

[00:08:54.515] Kent Bye: Yeah, and it also seems like overall in the VR industry, there's the high-end VR, but there's also mobile VR. So we're starting to see a lot of different system-on-chips that have everything that's integrated from the CPU to GPU as well as the operating system as well. And so is that something that Intel has been getting into in terms of like the system-on-the-chip for mobile? Or are there other things that you're getting involved with in terms of the mobile VR?

[00:09:20.063] Kim Pallister: Sure, I'll give you a slightly different way that we like to view it is that, you know, for any of these kind of consumer computing usages, there's a spectrum of offerings available, right? And you could divide those any number of ways. You could say by price, by form factor, by performance, by power consumption. Right. And the way that the kind of early entrance in this wave of VR have shaped up is they've tackled it from a set of people like HTC and Valve, like Oculus, et cetera, that have said, let's go take the highest performance PC, kind of class of PC or computing platform, which happens to be a PC, that a consumer could buy and let's go do the best possible VR experience we can do. And we'll worry about making that kind of reach a broader audience later or be more affordable or whatever. We're going to start doing the best thing we can do, even if it's a kind of small market. And then there are people that started at the other end of the spectrum with things like probably Google Cardboard is your best example of let me take this really widely distributed, ubiquitous platform and do whatever VR I can get away with right now. And I'll worry about making that better over time. And as you mentioned, a bunch of the SoC vendors in that space have been saying, okay, how do we get better at doing computer vision or higher frame rate, higher display, et cetera. While the people at the high end have been kind of coming down in terms of delivering that experience at better price points or in better form factors, you know, it fits in a laptop now and maybe in a ultra thin notebook at some point. And so what you're seeing is that spectrum begin to fill out and it'll look very much like a lot of consumer computing spectrums where you have a wide range of products available and you can pick and choose what it is you want to do. And at any point in that spectrum, people will look at the SOCs there and say, how do I make them more optimal for VR? So that's something that we started with early on was saying, OK, as we develop products, we need to look at VR as one of the things people are going to do with those products. And what does it mean to the requirements of what we do? What do we need to add graphics-wise? What do we need to do to our audio, et cetera? So I think that it's not only the low power mobile phone SOC folks that are looking at this space, right? Everybody that's interested is doing that. And it'll be interesting to see how that market plays out. And I think that 3d gaming constitutes a pretty good kind of analogy for seeing how that played out. Like, you know, mobile phone games got good and became over time a $20 billion market. PC games are still a 20 plus billion dollar market as well. And guess what console games are kind of somewhere on that spectrum in between. And they're also lucrative market, there's loads of room for a lot of different types of VR.

[00:12:07.097] Kent Bye: Yeah, and it also seems like some of the initiatives that you have are trying to bridge this gap between the PC and the mobile phone. And maybe you could just talk a bit about what you're trying to achieve there in terms of perhaps using the computing resources of a PC to do various things on mobile computing.

[00:12:23.826] Kim Pallister: Sure, so I think that again viewing it as like lots of different solutions mean there will be different solutions for different people What you were alluding to a minute ago as we showed you just it was a proof of concept no product coming out of it Anytime soon, but just showing that hey, there may be a model where you render high-performance 3d games on a PC, but you use a smartphone based or smartphone class of HMD and as the HMD that the consumer wears to then go experience that. So if you own a smartphone and you put it in a phone holder and it can talk to your PC and act as the HMD for that class of application. And so you're not only held down to kind of, you know, Android daydream class of games, which are great, but of a different class of experience. you know, we've kind of seen that, yeah, that might be a model that works, you know, loads of work to do between proof of concept before it could actually deliver the full experience. But we believe doing experiments like that lets us learn whether there are different variations in kind of these different paths to market.

[00:13:28.651] Kent Bye: Well, one of the things that I've seen already on the market is some of these wireless VR solutions, like TP Cast at CES had the system that was showing at the Vive booth, as well as on the floor at CES, where they're able to use some technology to be able to render things on the computer, but send it into a headset wirelessly. And so maybe you could talk a bit about your WiGig technologies and kind of your vision that you see where that's going. Sure.

[00:13:52.623] Kim Pallister: So we've demonstrated publicly a technology demo of an HTC Vive with a unit added onto it that we built that is similar to the TPCast in terms of the experience that it promises, which is delivering the high performance PC, high resolution, full quality, you know, high frame rate VR experience and delivering that in a very low latency fashion, you know, off of that PC and onto the HMD. And it's an obvious problem that a lot of people look at. They're like, I love the experiences delivered by the high end PC, but how do I get this tail off the back of my head, right? And so we think there'll be a lot of people taking a good look at that. We believe that the 60 gigahertz wide gig technology that we've been working on for some time now, that over time, both the bandwidth requirements for that level of quality being distributed, as well as the quality of experience requiring that you not be in a crowded spectrum and having conflicts with other wireless signals, like you don't want to have like glitches in the experience. We think that those kinds of factors are going to lead to YGIG being very competitive in that kind of space.

[00:15:02.654] Kent Bye: And maybe you can go into a little bit more detail some of the benchmarks that you were able to prove out in order to show the differences between the different levels of the CPUs that you have, specifically the i5 versus i7. What kind of ways were you able to quantify the differences between having those extra cores and processing capabilities when it comes to what you're able to do in VR?

[00:15:23.324] Kim Pallister: Sure. So they really came down to four different ways in which the performance manifested itself. The first was on applications that chose to do deliberate things to say, oh, you've got a high-end processor, I'm going to throw more eye candy at you. I gave you examples of those already with Arizona Sunshine and Star Trek Bridge Crew and Warhammer Vermintide. The second thing is we said, hey, we had seen things in traditional 3D games where having a certain amount of performance overhead was good in providing more fluid experience. You get occasional glitches in frame rate, right? And we kind of said, we probably think those apply here as well. We went and did measurements of some applications and actually did some user testing to try it out. And yeah, even on CPUs that meet the minimum requirement from any manufacturer that meet the minimum requirements for what the HMD vendors state on their websites and packages and such. They'll run at full frame rate, but depending on what workloads are going, they'll occasionally drop frames, or numbers of frames. And that actually manifests itself in a way that is uncomfortable to you in a VR headset, right? So we were able to demonstrate, hey, having that performance overhead gives you a kind of rock-solid smooth thing. The third area, as we said, we did kind of a, I wouldn't say like a very thorough, but a kind of quick canvas of the applications that we're releasing on SteamVR and on the Oculus store and had a look at some of them. And we said, there are some of these things that depending on the setting, the user chooses or the type of workload that's going on in the app, they actually dial up the requirements and the workload quite a bit from the kind of default state of the application. Right. So I'll give you a couple of examples. Raw data was one that we saw if you host a game that's got like a lot of players in it, there are sections in that environment that gets pretty choked down on the lower end system and you'll visibly feel it and you can measure it and plot a frame rate graph and things like that. Another good example was Universe Sandbox 2, where I had that application installed on my PC at home. I actually didn't know much about it. It was my son was using it for a school project in a non-VR way. And I had looked at it in VR and said, okay, yeah, I'll smash the moon into the earth and do these things. And then he went and showed me that he did this school project where he dialed up a workload on simulating a planetary formation that took like 12 hours to render out because it was rendering at like very like minutes per frame kind of thing. And then I realized, oh, so you can do things in this app that you can actually dial stuff up quite a bit. What happens in the VR experience? And we were able to show the same kind of thing that for certain workloads, It's well-threaded, like it's a really thoroughly built application. It will eat all the cores you can throw at it. And so you get that same kind of effect that a higher-end processor gives you more overhead for that. And then the fourth area that we're seeing that performance manifest itself is in what our marketing guys like to refer to as mega-tasking usages, so multitasking for performance-heavy workloads. And so that can be things like running a traditional 3D game while you're also doing some video streaming. real-time encode and streaming. And so probably the best example of something like that in VR is either some of these level editor type of applications, like using the Unreal level editor in VR, or there's the, I'm spacing on the name, but Adobe Premiere has the real-time stitching, edit your video stitching that you do in a VR environment. Or more recently, we've been playing around with and talking to the guys that developed the Mixcast green screening solution. And we're looking at that as being a usage that, number one, we think has real legs in communicating what the VR experience is, not just for things like game trailers, but for things like eSports and for game streamers. And then when you look at things like doing a real-time encode of a high-definition stream while you're running a VR application, while that VR application has to render a third camera viewpoint, while you might be using a depth camera, like a RealSense camera, and then doing processing on that depth data to add lighting effects or to do depth sorting, it starts to add up to a lot more work than the traditional just the VR workload itself. So between all those things, we think there's years of performance demand to go fulfill here and improve the experience and a pretty good story for why people should buy a high-end Intel processor with a VR rig.

[00:19:50.128] Kent Bye: It seems like the streaming is a use case that I've starting to see perhaps a more and more of and a lot of people who are like really hardcore streamers have multiple machines. So do you foresee that that's still going to be the case for people who want to maintain that frame rate and then still broadcast? Or are you able to do a high-end VR experience at 90 frames per second and still be able to broadcast that from the same machine?

[00:20:15.157] Kim Pallister: Yeah, I think that If I told you a minute ago that we can take a high-end machine and find some apps that bring it to its knees, then clearly there is a case that at least for that application, doing the green screening on top of it, you're not there yet, right? And that'll be performance that maybe we'll have a processor in months or years down the road that'll be able to do that. So I think that just like with things like video editing or video encode, you will find some people that are really at the very high end of the spectrum and say, I really want another dedicated system for doing that. That's great. I'd love that they buy two Intel PCs. It's fantastic, right? I think, though, that what we will aim to do is work with the people that are providing these solutions. Like, I love that there are guys doing solutions like the MixCast solution that make it easier and more integrated to do that so that it goes from beyond the very top-end studios to being something that a hobbyist or a consumer can go do. And then making sure that those usages, when you choose to do them on one machine, are a pretty seamless experience and have enough performance room there to go do without choking on frame rate.

[00:21:22.477] Kent Bye: And I was at the experiential technology conference and had a chance to catch up with Android Jones. And he was saying that he was preparing a big dome experience that I think that Intel might have been involved with that was showing at Coachella. So I'm just curious if you have any more information as to what that was and being able to show on a huge dome sort of an immersive experience at a music festival.

[00:21:43.685] Kim Pallister: So I have no idea and I'm a little envious I didn't get to that I don't know what that's about. I'd like to check it out. I will say that we're talking to a lot of people in the different creative industries that are looking at either VR or kind of new ways to take technology and kind of use it to either manifest in a different art installation or a different performance installation. Like we did the thing Jeez, must be like a year and a half ago now or about a year ago now that Bjork did an album release where she did a bunch of the kind of videos were done as VR experiences. And we did a thing sending down some of our marketing people on a bunch of systems so that they could do these city by city tours with like 28 vibes and an installation all running on Intel PCs. So I think that kind of stuff is good for reaching new audiences, for just letting creative people go discover the stuff that we're never gonna think of about new ways to take it. But I'm not familiar with the example you brought up just there.

[00:22:43.659] Kent Bye: Okay, well also when I'm watching like an NBA game I see these little like freeze frame when someone's slam dunking and then they'll spin around and show this volumetric view of this sponsored by the Intel RealSense camera. Can you speak a little bit as to the strategy for RealSense when it comes to VR? Because you know I've seen the Leap Motion and there's you know with Microsoft they have the Kinect but that's sort of you know the latest iterations have been going into the HoloLens and so I haven't seen a lot of other volumetric depth sensor cameras. I'm just curious to hear what's happening with RealSense and Intel.

[00:23:18.914] Kim Pallister: So I'll talk a little bit about what real sense is doing. And I can also touch on some of the sports stuff that you that you mentioned there. So we have a team called the perceptual competing group that I it's kind of a sister organization to mine, and I work with them a lot. And they've been taking that technology to a whole bunch of different segments in robotics, in drones, like they've got drones now that will avoid the branches on trees when they follow you through the forest to film you on your mountain bike, things like that. And one of the areas that they see application for this is in VR. And so a lot of what kind of initially spawned the idea around Project Alloy was, hey, what if we kind of push the envelope on what's possible in an all-in-one VR platform and do a kind of proof-of-concept vehicle that integrates a couple of RealSense cameras to basically do real-time depth capture of the environment around you both for doing kind of six degree of freedom from the inside out tracking but also for things like bringing your hands into the experience so that you don't you know you either don't need a controller or if you have a controller it's not just floating in empty space in front of you you know, they did that vehicle to prove out that that technology could add value to VR and they're kind of talking to a lot of people about, you know, how that might find its way into products, right? The example you mentioned early on with the example of kind of real-time viewing of voxelized data of sports and things like that, We have a whole group now that's looking at the whole domain of sports becoming an entertainment medium or a thing that people are passionate about and do. And it's all going digital in a whole myriad of different ways, right? Not just in VR, but just it's something that when people watch sports on TV, they're also consuming it on a digital device. There is a lot of data capture happening around the event. And so we've got a division that's been looking at that. A couple of things that they've been doing that are pertinent to VR. One is around the acquisition of Vogue and TrueVR. So the product name that they've assigned to the codec is TrueVR, where they're doing real-time stereo 360 capture and broadcast from sports events in a way that lets you actually move your viewpoint to one of a select number of viewpoints in the event. back at CES at the press conference we did there. They brought 250 reporters real-time to a basketball game and let them all go select where they were going to view the game from and stuff. It was actually a real demonstration that it was real-time. At the time that came up in the keynote, they went and it was half-time, there was nothing going on. So they came back and did it again later. It was some proof in the pudding. So that's one example of something that is happening because of the way sports is getting consumed and it'll have implications across a wide variety of Intel platforms but it certainly is pertinent to VR and one of the ways in which sports gets consumed. We also have the 3D technology which is a little more further reaching but is really quite ambitious when you think of how it functions where they will place an array of high-definition cameras around the outside of a stadium all pointed at the event that's going on and they'll do a real-time capture and data crunch of that whole sports event and turn it into essentially like a 3d voxel map and that lets you instead of selecting one of a number of viewpoints and basically put the camera anywhere in the stadium and rewind and fast-forward and all that. So that'll be a little slower to deploy in VR, but is arguably long-term what you need to really let you consume an event like that and say, I want to view it from the point of view of the quarterback, or I want to sit on the sidelines next to the coach, or I want to get the blimp shot, you know, any of those things.

[00:27:03.511] Kent Bye: Yeah, and at CES there was a showing of Project Alloy and I didn't get a chance to try it out yet, but I have seen some of the early commercials of it. It sort of have this, what I imagine would be kind of like a VR HMD with people walking around a social space. There's been a lot of semantic battles around what to call, you know, virtuality, augmented reality, and the full mixed reality spectrum or the immersive computing spectrum. And this seemed to be a little bit like what some people may call like augmented virtual reality, where you're in a VR HMD, but you're bringing in components of the real world. So I think what Intel is calling it is merged reality. But from your perspective, I'm just curious to hear what some of the use cases for Project Alloy might be where you may be completely unveiled within a VR HMD but yet still having some level of mixed reality or augmented reality experiences.

[00:27:56.235] Kim Pallister: Sure, so I think maybe let me tackle the merge thing first because I think it informs the rest of the discussion. So all the semantic battles of people having about MR and is it, you know, where are you on that spectrum, all that aside, our long-term vision is that you should be able to, whether it's through RealSense cameras or whether it's through other new types of sensors or other data sets or the stuff we're doing with sports, You should be able to capture and digitize the real world and hand it to the creators to let them manipulate it every bit as much as they're doing the virtual world, right? And that then all those experiences across that whole spectrum just strive towards that vision long term, right? In the meantime, there'll be lots of point examples along the way. Getting back to Project Alloy and what we demonstrated with that, with the initial kind of proof of concept around it was, it's an occluded headset. It has display panels, just like you'd have in a Vive or a, in a Rift, but it's added this 3D depth sensing capability. And so you might just choose to use that for like six degree of freedom tracking. You might just do VR without the use of external, you know, lighthouses or IR cameras or what have you. But as long as you've got those cameras there and you're capturing a depth map, you can start to do a little bit more. So maybe the idea of, which I alluded to earlier, of bringing in your hands and capturing those and providing not just a avatar-like rendered hands, but your actual hands. So if you've got a wedding ring on, you'll see it there. If you've got a watch, you can actually check the time. those kinds of things, you know, it falls somewhere along that spectrum. But I think that ultimately, just giving the developers capability to say, well, you decide how much of both you want to mix together and then go run with it is, you know, that's what we'd like to unveil and unleash in the hands of creators.

[00:29:45.730] Kent Bye: Yeah, I'm curious to hear your thoughts on what I see as this trend of everything moving more and more towards mobile, whether it's the mobile headsets getting more and more like a six degree of freedom and more of like the features of a PC gaming, and then the PC gaming going wireless and then perhaps being untethered and becoming more and more like a mobile experience. And that, you know, there's all sorts of cloud computing things that are going to be going out and maybe pulling in artificial intelligence type of things. But when it comes down to actually rendering out an experience, I'd imagine that it's either going to happen through a PC that's disconnected, or if you're completely untethered out into the wide world, it's going to be happening on the chip. as like a system on the chip. So from Intel's perspective, are you also trying to make some moves of developing your own system on the chip so that you could be a part of that mobile? Or are you really just focusing on this world where we have dedicated PCs that were wirelessly streaming in high-resolution experiences, maybe in a private context in a home or work?

[00:30:48.180] Kim Pallister: It's interesting to hear you phrase it that way, I'll give you a slightly different spin on things. I mean, ultimately, we do SoCs today, right? So we do system on a chip, like if you buy a high performance PC today, especially if you buy a notebook, like the one sitting next to me here, you know, that Core i5 processor has got everything on it, right? It's got audio on, it's got graphics, it's basically a full SoC. And so what you start to then look at is to say, okay, what is the trade off I'm making in terms of cost and consumed power, which either means I need to be plugged into a wall socket or I need to have a different size battery. And what can I deliver as an experience there? There's no question that the mobile class SoCs that are used in today's phones are able to do decent levels of 3D rendering for their size. But for anywhere upwards from there, you can deliver a higher quality experience. So we think that there are going to be a range of demands from users in terms of what they want to deliver. And just like we've seen with 3D games, you can do an adventure game on a mobile phone that's got like a decent screenshot to it, but it doesn't have the open world of Skyrim, right? And so that range of experiences will exist. Now, just because we're saying, you know, we want to get rid of the wire and make it a wireless HMD that takes all that power of a high-performance PC and makes it more comfortable to use, that's not necessarily a move towards mobile. That's just making VR a better experience no matter where the user chooses to go along the spectrum, right? So as that spectrum fills out, we have a roadmap. We think we'll compete across a wide range of those platforms. But for the near term, just because of our heritage and where our products are playing right now, our first order of business is really high quality VR experiences on PCs.

[00:32:31.689] Kent Bye: Talking to Jules Urbach from Otoy at CES, he told me about PowerVR, how they're doing these application-specific integrated circuits in order to do light field rendering. So in order to basically render out light fields in a way that's a lot faster than other techniques that are out there, whether it's from a GPU or CPU. So I'm curious to hear your perspective on some of the work that you may be looking into when it comes to volumetric video and specific chips that are customized in order to do light field rendering.

[00:33:02.047] Kim Pallister: Sure, so I think there's going to be a lot of research or there is a lot of research going on in display technologies. Everything from very high density technologies for follow-ons to OLED or LCD to light field displays to, you know, multi-planar kind of compromises in between giving you some level of depth of field. There'll be a lot of efforts in those directions. And if and when any of them show promise, there will be people looking at what it takes to then go feed them with a graphics chip either on an SOC or as a discrete part. And so we ship very high volumes of processors, and so we're looking at all of these things but needing to kind of balance that against, okay, where do we expect the mainstream solutions are going to end up and where do we need to focus? But I think one of the advantages or the key strengths that we have in building upon the PC ecosystem is it really is the only true open ecosystem top to bottom for people to then go innovate on, right? So the fact is if somebody comes up with like a follow-on to light field displays, it's way better and no one ever thought of it and somebody wants to go tackle that, they can do that and they can ship it on PC, right? People should kind of keep in mind that things like all the stuff that Oculus has done with the Rift and with the original like DK1 and DK2 and that the guys at HTC and Valve did, they did on a PC ecosystem and they did it without Microsoft or Intel or any other large companies giving them permission. It's an open platform, they can go do it, right? So as things like that emerge, I kind of feel like we're going to see a lot of it emerge in this ecosystem first, right? And then we can partner with those people. We can choose to see which ones are going to get enough traction that we want to put it on our roadmap. So that's kind of how we look at that space.

[00:34:47.839] Kent Bye: So for Intel, it seems like I see this change and shift of moving towards these immersive computing platforms. And it's sort of the shift of immersive computing. So I'm just curious to hear your perspective of how you think about that. And if you do see that this paradigm is coming.

[00:35:03.496] Kim Pallister: Oh, absolutely. I think If VR were only about like a niche within gaming or something like that, we wouldn't be kind of showing the level of interest that we are and investing in the level that we are. I think we really do see it in a fundamental shift in the way that people interact with technology in a way that both senses the environment around them and senses what they're doing and feeling so that it can feed that into whatever compute is going on and present it back to them in a way that's just natural and intuitive. And so right now that's all starting in VR. It's going to play out across this whole spectrum and it's going to affect a lot of what we do. You know, it's going to require a lot of high horsepower computing, both in the cloud and in the client devices happening under the hood. Like it's going to be really hard to make it look seamless and make it look transparent, but that's part of what's exciting about it.

[00:35:57.503] Kent Bye: Great. And finally, what do you think is kind of the ultimate potential of virtual or augmented reality and what it might be able to enable?

[00:36:06.480] Kim Pallister: Sure. So I think I just gave a talk at GamesBeat a week or two ago where I paired up with Austin Grossman, who's a sci-fi author and actually works at Magic Leap as his day job, where we kind of did a little back and forth on what visions sci-fi was kind of giving us about this. And I think rather than view it as this kind of converged thing, whether or not there's a converged device, there really are two bookend kind of uses right and one is like full holodeck right like take you and your friends and transport you into Skyrim and really make you feel like you're off going and having this adventure whether it be you know an adventure uh Dungeons and Dragons style adventure or driving a Formula One race car or what have you there's the full immersion full magical experience like that And then the other extreme is this kind of futurist vision of an augmented reality experience where you have a device that is ubiquitous in that it's an easily wearable lightweight pair of glasses or some technology like that. that helps inform you in whatever way you want about the world around you or change the world around you to some degree so that you can layer a game on top of it. I think both those features are gonna exist. I think both of them are really far out, much further than people anticipate, but that's good. We're good at long-term problems, and it's an exciting set of problems to go work on.

[00:37:33.737] Kent Bye: Awesome, well thank you so much. All right. Thank you. So that was Kim Pallister. He's the director of the Virtual Reality Center for Excellence at Intel. So I have a number of different takeaways about this interview is that, first of all, one of the most striking things to me was how Kim was talking about how the 3D game market has evolved and developed. because there's basically three different types of platforms that are out there. There's the mobile phones and tablets, there's the console, and then there's the PC. There's a company called Newzoo, and they do a global games market report, and their latest report was projecting that in 2017, the tablet and smartphone was going to be around $46 billion, console gaming was going to be around like $34 billion, and the PC gaming was around $29 billion. And I think what's interesting to me about that is that it is probably on the same order as to what this scale is. It looks like the tablet and smartphone is top, then console gaming, and then PC gaming. but that there are different experiences in gaming when it comes to each of these different platforms. And as you get to PC gaming, you get to the highest level of fidelity, of immersion. And mobile games are also these casual games that are out there. Now, I personally don't know necessarily whether or not people will be playing casual VR games in quite the same way, just because you have a certain amount of situational awareness of what your surrounding environment is. And if anything, I think phone based AR mobile games will probably be a lot more popular than phone based VR games. But that in the long run, I think that the highest end immersion that you can get with VR experiences are going to be way more immersive than any of the augmented reality games that are out there and available. So it feels like this is kind of how this ecosystem and spectrum is going to be playing out. Now that AR is going to be being pushed out to a lot of these phones, then you're going to see a lot more of these depth sensor cameras. They're going to be able to see six degree of freedom capabilities. And then at the highest end of the PC, then you're going to start to have things like getting rid the tether and going wireless. And so I personally see that there's this convergence where the high-end VR is going to be coming more and more like mobile VR, and that the mobile VR is going to become more and more like the high-end VR with initially 3DOF controllers and maybe eventually 6DOF controllers, but that at some point there's going to maybe be this convergence of the optimal amount of immersion that you need, whether it's being able to walk around an environment with six degree of freedom controllers, you have a high enough frame rate, But there's always going to be these trade-offs between price, performance, and power consumption. And those are some of the ways that Intel is looking at how to differentiate between all these different various markets. And Intel has their own strategy for how they're going to be addressing all these different ones. But when we go and look at the high-end PC and some of the differences that you get from some of the higher-end Intel Core processors, it seems like they're on this trajectory where there's just going to be more and more demand for workload. You can get a PC and get the lowest-end processor and still meet the minimum specifications, but I expect over time that there's just going to be more and more demand of either it's from real-time audio or multiplayer. experiences, or starting to add all these different levels of eye candy and real-time interactions of physics and particle effects and destruction of objects. And so there's the four areas that Kim had identified as to the differentiation between the lower end and high end. Intel processors and since this interview was recorded they've actually come out with a whole other Intel processor They've released the i9 now And so the four things that Kim had said is that they are able to dial up the eye candy with the high-end processor So it's just kind of like nice to have things that you have that you can still have the experience but if you want to kind of dial up the fidelity or particle effects or the ragdoll effects or there's all sorts of different dials that you can set within specific games and in some of those games they just automatically turn on if they detect that you have the hardware that's able to really handle it. And then they're looking at just having a little extra performance overhead so that when you get to a part of the game that has a high workload then it just doesn't drop frames or glitch out in any way. And then there is the use case of these high-end applications where you can just really dial up and really push the limits to what the processor is able to do whether it's that universe simulation, or raw data, where the application is really pushing the limits and the edge. And so you're able to then go further than you would with a lower end processor. And then finally, there's the mega tasking workloads, where if you start to do streaming in VR and doing like mixed cast, mixed reality studios, or using a level editor like either the Unity or Unreal, being in virtual reality while actually using an application that's kind of pushing the limits of the processor to be able to actually be creating a VR experience. or doing some stuff like a real-time rendering for 360 video. So I think that Kim made some really fascinating points that the PC is an open platform and that some of the leading-edge innovation that's happening has been happening on a PC and that the Oculus Rift and HTC Vive happened without having these companies have to go and ask for permission. It's an open platform and they're able to innovate in that way. And so with that, I think that there's continuing to be innovation when it comes to the next generation of displays, whether it's like light field displays or even things that are high density OLEDs or even things beyond that, where you start to get away from some of the limitations of the OLED screens and then start to get into like these digital light fields. And then once those display technologies come about, then there may be more application-specific integrated circuits that are driving these applications. Like Jules Urbach said, that PowerVR was having this specific ASIC to be able to drive light field rendering. And finally, I think it was striking to me to hear Kim and Intel being really committed to virtual reality in a sense of this is where our trajectory is going in terms of the future of immersive computing, whether it's with augmented reality or virtual reality. All these confluence of technologies is really pushing the edge of innovation, of driving technology forward. And that with that, it's like pushing us towards more and more interaction and more and more immersion. more and more natural intuitive interfaces with putting our body within the experience. And so I think the timeline of when this critical mass is going to happen is still probably like the biggest open question within the virtual reality industry. But from my perspective and looking at what the technology companies are doing, whether it's Apple or Facebook or Google or Intel or Microsoft, these companies are moving towards this world where we're having this immersive computing revolution that is on the horizon and we're already seeing the early signs of what that looks like and I think it's just going to continue to move forward with this immersive computing revolution. So that's all that I have for today. Just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a listener supported podcast. So I rely upon your gracious support to be able to continue to bring you this type of coverage. So you can become a member today at patreon.com slash Voices of VR. Thanks for listening.