#102: Ben Lang’s GDC Recap & Impressions: Lighthouse, HTC Vive, Sony Morpheus, Oculus Crescent Bay, & eye tracking

Ben LangBen Lang is the Executive Editor at RoadtoVR.com, and he shares a lot of his impressions from many of the VR announcements and news from GDC. Ben has been exclusively covering the VR industry developments at GDC for the past three years, and he says that this year has been by far the most exciting.

Ben is also currently holding a Reddit Ask Me Anything thread today, and so go ask him a question here.

Here’s all of the news that we discuss coming out of GDC:

  • 0:52 – Lighthouse tracking system and Valve’s HTC Vive, and SteamVR controllers. Planning to release lighthouse to whomever wants to use it. Best tracking for doing tracking in VR in large spaces that we’ve seen so far. Flexibility of the Lighthouse system for tracking multiple items.
  • 3:41 – More details for how the Lighthouse tracking system works with an X & Y sweep. It’s 100Hz tracking and added to other sensor fusion
  • 5:06 – Laser sensors are also on the HMD, and more about how the laser sensor timings work
  • 6:20 – Tracking multiple people and potential occlusion issues. Can always add more laser base stations
  • 7:27 – More details about the HTC Vive (here’s Ben’s hands on write up about it), it’s better tracking than anything else out there. and using a 15×15 space is something totally new. Vive is visually very comparable to the Crescent Bay.
  • 9:37 – Cultivating a sense of presence being immersed into another place plus being able to act within a VR scene. Vive combines these two components of presence to the next level with their tracking and input solutions. Adding natural movements for what humans do instead of an abstracted button press. Get immersed by not worrying about your surroundings.
  • 12:50 – Three tiers of designing from mobile experience ranging from Cardboard to Gear VR to a sit-down experience like Morpheus and Oculus and then the stand-up experience like Vive and Survios. Devs carry around Gear VR because it’s easier to demo VR experiences, and when you come home then you have positional tracking. Will people start to have a VR office dedicated room?
  • 16:53 – Either they have to move their room around or dedicated VR arcade that have lighthouse systems set up
  • 18:03 – Progressive enhancement ideas and how that may apply to VR design. Mobile with no input to sit-down positional components and then on up to the walkable VR experience with two-hand interactions. It’s more difficult to do progressive enhancement design in VR. It’ll be easier to design for it later once the solutions become more standardized.
  • 21:05 – Difficult to do two-handed interactions without the controllers.
  • 23:07 – Sony Morpheus and the Sony Move controllers were not as performant as the SteamVR controllers. The VR narrative storytelling by Sony where he stood up and crouched, but wasn’t walking around. Worked vast majority of time and it was really fun.
  • 27:53 – Q2 2016 release date for Morpheus and Vive coming out in the Fall 2015. Oculus’ new Crescent Bay demos, but not a lot of announcements beyond an Audio SDK and the Mobile Game Jam. They had a big physical presence, but they were focusing primarily on developers. Had some new blog posts about time warp.
  • 30:09 – Developers do need some time integrating input controls, and so the launch looks to not have input. A lot of the demos were very passive. Is Oculus pivoting towards more cinematic VR and passive VR? Ben thinks that they’re still focusing on gaming market. They’re working on input solution, and devs need time to know what they’re working with. Perhaps Lighthouse will become a standard solution.
  • 32:59 – A tour of the history of the VR hardware development at Valve. Oculus looking for people with optical experience. There’s not a lot of extra wires with the Lighthouse solution. Oculus likely had known about their laser scanning solution
  • 34:59 – Striking to see how many VR was happening at GDC, over 22 different booths had a VR HMD. Haptech working on an electric feedback for guns and using STEM to track the gun.
  • 36:47 – Haptics are a key component to immersion, and abstracting out other haptic devices. Tactical Haptics and their Reactive Grip Controller
  • 39:03 – Low-poly scenes for VR and the uncanny valley problem, and using stylized art to avoid the uncanny valley and
  • 40:57 – Lucky’s Tale and using a diorama approach with a 3rd person perspective that is the sweet spot of VR with the stereoscopic effects. Things that work better in VR like body language in VR for doing telecommunications. Google Earth data being mapped at the SteamVR demo, and it’ll help visual learners to
  • 43:42 – John Dewar’s educational demo of airplanes, and Oculus demos used scale a lot. Other indie demos that were cool were like ConVRge that was broadcasting a livestream of the party into the VR space, and then broadcast VR scene onto a screen at the party. WebVR experiences that Mozilla were displaying
  • 45:52 – Mozilla’s WebVR experiences are really exciting and rearchitecting the browser to be more optimized for VR for ephemeral experiences. Web is great for quickly navigating information without having to download a lot of information
  • 48:10 – Google Cardboard experiences and ecosystem for ephemeral photos and videos, and using Gear VR to show people 360 videos very quickly
  • 49:46 – Eye tracking at Tobii and FOVE, and eye tracking can add a lot of useful things for VR like adding depth of field, being able to know where the user is looking for selection, and can do better chromatic aberration correction, and then to do foveate rendering for more optimized rendering.
  • 52:49 – Augmented Reality like Project Tango, and Qualcomm’s Vuforia and Meta’s AR Hackathon. Microsoft and Magic Leap are the two big AR players at the moment. AR isn’t there yet, and need sub-millimeter tracking
  • 54:32 – Meta’s AR hackathon, small field of view, and about a 60ms delay, and rudimentary demos not really interacting with the environment, and more tethered to a computer. They have $23 million in funding and some interesting team members. AR is still really early days, and computer vision is not a solved problem. Doing advanced putting things in the room is a difficult problem. VR is a lot further along in terms of the experience and
  • 57:15 – Magic Leap is looking to the VR space for innovation. OSVR and Razer booth and having a unified SDK and Unity’s integration of VR input as well. As long as OSVR’s system just works, then it doesn’t matter as much if Oculus, Sony, or Valve is involved or not. It’ll allow the third party manufacturers to collaborate and be
  • 1:00:12 – Khronos Group to come up with Vulkan announcements and the collaboration that’s happening there. Need to pay attention to the performance and don’t just throw GPUs at the issue, and a lot of focus on speed and reducing latency in the future.
  • 1:01:45 – Unity, Unreal Engine and Source 3 are now free, and that’s huge. VR has been a grassroots movement with a lot of experimentation from a lot of small upstarts. Ben Lang really wants a Virtual Pinball game.
  • 1:04:25 – Triple AAA shops are being really cautious with VR, and it’s a great time for indie’s to jump in and experiments. Lots of open problems for narrative storytelling for VR
  • 1:05:34 – Other highlights were Sony Morpheus narrative experiences and the WETA experience from Epic was pretty memorable
  • 1:07:03 – Unreal Engine for cinematic VR with Oculus Story Studio and interactive games from Unity
  • 1:07:40 – A lot of important announcements from GDC for the future of VR

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Below is a 16-part visual history of Valve’s SteamVR and what’s since become the HTC Vive:

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:12.414] Ben Lang: I am Ben Lang, the executive editor at RoadtoVR.com and GDC this year was hands down the most exciting year for virtual reality at this conference. This was my third year at GDC exclusively to watch the virtual reality industry. And this just was tremendously exciting. So much amazing stuff happening.

[00:00:32.512] Kent Bye: Yeah. And I think just to start off, you know, I think the biggest news coming out of GDC is probably Valve's announcement that they're coming out with not only a virtual reality HMD, but also a whole tracking system. Maybe you could talk about some of your impressions and what you saw and where you think this impacts the overall VR industry.

[00:00:51.837] Ben Lang: Absolutely. So Valve came out of the gate with not only the tracking system, but you know, SteamVR, it's kind of confusing the terminology they're using right now because SteamVR will probably be what they'll call their software. At some point, but right now they're kind of using that term to refer to their platform, their VR platform, which is, it is the lighthouse tracking system. And then a headset, which happens to be the HTC five as the first one. And this was developed by valve, but HTC is going to manufacture it. And then there is their steam VR controllers, which we can talk more about, but are also very interesting. So. The company says that they are planning to kind of release Lighthouse to whoever wants to use it. And this could potentially be huge because it is the best system for doing positional tracking in a large space that we have seen yet at the consumer level. It is highly practical. It is very elegant in its solution and its flexibility. So the Lighthouse system It's really interesting because it is essentially a dumb system, if you will. So if you think about a lamp, a lamp illuminates a room, right? And it does so for everybody. You don't need to be connected to some special thing to see what that lamp is illuminating. Lighthouse is similar in that it enables any space to basically have the necessities for high performance positional tracking. And it can track anyone's object. It doesn't hook up to a computer, I think is the easiest way to say this. It literally just plugs into a power outlet. And that's, you know, why I make a comparison to a lamp, because you don't need to hook this up by wifi. You don't need to plug it into a computer and run a cable all the way across the room. You just place these base stations in the corners of your room. And that space now becomes ready for high precision tracking for any object, really. So this is why it's such an interesting solution. You know, you can do it with a headset, you can do it with controllers, you can attach something to a peripheral. to track easily and it can be anybody's object, like I mentioned. So, you know, I could have a headset and walk in there and it's tracking from my computer and then you could have a headset and walk in there and it would be tracking on your computer. You know, no hookup between the system and the items that it is tracking. The items need to be hooked up to a computer and they do all the processing. But this means it's really interesting in the way that, you know, you could add a third station, a third base station for more robust tracking volume with less chance for occlusion. You could string these bases together into a corridor and walk through a corridor. So it's just a super interesting solution for turning any existing space into a space that is ready for really good positional tracking, which is so important to not just the head mounted display, but also the controllers and anything else you want to interact with.

[00:03:41.062] Kent Bye: So yeah, let me just repeat some of that just to make sure I understand it. It seems like they have one that's broadcasting in like, let's say the X direction and then another one broadcasting in a Y direction. And so you've got two lasers going in. And so you have these controllers that have these little receptors on the end and based upon those X and Y lasers, they're able to kind of figure out what the orientation is of what you're holding in your hand. And then from there, they kind of do. I presume inverse kinematics just to kind of like figure out where your hands are at. Is that correct?

[00:04:15.002] Ben Lang: Yeah, that's pretty close. So each base station you write has an X and a Y laser sweep, essentially. So they have a spinning lasers that basically wipe a line across the room in the X direction. Then another one, the Y direction, each station has that. The reason to add more stations is purely for, to prevent occlusion. And then as the laser sweeps across sensors, there's some complicated math that goes into figuring out exactly where it is in the orientation, but that is how it works. And to my knowledge right now, this is a hundred Hertz tracking at this point. And then it's combined of course, with accelerometers, IMUs in the controllers and in headsets and whatever else, if you want, you know, to add that additional precision and rotational tracking up to, you know, what they're doing right now is a thousand Hertz.

[00:05:03.788] Kent Bye: Now, are they also having receptors on the actual Vive HMD?

[00:05:09.251] Ben Lang: Yes. So there are sensors all over it. And these are not cameras like you might think. This is not a computer vision system, which is actually what Oculus is using. So they have a webcam that looks at the DK2 or the Crescent Bay. And it picks up lights that are on the headset. And then it sends that image to the computer. takes that image and processes, OK, where are these lights? And we know the pattern they should be arranged in. And that gives us the position of the headset. The Lighthouse system actually sweeps these lasers across the sensors. And the sensors have used highly accurate timing to say, OK, I know that the X laser is coming. And as it's coming across, it has hit this sensor at time 0, this sensor at time 1, and this sensor at time 3. And using those timings, the computer can work out, okay, the system must be in this orientation in order for that to have been the times that the laser came across, because we know the speed that the laser is crossing, and we know when the X laser is coming, we know when the Y laser is coming.

[00:06:18.934] Kent Bye: Oh yeah, that's really brilliant. I mean, it just, it's really elegant if you think about it. And, and also, I guess a question would be like, let's say you have three or four people in the same room. Can they just use one lighthouse system and each have their own independent system? Or is there going to be like occlusion issues that may come up?

[00:06:36.618] Ben Lang: Yeah. Well, so that's, what's one of the very interesting and flexible. And like you said, elegant means that the system it is. Yeah. Anybody could use the space. The space becomes enabled as soon as you add these lighthouse units. Yes, there will be potential occlusion issues because, you know, if a person is ducked down behind you and the laser sweeps across and you block the laser, then yes, you'll have occlusion problems. So for many people, it may not be great, or maybe you'll have to add, you know, 10 lighthouse base stations. It'd be pretty confident if you're going to want to have like four people in there, but for one person and two controllers, it's quite good. Maybe for two people, each with controllers, it can be pretty good. But the nice thing is you can just say, OK, I'm going to get another base station and plug it in over here. And now I have that additional angle covered for better avoidance of occlusion issues.

[00:07:28.153] Kent Bye: Yeah. One of the interesting things about this year at GDC with Valve is that, you know, they set up at least seven stations per half hour for developers to go try this out, but they needed quite a large space to be able to do that. And so they ended up having to turn down over a thousand people trying to get in and experiencing it. So it was certainly the hottest sort of ticket at GDC. And, you know, I actually did not get in despite my pleaing, but you were lucky enough to actually try it out. So I'm curious, you know, what some of your thoughts were.

[00:07:58.492] Ben Lang: Yeah, I'm working right now on a pretty lengthy article about all of these thoughts, which will pop up soon. But the short of it, and I'm going to position this as most people ask me, which is, you know, the first question is, how is it? And the second question is, is it better than the other stuff? And the short answer is, yes, it's better than the other stuff that we've seen so far from Sony, from Oculus, and from others. But the reason for that, I need to clarify. Crescent Bay is probably the closest competitor to what's going on right now because the resolution is similar, the field of view is similar, and visually, when you just look through the two headsets, and if you're standing in one spot and moving your head around and spinning, they are basically the same, or at least very comparable. So visually, through the headset, there's no major advantage to the Vive, let's say, over Crescent Bay. Where the advantage comes is suddenly you have a 15 foot by 15 foot space in which you can walk around in SteamVR using that same high quality, you know, tracking that you're used to on Crescent Bay. And then on top of that, you have an input system that is just as accurate in the same volume. And the controllers, furthermore, have some pretty interesting haptics on them and are being used in very unique ways that I think are gonna be very beneficial for virtual reality. But yeah, so the short of it is visually tracking wise, these systems are comparable, but the addition of the space that you can use and the controllers means that at this point, SteamVR is ahead of the game.

[00:09:36.297] Kent Bye: Yeah. I think that in my own personal experience, having done what I find to be one of the more immersive virtual reality experiences has been the Sixth Sense lightsaber demo, just because you are engaging both your hands and a little bit of a subtle haptic feedback and just having that hand tracking and positional tracking and in combination with just a TK2 to me was, you know, one of the most immersive experiences to feel that sense of being present and to feel like whatever I was doing in the VR scene was actually having an impact. And I think that The Crescent Bay demos this year didn't have any input, and so it was such a passive experience. And so the Vive, when people are doing that, it's like taking the level of presence up to the next level, that I think that it was giving people that sense of being in the VR scene. something that they really hadn't seen that much immersion before, or that much presence. And so, Sebastian Kuntz, when we were talking about this back in Oculus Connect, he's talked about, like Mel Slater, talking about the place illusion and the plausibility illusion being the two key components of being present. And I think from my observation, seeing the Vive that, you know, people are be able to actually experience both of those at the same time, both being in that place and being able to have their actions have an impact. And so that's why I think you get such a huge uptake into like, Oh my God, this is extremely like so different. So I don't know what your, some of your thoughts on that are.

[00:11:01.401] Ben Lang: Yeah, no, I totally agree. Being able to interact with the space is extremely important. Anybody who's been following along in this kind of virtual reality development world will know that the input question has been huge. It's kind of at the top of everybody's minds because Oculus has never publicly shown any of their headsets with anything other than a keyboard or a controller. And being able to interact naturally with the world is massive for virtual reality. Just huge improvement in immersion and interactive ability. You know, it's one thing to say, hold X to reload, you know, what we've been used to for years. It's another thing to say, okay, I need to reload. I'm holding my hands up because I have motion tracked controllers and I'm going to have to press one button to drop the clip out. Then I have to reach back behind me to get a new clip and put that in the gun. You know, that takes a single button press into a whole basically mini game, if you will. And that adds so much to the experience because it is what you would do in real life, you know? It removes an abstract system, the controller or the keyboard, and turns it into something that is just, okay, here are objects, interact with them like you know how to do, as any human does. And so you're totally right, having that interactability and the huge space is definitely the reason why people are, you know, being blown away by SteamVR because You become so present in the space that even though you're confined by walls in the real world, when they put a virtual environment around you that is visually just completely expansive, you just kind of believe that. You're just like, OK, I don't see the walls. I trust that I'm not going to run into the walls. And therefore, I can put them out of my mind and just be here in this space. And it is a tremendous improvement over just not doing anything, just standing and looking.

[00:12:49.251] Kent Bye: Yeah, and just walking around the floor of GDC and seeing the different people having Gear VRs there, some people had DK2s, it seemed to be all over. But one of the things that I really think about now in terms of like these different tiers, in my mind I think of the three tiers of like, Designing for a mobile experience. So everything from Google Cardboard to the Samsung Gear VR, something that's designed to be like sitting down and forward facing. And it seems like maybe Morpheus and Oculus would be in that realm of, you know, maybe not sort of walking around the room. And then the third one would be sort of this vibe, completely dedicating an entire room for you to be able to walk around in. And something like Servios as well as sort of working on this sort of out of home, you know, either you go out of your home to go to do this as a digital out of home experience, or you have a dedicated room in your home to be able to do this. So, so in terms of designing VR experiences, that seems to be three distinct tiers to me in terms of a, you know, VR design perspective, but also depending on what type of use cases that you have to have everything from the gear VR, which is very mobile. You just throw it on people. to DK2 where you have to have like maybe a higher computer to be able to set it up and, you know, dedicated space to sit down with a camera for positional tracking to then to the Vive where you have to actually set up like a lighthouse system into the room where it may be harder just to kind of like take it to a demo place out in the streets because you have to have like more of a dedicated space. So yeah, I don't know how you kind of think about these tiers and use cases, if it's similar, if you have any thoughts on that.

[00:14:26.135] Ben Lang: Yeah, they're definitely there and It's obvious to see because before the Gear VR came out, developers who were wanting to demo things to people would take their DK2 around and usually they'd leave the camera at home because it was kind of a pain. And they would just use the DK2 without the camera because it has an IMU in there. Or I may actually, this may actually be the HD, the older HD prototype. But one of these systems they would just take around because it was the easiest thing that they could do at the time. And then when Gear VR came out, now everybody's toting those around, and it's so much easier to show stuff. The interesting thing, going back to Lighthouse, is that Gear VR, for instance, could be exactly what it is today, but have the sensors on it for Lighthouse. And it could be you're out and about, and you're not anywhere near the base stations, and you just use it like Gear VR is today, which is no positional tracking, just orientation. And then when you come home, you can just take that same headset and walk right into that space. And now you have that high-level positional tracking that really makes a VR experience. That level of flexibility, I think, is going to be crucial to making people say, OK, there's a good reason to have a permanent setup in my house for Lighthouse. But to the point of these different spaces, do I want to have a VR office where I can sit at my desk and work? Yes, that's going to be desirable. Lighthouse might not be best there in a, you know, messy space with small confined area. You know, Oculus, single camera solution there may be best. And then you might leave the office and say, okay, I want to really explore a VR world and do that in your, in your VR dedicated room. And I find this argument that, oh, well, Lighthouse requires an entire room. Like that's ridiculous. I find that to be really silly when you take a step back. I mean, if you look at. The television, you know, how exciting is a television experience compared to VR? You know, it's a, okay, you're just watching a little screen there. If people dedicate a room in their home to the TV, and you think about what that experience is compared to VR, you know, why would they not do the same for a system like SteamVR? I think that they will. I will. It just doesn't make sense to not do that. I mean, geez, sacrifice your TV room, if anything, the experience is so much more amazing than And watching something on a small, you know, flat screen, even if it's a 60 inch HDTV, it feels small compared to virtual reality.

[00:16:53.290] Kent Bye: Yeah. I know if people have kids, they may have like toys and stuff laying around and, you know, if there's a table or something you have to move. So yeah, it would either require people to kind of re architect and really dedicate that space to be able to really walk around without obstruction. Or like I said, there may be places that pop up where you go to the VR arcade to be able to experience it, you know, without having to to reorganize your living room or if you don't have enough space or whatever.

[00:17:19.418] Ben Lang: Yeah. Well, so the lighthouse system and the things that they're saying about, you know, they want this to kind of just go out there and whoever can use it, that could be huge for that sort of thing. You know, it'd be painfully easy to, you don't have to come up with a custom system. You can just have your VR arcade filled with the laser base stations and go from there, you know, with whatever headset people could even bring, bring their own if they wanted to, because of the fact that it's kind of a dumb system. So yeah, that would be that would be excellent. And then there, you know, VR arcade, you could have, you know, lots of different peripherals or systems and games specific to what you've got there, like maybe a chair with a bunch of buttons and levers for a specific, you know, mech game or what have you. Lots of different possibilities.

[00:18:02.965] Kent Bye: Yeah. And one of the things that I saw on the floor that was really interesting was Mozilla's WebVR and starting to get more content, being able to view it through the browser. And one of the reasons why I bring that up is because in the web world, they have something called progressive enhancement, where if you have certain capabilities like JavaScript turned on, if you have it turned off, then it should work. If you have it turned on, then it's a progressive enhancement. So. If you have it, it's there, then you use it. But if you don't, then it still works. And I think that when you think about VR design, I'm seeing this sort of similar thing happen as to what the web world has had to deal with, with progressive enhancement, where as a VR developer, do you design something to work on mobile with no input? And then do you add input as a sit down experience? And then do you add even more interactivity? if you're being able to walk around. So I just sort of see these decisions that VR developers are going to have to make is, am I going to start with, you have to have absolutely everything with the walking and then take that to the extreme? Or are they going to start at the very lowest bottom and say, Oh, I have a VR experience that I want to be just for mobile with no input. And I still want to have it be a good experience.

[00:19:14.156] Ben Lang: Yeah, definitely is going to be a challenging design world right now, because there are so many different systems at this point, like we just talked about. In the web, the progressive enhancement thing is definitely an important part of it. But it is difficult to do in virtual reality because the web, you know, you want everything to basically function similarly. You don't want to have one website that scrolls left to right instead of up to down. You know, you try to make these things consistent for intuitive purposes because most of the web is about information. If you try to apply the same concepts to a virtual reality experience, you kind of end up with a lot of sameness. And I don't know that that is as easy to do because, you know, every game developer wants to do something very different. But I agree that there will be instances, I mean, particularly just going from, let's say, Gear VR to desktop, you're going to have people who create a game on Gear VR that they know will have improvements if they go from gear VR to desktop where desktop has the positional tracking and it has, you know, potentially a full keyboard or controller for input. But I think covering the entire span of mobile with just rotational tracking to steam VR with a 15 foot space and input will be very difficult. And we may start to see some, I think, I think that the answer is steam VR, having a space, having the controllers and having the headset. I think that's going to come. kind of the standard VR platform going forward. And then you might have your mobile as well. But I think that we will slowly slide away from having so many different types and it'll be easier to develop for across these solutions later because eventually we'll kind of settle into what is the best and most practical thing. And I think SteamVR is approaching that.

[00:21:02.353] Kent Bye: Yeah, I agree that you'll probably have to like choose one platform and then really At this point, explore the extent of the user interactions that are even possible with all the best things without having to worry about that progressive enhancement because When I talked to Alex Schwartz from Alchemy Labs, they had one of the demos in the Vive sort of demo reel that was a half hour long. And, you know, they were really trying to figure out, well, what's really interesting to do with two hand interactions. You know, this is something new that we haven't had before to be able to have both of your hands doing something where you may be able to kind of use them together. And they kind of, as they were thinking about it, they sort of realized like, Oh, well. Whenever you're using two hands, you're doing all these different types of jobs. So why don't we like create this sci-fi future where all the robots are doing all the jobs and you're doing this training simulator of like going back in time and figuring out what it'd be like to be a chef in a kitchen and then all these other different things. And so, but some of the design things that they had to do with those two hand interactions would never be possible without these controllers. If you went all the way back and just having it more of a cinematic VR experience.

[00:22:08.384] Ben Lang: Yeah, you're right. So their game, which is actually a lot of fun, is kind of cartoony and cutesy. And you're doing this job simulator where you're a chef and you basically look on the board behind you and it says what ingredients you need to put into the pot. So you look around the room and you can like, you know, physically turn around and walk over to the refrigerator, grab stuff out, walk back to the pot, drop it in, or pop open the microwave and put something in there. And it's really quite a lot of fun. But if you were to try to take that experience back to something like Gear VR, it would really consist of, OK, you stay in the middle of the room. And you look at the object you want. And then you touch the touch pad to pick it up. And then you hover it with your mind capabilities, I guess, over to the pot. And then tap again to drop it. And yeah, you could do it that way. But it really kind of breaks a lot of the neatness of that experience, a lot of the interactivity. And again, it adds that abstract method of, okay, I'm going to tap a touch pad to do something instead of just reach out and grab it and do something with it.

[00:23:07.273] Kent Bye: Yeah. Yeah. It seems like, you know, with the new vibe, it's going to be sort of on the forefront of all these new interactions of virtual reality. And also the Morpheus has the move controllers and they had some new demos that were very private. They weren't public. And so not everybody got a chance to try them, but you did. So I'm curious what you saw and what your reaction is to. having your hands in the game with the move controllers and how well that works.

[00:23:32.001] Ben Lang: Yeah, the move controllers are good, but I would say at this point, I didn't see them performing as well as the steam VR controllers. So I got to try the latest Morpheus prototype, the 2015 prototype that they just revealed at the show. And I got to do this experience called the London heist, which was a lot of fun and really highlights. Sony's network of impressive game developers. This was an experience where it's kind of two experiences in one. First of all, you start out sitting in this kind of interrogation scene where, you know, it's a dingy warehouse around you and there's this big bulky guy there who's kind of talking to you and he smokes a cigarette and blows a smoke in your face. And, you know, he tries to be real intimidating and it's all mo-capped, you know, voice acted very well. And there's some interactivity there where if you try to look at the door, where there's like this big exit sign where you think, okay, I might be able to get out there. He will be like, no, you're not going anywhere. And then he gets out a gun and shoots the exit sign. And it's a really, uh, really cool scene, especially as it comes up to the end where he gets this blowtorch out and he's like, if you don't tell me what I need to know, you know, I'm going to kill you with a blowtorch. And you know, it comes right to the climax of his anger and, You hear this cell phone ringtone, this little cute, like, do-do-do-do-do-do-do, or something, totally not befitting of this big guy. And he gets his cell phone out of his pocket. And it's kind of this moment of a little bit of comic relief. And he gets his cell phone out of his pocket and talks on it for a second. And then he says, all right, someone wants to talk to you. And he tells you to stand up. And you actually do physically stand up in the environment. And he tells you to take the phone. So you reach out with the move controller and grab it. And then it's, you know, sounds like, you know, muddled, like, you can't really hear it. And then you just kind of naturally put it up to your ear, because that's what you do with a phone. And then Sony has this really good 3D positional sound that now the phone sounds like it's right up to your ear. And that was really cool. And then you can hear it very clearly. And there's a guy on the other end asking you, you know, what happened last night or what, you know, you still don't really know why this guy's interrogating you. But this guy's asking you, he says, like, walk me through what happened last night. So then you flash back to the scene where you are in this room, this very fancy decorated space, and you're behind a desk, and there's somebody coming over the intercom or a headset telling you, OK, find so-and-so. There might be a key around here. So you're behind this desk, and you're kind of opening drawers and stuff with the Move controllers, and you do find a key. And then there is another drawer that has a keyhole, and you put it in there, and there's this big diamond inside. As soon as you grab it, the alarm goes off, These guards come in, then you open up the kind of center drawer and you find a pistol in there and there's clips. And so you can fire the pistol with the move controllers and then you can grab one of the clips with the other hand and put it in there. And it's a really, really just a lot of fun. It's extremely polished experience. The developers clearly spent a lot of time on it. You know, you're crouched down behind the desk because there's bullets whizzing over your head and they're hitting the desk and there's wood splintering off. And you can reach your gun up and just kind of blind fire, you know, up over your head and try to hit one of these guys. Or you can wait until they're done with their volley and pop up and, you know, kill them. And really great experience. Although, so I had some trouble with the move controllers, especially when I was ducked down low. And, you know, this may have been a problem of camera placement, but I mean, it is also fundamentally a problem of the tracked space is significantly smaller than something like SteamVR at this point. So they were able to ask me to stand up from a sitting position, which is cool, and it worked into the game very well. But there is no sense, really, of being able to walk. You might step one to your left or one to your right, but there's no sense of walking around this space. And because of the fundamental limitation of that tracked space, when I got down low, I believe the problem was my move controllers were kind of falling out of that tracked area. And this gave me some trouble with putting the clips in the gun. But all things considered, it worked the vast majority of the time and the experience was a whole lot of fun. So even if it worked as well as it did, as I saw it, which was with some issues, I still think it is a completely compelling experience. But in the end, I think that they will get better at designing so that you're not accidentally leaving the track space.

[00:27:54.275] Kent Bye: Right. Yeah. And the other, the other big news with Sony is that they actually announced that Q2 of 2016, they're planning on. coming out with a consumer version of Morpheus, or if they changed the name. I don't know if it's still called Project Morpheus. Everyone seems to know it by that. But then also the Vive says they're going to be coming out sometime in the fall of 2015, which I think in some ways put some pressure on Oculus. So Oculus was pretty tight-lipped. They weren't really talking to press. They just had some new demos that they were showing, a scene from The Hobbit with a big dragon, Smaug. They also had Crescent Bay units at the Crytek booth. They were showing another dinosaur experience there. So yeah, I don't know if you have any thoughts about Oculus here at GDC and what you sense from them and where they're at and moving forward here.

[00:28:42.809] Ben Lang: Yeah, it's interesting. So when we reached out to them initially, they said they weren't doing any press and that was kind of strange for them. You know, usually they have something to show, especially at GDC. GDC was where they first had the Oculus Rift DK1, was where they first had the Oculus Rift DK2 last year, and so this year we were expecting something. I guess they may have known that the Valve news was coming up and just kind of wanted to leave them there in the spotlight. I doubt that they don't have any developments to show, you know, they're always working on a lot of stuff. But yeah, it was a little bit different than usual. They had a big physical presence. They had two booths, lots of Gear VRs. They were showing Crescent Bay all over the place. And they also showed the one new demo, the Weta experience from The Hobbit, which was extremely cool. But they said that they were focusing on developers primarily. So yeah, they didn't do a whole lot in terms of announcements, although they did release some interesting blog articles on their official blog about reducing latency. They had several sessions. about what they're doing with their SDK and, you know, how to work with their new audio solution. So it did very much seem developer-focused. And I think that that is a move toward them saving up for Connect, which is their own official Oculus conference, and that'll come in September. I think that, you know, they've used trade shows in the past as vehicles for communicating big announcements, but I think now that they have Connect, and this will be the second year coming up, I think that they will use that now as their place to make those sorts of announcements.

[00:30:18.285] Kent Bye: Yeah, I was really hoping to see some type of input solution because if you think about if you want to have something launching this year, you need to give developers some time to actually develop some experiences using input controllers. So if they do launch something, it's going to be, I imagine something that's very passive or require. people having Xbox controllers. So yeah, there seemed to be, you know, all the demos that were being shown, like the Crytek, you had maybe three or four different ways to interact just by the way that you're ducking and moving your head. The Hobbit experience was very on rails, passive. You kind of just sit there and watch it and experience it. And Lost, I got a chance to see the Lost experience as well, which is their Oculus Story Studio experience, which again, is very much on the rails and you kind of experience it. And I don't know, do you think that Oculus is pivoting towards doing more cinematic reality, more towards film and entertainment and passive experiences, not having a lot of input controls? Or do you think that they will also try to keep going after this gaming market to have some sort of input controller?

[00:31:23.507] Ben Lang: No, they'll definitely continue to go after the gaming market. It makes sense to start there primarily because game developers are the ones that have experience in creating real-time spaces, you know. It's basically the market that is the most advanced at doing that, especially ones that are interactive. They're definitely sticking with the gaming market. The Story Studio stuff is interesting, but I think that's just one of many kind of branches of what virtual reality will touch. I do think that they're working on an input solution, but yeah, it is a good point that developers do need time. If they want any, if Oculus wants any launch titles, Developers need time to know exactly what they're working with and so far oculus has said that the DK2 is basically, you know That is what you can expect but better from the it's kind of the base feature set that you can expect from the launching Version of the headset, but there's been absolutely no word on an input solution aside from the fact that they're gonna do one You know, it's an interesting Possibility to think that if lighthouse goes out there and steam really does open it up for whoever wants to use it It's interesting to think that maybe that just solves the tracking problem right there, and then it really just comes down to the controller design. You know, do you want some haptics on there? What do you want to do with that? And I wonder if Oculus is considering that, or I wonder if they would not want to do that for pride's sake, or maybe they think they have something better coming up. But I think that in the end, the Lighthouse solution is, it covers so many bases that it would almost be a shame to not kind of make that the standard for tracking.

[00:32:58.159] Kent Bye: Yeah, I know that in the behind the scenes of the Valve demo room, I actually got a little tour of some of the hardware and history there. And, you know, they were taking like a hardware engine motors and putting like lasers. It was like kind of do it yourself, LiDAR scanners, basically what they've created with Lighthouse. And so. You know, if you look at the timeline though, they were kind of prototyping and developing that. I don't know if it was before the Facebook acquisition, if some of that stuff was, you know, that you can kind of presume that Oculus had their hands on some of that stuff. But yeah, it is kind of interesting to think about and, you know, looking at the job descriptions, I know Robert McGregor was saying that Oculus was looking for people with optical experience with lasers. So it seems like. Of all the solutions I've seen out there, it probably is the most consistent and cheapest and most effective, I'd say.

[00:33:46.197] Ben Lang: Yeah, it does seem to be turning that way. I'm not a hundred percent certain on the cost, but it does seem like it should be fairly inexpensive because the base stations consist of, you know, you need a nice rotating motor. You need to be able to generate the lasers. There's LEDs on there, but other than that, you know, no wireless, no processor. no display, and then you just need a power outlet plug and the casing for it. So that is very interesting. You know, you can keep the headset, the thing that kind of contains all of the complicated stuff and is already going to cost a fair amount. Whether or not Oculus is actually working with the laser stuff at this point, they very well could be because they were very close with the company. And not to mention, Michael Abrash, who is now their chief scientist, used to work with Valve, and he probably, he almost definitely knew of the laser work that they were doing, because he didn't move over to Oculus until I believe it was either late 2013 or 2014. And the laser kind of initial prototypes happened earlier than that, if I remember correctly. So there's almost no way that he doesn't know about that system, and may or may not have seen, you know, what it could become with more development. And they could be working on that in Oculus as well.

[00:35:00.027] Kent Bye: Yeah. And, you know, I had a chance to walk the floor as well, and there was a lot of different stuff happening around, and I probably did around 30 different demos of different stuff of both hardware and software stuff. And I'm curious from you, if you had a chance to kind of walk around and experience anything, and if there was anything that was really striking for you.

[00:35:17.218] Ben Lang: Around the floor, I think the thing that was striking was the number of VR things happening there. One friend of mine said he had counted some 22 different booths that had some form of virtual reality, whether it be Gear VR, just as a promotional tool, or DK2, making a game. And I bet there's even more than that in some of the indie gaming areas. And that is such a massive growth. And it's not just the quantity. It's the quality of these different players. Oculus, Valve, Sony. It's crazy. I mean, and then we have Razer working on their OSVR stuff. It's growing tremendously fast. One cool thing that I'll call out that was kind of just one of those on the show floor kind of things was Striker VR, they used to be called, they're now called Haptek. They're working on an electric haptic feedback system for guns and different things. And it's just extreme, like what they have right now is extremely good. So they have a gun mockup and they had it hooked up to the DK2 and it's tracked. They're actually using STEM to track it. for their demo, and the feedback that it gives when you pull the trigger is really powerful and very immediate. And it's a whole lot of fun to use, and I was totally surprised to learn that they said that they thought they could hit a $60 price point for a complete gun controller with this really great kick that could be useful for virtual reality, and I was stoked about that. I was like, sign me up.

[00:36:42.819] Kent Bye: Yeah, I did had a chance to try that demo as well. And you know, they did actually have stem controllers on the head and on the gun. And so the gun was actually being tracked. So I don't know if that would be an additional cost to actually track the gun in the VR experience with a stem controller. But that made all the world of difference. And it was extremely immersive with that kick. And they have a mil spec version of that, which is I did an interview with The founder of Kyle Monte, and he said that, you know, if they made it at that full spec, it wouldn't be any fun to play at all. You know, it would just like, you know, be too much. And so they kicked it down a lot for the consumer version. And, and yeah, they're actually also abstracting out these linear motors to be able to kind of create baseball bats and tennis rackets and stuff like that, which I think will be really compelling to give that type of force feedback, which you can really feel in your body really accepts it. And even in the subtle. feedback with the rumble within the Sixth Sense demo with their STEM controllers is enough to, it doesn't have to be super precise in the actual force. If a Star Wars droid was actually shooting lasers at you, I'm sure it would be a little bit more force. But even that little rumble makes the world of difference. And I think their approach with that force feedback is going to be really compelling to creating that sense of immersion in VR.

[00:38:01.564] Ben Lang: Yeah, haptics are extremely important. It's one thing to be able to, reach out and see as though you're touching something, but when you get that feedback, that little rumble on Sixth Sense, let's say, when the laser strikes the lightsaber, it connects the visuals to the tactile sensation. And one company that's doing really interesting stuff in this space, I don't know if you got to see them at GDC, is Tactical Haptics. They have this really interesting, what they call reactive grip controller, which has these sliders on it, which can move up and down to simulate you know, the health of a sword in your hand or, you know, hitting something or stabbing something. And it's really quite compelling. It creates a much different sensation than just a mere rumble. It can create a, you know, a sense of direction in the force, in the forces that you're feeling. And even for several things like reaching out and grabbing a lever and pulling it, that has a nice, you know, clack, clack, clack to it. It could do so well and just add so much to hear, to see it, hear the sound and feel. each clack as you pull that lever. And I'm really excited to see where that kind of technology goes.

[00:39:04.330] Kent Bye: Yeah. And my experience of the tactical haptics was that the scene that gave me the most immersion was these low poly, just throwing cubes around. And there was something about the abstract nature of it allowed me to kind of accept the haptics that they're using a little bit more. The more realistic it was, it started to be a little bit of a disconnect. And I think this is true for all VR is that the more realistic The scene is the more realistic you expect all of the haptics and everything to be. And so there was a little bit of a disconnect in my mind of not quite fully believing it, but when it abstracted that to be this low poly scene, I totally believed in it. It was really super compelling to kind of like throw these cubes and blocks around.

[00:39:42.503] Ben Lang: Yeah. I think that at the moment, you know, we still have the uncanny Valley problem. So although you can get some really amazing. near photorealistic imagery out of today's computers, and that can do just incredible things for immersion. Once you start reaching out and touching, you know, if the physics aren't there, if it doesn't feel like you can naturally grab something, it starts to break down. And actually, I went to a session by a Sony Studios artist who had worked on that London heist demo, and he had suggested that you should do kind of the arted up route a little bit. the less lifelike, you know, your Pixar style, like the rendering is real, the materials look real, but then the people are not necessarily human shape and the, you know, the environment around you is not necessarily, doesn't look exactly like you would expect real life to look like. He suggested that route for a number of reasons, but, you know, I think one of the big ones is avoiding that uncanny valley space where you expect all of these perfectly real things. Whereas if you don't go that route, you can have a kind of a suspension of disbelief. in terms of the exact interactions of what you would expect from a space or an experience like that.

[00:40:57.108] Kent Bye: Yeah, totally. I totally agree. And that's been my experience as well. And Lucky's Tale is probably the demo that I got to experience at GDC for the first time that kind of takes that approach of a low poly, kind of cartoonish. But they're also doing this really fascinating thing with the third person perspective. And one of the things that Paul Bettner said is that when you hold out your hand, The distance from the end of your hands to your face, you have more neurons to process information that's that close to your body. And Lucky's Tale is really kind of taking that phenomena and really exploiting that. It's like this diorama that you're looking at, but it's like moving and flowing and all the gameplay is happening there and it's all really dynamic and the. The different depth that you can see, and just the same with the Crescent Bay demo. There's one with the Paper Town. It's like this small diorama that you're looking around and being able to have that has the biggest like stereoscopic effects when you kind of move your head around. And it's like this little toy, but it's like alive. And to me, you know, people are recreating real life in VR, but something like Lucky Cell starting to be like, Oh, this is something that you can't do anywhere else other than VR. And it's completely unique to VR. And to me, it's really starting to define the language of what the platform can do.

[00:42:10.151] Ben Lang: Yeah. And there's so many different things that are really only something that are going to work well in VR. I mean, I've found already that I prefer doing telecommunication in VR, rather than Skype, where you're looking at a flat person on a flat screen. Seeing them in a virtual world and watching their head move, you get this body language that you can't get through a computer. And it just adds so much to the experience, and I think that Going forward, there's going to be so many places beyond just gaming that virtual reality ends up playing. For instance, in the Steam experience, in the Steam demos, one of the things they were showing was a Google Earth demo where you're standing, you feel like you're a giant, and it's mapping the Google Earth data, the 3D data of a city right at your feet. And you can lean in and look at all these buildings that are mapped in 3D. And I would just love to be able to say, all right, I've got a big trip. I've got a lot of stops. I'm going to use these controllers to input my destination. And I want to see a visualization of that at my feet. And I can lean down and maybe grab the route and adjust it to go somewhere else that I might need to. And I just can't wait for those kinds of solutions, where it's not even a game, but it is a more immersive way to see something. Because people learn in many different ways. I happen to be a very visual and audio learner. And to be able to get down on my knees and say, OK, that's what the entrance to that building looks like when I need to find it. That sticks in my brain in such a different way than reading about what address number I should stop at or seeing even just a little picture of it on a 2D screen.

[00:43:43.587] Kent Bye: Yeah. And one of the more compelling VR experience demos that I saw by an indie developer there was John DeWaar, who used to be at Kite and Lightning, has now started his own firm. I think it's called Transcendent Media. He had an experience where you see like a history of all the different planes from the Wright Brothers to like, you know, seeing a plane fly by right by your face going at supersonic speeds and to be able to see the scale of things is really compelling. And, you know, if you look at like all the. Oculus demos that they're showing, they really play with scale, like having these huge creatures or huge robots or dinosaurs kind of coming up on you. And, you know, feeling that sense of scale is something that's really unique to VR. But I don't know if you saw any other sort of indie demos or stuff that really stuck out for kind of developers that were just kind of either showing in gear VR or at the VR mixer where there was, you know, all these dozens of different indie VR developers showing their latest games.

[00:44:38.674] Ben Lang: Well, for one, I actually saw Converge, which you may know, is a social virtual reality platform. And they were actually doing something really cool with the mixer. They had a projector projecting a window into the Converge virtual world where you could see virtual people walking around. And then they had a camera in the VR mixer there that was actually looking at all the people in the VR mixer and projecting that into the virtual space. So it was like this window. Into that virtual world which you know, I thought was super interesting so you could wave to people and they could wave back you know virtual to real that was really cool and Also one of the very neat things that I saw kind of an end of not indie but in development very much right now was some of the web VR stuff the guys at Mozilla on their web VR team are making really amazing progress on VR enabling the web and making the web something where virtual reality is Like it's really simple to use experience and one that can add a lot to content that we already enjoy, you know, and they're doing really cool stuff to make it responsive such that you could have, you know, Wikipedia page that just looks like a Wikipedia page, but then when you throw on a headset transforms into something that is much more immersive.

[00:45:51.216] Kent Bye: Yeah, I agree. I think the Mozilla stuff that they're doing, I was really actually surprised and stunned to see how performant was. I was expecting to be, you know, way more laggy and with latency, but Some of the experiences that they're doing are super compelling and a lot of the user interfaces and stuff that they'll be dropping soon. I think it's going to be pretty huge, especially when you think about the internet as kind of a 2D metaverse and they're kind of putting the tools in place to really turn it into a 3D metaverse. It's super, super exciting. I did an interview both with Josh Carpenter and with. One of the leads at Chrome, Brandon Jones. Yeah. So some of the stuff that they're doing, well, they're actually going to be rearchitecting the browser to be able to kind of make it more efficient. So, cause there's a lot of stuff in terms of the DOM processing that you just don't need to do when you're doing real time interactive 3d. And so just kind of going at the guts of these browsers and starting to. re-architect them so that you really get that latency down to an acceptable level. I think, to me, is one of the most exciting things that I saw there, just because of what the implications will be. Especially for, if you think about it, like cat videos or GIFs. These are not things that you want to download to your hard drive. And I think that's the thing that, right now, if you want to experience anything in VR, you have to kind of like Do I have room for this 500 megabyte file of a video that I may watch once and then want to delete? It'd be much easier to just kind of like stream that experience it and then send people a link, but you don't have to like worry about filling up your hard drive with it.

[00:47:20.354] Ben Lang: Yeah, absolutely. There's lots of experiences on the web. You know, the web exists today for that reason. We want to be able to quickly navigate from one webpage to the next and not say, okay, I'm going to download this page that I want to read. I'm going to download that page. It's really as quick as clicking and seeing. And there are lots of small experiences where you're right, you may want to just watch them once and maybe it's not cool, and you don't want to spend your time downloading that file in order to do that. And there are little neat things on the web, like little web games that probably aren't good enough to bother downloading with, but if you can just click and play in five seconds, then that's something that you might do. Whereas if you knew that you had to spend 10 or 15 minutes downloading it and then figuring out how to get it to work with your headset, you might not. opens up this new space for experiences that wouldn't otherwise get much play.

[00:48:09.090] Kent Bye: Yeah, and I did an interview with one of the original team members of the Google Cardboard and was asking about straps and he basically said that the Cardboard was designed to just put up to your face and to hold it there. They don't want you to be like putting a strap on, because if you put a strap on a cardboard, then you're going to see how like unperformant the phone is. And it's going to be a bad experience. And so there's something about just putting those hands there that act as a governor so that you're not like swinging your head around, but something, stuff like the cardboard and the gear VR, I think is going to be super great for these kinds of ephemeral web experiences where maybe you'll just sort of put up to your face and experience it, but it's not like a fully immersive thing where you have to like have all this gear to dive into VR.

[00:48:52.356] Ben Lang: Yeah, Gear VR is already great for that. And one of the things that I've found is particularly a great experience for passing around is the 360 photos. You can actually take the straps off of Gear VR really easily and turn it exactly into just a thing that you can hold. And you can walk that around and say, hey, check out this awesome 360 photo. And just have somebody else take it, put it up their face. The photo will come up. They can spin around and go, whoa. And then walk over to another person and say, you have to look at this one. And the pass off is five seconds, and then you're looking at it again. sit down, look straight, let me get this strap over your head, let me put your headphones on, let me calibrate for your height or your IPD, etc. It's just a quick experience. And those kinds of things are, again, things that you, you know, they fit, certain experiences fit in that quick sharing thing that you don't actually want all of the, you know, you don't want to hook up a mocap suit to your body just to see something quickly like that.

[00:49:45.383] Kent Bye: Right. Totally. And some of the other tech that was on the floor, but not necessarily having any virtual reality HMDs yet is like the eye tracking of Tobii. And I had a chance to talk to somebody there. They said that they're working with virtual reality HMD manufacturers and they We're probably going to be announcing some more stuff in Q2 of this year, but eye tracking in VR is going to be, I think, pretty huge as well as Fove was part of the river incubator. They were showing a little demo there at the VR mixer in the basement at the demo room. So I don't know if you had a chance to check out Fove at all and some of your thoughts on that.

[00:50:21.595] Ben Lang: Yeah, I've got to seen Fove. I've seen Fove back at CES actually, and. The thing about eye tracking is that if it can be done well, it can add, with that one feature, you can add a bunch of things that are gonna be useful for virtual reality. Let's just run down a quick list here. So I'm not so enthusiastic about this idea of like some of the demos that Fove is showing are like, look at someone and then that shoots them. Like, that's just silly to me, but it is just an example at this point. But that's not why eye tracking is going to be useful. It's going to be useful for things like adding depth of field where when you have an object that's really close to your face and you look at it, it's in focus and everything else is blurry. And when you look past it, it becomes blurry and everything else is in focus. You know, that is how the eyes actually work and that's what you expect to see. And you'd be able to add that sort of thing if you have very fast eye tracking. On top of that, being able to know where the user is looking is very useful for a developer. It could be for interactive purposes, like look at the object that you want to select or the menu item you want to select. And it can be used invisibly for developers to say, OK, when the user looks here, I want this thing to happen so that they don't miss it. On top of that, you could do better chromatic aberration correction if you know where the user is looking. Because right now, the chromatic aberration correction, which corrects for the colors that kind of change as they come through the lens, it only really is designed for if you're looking directly through the lens straight through. But if you turn your eye a little bit within the lens, it doesn't work quite as well. You could adjust for that to make it more accurate. And probably the biggest thing in terms of kind of the technical level stuff is the potential to do foveated rendering, which is basically dynamically reducing the quality of the image outside of the very center of your current view. Because we only see very sharp in a tiny, tiny cone in the center of our vision and everything else is really actually quite blurry and so much of the information that you know is there is just inferred and constructed by your brain and doesn't actually rely on your eyes that much. And so being able to do foveated rendering is expected to help really reduce the processing cost of running these simulations at such a high frame rate that the headset manufacturers are asking for. So it's like you do eye tracking well and you can get all of that out of it. So that is why it is super exciting.

[00:52:49.804] Kent Bye: Yeah, I wasn't such a big fan of looking and shooting at things either. And once I get the latency down and. You know, doing all those things, I think it's going to be huge within VR as well. So in terms of augmented reality, there's a few things there. They had Project Tango from Google. We're showing some stuff. Qualcomm had Vuforia. And then they weren't actually there at GDC, but Meta had a whole AR hackathon that I dropped by and got a chance to check that out as well. But I'm curious what you were seeing on the terms of augmented reality there on the floor, and if there was anything else that you saw pop up.

[00:53:23.456] Ben Lang: I didn't see much that was too compelling. We're still kind of waiting for the big guys to check out what they're doing, which is like HoloLens and Magic Leap, if they ever reveal what's going on there. I didn't really see a lot of compelling stuff that was augmented reality on the floor. Project Tango is certainly very interesting. You know, if you have something like Tango built into Gear VR and phones in the future, as Google hopes it will be, that opens the door for positional tracking on there and a standard way to do that. But to my understanding at the moment, Tango just isn't quite as accurate as it needs to be. Sub-millimeter is kind of the buzzword you'll hear when it comes to tracking. And that means that they want tracking capabilities that are accurate to less than a millimeter. And that's kind of what you need to have a really solid augmented reality experience, especially on your head, because if you are bumping around a millimeter, as you're not actually moving, you're going to start to get a tiny but noticeable

[00:54:20.777] Kent Bye: Movement on your head that you're not actually doing and you know you always want to avoid that stuff to avoid latency But I didn't have a chance to check out the meta hackathon, so maybe you can talk a little bit about that Yeah, they they had probably about 80 different developers there I just got the very beginning and then I had to fly out But I got a chance to put it on and you know the field of view is very small as maybe 15 or 20 degrees it was just sort of like overlaying a field in front of you. And the latency, I had a chance to talk to Soren there, the chief product officer there at Meta. And, you know, it's probably around 60 milliseconds or so, you know, and they're trying to get it slower, but there was a delay of, you know, being able to stick your fingers out and press buttons and kind of interact with scenes. But a lot of the early demos were very rudimentary, not great graphics, and that didn't seem to really be integrating with the actual environment at all. It was just kind of like projecting a screen out in front of you and being able to interact with your hands, as opposed to kind of like engaging with the environment around you. So It seems like they're going to be more aiming towards the business use case of, you know, it's not something you're going to be wandering around with. Um, it's going to be tethered to a computer and maybe you're in a meeting room and using it to kind of do more data analysis and stuff like that. So. It's not like a use case to be kind of wandering around like the Google Glass, maybe eventually, but at first it's kind of more kind of enterprising in that way. The thing about the meta team that I think is really interesting is that, you know, they've raised like $23 million. They have Steve Mann, who is like the father of wearable computing. who's doing it back in the 70s, and he's with them. They've got a couple of perceptual psychologists who are on leave from academia who are also pretty big in the field. And so they've got a great team. The tech didn't seem to be where I would want it to be in terms of relative to virtual reality. It still feels early, early, early days there.

[00:56:16.509] Ben Lang: Yeah, it has been for a long time. And I think the fundamental reason is because so much of what they're trying to do is based on computer vision, and computer vision is just It's not a solved problem. It's not even a problem where I've seen a solution that, you know, works most of the time. Computer vision at this point is something that, I mean, it depends upon the application. Yeah, you have simple applications that are working most of the time, but when you want, you're really advanced, like, okay, we need to know what's the shape of the room. What does it look like? How can we put stuff in this room that is augmented? Like that is such a challenging problem and lots of different companies are working on it, but I haven't seen, like you said, something that is really quite to my satisfaction yet where I would want it to be. It does feel like VR, of course, is further along at this point in terms of the experience. The hope, of course, is that they'll get there and VR and AR will kind of meld together. But at this point, VR seems like it is ready to take a leap into the consumer market, whereas AR is still in the R&D phase.

[00:57:12.379] Kent Bye: Yeah, and I had a chance to do an interview with the creative technical director of Magic Leap. And at this point, they are looking to the virtual reality space as to where the innovation is going to happen with all this user interaction and everything. So it does feel like a lot of that stuff is going to be kind of figured out in VR and then be transferred over to AR. Yeah, the other big presence I feel like is worth calling out is the OSVR, because they did have everything from the Visisonics, the audio solution that Oculus has licensed They had like different people like VR Union and different HMDs and I had a chance to talk to Yuval and the whole idea of having sort of one SDK that rules them all that you just put in one SDK and all the other input devices and the HMDs you just don't have to worry about. integrating each one, creating one binary is the vision. And I think that's exciting to hear about. And, you know, I also heard rumblings that Unity is actually doing something very similar. I don't know if it's in conjunction with OSVR, but the upcoming Unity release, I think they're going to also be doing a lot of these low level integrations with these different input devices and have it kind of baked right into the engine. So, but yeah, OSVR was certainly a big presence there at the Razer booth.

[00:58:28.476] Ben Lang: Yeah, it is totally a great vision. write once, run everywhere. I think that people are saying, like, oh, well, Oculus hasn't jumped in, or Valve hasn't jumped in, or Sony hasn't jumped in. I don't think that that's necessarily a problem, as long as OSVR does their job right and they create a system that just works, is easy to use, and remains performant. I don't think that's a problem at all, because developers may say, OK, I'm going to write for Steam, or I'm going to write for Oculus. adapting to like 15 other things or 50 other things is as easy as writing for one other SDK. So it's like you have these smaller platforms that get this benefit of being together collectively with OCR such that they can all be supported rather than a developer saying, okay, I'm going to take the top, you know, three or four things, the top three or four headsets, the top three or four input devices, et cetera, or, you know, instead of doing that and everybody else kind of gets cut out just because they don't have the size. you have this capability to say, if a developer says, OK, I'm going to take the top three or four, and OSVR collectively is part of that in terms of deployment size, that makes a pretty easy decision for a developer. And it means that people with that hardware that may be kind of less widespread still get that benefit of getting support for their particular thing that they have, which I think is great. It would be good to see this all coalesce into a single thing just for ease of use for developers. But obviously, Oculus and others are still working on their own SDK stuff. And I don't think they want to be slowed down. I think that they want to be able to continue to experiment and have full control over that for the time being. But once they really finalize things, it may become a more compelling idea to kind of become part of that consortium.

[01:00:10.984] Kent Bye: Yeah. And the other big thing that was happening this week was the Kronos group, which is like this cross industry collaboration of trying to come up with some of those standards at both the hardware and software API level. And so I think there was announcements from the Immersive Technology Alliance that the Kronos group was announcing Vulkan, which is sort of like the OpenGL sort of next iteration. And that was exciting to see just in terms of the collaboration that's happening across the industry to try to allow that innovation to happen, but also start to kind of make up these standards that everybody can work around so that they're getting everything at the Silicon layer all the way up to the software layer with these systems. There seems to be a lot of cooperation that's happening that I see represented by the Kronos group.

[01:00:56.125] Ben Lang: Yeah, and it's a good thing in the end when, you know, they're focusing on high performance computing and that's really what we need for virtual reality. I think the industry as a whole is going to, you know, with this reemergence of VR, I think the industry as a whole is going to have to be paying better attention to the performance and optimization and not just have, you know, extra GPUs to throw at it because it's not really an option anymore because throwing extra GPUs at it is potentially adds latency to the experience. And we can't really have that. So it's really going to be all about making sure that these systems are not only, you know, it's not many years ago, all the new developments were about, okay, look at how much better this experience can look now that we have more power. I think a lot of the conversation will shift to look at how much faster this experience could be now that we have more power.

[01:01:41.746] Kent Bye: Right. Yeah. And the final big announcement that I would say coming out of GDC is the Unity, Unreal Engine 4, and Source 3 all kind of going to this free model. You know, each one had a different kind of condition in terms of free, but just having indie developers, having access to the tools to make virtual reality experiences, I think is going to be kind of a huge deal for the future of VR moving forward.

[01:02:06.285] Ben Lang: Absolutely. Because I mean, if we've seen anything from the virtual reality industry, it's the fact that It is a grassroots movement of people who believe in this technology that made it happen from the get-go. There were big companies, many of which are now working in the virtual reality space, that could have done exactly what Oculus did from the start, but they were jaded from the failure of virtual reality early on. And it took people like you and me and backers of the Oculus Kickstarter to just come together collectively and say, we want this to happen, we believe in this happening, and there you go. Oculus in the early days didn't have any special invention or magic innovation that enabled this stuff. They just kind of combined the right hardware with off the shelf parts that already existed, a screen from a tablet, you know, a tracker that was already out there in their very early prototypes. There was no magic innovation. It was just, it was just the recognition that the time is now and no one has done it. So let's do it. So the thing is that experimentation and that belief in something new That comes from people and organizations that are small and agile and not massive AAA gaming developers that are going to spend their time and money doing something that's just an experiment. You know, you need these people that can say, I'm going to spend two weeks or a month just exploring this prototype idea and seeing how it works in virtual reality. You know, I think the indie is now with access to these game engines, which now both natively support Oculus. and other VR stuff like Gear VR, I think that they're going to be the ones that are coming up with totally unique ideas. And that really opens the door and lowers the barrier of entry to do that. And it's such an exciting thing because these are really powerful game engines. If I wasn't so busy doing what I'm doing, I would totally download them and just start playing around. And I'm not a developer. But for anybody out there who's listening to this now, if I were to do that, I would probably start by trying to make a virtual reality pinball simulator game. So somebody do that. The idea is there.

[01:04:08.337] Kent Bye: Take it. Yeah. I actually saw somebody had like an interface for pinball.

[01:04:12.399] Ben Lang: Yeah. Yeah. I want exactly something like that. Like, yeah, there's a little thing that just has the buttons that give a nice click, you know, a nice click of the bumper and then just the spring to pull back and let it go. You just need just that part and then you can make the rest of virtual.

[01:04:24.443] Kent Bye: Yeah, I do agree that I get the sense that AAA shops are going to kind of take a step back and it's a big decision to invest that much money into their pipeline to develop VR games. And so. The VR space is going to be ruled by indies for quite a while. And so now's the time to jump in and experiment and really start to figure out the language of VR. Yeah. Even from the sense of VR storytelling is something with like FaceShift to be able to do facial animations and Mixamo with the characters, you know, to Unity and all these things, all the tools are there to just basically start creating your own sort of cinematic and interactive experiences and start playing around. I know I did an interview with the narrative directors of Endreams and, you know, or he was a contractor with Endreams in the assembly. And just hearing him talk about sort of the experiments of storytelling and VR, there's such an open space for people to kind of just experiment with. It doesn't take a million dollar budgets or even Oculus Story Studio level quality of visual fidelity. It can be very simple and low poly. And yeah, it's now's the time to really start to, all the seeds are there that have been planted for you to kind of harvest and grow the future of VR. Absolutely. Nice. And is there any other things that you saw behind closed doors or closed meetings? I know at GDC, there's a lot of stuff that's happening off the showroom floor. But I don't know if there's anything else that was really striking to you in that terms.

[01:05:46.429] Ben Lang: Well, so it was really Morpheus, which we talked about, which I thought was just a really fun, awesome, interactive experience. And game designers are going to have a blast figuring out really interesting new ways to tell stories and make players part of those stories. And then there was the Weta experience, the Hobbit experience, which was kind of a little bit lesser shown, but it was really quite phenomenal. Epic Games worked with Weta in Unreal Engine 4 to create this experience, which is basically a scene straight out of the movie, where Smaug is revealed. And it is just beautiful. The graphics are incredible. I went to a session to learn about how they did optimization on this experience and the tricks that they pull and the techniques that they come up with to optimize what they want to do are just, it just blows me away. It's just, it's creative, it's technical, and it's really just quite impressive. And in the end, the experience is something really special, like getting up close to this massive dragon that was terrifying in the film and is terrifying there in the experience and feeling like you're part of that universe. That very cherished universe is really cool.

[01:06:59.499] Kent Bye: Yeah, and definitely Unreal Engine for that visual fidelity is all the different optimizations they made, for sure. And Unity is great for just level-level stuff. But I definitely see that the Oculus Story Studio and Akrima from Kite and Lightning, they created this whole scene in Unreal Engine and created a plug-in to be able to output that to just a movie file to be able to put onto the Gear VR. So I think creating these cinematic VR experiences in Unreal Engine That seems to be where Oculus is putting a lot of energy in that. And then, you know, in terms of game and interactivity, I'd imagine that Unity is the place a lot of that innovation is going to be happening. Well, Ben, thank you so much for joining me here today and sort of debriefing. It was fun to kind of go over all the stuff that we saw there. And yeah, to me, this GDC is just sort of like a key moment in the history of virtual reality with so much announcements and news and technology that was announced. To me, I'm just really excited for the future of VR.

[01:07:59.329] Ben Lang: Yeah, I can't wait for the gaming world at large to get their hands on something like SteamVR. Because GDC is this world of developers And people outside who haven't had a chance to see this may be looking in and wondering, you know, hmm, that seems like it'd be cool. But I think that for your average gamer, it's really going to blow them away. Awesome. Well, thanks so much. Yeah, thanks for having me, Kent.

More from this show