#425: Highlights from the SIGGRAPH VR Village with Denise Quesnel

denise-quesnelThere was a VR Village again this year at the SIGGRAPH conference where a lot of experimental interactive technologies were on display. Virtual Reality was the common thread, but there were other immersive technologies like different prototype haptic suits and devices, along with a number of collaborative social games, and demos including redirected walking, eye-tracking, facial retargeting, and augmented reality. There was also an entire VR storylab with different narrative experiences being show. I had a chance to talk to the program chair Denise Quesnel about some of the content themes and technologies this year at the SIGGRAPH VR Village as well as the different technology trends that were emerging.

LISTEN TO THE VOICES OF VR

Here’s the call for submissions for SIGGRAPH 2017.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to The Voices of VR Podcast. So this year was the first year that I was able to make it out to the SIGGRAPH conference. It's the Special Interest Group for Computer Graphics and Interactive Technologies and SIGGRAPH is one of those conferences that's been going on since like 1974 and I've met a lot of people who've been involved with VR for some time who had some of their first VR experiences at SIGGRAPH in the 90s and so SIGGRAPH is a conference that is always kind of on the bleeding edge of what is possible with computer graphics. And last year was the first year that they had an entire VR village. And the VR village returned this year, and I had a chance to talk to the program chair, Denise Quinnell, to talk about some of the different things that she's really looking for in some of these forward-looking interactive technologies. And so we'll be talking about the VR village and what was included, as well as some of the deeper intentions for what types of experiences and technologies they wanted to include within the VR village. So that's what we'll be covering on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. This is a paid sponsored ad by the Intel Core i7 processor. If you're going to be playing the best VR experiences, then you're going to need a high end PC. So Intel asked me to talk about my process for why I decided to go with the Intel Core i7 processor. I figured that the computational resources needed for VR are only going to get bigger. I researched online, compared CPU benchmark scores and read reviews over at Amazon and Newegg. What I found is that the i7 is the best of what's out there today. So future proof your VR PC and go with the Intel Core i7 processor. So this interview with Denise happened on the last day of SIGGRAPH, happening on July 28th in Anaheim, California. And apologies to all the background noise as they were starting to break down the show. So with that, let's go ahead and dive right in.

[00:02:12.550] Denise Quesnel: So I'm Denise Quinnell and I'm the VR Village Program Chair for 2016-2017. I also did this in 2015 so that makes three years in a row. What VR Village does at SIGGRAPH is it basically brings what it sounds like. It brings virtual and immersive realities to the convention hall here. So what we aren't is an exhibit or a trade show. What we do try and do is provide the content creators who work on these projects to be with the attendees and explain one-on-one to attendees what is going on in that experience. And we try and bring in experiences that actually are rather unique. So things that have haptics or technologies that haven't been released yet. Likewise a lot of content. So a main focus of this year's conference was on narrative.

[00:02:56.720] Kent Bye: Great. So I've been to the IEEE VR a couple times. This is my first SIGGRAPH. And so in both conferences, they have demos with a lot of kind of cutting edge prototype technologies. And so how do you kind of make sense or tell the difference between the types of demos that are shown at other academic conferences? And what kind of makes this SIGGRAPH aspect unique?

[00:03:15.940] Denise Quesnel: SIGGRAPH is a funny one because it starts to blend together industry with academia and also artists. I mean, everything I've gone to that's been academic in nature, sometimes the VR demos are super cutting edge and there can be a really difficult time with those contributors explaining in layman's terms to attendees what exactly is happening. A lot of time it's presented almost like a technical paper but wrapped up into a demo for people to try for themselves. And here we try and go for a high production quality, so something similar to what you would see at the Computer Animation Festival's Electronic Theatre over at SIGGRAPH, so best in the world. But also things that include student work or research work, and as well as industry work, so studios participate. and artist work. So things are usually brought into the show through the jury process. We did VR Village last year. It was all curated, yeah, every single piece. But this year we went through a submission process. So what that meant was there was a call for submission. People had an opportunity to go ahead and show us their work. And we had two juries, actually. We had a general jury review everything just to see, yep, this is definitely VR and this is augmented reality also and it belongs in VR Village, but we need to have a finer look at it. Let's get a second juror on it. And so we had a second group of jurors review it for things like potential for motion sickness or the immersive qualities, the innovation, the creativity. And they actually tried quite a few of these experiences themselves, except for the ones that were logistically difficult or location specific.

[00:04:42.812] Kent Bye: Yeah, so just in terms of the overall VR industry, there's a couple of trends that I saw with projects here. Probably the most experimental prototype experiences that were not available commercially are the haptic devices, where either haptic suits or Oculus Research were selling their haptic wave. And so maybe you could talk about how haptics kind of fits into SIGGRAPH here in the VR village.

[00:05:04.581] Denise Quesnel: I think the haptics fits very well into the embodiment. So in terms of how haptics can work in terms of its appropriateness or even effectiveness, it's in a very early stages so that's hard to evaluate. I think a lot of the haptic isn't quite effective yet. If the goal is to aid in presence or to aid in that overall sense that yes, you are there, you're embodied, your body is part of this experience, there's interaction with what you do has a ramification on the story or on the outcome of that experience. At the same time, it's come so far, like we've moved quite a ways away from The kind of cheesy haptic stuff, like I'm talking 4D theatre where you've got those sticks in the air that blows on your legs and pokes into your back. I think most people have probably experienced something like that and that's not what we're talking about. We're talking about diving much more into subtle haptics. So one of the pieces is a synesthesia suit which has 24 haptic sensors placed all over the legs, arms, torso, and what happens is you get to play a game. In this case, I played Rez. They had another experience, so I can't speak to that one, but Rez was really cool with the suit, because at one point, this monster that I was going and triggering at, it actually jumped behind me, and I would have never known, even though, of course, it's 360, but it's really hard to cue someone. It's really hard to know where to look. even though maybe my attention should have followed the monster, it didn't. I was too concentrated to look in the direction I was going. What happened was the sensors started going off in the back of my legs and my torso, so I actually turned my body around to look and then, oh, it's behind me and it's, I should probably focus over there. So I think it can be used in a really, really interesting way, not just for driving attention around a scene, but we've seen things like enhancing the audio so haptic like a sub back to do that and guided meditation is using for those sub waves to provide low frequency sound waves that have actually been proven to aid in reaching mental states of calmness. So lots of unique things here as well as the haptic controller. The controller has been a really really interesting thing to see emerging over the past few years and there's a few examples of those here.

[00:07:05.743] Kent Bye: Yeah, I think that for me, I kind of think of the haptics as still a little bit in the Uncanny Valley where, you know, it's like not quite realistic and still pretty low fidelity and I actually really enjoyed the synesthesia suit even though it was pretty contrived in the sense that I don't think it was actually kind of mimicking how I would actually feel in real life, but it was in a a very stylized virtual environment. The game was actually a lot of fun with Rez Infinite and then I felt like myself really being motivated to try to win the game so that I could get my entire body buzzed. It was just sort of like a thrill and a reward that once I did something in the game it would actually kind of stimulate my entire body in a way.

[00:07:43.117] Denise Quesnel: It's interesting to use the term reward because this morning we had a presentation, two presentations actually. One was by the Children's Nationwide Hospital and also Ohio State University who presented together on the reward system and as well as different gamification techniques you can use for acute pain and chronic pain disorders. So things like having blood draw from children, children develop these horrific PTSD, especially hemophiliac children and they introduce the ideas of game concepts that they know act both as a distractor but also to aid and provide everything from breathing techniques that help the child and even adult calm to the ability to distract them and then after that presentation Dr. Walter Greenleaf who's I think you may have actually interviewed, he's a long, long, long time pioneer, 30 plus years of research in neuroscience and virtual reality. He talked quite at length about the reward system and how task-driven or cognitive task-driven virtual reality is going to be the next thing. That is going to be the thing that brings us into, well, virtual reality isn't just in our lives, but now is actually enhancing them. helping us be better people for ourselves, to one another, and also for our own well-being purpose. So the presentations we've been having have been really, really enjoyable in terms of having these contributors talk about what their motivation was behind creating these and where we're actually looking five, ten years from now. So that's been really great.

[00:09:12.711] Kent Bye: Yeah, one of the highlights for me in terms of the demos was the redirected walking demo where it looks like you're walking on a straight plank, but you just have your hand against the wall and the wall is curved, but yet it's actually too small. I can still feel the curvature of the wall. When I did the void, that was an example where it was a bigger radius where I could put my hand on the wall. very similar effect of actually touching a physical wall, but walking in a semicircle, but still, you know, my mind was thinking that I was walking straight. And so that's probably, from a technical perspective, something that someone may not be able to do within the constraints of their room-scale space at home. You kind of need a lot bigger space, but just this idea that you could be touching something physical and use the redirected walking techniques to kind of walk forever in VR.

[00:09:58.536] Denise Quesnel: I think that's really fascinating and it's really indicative of how the research that's been done in academia is starting to filter its way into the application. So what I mean by that is there's psychologists, neuroscience majors, there's people who are working in everything from game design to research and development. getting together, and what they're doing is they're taking all this research, so all this work, I'm talking people, early pioneers like Dr. Walter Greenleaf, Jackie Maury, the folks over at ICT, USC's ICT, NASA, I was talking with JPL, and they were saying, oh, you know, a bunch of our research is actually starting, we're seeing it, like we're not just seeing it referenced in papers, we're actually seeing it implemented inside of people's designs, which was absolutely amazing. So this is a turning point. We're actually at this really interesting place where the research has become not just iterative in terms of yes, we're writing about it, we're continuing to research and adding on to the research that has been done, but we're seeing it being created inside of virtual and augmented reality experiences, not just the hardware or the design. but in terms of how do you navigate a space? How do you move within a space? If you're in a small space, which to be fairly honest most people will be in virtual reality and in today's world, we need to figure out ways for them not to get lost in their navigational system, to be able to explore a space without needing that actual structural physical environment and feel like they are still there at the same time.

[00:11:27.357] Kent Bye: Yeah, I think talking to different people, historically it seems like SIGGRAPH was showing technologies that may be two to five, ten years out, but yet the iteration seems to be a lot faster, perhaps. There's still a lot of kind of prototype demos, but the turnaround to the point where some of these to be productized, I think is going to be perhaps a lot faster. I know, for example, NVIDIA was showing some foveated rendering pipeline with SMI technology to be able to actually use eye tracking and then change the rendering pipeline to actually implement foveated rendering, which it's a It's kind of a funny demo to see because you can't actually tell the difference that much between when it's turned on or off, but when they stop it and show, okay, look, the GPU processing load was this much lower, and they could also freeze it and you could see the differences. But also FOV is another example of eye-tracking technologies. So it seems like eye-tracking is gonna be kind of the next frontier in terms of the second generation headsets. I'd expect that a lot of the Oculus and Vive have things more built in, integrated, having some of these stopgap kits with SMI eye tracking technology and some of the other techniques with integrating the eyes within the workflow of VR.

[00:12:35.633] Denise Quesnel: It's really, really cool that you said, oh great, that it was not overtly apparent when they switched from the rendered fovea to rendering and back and forth. And I think that goes back to what I was saying about the haptic, it's subtle. And we don't want it to be cheesy, you know, we don't want it to be overt. we want to build something here. And I think it's like any cognitive learning process or any body awareness, you have to start gently and slowly. You can't just hop into something and expect immediate, yes, you know, this is working for me. And I emphasize that because a lot of people are waiting for that, ah, moment in VR. They're waiting for that thing, whatever that thing is. And there's not going to be a thing. There's not going to be a brilliant, this is the application or this is a tool. That's going to change for all of us how we use virtual reality. It's going to start gently and slowly. We're going to just be using virtual and augmented reality daily basis without realizing it. I like this conference in particular. One of the reasons I volunteer here is the inspiration and being around people who can inspire you back. So the idea of being five, ten years ahead and having it be subtle and having these conversations, there is no major product launch often inside of my hall, at least the hall where I am. I can't speak to exhibits because there's often, there usually is some sort of hardware or software announcement around SIGGRAPH, but here in the Experiences Hall, it really is driven at the attendees and to inspire them as to what is coming up and what they can create. I was talking to Someone at the New York Times who helped to launch, actually, and I think he was on the forefront of launching the VR initiative there in the New York Times, and he has said, this was Graham, he said, hey, you know what actually made me thinking about virtual reality in terms of its potential? for a narrative is having been at SIGGRAPH in 2014 in Vancouver and that year was particularly significant because all the meet-up groups, I believe we had Silicon Valley VR was up there, we had Vancouver VR meet-up which I'm one of the co-founders of that, there's about five of us total who started a little Vancouver hack space and we blew up into a thousand people. Likewise same thing happened at VRLA and they were up there basically bringing these brand new ideas and inspiration into very casual meetings and sessions at SIGGRAPH and we had amazing discussions and dialogues so just wanted to carry that momentum going into 2015-2016. I think it's grown so big it's hard to believe it.

[00:14:59.320] Kent Bye: Yeah, and speaking of narrative, you had a whole kind of story lab area with a number of different narrative experiences, including the Oculus Story Studio. And the highlight for me was seeing Google Story Labs Pearl by Patrick Osborne, I just thought was just a really beautiful experience that premiered at Tribeca a couple of months ago. But yeah, just maybe talk a bit about the inclusion of some of these more narrative experiences here.

[00:15:23.458] Denise Quesnel: So the VR story lab is, the best way to explain it is a work in progress and that's why we called it lab. So it was an interesting thing, I mean last year we had a lot of film and game content and not so much in console style gaming but more mobile gaming and this year we didn't want to really push that too much knowing that like Tribeca and Sundance they already have that ground well covered. And I'm very, very familiar with their work and the people who organized that. And so yet we kept getting a lot of requests, hey, you know, like, is there a way to have a look at narrative? And I'm thinking, well, like, let's define what does narrative really mean. It's not simply film-centric story that we know today. A narrative is anything. It's basically telling a process or a story in general. And so each of the pieces that was part of, not just VR Story Lab, but all of VR Village, has its own unique narrative that couldn't be discounted or deregistered as, no, no, no, this is not a story. They do. The HD1 has a story. VR Lab in particular was meant for the things that are seated or standing, so not moving. You don't need to wear anything. You don't need to have any over-complicated interaction. It's basically a much more casual atmosphere. And it was also a place to talk to the creators. So you mentioned Google Spotlight Series, their Pearl. We had the creators of Pearl, so everyone from the technical directors, and at one point Patrick stopped by. I had the audio engineers, the creators of the music, like everybody was part of the demonstration at some point. And I think there's some value there. In the early crowd, having access to the creators is huge. That's where that inspiration I was talking about really comes from, that's where it's generated. And then they get this moment of, aha, like the suturing moment of, this is what I was trying to go with my story, or I've had this idea and now I've seen it actually tangible. That place, what we're trying to do with it is make it part of the Computer Animation Festival, and for those who have been to the Computer Animation Festival, they know it comprises of several different elements. We've got the Electronic Theatre, which is the big screenings of the world's best short films, trailer content, and we give out an award for the best in show, which is actually an Academy Award qualifier for that team if they do go on to be nominated. And then we've also got the Daytime Selects, which is where you showcase even more short films throughout the day, and we've got the Real-Time Live Show. So in talking to next year's Computer Animation Festival Chair, Paul Jeremias, we figured, well, we need to integrate immersive storytelling somehow. How are we going to do that? So this VR Story Lab was really the first shot in the dark to see what do people want we had them fill out a qualitative survey so what did you want would you mind waiting in line do you need to pre-book are you satisfied with your experience what kind of environment did you want to experience these shorts in and go from there and really see we've actually got people all week long who have been inputting the results and it's absolutely stunning i'm blown away not just by the volume of people who went through there we completed about 400 surveys every 90 minutes. No joke, that was steady throughout the five days. I just couldn't believe it. It's a lot of surveys input, but we'll be using that data to figure out what people need to see here at SIGGRAPH specifically and how to incorporate that into the next programs.

[00:18:42.597] Kent Bye: Yeah, you mentioned the real-time live and last year I was asked by Jason Gerald to be on the jury of the real-time live to look at the different experiences there but this year I just had a chance to see a number of the different contestants and one of the winners was with Unreal Engine and a number of different Visual Effects Studios to do some like real-time kind of dynamic using Unreal Engine to do special effects sort of real-time. And I think that the emphasis on real-time graphics I think is a trend that it seems like there's things that you can render and get the highest quality, but with VR it seems like it's pushing the whole entire industry towards doing things in real-time. And so there's a number of different VR experiences that also were participating in the real-time live demos there. So I'm just curious to see how the VR Village and the real-time live play a part together in the whole show.

[00:19:26.404] Denise Quesnel: I remember you in the jury. I remember that. I was on the jury too. And I remember seeing you in the call, I believe. Your comments were fantastic, by the way. But this year, I was also on the jury. And you're right. The VR is absolutely being pushed in the direction of real time. That's because it's an immersive experience. And really, how we experience reality is always real time. Realtime Live actually had a pretty awesome collaboration with VR Village this year and I think we're also sowing initial seeds for next year also because Realtime Live managed to put a couple of these demonstrations here and likewise we sent a couple things that were more appropriate for the Realtime Live show that way. One of those was the one that you mentioned, Epic, with Ninja Theory, Cubic Motion, that team, amazing. I think for the general population, it's hard to understand in the video trailer exactly what's going on in there, but if I could summarize it, essentially what they're doing is they're facial capturing, body capturing, and also audio capturing an actor in real time, again, real time, and they're retargeting her using Echinema, which is also a product that you can buy, onto a rig. Recording multiple takes so the scene that they showed a real-time live was actor acting with herself over multiple takes all real-time and audio element had to be delayed to incorporate the latency of actually how the retargeting process works when you lay it onto a rig so I All the moving parts of trying to accomplish that, like I didn't even mention the graphics, but next step to that is going to be putting on a virtual reality headset and for people to actually do just what you saw at the real-time show, but instead of seeing recorded onto one itself, it's going to be actors on a stage all over the world in VR being able to record with one another and retarget onto rigs. We're actually really close. We're nearly there. All the tools that Epic showed in that demo were, in fact, off the shelf. Not a single one was some sort of hard-to-get product, with the exception of Unreal's Engine, of course. But again, that's available, and you can do it on your own if you really want. It's absolutely accessible.

[00:21:32.153] Kent Bye: Wow, that's really exciting. I talked to Charlie Hughes at IEEE VR talking about some of the stuff he's doing at the University of Central Florida, doing kind of Wizard of the Oz type of experiences where you're in a room with like 20 different virtual characters and there's an improv actor that's kind of jumping in between different ones and kind of doing live improv. But it sounds like with a system like this you could start to do almost like live theater with actors from around the world and I guess probably the biggest bottleneck would be the network latency for being able to get the stuff sent and to have it actually like in a way that you have it interactive. But it sounds like what you're saying is that you could do like live theater with actors around the world and see these kind of virtual reality shows with putting them into all sorts of different special effects.

[00:22:15.545] Denise Quesnel: That's exactly it. I think it's coming really soon. I mean, I'm only speaking to my experience. I think there's a few people who are working on it. The place where I volunteer a lot of my time, it came about from the roots of being an artist collective and now it's turning into a studio called the Sawmill in Vancouver and we're actually working on a virtual... Al McLeod, who was one of the co-founders of that space, he's been saying for about about two, two and a half years now, yeah. His dream is to have a virtual game of footy with somebody around the world and actually be able to achieve that with the network and with the realism of the virtual scene. So we've been actually doing a lot of demos and we built a virtual camera and we've tried to build a lot of our own equipment at that studio to achieve this. And it just shows it absolutely is possible to do this. And it's definitely coming up really, really fast.

[00:23:07.861] Kent Bye: One of the technologies that I wanted to ask you about is like they have these like spinning wheels and they're doing like a strobe light and you have these taking like a little ball and it's like changing colors or it's able to kind of have this different textures and they're having these cutouts and it And when you see it, it looks like this 3D kind of vibrant texture that's dynamically changing in front of your eyes. It's really quite wild, but it's these spinning wheels. And I'm just curious how that kind of fits into SIGGRAPH or graphics or VR. It seems like very physical reality. And what kind of principles those are showing and how that kind of fits into the whole ecosystem?

[00:23:44.473] Denise Quesnel: Yeah, I think that was an emerging technology. I actually didn't lay my hands on it, although I very well recall going through the jury process when that came through. It blew everybody's minds because here we're talking about blending elements of your, like this is your floor, this is your structural environment, this is the walls, this is the table. and we're bringing it into the reality. So a lot of this kind of harkens back a little bit to when people say, right, I don't want to get into virtual reality because I don't need to escape my reality any more than I already am. Why would I do this more? Well, demos like that, they're actually a really, really good example of this is enhancing or this is adding a layer or something else to the reality, as you know it, for a very, very tangible or physical purpose. So who knows what, you know, a lot of these applications are going to end up doing. Who knows how many will penetrate the market, actually be developed into a product. Maybe they'll end up being in a lab and repurposed for something else, but just the actual idea that something is possible. I really, really like that. I like the idea of playing. People call it mixed realities, but it is reality. We are in realities. Anytime that you bring in virtual or augmented, it's not really mixed. We're still in reality. It's just another version of it. If we think about it like layering of a cake, You're not taking anything away from your experience. You're adding to it. But it's all about balance, right? So it's all about, you can't give somebody absolute cognitive overload. You can't give them too many tasks, too much stimuli, too many distractions all at once. We're in that delicate balancing point of trying to figure out, well, what is too much? What just becomes overwhelming? And what is making a lot of sense for us as users of the technology?

[00:25:34.205] Kent Bye: Cool, and just as we're kind of wrapping up, I just wanted to recount some of the things that I saw on the floor just in terms of just covering different themes that I saw emerge. There was some cyber archaeologists who were doing some phonogrammetry, doing some VRs to be able to assist archaeologists in doing their work or doing collaborative work. There was some social games where you actually have a kind of a VR headset and you're at some point seeing four different people's views at the same time and it was like a game where you were trying to kind of chase other people around and so you could actually see where other people were based upon what they were seeing. So that was kind of like a fun social game messing with what type of perceptual inputs that you're having. There's a couple of like motion rigs where you're kind of moving around and able to actually get like a motion platform experience. Some things with like Gear VR's motion track within a collaborative environment that they were showing here. And some drones that were flying up in the air and doing projection from a drone. So you know trying to stabilize all the signal to do a stabilized projection mapping from a flying drone. And just some facial retargeting, so being able to dynamically track your face in real time and then take your face and start to puppet and animate Arnold Schwarzenegger's face. And some other kind of like training applications with taking a small poll but you see it longer in the VR application and so you're just kind of doing this training task. That, for me, was kind of some of the highlights. There was a lot of other different types of experiences, but just wanted to see if there was anything that you thought also that wasn't mentioned or big themes that you thought were emerging this year.

[00:27:06.445] Denise Quesnel: That was pretty damn good. I mean, like, what you essentially just described was a lot of mad scientists here on the floor. I mean, like, that totally sounds insane, like, hearing it said back to you like that. That was a pretty good explanation of what was happening. I'm laughing because it literally sounds insane. Because half of those things you're going, well, where are we going with this and why? And then you try it, and you're like, well, why not? This was so cool. It's about having it be an experience. And so a few of those that you mentioned were absolutely applications that are used day to day, like the archaeological site. How many times are you going to be able to access an archaeological site as an archaeologist and work on it before, or even for tourism purposes, how many times can you have people come into that place before the site begins to degrade? Every layer that they uncover, every bit of sand that they brush away is fragile. It's not going to be there forever. And there's something about what that group is doing. And that's a university group, by the way. They're developing an application so that you can, as archaeologists, explore the site, the dig, and make field notes, and be able to plan. So things like that. What's cool is they were using a combination of 3D scanning, so LiDAR, and they also used 360-degree capture to get the site itself. And then they used all of it in a VR headset. This collaborative initiative, I mean, you mentioned the parallel eyes, which is the four people at once who get to run around obstructions in the corners and one person has a plastic sword and they have to play a game of tag. That's basically what it is. And again, you might be going like, but why? And then people walk and they go, but why not? I got to see the point of view of these other people. And people go, it was trippy. Like that was so crazy. And then one of my favorite collaborative or educational experiences was called Aquarium Earth. And that was put on by FaceSpace along with a couple of other contributors who did the graphics. And they've been around SIGGRAPH for a long time. So when they pitched to me this idea of we really feel compelled To create meaningful content, we like to cover coral reef bleaching and what the impact of our waste and our lifestyles is having on the oceans. Is there anything that we can do? And I was like, well, you know, you've got this space. Let's see what you create. So they create an educational experience where you and multiple people, so up to 10 or 12 individuals, can walk around a space at once. And you see yourself as a little submarine vehicle. you don't bump into the other submarine vehicles. Actually, I didn't see a single person. I was standing there for hours and I didn't see a single person bump into each other. One guy was spinning in circles and doing swimming miming motions. I saw him come back three more times. And I asked him, so I was like, I have to ask, what's going on here? And he said, I just love it. He goes, the first few times I was educated, now I'm just taking it all in. It means something to me. Likewise, a couple of our films played on that same empathy factor, the care, the meaningful content where you begin to care about whether it's sustainability, your body, other people and their social situations, Injustice, which was a short film created by some Carnegie Mellon students. You witness a police brutality case right in front of you. And people walked away just like arms in the air going, my God, like that was that was I actually really care. But if I watch that on the news, I don't know, like I might just go, oh, yeah, another of these unfortunate cases. This is really awful. But then having experienced it from the point of view of somebody who witnesses that experience. I mean, what's cool is they had a microphone so you could actually speak directly into it. and change the course of the story, depending on your response. So for me, those were a few highlights. I think you covered all the other ones, the crazy, you know, and the elaborate and the benign and also the eccentric. So it was, it was pretty cool.

[00:30:54.988] Kent Bye: Yeah, a couple other ones just from what you said there is there's one where you have a LIDAR, like a radar thing on your helmet head, and they did this thing where they did from first-person perspective, so it's right on your head, and you get to see this perspective of kind of like these abstracted lines, and then they switched the perspective of like way over your shoulder. It was kind of eerie to go from that first-person to the third-person perspective, and just to have this camera that was way up high. And there's another like telerobot where you're able to actually like move your head, and you're able to see the perspective. And a couple of augmented reality and mixed reality experiences as well, so an augmented reality experience where you're able to actually do some prediction of where the ball was bouncing. But in terms of augmented reality, I think of anything, that's the one area where I'd imagine that in the future there's going to be probably a lot more of those types of experiences at SIGGRAPH, but yet at this point there's just kind of like the HoloLens and some Pretty much a commercial off-the-shelf virtual reality HMDs, but not a lot of cutting-edge augmented reality headsets or technology yet. But I imagine that over the next one to five years, that's going to be the one area we're probably going to see a lot more of.

[00:32:00.273] Denise Quesnel: Yeah, and the fixation right now is on the hardware itself, so everyone's looking at the devices created by Microsoft HoloLens and FaceSpace with their augmented reality and also Meta with their augmented reality and of course Magic Leap, can't forget them. But the fixation is still on the hardware, kind of similar to where it was with VR like two, three years ago when the original development kits were coming out. So we're not seeing very compelling content yet. That's going to change. It's starting. But what experiences we do see often have a lot of flaws or issues that are still built within. And also just having such a fixation on the hardware. Everyone keeps talking, ah, the field of view of this and that. It's like, ah, you know what? We're in the earliest stages. Nobody remembers with cars the way you had to crank the engine, right? That was how you got around. And we're kind of there now with the hardware is we're essentially cranking the hardware engine and trying to find something that can give us what we actually want but we're still not quite there with the content so I'm really looking forward to it. I don't own anything AR yet and I think I'm just waiting to see when it becomes a bit more feasible, financially feasible for me to try. I don't like the comparison between AR and VR, which is better, that sort of thing, because at the end of the day there's immersive realities and that's what that is. They do similar, they do different things, and I think there's a lot of potential with both, but we kind of have to get over this hardware fixation to begin with. I don't think that'll really happen until some more development kits really do end up in people's hands.

[00:33:34.097] Kent Bye: Great. And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?

[00:33:42.720] Denise Quesnel: I was totally inspired by today's presentation that we had. We call it an experience presentation here at SIGGRAPH because what that means is somebody gets to come up on stage, tell you not just about how I built the CG or interaction paradigm in this presentation, like those are great, but what these speakers did was they put people on the headset and they said, give it a shot, let's see what you feel like you're gonna do. And people just go and they explore, that's what they do. And likewise, there was a quote by Dr. Greenleaf, he said, It's been gaming entertainment that have been driving VR. Yes, we know that. And that's very, very good. But what's going to change our lives is going to be the integration of VR for telepresence, for health care, for different applications to help with dementia, cognitive behavioral changes, you name it. We've only just dipped our toe into the proverbial VR lake. It's barely even there. just the sheer potential of what's being talked about. I just can't believe it. And the fact of the matter is, anybody can now create VR. Here at SIGGRAPH, we had Unity and Unreal workshops on how to actually build scenes. It's free. I just want to reiterate that. It's free. Yes, you might have to pay for your assets or cameras if you're going to capture content, but still. you can create VR for free. Like I can't believe I'm saying that because even eight years ago we were still making caves because we couldn't even get our hands on any virtual reality hardware. So just the fact that yes okay we are trying to dip our toes in this area but we can do it and we can do it for free Unbelievable. That's the potential. That right there is as people from all these different industries start experimenting, and they are, that is going to change everything. Bringing it outside games and entertainment, as Dr. Greenleaf said, and more into the application of the other industries and artist hands, we're going to be in such a different place even four years from now in 2020.

[00:35:41.106] Kent Bye: Awesome. Is there anything else that's left unsaid that you'd like to say?

[00:35:45.407] Denise Quesnel: Come next year, we're back in LA in 2017. We'd love to see everybody and have VR Village there again. It's going to happen. We're going to open up a call for submission sometime probably by December. So watch that deadline. You can follow us at s2017.siggraph.org. follow that site because that's where the call happens. The call happens through that website. We want your projects. We want to see everybody here of all types and we'll do our very best to bring the essence of VR back into SIGGRAPH next summer.

[00:36:20.626] Kent Bye: Awesome. Well, thank you so much Denise.

[00:36:22.467] Denise Quesnel: Thank you very much Kent for having me.

[00:36:24.800] Kent Bye: So that was Denise Quinnell. She's the program chair for the VR Village here at SIGGRAPH. And so I had a number of different takeaways about this interview is that, first of all, I think that haptics was probably the one thing that I really got a chance to try to try out here at SIGGRAPH. Because I think that, you know, when you look at virtual reality, I think that the visual fidelity is something that is pretty well solidified at this point. You know, there's going to be some future breakthroughs, I think, perhaps with some of these. digital light fields as well as these virtual retinal displays like Magic Leap and so I think there's still some work to be done for the visual fidelity but yet at this point it's pretty good and pretty convincing. I think audio is going to be the next kind of frontier in terms of how to really create a really immersive experience. There wasn't actually a lot of audio demos that I saw here at SIGGRAPH which I thought was pretty interesting. And so haptics is, in my mind, kind of the next frontier in terms of different peripherals and different types of things you can wear or do to make it more of an immersive experience, make you feel more embodied within your actual physical body and your sense of sensory presence. So, again, I think a lot of these different haptic approaches, though, are either going to be very specific for a very defined use case, but the synesthesia shoot actually was kind of an interesting low-fidelity type of experience because it's just kind of randomly buzzing across 24 different parts of your body. By doing different combinations and buzzing your entire body, you get this extra thrill of doing things in the game being correlated to having this sensory stimulation that's happening all over your body. It was quite enjoyable and fun. Again, I don't know how feasible of a product that would be, how much it would cost, how would you keep it powered. So a lot of these things that were being shown at SIGGRAPH, I think were a little bit more to show you what's possible rather than to have something that's fully developed in terms of something that's ready to come to market. So that's what I really appreciated about SIGGRAPH is that there's a lot of these kind of experimental demos that were actually a little bit more polished than some of the other academic conferences that I've seen. I think Denise is right. When I've been to the IEEE VR, there's been more of an academic research into issues that may be five to 10 years out. it's not something again that can be immediately productized and so it's just an opportunity for you to go in and have all these different crazy wacky experiences and ones that were a little bit more polished and done with a little bit more game design you know the one where you're actually having this VR headset over your face and you're getting these four different views it was actually a lot of fun because you're playing it with other human beings and you're able to kind of see what they're seeing and I was surprised actually that I didn't get motion sick because you know whenever you start to give some sort of visual input but it's not of you actually moving around the fact that it was so small it wasn't really filling up the periphery and that there was four different squares that you saw at the same time and you saw just one of those squares was your view but those three other views were actually what other people were seeing and so you kind of had to have this dual awareness of where you were at, but also where other people were at within this environment. So it was a pretty trippy and surreal experience. And so there's other things that were there that I didn't get a chance to do, like get on top of this like 20 foot robot. And as you move around, you actually kind of physically move around and it kind of felt like this mech like robot. experience but it took about 15 minutes to go through and by the time the lines were opening every morning there was already a line that would take an hour or two to wait so I didn't get an opportunity to do that but there were things within the VR village and other demos that were kind of like that were Very unique experiences where like you were either gonna do it or you may never actually get a chance to experience that experience Again in your life, and so it's kind of a interesting thing to experience but also a little difficult to cover so I'm glad that I had a chance to talk to Denise to be able to just kind of talk a bit about my experiences because I didn't actually do a lot of other interviews with some of these creators because again, it's kind of like these crazy wild ideas and prototypes and I think the thing that's probably the most applicable is some of these redirected walking techniques where you could have something that's a little bit larger than a room scale maybe something that's not quite as big as the amount of space that you need for the void but something that's kind of like beyond room scale but small enough that you can start to have these physical arrangements of kind of like 2Ds back to back to each other was a bit of a system where you could start to put your hand on the wall to be able to walk around a curve but it looked like you were walking straight but yet there was opportunities to kind of make different 90 degree turns which allow you to kind of walk around an area and then continue to walk straight. So essentially what I was doing is using a very constrained space and using redirected walking to be able to kind of walk you in circles essentially and guided by the wall and to allow you to have the experience of kind of infinitely walking. Now we will say there's a My own visceral reaction to that was I was having a little bit of vestibular disconnect. I started to feel, not necessarily motion sick, but I could feel kind of the cognitive lewd where my mind was actually aware that I wasn't actually walking straight, but I was kind of being tricked. And so I think in the long term, something like that may not actually be a good solution for everybody. something like the void where you're unaware that your mind is being tricked and that you're kind of walking around this quarter circle the radius is a lot larger so it's a lot less perceptible they had another demo where you were looking at this a and b and the visuals that you were seeing it made it look like you were walking like straight towards the b but it ended up that you were walking like way off course towards the A. So it was just kind of like this demonstration again where like how much your visual system can override what your body is feeling and be able to actually kind of redirect you into different directions just by slightly changing some of these visual feedback that you're getting. So another kind of demonstration of the redirected walking And so the other thing that I just wanted to point out was the Unreal Engine winner that happens with the real-time live demo is something that's really quite fascinating. And what Denise is saying is that it's going to be set up to the point where people are going to eventually be able to do live theater acting with actors around the world and being able to go into a virtual reality experience and see a live performance with people interacting with each other. That I think is, in a lot of ways, the future of storytelling, of what the potential of VR is going to enable you to do. Something like that, where there are people able to be in a space and interact and be able to act with each other. Maybe in the future we'll have just about anybody be able to participate in these types of interactive narratives. I think right now they're kind of a requirement to have a lot of extra physical gear that's doing motion tracking on your face and everything. And so it's not something that's necessarily a consumer VR equipment that you'd be able to recreate that. But I think that's kind of where the direction where it's going, where you're going to be able to do these kinds of acting and performances. I think the other big trend that I just wanted to point out as well is the eye tracking. And I do have an interview with SMI tracking to talk a little bit more about what they're doing. But there was a few demos there with NVIDIA doing some SMI tracking with foveated rendering. I think it's like they pretty much got it working. I think it's just a matter of time before some of the second generation of headsets start to integrate some of this technology directly. And I think that is going to be coming likely. I'd expect that. Eye tracking to be kind of built in to like both the vive and the oculus rift and perhaps the sony playstation We'll see. I think that we'll probably start to hear maybe about some of these second generation headsets coming up Maybe maybe not. I don't know. There's oculus connect 3. They're going to be showing off probably like doing the big launch of the oculus touch Will they be showing any like cutting edge prototypes for the next iteration? You know, kind of like a prototype version that is leading up to the next second release of the consumer version of the Rift. I don't know what the distance is going to be between some of these different headsets. I'd imagine that it's going to be out there for longer than what the cell phone kind of iteration cycle is. That tends to be about almost every year. I'd imagine that the Oculus Rift will probably be good for at least 18 months to two years. Steam Dev Days is also coming up here in October and I expect that Valve is going to be making a bunch of new hardware or other different types of technology announcements and demos that are happening at Steam Dev Days. I think from some of the interviews that I've done before with some of the Valve employees, they kind of alluded to saying that, you know, there's a lot of capabilities that are built into the hardware that haven't been unlocked with software updates. So maybe it'll be along those lines with software updates that's going to be able to unlock new capabilities. And so I think it's kind of timed in that way that they're kind of due to make some new announcements. There were some announcements with being able to track multiple objects kind of in a royalty-free way. I think Valve is taking this open source approach where they're allowing people to do all sorts of different third-party objects to be tracked within the Lighthouse technology stack. And so who knows if Oculus is going to come out with something similar to be able to say, hey, here's how you could do their kind of constellation tracking with the optical track cameras rather than something like the peripherals for the lighthouse. And so we'll see. I think that there's been some discussion over the last couple days. There's an article from Road to VR quoting Jason Rubin talking about how Oculus isn't really necessarily designed to be room scale so maybe it's always going to be kind of like a sit-down experience and so when you start to compare those two I think that the vibe is really set up to do room scale and beyond and to get your full body immersed into the experience and so I don't know, it'd be kind of interesting to see if there's more discussions about that, whether or not Oculus is going to try to do some of these experiences where you're really trying to create a full room scale type of experience, or if they will continue to do the recommendation of what they have now, which is to just put the two optical track cameras in the front facing, so there's a bit of occlusion problems that happen if you turn around. And so that was kind of my experience with some of the demos with Oculus Medium and trying to turn all around and do these sculpting behind me. It was not really meant to be kind of a tilt brush application that was really meant for you to be facing one direction and sculpting something in front of you. And so something that is doing full immersive 360, walking around a full room, It looks like at this point that Valve and the Lighthouse is kind of doing this leapfrog. They just came out and said, nope, we're just going to go fully immersive. This is what, you know, I kind of heard from other Valve employees when they were developing this, they were just finding what was making the developers within Valve kind of leave their desktop and go into the lab and to be able to actually experience some of these VR experiences. I think that's part of the reason why they decided very early to go with the full room scale rather than doing this intermediary step of sitting down. So just to bring it back to SIGGRAPH and to kind of wrap things up here, there's a lot of kind of really out there experiments where you look at it and you're like, this is kind of cool. I have no idea where this would be used, but it just kind of starts to open your mind as to what might even be possible. And so I think, uh, over the years there's been people like Denise said, someone from the New York times coming to SIGGRAPH and getting inspired to be able to then start the whole VR initiatives that are happening within journalism at the New York Times. As well as with some of the story lab that was here, maybe they'll have to figure out some way to be able to take what they're doing in the story lab and be able to bring it into more of a theaters and shows. I think actually doing like public screenings of VR experiences, there's a lot of actually really difficult kind of throughput issues. In some ways, they might be better off to kind of create this theater in these times for different experiences for you to actually have a ticket to go see it. You go in there, you get and see the experience. Everybody sees it at once and then you move on. Right now, there's this kind of like this waiting game that is going on with seeing public demos of virtual reality experiences, especially narrative experiences where you go in and you just end up having to do a lot of waiting. thinking about ways to do that completely differently. I think Sundance is likely going to do something completely different because this past year they couldn't really even handle all the tons of people that were coming through and trying to see these experiences. They would go up and want to see something and then be told come back three to six hours later. Well, at a conference like that, you have no idea where you're going to be in three to six hours. And so you kind of want to know like this is the time that I'm going to see it. So moving away from this kind of like open, like see any of the 30 different demos at once, maybe moving more towards like time screenings where you have an infrastructure that's set up. So you just go in and you watch a VR experience and then you kind of have more of a film festival type of experience rather than kind of like this open demo experience. There's been other approaches like VRLA. For example, they have these two-day approaches where you essentially are able to see everything that's on the expo floor, but it's kind of like exclusive VIP access. You pay more to get that freedom to be able to just walk up and experience everything. That's another approach. And then the next day when they invite 6,000 people in and you essentially just end up waiting a lot to see some of the other experiences, if you didn't get that VIP access. Anyway, there's a lot of different ways to try to structure that. And so Denise was just kind of thinking about how to kind of incorporate the VR within kind of more of a animation screenings and other ways to kind of have more structured ways of showing some of the latest VR narratives there. When I was there, though, one thing that happened was that the line for Pearl was so long that You know, I hadn't really heard of it, so I just thought, Hey, well, there's gotta be some wisdom of the crowd here of people wanting to see this. And I saw it and it's pretty much one of the best VR narrative experiences that I've had a chance to see. And then kind of the same thing was happening at VRLA. So there's the, for me, there's also a benefit to be able to just not plan things and just kind of see what the buzz is based upon what people are really trying to get at and experience based upon the word of mouth that's already spread. So that's all that I have for today. I wanted to just thank you for listening. And if you'd like to support the podcast, then spread the word, tell your friends. You can leave a review on iTunes and you can support the podcast directly by sending me a tip. Go to patreon.com slash voices of VR.

More from this show