ela-darlingEla Darling is an adult entertainer who is collaborating to create Holographic 3D VR Porn with a couple of technologists that she met on Reddit. Porn and VR is something that is often cited as something as an inevitability, but I suspect that the ultimate strengths and direction of where VR porn goes is likely going to surprise a lot of people. Specifically, Ela sees that one of the strengths of VR porn is that there’s going to be a lot more interactive experiences that are more about cultivating intimacy and emotional immersion.

You can check out some of the actual video on demand VR experiences on Ela’s VRTube.xxx (NSFW). They did some early experiments with 360-degree video, but found that it both felt a bit creepy and that there was an entire world around you that really wasn’t all that interesting to take a look at. So they decided to use the Kinect V2 in order to capture the depth data, and create a 3D hologram from that information. You can check out this WIRED article about VR Porn to learn more about Ela and her partners.

Porn and VR is often cited as a marriage made in heaven, and somewhat as an inevitability. Perhaps it will be inevitable, but I don’t think it’s necessarily a sure bet after talking to Ela about the economics of the porn industry. Adult entertainers have suffered from the rise of torrent piracy and Tube sites where there’s a bit of a cabal of large-scale porn producers who use TubeSites to promote their paid sites and turn a blind eye towards pirated material. This has made it harder to make a sustainable living within the adult film industry, and cultivated an environment where a lot of people in the target VR demographic has been conditioned to not ever have to pay for porn.

Ela cites the cam world or interactive experiences with adult entertainers through web cams as a viable way to make a living in the industry. This type of interaction tends to focus more on emotional intimacy and interactive engagement rather than objectively watching sex acts. It’s from this experience that Ela has found a real desire and need for intimacy within the porn industry, and sees that VR porn has the possibility to start to provide interactive and gasified interactions that go beyond skipping right to the sex.

She talks about her dating simulation experiment where the user has to court and seduce the performer by choosing amongst four different options for what to say. If you’re too forward, then you lose and game over. But if you’re able to successfully navigate this interactive portion, then you’ll learn more about the adult performer and then be treated to a sexy scene at the end. In the end, the type of immersion that VR provides has the opportunity to provide some of these more engaging experiences that go beyond what you’d be able to experience within a 2D film.

Ela and I also discuss some of the implications of VR porn, and what it might mean to relationships for men to have interactive VR cam experiences with adult performers. Given that VR experiences can feel more like actual memories, then will be there new lines in terms of what’s considered “cheating?” Ela says that this is a conversation that people need to have within their relationships in order to explore what’s acceptable and not acceptable. She says that porn is often treated as the scapegoat for failed marriages, but that porn is merely a symptom of a relationship dynamic where there’s fundamental lack of clear communication about sex. If the existence of VR porn inspires a lot of couples to have a candid conversation about sex and porn and their agreements around that, then that’s going to be a lot better than treating as a taboo topic that needs to be hidden and felt ashamed of.

There’s a lot of interesting new technologies with teledildonics and the possibility of how technology could help enhance the physical experience of either 1:1 intimate and erotic encounters between long-distance lovers, as well as interactive porn experiences. People often cite porn as one of the technological innovators by looking at the history of the VHS vs. Beta max, the rise of streaming video on the Internet, as well as online payment systems to be able to pay for porn cites. But Ela actually agrees with Palmer Luckey in that Porn has lost that competitive edge to gaming. It’s not because of porn that we have VR headsets. It’s because of gaming. And if anything, porn is going to be following the market leaders of which VR gaming platform is going to be the most compelling rather than driving the market decisions.

If anything, porn may suffer being shut out of being accepted onto some of the walled garden platforms, and there may need to be technological hacks like Mark Schramm’s SideLoadVR signature injector in order to get porn applications to run on devices like the GearVR.

One area that will be interesting to watch in the realm of VR porn is how kink communities adopt the technologies and how much haptic feedback might be able to be included within various sensation play experiences. There’s a large psychological component that can be involved with kink that plays with power dynamics, and there doesn’t necessarily even need to have physical sex involved. It’s these types of more social interactions that could lend themselves really well to playing various different fantasies and kink scenes within a virtual environment.

Finally, Ela talks about some of the future applications of VR in terms of education and medical experiences. She also sees that VR porn could actually been seen a therapeutic application of VR to help provide more intimacy and reduce levels of isolation and loneliness.

VR porn will certainly continue to be part of of the VR landscape, and I personally will be interested to see what types of surprising insights that VR porn will provide. Just as you can’t take a 2D film or 2D game and just do a port to VR and expect it to be a compelling experiences. Instead, the best practice is to look at the strengths of the VR medium and build something from the ground up. It sounds like Ela and her VRTube.xxx team are early pioneers on that journey, and I’d expect that there’s going to be a lot of interesting surprising results from that journey. For example, if you would have told me that VR porn would result in the cultivation of deeper intimacy and perhaps better sex in real life, then that would’ve been really surprising for me to hear. But yet, that’s what seems to be the direction that Ela and her team are going.

Of course, the world of pornography and adult entertainment is a huge and vast world. It’s likely that there will be a whole new range of new and different experiences that people create that are genre-busting and truly innovative. One of the best places to track the latest trends and developments in this realm is probably over at the OculusNSFW subreddit.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Alex-HenryAlexx Henry is a photographer who has created an array of dozens of cameras in order to create extremely high-resolution captures and avatars for people. He’s based out of LA and has a number of different clients from the movie and entertainment industry. But his vision is to be able to democratize this process for independent game and virtual reality experience developers with his xxArray project.

Alexx talks about how this is really a two-step process of first the capture using his xxArray photogrammetry rig, but then there’s the process of creating optimized avatars for either the film industry or a much more lower-poly and optimized version for virtual reality. From a photographer’s perspective, you always want to capture the highest quality and then you can downsample it from there if you’re just producing a low-resolution version for the web. Just the same, he advocates that it’s better to have an extremely high-resolution (like 22 Gigapixels of texture data) onhand in case you need that additional resolution later.

He talks about some of his visions for putting yourself into a virtual reality game or experience, but also some of the implications of identity and self-esteem to be able to have a more objectified experience of your body. He talks about some of the changes that one of his friends had with his self-image in being able to experience his high-resolution xxArray avatar within virtual reality.

One of the big debates that we have within this interview is the tradeoffs of going with photorealistic and hyperreal avatars with VR. It sounds like it’d be amazing, but there are many tradeoffs with the uncanny valley and it has the potential to send you off into a pit if you don’t have an equal amount of fidelity on the social behaviors and cues, interpersonal interactions, eye gaze, and overall believable movements and behaviors. If there is anything that’s off, then it can look creepy or uncanny. Richard Skarbez is probably the most comprehensive interview I’ve done on the uncanny valley where he advocates that the uncanny valley is n-dimensional.

Alexx is a clear advocate for high-fidelity avatars and that believes that there’s a lot of FUD and BS around our concepts and understanding of the Uncanny Valley. It shouldn’t be seen as a unapproachable boogie man, and he showed me the following example during the interview for how believable you can create an avatar within a virtual environment.

Jimmy's Avatar Gets Angry from alexxhenry on Vimeo.

At the end of the day, I’m glad that there’s people like Alexx who are bravely challenging the status quo and providing a technology stack for people to get a super high resolution capture scan and avatar of themselves. I think that there is a lot of really interesting possibilities for what could be done with self-image and identity, especially as a lot of technological hurdles about the uncanny valley are slowly figured out and solutions provided.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

philip-rosedale
Philip Rosedale has a lot of really brilliant dreams and visions for how to create a sustainable and open metaverse with High Fidelity. Being that he’s the founder of Second Life, we talked last year about some of the lessons learned in how he’s approaching creating a more sustainable and scalable model of interconnected virtual worlds.

This year his focus was on how important being able to create a hyperlink between multiple virtual worlds is going to be. He says that the history of the Internet provides a really valuable lesson in the fate of what happened to the walled garden platforms of AOL and CompuServe. Even though the content on these platforms were of much higher quality, over time the links between other websites with a more rudimentary and less polished look and feel ultimately won out. Philip cites Metcalfe’s law which states that the value of a telecommunications network is proportional to the square of the number of connected users of the system. As each virtual world is able to link off to different portals to other virtual worlds, then that makes that virtual world that much more valuable and compelling.

This insight seems to be at the foundation and core of High Fidelity’s approach of using open source licensed technology in order for people to stand up their own hosted servers containing the code for their virtual worlds. In fact, part of Philip’s long-term vision is to start to use the distributed computing power of the many more desktop computers and mobile phones to be able to create a virtual world equivalent to the square footage of the entire earth so that it could concurrently create virtual experiences for all 7 billion people on the planet. This vision is what keeps Philip going to work every day, and to be able to create the technological backbone in order to make that happen.

Currently, linked between virtual worlds is a bit of an open problem for how to actually pull that off in a seamless fashion. Text on a website has the ability to unobtrusively link to other websites, and there are metadate queues that add contextual information about where links will lead to. There are no equivalent standards within a VR environment with 3D objects, but the closest metaphor is a using a Door or Portal or completely different building to be able to navigate between different virtual worlds. There are potential perceptual hacks that could be exploited, but Philip cautions that there may be very physical limitations for how we navigate virtual worlds that would the equivalent of the disorienting effects of providing contradictory information to our perceptual system thereby causing simulator sickness.

Philip was also really excited to have created some shared physics simulations in order to have games like air hockey that could be played in VR. This will add a level of physical reality that could add to the coherence of the virtual experience, but also provide a lot of opportunities of engaging in fun and playful activities with other people within High Fidelity environments. A common theme amongst all of the social VR applications from AltSpaceVR, Convrge, and VR Chat is that all of them have been adding more social gaming experiences within their social apps, and so this has been a consistent theme amongst all of the social VR applications.

If I were to bet who has the most viable and sustainable approach for creating the metaverse, then my money would be on High Fidelity’s strategy and open source technology stack. I’m really excited to see how Philip Rosedale and the rest of High Fidelity continue to evolve over time.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

First Contact! from High Fidelity on Vimeo.

jessejoudrey
Graham Gaylor & Jesse Joudrey have been working on VR Chat for well over a year now, and they’ve been consistently providing social spaces for some early experimentation with Social VR experiences. They talk about some of the latest features that they’ve been implementing in their platform, and what they’re doing to help make their dreams of the metaverse a reality.

Jesse says that social interaction within a VR experience a feature, not the final destination of a VR environment. More and more people will want to be doing or seeing something specific within the VR experience, and so that’s part of the reason why VR Chat has created an SDK with a set of unity scripts to help easily add social experiences to your VR experience.

grahamgaylorThey’re also interesting in being able to link virtual experiences together as well and helping to do some early experiments in building out the metaverse using tools like Unity that are capable of creating highly-performant and interactive immersive environments.

One of the requests that people have had with VR Chat was to be able to know when their friends were online, and so Jesse and Graham have been implementing user accounts so that people can have friends lists. They also have a whole system for being able to create rooms within VR Chat, and whomever creates the room has a set of moderator privileges in terms of whether it’s hidden, open, and access only given to a white list of people or friends of the moderator. They’re starting to implement the technological foundation in order to have the type of chat room environments that were described in Ready Player One novel. I think that this is going to be really, really powerful to be able to invite people over to your personalized VR chat environment, much like you might invite people to come hang out at your home.

Some of the other features that VR Chat has implemented was being able to have different types of games and interactions while hanging out socially. For example, Jesse created a race track where you can ride around in a car with four other people while racing other people. But these types of physical proximity constraints start to replicate taking a road trip with friends, but in a way that’s super silly and a bit absurd. Jesse says that one of the biggest lessons and reactions of creating experiences like this is that people tend to laugh a lot more together.

In the future, I foresee that Virtual Reality has the capability to allow us to play with a way that we don’t tend to do in real life. It’s going to break us out of socialized patterns for how we normally connect and relate to each other, and I think there’s a lot of rich opportunities for finding new and fun ways for people to play and spend time together whether they live across the country or if it’s just something that people decide to do within a room-scale VR environment or VR arcade.

For more information, be sure to check out VRChat and drop by one of their upcoming events.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

shawn-whiting
When Shawn Whiting and Hayden Lee decided to create a social VR application, they decided to create the simplest minimal viable product. It was literally just a cube and a name that was tied to head movements, but they had a dance party where over 50 people showed up for at least a half hour each. Even though they were simple blocks, knowing that they were tied to the movements to a real human being on the other side proved to be extremely compelling.

That is a perfect story that encapsulates the iterative approach and rapid feedback that has been the driving philosophy for Convrge. Known for it’s minimalist floating heads, Convrge has adopted a low-poly aesthetic that has proven to be lightweight and super compelling.

hayden-lee
Shawn and Hayden have found a number of ways to keep users engaged with social applications ranging from dance parties, VR developer talks with notable speakers from the VR circuit, virtual campfire hangout sessions, YouTube watching parties, and watching livestream events of either VR meetups or breaking news and press conferences from VR companies.

As a particularly salient example, here’s a reaction video of over 73 people in VR watching the livestream of the Oculus CV1 press conference where it was announced that the Rift would be shipping with Xbox controllers. This was clearly disappointing on some level for the VR enthusiasts who really wanted to see 6DOF controllers ship with the Rift, but then there’s another moment later in the video where Palmer announces the Oculus Touch, and there’s clearly a lot of excitement that this is something that is coming in the future. Note that there’s a recording error in having two tabs open as to why there’s an echo in the sound.

Shawn and Hayden even were livestreaming the VR mixer afterparty onto the theater screen in Convrge while at the same time projecting that scene onto a screen at the party. Here’s an excerpt of a little dancing that I did at that party that Hayden mentions within the interview

Convrge has also recently added the ability for users to play games with each other in VR, and that was by request. They talk about some of their future plans, and what some of the most popular feature requests have been.

Finally, Shawn gives the advice to anyone creating a social VR application to get people involved as quickly as possible so that you can start to get their feedback for what they want. In other words, don’t spend a year and a half developing a polished social experience without ever once having people involved with using it.

I’m looking forward to seeing how the community and Convrge platform continues to evolve, and encourage anyone who hasn’t made time to check it out to definitely drop by sometime. It’s a very welcoming and open community.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Hallie-McConlogue2Hallie McConlogue has been waiting for virtual reality to go mainstream for over 20 years now. She used to spend many consecutive hours within a VR design and drawing tool called HoloSketch at Sun Microsystems’ VR lab where she did interface design, modeling, art direction, and animation for 7 years. She worked on the first claymation for real-time playback within a head tracked virtual reality HMD.

I talked with Hallie right after she had tried out the Crescent Bay demo for the first time at SVVRCon. She was marveling at how well the head-tracking from system that she used over 20 years ago has stood the test of time, and even the framerates and resolution were comparable in her mind. Of course, computer graphics have improved exponentially and the cost of the Sun system that she was using was over $53,000.

She talks a bit more about using HoloSketch, which is a program developed in 1994 by Michael Deering. It sounds like the metaphoric grandfather of an interactive painting and design program like Tilt Brush.

One of the things that Hallie is interested in doing is applying the insights that she’s gained from improv acting and comedy over the years. She argues that improv theater actors have been on the vanguard of interactive media for as long as improv has been around. She thinks that improv has a lot of lessons to teach VR designers for how to create a compelling and engaging interactive experience.

Hallie also advocates for the importance of facial tracking, especially being able to track the eyes. Where the eyes are looking can give so much information about how to navigate social situations, and she says that the eyes are a huge feedback mechanism for how improv actors communicate with each other.

Hallie also talks about getting more women involved in Virtual Reality, and provides some feedback about what type of experiences that she’s interested in. She says that romance novels are like porn for women, and that there’s an romantic and emotional component that is worth exploring through the medium of virtual reality. And again, a lot of these types of romantic experiences get back to being able to track facial expressions and to have a strong sense of eye contact between either the audience or characters within a story.

Incidentally, Maria Korolov, the editor of Hypergrid Business, has started a Women in Virtual Reality website aggregating women speakers and professionals who are working with VR.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

nick-donaldson
Nick Donaldson is a Senior Designer at Epic Games, and I caught up with him at Oculus Connect to talk about the design process behind The Showdown Demo. This demo was meant to be the highest fidelity VR experience with AAA-game polish up to that point and time. They ended up re-using a lot of assets that were designed for other experiences, but in the end The Showdown Demo was an epic slow-motion walk through a war scene unfolding as you’re slowly moving down a road towards a giant robot who leans down to scream in your face. Here’s a video of what was shown within the demo.



Nick talks about some of the easter eggs and design process that they went through in order to create this experience. At Oculus Connect, he and Nick Whiting talked about some of the optimizations that they had to make within Unreal Engine in order to get the Showdown Demo operating smoothly. Here’s a video of that talk:

One of the fun anecdotes that Nick shares in this interview is the first VR social interaction that he had with Nick Whiting while working on the Couch Knights demo. There were still early in the debugging phases of creating a multiplayer experience, and he talks about how much body language and information that they were able to communicate with each other non-verbally. Nick Whiting looked over at him and just froze. Nick Donaldson shared a gut-level response of “Sup?” with a head nod, which Nick replied to with his own head nod and “Sup?” It sounds like a simple interaction, but until you’ve had your first social VR experience like this, then it can be quite a profound experience knowing that there’s a human being on the other side of all of the movements of an electronic avatar.

Nick says that Epic Games is very interested in continuing to explore the medium of virtual reality, and that they’re really excited about the potential for creating more games and other VR experiences moving forward.

For more information, then be sure to check out the Road To VR write-up of their Unreal Engine optimization talk or the summary from the Unreal Engine blog.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.

Ryan-McMahanRyan McMahan is Assistant Professor of Computer Science at the University of Texas at Dallas, where his research focuses on the effects of system fidelity for virtual reality (VR) applications and systems. He had an interest in VR training applications and was presenting at IEEE VR about a realistic ladder climbing technique called “march-and-reach.”

Climbing a ladder sounds in VR like it’d be a fairly straight-forward problem, but yet this interview shows all of the various nuances and design decisions that had to be made in order to both accurately replicate the feeling of virtually climbing a ladder, but also do it in a way where they could teach ladder safety.

There were previous ladder climbing techniques that would either be purely hand or feet-based methods for controlling vertical locomotion. Ryan decided to use the feet for controlling the vertical locomotion, and that there needed to be at least two points of contact at all time otherwise the person would fall off the ladder.

Part of why VR training can be so effective is that you can show people what a failure condition looks and feels like without putting someone’s physical safety in danger. They can make people fall of a ladder in VR and show that they’d break their legs, and having this level of realistic fear can actually help create episodic memories that can help people more effectively remember to always keep two points of contact while climbing a ladder.

There were other nuanced changes that they had to make in order to make the simulation more realistic. While you’re climbing a ladder, you’re holding on to the rungs of the ladder and leaning backwards so that when you look down you can see your feet. But yet if you’re walking in place and look down to see where your feet are, then you actually have to lean forward if the feet are tracked accurately, which would make people loose their balance and potentially fall down in real life. So Ryan had to create an offset the feet by 10 degrees forward so that people would be able to actually see their feet in VR while still maintaining their balance.

march-and-reach01
Here’s more information on the march-and-reach paper that Ryan presented at IEEE VR.

March-and-Reach: A realistic ladder climbing technique

In most 3D applications, travel is limited to horizontal movement. A few 3D travel techniques allow for vertical travel, but most of them rely on “magic” abilities, such as flying. We sought to develop a realistic vertical travel technique for climbing ladders. We have developed March-and-Reach, with which the user marches in place to virtually step on lower ladder rungs while reaching to virtually grab higher rungs. We conducted a within-subject study to compare March-and-Reach to two prior ladder-climbing techniques. Results indicate that users consider and treat March-and-Reach as the most realistic ladder climbing technique.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.

Luca-MarchettiLuca Marchetti is the CEO of Studio Evil and he talks about developing a serious game called Relive. Relive is a sci-fi adventure game set on Mars that is designed to teach people CPR. There were demos taking place in the hallways at Oculus Connect with people wearing an Oculus Rift while pressing on a Mini-Virtual Reality Enhanced Mannequin that was being tracked by a Kinect.

cpr

Here’s a video of the Mini-Virtual Reality Enhanced Mannequin and Kinect integration, which was being worked on before Palmer launched the Oculus Rift Kickstarter in 2012.

Here’s a description of the mannequin:

The project involves the development of a serious game and a self-learning software specifically dedicated to quality cardiopulmonary resuscitation.
We developed Mini-VREM (Mini-Virtual Reality Enhanced Mannequin), a CPR feedback device with a new motion detection technology including a Kinect sensor and software specifically designed to analyse chest compression performance and provides real-time feedback in a cardiac arrest simulation training setting.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Yon-VisellYon Visell is an assistant professor at Drexel University focusing on Haptic display engineering at the RE Touch Lab. His research is in trying to figure out the neuroscientific and physical basis of human tactile sensation and perception. It’s in this process of trying to model physical sensations in the hands that he started to look at stochastic physics models that have a random probability distribution pattern that may be analyzed statistically but may not be predicted precisely.

At IEEE VR, Yon was presenting a long-paper titled, “Fast physically accurate rendering of multimodal signatures of distributed fracture in heterogeneous materials.” In essence, he’s able to model what it looks and sounds like when wood breaks at a sampling frequency of 48 kHz. There are thousands of fibers that could be modeled individually, but that’d be too computationally intensive to do in a real-time physics simulation. But there’s a pretty consistent probability distribution for how these fibers fail that can be approximated with pretty remarkable accuracy.

Here’s the abstract of his paper:

Abstract: This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.

Yon also talks about the latest and most cutting edge haptics research, including what the best research-grade machines are capable of doing such as simulating force feedback at a sampling frequency of 5000 to 10000 samples per second with a desired latency of 1 sample.

He also discusses some of the other frequency bandwidth requirements of other haptic systems including the detection of temperature changes, pressure changes, and our sense of touch.

Here’s an excerpt from Yon’s RE Touch Lab describing the overall research focus:

What do we feel when we touch and manipulate objects in the world?

How do mechanical signatures of contact elicit conscious percepts of touch?

How is touch perception enabled through structural and functional specializations in the body?

In our lab, we study mechanisms underlying haptic perception and action, and the neuroscientific and biomechanical basis of touch. Our long-term goal is to uncover the biological (neural and mechanical) computations that enable haptic interaction, when movement-dependent sensory signals are concurrently available via multiple perceptual channels.

We conduct theoretical and behavioral studies of haptic perception to illuminate contributions of different mechanical cues and motor behaviors. We aim to develop novel hypotheses about how real haptic objects are perceived, and how they can be simulated with new technologies.

Yon says that the two biggest Haptics conferences in US are the
IEEE World Haptics Conference which just happened from June 22-26 in Illinois, and then the IEEE Haptics Symposium 2016 that’s coming up next year.

After talking to Yon and other researchers at the IEEE VR conference, then it’s pretty clear to me that haptics is something that is going to be very use-case specific. The idea of creating a generalized VR haptic solution that’s affordable to consumers is something that feels a long way out, perhaps 5-15 years especially if you imagine trying to achieve a 5-10kHz sampling rate combined with less than 100 microsecond latency. In the meantime, then we can get the most out of task-specific haptic devices that provide just enough haptic feedback to give that extra amount of presence and immersion.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.