Nathan Burba of Survios talks about some of the custom VR hardware, software and games that they’re developing after recently receiving a $4 million dollar investment.

nathan_headshot_200He talks about the Zombies on the Holodeck experience, and what they’re creating in order to have a more untethered VR experience where the user doesn’t feel constrained or limited by being in VR. They’re goal is to create a more natural user interaction within VR in order to create a deeper sense of immersion and presence.

Nathan also talks about his response to the Ben Kuchera’s article about “Let’s put down the guns in virtual reality, and learn to pick up anything else.” He sees VR as an opportunity to live out our action-filled fantasies of having light-saber battles or shooting zombies in the face.

He also makes the observation that you have to innovate one step at a time with the VR medium. For example, recreating the experience of going to the beach requires all sorts of haptic feedback to make it feel real, that it becomes one of the most challenging problems to solve. Shooting things in video games is a well-establish game mechanic that is fairly easy to implement, and that’s why they started there. Plus zombies aren’t real, and it’s one of his action-filled fantasies that VR allows him to live out with their Zombies on the Holodeck experience.

He talks about his ideas of the mobile, PC and IMAX-location based type of tiers within VR, and how Survios is going down a path of creating optimized hardware and software so that VR isn’t limited by computer systems that exist today. And finally, he shares his vision of the potential for VR as being able to allow us to return to our fundamental humanity of running around and playing games, exploring exciting virtual worlds, and allowing us to express our full creative potential.

Reddit discussion here.

TOPICS

  • 0:00 – Intro to Survios, Hardware, Software and Game development
  • 0:33 – Recent $4 million round, new tech and new product to push VR forward
  • 1:07 – Zombies on the Holodeck. Feel like you’re in an 8x8ft space. Visceral experience where there’s natural interactions to have a tremendous amount of presence.
  • 1:54 – Tracking technology, wearable, server, optical camera, VR HMD, Hydras to give sense of presence and be free within a world
  • 2:30 – Going for an untethered experience where you don’t feel any limits or constraints
  • 2:53 – Safety issues with untethered VR. Pieces of tech to help out with that
  • 3:15 – Project Holodeck at USC doing Kinect Research, and learned everything he knows about VR HMDs from Palmer Luckey. Take VR and make games with it rather than just art-based and academic research. Wanted to
  • 4:39 – Ben Kuchera article talking about tired of shooting Zombies in the face. Don’t innovate on too many aspects at once, and so taking an easy to implement idea and starting there.
  • 5:59 – Response to violence in VR. Hard to simulate going to the beach because it’s a felt experience. Killing Zombies is an effective experience. VR is about living out our action-filled fantasies.
  • 7:37 – Nonny de la Pena’s immersive journalism and untethered VR experiences. Using VR for emotional appeal. Push the VR medium forward to make it easier to make this type of content
  • 8:49 – VR tiers: mobile, PC-based, and IMAX location based approach. Based upon computing systems that we have in society today, consumer PC and then super powerful, location-based computers. The best virtual reality experiences will be on nextgen computer systems that are tailor-made for VR — something that Survios is focusing on.
  • 10:10 – Don’t want to be limited by computer systems that exist today
  • 10:41 – VR will let you run around and let you feel human again, and run around, jump, play sports and play games. Games are fundamental, and VR will get us back to our fundamental humanity of running around playing games with each other, exploring worlds, and manifesting our creative potential.

Theme music: “Fatality” by Tigoolio

Jesse Joudrey of Jespionage Entertainment talks about the weekly VR Chat gatherings that have been happening, and how meeting people in virtual spaces compares to meeting them in real life. He talks about the what type of body language cues translate, and how they’ll be expanding that with hand gestures.

Jesse JoudreyJesse also talks about the process of stress testing gatherings in VR Chat, and the current bottleneck is when too many people speak at the same time. They’re currently limited by what the uSpeak Unity plug-in provides, but they can also host additional servers to help out with the load. He also talks about how the VR Chat SDK provides the ability to customize your own avatar look with animations as well as customized spaces. Gunter has also gathered an archive of past VR Chat meet ups.

Finally, he talks about his VR Game jam game of Snow Drift, which is an extreme sports game. He was surprised that it made a lot of people motion sick because he doesn’t suffer from any symptoms of simulator sickness, which he talks about as well.

Reddit discussion here.

TOPICS

  • 0:00 – Intro to VR Chat
  • 0:30 – The VR Chat experience. Almost like being in reality in that you’re meeting other real people from around the world.
  • 1:13 – Body language that translates. Where people are looking. Hand tracking is coming. Fidgety people also move around a lot in VR
  • 2:00 – VR Chat stress tests. How to measure bottlenecks. Simultaneous in-voice communication
  • 3:12 – Different challenges with audio. They use the uSpeak Unity plug-in. Can add servers to help with the load. Lots of data is being pushed when people speak simultaneously
  • 4:15 – VR Chat SDK – Import avatar into the chat. Create own animations, and their own meet-up spaces. Build environments to express themselves.
  • 5:30 – Can people host chats on their own server? Chats pare
  • 5:56 – What it’s like meeting people face-to-face after communicating with them in VR Chat.
  • 6:38 – What’s next in VR Chat? More personalization.
  • 7:16 – Snow Drift, extreme sports VR Jam game.
  • 7:53 – Doesn’t get motion sickness.
  • 8:39 – Get more info at Snow Drift and VR Chat and Jespionage Entertainment

Theme music: “Fatality” by Tigoolio

I’m joined by the Kite and Lightning team including co-founders Cory Strassburger & Ikrima Elhassan as well as developer/VFX artist John Dewar.

kite-and-lightning-allThey talk about the creative process behind converting the mini-opera song of Senza Peso into a 2D motion graphics film and then into an immersive virtual reality experience, which created some impressive buzz within the VR community.

They also discuss a number of the reasons why they went with using Unreal Engine 4 over Unity 3D, and how it enables them to more rapidly prototype on the look and feel of their VR experiences. They also have more control by being able to change the source code. They also talked about the decision to record stereoscopic video of the characters rather than using motion captured avatars.

Cory also talks about his background in the sci-fi film Minority Report, and his interest in helping develop 3D user interfaces in VR as demonstrated in The Cave & The K&L Station experience..

Finally, everyone talks talks about some of the major take-aways and lessons learned from working on all of their VR experiences over the past year, where they see VR going as well as how many exciting, open questions there are right now.

To keep up with all of the latest developments with Kite and Lightning, then be sure to sign up on their newsletter listed at the bottom of their website here.

Reddit discussion here.

TOPICS

  • 0:00 – Intros
  • 0:51 – Backstory behind Senza Peso. Getting a DK1 changed everything. Switching to Unreal Engine
  • 2:56 – Comparing Unreal Engine to Unity, and what UE4 provides
  • 5:25 – Translating the story to a 2D motion graphics film, and then translating it into a cinematic VR experience
  • 9:35 – How they did the character capture with stereoscopic video
  • 11:06 – Programming challenges for creating this cinematic VR experience
  • 12:47 – Visual design considerations & working with the Unreal Engine 4 in contrast to what the workflow would’ve been with Unity.
  • 15:29 – Ikrima’s take-aways from working on this project, and Kite and Lightning’s
  • 17:14 – 3D user interface prototypes in the Cave & insights from working on sci-fi films like Minority Report
  • 21:51 – Other 3DUI interface insights from the VR community including Oliver Kreylos’ Virtual Reality User Interface (Vrui)
  • 25:56 – Tradeoffs between file sizes in using different motion capture techniques
  • 31:38 – Experimenting with experiences that are either on-rails, some triggers, completely open world
  • 35:17 – What type of innovations they’re working on in terms of motion capture and graphics. Optimizing their production pipeline processes.
  • 37:14 – Lessons learned for what works and doesn’t work within VR
  • 44:51 – The ultimate potential for what VR can provide
  • 52:35 – What’s next for Kite and Lightning

Theme music: “Fatality” by Tigoolio

Other related and recommended interviews:

Matt Bell of Matterport on their 3D camera for capturing physical spaces. The Matterport camera and software seems like a great solution if you wanted to quickly convert a physical space into a 3D model to use within a game context. Here’s the 3D scan that Paul Grasshoff from Matterport captured of the SVVRCon 2014 exhibition space, which was then imported into VR Chat.

matt-bell-300x300At the moment, the primary use cases for the camera have been for real estate, home improvement, remodeling, construction, insurance, documentation of spaces, and crime scene investigation. But since they provide a textured mesh in the form of an *.obj file, then you can scan a room and within a half hour be able to get the file and import it into a VR experience that you’re developing.

Matterport is primarily a software company, and they use an iPad camera to be able to control their $4500 professional camera. One thing to note is that they are charging a monthly fee ranging from $49 to $149 to be able to scan, process and host a number of different files — and so there does appear to be a recurring fee to be able to actually use their camera.

You can either host the 3D model with Matterport if you need to have other people look at it online, or you can just export the textured mesh and start manipulating it for your game.

It’s also able to measuring distances within these models, and it’s dimensionally accurate down to around 1%, which is good enough for remodeling and furniture placement — and certainly good enough to quickly create 1:1 scale environments for VR. The current camera has a range limit of about 30 feet, but cameras with larger ranges will be coming in the future.

Matt also talks about how Matterport is developing some apps for Google’s Project Tango & Intel’s RealSense mobile 3D cameras, as well as lighting considerations when doing a 3D scan of a physical space.

Reddit discussion here.

TOPICS

  • 0:00 – Intro of Matterport. Solving content creation problem of making 3D models of physical spaces. Place a Matterport camera in a space, take a number of shots, and it’ll build a model within ~30 minutes.
  • 0:42 – What are people using this for? Real estate, home improvement, remodeling, construction, insurance, documentation of spaces, crime scene investigation
  • 1:03 – What is the technology behind Matterport? Primesense 3D sensors that capture depth and color. Matterport software puts together the 3D puzzle pieces to create a coherent model of the space that’s dimensionally accurate.
  • 1:57 – Using an iPad for the software end to control the camera. Uses the CPU to align the pieces & GPU to display 3D models.
  • 2:25 – What’s the cost of a Matterport camera? Aimed at professionals at $4500. Writing apps for Google’s Project Tango and Intel’s RealSense, mobile 3D cameras. Built a demo for project Tango to scan a room
  • 3:21 – What’s the output from Matterport? Textured 3D mesh. They allocate the polygons to describe complex models and how they’re optimizing the 3D model.
  • 4:21 – What are some applications for how people are using Matterport? Scanning of famous monuments and world cultural treasures, and using the Oculus Rift to have an immersive experience of a space. Take spaces you care about and start to experiment with them. Make a model of your house, and you can play a game in your house or do remodeling with drag-and-dropping of furniture.
  • 5:48 – Measuring distances within these models. Dimensionally accurate down to around 1%, which is good enough for remodeling and furniture placement.
  • 6:24 – What type of file is it, and can you alter the file? Most people just leave the models on the Matterport platform and it’s embed code. You can download the model as a *.obj and then edit the textured mesh just like any other 3D file.
  • 7:15 – Considerations for lighting? We have the real world as our rendering engine. Generally light the room that looks pleasing to you as you walk around it. What if you wanted to turn off a light later? Could get the geometry an paint it later
  • 8:28 – Have people used it to scan people and faces? Not the focus. More focused on spaces. Mobile app will be optimized for more use cases.
  • 9:15 – Is there a scanning pattern to avoid occlusion issues? Not really, just be sure that camera can see all parts of the room.
  • 9:49 – In a room with high ceilings? What is the range? ~30ft high is the current limit. There are nextgen sensors that have a greater range. Matterport is primarily a software company.
  • 10:30 – Matterport.com is where you can see more models and see their 3D cameras.

Theme music: “Fatality” by Tigoolio

Tony Davidson is the developer of the puzzle-based, adventure game called Ethereon. He talks about the process of creating a world that you can explore and interact with where there are puzzles that are well-integrated into the environment and make sense. He liked to take things apart and put them back together as a kid, and wanted to create a slower-paced VR experience that appeals to this type of exploration.

Photo by VRFocusHe talks his approach for optimizing his VR experience by only using 270k polygons, and a handful of texture maps. He set his target platform to Android, and was very mindful of getting frame rates ranging from 120 to 300 frames per second.

Tony also talks about the process of using RiftUp in order to upgrade the resolution of his Oculus Rift. He shares his thoughts on creating more immersion by avoiding photorealism and creating a less familiar, lucid dream type of environment where it’s easier to suspend your disbelief.

Finally, he talks about his vision for creating personal and solo experiences within VR, and his hopes that the slower-paced games adventure and puzzle-based games will have a resurgence within VR.

Reddit discussion here.

TOPICS

  • 0:00 – Intro. Ethereon puzzle game. Solo project. Exploration game. Worked on Riven. Has an adventure background. Learned 3D because of VR, and made a demo which got him a job. In promo mode now.
  • 1:46 – Myst and Riven games were old-school, pre-rendered game. Explore a world and interact. You figure things out to explore more. Ethereon is physics-based game, and well integrated in environment
  • 3:34 – What happens when people can’t figure it out. Some people just give up. Other people really get it. Like to tear things apart as a kid, and people like that will appreciate that. VR mimics reality. Put them in a world where they have to figure it out.
  • 4:48 – How to balance moving people forward without giving too many spoilers. Community will share hints and spoilers. It’s a slower-paced game. People who like to run and gun are ones who have more trouble.
  • 6:35 – Slower pace, and dealing with VR simulator sickness. Had a hard time with it.
  • 7:46 – RiftUp kit details to get higher definition within Oculus Rift
  • 9:59 – Does it account for the barrel distortion?
  • 10:37 – AMOLED screens will have low persistence relative to LCD.
  • 11:13 – Frame rates and adapting for higher resolution. Only 270k polygons. Targeting Android as a minimum platform. Getting 120-300fps
  • 13:02 – Photorealism vs. more immersion in less photo-real, low-poly world. Reflections and other effects help trick people in believing it. Non-photo real is less familiar and it’s easier to suspend your disbelief. Reality and Lucid dream.
  • 15:54 – High contrast working better in VR. Setting it in space.
  • 16:43 – Potential of VR. Not a big fan of the metaverse, and more about creating a personal and solo experience. Adventure, puzzle-based games and slower-paced games to have a resurgence.

Theme music: “Fatality” by Tigoolio

Caitlyn Meeks is one of the creators and current manager of the Unity Asset Store, which is a marketplace that is changing game development for both game developers and content creators. She describes how Unity has built an extensible framework where you can extend it’s functionality through the asset store, and so it’s functionality is not fixed with the new features that come from their official releases.

CaitlynUnityThe asset store innovates in many different areas, and it is a slow and methodical process that Unity goes through to eventually integrate some of those features into it’s core engine.

Caitlyn talks about what differentiates Unity from other game engines, and how Unity is responding to recent pressures in the marketplace from Unreal Engine 4. They’re staying the course with their plan and roadmap, and see that they’re focusing on creating a streamlined user experience for game developers. She sees Unity as an unstoppable tortoise who may not always be first to market with all of the new features, but that they’re implemented well in a methodical fashion and with love.

She mentions some of the VR specific plug-ins on the Asset Store including SDKs for Sixense and Leap Motion, Cast AR, CAVE projection systems, and DIS HLA interfaces. She also talks about the free Unity Multipurpose Avatar (UMA) plug-in, which is a Unity-sponsored, avatar creation tool.

Finally, she talks about her vision for how VR will change the humanities and expression in a way that makes us more human and grateful to be alive.

Reddit discussion here.

TOPICS

  • 0:00 – Unity asset store. Provides content to game and VR developers, 3D audio, texture and music, and scripts and tools that extend Unity. Exponential growth since 2010. 750k active users forming a community of content producers. It’s an ecosystem where people are helping each other. People can make a living off of selling assets. One of top 3 reasons for using Unity. Bedroom artists who are making $10k-$100k. Senior of artist from Ubisoft making more from the Unity asset store sells. It’s changing game development for developers and content creators.
  • 3:08 – How does it differentiate from other engines? Unity is extensible. Asset store can be a stopgap for providing new features. Unity isn’t a static product. New functionality is coming through the asset store. Unity feature list doesn’t include all of the functionality available through the asset store. You don’t need access to the source code to create your dream tool in Unity.
  • 4:45 – What are some popular VR plug-ins? First submission from Palmer Luckey two years ago. SDK for Sixense and Leap Motion, Cast AR, CAVE projection systems, and DIS HLA interfaces. UMA is the Unity Multipurpose Avatar, which is a Unity-sponsored, avatar creation tool.
  • 6:42 – Binaural audio plug-ins to enable positional audio. New audio implementation in Unity 5 including audio
  • 7:20 – Lots of excitement for UE4 and EVE: Valkyrie moving from Unity to UE4, and what is Unity’s approach for counteracting this? Staying the course with a solid product with an unparalleled workflow. Lots of new features coming in 5.0 that have been on the roadmap. Not going to change anything drastically. Always been a bit behind, but they do it well, methodically and with love. They’re an unstoppable tortoise.
  • 8:55 – Your vision for what you want to see happen in VR. Fan of Cyberpunk and Snow Crash and Second Life. Coming from an artist’s perspective, and VR will be one of the most significant developments in the humanities and human expression. So many different possibilities for creating worlds and experiences that are beautiful, horrifying, mechanical, alien, etc. Goal is to see beautiful morphologies emerge. See worlds, spaces and scenarios that make us more grateful to be alive and things that make us more human.

Theme music: “Fatality” by Tigoolio

Mike Sutherland of YEI Technology talks about the PrioVR immersive body suit, which aims to immerse your whole body into VR experiences. YEI Technology is bringing this technology that they’ve been developing for the military into the consumer gaming market.

priovr They’re hoping to provide the first, consumer-grade motion capture suit with their pro version for $429, and also have a Core full-body option for causal gamers for $369, as well as an upper-body only Lite version for $289.

Mike talks about succeeding with the PrioVR Kickstarter the second time around, their custom motion controllers, game development plans, interest in finger tracking, target time for suiting up, and more details about their 3-Space Sensor technology.

Their website describes these sensors as “miniature, high-precision, high-reliability, Attitude and Heading Reference Systems (AHRS) / Inertial Measurement Units (IMU). Each YEI 3-Space Sensor uses triaxial gyroscope, accelerometer, and compass sensors in conjunction with advanced processing and on-board quaternion-based Kalman filtering algorithms to determine orientation relative to an absolute reference in real-time.”

For one of the most comprehensive reviews about this technology, then I’d recommend checking out this epic review of SVVRCon gear by Oliver “Doc_Ok” Kreylos.

Reddit discussion here.

TOPICS

  • 0:00 – Intro. PrioVR first consumer level, full immersive gaming suit. Bring VR into the next stage by bringing your body into VR
  • 0:39 – Low-cost motion capture suit. Working with motion sensors for the military, and bringing that technology into the consumer level. Core suit configuration. Pro version for motion capture for indie developers AAA game toolsets into the hands of independent devs. Also an option for upper body suit for seated VR
  • 2:07 – Arm controls. Currently have an aftermarket controller, and they’ll be shipping with custom controllers.
  • 2:52 – Kickstarter history. Launched unsuccessful Kickstarter. Learned lessons, and focused on improving the suit, and did better marketing.
  • 3:44 – Using PrioVR for motion capture. Pro suit with 17 sensors cost around $400. Get rich character animations. They have an existing motion capture studio, and have tooling around that
  • 4:36 – Game titles that will be available. Have an in-house dev team, and getting it into the hands of developers as quickly as possible. Some partnerships developing.
  • 5:32 – What type of user interactions are possible? Doing full-joint reconstruction rather than inverse kinematics. Keyboards or mice doesn’t work in 3D. Use your hands and reach out, and it’s more intuitive for the non-gamers. Don’t have to remember buttons
  • 6:56 – Finger tracking plans? Just focusing on the suit for now, but interested in it.
  • 7:19 – Talk about the types of sensors and fusion system that you’re using. Untethered experience and range.
  • 8:07 – How long does it take to suit up? Working with design companies to get it easier to get on and off than alpha suit. Target is 15 seconds.
  • 8:44 – What type of latency can you get. 9:26 – What’s the roadmap for when these will be available. Later this year for dev kits.
  • 10:08 – Price points for the different products. $289 upper body. $369 core suit. Pro suit $429. First time for sub-$1000 motion capture suit will be available. More info PrioVR.com. Available for pre-order now.

Theme music: “Fatality” by Tigoolio

Philip Rosedale is the creator of Second Life, and more recently High Fidelity. He talks about a lot of the things that he’s doing differently in creating a virtual world for the second time around including a focus on 3D audio, low latency, speed and texture of experience as well as using a standard scripting language with JavaScript rather than rolling their own.

philip-rosedaleHe talks about virtual body language and how the target of 100ms of latency is the threshold for a compelling telepresence experience that is indistinguishable from face-to-face interactions.

Philip talks about how High Fidelity wants to create a set of open standards and protocols so that people can host their own virtual worlds on their own servers. He also talks about their approach to distributed computing to help offload the computer power to run a complex and nuanced virtual world, and how mining a cryptocurrency could be a part of that process.

Finally, he talks about his vision for the future of the Metaverse, and how these virtual worlds will provide opportunities for participants to be more thoughtful, more open, and more creative than they can be in the real world. He doesn’t see that these worlds are necessarily escapist since they can be as sophisticated, complex, navigable and challenging as the real world. His experience with Second Life was that you have to be just as capable, smart and entrepreneurial to succeed in virtual world environments.

Reddit discussion here.

Be sure to check out this blog post on High Fidelity’s system-level architecture for more details.
hifi-system-architecture1

TOPICS

  • 0:00 – Intro – High Fidelity. New virtual world taking advantage of changes in technology
  • 0:32 – Motion sensors and Oculus Rift are driving changes in virtual worlds. Driving to interact naturally in 3D virtual spaces, and the requirement to have the learned skill of using mouse and keyboard is going to end soon.
  • 1:33 – What types of interactions have you had within High Fidelity with these new tools? Body language, and seeing someone blink. Nodding head is important. Moving hands is remarkable. Got PrioVR working for full upper body animation. Group interactions and face-to-face interactions.
  • 2:47 – Facial capture with either a 3D camera or a webcam with Faceshift, and reconstruct it via 50 floating point numbers. Aiming to get less than 100ms latency to mimic 1-to-1 interactions
  • 3:48 – Using VR HMD and facial capture at the same time. Can only get one at a time. Oculus thinking about doing facial capture. Can use a 3D TV, and adjust the view as a intermediary between full VR HMD and computer screen
  • 4:54 – Using High Fidelity as a telepresence tool. Use it with their distributed team, and cool to see others.
  • 5:35 – Good enough for enterprise use? Proof point of recording telling the same story with the same story with same avatar, and can identify people even without sound
  • 6:20 – Distributed computation at High Fidelity. Limited by centralized hosting. Distributing small computers quickly, and use computers at home to offload some of the processing.
  • 7:30 – Dynamic multicasting with audio. Mixing it in 3D. Dynamically assembling a multicast repeater and can perform a concert in real-time with less latency than in the real world.
  • 8:47 – What is a voxel, and how are you using it? A way to organize space virtually. Represent what things look like at a distance, and enables to see at infinite distance. See full mesh topology up close.
  • 10:06 – Hierarchical nesting of voxels for the decomposition of space with a “sparse voxel octree”, and then distributed computing with those. Can create infinitely complex city
  • 10:59 – Other things that you’re doing differently from Second Life: Audio processing, low latency, speed and texture of experience, using a standard scripting language with JavaScript rather than rolling their own. People want to run their own services, it’s a protocol and open source standard rather than a world upon it’s own.
  • 11:59 – Cryptocurrency and paying people for helping run the virtual world.
  • 12:56 – How is identity different on High Fidelity? By default, you’re anonymous, and using OAuth and SSL for authorization for certain secure sites, but also a lot of open worlds. Having name floating over your head is not a great solution, because sharing you name is a choice and form of greeting
  • 14:23 – Future of the Metaverse. Create a set of interconnected virtual worlds, where they’re living adjacent to each other. Instead of hypertext links, there will likely be doors. Virtual worlds of the future will be a set of interconnected spaces like the real world. There will be hidden servers that you can’t get to, just as there are private intranets.
  • 15:34 – What inspires you with what you want to see? How people are changed by virtual worlds for the better, more thoughtful, more open, more creative. Virtual worlds are our future. They will become a real added space, and it’ll be a profound expansion of the real world.
  • 16:35 – Are virtual worlds escapist? Technology is getting us the ability to create worlds that are just as sophisticated, complex, and navigable and challenging as the real world. Only escapist if you’re escaping from other people, or simplifying the world too much in a way that isn’t in our best interest. To be successful in Second Life you have to be capable, smart and entrepreneurial.

Theme music: “Fatality” by Tigoolio

Here’s a recent talk that Philip Rosedale gave about High Fidelity. Note that this is NOT within High Fidelity, but a government virtual world called MOSES, which is the “Military Open Simulator Enterprise Strategy.”

Four and a half years ago, George Burger was watching his son play Call of Duty, and thought that it’d be great if he would be able to run around instead of just sitting while playing video games. When he discovered that there weren’t any omnidirectional treadmills on the market, then he decided to build one himself.

infinadeck-thumbThis resulted in the InfinAdeck, which was his second prototype of the omnidirectional treadmill and it was a surprise hit at SVVRCon. George describes it as being like a tank tread with each tread being it’s own Y-direction treadmill running 90-degrees perpendicular to the main, X-direction treadmill.

George talks about the process of designing and building this second prototype, and what’s next in terms of designing an automatic control mechanism. He had shown earlier prototypes on the Meant to be Seen 3D forums, and had even been in contact with Palmer Luckey about his progress.

Finally, he talks about how InfinAdeck compares to the Virtuix Omni as well as his next steps in designing the next iteration.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & the InfinAdeck omnidirectional project started 4.5 years ago. Inspired by watching son play Call of Duty. Worked on it for a year, and made an original prototype. Designed 2nd prototype and started building it in October 2013. It’s a prototype, and it’s over designed and too big. But now know what it takes to build the next iteration
  • 1:42 – Describe what’s happening with the two conveyor belts. Difficult to describe, but it’s like a tank treadmill with 90-degree treadmills
  • 2:26 – Able to turn while walking and how does the control mechanism work. Two motors X and the transverse Y motor. Two omniwheels and that it’s geared different. Uses noisy parts, and will be different. Motors are controlled by a motor controller, and the control mechanism isn’t finished yet.
  • 4:08 – There’s manual controls that he’s using at the moment.
  • 4:47 – Has anyone fallen down? No. No one has ever fallen down.
  • 5:15 – What did you learn from this prototype then? Too big. Too heavy. Too overdeisgned. Will design the next version with a computer.
  • 6:04 – At what point did you realize that this would be great for virtual reality? Envisioned that it’d be like a CAVE, but with televisions. Got onto Meant to be Seen, and showed prototype, but people weren’t interested. Decided to lay low. Couldn’t get engineering help from others, and decided to be quiet and develop it.
  • 7:39 – What was Palmer Luckey’s reaction to it? He loved it. Had control suggestions, and wants to buy one. Doc_Ok wants to help with the control mechanism.
  • 8:24 – Has anyone used VR with it yet? Not yet. Need to have the control mechanism in place.
  • 8:58 – Who is the target demographic for this? Everyone. It was designed for mass production. Perhaps military, medical police officers and even gamers. Need to know how big and how fast should it go. Never as portable as the Virtuix Omni, but they’re not competing against them. At opposite ends of the spectrum.
  • 10:07 – Sounds like this would be great for a higher tier where you have a dedicated space for VR. Don’t need a full room, just enough space for the InfinAdeck.
  • 11:16 – What are the dimensions. 7′ x 7′, but with a 60-inch space in the middle. Need to keep them in the center.
  • 12:01 – What’s the speed limit? Could make it go 10 mph in next iteration. This one was meant for jogging at around 3.5 mph. Depends on the technology for how fast it’ll go. The purpose of this prototype was to figure out what would be needed for the next iteration. Purpose of it was over before SVVRCon, and getting to show it was a bonus.

Theme music: “Fatality” by Tigoolio

I had a chance to catch up with Reverend Kyle of the Rev VR podcast at the end of day one of SVVRCon, which is right before the Ubercast and his interview with Palmer Luckey.

kyle

We talk about everything from being a VR personality, his favorite VR experiences, Social VR, adding value and being topical & fostering community. He also talks about the importance of VR social experiences for coping with living in the midwest where there’s not a thriving tech scene.

More details about the discussion are below.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & how he got started into Virtual Reality. Got it in June. Had heard an earlier podcast, but it had stopped. Did a couple and got a good response. Eventually joined the Road to VR team.
  • 1:16 – What’s your intention with the podcast? Fill in the gaps between news from the VR HMD manufacturers. Also to give developers a voice. Wants to do episode 500 eventually. There’s no formal marketing for VR, and there’s room for personalities to get involved.
  • 2:55 – Most compelling VR experience? Played Time Rifters for 2.5 hours. Riftmax Theater and VR Chat. Karaoke night.
  • 4:00 – Social aspects of VR chat and body language. Not meant to replace face-to-face, but there’s enough body language to feel presence with other people.
  • 5:13 – Being located in Cincinnati, OH. Struggle with the lack of tech scene there. Virtual meet-ups provide opportunities to connect with other VR geeks.
  • 6:34 – Impressions of the SVVRCon exhibit floor – DK2. Sony Morpheus. InfinAdeck. In VR Overload.
  • 7:34 – Working with the Hydra. Looking for next level of VR input controller with the STEM.
  • 8:17 – Rev VR podcast guests that stuck out? Taylor Roach and Karaoke host who is 17 years old. Cast AR. Tactical Haptics. Post-Facebook acquisition discussion.
  • 10:04 – Participating in the Oculus subreddit and adding value to the VR community, and working together as a community to help make VR happen. Be active and topical.
  • 11:47 – SVVRCon and the future of VR. About networking and meeting for face-to-face for the first time. Lots of community feelings. A giant VR love-in.

Theme music: “Fatality” by Tigoolio