Mike Arevalo talks about the process of creating the VR Typing Trainer, which was created as part of the Orange Country Virtual Reality Meetup’s 48-hour Educational Game Jam.

mike-arevaloMike talks about the process of developing the game, and the structure of the game jam. His day job is to create educational applications, and they are always talking about how to immerse students in environments to help them learn more effectively.

Mike says that studies have found that gaming can stimulate a student’s brain in a way that static presentations never can, and that immersive VR can be a powerful way to unlock the parts of your brain to make it easier to learn new things. His advice to other game developers is to focus on getting the immersion right in your experience, and that your other goals and learning objects are more likely to fall into place.

I had a chance to play the VR Typing Trainer at Immersion 2014, and it is a very immersive and fun way to improve your typing. Having the words flying towards your face does create a certain amount of pressure and tension that makes the ordinarily dull process of typing much more engaging and fun. I could see how playing this game could help to cultivate some useful typing skills for when you’re in VR, and it’s definitely worth checking out — especially for an experience that was created in 48 hours.

TOPICS

  • 0:00 – VR Typing Trainer – Educational game to bring typing into VR to type without looking at the keyboard.
  • 0:36 – Sitting in Tron-like world, and targets at you and you have to type the word that’s printed on it. It’s an endless runner.
  • 1:00 – It’s a simple core game mechanic. Uses object pooling to take existing objects and get them to move towards the player. Have an algorithm to determine the difficulty depending on how long you’ve been playing the game.
  • 1:37 – Creating the Tron environment because it needed to something more interesting
  • 2:06 – Educational Hackathon
  • 2:26 – Ideas were pitched, and then broke up into groups
  • 2:53 – Saw Tuscany demo, and needed to get into VR
  • 3:09 – Use VR typing trainer to learn how to use keyboards more efficiency.
  • 3:47 – Hard to work with 7 programmers with different skill sets and not all Unity users. A lot of other art exhibits that were there. It had a bit more
  • 4:31 – A lot of planning required to coordinate.
  • 4:56 – Potential for VR. Mike is an educational app developer. How to immerse students to learn at a more effective rate, and gaming stimulates a student’s brain in a way that static presentations never can. Immersive VR can unlock parts of brain to make it easier to learn new things.
  • 5:48 – Advice to other VR developers to make an educational experience. Immersing the player into a place where they’d never be able to be otherwise. If you get the immersion down, then everything else will fall into place.

Theme music: “Fatality” by Tigoolio

Kieran Nolan is a network administrator who has been creating different elearning applications with immersive technologies. He’s 3D printing objects that students either create or modify from Thingiverse withing Google SketchUp. He’ll take a digital photograph of their objects, and then upload it to a virtual art gallery that can be viewed with an Oculus Rift and networked to another school system. He’s also been teaching classes in Minecraft, and even had his students collaborate on building a working QR code.

Kieran-NolanKieran also talks about how he sees cryptocurriences like BitCoin playing a larger part of the future infrastructure that’s going to enable all sorts of things that we haven’t even thought of. He sees BitCoin as a protocol that will enable all different types of decentralization of our infrastructure. One example that he provides is Namecoin, which is like decentralized DNS and a “decentralized open source information registration and transfer system based on the Bitcoin cryptocurrency.

He says that there’s a lot of potential for using immersive technologies in education, and he sees that it’s going to bring in a whole new curriculum because it’s so engaging and compelling for students.

TOPICS

  • 0:00 – e-learning and using oculus rift with the virtual arcade. Have a 3d printing networking set up with another school. Design 3d object, 3d print it, take picture
  • 1:14 – Workflow. Using sketch up to design objects. Eventually want to use Minecraft for designing objects. 3D print and take photos, hashtag and upload to virtual arcade to be viewed. Built a QR code in Minecraft. Lots of collaboration with Minecraft. Kids adapt to the Oculus Rift pretty quickly.
  • 3:57 – 3D printing and then put virtual images within in and use btsync. Enigma portal to get schools to work together and get older students mentoring younger students. Use QR codes to move between places. Using Titans of Space with students with Aspergers. Most interested in interschool 3D printing
  • 7:04 – Immersive education keys for engagement. Downloading 3D objects from Thingiverse, and changing it. Each student took photo, and then took turns walking through virtual art gallery to see their work.
  • 8:52 – Potential for using immersive technologies. Going to bring in a whole new curriculum. Running classes in Minecraft to do math and English.
  • 10:16 – Excited for BitCoin in education. Wanted to use BitCoin as an incentive for learning. Using the BitGigs model to do tasks to learn, and get paid in BitCoin to do small jobs. It’d teach kids about money and cryptocurrencies.
  • 12:03 – Bitcoin and the future of virtual worlds. BitCoin is a protocol like TCPIP that you can build on top of it. Namecoin is like decentralized DNS. It’s a “decentralized open source information registration and transfer system based on the Bitcoin cryptocurrency.” It’ll revolutionize things, and it’ll play a big part of decentralizing everything.

Theme music: “Fatality” by Tigoolio

Philip Lunn is the CEO of Nurulize, which is an entity created by the collision of VFX and video gaming for Virtual Reality. Co-founder of Nurulize has developed a process to be able to capture the world in a high-resolution, photorealistic way with a framerate ranging from 100-200 frames per second.

philip-lunnIn their VR demo called Rise, they combine FARO LIDAR scans, HDR photography, and xxArray character captures in order to create photorealistic environments and people within VR. He talks about the mostly manual process that they go through in order to capture the entire environment in a point-cloud with sub-millimeter accuracy, build a 3D mesh from the point-cloud data and project the HDR photos onto it, and then use real-time shaders to get framerates as high as 100-200 fps.

Philip talks about their plans to use their process to help capture retail locations, film trailers and high-value objects that you can’t get close to.

He sees VR as the biggest breakthrough in computing that there’s been in the past 25 years, and that virtual reality goggles will eventually replace our computer monitors and that Nurulize wants to help populate those virtual work spaces with idealized and exotic, 3D-scanned environments.

TOPICS

  • 0:00 – Intro – CEO of Nurulize. Developed a process to capture the world in high-resolution, photorealistic and with a very high framerate. Creating VR experiences for the Rift
  • 0:32 – Rise demo that has laser-scanned warehouse. Scott Metzger has developed this process using high-resolution photography at multiple exposures, and then using FARO laser scanners to capture the entire environment in a point-cloud with sub-millimeter accuracy, build a 3D mesh from the point-cloud and project the photos onto the mesh, and developed real-time shaders up to 100-200fps.
  • 1:53 – Dealing with occlusion issues. Created a narrative around this. It’s a full environment without occlusion.
  • 2:54 – LIDAR scanner FARO commercially available and then uses 3-4 tools to process that
  • 3:23 – Reverse photogrammetry process
  • 3:45 – Commercial business that is doing service work to do captures
  • 4:05 – Special effects shops moving from film to VR. Have enough hardware processing power
  • 4:47 – Target markets: Retail. Film Trailers and High-value objects that you can’t get close to
  • 5:09 – How did you get into VR. Been in computer graphics for 20 years with real-time ray tracing. VR is the biggest breakthrough in computing that there’s been in the past 25 years.
  • 5:45 – Where do you see VR going. Ready Player One is a good roadmap. VR HMD will replace your monitor, and Nurulize want to help fill that with 3D-scanned environments and be in dream environments
  • 7:02 – Travel to exotic locations and capturing exotic unattainable things
  • 7:30 – Won’t be interested in creating things that don’t exist in reality. More interested in capturing real-world places.

Theme music: “Fatality” by Tigoolio

daniel-greenDaniel Green is the Co-Chairman of the Mid-America Chapter of the Immersive Education Initiative, and has been involved in teaching coding skills with immersive technologies. He points to a lot of educational resources at code.org that they use including curriculums using MIT’s 2D, drag-and-drop gaming platform Scratch, the 3D platform of Alice, Greenfoot for teaching introductory Java programming, and then programming mods within Minecraft. There’s also MinecraftEDU, which has a community of educators who share their programs with each other.

Theme music: “Fatality” by Tigoolio

Ka Chun Yu is the Curator of Space Science at the Denver Museum of Nature and Science. He’s in charge of the digital planetarium there, and has been studying the effectiveness of using immersive dome environments in teaching different astronomical principles.

Ka-Chun-YuHis research has shown that immersive technologies are more effective at teaching certain astronomical principles such as why there are different seasons and how the sun rises and sets in different places throughout the year as well as throughout different places on the planet. That may not be a total surprise, but what was interesting was that the process of transforming and distorting immersive, 3D visualizations in order to work within a 2D projected context may actually be worse than telling people about it without using any visualizations at all. There are certain topics where being able to see objects fly around you are a critical part of understanding how the world works.

Ka Chun also talks about how he’s been using immersive technologies to facilitate group discussions with experts on various ecological issues facing us today. The technology can help provide a holistic picture on topics like the complete water cycle and the limited sources of fresh water. Having data visualizations and immersive experiences can make dry topics more compelling and engaging, and provide a solid foundation and context for having deep discussion around challenging abstract issues that we face as a society. He’s found that using immersive technologies like a digital planetarium can provide an experience to a large audience that is both very effective and compelling.

TOPICS

  • 0:00 – Intro – Curator of space science and working with digital planetariums and studying the effectiveness of using immersive dome technology to teach astronomy and for telling stories about planet earth and regional ecological issues
  • 0:39 – Effectiveness of fully, immersive dome for teaching astronomical principles compared to same visual content that’s projected onto a 2D wall as well as a control group with no visual content at all. For astronomical seasons, the immersive dome students had far superior results and if it’s projected onto a 2D wall, then those students did worse. Showing immersive visuals in a 2D screen, you get distortions and it’s a much inferior experience when compared to immersive visuals.
  • 2:19 – Other astronomical principles better describe in a dome. Seasons requires you to look around and be able to watch the objects move across the sky. Other results are not as straight forward when you don’t need visuals that move around you.
  • 3:06 – Sun rising and setting in different parts of the sky and able to show direction of rising and that it changes during the time of year and depending on where you life. You can even show how the sun rises and sets on Mars. Have the universe in their simulation and can travel through the entire universe and travel through time and at different viewpoints and perspectives to look at various issues, which is more effective
  • 4:38 – Lectures and dialogues about ecological issues on planet earth. Have Google Maps type of capability. Connect people to issues of global change and how to be a part of the solution
  • 5:37 – Other applications. 100% of California is under drought conditions. Help understand about water issues and where fresh water comes from. Fly around and show where water originates. Show them the water cycle process, rainfall and drought data, and connect them to global issues. Easier to show data in immersive dome environments than have someone just tell you about it in an abstract way.
  • 7:38 – Trying to come up with a model so that audiences can have a group discussion with experts talking about these global issues from water conservation to sustainable agriculture. Need to educate the public and do it in a compelling way, and visual storytelling is very powerful and compelling, it looms overhead and is much more effective and immersive. Having these discussions with immersive technologies laying the context and foundation for the discussion is a very power and effective approach.

Theme music: “Fatality” by Tigoolio

Ross Mead studies Human-Robot Interactions to make robots and virtual human non-player characters (NPCs) more realistic to engage with. There’s a lot of overlap between designing body language for physical robots and for NPCs since they use the same principles.

RossMeadNon-verbal communication is a fundamental building block of social interactions, and he talks about principles like spacing, socially-appropriate eye gaze, gestures, using and understanding pointing behaviors, modulating speaking voices to be louder/softer or faster/slower, head nodding, and taking turns when communicating.

He talks about how humans are always broadcasting information with everything that they do whether it’s speaking or not speaking, moving or not moving. Any reaction or lack of reaction communicates some form of meaning of whether or not your interested and engaged or disinterested and not fully connecting.

Body language can tell you the nature of the relationship with someone, and being able to identify open and closed body language cues can add another layer of depth and realism to interactions with NPCs within virtual environments.

Ross says that there a couple of ways to measure how believable your social interactions was either robots or virtual avatars. There are physiological measures that can come from looking at heart rate, galvanic skin responses, respiration rate, and general activity like the speed and frequency of motion. But there are also traditional psychological surveys that can measure how believable or comfortable the interaction was subjectively perceived.

He sees that the top two body language cues to implement with virtual humans would be adaptive positioning and automating co-verbal behaviors of gestures that are coordinated with speech so that it doesn’t feel like a robot or zombie.

Finally, Ross talks about the different cues for open vs. closed body language, the importance of mimicking for building rapport, and some of the ways that these techniques could be applied to provide a safe escape that’s fun and improves people’s lives. Stay tuned for more information about his company named Semio.

TOPICS

  • 0:00 – Intro – Ross studies Human-Robot interaction and presenting work on getting robots to use body language and understanding non-verbal communication that are the building blocks of social interaction like spacing, socially-appropriate eye gaze, gestures, using and understanding pointing behaviors, modulating it’s voices louder/softer, faster/slower, head nods, taking turns when communicating.
  • 1:12 – Applies to both robots and avatars. Robots are physically co-present NPC. Could be applied to virtual worlds to make characters more engaging. Working on making characters more engaging by using body language.
  • 2:04 – Eye gaze can feel weird if implemented in a way that feels natural. It broadcasts info, and tells others what you can observe & connected to privacy and the nature of the relationship. Continued eye contact means “I want to see more.” Too much eye contact violates the amount of intimacy that people are comfortable with. Will compensate by averting our gaze, can get more spacing, change frequency and duration of direct eye gazes or perhaps cross arms, or pacifier behavior of self-touching
  • 3:31 – Measuring psychological impact of implemented body language? Two ways. Use physiological measures like heart rate, galvanic skin responses, respiration rate, general activity and speed of motion. Can use psychological surveys with Likert scales. How intelligent NPC? Was it violating your personal space? Use these to figure out how people react to these
  • 4:43 – Top behaviors to implement with NPCs. Positioning is the first thing to get correct, and will be more engaging you adaptively use positioning. Secondly would be automating co-verbal behaviors of gestures that are coordinated with speech so that it’s not a robot or zombie. Eye gaze. Pointing. Immersive and engaging.
  • 6:26 – Pointing behaviors like pick “that” up or talk “her,” which is a referencing behavior that’s fundamental to human communication
  • 6:58 – Body language for engagement like a forward lean, increased eye contact, increased rate of speech. Opposite with the opposite like leaning back, attention if focused elsewhere. Can’t look at these in isolation, and look for combinations and clusters of behaviors because there’s other reasons
  • 8:12 – Open body language, not arms crossed and reveal front of the body. Open eyes, eyebrows up and a smile. Bitchy resting face when an idle pose scrunch up, and have to consciously counter this. Humans are broadcasting 24/7 and need to be aware of what you’re putting out.
  • 9:34 – Mimicking body language is a fundamental component to building rapport. USC’s ICT is looking at virtual humans.
  • 10:24 – These technologies will make our lives more fun. Seen as an outlet and relief to challenges we face during the day. Safe escape, but also if someone has a disability and want to improve their lives. Focusing in on helping people with special needs and make the world a better place.

Theme music: “Fatality” by Tigoolio

Melissa Carrillo is the Director of New Media Technology for the Smithsonian Latino Center and the Smithsonian Latino Virtual Museum. She’s been a pioneer in using immersive technologies for the Smithsonian.

MELISSA-CARRILLOBecause the Smithsonian Latino Center does not have any physical spaces, then Melissa has had to embrace the digital revolution and start to challenge a lot of the traditional curatorial mindset of institutions like the Smithsonian. She’s been a pioneer in using virtual worlds environments like Second Life to hold virtual cultural heritage events like the Smithsonian Día de los Muertos (Day of the Dead).

She talks about the keys for creating an open and collaborative environment with a virtual world, and how it’s more about creating a transmedia hub for all different types of media to be synthesized and shared in virtual spaces yet also shared back to the outside world through social media channels. She also goes into more details about all of the challenges that the faced along the way including what types of virtual world environments work the the best for cultivating community and sharing cultural identity. They fell into the pitfall that a lot of museum curators and educators do within VR by recreating buildings within virtual worlds that lecture at people and merely show 2D representations of the art within a 3D world.

Audiences want to be able to interact with the world, discover information that they find interesting, and be surprised and delighted through authentic experiences that are backed by curatorial scholarship and integrity.

Finally, Melissa talks about other initiatives where the Smithsonian is embracing the digital revolution, and how she sees the use of immersive technologies like virtual reality will be used by museums in the future.

TOPICS

  • 0:00 – Intro – Director of New Media Technology for the Smithsonian Lation Center. Use virtual worlds, gaming and simulations to reach out to audiences in a new way in order to communicate cultural identity. Smithsonian Día de los Muertos (Day of the Dead) events within Second Life. University of Texas in El Paso is their Second Life partner. They have a town square and cemetery environment that provide different cultural contexts for share cultural heritage experiences within a virtual world environment.
  • 2:05 – Second Life events. Important to have live programming and live streaming within Second Life to help recreate events within virtual worlds through collaborative outreach events. In virtual space, you’re able to create new experiences and not just replicate them.
  • 3:30 – Expressing cultural identity. It’s challenging to represent cultural heritage and identity as authentically as they can. Ensure authenticity the representation and presentation of artifacts, and have rigorous scholarship to do that from a cultural heritage perspective to preserve traditions. Make sure that it reflects the story that they’re trying to tell. It’s a collaboration with the participating community, and allowed the audience visitors to share stories and build altars
  • 6:10 – How to invite collaborators and hold space for that. It was challenging within Second Life since Smithsonian is used to being in complete control. Took a few years of experimentation, and need to figure out what they can and can not do. Used social media in coordination with Twitter, Facebook, Tumblr, Instagram to allow audience to share their Day of the Dead tattoos and food connected to that cultural event. Let audience build their own alters, and they share their creations via social media. Tell their story in one space and spread through the social media channels.
  • 9:35 – Creating rooms and spaces that work best for sharing cultural heritage within Second Life. Fell into the same pitfalls in creating spaces in virtual worlds. Creating building and rooms and creating 2D representations of art, and replicated a museum. The most successful part of their space was their town square and plaza because that’s where people meet and have events. Using Unity3D and looking outside of the box. Simulated an excavation site to get information and clues about the objects that you find where you’re free to explore and learn. You role play an archeology, and you get an immersive experience with different ways of experiencing the objects. Virtual museum is seen as a transmedia hub.
  • 13:00 – Key learnings from working in virtual spaces. The Smithsonian Institution found that people want authentic experiences. Need to be grounded in scholarship and maintain integrity of the work and accurately recreated in 3D space. All of the information that you interact with is coming from the curatorial team to ensure that it’s not being misrepresented and that audience can have an authentic experience.
  • 14:32 – Why do people like to be surprised. People don’t want to be talked down to and told the truth. They want to discover things on their own and have a sense of wonder and awe. Goes against what institutions are used to. Digital revolution put the power of discovery back into the hands of audience. Being to think strategic for what audiences want, and so look to social media to learn about that. Science museums have been creating interactive experiences like this for a long time. Audio tours are old way, create virtual experiences that are more interactive
  • 16:55 – Digital revolution have upset the power structure and previous paradigm of cultural institutions like the Smithsonian. It’s challenged the traditional curatorial practices and traditional storytelling practices. It’s all transformed and changed how they think creatively. A lot of different stakeholders at the table at the same time. Everyone can play a part of telling these stories in collaboration with the public. Virtualization and digitization has shattered the foundation of how these institutions do business and communicate tot heir audiences. Art and culture council want to share their lessons learned. How digital artifacts are used and the permissions around those are new challenges around access. How far is content made available made due to copyright limitations. Can then change, adapt and use it further. It challenges the traditional infrastructure of how these institutions have worked in the past. Audiences are demanding more access.
  • 21:05 – Saw the power of immersive technologies back in 2007. Smithsonian was trying to understand Facebook and how to deal with social media. What will 10 years look like? How about right now? Everything is shifting in 2007 and advocated embracing change. The Latino Smithsonian Center doesn’t have a physical space, and so social media and these virtual world technologies would be crucial for their mandate. Ran for the digital revolution on the underground for the longest time. Met Aaron in 2008 and saw that they needed to collaborate with other technology companies and innovators. Art and Culture summit need to be on the same page with how to tell stories and stay authentic. Audiences want surprise. Audience preferred to go to Wikipedia rather than Smithsonian website, and now collaborating with each other. How physical installations are exploding in virtual spaces.
  • 25:37 – Virtual worlds and creating spaces, and the female-perspective. Audiences use these video games. How they tell their stories, and don’t need to use violence. Need to ensure authenticity and create meaningful experiences and that they’re contributing to these stories. Rely and respond to what the audiences are asking for. Second Life had it’s own subculture and they can’t completely censor their presence from violence and all that happens there. Set security parameters, but can’t completely shield themselves. Need to act responsibility.
  • 28:48 – There’s so much potential. Don’t put it out all there. Balance for how it’s use. The opportunity is enormous. There’s a new layer of storytelling and experience. It’s augmenting that experience. Virtual gaming as a museum collection and embracing the digital revolution.

Theme music: “Fatality” by Tigoolio

Bryan Carter has been using virtual worlds for education since 1997, and he talks about the lessons learned from his Virtual Harlem project to immerse students into the literature and music from the 1920s. He talks about some of the resistance that he received from his peers in Africana studies, and how his students are already immersed in technology and that using virtual worlds is a way to create more engaging and potent learning experiences.

Bryan-CarterHarlem in the 1920s was improvisational and edgy, and Bryan is attempting to recreate this feeling with his virtual recreation. It was one of the first extended recreated environments for African Americans and so it attracted role playing, entertainment, live performances, lectures, poetry slams, and DJ performances, and businesses, educators and a range of different classes being taught within Virtual Harlem.

He talks about diversifying his presence from Second Life to OpenSim as well as some of his future plans with Unity and experiencing Virtual Harlem within fully immersive virtual reality with the Oculus Rift.

TOPICS

  • 0:00 – Intro. African American literature of 20th Century and Digital Humanities. Virtual Harlem as existed by 1920s jazz age. Wanted students to experience some of the literature from the 1990
  • 1:05 – Started into VR in 1997. Some of the Silicon Graphics and CAVE technology used back then
  • 2:04 – Used Quake for lower-cost multi-player networking, then VRML, and then eventually Second Life
  • 2:54 – Virtual Harlem is focused on African American Life and culture. Attracted role playing, entertainment, live performances, lectures, poetry slams, and DJ performances, and businesses, educators and classes being taught. Then other platforms opened. Linden Labs briefly eliminated non-profit pricing, which
  • 4:05 – Educational lessons from Virtual Harlem. Have a short-term, medium-term, and long-term plan. How to teach in these environments. Other African American scholars have more of a wait-and-see mindset, and there’s a resistance towards things that new and technological, and to games and virtual worlds. Engagement and success will help convince others of it’s worth.
  • 6:27 – Bringing diversity to Second Life. Field of Africana studies came from activist roots, and some question why work with something that’s not real when there’s so much other “real” problems happening. Some don’t see relevance of virtual worlds. Perceived as a distraction, hobby or a toy.
  • 7:23 – Create an educational period piece and collaborating with businesses. Cities have diversity from education, commerce, entertainment and other media as well. OpenSim, Second Life and Unity.
  • 8:23 – Immersing people into the music of the time. Jazz was edgy and improvisational. Immerse them within the environment.
  • 9:23 – Be prepared for your technology to fail. There are different levels of Internet connectivity. You don’t have control over the entire ecosystem. Tech failures can frustrate students.
  • 10:32 – Change in discount pricing in Second Life. Migrating towards open source OpenSim version of Virtual Harlem. Funding is more difficult, but the community is in Second Life. Hopes to have a presence in a number of different diverse places.
  • 11:42 – Future of the metaverse. Working with Virtual World Web company, and they’re creating Curio. WebGL is also a possibility. Open communication channels up between these worlds.
  • 13:28 – Using fully immersive VR within Virtual Harlem.
  • 14:37 – Future of education with immersive technologies. Many new tools in this new toolkit, and they all need to work more seamlessly together and connect to these disparate worlds.

Theme music: “Fatality” by Tigoolio

Aaron Walsh talks about his journey into virtual reality, and how Jaron Lanier and the Lawnmower Man eventually led him to starting the Immersive Education Initiative. He taught the first college course that took place within a virtual world back in 1995, and has been exploring how to use immersive technologies within an educational context ever since.

Walsh_Aaron_AvatarHe held the first Immersive Education Summit in 2006, and at this year’s Immersion 2014 they’ve expanded beyond education to include business and entertainment speakers as well. They offer a number of different iED certifications and resources to help show how immersive technologies could be used across the human experience.

In this interview, Aaron talks a bit about some of the strengths of immersive education, some principles of what to do and not to do when designing an immersive educational experience as well as how to cultivate serendipity and surprise to keep students engaged and excited to participate.

I was able to conduct 21 interviews at Immersion 2014, and I’ll be releasing these over the next three weeks here on the Voices of VR podcast. Here’s a preview of some of the upcoming topics:

  • Aaron Walsh – Best practices for Immersive Education
  • Richard Gilbert – Psychological connections to virtual world avatars
  • Melissa Carrillo – Smithsonian’s approach to sharing cultural heritage through virtual worlds
  • Morris May – The movement of Hollywood special effects into VR
  • Saadia Khan – Power of Avatars in Educational Virtual Worlds
  • Ross Mead – Body language for virtual avatars
  • Jackie Morie – History of VR & USC’s Institute for Creative Technologies
  • Bryan Carter – Immersing Learners in Harlem, NY during the 1920s Renaissance/Jazz Age
  • John Dionisio – Wearable Computing and the Reversal of Virtual Reality
  • Isabel Meyer – Smithsonian’s digital asset management & future of public domain access to digital artifacts
  • Kieran Nolan – Using virtual worlds for education
  • Mike Arevalo – VR Typing Trainer – Game Jam winner for Educational VR Hackathon
  • Daniel Green – Using Minecraft & other immersive software for education
  • Jane Crayton – Fully Immersive Dome Entertainment
  • Ryan Pulliam – VR for marketing
  • Ivan Blaustein – Orange County VR Meetup
  • Inarra Saarinen – Ballet Pixelle virtual world dance company
  • Philip Lunn – Nurulize’s approach to new forms of immersive entertainment in VR
  • Michael Licht – Immersive Journalism
  • Ka Chun Yu – “Full Dome” Video Virtual Reality (VR) Theaters: Exploiting Extreme Fields of View for The Benefit of Students
  • Terry Beaubois – Architecture in VR & Preparing for the Golden Age of Immersion

More details about the interview with Aaron are down below.

Reddit discussion here.

TOPICS

  • 0:00 – Immersive Education Initiative. Learned how to program, and students were interested in the game developer in classroom. Wanted to create virtual worlds and virtual scenes. Wanted to share experiences from Boston with his family in Colorado. In 1989, saw Jaron Lanier speak about VR at the Institute of Contemporary Art. He was a showed a working VR system where he was a lobster. Realized that VR could be the mechanism for sharing experiences with his family. Wrote down all of the software and hardware that he needed to learn in order to build a VR system. Succeeded in building an actual VR prototype system. Saw Lawnmower Man, and realized that experiencing information is much more powerful than reading and started him on the path towards immersive education. Got involved with VRML standards committee. Around 1995-1997, got tired of coming onto campus for teaching. Pitched to his dean to teach a class in a virtual world, and go permission to start the first immersive education class in a virtual world at Boston College. The first Immersive Educational Summit started in 2006.
  • 8:36 – Strengths of Immersive Education. Lots of different technologies. Virtual Worlds, VR, AR, immersive learning games. Depends on technology and what you’re trying to teach. Have the ability to visit places either in a group context or individually in an immersive environment where there is a lot higher level of engagement and participation. Much better than lecturing
  • 11:35 – Moving away from the broadcast lecturing model of learning, and more towards self-driven interactive learning. Put best of self into an immersive experience, and that could be more effective than the authentic version. Recorded an authentic presentation the first year, and second year he played back a pre-recorded version of the lectures. There’s a psychological barrier about what’s really authentic and what’s a real experience.
  • 14:39 – Most powerful immersive educational experiences. Charlie was a war veteran who experienced a lot of trauma and had doubts about participating in a virtual world classroom environment. Charlie was fully mobile and was more engaged than in the physical place. No physical indications of severe damage. Only needed his voice. He would have been lost in the traditional educational system.
  • 16:55 – Design principles of what to do and not to do. Don’t make a classroom or physical location. Pick a comfortable environment where everyone is happy to be there. Reconsider the setting of your education. Be comfortable to navigate and talk within a virtual world. Takes time and experience. Immersive Educational Initiative has a number of different certifications. Don’t just stand there. Keep moving around the virtual world. Learn to walk backwards and lead students through journeys.
  • 19:39 – Other locations for teaching? No standard ones. Important thing is to change locations for every class. Use rolling environments to build excitement and anticipation, and they want to come and be there. Explore them and share your favorites. Let them choose.
  • 21:38 – Every class is a field trip. How to cultivate serendipity and surprise? Happens naturally in synthetic environments if there are objects to interact with. You can make your gatherings open and public as well. You can script and let them happen. Construct environments specific to the lessons like space travel. Virtual watershed and entire ecosystem.
  • 23:29 – Initiative is a non-profit collaboration to show business, teachers, educators, entertainers — and a public training initiative that includes anyone who is interested in investigating how to immersive technologies could be used across the human experience. Broader range of target demographics, and starting to have more specific summit gatherings designed for academics for research and teaching techniques, arts and culture to preserve culture and convey culture, business and entertainment. There’s a lot of free resources.
  • 25:48 – Immersive education tools moving into the mainstream. When the dotcom bubble burst, then a lot the VR initiatives evaporated. Thought that around 98-99 that it was going to happen. Development has continued. Kickstarter help re-catalyzed the excitement from the 90s. Technology, computers, graphics techniques make graphics much better as well as a mobile technology and broadband network infrastructure that delivers the data. It didn’t disappear, it just went underground. Same visions and concepts, and the technology caught up.
  • 29:30 – New tools for game development and comparing the new to the old. Technology happens in generations and is a continuum. Their time will come and go. Traditional VR will be less and less, and then new consumer round will come around. Eventually the current VR tech will be phased out by the next generation of tech — most likely involving neural implants.
  • 31:46 – Potential of VR is all about human connection. Shorten the distance between people you care about and people you’re about to meet. Technologies will help us connect deeper and in ways that you can’t do today.

Theme music: “Fatality” by Tigoolio

Kevin-Williams2Kevin Williams has recently published a book on The Out-of-Home Immersive Entertainment Frontier: Expanding Interactive Boundaries in Leisure Facilities.

In this extensive interview, Kevin gives a comprehensive overview of the Digital Out-of-Home Entertainment (DOE) sector and what VR developers can learn from amusement parks and what type of opportunities there are to provide immersive experiences to large groups of people.

Kevin also provides a lot of insight into the history of how AR and VR developed out of military simulations and applied into what he sees are three different types of immersive entertainment experiences: Ones that are designed for audiences and shared group experiences, individual experiences, and finally educational applications and experiences.

Here’s the Venn diagram that’s discussed at the end of the show that maps out the DOE landscape.

Venn-6-Operating-Digital-Out-of-Home-Entertainment

It’s a rich interview filled with a lot of unique and interesting insights, and a more detailed listing can shown down below.

If you have any questions for Kevin, then feel free to reach out to him at: kwp@thestingerreport.com

Reddit discussion here.

TOPICS

  • 0:00 – Intro KWP Consultancy that focuses on Digital Out-of-Home Entertainment. Founding chairman of DNA Association. Co-author of The Out-of-Home Immersive Entertainment Frontier: Expanding Interactive Boundaries in Leisure Facilities
  • 1:00 – Comprehensive overview of Digital Out-of-Home Entertainment. Held first conference in 2011. Created an association. Tired of explaining the market to others, and got a lot of industry leaders to participate. What it has to offer.
  • 2:55 – How Digital Out-of-Home Entertainment is relevant to VR. Arcade golden age from 80s-90s, but then became placid and compliant. Public and audience moved on. From 90s onward would visit out-of-home entertainment facilities. Arcade companies. Crossover for VR. Where VR first hit consumers with Virtuality and Alternative Worlds was the arcade machine. VR continued in simulation and military applications, but had floundered up until Palmer Luckey came along with Oculus VR. It’ll be a second dawn for VR and DOE
  • 6:31 – DOE used to be a bridge from research to the mainstream. Mobile, PC, and fully immersive room as 3 tiers of VR. Future of VR arcades and location-based entertainment. VR is in it’s 4th phase. 1st Ivan Sutherland in 1968. 2nd VPL took NASA tech and created disposable tech. 3rd Virtuality and ended with Virtual Boy. 4th was Oculus VR and resurgence of consumer VR. Used to be beating swords into plowshares — taking military tech and making it available to consumers. It’s been a bit of a reversal of roles with VR coming from mobile technology. Focused on immersive entertainment with DOE and DNA.
  • 10:53 – History of military’s involvement with VR. Translating military technology and make it available to consumers. Simulations were huge in 70s and 80s for defense purposes. Couldn’t build as many tanks as needed, and so train people via simulation. Flight and commercial jet training simulations in VR. Do 80% of training before doing it for real. Simulation industry is real and VR is used by military, law enforcement, railroads, transportation, etc. Was able to try out $2-4 million VR flight simulators. VR came to it’s fruition with simulation. Detailed simulation needed for the apache simulator, and so it required a head-mounted display and move beyond the CAVE screen projection. Apache Helicopter tech lead to AR and VR.
  • 16:51 – Head-tracking. Difficult to nail down from 1968-1990 for how VR was used in the military. Synthetic visual worlds being represented to military pilots. Now create immersive synthetic environments that are believable, immersive and entertaining.
  • 18:07 – History of VR from 1968 to 1990. Big VR developments happening within the military. Military simulators have had a lot of money from US, UK and Israel defense agencies.
  • 19:13 – Car simulation. VR and simulation are viable. There’s no difference between VR and simulation. Speed boat and professional racing drivers learn the tracks through simulation. Professional simulators are made available for consumers.
  • 21:21 – Connection between amusement parks and innovations for digital out-of-home entertainment. Large audiences looking for immersive entertainment experiences. Star Tours was one of the first immersive experiences for amusement parks. Theme parks look for the next big high, and immersive entertainment provides that. Disney Quest built in 1998, and is still open today. It’s the longest-running VR entertainment experience. Aladdin magic carpet ride and Ride the Comix, a comic book sword fighting application. Had hundreds of thousands of people experience VR. Spent over $90 million dollars to develop it. Consumer is cost-effective and immersive, and throw as much possible money as possible to make compelling immersive experiences that encourage repeat visits.
  • 25:12 Use of mobile and tablet tech at Amusement Parks and future of VR/AR HMDs in public spaces at Amusement Parks. Augmented Reality devices are personal, but DOE wants to control the device and provide special devices that are more unique. Three tiers of Amusement Park experiences: 1.) Pay price and experience the theme park attractions in meat space. 2.) Do augmented reality experiences while at the park. 3.) Participate remotely with others via virtual reality.
  • 28:24 – Museums and Libraries and Edutainment applications with AR/VR – Facilities using AR viewing stations and show additional info. If the weather is bad, then show best view. Boosting the experience. AR tablets that superimpose additional info about objects in a museum. At a gallery, tablets will show specific aspects of pieces of art. Taking military situational awareness simulation technology and recontextualizing it for consumers. Education will be a huge part of AR and VR. In the leisure sector, there’s the gamification of exercise
  • 32:36 – Developers could create their own AR experiences at amusement parks. Big corporations are doing this. MagiQuest at Pigeon Forge, TN combination of VR, AR and immersive technology where people role-play being a magician. Doesn’t get a lot attention from the media.
  • 34:53 – Augmenting Laser Tag – Laser Tag is military technology for combat simulations. Laser tag has gone through different waves of popularity. Seeing next-gen laser tag with digital natives who are used to mobile phones and console gaming. Using AR HMDs within laser tag environments as well as digital projection mapping within the environments.
  • 38:08 – 4DX theater opened within US. Porting experiences to incorporate 4DX theater. Cinemas started at amusement parks. Then Lumière brothers thought to project film projecting into walls. But films emerged from amusement parks. Passive film experiences and increasing the immersive experience with physical effects. Drive towards interactivity and 7D films by Triotech where audience can interact and compete with each other. Dark Ride experiences where you shoot at the screen. Transition from passive to interactive narratives. DOE likes to deal with audiences because of economies of scale. Immersive Dome experiences
  • 43:26 – Opportunities for independent VR developers to port content into DOE experiences. Looking for new opportunities to apply their skills
  • 44:45 – Marketing applications of immersive technologies. Promotional and marketing impacts of DOE. Initially thought they’d take over billboards with digital billboards, but realized that they need to create interactive narrative to draw people in. AR bus shelter done by Pepsi. Marketing industry wants to draw people into experiences to build brands. Creating a VR fashion events and virtual fashion show. Dreamworks Teach your own Dragon and using VR experiences to promote movie, and Game of Thrones using VR to promote the show.
  • 49:20 – ImmersiON announces strategic fusion with VRelia. TDVision has done a lot of military simulations. Decided to take on VR HMDs, and used for simulation and training sector. Dedicated system that’s off-the-shelf system for training, simulation, research and development, but also be able to be used in ruggedized and placed into public spaces for the Digital Out-Of-Home Entertainment sector, and have access to the latest technologies.
  • 51:35 – Going towards the Holodeck. NASA and JPL using autonomous astronaut machines. JPL demonstrations using CAVEs to recreate another world. Content source for augmented reality, and used for VR content. Sony Project Morpheus had Mars Rover data. Simulation has been a huge part of space exploration. For every hour of moonwalk, there was 8 hours of simulation. Quad drones using data for virtual tourism. GoPros on submarines, space ships and perhaps even on the Mars rover.
  • 55:45 Venn Diagram that maps out the landscape of Digital Out-Of-Home Entertainment. Amusement and pay-for play sector, theme park sector, retail and hospitality sector, and edutainment & leisure sector. They overlap, but there are two digital gambling and video games are different and self-contained industries.
  • 59:50 – Potential of VR. Experiential technology using forced feedback to make it feel like you’re on Mars or microscopic entity or another person. Three levels of immersive entertainment. Three types of immersive entertainment experiences: Audiences and shared group experiences. Individual experiences. Educational applications and experiences.
  • 1:02:30 – Contact Kevin via kwp@thestingerreport.com

Theme music: “Fatality” by Tigoolio