Richard Gilbert is a Professor of Psychology at LMU who founded the P.R.O.S.E. Project, which does Psychological Research on Synthetic Environments. He’s been researching the psychological impacts of virtual worlds like Second Life, and has found that people feel psychological immersed in another environment. There is also a lot of idealism in terms of showing your ideal self, but overall people experience these virtual worlds as being real, albeit an idealized version of reality.

richard-gilbertRichard has studied a number of different aspects of how people are using avatars in virtual worlds including everything from sexuality, friendships, relationships, intimate relationships, marriage, role playing of children, role playing of families, addictive issues, identity issues and constructing a new identity in the virtual world, changing genders, and changing of physical appearance.

He explains that some people play with identity to have a corrective experience from their childhood by playing a child or a parent of a child. Or there’s also people who change genders in order to explore alternative sexualities through relationships with the same sex, but through a heterosexual virtual avatar. He found that 11% of Second Life users are a different gender than their avatar, and that over 90% of that 11% were males.

When it comes to exploring identity in a virtual world, Richard says that everyone wants to be different, but that it can be liberating to be the person that you always wanted to be.

Richard says that immersing students in a learning environment makes them less passive and provides more opportunities to be creative, which ultimately provides a more active and deeply-engaged education where people learn and remember more.

Finally, he sees that the potential of virtual reality being the creation of a metaverse that’s will provide be a parallel context for human culture. Our senses will be engaged at all levels there, and it’s where education, entertainment and all dimensions of reality all be going. In the end, he seems that this alternative world will provide psychology and emotional reactions that will be indistinguishable from reality.

TOPICS

  • 0:00 – Intro. Professor of Psychology at LMU. PROSE Project is the Psychological Research on Synthetic Environments. Interested in user’s experience and how it affects them and society.
  • 0:55 – Have studied the following issues in virtual environments: sexuality, friendships, relationships, intimate relationships, marriage, role playing of children, role playing of families, addictive issues, identity issues and constructing a new identity in the virtual world, changing genders, and changing of physical appearance
  • 2:31 – People don’t see virtual world as a game. Large majority feel psychologically immersed in another environment. A lot of idealism in terms of showing your ideal self. People experience as real, but are also in an idealized world as well.
  • 3:39 – Trying to remediate trauma. Studying this issue. Role playing a child and seek new parents to have a corrective experience. The parents could also be trying to do the same thing from a new perspective.
  • 5:20 – Playing with identity. Everyone wants to be different and be the person you always want to be can be liberating. Role players have multiple avatars. Multiple personality Order.
  • 7:00 – Surprises? People happier in their virtual relationships with better communication and more intimacy than in physical context. Only can do in virtual worlds is communicate and can develop intimate connections very quickly
  • 7:45 – Getting audio in second life and getting additional context
  • 8:40 – Modulating their voice with role playing. Some people switching genders to experiment with a gay lifestyle.
  • 9:40 – Audio masking is getting better
  • 10:10 – 11% operate as a different gender in Second Life, and of that 11%, then over 90% of them were males who were switching to a female avatar in Second Life. Rare for a woman to change to male. Females get more gifts, but also experimenting to empathize with woman or experiment with same-sex relationships but through a heteronormative context
  • 11:24 – History of the metaverse. Begin in literary forms. Lord of the Rings inspired people to create technology to create Dungeon and Dragon like experiences. Future of the metaverse. Need a transfer protocol to move through the 3D web with all of your 3D assets and not have walled gardens
  • 13:56 – Sees a Multidimensional Internet. In Second Life, he can pull up 2D content on screens including websites, email and Pandora music. Merger between the 1D Internet, 2D social networking, 3D physical reality & 3D space of Second Life. Motion capture and VR HMDs will
  • 15:41 – Education improved by immersion. Immerse in learning environment, they’re less passive and can be more creative. It’s more active and deeply engaged education. People learn and remember more.
  • 16:40 – Creating a metaverse that’s going to be a parallel context for human culture. Our senses are engaged at all levels there. Where education, entertainment and all dimensions of reality is going, an alternative world that is harder to distinguish between it’s psychology and emotional reactions.

Theme music: “Fatality” by Tigoolio

Saadia Khan is an adjunct assistant professor of psychology and education at Teachers College, Columbia University. She has research is with looking at how using avatars in virtual worlds can improve learning and how they can make you feel better.

Saadia-KhanSaadia explains how Embodiment Theory shows how experience things with more than one sense can improve learning, and that virtual world avatars can also provide that type of multimodal learning. Avatars can increase interest, focus, motivation, engagement as well as making a more emotional connection to characters and periods in history.

She describes some of her research in using virtual worlds for education, and the importance of identifying with your avatar in order to have a self-image in the virtual world which can provide stronger sense of embodiment. There a lot of potential for using virtual worlds and avatars for education, and Saadia is definitely on the cutting edge of researching this field.

Theme music: “Fatality” by Tigoolio

Dr. Jacquelyn Ford Morie was a co-founder of USC’s Institute for Creative Technologies (ICT) where she spent 13 years as a Sr. Research Scientist. She is also the founder and chief scientist of All These Worlds LLC.

JackiPortraitSmall-180x240Jacquelyn has been working in Virtual reality for over 25 years since 1989. She comes from an artist background where she found her medium was to create emotionally evocative virtual reality environments. Interestingly, she found that a majority of the VR experiences created before 2007 were created by women.

She covers a wide range of the history of virtual reality starting with Ivan Sutherland in the late 60s, the military simulations up into the 80s and 90s, and then up to today. She has done a number of VR experiments, and helped create a language for emotionally mapping a VR experience based upon biometric feedback. She also invented a scent collar in order to bring the emotional power of smell into immersive experiences.

Jacquelyn talks about a number of the different military training simulators that she’s worked on as well as some of the recent research into mindfulness training in virtual worlds and using social experiences with AI within VR to minimize the isolation of NASA astronauts on extended missions like going to Mars.

Overall, Jacquelyn has a wealth of information about virtual reality and has a very unique perspective about the history of VR considering that she’s been a part of it for the past 25 years. There’s some interesting connections to inspirations from sci-fi literature, and the reason why the military wanted USC’s ICT to be located in Hollywood and it’s connection to Star Trek and the creation of the Holodeck.

Theme music: “Fatality” by Tigoolio

Morris May is the CEO of Specular Theory and he talks about transitioning into making virtual reality experiences coming from the Hollywood special effects industry. The toolchain is very similar, and he’s excited to be able to start to exploring new ways of telling stories with this new immersive medium.

Morris-mayHe does see that there will be a bit of a gold rush into VR, and that it’s still very early in figuring how the best way to tell interactive stories in VR. He sees that a lot of the initial experiences will be more like watching a movie experience, but expects that this will evolve to be a lot more interactive by triggering actions from where you’re looking or even looking at biometric data like your heartrate as a passive input to alter your VR experience.

Morris says that he’s getting interested from producers of the horror film genre to do VR experiences because they have a lot of experience in creating first-person perspective narratives.

He also predicts that VR will change almost every industry over the next 5 years, and that it’s impossible to predict every creative application that people will think of.

Specular Theory is a digital agency specializing in creating VR experiences for clients, but they’re doing some of their own R&D and game prototype development like with their Rift Park VR experience where you’re righting different amusement park rides, but you can dial it down if it’s too intense.

Overall, Morris is excited to be able to provide an experience of awe to others like he experienced when he first saw Star Wars. He’s looking forward to exploring the new ways of telling stories and immersing people within another world.

TOPICS

  • 0:00 – Intro. Coming from Hollywood special effects
  • 0:45 – Hollywood experience
  • 1:07 – Transitioning from Hollywood to VR. Inspired by Star Wars and wanted to share excitement to other people. Got desensitized to special effects, but VR storytelling potential is huge and exciting. Presence and feeling like you’re there
  • 2:21 – Differences in storytelling in interactive VR. Figuring it out. Doing pure game engine and hybrid of capture. Working with horror film genre director who use first-person perspective. 1st TV show was a recorded video show, and first VR experiences are very much like movies
  • 3:42 – Creating spaces and environments for people to explore. Guiding people through a scene in open world is a challenge. Technological challenges force telling stories from one perspective
  • 5:30 – Immersive theater like Sleep No More and seeing what other audience members paying attention to. Looping interactive narrative. Most social platform. You’ll be in the movie with the friends and take cues from other people within the film.
  • 6:51 – The Life of Pi special effects shop won an Oscar, but went out of business. Love making content, but not always the best businessmen and creates a challenging work environment. Whether special effects shops will move from movies to VR. Work for hire business model vs making your own content. Opens new realms of possibilities for new streams of revenue and telling sties
  • 8:36 A bit of gold rush time and migrating into VR. Whether special effects shops will migrate from film to VR
  • 9:46 – Offshoring of special effects in Hollywood
  • 10:08 – What do you want to experience. First-person narrative. Made Rift Park experience where you can dial back the experience if it’s too intense. Working on horror experiences and being able to trigger things by where you look or your heart rate
  • 11:28 – Using biometric feedback to alter a VR experience
  • 11:41 – VR starts with gaming. Capturing and sharing a theater experience. Name any industry that won’t be completely transformed by virtual reality. Watching Indianapolis 500 from within the car. Presence where you immersed in another world. Education will be huge in VR. Judging the OCVR Educational Hackathon. Getting a sense of scale that you can’t get elsewhere.
  • 14:10 – Haptics and getting a massage in VR. Restaurants
  • 15:00 – Internet and Mobile cell phone industry and how much it’s changed society. Impossible to predict how VR will be used
  • 15:55 – Being a digital agency and creating some of their own content and doing some R&D.
  • 16:34 – Toolchain from Hollywood being applied to VR. Using game engines. Use same modeling, rendering and compositing films. Same post-production process. Everything is much harder, but essential same tools.
  • 17:22 – Motion capture with a Kinect and do simple modeling. Special effects industry has made a lot of innovations that will be applied to VR
  • 18:18 – Using Kinects for motion capture. Keeping it simple. Tons of energy to do a LIDAR scan in Hollywood.
  • 19:33 – Using a Kinect for motion captures. Maya character generator and then drive that character in Motion Builder and use Zigfu ZDK for Unity. using characterize and eventually export to FBX and import into any game engine
  • 21:14 – How it works? No one really knows. But it does skeletal tracking and will map a skeleton onto your movements. No perfect, but cheap and can’t com pain
  • 22:10 – Telling stories with VR and want to help immerse people in new ways and telling stories.

Theme music: “Fatality” by Tigoolio

John Dionisio is an associate professor of computer science at Loyola Marymount University, and he talks about moving towards omniscience, omnipresence, and omnipotence in virtual and augmented realities. He sees augmented reality as the reversal of virtual reality where AR brings alien technology into reality, VR is bringing alien humanity into synthetic environments.

John-DionisioJohn talks about the evolution of privacy and identity through technology, and the open question as to whether there are latent generational differences or if technology is an active participant in evolving that relationship.

He also talks about the spectrum from reality to augmented reality to virtual reality, and sees that there is a multi-dimensional nature to how presence, communication and our economic capabilities are influenced within each type of reality.

In terms of education, he sees immersive technologies as merely a means to an end of ultimately producing competent producers, users and thinkers in a specific domain. It’s more about improving yourself and not just being enamored by technology for technology’s sake.

Finally, he’s sees that virtual and augmented reality technologies have the potential to produce a society that’s completely comfortable with increased capabilities when it comes to manifesting Omniscience, Omnipresence, & Omnipotence. And that technology could help to free us from feeling less limited and more empowered. There are open questions as to the digital divide and existing inequalities, but that those are more political and cultural issues to be resolved and are less technological in nature.

TOPICS

  • 0:00 – John Dionisio studies interaction design
  • 0:28 – Omniscience – channel information to you, Omnipresence – extend your presence elsewhere, & Omnipotence in Virtual Reality where arbitrary content creation is possible
  • 1:53 – Reversal of Virtual Reality. Virtual Environments is where human is immersed in a synthetic world. Augmented Reality is bringing synthetic objects into reality. Moving towards achieving Omniscience and Omnipresence in AR, and potentially Omnipotence. What will be more compelling? AR or VR?
  • 4:17 – AR bringing in technology into the lives perhaps against the will of others. It’s an open question to what will be more compelling? Will it be accepted
  • 5:40 – Balance of surveillance and privacy. Sun’s Scott McNealy on Privacy. “You have zero privacy anyway. Get over it.” You never had it. In reference to credit card and financial transactions. Some could argue that it was never there for anyone who had enough access. It’s an open debate. There are generational differences. Difficult cultural landscape that’s hard to know how that will evolve. Weakest link to privacy is more sociological than technological. Technology may bridge the gap with biometric security.
  • 8:52 – Evolution of identity within virtual and augmented realities. Layer of identity that’s beyond your control. Identity mapping project. Not sure how it’ll play out. Multiple personality ORDER. Project identity out and filter out parts of ourselves. Don’t know where identity will go. Just started to catalog how identity looks in different mediums.
  • 11:03 – Filtering and mediating your identity online. Identity in Virtual Worlds. Lots of factors, including generational differences. Younger folks are aware of false limitations of identity expression.
  • 12:38 – Spectrum of realities from reality to AR to VR. How true is linear progression of presence in VR vs AR. Not bound by physical constraints in VR. Spectrum will have more dimensions than just presence including communication and economics.
  • 15:00 – Financial differences in different realities and moving things of value from virtual to augmented to real world. How does economy look and how will the realities mix
  • 16:36 – Using immersion in an educational context. These technologies are just a means to an end. Ultimate goal of education is to produce competent producers, users and thinkers in a specific domain. Make sure that you’re taking yourself to a new place, and not just being enamored by technology for technology’s sake
  • 17:56 – Potential of VR. Back to manifesting Omniscience, Omnipresence, & Omnipotence in AR and VR regardless of limitations. Feel as unlimited as they can get via technology. Get to point where people feel empowered by technology.
  • 19:27 – Digital divide and the haves and haves not. Technology evolves and gets cheaper over time. If technology isn’t the limitation, then is it educational or another cultural factor. It should be watched, but don’t know any specific action to take.

Theme music: “Fatality” by Tigoolio

Ryan Pulliam is the Chief Marketing Officer of Specular Theory, which is a digital agency focused on creating augmented and virtual reality experiences that was co-founded with Morris May. She talks about how virtual reality is starting to be used in marketing and advertising campaigns, and the potential for telling engaging stories and immersing the audience into a unique and otherwise impossible experience.

ryan-pulliam2Virtual Reality will enable brands to create experiences that allow the audience to play a role in the story ranging from being professional race car driver, professional athlete or rock star on a music stage. Interactive stories up to this point have been seen through a screen where the audience feels more like a spectator, but VR can immerse someone within an experience.

Ryan talks about the lessons from the Games of Thrones Ascend the Wall VR experience by Framestore, and the Top Shop campaign during London Fashion week. She also mentions the Rift Coaster and Dumpy as being inspiring VR experiences as to what’s possible.

Finally, she talks about connecting a VR experience to a brand and the future of using VR as a try before you buy for things like Ikea furniture or driving a car. Immersive technologies can provide new ways to emotionally connect and audience to your brand’s story. In the end, it’s less about the VR technology, and more about providing an fully immersive experience that goes beyond what’s possible with observing experiences through a 2D screen.

TOPICS

  • 0:00 – CMO & co-founded Specular Theory with Morris May. Marketing and storytelling and reaching people in new ways with emerging technologies
  • 0:34 – Marketing is about storytelling and reaching your audience in a new way. Sharing content through a screen where they’re spectating. Experiential campaigns are limited by not fully being a part of a brand’s story. Making the impossible possible. Allowing them to be a play a role and not just be inspired by it like being a professional race car driver, a sports star or a rock star on stage at a music festival.
  • 1:57 – Which industries get virtual reality. Car companies stay up with technology and make interactive stories. Brands with a lot of marketing budget. Game of Thrones VR experience.
  • 2:46 – Connecting a VR experience to a brand. Depends on the brand. Brands who sponsor an event, and food and drink sponsor music festivals for example. Try not be gimmicky, but if you go that route, then give the best experience possible. Be a part of a brand. Try before you buy shopping experience either with furniture or a car will be pretty big. Gloves to pick up objects. Ikea shopping virtually to avoid driving there. Cool to bring really awesome things to your fans. It may be an extreme experience, but as long as it’s connected to your brand of adventure, enthusiasm or sports
  • 4:55 – VR demos that provide some inspiration for a marketing context. Top Shop campaign during London Fashion week where they did a contest to use a Rift during a fashion show. Been impressed by a lot of experiences on the Oculus Share site. Connects dots
  • 6:58 – Rift Coaster. Dumpy. Would be great to experience Dinosaurs
  • 7:52 – Game of Thrones demo. Fully immersed within a scene.
  • 8:31 – Scale and huge wall in the demo. People experiencing vertigo. Had a museum exhibit while waiting for the #GOTExhibit experience
  • 9:30 – Future of VR and marketing. Less about marketing and advertising and more about the story and the narrative. Less about the technology, and more about the experience. Immersion and being a part of a story and being emotionally connected to a story, and to be wowed. It’s about trying to be awesome and effective, and can now actually provide an experience rather than just seeing an experience

Theme music: “Fatality” by Tigoolio

Michael Licht is a professor of level design at USC’s Viterbi School of Engineering and has involved in video games for over 15 years and has an architecture background. He paired up with Nonny de la Pena in creating immersive journalism pieces because he wanted to use his skills beyond just making war simulations like Call of Duty.

michael-lichtHe talks about the importance of creating real and believable environments because our psyche will think that it’s fake unless it’s based upon real physics and has sound architecture. He also talks about the importance of being able to freely roam around in an untethered VR experience, and how can create a profound sense of presence

With immersive journalism, there’s not a lot of freedom to deviate from the source material because it starts to become fantasy and not a documentary of actual events. And he talks about the importance of creating virtual human characters through motion capture and facial capture to create an emotional resonance.

Michael is looking forward to continuing to collaborate with Nonny on immersive journalism pieces, but is also interested in creating an untethered VR game prototype to see if it’s something that people would enjoy as digital out-of-home entertainment. He sees that people are willing to take the red pill with virtual reality, and to be taken to a new place and have novel experiences there that will really blow them away. He’s looking forward to seeing the medium evolve and thanks Oculus for creating an open platform where people can experiment and innovate with the VR medium.

TOPICS

  • 0:00 – Intro. Game level designer and architect for 15 years. Looking for new application beyond war simulations with VR pipeline, and found Nonny & Immersive Journalism
  • 1:00 – Importance of creating environment and spaces. Creating immersion. Real environment that behaves in a real way. Architecture needs to be sound otherwise it’ll feel fake. Knowing how things are built plays a part of our psyche. Base it upon real physics and real life. Second part of immersion is that the space is plausible. It needs to seem realistic. Recreate environments based upon photos
  • 3:07 – Incorporating full body tracking. Use their own VR HMD system to create an untethered experience. Using a 20ft x 20ft space to freely move around. Creates profound sense of presence
  • 4:30 – Level design of games vs. immersive journalism. Forbidden from deviating from the event that actually happened. Use the audio to match it 1to-1.
  • 5:35 – Use of omniscient narration and whether that breaks immersion or adds more context
  • 6:31 – Virtual humans, and focusing on motion capture and emotional expression. Play actual audio for the mo-cap actors to match to actual events as much as possible.
  • 7:47 – Use of Force VR piece about an immigrant who was beaten to death. Recreated footage from cell camera footage
  • 9:30 – Where you’d like to go in the future with VR? Love to see more gaming applications in physical spaces and motion tracking within a large open space.
  • 10:30 – Faculty on level design
  • 10:50 – VR vs 2D screen. People want to take the red pill and taken to a new place. Facebook acquisition got people’s attention
  • 11:30 – Insights from Immersion 2014. Technology is so young, and there’s a lot of experimentation and lots of cool energy and looking forward to seeing what people do with VR.

Theme music: “Fatality” by Tigoolio

Jane Crayton is an immersive educator at the ARTSLab University of New Mexico who teaches and creates immersive dome experiences. She’s collaborated with Charles Veasey from the The Digital Dome at Institute of American Indian Arts in creating the vDome open source software, which is multi-channel projection software that provides real-time warping and slicing of content designed for immersive domes.

JaneCrayton Jane describes how you could take content developed in Unity and project it onto a 20ft dome with one computer, and a TripleHead2Go to drive three projectors. Producing content for domes used to require a lot of rendering time, but can now be done in real-time using vDome or Blendy Dome VJ.

The desire to do live VJ performances in an immersive dome is what catalyzed some of these technological breakthroughs including with two other groups working on this including the Société des arts technologiques (SAT) in Montreal, Recursive Function Immersive Dome (RFID) in the UK.

Jane talks about some of the educational uses of immersive domes including how she’s using it to recreate archaeological sites. Domes also allow for collective experiences that could be shared in groups, and that she expects to see Unity playing a bigger role in producing content for domes moving forward. She sees that fully immersive domes have the potential to change your perspective and alter your frame of reference, since you leave behind your point of view and it allows you understand material in a new way.

TOPICS

  • 0:00 – Intro – Work in fully immersive dome. Teaching digital production for a full dome environment using technologies like spherical photography, photogrammetry, and building up 3D environments to be fully immersed in the dome environment and interact with it. At the University of Mexico arts lab, and got a grant to develop a curriculum to best product multi-projection, full dome format. Creating a 4000px x 4000px format. Blending photography with virtual objects with textures. Focusing on creating on new and interactive tool within the full dome. Technology has been innovating to change how multiple-projection digital planetariums are produced. vDome open source software written by Charles Veasey, which provides real-time warping and slicing for domemaster input. Developed it in order to do live VJ performances, and bringing in contemporary club culture into the immersive domes. Being able to build out virtual places that you can explore and interact with each other. vDome transformed how they use the dome since it doesn’t have to be pre-rendered so that they can see it immediately on the dome. It’ll change how dome content is produced. Still in the R&D phase. Other groups creating dome software include Société des arts technologiques (SAT) in Montreal, Recursive Function Immersive Dome (RFID) in the UK, Blendy Dome VJ in Brazil. All of the groups were motivated by wanting to do live VJ in immersive domes.
  • 7:55 – Immersive dome vs immersive VR in a HMD. Some are 360-degrees and others are 180-degrees or 270-degrees. It allows you to look around and see out of your peripheral vision. You can engage audience with surround-sound audio. Use sound as an instigator for what to pay attention. Engaging emotionally and physically and do it with a live audience. You can sit in different perspectives within the dome. Consider how the audience will be seated and how they’ll be looking at the dome
  • 10:35 – Educational component to domes. First experience within a dome was in a planetarium, and it got her interested in science, optics and computers. Slide projectors used within the dome. It’s not just about astronomy in the dome any more. Teaching photography and videography from a different perspective. Dome offers a lot to students and teachers to engage with each other. Your perspective changes when you’re immersed
  • 13:11 – Content beyond astronomy. Cartoons. Film. Working with on a NSF grant to document archeological sites and building out a virtual archeological sites to be experienced in an immersive dome. Looking at applications beyond astronomy. Teaching photography, videography and 3D skills
  • 15:40 – What one would need to set up a dome. Download vDome software. A 20ft dome would require 3 projectors. Need a computer. Would need a TripleHead2Go to drive three projectors.
  • 16:53 – Digital planetariums used to use a $5k computer per projector x7. Today it’s a lot easier. A computer with two video cards could drive up to six projectors with two TripleHead2Go devices.
  • 18:50 – How does Unity game engine fit in? Can pipe in Unity environments onto immersive dome environments. Movement can be difficult since moving too quickly will make the audience sick. Unity is up-and-coming platform for the dome
  • 20:20 – What to avoid to minimize motion sickness. There’s a sweet spot on the dome where they audiences’ eye naturally rest. Take everything a bit slower and watch what you’re producing in the dome. Slow pans, animations and moves, and can be easy to get sick. Trojan commercial with pigs on a roller coaster that made people sick.
  • 23:00 – Spherical video solutions to bring video into an immersive dome. High-learning curve on these technologies. 360Heros is probably the most affordable solution. Uses similar software pipeline.
  • 25:48 – Full dome has the potential to change your perspective and alter your frame of reference, leave behind your point of view and understand material in a new way.

Theme music: “Fatality” by Tigoolio

Inarra Saarinen is the founder, artistic director and choreographer of Ballet Pixelle, does virtual dance performances in Second Life. She talks about the process of blending physical and virtual realities, and pushing the boundaries in creating a new form of dance.

inarra-saarinenIt’s not just about replicating physical reality in a virtual world, but integrating all of the things that are impossible in the real world including hovering, flying, moving your limbs beyond a body-joint movement. becoming an object, animal or dragon, being able to change your skin color, gender, and age.

She talks about some of the limitations of only having 28 bones to work with in Second Life, and her process of scripting out segments of animation sequences, but allowing each dancer to be responsible for the timing of the execution while having the room to improvise.

Inarra doesn’t want to create an automated experience that’s the same every time, but rather capture the vibrancy and vitality that comes with the imperfections and character of live performances. She also talks about how a lot of the participants aren’t physically able to be in a professional dance troupe, and that by participating in Ballet Pixelle the they’re able to have a kinesthetic experience of feeling like they’re performing dance on stage.

It’s interesting to hear all of the insights that Inarra has from doing Ballet Pixelle since 2006, and I imagine that the blending of physical and virtual during live performances will be an area of rich exploration over the next decade. From the VR community perspective, the Riftmax Theater’s Karaoke night starts to explore this blending of realities during live performances, and it’s a bit of an open question as far as what will be considered the most compelling and beautiful experiences within this new spectrum of mixed realities.

TOPICS

  • 0:00 – Intro. Founder, artistic director and choreographer of Ballet Pixelle, does virtual dance performances in Second Life.
  • 0:40 – How is movement controlled? Creates animations and chunks of movement in scripts and puts them into Second Life where each dancer is in control of their avatar’s performance.
  • 1:28 – Create animations in 3 ways. Individual keyframe per 30 fps and import into Second Life. Also uses a motion capture suit, but Second Life only allows 28 out of 206 bones. Also using a Kinect system to put the animations in a coherent sequence.
  • 2:31 – How does the dance trouble keep in sync. They keep the beat like another and are in charge of triggering the actions with their keyboard & mouse.
  • 3:03 – What’s been the reaction? Lots of powerful emotional reactions.
  • 3:35 – What is the audience connecting to? It’s a combination of telling story with set, lighting and movement. Movement is a universal language, and if you put together correctly, then you get an emotional resonance.
  • 3:57 – Sleep No More dance performance of MacBeth. Any dialog? There’s a playbill that tells the story of the ballet just like you would in any other live performance. The story should tell itself, but there’s a bit of help provided
  • 4:42 – What’s motivating your performers in your dance troupe? Had people in the troupe since 2006. They really get the kinesthetic experience of performing, and a lot of them have physical or other limitations where they’ve never been able to do that be for. They feel like they’re on stage and giving a dance performance.
  • 5:45 – The human synchronization and not being driven by a robot. A movie is the same every time, but live theater is not the same every time. Trying to create an experience that’s vibrant. It’s art, not automation. She wants those human imperfections. Choreograph ballets that allows them to deliberately go out of sync and to make order out of chaos.
  • 6:54 – Are there auditions? Lots of things are different. Transform, hover, fly and move beyond body limits. But lots of similarities and universals of working with other people. Some of the things they look for.
  • 7:44 – Coordinating across many different time zones for live performances. You can teleport in Second Life, but you still have time zones. Have both a European and North American dance troupe. But it can be difficult.
  • 8:35 – Other considerations for broadcasting music and clearing rights. Been very copyright sensitive from the very beginning. Made sure that everything is copyright cleared, and have clearance from everyone involved.
  • 9:42 – Right for each performance and image release for avatars
  • 10:18 – What keeps you engaged? It’s creative and at outer bounds of being creative. It’s a new form of dance. It’s not just adding something. It’s an exploration of physical and virtual movement and blending of realities, which is a different form. What do we find beautiful about virtual dance? Developing a language for virtual dance.
  • 11:34 – Things you can do virtual dance: Hover, fly, move beyond a body-joint movement. Become an object, animal or dragon. Change your skin color, gender, Become a child.
  • 12:00 – Use all of these components in all of her ballets.
  • 12:17 – Pushing limits of what’s physical possible and expanding audience for dance. Gives dancers a chance to experience performances. Teach history of ballet and technique.
  • 13:04 – Immersive VR with the Oculus Rift, and future of limb tracking with dancing in VR. Not as interested in translating your movements into the virtual world, because the animations are doing things that you couldn’t be doing. Not interested in replicated the real world, and can’t go out and hire real professional dancers
  • 14:36 – Ultimate potential for virtual environments. We’ll eventually live in virtual worlds.
  • 15:03 – Next open problem to solve with virtual dance. Limited by the 28 bones that are allowed by Second Life out of the 206 bones. On a world audition tour to do choreographic studies and do motion capture of dancers to study the movement of professional dancers.

Theme music: “Fatality” by Tigoolio

Isabel Meyer is the branch manager for the Smithsonian’s Digital Asset Management System (DAMS), and she talks about the process of digitizing different collections within the Smithsonian to better support its mission of “increase and diffusion of knowledge.”

isabel-meyerThere are over 157 million objects in the Smithsonian’s overall collection with over 5 million of them having been digitized within their DAMS. This accounts for just over 3% of their total collection, and their in the process of prioritizing the digitization process and making those assets more widely available.

She mentions the Smithsonian Collections Search EDU site that has over 8.6 million catalog records of museum objects, library & archives materials with about 15% of those that have images.

There’s also the Smithsonian X 3D site, which is currently in an early beta that contains over 20 3D-scanned objects available for download and for non-commercial, personal or educational uses according to their Terms of Use. One particularly interesting example is this 3D laser scan of a Wooly Mammoth

Isabel says that this is an expensive process, and they’re trying to get more funding to make these objects available. Hopefully at some point, VR developers will have greater access and ability to create immersive experiences that include authentic artifacts our our digitized cultural heritage.

TOPICS

  • 0:00 – Intro – Digital Asset System manager at Smithsonian. Digital representations of all of their collections. Capturing more and more objects. Currently at 5 million digital assets. Being used by all 19 museums, 9 libraries and the zoo.
  • 1:43 – Total Objects in Smithsonian is 157 million objects, but doesn’t include event photography and other objects. Probably around less than 3% of it has been digitized. In process of prioritizing what should be digitized.
  • 2:44 – Getting access to digital objects. How do you collaborate or get access to some of these objects. Their DAMS is behinds a firewall. Determining what should be made publicly available. Greatly expanding this portion. There was a lot of reluctance at first. Have expanded tools. Smithsonian Search site at http://collections.si.edu/search/ Sketchbot Robot that draws images in the sand, and want to make that code made available.
  • 5:30 – Copyright may have expired, but Smithsonian owns copyright of the digital version, and make available? Making high-res scans available according to their terms of use and clears for distribution.
  • 6:44 – Tracking metadata within their digital objects. Different categories of metadata, and their DAMS is integrated with their collection management systems. Metadata is embedded within the asset.
  • 8:00 – Announcement of museums that will be releasing objects. Have an existing 3D site with 20+ objects available. It’s an expensive process, and trying to get more funding to make these objects available at Smithsonian X 3D. There’s a rapid capture initiative.
  • 10:04 – What would you hope would happen with this cultural heritage. Don’t know what the possibilities are yet. Researchers, educators and creating new artwork.
  • 10:55 – Potential to collaborate with Smithsonian. Would need to go through the Public Affairs office.

Theme music: “Fatality” by Tigoolio