Isabel Meyer is the branch manager for the Smithsonian’s Digital Asset Management System (DAMS), and she talks about the process of digitizing different collections within the Smithsonian to better support its mission of “increase and diffusion of knowledge.”

isabel-meyerThere are over 157 million objects in the Smithsonian’s overall collection with over 5 million of them having been digitized within their DAMS. This accounts for just over 3% of their total collection, and their in the process of prioritizing the digitization process and making those assets more widely available.

She mentions the Smithsonian Collections Search EDU site that has over 8.6 million catalog records of museum objects, library & archives materials with about 15% of those that have images.

There’s also the Smithsonian X 3D site, which is currently in an early beta that contains over 20 3D-scanned objects available for download and for non-commercial, personal or educational uses according to their Terms of Use. One particularly interesting example is this 3D laser scan of a Wooly Mammoth

Isabel says that this is an expensive process, and they’re trying to get more funding to make these objects available. Hopefully at some point, VR developers will have greater access and ability to create immersive experiences that include authentic artifacts our our digitized cultural heritage.

TOPICS

  • 0:00 – Intro – Digital Asset System manager at Smithsonian. Digital representations of all of their collections. Capturing more and more objects. Currently at 5 million digital assets. Being used by all 19 museums, 9 libraries and the zoo.
  • 1:43 – Total Objects in Smithsonian is 157 million objects, but doesn’t include event photography and other objects. Probably around less than 3% of it has been digitized. In process of prioritizing what should be digitized.
  • 2:44 – Getting access to digital objects. How do you collaborate or get access to some of these objects. Their DAMS is behinds a firewall. Determining what should be made publicly available. Greatly expanding this portion. There was a lot of reluctance at first. Have expanded tools. Smithsonian Search site at http://collections.si.edu/search/ Sketchbot Robot that draws images in the sand, and want to make that code made available.
  • 5:30 – Copyright may have expired, but Smithsonian owns copyright of the digital version, and make available? Making high-res scans available according to their terms of use and clears for distribution.
  • 6:44 – Tracking metadata within their digital objects. Different categories of metadata, and their DAMS is integrated with their collection management systems. Metadata is embedded within the asset.
  • 8:00 – Announcement of museums that will be releasing objects. Have an existing 3D site with 20+ objects available. It’s an expensive process, and trying to get more funding to make these objects available at Smithsonian X 3D. There’s a rapid capture initiative.
  • 10:04 – What would you hope would happen with this cultural heritage. Don’t know what the possibilities are yet. Researchers, educators and creating new artwork.
  • 10:55 – Potential to collaborate with Smithsonian. Would need to go through the Public Affairs office.

Theme music: “Fatality” by Tigoolio

Terry Beaubois is the director of Montana State University’s Creative Research Lab, and he talks about how he used Second Life to teach architecture classes and the different limitations he faced from having an imprecise physics model in the virtual world.

Terry-BeauboisHe talks about the other potential for using architecture within virtual reality as well as starting to think about how a physical space can interact with you through the Internet of Things, and the implications of living in a smart home that is aware of who you are, where you’re at, and you behavioral patterns.

Terry also talks about his different VR projects that he’s been working on since the early 80s with doing telepresence applications for NASA so that astronauts could control robots through a virtual reality interface.

TOPICS

  • 0:00 – Teaching at Stanford and talking about lessons on VR. Been doing VR since the 80s with NASA doing robotic telepresence. Motorcycle helmet with CRT monitors and wires. Data glove. Involved with VRML and early days of Second Life. Going to be experimenting with Terf VR program, which is a follow-up to Croquet & Qwaq.
  • 3:09 – Seems like a natural fit for architecture. Different between building a house in VR vs. designing a house for how it’ll actually be built. Second Life and VR programs need to have accurate physics models in order to have a 1:1 mapping of reality and to do actual architectural design. Currently have to do workarounds, which isn’t teaching real architecture.
  • 5:55 – Would love to see accurate physics models within a VR engine for architectural purposes.
  • 7:05 – Importance of spaces and design principles for architecture. Creates a context that blends in with reality. Architecture needs to have sensory awareness and be plugged into the Internet of Things. Entering an age of enormous amount of information being shared. Architecture could be a participant in peoples lives through sensors and detecting your identity and patterns of living. Not a lot of imagination for what a smart building would mean
  • 10:00 – Entering a golden age where everything will communicate with everything. Track medical biometrics and share to relevant parties. Singularity will be a non-event because we still people to help interpret the meaning CERN is generating an enormous amount of data, and it still requires humans to look at it
  • 12:04 – History of VR since the 1980s. Human’s connection to a virtual avatar could be relevant and cognizant of your physical avatar because we’re not connected to what our human life form is in charge of maintaining. VR can help us with deal with who we are. VR will enable helping people deal with phobias. He meets the most creative and fun people in virtual worlds. VR will be a tool that will develop and evolve over time. Lots of uses for training. They physics engine will get there eventually to be more relevant for architecture.
  • 16:08 – VR and Architecture business engagements, and be used to build something and preview it beforehand. Perhaps VR to 3D printing and have lots of iterations.
  • 17:25 – Being able to experience a architecturally design space in VR before it’s created
  • 18:14 – Dealing with the Wild West with no rules in Second Life and adult content.
  • 18:47 – Future of VR. Thought we’d be where we are with VR back in 1985. Good thing we don’t know how long it’ll take otherwise we may not start things. Humans are hopeful and generally optimistic for how long things take.

Theme music: “Fatality” by Tigoolio

Kevin Joyce is the editor-in-chief at VRFocus, and he talks about how they’re covering everything to do with virtual reality gaming and entertainment at VRFocus. He talks about how it was founded and funded by nDreams CEO Patrick O’Luanaigh, who is working on a number of VR experiences and noticed that there wasn’t a site in the UK covering VR in a comprehensive way.

kevin-joyce-avatarAt the moment VRFocus is just Kevin and Jamie Feltham, who has been tracking a lot of the online communities and breaking news in the VR space. VRFocus does a lot of excerpting from other articles to pull out the newsworthy bits of information, as well as a lot of original reporting, live blogs at conferences, and video interviews.

He talks about some of the things that need to happen for VR to go mainstream, and how VRFocus is trying to help communicate what’s happening in this space beyond to the wider video gaming community. He says that VR needs to make incremental steps towards going mainstream, and sees that one day VR experiences will be prolific and the standard norm for people. There’s so many things that VR can do and that we’re just only starting to scratch the surface.

TOPICS

  • 0:00 – Intro. Worked in video games journalism, and VRFocus funded by nDreams’ Patrick O’Luanaigh. No VR website in UK, and started a site where he has full editorial control. Launched in February 2014. Focusing on VR as entertainment
  • 1:00 – VRFocus as the beat reporter of the VR space. Aim to cover video gaming and entertainment and how VR is changing VR gaming
  • 1:39 – SVVRCon coverage. Did liveblog coverage about VR gaming. Conducted 28 video interviews and released over time.
  • 2:27 – What got you excited about VR? Only touched on briefly on VR before getting the job at VRFocus. Independent game developers are driving a lot of VR innovation and showing what the power of VR is
  • 3:10 – VRFocus’ Jamie Feltham tracks a lot of the online communities and breaking new stories.
  • 3:58 – Just Kevin and Jamie putting out 12-14 articles a day
  • 4:15 – Pulling out news bits from existing content. Aimed at non-VR audience and push beyond your normal audience and share what’s going on in a way that’s consumable.
  • 5:30 – Measuring the response. Doing a lot better within the VR community than the larger video gaming community. Trying to let people know about what VR is
  • 6:04 – Reaching out to new audiences. Finding the middle ground, and the big projects excite a lot of people. Cover Sony because it’s closer to the larger audience
  • 6:50 – Events to cover for VR. Meet-ups and conferences like SVVRCon, GDC, & E3.
  • 7:44 – What types of games he’s personally experienced. VirtualReality.IO isn’t a game, but was a compelling experience to show the seamless interface to be able to go from game to game without leaving VR. VR needs something like this to go mainstream.
  • 8:38 – People projecting what they’d expect would be a great VR experience, but they find out it’s not as great as they expect. VR needs to be incremental to minimize simulator sickness.
  • 9:48 – Most surprising is to see the general public’s reaction to VR without ever hearing or knowing anything about it.
  • 10:24 – There will be a time where VR is the norm, and it’ll be standard. So many things VR can do, and we’re just only starting to scratch the surface.

Theme music: “Fatality” by Tigoolio

Ivan Blaustein is a co-founder of the Orange Country VR meetup, which happens to be in the same location as the headquarters for Oculus VR. Their first meet up had 180 people, and they had five meetings within their first month including a couple of game jams hackathons.

Ivan-BlausteinIvan talks about fostering community through the process of getting together to create VR experiences, rather than just talking about it or demonstrating existing VR experiences. Leading up to the Immersive Education Initiative’s Immersion 2014 gathering, OCVR held a couple of educational game jams and were demonstrating the winners of those hackathons.

He talks about how the VR Classroom, VR Typing Trainer and PVRamid demos were demos that were much more compelling in VR than in 2D, topics that haven’t been well-explored in the past, and were a really polished experience for having been created within 48 hours. You can check out these and the other demos on the OCVR site here.

TOPICS

  • 0:00 – Intro – co-founder of the Orange County Virtual Reality. Meetup group for VR developers near Oculus VR headquarters. Had 180 people show up to try 12 demos including a DK2 demo from Oculus. Had 5 events in the first month teaching how to create VR experiences and to foster community
  • 0:56 – Dove in head first. Other events that they’ve held. Had hackathons for the past couple of weekends. Developers split into teams to develop educational experiences to have demos to show at the Immersive Education Initiative’s Immersion 2014 gathering.
  • 1:35 – In partnership with UC Irvine and collaborated with them on a hackathon. VR Classroom won that VR competition. Had another group the following weekend and a couple of guest judges. Showing two top prizes from Hackathon. VR Typing Trainer and the Pyramids.
  • 2:45 – VR Classroom developed by someone who never VR. Take traditional classroom and twist it on it’s head. Each classroom has a VR twist to it. History room that has a small-scale model of the Roman coliseum. Approach it, and the room walls fall down and you’re in the middle of the Roman Coliseum.
  • 4:07 – VR Typing Trainer like Mavis Beacon. Can’t see your fingers and forces you to learn the keys. Has a TRON style. Fun and exciting and difficult to cheat
  • 5:03 – PVRamids done in UE4. On-rails experience exploring the pyramids
  • 5:55 – Design principles of educational demos – Can only be in VR and wouldn’t be as compelling in 2D. Look for things that haven’t been well-explored in the past. And to create a polished experience within 48 hours. Other Wii mote and Google Maps integrated experience didn’t have as much polish
  • 7:19 – Forming community through hackathon projects. Future plans? Really amazed by the support by the community. Not the first VR meetup group, but actually getting together to make things. Talked to Smithsonian to possibly work with 3D scanned objects to see what they can do with with. Talked with Eric Greenbaum about doing a fitness game jam.
  • 8:50 – First development experience at the Portland Game Jam. Having time boxed constraints to make something real in 48 hours. First time in getting hands dirty with Unity. What can get done in 48 hours. Get to see what’s possible. Any time you get together and bouncing ideas off each other is an exciting creative environment
  • 10:14 – Where VR could go? Everywhere. Scared of Facebook metaverse. The positive potentials is making the world a better place and do great things, live healthier lives and learn new things.

Theme music: “Fatality” by Tigoolio

Mike Arevalo talks about the process of creating the VR Typing Trainer, which was created as part of the Orange Country Virtual Reality Meetup’s 48-hour Educational Game Jam.

mike-arevaloMike talks about the process of developing the game, and the structure of the game jam. His day job is to create educational applications, and they are always talking about how to immerse students in environments to help them learn more effectively.

Mike says that studies have found that gaming can stimulate a student’s brain in a way that static presentations never can, and that immersive VR can be a powerful way to unlock the parts of your brain to make it easier to learn new things. His advice to other game developers is to focus on getting the immersion right in your experience, and that your other goals and learning objects are more likely to fall into place.

I had a chance to play the VR Typing Trainer at Immersion 2014, and it is a very immersive and fun way to improve your typing. Having the words flying towards your face does create a certain amount of pressure and tension that makes the ordinarily dull process of typing much more engaging and fun. I could see how playing this game could help to cultivate some useful typing skills for when you’re in VR, and it’s definitely worth checking out — especially for an experience that was created in 48 hours.

TOPICS

  • 0:00 – VR Typing Trainer – Educational game to bring typing into VR to type without looking at the keyboard.
  • 0:36 – Sitting in Tron-like world, and targets at you and you have to type the word that’s printed on it. It’s an endless runner.
  • 1:00 – It’s a simple core game mechanic. Uses object pooling to take existing objects and get them to move towards the player. Have an algorithm to determine the difficulty depending on how long you’ve been playing the game.
  • 1:37 – Creating the Tron environment because it needed to something more interesting
  • 2:06 – Educational Hackathon
  • 2:26 – Ideas were pitched, and then broke up into groups
  • 2:53 – Saw Tuscany demo, and needed to get into VR
  • 3:09 – Use VR typing trainer to learn how to use keyboards more efficiency.
  • 3:47 – Hard to work with 7 programmers with different skill sets and not all Unity users. A lot of other art exhibits that were there. It had a bit more
  • 4:31 – A lot of planning required to coordinate.
  • 4:56 – Potential for VR. Mike is an educational app developer. How to immerse students to learn at a more effective rate, and gaming stimulates a student’s brain in a way that static presentations never can. Immersive VR can unlock parts of brain to make it easier to learn new things.
  • 5:48 – Advice to other VR developers to make an educational experience. Immersing the player into a place where they’d never be able to be otherwise. If you get the immersion down, then everything else will fall into place.

Theme music: “Fatality” by Tigoolio

Kieran Nolan is a network administrator who has been creating different elearning applications with immersive technologies. He’s 3D printing objects that students either create or modify from Thingiverse withing Google SketchUp. He’ll take a digital photograph of their objects, and then upload it to a virtual art gallery that can be viewed with an Oculus Rift and networked to another school system. He’s also been teaching classes in Minecraft, and even had his students collaborate on building a working QR code.

Kieran-NolanKieran also talks about how he sees cryptocurriences like BitCoin playing a larger part of the future infrastructure that’s going to enable all sorts of things that we haven’t even thought of. He sees BitCoin as a protocol that will enable all different types of decentralization of our infrastructure. One example that he provides is Namecoin, which is like decentralized DNS and a “decentralized open source information registration and transfer system based on the Bitcoin cryptocurrency.

He says that there’s a lot of potential for using immersive technologies in education, and he sees that it’s going to bring in a whole new curriculum because it’s so engaging and compelling for students.

TOPICS

  • 0:00 – e-learning and using oculus rift with the virtual arcade. Have a 3d printing networking set up with another school. Design 3d object, 3d print it, take picture
  • 1:14 – Workflow. Using sketch up to design objects. Eventually want to use Minecraft for designing objects. 3D print and take photos, hashtag and upload to virtual arcade to be viewed. Built a QR code in Minecraft. Lots of collaboration with Minecraft. Kids adapt to the Oculus Rift pretty quickly.
  • 3:57 – 3D printing and then put virtual images within in and use btsync. Enigma portal to get schools to work together and get older students mentoring younger students. Use QR codes to move between places. Using Titans of Space with students with Aspergers. Most interested in interschool 3D printing
  • 7:04 – Immersive education keys for engagement. Downloading 3D objects from Thingiverse, and changing it. Each student took photo, and then took turns walking through virtual art gallery to see their work.
  • 8:52 – Potential for using immersive technologies. Going to bring in a whole new curriculum. Running classes in Minecraft to do math and English.
  • 10:16 – Excited for BitCoin in education. Wanted to use BitCoin as an incentive for learning. Using the BitGigs model to do tasks to learn, and get paid in BitCoin to do small jobs. It’d teach kids about money and cryptocurrencies.
  • 12:03 – Bitcoin and the future of virtual worlds. BitCoin is a protocol like TCPIP that you can build on top of it. Namecoin is like decentralized DNS. It’s a “decentralized open source information registration and transfer system based on the Bitcoin cryptocurrency.” It’ll revolutionize things, and it’ll play a big part of decentralizing everything.

Theme music: “Fatality” by Tigoolio

Philip Lunn is the CEO of Nurulize, which is an entity created by the collision of VFX and video gaming for Virtual Reality. Co-founder of Nurulize has developed a process to be able to capture the world in a high-resolution, photorealistic way with a framerate ranging from 100-200 frames per second.

philip-lunnIn their VR demo called Rise, they combine FARO LIDAR scans, HDR photography, and xxArray character captures in order to create photorealistic environments and people within VR. He talks about the mostly manual process that they go through in order to capture the entire environment in a point-cloud with sub-millimeter accuracy, build a 3D mesh from the point-cloud data and project the HDR photos onto it, and then use real-time shaders to get framerates as high as 100-200 fps.

Philip talks about their plans to use their process to help capture retail locations, film trailers and high-value objects that you can’t get close to.

He sees VR as the biggest breakthrough in computing that there’s been in the past 25 years, and that virtual reality goggles will eventually replace our computer monitors and that Nurulize wants to help populate those virtual work spaces with idealized and exotic, 3D-scanned environments.

TOPICS

  • 0:00 – Intro – CEO of Nurulize. Developed a process to capture the world in high-resolution, photorealistic and with a very high framerate. Creating VR experiences for the Rift
  • 0:32 – Rise demo that has laser-scanned warehouse. Scott Metzger has developed this process using high-resolution photography at multiple exposures, and then using FARO laser scanners to capture the entire environment in a point-cloud with sub-millimeter accuracy, build a 3D mesh from the point-cloud and project the photos onto the mesh, and developed real-time shaders up to 100-200fps.
  • 1:53 – Dealing with occlusion issues. Created a narrative around this. It’s a full environment without occlusion.
  • 2:54 – LIDAR scanner FARO commercially available and then uses 3-4 tools to process that
  • 3:23 – Reverse photogrammetry process
  • 3:45 – Commercial business that is doing service work to do captures
  • 4:05 – Special effects shops moving from film to VR. Have enough hardware processing power
  • 4:47 – Target markets: Retail. Film Trailers and High-value objects that you can’t get close to
  • 5:09 – How did you get into VR. Been in computer graphics for 20 years with real-time ray tracing. VR is the biggest breakthrough in computing that there’s been in the past 25 years.
  • 5:45 – Where do you see VR going. Ready Player One is a good roadmap. VR HMD will replace your monitor, and Nurulize want to help fill that with 3D-scanned environments and be in dream environments
  • 7:02 – Travel to exotic locations and capturing exotic unattainable things
  • 7:30 – Won’t be interested in creating things that don’t exist in reality. More interested in capturing real-world places.

Theme music: “Fatality” by Tigoolio

daniel-greenDaniel Green is the Co-Chairman of the Mid-America Chapter of the Immersive Education Initiative, and has been involved in teaching coding skills with immersive technologies. He points to a lot of educational resources at code.org that they use including curriculums using MIT’s 2D, drag-and-drop gaming platform Scratch, the 3D platform of Alice, Greenfoot for teaching introductory Java programming, and then programming mods within Minecraft. There’s also MinecraftEDU, which has a community of educators who share their programs with each other.

Theme music: “Fatality” by Tigoolio

Ka Chun Yu is the Curator of Space Science at the Denver Museum of Nature and Science. He’s in charge of the digital planetarium there, and has been studying the effectiveness of using immersive dome environments in teaching different astronomical principles.

Ka-Chun-YuHis research has shown that immersive technologies are more effective at teaching certain astronomical principles such as why there are different seasons and how the sun rises and sets in different places throughout the year as well as throughout different places on the planet. That may not be a total surprise, but what was interesting was that the process of transforming and distorting immersive, 3D visualizations in order to work within a 2D projected context may actually be worse than telling people about it without using any visualizations at all. There are certain topics where being able to see objects fly around you are a critical part of understanding how the world works.

Ka Chun also talks about how he’s been using immersive technologies to facilitate group discussions with experts on various ecological issues facing us today. The technology can help provide a holistic picture on topics like the complete water cycle and the limited sources of fresh water. Having data visualizations and immersive experiences can make dry topics more compelling and engaging, and provide a solid foundation and context for having deep discussion around challenging abstract issues that we face as a society. He’s found that using immersive technologies like a digital planetarium can provide an experience to a large audience that is both very effective and compelling.

TOPICS

  • 0:00 – Intro – Curator of space science and working with digital planetariums and studying the effectiveness of using immersive dome technology to teach astronomy and for telling stories about planet earth and regional ecological issues
  • 0:39 – Effectiveness of fully, immersive dome for teaching astronomical principles compared to same visual content that’s projected onto a 2D wall as well as a control group with no visual content at all. For astronomical seasons, the immersive dome students had far superior results and if it’s projected onto a 2D wall, then those students did worse. Showing immersive visuals in a 2D screen, you get distortions and it’s a much inferior experience when compared to immersive visuals.
  • 2:19 – Other astronomical principles better describe in a dome. Seasons requires you to look around and be able to watch the objects move across the sky. Other results are not as straight forward when you don’t need visuals that move around you.
  • 3:06 – Sun rising and setting in different parts of the sky and able to show direction of rising and that it changes during the time of year and depending on where you life. You can even show how the sun rises and sets on Mars. Have the universe in their simulation and can travel through the entire universe and travel through time and at different viewpoints and perspectives to look at various issues, which is more effective
  • 4:38 – Lectures and dialogues about ecological issues on planet earth. Have Google Maps type of capability. Connect people to issues of global change and how to be a part of the solution
  • 5:37 – Other applications. 100% of California is under drought conditions. Help understand about water issues and where fresh water comes from. Fly around and show where water originates. Show them the water cycle process, rainfall and drought data, and connect them to global issues. Easier to show data in immersive dome environments than have someone just tell you about it in an abstract way.
  • 7:38 – Trying to come up with a model so that audiences can have a group discussion with experts talking about these global issues from water conservation to sustainable agriculture. Need to educate the public and do it in a compelling way, and visual storytelling is very powerful and compelling, it looms overhead and is much more effective and immersive. Having these discussions with immersive technologies laying the context and foundation for the discussion is a very power and effective approach.

Theme music: “Fatality” by Tigoolio

Ross Mead studies Human-Robot Interactions to make robots and virtual human non-player characters (NPCs) more realistic to engage with. There’s a lot of overlap between designing body language for physical robots and for NPCs since they use the same principles.

RossMeadNon-verbal communication is a fundamental building block of social interactions, and he talks about principles like spacing, socially-appropriate eye gaze, gestures, using and understanding pointing behaviors, modulating speaking voices to be louder/softer or faster/slower, head nodding, and taking turns when communicating.

He talks about how humans are always broadcasting information with everything that they do whether it’s speaking or not speaking, moving or not moving. Any reaction or lack of reaction communicates some form of meaning of whether or not your interested and engaged or disinterested and not fully connecting.

Body language can tell you the nature of the relationship with someone, and being able to identify open and closed body language cues can add another layer of depth and realism to interactions with NPCs within virtual environments.

Ross says that there a couple of ways to measure how believable your social interactions was either robots or virtual avatars. There are physiological measures that can come from looking at heart rate, galvanic skin responses, respiration rate, and general activity like the speed and frequency of motion. But there are also traditional psychological surveys that can measure how believable or comfortable the interaction was subjectively perceived.

He sees that the top two body language cues to implement with virtual humans would be adaptive positioning and automating co-verbal behaviors of gestures that are coordinated with speech so that it doesn’t feel like a robot or zombie.

Finally, Ross talks about the different cues for open vs. closed body language, the importance of mimicking for building rapport, and some of the ways that these techniques could be applied to provide a safe escape that’s fun and improves people’s lives. Stay tuned for more information about his company named Semio.

TOPICS

  • 0:00 – Intro – Ross studies Human-Robot interaction and presenting work on getting robots to use body language and understanding non-verbal communication that are the building blocks of social interaction like spacing, socially-appropriate eye gaze, gestures, using and understanding pointing behaviors, modulating it’s voices louder/softer, faster/slower, head nods, taking turns when communicating.
  • 1:12 – Applies to both robots and avatars. Robots are physically co-present NPC. Could be applied to virtual worlds to make characters more engaging. Working on making characters more engaging by using body language.
  • 2:04 – Eye gaze can feel weird if implemented in a way that feels natural. It broadcasts info, and tells others what you can observe & connected to privacy and the nature of the relationship. Continued eye contact means “I want to see more.” Too much eye contact violates the amount of intimacy that people are comfortable with. Will compensate by averting our gaze, can get more spacing, change frequency and duration of direct eye gazes or perhaps cross arms, or pacifier behavior of self-touching
  • 3:31 – Measuring psychological impact of implemented body language? Two ways. Use physiological measures like heart rate, galvanic skin responses, respiration rate, general activity and speed of motion. Can use psychological surveys with Likert scales. How intelligent NPC? Was it violating your personal space? Use these to figure out how people react to these
  • 4:43 – Top behaviors to implement with NPCs. Positioning is the first thing to get correct, and will be more engaging you adaptively use positioning. Secondly would be automating co-verbal behaviors of gestures that are coordinated with speech so that it’s not a robot or zombie. Eye gaze. Pointing. Immersive and engaging.
  • 6:26 – Pointing behaviors like pick “that” up or talk “her,” which is a referencing behavior that’s fundamental to human communication
  • 6:58 – Body language for engagement like a forward lean, increased eye contact, increased rate of speech. Opposite with the opposite like leaning back, attention if focused elsewhere. Can’t look at these in isolation, and look for combinations and clusters of behaviors because there’s other reasons
  • 8:12 – Open body language, not arms crossed and reveal front of the body. Open eyes, eyebrows up and a smile. Bitchy resting face when an idle pose scrunch up, and have to consciously counter this. Humans are broadcasting 24/7 and need to be aware of what you’re putting out.
  • 9:34 – Mimicking body language is a fundamental component to building rapport. USC’s ICT is looking at virtual humans.
  • 10:24 – These technologies will make our lives more fun. Seen as an outlet and relief to challenges we face during the day. Safe escape, but also if someone has a disability and want to improve their lives. Focusing in on helping people with special needs and make the world a better place.

Theme music: “Fatality” by Tigoolio