Jane Crayton is an immersive educator at the ARTSLab University of New Mexico who teaches and creates immersive dome experiences. She’s collaborated with Charles Veasey from the The Digital Dome at Institute of American Indian Arts in creating the vDome open source software, which is multi-channel projection software that provides real-time warping and slicing of content designed for immersive domes.
Jane describes how you could take content developed in Unity and project it onto a 20ft dome with one computer, and a TripleHead2Go to drive three projectors. Producing content for domes used to require a lot of rendering time, but can now be done in real-time using vDome or Blendy Dome VJ.
The desire to do live VJ performances in an immersive dome is what catalyzed some of these technological breakthroughs including with two other groups working on this including the Société des arts technologiques (SAT) in Montreal, Recursive Function Immersive Dome (RFID) in the UK.
Jane talks about some of the educational uses of immersive domes including how she’s using it to recreate archaeological sites. Domes also allow for collective experiences that could be shared in groups, and that she expects to see Unity playing a bigger role in producing content for domes moving forward. She sees that fully immersive domes have the potential to change your perspective and alter your frame of reference, since you leave behind your point of view and it allows you understand material in a new way.
- 0:00 – Intro – Work in fully immersive dome. Teaching digital production for a full dome environment using technologies like spherical photography, photogrammetry, and building up 3D environments to be fully immersed in the dome environment and interact with it. At the University of Mexico arts lab, and got a grant to develop a curriculum to best product multi-projection, full dome format. Creating a 4000px x 4000px format. Blending photography with virtual objects with textures. Focusing on creating on new and interactive tool within the full dome. Technology has been innovating to change how multiple-projection digital planetariums are produced. vDome open source software written by Charles Veasey, which provides real-time warping and slicing for domemaster input. Developed it in order to do live VJ performances, and bringing in contemporary club culture into the immersive domes. Being able to build out virtual places that you can explore and interact with each other. vDome transformed how they use the dome since it doesn’t have to be pre-rendered so that they can see it immediately on the dome. It’ll change how dome content is produced. Still in the R&D phase. Other groups creating dome software include Société des arts technologiques (SAT) in Montreal, Recursive Function Immersive Dome (RFID) in the UK, Blendy Dome VJ in Brazil. All of the groups were motivated by wanting to do live VJ in immersive domes.
- 7:55 – Immersive dome vs immersive VR in a HMD. Some are 360-degrees and others are 180-degrees or 270-degrees. It allows you to look around and see out of your peripheral vision. You can engage audience with surround-sound audio. Use sound as an instigator for what to pay attention. Engaging emotionally and physically and do it with a live audience. You can sit in different perspectives within the dome. Consider how the audience will be seated and how they’ll be looking at the dome
- 10:35 – Educational component to domes. First experience within a dome was in a planetarium, and it got her interested in science, optics and computers. Slide projectors used within the dome. It’s not just about astronomy in the dome any more. Teaching photography and videography from a different perspective. Dome offers a lot to students and teachers to engage with each other. Your perspective changes when you’re immersed
- 13:11 – Content beyond astronomy. Cartoons. Film. Working with on a NSF grant to document archeological sites and building out a virtual archeological sites to be experienced in an immersive dome. Looking at applications beyond astronomy. Teaching photography, videography and 3D skills
- 15:40 – What one would need to set up a dome. Download vDome software. A 20ft dome would require 3 projectors. Need a computer. Would need a TripleHead2Go to drive three projectors.
- 16:53 – Digital planetariums used to use a $5k computer per projector x7. Today it’s a lot easier. A computer with two video cards could drive up to six projectors with two TripleHead2Go devices.
- 18:50 – How does Unity game engine fit in? Can pipe in Unity environments onto immersive dome environments. Movement can be difficult since moving too quickly will make the audience sick. Unity is up-and-coming platform for the dome
- 20:20 – What to avoid to minimize motion sickness. There’s a sweet spot on the dome where they audiences’ eye naturally rest. Take everything a bit slower and watch what you’re producing in the dome. Slow pans, animations and moves, and can be easy to get sick. Trojan commercial with pigs on a roller coaster that made people sick.
- 23:00 – Spherical video solutions to bring video into an immersive dome. High-learning curve on these technologies. 360Heros is probably the most affordable solution. Uses similar software pipeline.
- 25:48 – Full dome has the potential to change your perspective and alter your frame of reference, leave behind your point of view and understand material in a new way.
Theme music: “Fatality” by Tigoolio
Inarra Saarinen is the founder, artistic director and choreographer of Ballet Pixelle, does virtual dance performances in Second Life. She talks about the process of blending physical and virtual realities, and pushing the boundaries in creating a new form of dance.
It’s not just about replicating physical reality in a virtual world, but integrating all of the things that are impossible in the real world including hovering, flying, moving your limbs beyond a body-joint movement. becoming an object, animal or dragon, being able to change your skin color, gender, and age.
She talks about some of the limitations of only having 28 bones to work with in Second Life, and her process of scripting out segments of animation sequences, but allowing each dancer to be responsible for the timing of the execution while having the room to improvise.
Inarra doesn’t want to create an automated experience that’s the same every time, but rather capture the vibrancy and vitality that comes with the imperfections and character of live performances. She also talks about how a lot of the participants aren’t physically able to be in a professional dance troupe, and that by participating in Ballet Pixelle the they’re able to have a kinesthetic experience of feeling like they’re performing dance on stage.
It’s interesting to hear all of the insights that Inarra has from doing Ballet Pixelle since 2006, and I imagine that the blending of physical and virtual during live performances will be an area of rich exploration over the next decade. From the VR community perspective, the Riftmax Theater’s Karaoke night starts to explore this blending of realities during live performances, and it’s a bit of an open question as far as what will be considered the most compelling and beautiful experiences within this new spectrum of mixed realities.
- 0:00 – Intro. Founder, artistic director and choreographer of Ballet Pixelle, does virtual dance performances in Second Life.
- 0:40 – How is movement controlled? Creates animations and chunks of movement in scripts and puts them into Second Life where each dancer is in control of their avatar’s performance.
- 1:28 – Create animations in 3 ways. Individual keyframe per 30 fps and import into Second Life. Also uses a motion capture suit, but Second Life only allows 28 out of 206 bones. Also using a Kinect system to put the animations in a coherent sequence.
- 2:31 – How does the dance trouble keep in sync. They keep the beat like another and are in charge of triggering the actions with their keyboard & mouse.
- 3:03 – What’s been the reaction? Lots of powerful emotional reactions.
- 3:35 – What is the audience connecting to? It’s a combination of telling story with set, lighting and movement. Movement is a universal language, and if you put together correctly, then you get an emotional resonance.
- 3:57 – Sleep No More dance performance of MacBeth. Any dialog? There’s a playbill that tells the story of the ballet just like you would in any other live performance. The story should tell itself, but there’s a bit of help provided
- 4:42 – What’s motivating your performers in your dance troupe? Had people in the troupe since 2006. They really get the kinesthetic experience of performing, and a lot of them have physical or other limitations where they’ve never been able to do that be for. They feel like they’re on stage and giving a dance performance.
- 5:45 – The human synchronization and not being driven by a robot. A movie is the same every time, but live theater is not the same every time. Trying to create an experience that’s vibrant. It’s art, not automation. She wants those human imperfections. Choreograph ballets that allows them to deliberately go out of sync and to make order out of chaos.
- 6:54 – Are there auditions? Lots of things are different. Transform, hover, fly and move beyond body limits. But lots of similarities and universals of working with other people. Some of the things they look for.
- 7:44 – Coordinating across many different time zones for live performances. You can teleport in Second Life, but you still have time zones. Have both a European and North American dance troupe. But it can be difficult.
- 8:35 – Other considerations for broadcasting music and clearing rights. Been very copyright sensitive from the very beginning. Made sure that everything is copyright cleared, and have clearance from everyone involved.
- 9:42 – Right for each performance and image release for avatars
- 10:18 – What keeps you engaged? It’s creative and at outer bounds of being creative. It’s a new form of dance. It’s not just adding something. It’s an exploration of physical and virtual movement and blending of realities, which is a different form. What do we find beautiful about virtual dance? Developing a language for virtual dance.
- 11:34 – Things you can do virtual dance: Hover, fly, move beyond a body-joint movement. Become an object, animal or dragon. Change your skin color, gender, Become a child.
- 12:00 – Use all of these components in all of her ballets.
- 12:17 – Pushing limits of what’s physical possible and expanding audience for dance. Gives dancers a chance to experience performances. Teach history of ballet and technique.
- 13:04 – Immersive VR with the Oculus Rift, and future of limb tracking with dancing in VR. Not as interested in translating your movements into the virtual world, because the animations are doing things that you couldn’t be doing. Not interested in replicated the real world, and can’t go out and hire real professional dancers
- 14:36 – Ultimate potential for virtual environments. We’ll eventually live in virtual worlds.
- 15:03 – Next open problem to solve with virtual dance. Limited by the 28 bones that are allowed by Second Life out of the 206 bones. On a world audition tour to do choreographic studies and do motion capture of dancers to study the movement of professional dancers.
Theme music: “Fatality” by Tigoolio
Isabel Meyer is the branch manager for the Smithsonian’s Digital Asset Management System (DAMS), and she talks about the process of digitizing different collections within the Smithsonian to better support its mission of “increase and diffusion of knowledge.”
There are over 157 million objects in the Smithsonian’s overall collection with over 5 million of them having been digitized within their DAMS. This accounts for just over 3% of their total collection, and their in the process of prioritizing the digitization process and making those assets more widely available.
She mentions the Smithsonian Collections Search EDU site that has over 8.6 million catalog records of museum objects, library & archives materials with about 15% of those that have images.
Isabel says that this is an expensive process, and they’re trying to get more funding to make these objects available. Hopefully at some point, VR developers will have greater access and ability to create immersive experiences that include authentic artifacts our our digitized cultural heritage.
- 0:00 – Intro – Digital Asset System manager at Smithsonian. Digital representations of all of their collections. Capturing more and more objects. Currently at 5 million digital assets. Being used by all 19 museums, 9 libraries and the zoo.
- 1:43 – Total Objects in Smithsonian is 157 million objects, but doesn’t include event photography and other objects. Probably around less than 3% of it has been digitized. In process of prioritizing what should be digitized.
- 2:44 – Getting access to digital objects. How do you collaborate or get access to some of these objects. Their DAMS is behinds a firewall. Determining what should be made publicly available. Greatly expanding this portion. There was a lot of reluctance at first. Have expanded tools. Smithsonian Search site at http://collections.si.edu/search/ Sketchbot Robot that draws images in the sand, and want to make that code made available.
- 6:44 – Tracking metadata within their digital objects. Different categories of metadata, and their DAMS is integrated with their collection management systems. Metadata is embedded within the asset.
- 8:00 – Announcement of museums that will be releasing objects. Have an existing 3D site with 20+ objects available. It’s an expensive process, and trying to get more funding to make these objects available at Smithsonian X 3D. There’s a rapid capture initiative.
- 10:04 – What would you hope would happen with this cultural heritage. Don’t know what the possibilities are yet. Researchers, educators and creating new artwork.
- 10:55 – Potential to collaborate with Smithsonian. Would need to go through the Public Affairs office.
Theme music: “Fatality” by Tigoolio
Terry Beaubois is the director of Montana State University’s Creative Research Lab, and he talks about how he used Second Life to teach architecture classes and the different limitations he faced from having an imprecise physics model in the virtual world.
He talks about the other potential for using architecture within virtual reality as well as starting to think about how a physical space can interact with you through the Internet of Things, and the implications of living in a smart home that is aware of who you are, where you’re at, and you behavioral patterns.
Terry also talks about his different VR projects that he’s been working on since the early 80s with doing telepresence applications for NASA so that astronauts could control robots through a virtual reality interface.
- 0:00 – Teaching at Stanford and talking about lessons on VR. Been doing VR since the 80s with NASA doing robotic telepresence. Motorcycle helmet with CRT monitors and wires. Data glove. Involved with VRML and early days of Second Life. Going to be experimenting with Terf VR program, which is a follow-up to Croquet & Qwaq.
- 3:09 – Seems like a natural fit for architecture. Different between building a house in VR vs. designing a house for how it’ll actually be built. Second Life and VR programs need to have accurate physics models in order to have a 1:1 mapping of reality and to do actual architectural design. Currently have to do workarounds, which isn’t teaching real architecture.
- 5:55 – Would love to see accurate physics models within a VR engine for architectural purposes.
- 7:05 – Importance of spaces and design principles for architecture. Creates a context that blends in with reality. Architecture needs to have sensory awareness and be plugged into the Internet of Things. Entering an age of enormous amount of information being shared. Architecture could be a participant in peoples lives through sensors and detecting your identity and patterns of living. Not a lot of imagination for what a smart building would mean
- 10:00 – Entering a golden age where everything will communicate with everything. Track medical biometrics and share to relevant parties. Singularity will be a non-event because we still people to help interpret the meaning CERN is generating an enormous amount of data, and it still requires humans to look at it
- 12:04 – History of VR since the 1980s. Human’s connection to a virtual avatar could be relevant and cognizant of your physical avatar because we’re not connected to what our human life form is in charge of maintaining. VR can help us with deal with who we are. VR will enable helping people deal with phobias. He meets the most creative and fun people in virtual worlds. VR will be a tool that will develop and evolve over time. Lots of uses for training. They physics engine will get there eventually to be more relevant for architecture.
- 16:08 – VR and Architecture business engagements, and be used to build something and preview it beforehand. Perhaps VR to 3D printing and have lots of iterations.
- 17:25 – Being able to experience a architecturally design space in VR before it’s created
- 18:14 – Dealing with the Wild West with no rules in Second Life and adult content.
- 18:47 – Future of VR. Thought we’d be where we are with VR back in 1985. Good thing we don’t know how long it’ll take otherwise we may not start things. Humans are hopeful and generally optimistic for how long things take.
Theme music: “Fatality” by Tigoolio
Kevin Joyce is the editor-in-chief at VRFocus, and he talks about how they’re covering everything to do with virtual reality gaming and entertainment at VRFocus. He talks about how it was founded and funded by nDreams CEO Patrick O’Luanaigh, who is working on a number of VR experiences and noticed that there wasn’t a site in the UK covering VR in a comprehensive way.
At the moment VRFocus is just Kevin and Jamie Feltham, who has been tracking a lot of the online communities and breaking news in the VR space. VRFocus does a lot of excerpting from other articles to pull out the newsworthy bits of information, as well as a lot of original reporting, live blogs at conferences, and video interviews.
He talks about some of the things that need to happen for VR to go mainstream, and how VRFocus is trying to help communicate what’s happening in this space beyond to the wider video gaming community. He says that VR needs to make incremental steps towards going mainstream, and sees that one day VR experiences will be prolific and the standard norm for people. There’s so many things that VR can do and that we’re just only starting to scratch the surface.
- 0:00 – Intro. Worked in video games journalism, and VRFocus funded by nDreams’ Patrick O’Luanaigh. No VR website in UK, and started a site where he has full editorial control. Launched in February 2014. Focusing on VR as entertainment
- 1:00 – VRFocus as the beat reporter of the VR space. Aim to cover video gaming and entertainment and how VR is changing VR gaming
- 1:39 – SVVRCon coverage. Did liveblog coverage about VR gaming. Conducted 28 video interviews and released over time.
- 2:27 – What got you excited about VR? Only touched on briefly on VR before getting the job at VRFocus. Independent game developers are driving a lot of VR innovation and showing what the power of VR is
- 3:10 – VRFocus’ Jamie Feltham tracks a lot of the online communities and breaking new stories.
- 3:58 – Just Kevin and Jamie putting out 12-14 articles a day
- 4:15 – Pulling out news bits from existing content. Aimed at non-VR audience and push beyond your normal audience and share what’s going on in a way that’s consumable.
- 5:30 – Measuring the response. Doing a lot better within the VR community than the larger video gaming community. Trying to let people know about what VR is
- 6:04 – Reaching out to new audiences. Finding the middle ground, and the big projects excite a lot of people. Cover Sony because it’s closer to the larger audience
- 6:50 – Events to cover for VR. Meet-ups and conferences like SVVRCon, GDC, & E3.
- 7:44 – What types of games he’s personally experienced. VirtualReality.IO isn’t a game, but was a compelling experience to show the seamless interface to be able to go from game to game without leaving VR. VR needs something like this to go mainstream.
- 8:38 – People projecting what they’d expect would be a great VR experience, but they find out it’s not as great as they expect. VR needs to be incremental to minimize simulator sickness.
- 9:48 – Most surprising is to see the general public’s reaction to VR without ever hearing or knowing anything about it.
- 10:24 – There will be a time where VR is the norm, and it’ll be standard. So many things VR can do, and we’re just only starting to scratch the surface.
Theme music: “Fatality” by Tigoolio
Ivan Blaustein is a co-founder of the Orange Country VR meetup, which happens to be in the same location as the headquarters for Oculus VR. Their first meet up had 180 people, and they had five meetings within their first month including a couple of game jams hackathons.
Ivan talks about fostering community through the process of getting together to create VR experiences, rather than just talking about it or demonstrating existing VR experiences. Leading up to the Immersive Education Initiative’s Immersion 2014 gathering, OCVR held a couple of educational game jams and were demonstrating the winners of those hackathons.
He talks about how the VR Classroom, VR Typing Trainer and PVRamid demos were demos that were much more compelling in VR than in 2D, topics that haven’t been well-explored in the past, and were a really polished experience for having been created within 48 hours. You can check out these and the other demos on the OCVR site here.
- 0:00 – Intro – co-founder of the Orange County Virtual Reality. Meetup group for VR developers near Oculus VR headquarters. Had 180 people show up to try 12 demos including a DK2 demo from Oculus. Had 5 events in the first month teaching how to create VR experiences and to foster community
- 0:56 – Dove in head first. Other events that they’ve held. Had hackathons for the past couple of weekends. Developers split into teams to develop educational experiences to have demos to show at the Immersive Education Initiative’s Immersion 2014 gathering.
- 1:35 – In partnership with UC Irvine and collaborated with them on a hackathon. VR Classroom won that VR competition. Had another group the following weekend and a couple of guest judges. Showing two top prizes from Hackathon. VR Typing Trainer and the Pyramids.
- 2:45 – VR Classroom developed by someone who never VR. Take traditional classroom and twist it on it’s head. Each classroom has a VR twist to it. History room that has a small-scale model of the Roman coliseum. Approach it, and the room walls fall down and you’re in the middle of the Roman Coliseum.
- 4:07 – VR Typing Trainer like Mavis Beacon. Can’t see your fingers and forces you to learn the keys. Has a TRON style. Fun and exciting and difficult to cheat
- 5:03 – PVRamids done in UE4. On-rails experience exploring the pyramids
- 5:55 – Design principles of educational demos – Can only be in VR and wouldn’t be as compelling in 2D. Look for things that haven’t been well-explored in the past. And to create a polished experience within 48 hours. Other Wii mote and Google Maps integrated experience didn’t have as much polish
- 7:19 – Forming community through hackathon projects. Future plans? Really amazed by the support by the community. Not the first VR meetup group, but actually getting together to make things. Talked to Smithsonian to possibly work with 3D scanned objects to see what they can do with with. Talked with Eric Greenbaum about doing a fitness game jam.
- 8:50 – First development experience at the Portland Game Jam. Having time boxed constraints to make something real in 48 hours. First time in getting hands dirty with Unity. What can get done in 48 hours. Get to see what’s possible. Any time you get together and bouncing ideas off each other is an exciting creative environment
- 10:14 – Where VR could go? Everywhere. Scared of Facebook metaverse. The positive potentials is making the world a better place and do great things, live healthier lives and learn new things.
Theme music: “Fatality” by Tigoolio
Mike Arevalo talks about the process of creating the VR Typing Trainer, which was created as part of the Orange Country Virtual Reality Meetup’s 48-hour Educational Game Jam.
Mike talks about the process of developing the game, and the structure of the game jam. His day job is to create educational applications, and they are always talking about how to immerse students in environments to help them learn more effectively.
Mike says that studies have found that gaming can stimulate a student’s brain in a way that static presentations never can, and that immersive VR can be a powerful way to unlock the parts of your brain to make it easier to learn new things. His advice to other game developers is to focus on getting the immersion right in your experience, and that your other goals and learning objects are more likely to fall into place.
I had a chance to play the VR Typing Trainer at Immersion 2014, and it is a very immersive and fun way to improve your typing. Having the words flying towards your face does create a certain amount of pressure and tension that makes the ordinarily dull process of typing much more engaging and fun. I could see how playing this game could help to cultivate some useful typing skills for when you’re in VR, and it’s definitely worth checking out — especially for an experience that was created in 48 hours.
- 0:00 – VR Typing Trainer – Educational game to bring typing into VR to type without looking at the keyboard.
- 0:36 – Sitting in Tron-like world, and targets at you and you have to type the word that’s printed on it. It’s an endless runner.
- 1:00 – It’s a simple core game mechanic. Uses object pooling to take existing objects and get them to move towards the player. Have an algorithm to determine the difficulty depending on how long you’ve been playing the game.
- 1:37 – Creating the Tron environment because it needed to something more interesting
- 2:06 – Educational Hackathon
- 2:26 – Ideas were pitched, and then broke up into groups
- 2:53 – Saw Tuscany demo, and needed to get into VR
- 3:09 – Use VR typing trainer to learn how to use keyboards more efficiency.
- 3:47 – Hard to work with 7 programmers with different skill sets and not all Unity users. A lot of other art exhibits that were there. It had a bit more
- 4:31 – A lot of planning required to coordinate.
- 4:56 – Potential for VR. Mike is an educational app developer. How to immerse students to learn at a more effective rate, and gaming stimulates a student’s brain in a way that static presentations never can. Immersive VR can unlock parts of brain to make it easier to learn new things.
- 5:48 – Advice to other VR developers to make an educational experience. Immersing the player into a place where they’d never be able to be otherwise. If you get the immersion down, then everything else will fall into place.
Theme music: “Fatality” by Tigoolio
Kieran Nolan is a network administrator who has been creating different elearning applications with immersive technologies. He’s 3D printing objects that students either create or modify from Thingiverse withing Google SketchUp. He’ll take a digital photograph of their objects, and then upload it to a virtual art gallery that can be viewed with an Oculus Rift and networked to another school system. He’s also been teaching classes in Minecraft, and even had his students collaborate on building a working QR code.
Kieran also talks about how he sees cryptocurriences like BitCoin playing a larger part of the future infrastructure that’s going to enable all sorts of things that we haven’t even thought of. He sees BitCoin as a protocol that will enable all different types of decentralization of our infrastructure. One example that he provides is Namecoin, which is like decentralized DNS and a “decentralized open source information registration and transfer system based on the Bitcoin cryptocurrency.
He says that there’s a lot of potential for using immersive technologies in education, and he sees that it’s going to bring in a whole new curriculum because it’s so engaging and compelling for students.
- 0:00 – e-learning and using oculus rift with the virtual arcade. Have a 3d printing networking set up with another school. Design 3d object, 3d print it, take picture
- 1:14 – Workflow. Using sketch up to design objects. Eventually want to use Minecraft for designing objects. 3D print and take photos, hashtag and upload to virtual arcade to be viewed. Built a QR code in Minecraft. Lots of collaboration with Minecraft. Kids adapt to the Oculus Rift pretty quickly.
- 3:57 – 3D printing and then put virtual images within in and use btsync. Enigma portal to get schools to work together and get older students mentoring younger students. Use QR codes to move between places. Using Titans of Space with students with Aspergers. Most interested in interschool 3D printing
- 7:04 – Immersive education keys for engagement. Downloading 3D objects from Thingiverse, and changing it. Each student took photo, and then took turns walking through virtual art gallery to see their work.
- 8:52 – Potential for using immersive technologies. Going to bring in a whole new curriculum. Running classes in Minecraft to do math and English.
- 10:16 – Excited for BitCoin in education. Wanted to use BitCoin as an incentive for learning. Using the BitGigs model to do tasks to learn, and get paid in BitCoin to do small jobs. It’d teach kids about money and cryptocurrencies.
- 12:03 – Bitcoin and the future of virtual worlds. BitCoin is a protocol like TCPIP that you can build on top of it. Namecoin is like decentralized DNS. It’s a “decentralized open source information registration and transfer system based on the Bitcoin cryptocurrency.” It’ll revolutionize things, and it’ll play a big part of decentralizing everything.
Theme music: “Fatality” by Tigoolio
Philip Lunn is the CEO of Nurulize, which is an entity created by the collision of VFX and video gaming for Virtual Reality. Co-founder of Nurulize has developed a process to be able to capture the world in a high-resolution, photorealistic way with a framerate ranging from 100-200 frames per second.
In their VR demo called Rise, they combine FARO LIDAR scans, HDR photography, and xxArray character captures in order to create photorealistic environments and people within VR. He talks about the mostly manual process that they go through in order to capture the entire environment in a point-cloud with sub-millimeter accuracy, build a 3D mesh from the point-cloud data and project the HDR photos onto it, and then use real-time shaders to get framerates as high as 100-200 fps.
Philip talks about their plans to use their process to help capture retail locations, film trailers and high-value objects that you can’t get close to.
He sees VR as the biggest breakthrough in computing that there’s been in the past 25 years, and that virtual reality goggles will eventually replace our computer monitors and that Nurulize wants to help populate those virtual work spaces with idealized and exotic, 3D-scanned environments.
- 0:00 – Intro – CEO of Nurulize. Developed a process to capture the world in high-resolution, photorealistic and with a very high framerate. Creating VR experiences for the Rift
- 0:32 – Rise demo that has laser-scanned warehouse. Scott Metzger has developed this process using high-resolution photography at multiple exposures, and then using FARO laser scanners to capture the entire environment in a point-cloud with sub-millimeter accuracy, build a 3D mesh from the point-cloud and project the photos onto the mesh, and developed real-time shaders up to 100-200fps.
- 1:53 – Dealing with occlusion issues. Created a narrative around this. It’s a full environment without occlusion.
- 2:54 – LIDAR scanner FARO commercially available and then uses 3-4 tools to process that
- 3:23 – Reverse photogrammetry process
- 3:45 – Commercial business that is doing service work to do captures
- 4:05 – Special effects shops moving from film to VR. Have enough hardware processing power
- 4:47 – Target markets: Retail. Film Trailers and High-value objects that you can’t get close to
- 5:09 – How did you get into VR. Been in computer graphics for 20 years with real-time ray tracing. VR is the biggest breakthrough in computing that there’s been in the past 25 years.
- 5:45 – Where do you see VR going. Ready Player One is a good roadmap. VR HMD will replace your monitor, and Nurulize want to help fill that with 3D-scanned environments and be in dream environments
- 7:02 – Travel to exotic locations and capturing exotic unattainable things
- 7:30 – Won’t be interested in creating things that don’t exist in reality. More interested in capturing real-world places.
Theme music: “Fatality” by Tigoolio
Daniel Green is the Co-Chairman of the Mid-America Chapter of the Immersive Education Initiative, and has been involved in teaching coding skills with immersive technologies. He points to a lot of educational resources at code.org that they use including curriculums using MIT’s 2D, drag-and-drop gaming platform Scratch, the 3D platform of Alice, Greenfoot for teaching introductory Java programming, and then programming mods within Minecraft. There’s also MinecraftEDU, which has a community of educators who share their programs with each other.
Theme music: “Fatality” by Tigoolio