#59: Philip Lunn on Nurulize and photorealistic, real-time capture with FARO LIDAR scans, HDR photography, and xxArray character captures

Philip Lunn is the CEO of Nurulize, which is an entity created by the collision of VFX and video gaming for Virtual Reality. Co-founder of Nurulize has developed a process to be able to capture the world in a high-resolution, photorealistic way with a framerate ranging from 100-200 frames per second.

philip-lunnIn their VR demo called Rise, they combine FARO LIDAR scans, HDR photography, and xxArray character captures in order to create photorealistic environments and people within VR. He talks about the mostly manual process that they go through in order to capture the entire environment in a point-cloud with sub-millimeter accuracy, build a 3D mesh from the point-cloud data and project the HDR photos onto it, and then use real-time shaders to get framerates as high as 100-200 fps.

Philip talks about their plans to use their process to help capture retail locations, film trailers and high-value objects that you can’t get close to.

He sees VR as the biggest breakthrough in computing that there’s been in the past 25 years, and that virtual reality goggles will eventually replace our computer monitors and that Nurulize wants to help populate those virtual work spaces with idealized and exotic, 3D-scanned environments.


  • 0:00 – Intro – CEO of Nurulize. Developed a process to capture the world in high-resolution, photorealistic and with a very high framerate. Creating VR experiences for the Rift
  • 0:32 – Rise demo that has laser-scanned warehouse. Scott Metzger has developed this process using high-resolution photography at multiple exposures, and then using FARO laser scanners to capture the entire environment in a point-cloud with sub-millimeter accuracy, build a 3D mesh from the point-cloud and project the photos onto the mesh, and developed real-time shaders up to 100-200fps.
  • 1:53 – Dealing with occlusion issues. Created a narrative around this. It’s a full environment without occlusion.
  • 2:54 – LIDAR scanner FARO commercially available and then uses 3-4 tools to process that
  • 3:23 – Reverse photogrammetry process
  • 3:45 – Commercial business that is doing service work to do captures
  • 4:05 – Special effects shops moving from film to VR. Have enough hardware processing power
  • 4:47 – Target markets: Retail. Film Trailers and High-value objects that you can’t get close to
  • 5:09 – How did you get into VR. Been in computer graphics for 20 years with real-time ray tracing. VR is the biggest breakthrough in computing that there’s been in the past 25 years.
  • 5:45 – Where do you see VR going. Ready Player One is a good roadmap. VR HMD will replace your monitor, and Nurulize want to help fill that with 3D-scanned environments and be in dream environments
  • 7:02 – Travel to exotic locations and capturing exotic unattainable things
  • 7:30 – Won’t be interested in creating things that don’t exist in reality. More interested in capturing real-world places.

Theme music: “Fatality” by Tigoolio

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:11.974] Philip Lunn: My name is Philip Lunn. I'm the CEO of Neuralize, a startup company. We've been in business six months now. What we're doing with virtual reality is we've developed a process to capture the world in photographic realism in real time at very high frame rate. That's really amazing in the virtual reality headset, Oculus Rift at the moment.

[00:00:32.458] Kent Bye: Yeah, so you just were showing a demo here called Rise where you were in a warehouse, a big factory floor, and you get a really big sense of the scale of the space and you have actors that are frozen and you're kind of cutting between those scenes. But what did you do to actually create that environment and put it into virtual reality?

[00:00:54.304] Philip Lunn: It's actually quite a long and developed process. It's been developed by my co-founder, a guy named Scott Metzger. Scott's been in the visual effects industry for the last 15 years and developed this process from a number of feature films that he's worked on. And the way it is done is we use high-resolution photography, so we go through a scene and capture every square inch, basically, in high-resolution photography at multiple exposures. And then we use a laser scanner, a Faro laser scanner, to capture the entire environment. in a point cloud. So it's used in the manufacturing industry where it captures to sub-millimeter accuracy every single point that the laser hits in a 360 degree circle. And then we take the point cloud and build a 3D mesh on it and then project the photographs onto that mesh. And then we've developed some very specific shaders in the real time to allow the whole thing to be run very, very fast in real time, up to 100, maybe even 200 frames per second.

[00:01:53.610] Kent Bye: Now, one of the issues that you come across when you are putting one camera in one place and capturing it with the laser camera is that you have occlusion issues of things that are being hidden behind those objects. And so in this demo, I saw a camera that was in one position, kind of locked, and you can look around. But how are you dealing with the occlusion issues? Are you able to actually create a full model of this 3D space and move around in it?

[00:02:18.362] Philip Lunn: Yeah, actually the demo you saw, the rise, was actually, we tried to create a narrative, have a story associated with it. So we actually put you in those camera positions that you moved through. The environment itself is actually not occluded. It is a full environment from a 360 degree space. So you can actually, with the hit of a key, move around in it as well. We just had it in that demo mode at that moment. So, if you want to see the interactive version using a PlayStation controller where you can move freely, we can do that as well. So, we did actually capture the environment in detail and using the laser scanner, we made sure we had no occlusion in anywhere that we captured. I see.

[00:02:53.850] Kent Bye: And is this a LiDAR laser scanner?

[00:02:56.033] Philip Lunn: Yeah, it's a LiDAR laser scanner, the Ferro technology.

[00:02:59.122] Kent Bye: And is this sort of a commercial off-the-shelf version of this laser scanner that you're just using special software processes then?

[00:03:06.025] Philip Lunn: Yeah, it's a commercially available off-the-shelf laser scanner, and we use I think three or four different tools in order to process all that data. And each tool has its particular tool that we like, so it takes skill and a number of applications to be able to process the way we want.

[00:03:21.174] Kent Bye: And so what is this process called when you're stitching everything together like that?

[00:03:25.055] Philip Lunn: It's called photogrammetry, I think is the term that's generally used, although photogrammetry almost always applies to an automation of this process. We're sort of doing a reverse photogrammetry process where we're doing it manually because we're using very specific high dynamic range images, high resolution images, and building the geometry ourselves. So we don't use the capture to build the model.

[00:03:46.095] Kent Bye: And is this a process that you plan on commercializing and making more widely available for people to do the same?

[00:03:51.358] Philip Lunn: Well, we're a commercial business, right? So our interest actually is, at the moment, we're taking service work on using this manual process. And in the process of that, we have a development team that's building an application that will be able to do this in a much more efficient way.

[00:04:06.106] Kent Bye: And so coming from the special effects industry, I'm curious if you see these special effects shops moving from film and more into virtual reality.

[00:04:16.480] Philip Lunn: Well, this is something that certainly if you ask anybody at Oculus, it's, yeah, of course, they're all going to do that. And it certainly seems that there is a logical collision of real time and film. And that's really only now that there's enough hardware processing power and enough technology, enough skills, enough knowledge for this merger of real time and film visual effects quality assets. So it's pretty apparent that there's going to be quite a movement in this direction. And we were hoping to lead that and build a software tool that will be available to everyone who wants to do this.

[00:04:46.907] Kent Bye: And so who are some of the target markets that you're going after with using this specific application of virtual reality?

[00:04:53.870] Philip Lunn: Well, certainly retail is one, and film trailers will be another one. And a third one would be objects that are very difficult to get to, what we call high-value objects that you can't get close to, like high-value large museum pieces, for instance.

[00:05:09.176] Kent Bye: I see. What are some of the things that drove you to get into virtual reality development?

[00:05:15.268] Philip Lunn: Well, you know, I've been in the computer graphics business for 20-something years. Before Neuralyze, I was CEO of a company they founded back in 2003 called Bunkspeed, and we were focused on high-end real-time ray tracing and real-time rendering for the automotive and industrial design world. And the focus of my company then was to make it very simple. And after seeing the Oculus for the first time, it was obvious to me that this is the biggest breakthrough in computing that there has been in 25 years. So to me, it's the biggest thing that's happened in computing.

[00:05:45.655] Kent Bye: Great. And finally, where do you see virtual reality going? What do you see coming next?

[00:05:50.406] Philip Lunn: Well, if you've read the book Ready Player One, it pretty much lays out a roadmap for where virtual reality is going to head. I fully believe that for many people, the Oculus headsets like this, virtual reality headsets, will replace your monitor for day-to-day work. And if it's replacing your monitor, then you're going to have a very large space that you're going to want to fill. And what you fill that with is what we want to help with. And how you fill it is really where our interest lies. So for many, many millions of people, I believe that it will replace your desktop monitor. And if you're a stock trader, you could have 20 monitors with live feeds all surrounding you, or if you want to have a IMAX theater behind you and have your desktop in front of you and a 3D scan of your girlfriend or wife or friend or whoever you want, your kids sitting next to you or your dog even, a 3D scan of your dog sitting next to you on your desk. You could basically recreate an environment and even if you actually work in an 8x8 windowless box, you could be in a penthouse apartment in Soho, New York overlooking Hudson. So that's what's cool about it to me. It really allows you to create an environment that is a dream environment that you may never be able to afford, but yet you can actually have this super high resolution, super rich, immersive place that you could live and work.

[00:07:04.410] Kent Bye: Are there plans to travel around the world and take all sorts of exotic locations and put them in as well?

[00:07:09.733] Philip Lunn: Interestingly, we just got back from Japan capturing a very very high-end retail location which is very very exciting and can't wait for it to be shown so we can talk about it live but yeah we think that there'll be lots of travel to exotic places and capturing really exotic unobtainable things is something that we're obviously is just a really great part of the job.

[00:07:30.645] Kent Bye: Yeah, and there also seems a component of creating things that are impossible, things that don't exist in reality. Are you guys going to start to kind of create new worlds like that as well?

[00:07:38.751] Philip Lunn: Well, you know, for us and our process, that's something that is more on the fantasy level. Obviously, with our experience in computer graphics in general and in film visual effects and film rendering, we certainly could create any kind of world that looks very realistic. But mostly we're focused on capturing this reality, you know, capturing what's already existing. a new piece of architecture that goes up. Let's go capture that. Let's go capture the inside of the Guggenheim so you can actually walk in that Guggenheim Museum and see the pieces as they are. So, in some sense, you know, the reality capture is part of what we're after, but we can certainly do anything fantasy as well, so. Great. Well, thank you so much. Thank you very much.

More from this show