#105: Sarah Amsellem on using Faceshift for capturing facial animations

Sarah-AmsellemSarah Amsellem is a lead engineer for Faceshift, which uses a 3D camera to capture facial animations. She talks about the workflow for getting Faceshift up and running, and says that having a good 3D model is key.

Faceshift is being used for High Fidelity, as well as throughout the animation and gaming industry to bring more human emotion and expression into characters through facial animation. At the moment though, it’s difficult to do full facial tracking while wearing a virtual reality HMD. Sarah suggests that eventually they might be able to fuse eye tracking with other facial tracking technologies in order to bring more human expression into realtime social interactions in virtual environments.

For more information on Faceshift, then check out this product page.

Here’s a demo reel that was running at GDC 2015:

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast.

[00:00:12.079] Sarah Amsellem: My name is Sarah Amsalam, I'm a lead engineer at PhaseShift and what we do is software to actually make it easy for you to animate 3D models. So basically we use a 3D camera like Kinect, PrimeSense or now we also support the Intel RealSense camera. With that you can actually train a profile of yourself with your application. It's about a 15 minute setup, top, very easy to do. And then you can already be live tracking and animate your avatar.

[00:00:39.758] Kent Bye: I see. And so what kind of rigging does it require to be able to do that type of nuanced facial animation then? Or how would you suggest people go about actually integrating the face shift data into one of their models?

[00:00:52.690] Sarah Amsellem: So first of all it's very important to have a very good model. So you need to build your own rig, you need to build your own model using Maya or any other software and within PhaseShift we support 50 blend shapes. So what is important is to do the retargeting from your own rig to our template model. And that's why the quality of your rig is very important because it's going to be the output quality of the tracking finally.

[00:01:18.611] Kent Bye: I see. And so if I'm not a 3D modeler, or some of our listeners are not a 3D modeler, what would you recommend as a good place or resource to get a 3D model that may be sort of set up already to be able to use it?

[00:01:30.049] Sarah Amsellem: I'm not very familiar with those things because I'm more in the engineering side than the artistic side, but I know there are quite a few other companies providing models where you can easily create models, like maybe Mixamo, and then you can import them into FaceShift and do the retargeting there and do the animation that way.

[00:01:49.517] Kent Bye: I see. So, tell me a little bit about the process for how people are actually using this. What type of use cases have you seen FaceShift as most often being used?

[00:01:58.922] Sarah Amsellem: Well, mostly it's used in games, it's also used in movies. We have a lot of independent people, a lot of freelance people, but also big studios. We were working a lot with Dreamworks lately and they did little clips. animation clips for YouTube where they actually used PhaseShift. So the typical setup for them is they have a mock-up stage, they have an actor, they have a helmet with a 3D sensor on it, and they record the facial animation. And then after having done all those recordings, they actually post-process the clips, they clean it a bit using animators, and then they use that as a basis for animation. There's also a lot of independent people using it for all kinds of projects.

[00:02:40.728] Kent Bye: And so, yeah, maybe tell me a little bit about the difference between, like, say, mounting a RealSense or a PrimeSense camera on the top of your laptop and using that capturing your face versus having this helmet. Like, you know, what type of use cases between the two?

[00:02:56.351] Sarah Amsellem: Well, with the helmet you're much more flexible, so you could also combine it with a full body mocap system, and that's what most of the big studios like to use, because they can synchronize the body with the face, using timecode and other kind of stuff. Yeah, and then I guess it's just flexibility, you can move more easily, much more freedom in animation that way, than just sitting in front of your laptop, but that will work as well, depending on what you actually try to achieve.

[00:03:24.235] Kent Bye: And so, tell me a little bit more about, is there a Unity plug-in, or how do people sort of integrate this with their existing game engines?

[00:03:31.138] Sarah Amsellem: So, we provide free plug-ins for Unity, Maya, and MotionBuilder, for example. So, you can stream from PhaseShift Studio to those plug-ins. You can also import clips there, you can play back clips that you recorded with PhaseShift, and you can export from PhaseShift to various formats like FBX, and stuff like that.

[00:03:53.324] Kent Bye: I see. And so would someone ever want to record the raw data that's coming from FaceShift without a model and then sort of form the model around that? Or do you really need to have a model that's sort of already set up? I'm just trying to see if you're trying to build it from scratch by capturing the data or if it just actually needs to be sort of correlated to an actual 3D model already.

[00:04:17.033] Sarah Amsellem: When you start using Phase Shift, you already have a project in mind, so normally you start having a nice 3D model and then starting to do clips with it. It's very important to have a nice model, otherwise you're not going to see the output correctly, directly. But of course you can still reuse clips that you've done previously and try to use them in another model. So it's really flexible that way, I mean, you can save the clips, you can keep them and you can still map them to other models later.

[00:04:46.700] Kent Bye: And, yeah, we were talking a little bit earlier about wearing a virtual reality head-mounted display and how, you know, if you're actually using this within VR, it may be difficult to actually track the eyes. Maybe talk about, like, why it's challenging to be within virtual reality but also sort of track your face with FaceShift.

[00:05:04.362] Sarah Amsellem: Well, the major problem is you have actually a headset on your face. So we don't actually see your whole face. That's a big problem. And what you would want in VR environment is to actually have good eye contact of your virtual characters. So what would be the ideal setup is to have an eye tracker within the headset to actually track the eyes of the person and then have another camera tracking the rest of the face and then combine those information to actually get a nice tracking within the VR environment.

[00:05:35.691] Kent Bye: Cool. And so what's coming next for FaceShift? What can we see on the horizon?

[00:05:41.015] Sarah Amsellem: So now that we support the Intel RealSense camera and now that it's going to be integrated into laptops, we also try to get into the consumer market and hopefully going to release some nice application to be used for anybody.

[00:05:55.764] Kent Bye: And is there a free version and a pro version or what's sort of the pricing model for FaceShift?

[00:06:00.272] Sarah Amsellem: So there is PhaseShift Studio, and we're going to work on PhaseShift Studio Pro, which is more targeted to studios who want to work more with mocap systems. But PhaseShift Studio can be downloaded, and there is a free trial for 30 days. All you need is actually a sensor.

[00:06:16.682] Kent Bye: And what's the price point for PhaseShift?

[00:06:19.504] Sarah Amsellem: So PhaseShift Studio is 1,500 a year subscription, and PhaseShift Pro is going to be 3,500.

[00:06:28.023] Kent Bye: And do you also sell the helmets that people might use to be able to mount these cameras to be able to use it in a more flexible way?

[00:06:35.996] Sarah Amsellem: So the helmet is still a prototype, we're not selling it yet, but we're working on it, and we're gonna also propose some DIY solution, like DIY helmet that could be pretty cheap actually, using bicycle helmets and GoPro mountings, and we also have now a Snorricam vest that could be also pretty useful for some studio.

[00:06:59.232] Kent Bye: Awesome, thank you.

[00:07:00.393] Sarah Amsellem: Thank you.

More from this show