Option to load FaceCap data in as FACS animation?
With the addition of FACS elements in Genesis 8.1 morph sets, I found myself wondering how difficult it would be to add proper support for loading facial mocap animation from FaceCap (iOS facial capture app for devices with Face ID) text export.
I’ve already cobbled together a FaceCap text format reader and have had success mapping and animating the DazFacs property set. Next step for this code demo is to connect eye angles, and head position/angle. I’m sure more than a little finesse work and UI elements would be needed beyond that, since I’m relatively new to Blender animation workflow. I’m not quite sure how it would work, for instance, to retain facial animation capture transfer capability while adding the Rigify or MHX rig right now.
Meanwhile, I’ve attached my still-evolving prototype code; you are welcome to re-use, modify-to-taste, or disregard in whole or in part. Down the road I’ve also been looking at the source code for Blender Motion Capture, since it can also export ARKit body motion sequences.
This sounds very interesting. Do you know where one can find a spec for the FaceCap and ARKit formats, and better still some sample files to test.
A short clip I just recorded with a quick random sampler pack attempting range-of-motion. This is the csv-like text format that the provided FaceCapReader pulls in. The FBX equivalent is considerably larger but includes geometry.
Here's the FBX of the same sample. Not super useful in Blender since it’s ascii FBX that it won’t import.
Here’s their webpage: https://www.bannaflak.com/face-cap/ .
I got the eye motion to work last night by making sure to Add Extra Face Bones before loading FACS, then animating lEye and rEye pose-bones. Note the angles appear to be in degrees.
However, those pose bones appear to also be driving eyelid shapes at the same time, which normally would be preferred, but may already be covered by the eyelid FACS (i.e. just want to drive the actual eye itself). The eyelash geometry also doesn’t follow correctly, though I’m not sure if that’s a separate issue with the FACS import and/or the extra face bones.
Updated script with addition of lEye/rEye euler angle transfer.
Converted (via modo) the ASCII FBX to a binary 2018 format that Blender can load, since this may be useful for side-by-side reference.
Mostly for the curious, the list (with illustrations) of the FACS blendshape coefficients that ARKit provides and FaceCap records: https://developer.apple.com/documentation/arkit/arfaceanchor/blendshapelocation
Unreal’s Live Link Face may be an equivalent option for facial mocap. Maxon has their own face/body ARKit capture app (Moves by Maxon) that I haven’t seen non-Cinema4D interfaces for; and Blender Motion Capture (face/body) on github is full open source and has a Blender plugin, but has an facial control rig that it records bone motion coordinates for before transferring animation.
Thanks for this info. I made a quick importer which is found at the top of the FACS Units panel. Seems to give rather sensible output, but I haven’t compared with the fbx file.
This is looking fantastic - thanks! A few things I noticed-
In line 209-210 of facecap.py:parse, I did this and the motion matched much better:
Regarding the eyelash displacement, this looks like it may be strictly FACS import and Merge Rigs resulting in a progressive displacement of the eyelash geometry - it’s most apparent with Eye Look Up (Left/Right) but others also show it to a lessert degree. I hadn’t realized how complex things had become with Gen8 and swappable facial hair. Possibly should be a separate ticket, assuming I’m not doing something incorrect/silly merging the eyelash rig to the primary rig?
The "dislocated eyelashes" that seems to come along with some of the FACS controls. Have seen this both with original Gen8 transmapped lashes as well as with fibermesh-based ones. Still scratching my head...
Re-read your blog entries and on my homework list is
I didn’t see the problems with the eyes, because they are all white in the viewport so I didn’t see the irises. The problem is most probably that the euler angles are specified in world space, but I just plugged them in as local rotations. This may sort of work for the head, whose local coordinates are close to the global coordinates (if Y is up in the facecap file), but some axes would be flipped for the eyes. The correct way to handle this involves the bone matrix in rest pose, which is not that difficult but may take some time to get right.
Since the head and eye bones are almost perfectly lined up with the coordinate axes, it was easier to just flip axes. I created a test file with simple bone rotations, and the bones rotate correctly provided that FaceCap uses a coordinate system with Y up.
The head location, if you use that, actually moves the hip. Moving the head relative to the neck does not make much sense, it just compresses the neck in a very strange manner.
Test file updated with location tests. The hip bone now moves in the correct direction.
I would say that this works now, cf. http://diffeomorphic.blogspot.com/2021/02/facecap-test.html
I’m assuming the iPhone dot matrix facial recording is at 30p? I only say this since Apple seems to target 30/60/120 for it video recording.
I would hope there will be a Diffeomorphic 30fps Facs conversion option in the future. I animate all my stuff at 30fps to 60fps.
Is Unreal’s Live Link an option with Diffeomorphic?
Lashes and peach fuzz is unparented “not moving with skin” when the cheeks, mouth, brow move the face but not with blinking. This happens without using motion capture.