Option to load FaceCap data in as FACS animation?

Issue #353 resolved
R. Keoni Garcia created an issue

With the addition of FACS elements in Genesis 8.1 morph sets, I found myself wondering how difficult it would be to add proper support for loading facial mocap animation from FaceCap (iOS facial capture app for devices with Face ID) text export.

I’ve already cobbled together a FaceCap text format reader and have had success mapping and animating the DazFacs property set. Next step for this code demo is to connect eye angles, and head position/angle. I’m sure more than a little finesse work and UI elements would be needed beyond that, since I’m relatively new to Blender animation workflow. I’m not quite sure how it would work, for instance, to retain facial animation capture transfer capability while adding the Rigify or MHX rig right now.

Meanwhile, I’ve attached my still-evolving prototype code; you are welcome to re-use, modify-to-taste, or disregard in whole or in part. Down the road I’ve also been looking at the source code for Blender Motion Capture, since it can also export ARKit body motion sequences.

Comments (18)

  1. Thomas Larsson repo owner

    This sounds very interesting. Do you know where one can find a spec for the FaceCap and ARKit formats, and better still some sample files to test.

  2. R. Keoni Garcia reporter

    A short clip I just recorded with a quick random sampler pack attempting range-of-motion. This is the csv-like text format that the provided FaceCapReader pulls in. The FBX equivalent is considerably larger but includes geometry.

  3. R. Keoni Garcia reporter

    Here’s their webpage: https://www.bannaflak.com/face-cap/ .

    I got the eye motion to work last night by making sure to Add Extra Face Bones before loading FACS, then animating lEye and rEye pose-bones. Note the angles appear to be in degrees.

    However, those pose bones appear to also be driving eyelid shapes at the same time, which normally would be preferred, but may already be covered by the eyelid FACS (i.e. just want to drive the actual eye itself). The eyelash geometry also doesn’t follow correctly, though I’m not sure if that’s a separate issue with the FACS import and/or the extra face bones.

  4. R. Keoni Garcia reporter

    Mostly for the curious, the list (with illustrations) of the FACS blendshape coefficients that ARKit provides and FaceCap records: https://developer.apple.com/documentation/arkit/arfaceanchor/blendshapelocation

    Unreal’s Live Link Face may be an equivalent option for facial mocap. Maxon has their own face/body ARKit capture app (Moves by Maxon) that I haven’t seen non-Cinema4D interfaces for; and Blender Motion Capture (face/body) on github is full open source and has a Blender plugin, but has an facial control rig that it records bone motion coordinates for before transferring animation.

  5. Thomas Larsson repo owner

    Thanks for this info. I made a quick importer which is found at the top of the FACS Units panel. Seems to give rather sensible output, but I haven’t compared with the fbx file.

  6. R. Keoni Garcia reporter

    This is looking fantastic - thanks! A few things I noticed-

    • After comparing the import to the original, the default FPS should probably be 24? Though this may be an artifact of the FBX conversion and import, I suppose.
    • The head rotations weren’t apparent, if they were happening. I’ll poke around a bit more on that. Betting I have to make the rest of the DAZ rig posable for that to work.
    • Most important: The eye angles were reversed (X ↔︎ Y), and one of them had to be negated, in order to match the reference FBX behavior.

    In line 209-210 of facecap.py:parse, I did this and the motion matched much better:

                        leyekeys[t] = Euler((D*float(words[9]), -D*float(words[8]), 0.0))
                        reyekeys[t] = Euler((D*float(words[11]), -D*float(words[10]), 0.0))
    

    Regarding the eyelash displacement, this looks like it may be strictly FACS import and Merge Rigs resulting in a progressive displacement of the eyelash geometry - it’s most apparent with Eye Look Up (Left/Right) but others also show it to a lessert degree. I hadn’t realized how complex things had become with Gen8 and swappable facial hair. Possibly should be a separate ticket, assuming I’m not doing something incorrect/silly merging the eyelash rig to the primary rig?

  7. R. Keoni Garcia reporter

    The "dislocated eyelashes" that seems to come along with some of the FACS controls. Have seen this both with original Gen8 transmapped lashes as well as with fibermesh-based ones. Still scratching my head...

  8. Thomas Larsson repo owner

    I didn’t see the problems with the eyes, because they are all white in the viewport so I didn’t see the irises. The problem is most probably that the euler angles are specified in world space, but I just plugged them in as local rotations. This may sort of work for the head, whose local coordinates are close to the global coordinates (if Y is up in the facecap file), but some axes would be flipped for the eyes. The correct way to handle this involves the bone matrix in rest pose, which is not that difficult but may take some time to get right.

  9. Thomas Larsson repo owner

    Since the head and eye bones are almost perfectly lined up with the coordinate axes, it was easier to just flip axes. I created a test file with simple bone rotations, and the bones rotate correctly provided that FaceCap uses a coordinate system with Y up.

    The head location, if you use that, actually moves the hip. Moving the head relative to the neck does not make much sense, it just compresses the neck in a very strange manner.

  10. CookItOff

    I’m assuming the iPhone dot matrix facial recording is at 30p? I only say this since Apple seems to target 30/60/120 for it video recording.

    I would hope there will be a Diffeomorphic 30fps Facs conversion option in the future. I animate all my stuff at 30fps to 60fps.

    Is Unreal’s Live Link an option with Diffeomorphic?

    Lashes and peach fuzz is unparented “not moving with skin” when the cheeks, mouth, brow move the face but not with blinking. This happens without using motion capture.

    Thanks,

    Tim

  11. Log in to comment