Transferring Shape Keys & Technical Debt in the Codebase

Issue #2209 open
Midnight Arrow created an issue

I’ve spent a few hours trying to get some code to work properly.

Prior to calling the following function, my code re-opened every object’s DSON file to extract the content_type property (see #2185). This allows me to sort objects programmatically. For my initial tests, I focused on facial attachments. The following function takes the context from the operator, an "Actor" content_type (i.e. a Genesis 8 figure), a list of objects with the "Follower/Attachment/Head/Face" content_type (eyelashes and eyebrows, in this case), and finally the bodypart as an enum string ('Face').

def transfer_shape_keys(context:Context, source:Object, followers:list[Object], bodypart:str) -> None:

    actor_context:dict = context.copy()
    actor_context["object"] = source
    actor_context["active_object"] = source
    actor_context["selected_objects"] = [ source, *followers ]

    mapped_morphs:dict = import_daz.selector.classifyShapekeys(source, source.data.shape_keys)

    morphs:list = list()
    for key in mapped_morphs:
        if mapped_morphs[key] == bodypart:
            morphs.append(key)

    import_daz.set_selection(morphs)
    with context.temp_override(**actor_context):
        bpy.ops.daz.transfer_shapekeys(
            bodypart = bodypart,
            needsTarget = False,
        )

    return

However, this code does not work.

Leaving aside the problem of having to extract the bodypart property from an undocumented internal function (which I touched on in #2186), Diffeomorphic’s code has several issues.

Blender allows you to specify a context override (actor_context, above). Essentially, you take the current state of the program and substitute new data for an operator to operate on. In the case of the preceding code, even though nothing is selected in the viewport, the overridden context tells the program to act as if the Genesis figure is the active_object and the follower objects are the selected_objects. This is a documented part of Blender’s Python API, so this functionality should ideally be supported.

https://docs.blender.org/api/current/bpy.ops.html#overriding-context

However, in the case of the Diffeomorphic importer, it creates a copy of the Genesis mesh, named “_TRIHUMAN”, and adds it to the scene so that it can enter into Edit mode. Looking through the code, it seems like after the transfer_shapekeys operator is called, the code then does some global selecting/unhiding. Whatever this does, it appears to circumvent the context override that I created. This causes the operator to complain RuntimeError: Operator bpy.ops.mesh.reveal.poll() failed, context is incorrect and fail.

If I remove the context override (with context.temp_override(**actor_context)), then it works. However, that requires me to select the objects in the viewport like normal, which defeats the purpose of automating the exporter with the content_type property.

Furthermore, I’m not sure what the purpose of the _TRIHUMAN mesh is (there’s no comments, for starters). It seems like it’s used to detect what faces to extract shapekeys from. But Blender has a built-in data structure for that, the BVHTree.

https://docs.blender.org/api/current/mathutils.bvhtree.html

All you need to create a BVHTree is a set of vertices and the vertex indices that constitute its faces. You can just take the shapekey data directly from the base mesh, create a BVH in Python, and use that to detect locations on a mesh without needing to create a disposable _TRIHUMAN object that needs to be added and deleted from the scene tree and which interferes with context overrides.

Currently, the codebase has some major technical debt due to its age (Thomas said it’s ten years old) and not following best/latest practices. Every time I try and follow what it’s doing, the lack of comments and the use of multiple inheritance leaves me confused. So what can be done to fix up the codebase and make it more modular and robust (thereby allowing content_type automation)? Or would it be easier to just restart development from scratch and design it with automation in mind?

Comments (7)

  1. Midnight Arrow reporter

    To clarify a little bit on what I mean by “technical debt” and “best/latest practices”, the Blender developer manual touches on this.

    There are two typical, related code quality issues with operators:

    • Abusing operator callbacks as functions: Often, operator exec() and invoke() callbacks contain a bunch of logic at mixed levels of abstraction. For example, iterators, complex condition checking, bit-flag operations, calls to other functions, ... This indicates that the operators deal with business logic themselves, rather than letting the model handle it -- a violation of the Model, View, Controller design that has consequences. Operators should just use high-level API functions of well defined modules. These should be unit tested and may be used by other parts of Blender, like the Python API. The operator then just puts a few pieces together to perform an action through the UI.

    https://developer.blender.org/docs/features/interface/operators/#differentiate-between-functions-and-operators

    Handling absolutely every piece of logic inside insanely complicated operator classes with multiple inheritance, like the exporter does currently, is expressly discouraged by current Blender coding standards.

  2. Alessandro Padovani

    My personal opinion.

    I know the blender api very little, you seem an expert on that so your comments are very interesting to me, thank you. However, refactoring the code to follow best practice as you’re suggesting seems a daunting task to me, consider that Thomas is a single developer, though extremely talented.

    As for the lack of comments in the code it’s something I miss myself, not that I’d be able to understand the code if comments were there, as Thomas is extremely advanced in python usage while I’m not, but at least it would be easier to get what the code chunk is doing. Obviously Thomas focused on things to get done and less on the code to be readable to others, that again is understandable on a single developer with limited time.

    Also the addon targets different versions of blender from 2.9 to 4.2 and it aims to be compatible with all of them. This adds complexity to the code. A different approach could be to abandon the old version and start a new one when the blender api changes, as essentially blender itself does with LTS where the api is frozen. We’ll end up having a separate addon version for every blender major release. Of course this would also require to split the docs, again as blender does.

    I’m not sure how much Thomas is interested in all of this, as the daz importer was born as his personal tool which he likes to share, and probably grown up to be a little monster of complexity by adding new features for every user request imaginable.

  3. Thomas Larsson repo owner

    Shapekey transfer needs a triangles-only mesh, because the value at a target vertex is a weighted average of the three source vertices that make up a triangle. There should be a way to get the triangles from a general mesh without having to build it, because Blender is probably only drawing triangles, both when rendering and in the viewport. I didn’t know about the BVHTree module before, but I’m not sure if it can be used. According to the docs, the find_nearest function returns an index, typically a face index. If that index refers to the original quad mesh it is not useful, because the transfer tool needs only triangles.

    As for comments, that can be a two-edged sword. Comments are useful if they are correct, but it can happen that code is updated and comments are not. Wrong comments are more confusing than no comments at all. You should also remember that this is not professional software but my hobby project, and that I am better at math than at programming.

    The tools are mainly intended to use from the viewport. They are exposed in a python api, but that is not how I use them myself and not something that was a major design consideration.

    Starting development from scratch is of course an option, but it is not something that will involve myself.

  4. Midnight Arrow reporter

    However, refactoring the code to follow best practice as you’re suggesting seems a daunting task to me, consider that Thomas is a single developer, though extremely talented.

    I’d be welcome to submit pull requests and contribute to a refactor of the codebase.

    Also the addon targets different versions of blender from 2.9 to 4.2 and it aims to be compatible with all of them. This adds complexity to the code.

    I believe this is part of the problem with containing all logic inside monolithic Operator objects, as the exporter does now. If the functionality is instead broken up into functions, using an abstraction layer, then it should be far more robust in the face of API changes.

    def do_something(mesh:Mesh, data:dict) -> None:
      something1(mesh, data["some_data"])
      something2(mesh, data["other_data"])
      return
    

    There’s not many ways an API change can bring down something like this, unless Blender (unwisely) decides to remove the Mesh type.

    I didn’t know about the BVHTree module before, but I’m not sure if it can be used. According to the docs, the find_nearest function returns an index, typically a face index. If that index refers to the original quad mesh it is not useful, because the transfer tool needs only triangles.

    The other half of the equation is mathutils.interpolate.poly_3d_calc.

    https://docs.blender.org/api/current/mathutils.interpolate.html#mathutils.interpolate.poly_3d_calc

    The BVHTree returns a tuple, (position:Vector, normal:Vector, face_index:int, distance:float).

    You take face_index, use that to extract the vertex coordinates from the mesh, and feed that and the position into poly_3d_calc. It returns an array containing the barycentric weights for each vertex on the face. Since the indices don’t change, you can use face_index to extract both the unmorphed and the morphed vertex positions, do some vector/matrix math, and calculate the new offset.

    I’m not sure why the transfer tool needs to be limited to triangles? If you have the face’s vertex positions/normals (from the BVHTree) and the barycentric weights, then surely you can calculate everything you need?

    As for comments, that can be a two-edged sword. Comments are useful if they are correct, but it can happen that code is updated and comments are not. Wrong comments are more confusing than no comments at all.

    Again, this is solved by breaking functionality down into modular pieces. Once a function is tested and works well, the comments shouldn’t need to be changed. Plus, smaller functions are easier to parse, so there’s less need for comments in the first place.

    The tools are mainly intended to use from the viewport. They are exposed in a python api, but that is not how I use them myself and not something that was a major design consideration.

    For my purposes, I sometimes need to create crowd scenes (say, thirty characters) with custom scripting (handling textures, etc.). This isn’t feasible to do manually from the viewport, so those design considerations are interfering with what I need it to do.

  5. Alessandro Padovani

    I’d be welcome to submit pull requests and contribute to a refactor of the codebase.

    Thank you very much for sharing your expertise on the blender api. If Thomas is interested that would improve the code to be more readable and robust. Probably to be done after we have a stable 4.2.1 though.

  6. Thomas Larsson repo owner

    I suppose it could work if you work on your own fork. There have been a few pull requests in the past, which I didn't handle properly, but that should be possible to learn. But I don't want to add any major changes before the next stable release.

    However, I am concerned about losing control over the code. Other people have contributed some pieces of code that I didn't fully understand. Hence if the code contains bugs or some prerequisites change (e.g. a new Genesis generation), I couldn't fix the issues. My own coding style may be lacking in many respects, but it is rather consistent and I can fix bugs that appear years later.

    But we shall see. If you fork the plugin I can look at the pull requests after the next stable release.

  7. Log in to comment