possible speedup for the HD exporter ** includes HD exporter by Donald Dade **

Issue #191 open
Alessandro Padovani created an issue

I do not know the daz script language nor the daz or blender apis. I come from the old fashioned c language as a professional programmer some time ago. So this is a guess.

I noticed that in the main loop the script writes to file for every single vertex. Now windows uses a file cache of course but I suspect that fp.writeline() could be quite slow. If we could use a string to contain all the vertices of a mesh then write to file the string with a single fp.write() call I suspect this may be much faster. Of course this will use more ram but shouldn’t be a big issue if we go for one figure at at a time.

Please Thomas let me know what you think about this.

edit. Of course the same would apply for the writeUVs(), writeNormals() and writeFaces() functions.

function writeVertices(mesh, geom, hd)
{    
    var nv = mesh.getNumVertices();
    var ne = mesh.getNumEdges();
    var nf = mesh.getNumFacets();

    fp.writeLine("        \"num " + hd + "verts\": " + nv + ",");
    fp.writeLine("        \"num " + hd + "edges\": " + ne + ",");
    fp.writeLine("        \"num " + hd + "faces\": " + nf + ",");
    fp.writeLine("        \"" + hd + "vertices\": [" );

    var c = ","
    for (var i = 0; i < nv; i++)
    {
        var v = geom.getVertex(i);
        if (i == nv-1) c = "";
        fp.writeLine("            [" + v.x + ", " + v.y + ", " + v.z + "]" + c)
    }    
}

Comments (99)

  1. Alessandro Padovani reporter

    Never mind. I just tested it and it doesn’t work it takes the same time. It means the file cache is working fine and fp.writeline() doesn’t slow down. I’ll leave the issue open for a while just in case someone could have some ideas for speeding up the HD exporter.

    Meanwhile TheMysteryIsThePoint from the daz forum may work on a c++ version as he gets some time so there’s hope.

    function writeVertices(mesh, geom, hd)
    {    
        var nv = mesh.getNumVertices();
        var ne = mesh.getNumEdges();
        var nf = mesh.getNumFacets();
    
        fp.writeLine("        \"num " + hd + "verts\": " + nv + ",");
        fp.writeLine("        \"num " + hd + "edges\": " + ne + ",");
        fp.writeLine("        \"num " + hd + "faces\": " + nf + ",");
        fp.writeLine("        \"" + hd + "vertices\": [" );
    
        var vcache = "";
        var c = ",\r\n";
        for (var i = 0; i < nv; i++)
        {
            var v = geom.getVertex(i);
            if (i == nv-1) c = "";
            vcache = vcache + "            [" + v.x + ", " + v.y + ", " + v.z + "]" + c;
        }
        fp.writeLine(vcache);
    }
    

  2. Thomas Larsson repo owner

    OK, implemented now. There is some speedup, but not very much. Exporting my test character at subdiv level 3 was reduced from 161 seconds to 142 seconds. Not neglible but less than one could have hoped for.

    I made a bad commit before reading your text to the end, where I put the entire file in a string buffer before writing it to file. As you pointed out, that could be a problem if you don’t have enough ram. Doing one function at a time in ram seems like a good compromise. Doing the entire file as a single string did not reduce the time any further.

    You use “\r\n” to break lines. I use a single “\n” and that works on windows 7. It should work one mac and unix too, because IIRC the extra \r is a windows thing.

  3. Alessandro Padovani reporter

    Yes unfortunately there’s not much gain I was hoping for something more. Usually optimizing the inner loops works fine enough. I’ll let this issue open for a while to see if someone else may have some clever ideas.

  4. Alessandro Padovani reporter

    Thomas I gave a look at the commit code and I believe there’s a (very) minor issue in your implementation. The extra newline at the end of the vertex list is not really needed and it generates a “hole” in the file. This doesn’t seem to affect the importer though so it’s just a “style” exercise.

    Below is the correct code at least it works fine here.

    if (i == nv-1) c = "";

  5. Alessandro Padovani reporter

    I got another one and this seems to work better. In my test the export time goes from 30 to 20 seconds.

    I noticed that the export script exports normals too. This is not necessary since normals are rebuilt by blender on mesh creation and update during animation, based on the object shade and autosmooth properties. So exporting normals doesn’t really make much sense. Without normals we get about a 30% speed gain.

    Below the Muta HD character that I exported without normals. She looks the same because normals are rebuilt by blender anyway. Also the base mesh figure looks fine too both with and without subdivision.

  6. Thomas Larsson repo owner

    When we started with HD export, it was an experiment and it seemed a good idea to export everything that could possibly be useful. But of course we don’t want to export unnecessary things in production.

    Anyway, quite a nice speedup. From 161 to 142 to 109 seconds in a few days on my benchmark.

  7. Donald Dade

    Hello Alessandro, Thomas. TheMysteryIsThePoint here. I looked at the export script, and from just eyeballing it I am not sure how much the speedup will be with C++ because, like the OBJ file format, most of the time is spent converting floating point values to ASCII. I’ll just do it, and we will see, but not sure how much time I’ll have Friday, and over the weekend. Luckily, the script API and SDK API seem analogous.

  8. Donald Dade

    Question: If a C++ version works well (and is desired), why not use a binary representation that the C++ writes, and the Python reads? It is wasteful to convert everything from binary to ASCII just to convert it right back to binary in Blender.

  9. Alessandro Padovani reporter

    The idea is not to change the importer code but just the exporter. The daz duf files are ascii as well so I guess this makes sense. That’s just my idea though then Thomas may very well decide otherwise if there’s speedup.

    My hope is that the inner loops in c++ will be just much faster.

  10. Donald Dade

    Understood. I thought as much. It is just that the fact that the Daz SDK represents the data internally in a format already *PERFECT* for streaming to disk with no transformation at all is a very juicy benefit. It literally provides a pointer to the data that one could pass directly to the write() function. One need only calculate the length.

  11. Alessandro Padovani reporter

    That sounds interesting. I wonder if Thomas evaluated this possibility to write and read data back.

  12. Thomas Larsson repo owner

    Hi Donald,

    A dbz file, like the duf and dsf files, are gzipped json files. This can easily be handled in python because python has built-in libraries for ungzipping and for reading json. If the data is stored in some other format, it will be difficult to read it on the Blender side. Perhaps if the output could be compatible with the pickle module, https://docs.python.org/3/library/pickle.html.

  13. Donald Dade

    I certainly see you point, Thomas. But because Daz lays out the necessary data in such a streamable format, all methods will probably naturally group themselves into two kinds: those that impose a protocol, and those that don’t. And those that impose a protocol will all be significantly slower than those that don’t.

    I am sure you are right about the Blender side, so perhaps we should just see how the C++ version turns out.

  14. Donald Dade

    Another question: If we are primarily concerned with speed, should we consider not compressing the dbz file?

  15. Alessandro Padovani reporter

    The time needed to compress the HD file is just some seconds so it doesn’t add up. What we’re going after is to reduce the time needed to generate and write HD data that with the script version can easily go for some minutes.

    edit. I’m attaching a basic g8f at subd 3 as example. With my ryzen 2200 she takes about 78 seconds to export.

  16. Thomas Larsson repo owner

    The files are compressed. DzGZFile creates a gzipped file. To create an uncompressed text file, create the file with DzFile instead. That used to be the case until some months ago, and makes the files about three times bigger. The import script reads both unzipped and zipped files.

  17. Thomas Larsson repo owner

    Sorry, I misread your comment. Here is a benchmark:

    unzipped 98 s 281 Mb
    zipped 114 s 75 Mb
  18. Donald Dade

    OK. More questions:

    I am completely ignorant of how HD, so can someone explain the rationale? doMesh() seems to export at high resolution and the current subd level, and then export subd cage. I can accept that without understanding why. 🙂

    But at high resolution we have

    writeVertices(geom, geom, “hd “);

    and at base res we have

    writeVertices(mesh, geom, "");

    In the C++ SDK, the geometry and the mesh are sublasses of each other, essentially the same thing. I don’t understand why there is the need to pass them both. Yet, in writeVertices(), it gets the vertex, edges, and faces count from the passed in mesh object, but it gets the actual vertices from the geom object. Again, this is a point of confusion.

    In doMesh(), around line 261, there are lines without terminating semicolons. The code seems to run fine; does ECMAScript not require them?

    And in doMesh() there also seem to be various execution paths that do not set the LOD level and subd level back to their original values. The conditions that would cause this are improbable, but is this an oversight?

    Th doMesh() business is the meat of the script, so after I understand what is going on, the rest is just formatting the various geometry information.

    And does anyone know if the ECMAScript is somehow synchronized? Because if the C++ SDK changed the LOD level or subd Level, it would not be sufficient to just call update() and finalize(); one has to explicitly turn on interactive updates and call QApplication::processEvents() in order for Daz Studio to apply all the modifiers and not return until they are complete. In Daz Scripting, the call to forceCacheUpdate() does all of that, and when it returns one call already call getCachedGeom() and be sure that all the modifiers have finished?

    Thanks, guys…

  19. Thomas Larsson repo owner

    Apparently ECMAScript does not need the semicolons, at least not always.

    That the lod level is not always restored is an oversight. Probably it has never happened.

    That writeVertices() takes its data from two objects is a remnant from earlier versions of the base script (not the HD version). In those days the lod level was not changed, so geom was a subdivided mesh with morphs applied, and mesh is the mesh at base resolution without morphs. We want the verts of the unsubdivided mesh with morphs, which is same as the first nv verts in the subdivided mesh.

    Now when the lod value is set to zero, we can probably take all data from geom. Also, it is not necessary to export the faces and uvs of the base mesh, since that info is taken from the duf files. OTOH, the time to do that is probably neglible.

  20. Alessandro Padovani reporter

    Thank you Donald for taking the effort to actually understand and may be improve the exporter. May be this could be useful for sagan too.

    As for the more technical code details Thomas can give the best answers. What I know is that the importer needs both the base resolution and HD data to rebuild the figure, that’s why the exporter exports both of them, and this is what the “hd” selector is for.

    As for the semicolons I noticed this too so I guess the daz script doesn’t have a strict syntax rule for that. If I’d do that in c the compiler would kick my ass instantly 😅.

  21. Donald Dade

    That’s interesting. It never occurred to me that the mesh gotten from the object might be different from the mesh gotten from the current shape. I’m still fuzzy on those concepts.

    But if I am understanding you, I can export the HD geometry from the current shape when the LOD is at its current level, and export the subd cage info also from the current shape, as long as the LOD is set to Base Resolution.

    I’ve worked out how to reliably switch/restore LODs and subD with the Alembic exporter.

    I’m going to go to bed now, but I’ll revisit this tomorrow.

    Thank you for the explanation.

  22. Donald Dade

    No problem, Alessandro, as you know, without this project I would probably not be in character animation at all…

    And I was as shocked as you were about the semicolons… that’s just wrong 🙂

  23. Alessandro Padovani reporter

    The base mesh gotten from the object is different from the current shape because subd zero is not the same as the base mesh. That is, subd zero has no subdivision but it has the smooth applied. As for the HD exporter it doesn’t need the base mesh shape because the rebuild subdivisions tool will make its own version anyway.

  24. Alessandro Padovani reporter

    edit. This is corrected in another post below. Actually blender multires is the same as daz HD.

    Please note that rebuild subdivisions doesn’t use catmull clark. That is, the generated base mesh will work “inside out”, while the standard subd works “outside in”. That means that with rebuild subdivisions the multires base mesh is not the same as the daz base mesh. But for the purpose of using a HD figure this is fine anyway.

    Below an example with a simple subdivided cube. On the left we have the standard subd base mesh, and on the right the multires base mesh by rebuild subdivisions. The final subdivided mesh is the same but the base mesh is not. Test scene included.

    https://docs.blender.org/manual/en/latest/modeling/modifiers/generate/multiresolution.html

  25. Donald Dade

    Another question:

        "hd faces": [
            [[1594,32924,16556,32927],[1823,34700,18332,34703],[1594,32924,16556,32927],[6404,65764,65767,6405],0,0,0],
    

    What comes back from the script calling

    geom.getFacet(i);

    is an opaque structure, and ECMAScript knows how to print it. But I don’t know what these values are. I suspect that these are the edge, normal, uv, and vertex indices (but in what order?), followed by the cage, face group, and material indices (but in what order?).

    Can anyone advise? I’m not fluent enough in Python to see how the values are being parsed back in.

  26. Thomas Larsson repo owner

    It seems that the importer only cares about three entries:

    [(vertex numbers), (uv vert numbers), ?, ?, material number, ?, ?]

    The rest can be anything, but yes the two last are polygroup number and cage face number.

    Faces are always quads, and the last vertex is negative for tris. But the import script would work even if there are only three corners, even if DS api never generates such faces. In fact, it would work even with more than four corners.

  27. Alessandro Padovani reporter

    Thomas, Donald, please also pay attention if possible to don’t modify the vertex order since this may allow to use reshape rather than rebuild subdivisions. It seems that @engetudouiti may have found it works if I understand correctly what he means. I’m also going to test it myself. See #198.

  28. Donald Dade

    Thanks, Thomas. Gotcha.

    And Alessandro, rest assured… changing the order of the elements in an array is not the kind of thing that happens spontaneously or by accident 🙂 I cannot think of a reason why I would ever change the order. If you need that guarantee for something else, consider it guaranteed.

  29. engetudouiti

    Unfortunately after all daz chagne vertex order , so reshape can not work.

    I know maya or some high cost app can re-arrange vertex order, but I do not know how to.

    maybe when sub-D even though daz and blender use same technology, the way to keep vertex order seems change. (I do not know if there is aprication, which transfer vertex order, that is maya offered as tool I remember)

  30. Donald Dade

    The initial port is done, there will be issues because of differences in floating point formatting which I am fixing to the degree possible, and also I don’t know if the matrices are right handed or left handed.

    I also may have discovered an oversight in my own code in that I only turn on interactive updates for the Mesh Smooth Modifier, but should probably turn it on for all modifiers.

    In any case, I’m about to “turn it on” for the first time to see what comes out the other end, and compare it to a baseline.

  31. Donald Dade

    With a naked Alistair HD at subd level 3 and Orson Hair, I’m seeing things go from 17 seconds down to less than 3, but the test is a little unfair. The dsa prints vertices to an unreasonable number of decimal places given that it is just a float, not a double, and also it unnecessarily converts the other data in the facet that is not being used.

    There are still some things to correct, but I think we’re still in for a bigger speedup than I would have expected.

  32. Thomas Larsson repo owner

    Donald, this sounds great.

    Does the code run on all platforms, or do we have to provide different versions for Windows and Mac?

  33. Thomas Larsson repo owner

    Unfair or not, the dsa script does what is does so the speedup is what the user will experience.

  34. Donald Dade

    It would definitely require different versions. The code itself should readily compile, but the process for building a Daz Studio plugin will differ from Windows to Mac. Unfotunately, I don’t own a Mac nor have any experience with that platform. I am sure, though, that it would be trivial for an actual Mac programmer.

  35. Donald Dade

    Hi Guys, sorry but I have limited time for fun stuff during the week 🙂

    I’m getting close, though, just fixing typos and differences in printing formats between ECMAScript and Qt, etc…

  36. Thomas Larsson repo owner

    Well, I have an unusual amount of free time this year, working two days a week with 90% salary because of corona 🙂 This will end at new year, unfortunately.

    When you are at it, maybe you could make a C++ version of the other export script too? It essentially amounts to stripping everything HD-related.

  37. Donald Dade

    We were wondering on the Daz forums how you were so incredibly fast with fixes and features…

    Sure, when I get the first exact match with the HD script, I will pull that functionality into its own plugin (right now I just added a button to the Sagan dialog) and probably have separate buttons for HD and normal export.

  38. Donald Dade

    The precision to which the script prints all the float values is having a tremendous effect on the execution time. It is printing values to 16 decimal digits of precision when single precision floating point values only have about 7. When, as an experiment, I changed the C++ code to print 16 digits instead of 7, it doubled the execution time from 2 to 4 seconds. From this I gather that if the script only printed 7, it might be twice as fast.

    And it also underlines that a binary format where there is no conversion to/from ASCII at all might really be worth it in terms of speed.

    This is to say nothing of the precision that is lost converting to ASCII and even more precision lost when the values are converted back. With only 7 decimal digits in the first place, I don’t know if we should be fuzzing out the data by converting and re-converting it.

  39. Donald Dade

    In any case, I just need to make sure I’m converting to/from high resolution correctly. I stole that code from Sagan, which seems to do it properly, but I need to test it a little more before I say “it’s done”. I hope it’s okay to leave it in Sagan for now, before putting it in to its own plugin. This is only because creating a new plugin project in Visual Studio and making everything right would probably take me the better part of a day because I really don’t know Visual Studio all that well.

  40. Alessandro Padovani reporter

    I can test it in sagan thank you Donald for taking the time to do this. Please let us know when you upload something at this time I only see sagan 1.14.

    https://drive.google.com/drive/folders/1rd5wjf58A9E5pQfV2Kp2E7jA6FjcDZGD?usp=sharing

    As for the precision speedup I did a quick test and it seems it doesn’t work. That is, even if I limit the digits in the string the export time is the same or even worse. I guess most of the time for the script may be used by the runtime parser to execute it, while in c++ most of the time may be used to write to disk. So they get different bottlenecks.

    var c = ",\n"
    for (var i = 0; i < nv; i++)
    {
        var v = geom.getVertex(i);
        if (i == nv-1) c = "";
        buf += ("            [%1, %2, %3]".argDec(v.x,0,'f',3).argDec(v.y,0,'f',3).argDec(v.z,0,'f',3) + c)
    }    
    fp.writeLine(buf);
    

  41. Alessandro Padovani reporter

    IMPORTANT. I have to correct myself as for the multires base mesh issue that I reported above. That is, if we export a subdivided figure to HD, as for example a subdivided G8F, then the issue does apply. But if we export an actual HD figure to HD then the issue does not apply. This is because the daz HD mesh is subdivided “inside out” the same as the multires modifier does in blender. So we’re lucky and don’t have to fix anything in this case.

    Below an example with the muta character at subd 0 then 2. We see that the subdivision is “inside out”. That is, the base mesh is smaller and inside the HD mesh, the same as the multires modifier does in blender.

    Please note that the base mesh issue still applies if we export a subdivided figure to HD. But this is not a real case since a subdivided figure will be exported as subdivided, not as HD.

    edit. This is a wire in blender where I imported the base and hd models so we can see it better. Did it in blender because in daz studio I can’t change the wire color. Also I corrected the first image with base resolution instead of subd zero that was a distraction of mine, thanks to engetudouiti for reporting this.

  42. Donald Dade

    No problem at all, Alessandro. I feel privileged to be able to work with you and Thomas. Floating point precison aside, I think the files would match perfectly, and so I just need to do a little research to make sure that all the modifiers finish before requesting the finalized geometry because that is completely undocumented and I am still not 100% sure that even Sagan is absolutely correct. I don’t want to pass my own doubts and bugs on to Thomas. It shouldn’t take long, and then I’ll publish a special version for us to test.

  43. Alessandro Padovani reporter

    Hi Donald I tried your first beta release “sagan-1.13.0.0padone.zip” but it seems it doesn’t work. I just exported a base G8F and when I import her in blender I get the error below. May be Thomas can help to understand what it may be.

    https://drive.google.com/drive/folders/1rd5wjf58A9E5pQfV2Kp2E7jA6FjcDZGD

    For Thomas: to test the plugin by Donald you have to copy the sagan.dll in the daz studio plugins folder “C:\Program Files\DAZ 3D\DAZStudio4\plugins”. Then in daz studio you use the edit > sagan menu to run the plugin. Within the sagan exporter there’s a “Export Diffeo HD” button.

    After the error the HD figure seems to be imported fine though. Also the speed gain is amazing, I get 2 seconds vs 20 seconds with the script version.

  44. Alessandro Padovani reporter

    WAIT. I get the same error with the diffeo HD exporter so it seems it’s a bug in the latest commit bb376ed. This is not on sagan.

  45. Thomas Larsson repo owner

    The bug should be gone now. However, all morphs are gone from both meshes, although I can see them in the dbz file. Investigating further.

  46. Thomas Larsson repo owner

    The HD mesh seems to be exported correctly, but the base mesh is the same as the HD mesh, which is not what the importer expects. This makes dbz fitting fail, with a message in the console like this:

    Mismatch Genesis8Male, Genesis8Male: 1037754 != 16384. (OK for hair)

    As things are now, the import script then completely ignores the info in the dbz file.

  47. Alessandro Padovani reporter

    As for commit 6c6fee5 I don’t seem to get errors anymore. And the sagan HD exporter seems to work fine.

    As for sagan not exporting the base mesh, I did a test with a simple subdivided cube at it seems to work fine. That is, it exports 8 vertices for the base mesh and 26 vertices for the HD mesh at subd 1. I can count the vertices in the dbz file with the notepad. Test scene included.

    I also did a test exporting a subdivided G8F, and though I can’t count the vertices exactly there, by looking at the dbz file with the notepad line number, I estimate roughtly 65k vertices for the HD mesh and 16k vertices for the base mesh. So again the base mesh seems to be exported correctly or at least it is certainly not the same as the HD mesh.

    So Thomas are you sure that sagan exports the base mesh the same as the HD mesh ? It seems not to be what I get here.

    edit. Also tested the muta HD character at subd 1 and again I get about 65k for HD and 16k for the base mesh, so sagan seems to work fine there too.

  48. Thomas Larsson repo owner

    It has something to do with the geometry editor. If I start a new DS session, both the base and HD meshes are exported correctly. But if I enter the geometry editor, to make geografts work, the base mesh becomes the HD mesh. And even if I choose another tool, it remains the HD mesh until I restart DS.

    Frankly I don’t understand what the geometry editor does and why it is necessary to export geografts. Black magic.

  49. Thomas Larsson repo owner

    Output from Blender.

    Cube exported from fresh DS:

    Fitting objects with dbz file...
    Highdef pCube(11cg000)1 1 26
    Preprocessing...
    Building objects...
    Build HD mesh for Cube: 26 verts, 24 faces
    HD mesh geometry_HD built
    Info: 1 new levels rebuilt
    

    Cube export with geometry editor active:

    Fitting objects with dbz file...
    Highdef pCube(11cg000)1 1 26
    Mismatch pCube(11cg000)1, shape: 26 != 8. (OK for hair)
    Preprocessing...
    Building objects...
    

  50. Donald Dade

    I’ve noticed that the Script API and SDK API are not precisely 1:1 and perhaps the Script API is doing something that the SDK does not.

    Thomas, can you provide a test sequence for me to follow, to duplicate that behavior?

  51. Thomas Larsson repo owner
    1. Start DS
    2. Open Alessandro’s cube.duf
    3. Tools > Geometry Editor
    4. Select cube (something flashes)
    5. Edit > Sagan > Export Diffeo HD

  52. Alessandro Padovani reporter

    I can confirm that sagan doesn’t work fine with the geometry editor. And I can’t find any way than restarting DS the same as Thomas did. Nevertheless, as for geografts we can always use the original idea by Krampus, to export the HD mesh without geografts then merge in blender. This may be a workaround if Donald can’t get the SDK API to work with the geometry editor.

    edit. As for why to use the geometry editor for geografts it is because it reverts the geometry to its “basic” state, somewhat like the edit mode in blender, thus separating the geograft geometry.

  53. Donald Dade

    OK, as I said on the Daz forums, I found the problem. But it made me think about another issue. I haven’t investigated it, but it seems reasonable:

    The script dumps the HD mesh and then the cage mesh node by node. But because certain objects' modifiers interact with other objects, smoothing for example, shouldn’t we set all objects to base resolution before dumping them all, instead of just the current object being dumped, one at a time? What currently gets dumped is geometry that does not naturally exist in the scene, no?

  54. Alessandro Padovani reporter

    Hi Donald I’m not sure I get the difference. May you do a practical example may be with pictures to understand the issue ?

  55. Donald Dade

    Hi Alessandro, I don’t have a practical example, it was just something that occurred to me as I was fixing Sagan to wait for all the modifiers to finish, not just smoothing, and was porting the solution to the C++ Diffeo HD exporter. But I seem to be making sense, don’t I? I mean, Alembic’s whole deal is that it is vertex exact, and so if I let the modifiers run while all the objects are subdivided except for the one currently being dumped, the result of the modifier stack having run could be slightly different from the normal case when the stack runs when all objects are either subdivided or not. Currently, this is what the script does: it reverts each object to the SubD cage one at a time. I admit the difference is probably very, very slight, and the non-neurotic among us probably wouldn’t care, but after working so much with Alembic, any difference at all that I can preclude is worth it to me.

  56. Alessandro Padovani reporter

    Hi Donald, as for geografts they get merged so probably any difference is fixed there. But, if you have any better code to propose I guess Thomas will check and commit it if it’s good. Also in this case Thomas is a better reference than me since he knows what the exporter does in its guts 😉.

    Thomas, any comment ?

  57. Thomas Larsson repo owner

    At least in the dazscript version, the nodes are done one at a time because both the HD mesh and the cage must be exported, and once the resolution has been changed to base the HD geometry is lost. So the HD geometry must be saved before the resolution is changed. If dazscript were based on python I would know how to keep the HD geometry in memory, but I’m not so good at javascript type languages.

    So the script does this:

    For each figure:

    ‌ write HD mesh

    ‌ set resolution to base

    ‌ write base mesh

    This won’t work if you want to set resolution to base for all figures at once.

  58. Donald Dade

    Hi Guys, sorry this is taking so long, but tonight I was able to leverage some Sagan code and I verified that the problem was indeed a race condition. Alessandro’s cube comes out with 8 vertices, even with the Geometry Editor tool active. I just need to fix one simple thing, and then look for any differences in the script since I ported it. But it did work on the cube, which for all intents and purposes is no different from the most complex scene.

    As for switching from HD to base, does the order of the meshed matter? Could we export all the objects' vertex data at the current subd level, set them all to base, and then export all the objects' vertex data at base res? I haven’t tried it on a non-trivial scene, but I can’t imagine calling QApplication::processEvents() twice and having all the modifiers run for every object, not slowing us down.

    And I’ve got you beat, Thomas… I’m not a competent Javascript programmer OR a Python programmer either :)

  59. Alessandro Padovani reporter

    That’s great news Donald, thank you so much for taking your time to do this.

    As for switching HD to base I guess you can do as you feel better for the c++ version. Eventually you could fix any reported bug later, if there’s any.

  60. Donald Dade

    Hi Alessandro, I ported export_highdef_to_blender.dsa from this commit:

    commit d385549b51d3c3f8ba328bf9c1b420dc4712e93a
    Author: Thomas Comsol <thomas_larsson_01@hotmail.com>
    Date:   Sun Dec 27 20:56:25 2020 +0100

    After realizing that I had not given the code to anyone but you, I thought I’d update it and formally hand over the source, but have there really been no changes to this script since Christmas time? Is the export procedure still used?

  61. Donald Dade

    I also have a Mac Mini now, and so I can create a MacOS plugin whenever Daz Studio should run on it.

  62. Alessandro Padovani reporter

    Hi Donald welcome back !

    Sure the c++ HD exporter is still useful to have. If you can also add mac and linux versions it will be great. The geometry editor is no more needed since we can now import HD geografts by removing loose vertices, see #223. I tested your plugin with 1.5.1 and 1.6.0 and it works fine here for HD figures. But it doesn’t seem to work with geografts and shells. I believe it is fine though to just report this as a known limitation, until you may want to find some time for improvements. I added a readme.txt with basic instructions in the distribution file above but of course feel free to change it as you wish.

    Some minor suggestions:

    1. Change the label in the edit menu to something more distinctive since now we have multiple options to export also with the daz bridge, something like “Diffeomorphic HD Exporter 1.0 c++ by Donald Dade” may be fine. Possibly with the version number if you may want to maintain versions.
    2. In the export panel be sure the extension is .dbz, otherwise the .duf extension may overwrite the daz scene.

  63. Donald Dade

    Like all DLLs, the plugin runs in DS’s process space, and so a Linux version is not possible because there is no Linux port of DS. But I think it should work with the current means of running DS on Linux via WINE. I know already works in VMWare, as that is how I developed it.

    I found some interesting and promising looking entries in the DSON spec having to do with geografts, what geometry they attach to, and what faces should now be hidden, but then I would have to parse JSON. I think I just made a tremendously bad choice in JSON parser because everyone eles’s parsing seems to be much, much faster than mine. I’ll investigate these things more.

    1. Sure, I will change the label to be able to distinguish things, and call it version 1.0
    2. Yes, I added code to ensure that the .duf is never overwritten, but the logic is slightly incorrect; It still warns that the .duf already exists, even when it will never overwrite it.

  64. Donald Dade

    Alessandro, Thomas,

    I am testing the minor changes suggested by Alessandro to the C++ plugin now and will include the source as well as a binary. There may be minor issues with paths in Visual Studio for anyone who tries to build it, as the code has never been built by anyone besides me, on my systems, and I may not have configured it “generally” enough.

    But to be honest, I am MUCH more excited about Blender 2.93’s support for Python 3.9, which supports shared memory. This will allow the import scripts in Blender to read directly from the Daz Studio process’s virtual memory space. The dbz file will not be necessary anymore and it should be virtually instantaneous, independent of the model’s size. I’m working on a simple API for the scripts to use and will present it for discussion when it’s done, to see if it is of interest.

  65. Alessandro Padovani reporter

    Thank you Donald for the update and your work on this. Shared memory sounds scary to me, especially for large scenes, since this is a HD exporter. I mean, having to keep both daz studio and blender open, while essentially duplicating the whole HD scene, may be heavy on system resources. But sure the technology is exciting.

    Thomas is on vacation and will be back soon.

  66. Donald Dade

    Version 1.0.0 with diffeo.dll binary

    After installing it, in DS, go to Window|Workspace|Customize, and find C++ Diffeo Daz Exporter under Miscellaneous and move it to wherever you’d like it to be in the menu, e.g. under Exports.

  67. Donald Dade

    Yes, that’s true. I develop on a pretty hefty dev box and so I don’t think about such considerations when I probably should. I don’t know how much memory Daz nor Blender keeps resident with a model loaded.

    But even so, it is not necessarily the vertex/face count that makes them heavy; I think they are heavy to process, but not necessarily that heavy to merely hold in memory… even with millions of vertices were are not talking about a lot of memory. I’ll try some experiments in VMWare with 16 or even 8 gigs or RAM allocated. I hope it turns out not to be a big deal because the idea of a livelink has always fascinated me.

  68. Alessandro Padovani reporter

    Tested with 1.6.0 and it works great. One minor caveat is that I get 0.9.0.0 in the daz panel while the distribution file is named 1.0.0. As for managing the code here I don’t know what you mean, I’m not a coder myself. I see Thomas uses one Bitbucket repository for each project.

  69. Donald Dade

    I’ll fix the version.

    I just meant if the code can be hosted somewhere adjacent to the Diffeo project, to be found easier, maybe Thomas could create a repository for it.

  70. Alessandro Padovani reporter

    That’s for Thomas to decide, but I don’t think so. I mean, I see that Thomas uses one repository for each project. For example I asked to keep mhx together with diffeo as distribution, because it’s easier this way. But it seems it’s not possible with bitbucket. Also I guess the repository gets a owner that should be the coder himself. So may be you have to setup your own repository.

    But again I’m not a coder so @Thomas Larsson may have other options.

  71. Thomas Larsson repo owner

    The problem with putting several Blender add-ons in the same repo is that you cannot download the repo as a zip file and then install the add-ons from the zip file. A repo with several addons would presumably consist of multiple folders, say import_daz, mhx_rts, and retarget_bvh. The zip file then contains a single folder, and the addon folders are subfolders, but Blender won’t find the add-ons because they are one level down. So if you do that, the user has to manually unzip the file and place the individual folders in a place where Blender finds them. This would not be a problem with a stable release, because then the individual folders can be placed at the top level of the zip file.

    However, it is already necessary to move the Daz scripts to a Daz directory manually, so adding Donald’s script there would not make things more complicated for the user than they already are. Perhaps I can add Donald as a developer, or maybe it could be handled with a fork and pull requests.

  72. Thomas Larsson repo owner

    Hello Donald,

    If you are still interested I could include your plugin in the upcoming 1.6.0 release. It is not possible to put several add-ons in the same Bitbucket repo because of the way that Bitbucket generates zip files, but for the release I make the zip file myself, and can put several directories at the top level.

  73. Donald Dade

    Hi Thomas. Yes, include it if you find it useful. I don’t think I fixed the things Alessandro suggested, nor made the standard resolution version. I’ll get on those two tasks. When will you put together the 1.6.0 release, i.e. how much time do I have? :)

  74. Thomas Larsson repo owner

    I’m currently writing docs which means that I have to go over everything. That will probably take another two weeks, but it depends if bug reports keep coming in.

  75. Donald Dade

    Hi Guys, I’m working on the base resolution exporter. I believe I remember either Thomas or Alessandro saying that for base res, I just don’t dump the hd mesh, but I’ve noted some other differences between the two scripts and hope. I’m about to test, so we’ll see.

    I wonder if the two scripts can be harmonized, somehow, so that there is truly only one version to maintain/port?

  76. Alessandro Padovani reporter

    Donald, the base resolution is not needed since it’s very fast already, so if it takes too much of your time you may skip it. The c++ version is useful to speedup the HD mesh.

    As for the script that's for Thomas to decide. Since we now get a dialog for the HD uvs, it should be possible to add another option for the HD mesh and make a single script. I guess.

  77. Thomas Larsson repo owner

    I tested the existing plugin briefly over the weekend and it seems to work fine as it is. Skipping the UV block isn’t necessary IMO, since the exporter is so fast already. I didn’t test with geografts, though.

  78. Donald Dade

    Oh, I would have sworn that someone asked me to do base res, too. In any case, I did it just to be consistent and just not dumping the HD mesh seems to work. The plugin presents a simple dialog where the user can request HD or base res.

    I’ll return to trying to research why geografts would not work. I suspect that, like SBH, there is nothing in the public API to support it and the necessary data is in the JSON.

  79. Mel Massadian

    Thanks! I missed the addon separation update! The addon is much more performant than the last time I used it!

  80. Valentine Khanmamedov

    Hi, how to use this plugin after installation? i cant find it as mentioned Window|Workspace|Customize, and find C++ Diffeo Daz Exporter under Miscellaneous. Maybe a more detailed instruction is present?

  81. Ogechukwu Ugama

    i have Copied the diffeo.dll plugin in the daz studio plugin folder but i can’t find it in Daz3d.

    IN edit > diffeomorphic daz importer, i can’t find it. Please Help.

  82. Alessandro Padovani reporter

    I can confirm that diffeo-HD-090.zip doesn’t show up for some reason, will report to Donald. You can use diffeo-HD.zip then update and merge menus, the plugin will show up under edit as usual.

    note. Please note that the c++ version doesn’t work for geografts and shells, so it’s most useful for props and scenes than for figures.

    note. There’s news from Donald. The 090 version needs to be manually placed. Go to window > workspace > customize and you will find the c++ exporter under miscellaneous.

  83. Log in to comment