- edited description
- changed title to It is possible to read .dhdm files directly and generate the corresponding HD .obj morphs by using OpenSubdiv
Blender addon to generate rigged hd meshes, hd shape keys, normals and vector displacement textures from daz HD morph files
[ Continued here: https://bitbucket.org/Diffeomorphic/import_daz/issues/814/blender-addon-to-generate-rigged-hd-meshes ]
I adapted the code from https://github.com/edolstra/dhdm to compile a .dll which can be called from Python.
The source is available in the following repository: https://gitlab.com/x190/daz-hd-morphs
Comments (332)
-
reporter -
reporter - edited description
-
Xin, if needed we can ask for the dhdm format to Sam of the daz team, see
#355. Then it would be nice to also get true HD figures with HD morphs, they won’t fit animation but can be good for single pictures as for comics. -
reporter There is no need anymore. The format can be read already. It’s a matter of deciding what to do with it now. You can even bake normal maps automatically now.
-
reporter One issue I noticed: the vertex order of the mesh subdivided in the .dll with OpenSubdiv is the same as in the HD mesh exported from daz when exporting as a raw .obj. This is good, this is the standard vertex order that follows from using OpenSubdiv.
But this addon’s HD exporter script seems to do things differently, so the vertex order doesn’t match with OpenSubdiv for some reason. This seems unnecessary considering that daz already has the correct OpenSubdiv compliant vertex order when exporting as .obj.
I will look into what this addon’s HD export script is doing later, but to make things easier, this addon should try to match daz’s export to .obj vertex order (the OpenSubdiv vertex order).
-
That’s odd .. I can import the obj from daz and use it as morph for the HD mesh, so the vertex order should be the same. See
#300. -
reporter That’s weird. Could you describe the steps and settings you use in daz before exporting with this addon? try exporting g8m as .obj and then with this addon (both times with 2 subdivision levels) and see if the vertex order matches in Blender. In my case, they don’t.
-
reporter In Blender, you can see the vertex order directly in Edit mode by enabling these options:
-
Just tested G8M at subd 2 and he works fine. I can import the portly fbm over the HD figure.
- export the HD figure
- export the HD morph as obj
- import the HD figure with no multires, that’s in the global settings (otherwise the HD mesh is converted to base + multires)
- import the obj morph
- join as shape
-
reporter Oh thanks for checking Alessandro. The problem was on my end, on the .obj import settings in Blender, which was mangling vertex order. Once you set it to respect vertex order, all meshes match: the .dll one, the .obj one and the one imported by this addon.
I will look into setting up automatic normal baking later.
-
reporter I created a script that bakes normal maps from the .objs generated by the .dll by reading .dhdm files. Currently, it has two baking options: using the multires to unsubdivide the HD mesh which enables multires baking (what this addon does once you import manually); and “selected to active” baking from the HD mesh to a base mesh (which is slower, might give some artifacts and is harder to set up right).
The “selected to active” baking, if done right (a big if), is the more accurate option since the base mesh in that case is the real base mesh, while the multires base mesh is an approximation of the real base mesh (a very close approximation for baking purposes nonetheless).
A nice property is that the vertex order matches with the mesh imported by this addon with the multires disabled. But Blender’s multires seems to mangle vertex order for some reason once you use the “unsubdivide” operation (the base mesh vertex order matches fines, but the HD vertex order is different). Not a big deal, we don’t want the multires if we want to import HD Shape Keys anyway.
Later I will try to do the following:
- Take the body mesh imported by this addon after the user has tweaked its parameters, but before finishing.
- Copy that mesh and export it as .obj at its base subdivision levels.
- Call the .dll to load .dhdm files (hd morphs) on that .obj. The .dll will subdivide the base mesh accordingly to properly load those files.
- Take the output from the .dll, an HD .obj, and run the second script described above to bake normal maps automatically from it.
If everything goes right, this means the user can select several HD morphs in one go and then the script could bake all corresponding normal maps automatically. If we combine this with the feature Thomas implemented recently which lets you have driven Value nodes in the shaders, this means we can reproduce a lot of the HD look with non-HD meshes, which would be far better for animation than what daz offers by throwing a massive mesh at you.
-
Xin, that sounds amazing.
I believe normal maps will be mainly useful for HD expressions, where the HD morph is used to sculpt little bumps. There are cases though where the HD morph is used to sculpt body features as muscles or joint deformations. In that case I believe we need the true HD mesh with HD morphs, because the normal map alone will not be able to deliver true displacement and this will be visible in the geometry silhouette.
Below an example with Mutation Morphs HD for G8F where the HD mesh is used to add body features, other than bump details. In this case the silhouette on the bended elbow, that is the HD jcm, could not be faked with normal maps.
https://www.daz3d.com/mutation-morphs-hd-for-genesis-8-female
As for the base mesh we have to distinguish two cases.
1. We have a subdivided geometry in daz studio, I mean catmark. In this case the subdivision matches fine with catmull-clark in blender. Especially with blender 2.91 where we can uncheck "use limit surface". See
.#288Below an example with a cube, please note that the subdivided mesh is inside the base mesh.
2. We have a HD geometry in daz studio, I mean a true HD figure. In this case the subdivision matches fine with multires in blender, where we unsubdivide the HD mesh to get the base mesh. The difference is that both daz HD and blender multires are subdivided outside the base mesh. See #191.
Below an example with a cube, please note that the HD mesh is outside the base mesh.
-
reporter Are you sure you can’t get a good result with normal maps too if you use the base JCM and on top of it the normal map? be aware that daz’s HD .dsf files have Shape Keys for the base mesh in them. Only the smaller scale HD data is contained in .dhdm files. So this is something we could try:
- Load the “base HD” JCM Shape Key on the monster (from the base morph contained in the hd .dsf file).
- Bake normal textures from the true HD with the HD JCM. Then make those normals follow the same formula than the base JCM above, but in the shader.
I believe this would give quite decent results. Dhdm files affect displacements of subdivided vertices only, so they will always have a smaller scale than the regular morphs you use on base meshes. I think a lot of the contribution in your HD morphs comes from the “base HD” JCM (contained in the public .dsf), not the .dhdm. I will eventually test this figure too and compare baked JCM normals with the real HD JCM Shape Keys imported with the .dll.
In any case, loading HD morphs on a real HD mesh in Blender can already be done with the .dll since it can read .dhdm files and vertex order matches. The driver formulas were always public, so the addon would have no problem setting drivers for the HD Shape Keys.
Basically, by reading .dhdm files, the .dll would allow this plugin to fully support true HD meshes. But note that Blender can’t load HD Shape Keys on meshes with multires, the Shape Keys only apply to the base. So for it to work, you need to import a real HD mesh in Blender (or apply a subsurf with max subdivisions). Then, through the .dll, you could import HD Shape Keys for it. At that point, it’s basically the same you do with non-HD meshes, but slower since you are dealing with a lot of vertices and faces, and Shape Key evaluation is more expensive.
I will first try to finish automatic normal baking before testing .dll calls more closely with this addon, so Thomas could have a better idea of how to implement it. As I said, dealing with HD morphs on true HD meshes in Blender can already be done since .dhdm files can be read, it’s just a matter of deicing how to implement it with this addon.
By the way, if you visit the first link I posted, you can see how you can get accurate displacement through textures too, by using Blender' Vector Displacement node (note the difference with a simple Displacement node: this node allows for arbitrary displacement in any direction, per pixel). It has its limitations though and still requires subdivision to work properly (and it can work with subsurf/multires modifiers), but it could be an alternative to very heavy HD Shape Keys. But this should only be considered later. Just mentioning it so you are aware of this other alternative.
-
The link is interesting indeed, thank you.
Also true HD would have the benefit to work with eevee, that displacement can’t. Then I agree true HD doesn’t fit animation. But it’s good for comics or pictures.
-
reporter - edited description
-
reporter I will test true HD in the context of this addon after I finish testing automatic normal baking. Then we can ask Thomas what to implement in more specific terms.
-
Though I do not need to attach any un-useful comment, but if it work, it is huge dvelopment which I have expected when I find some topic which introduce Vector displacement and test them with some files. Unfortunately I had no 3d paint tool which can generate the Vector displacement correctly. so I could only see how it work, with offered displacement map. So the idea to use it for daz HD morph offer real advantage I just suprised this topic. (and we can easy disable them, with use node set-up) so we can activate it only when we render. (then I like cycles more than eevee, as final render, so if it actually work with options, it is massively useful. and I can pefectly content with the quarity.(not normal map baking)
I hope if you could take time to step to step work-flow for user who test it, how use those, after finish this. There are many document for HD, then I sitll do not test much, with current add on HD improvement. I may hope to test with some HD morph G3 monsters with use vector displacement bake work-flow (Xin now seems offer)
-
For user who may not clear understand difference Vector displacement and default displacement map,
http://wiki.polycount.com/wiki/Vector_displacement_map
many user may expect if they see pics,, might ask why iray do not offer nodes which can apply it. (Though I do not know about MDL recent version)
at least daz seems not offer way. for iray still.
https://www.daz3d.com/forums/discussion/434187/iray-vector-displacement-shader
only difference is HD morph still work even though it is used for JCM etc, correctly. I do not think it can show pefectly same effect with shader Node slider. (change effect like morph) eg when we pose bone with HD JCM work, the driver need to set for the shader node property. (Though I do not know how recent version tweak shader node to morph them, so if it is already improved. or not)
-
Anyway after Xin finish this work (as you hoped), teach me some steps, (what I may need untill test it or need not anything just can test it,, etc), with real daz HD product (G3 or GM), how I can see it as Vector displacement.. if you could kill time for me pleeeease..
-
Xin please add or fix, plug in may work button to offer generate HD morphs vector displacement map (if original already offered it,,)
so I read dll and txt, but at least I interesting creating vector displacement textures than import HD morphs as obj seriously. I do not know why there is user who may not hope it but hope to import them as real HD obj shape keys? (of course it is one way, and which may show perfectly same thing as daz do,, but if we can do things without lost many quarity,, generate vector displacement textures and use it to show HD morph effect, it is more useful I believe… (I do not know if there is user who not interesting about it, if they hope to see HD detaill)
you discribe,,
“ I removed all the OpenGL stuff from the original since we are not interested in creating vector displacement textures for now.
If you are, you will still need to make some changes to the github version by edolstra, or you will likely experience crashes/bugs like I did. I can provide the fixed version too.”So I hope you improve dll, which may add new button, which generate vector dsiplacement map, from selected HD file . manually make node group is OK for me. (though If it auto set-up those nodes, even though they not connected, I can easy attach it for current shader nodes..)
About my case, I may mainly use open-sub D without use mutlires. so when I hope to see HD effect, I may on vector displacement map.
at first step, I do not import any JCM or MCM HD morphs. which related with posing, but simply hope to import HD character morphs (creature) of G2M, then adapt it in blender as vector displacement.. if you did already,, I really hope to use it (not may hope to import those morphs as obj, and may not hope to use normal map to represent HD morph effect,, (of course it is better than none,, but normal map have clear weak point with change angle. so actually it is not good to get detail well (like HD morphs detail , if user really hope to render it with many angle)
-
reporter Ok, whenever I find time, I will create a small addon with 3 options for testing purposes:
- Create normal maps from HD morphs.
- Create vector displacement maps from HD morphs.
- Create real HD mesh with HD Shape Key from HD morph.
-
Thanks Xin if it work, I really hope to test with vector displacemnt option. at current temporally I am serching way to generate (bake) vector displacement without Mari or Zbrush but in blender.
Some you tube show way to baking vector displacement in blender, they seems use emission baking with convert source (high reso) geometry position as RGB color by shader node, then bake to non morphed shape (of course keep same sub-D when bake). but your introduced add on not need to bake with render engine, but directly convert geometry delta data as vector displacement.
if it can mix with Thomas add on way, I believe it is huge development. (as Thomas already said, he concentrate remove bug so we have time to test your add on. maybe we may find best setting etc,,and how to use it with already offered maps (bump or normal).
“At least as of Blender 2.83, the "Normal Map" node does not appear to take displacements into account, so the resulting normals will be as if the displacements are not there (even though the geometry is displaced). This has the effect of largely obscuring the effects of (small) displacements, depending on the lighting conditions.”
I may hope to test what she mentioned. (if so it seems blender side bug, and the document is old about recent 2.9 versions, so I may hope to confirm it. but basically in daz studio, most of HD character (iray) seems only use bump map (not normal) + HD morph so it seems not critical, if bump with vector displacement work without problem.
and there seems custom build which for-give us to use vector displacemnt map (tangent space) as displacement modifier texture. (but it seems custom build, and I do not know if it work for multi tile textuers ,, so I can not expect it much ) about the case we need not use shader node.. (maybe when displacement modifier will work with Vector displacement map, RGB (tangent space) for blender future version, , it becom more daz like way I suppose)
If you will offer your custom add on, as more user friendly one, it should be more useful than bake ,, thanks!
-
For more user friendly use of HD expressions for not animated renders I would wish I could just import the HD figures without multires to get all the HD expression morphs to the figure. To get good faceexpressions it’s often much try and error. It’s not usful if I must bake normal or dipslacement maps for each faceexpression or mixed faceexpressions.
-
Jochen, as I understand it, the plugin will load the baked HD maps with drivers. So you will need to load the HD expressions only once, as you do with base expressions, then you can mix and match in blender.
Then I agree that an option to load the full HD morphs on a full HD figure will be useful.
-
reporter I finished the vector displacement map operator and compiled the .dll with opengl. And it works fine.
So now I will focus on the real HD creation and import of HD shape keys. One thing I will try is to create the HD entirely in the .dll from the base mesh in Blender, so there would be no need to import the HD mesh from daz, saving time and making it easier for the .dll too.
The normal maps operator should be quite easy later since most of the work was already done (Thomas in particular did the work of handling UDIMs for baking purposes).
I suppose I could finish by the weekend depending on how much free time I get.
-
Jochen so you means still render, but you hope to pose then render one frame only? If so I suppose you may need to import high reso (sub-D applied) mesh not only for HD morph, but need to import all morph file (JCM MCM) as high resolution. because blender not offer way to import shape keys without change resolution (only Daz offer it I suppose)
if you means you make expression in daz, then just hope to render in blender, you may not need to import any HD morph I suppose. (just export it as high-resolution, then all HD morphs will be baked on the frame as zero pose.) but I suppose you means, import High resolution figure which can still pose. = all morph which need to make expression, need to import as high resolution.
you can not import daz base resolution morph files any more, untill convert it as high-resolution to blender.
But I do not against, to make add on work as you mentioned. (though I can not imagine, such huge shape key data wtih drivers, can be managed with my PC as posable. If we import HD morph as high-resolution, it means, all shape key data need to be import as high-resolution, I suppose.I f it is wrong forgive me. I may miss understand something improtant, what you hope.
At same time, import all morphs as shape key, can be manage easy (add driver ) with rig or controller, only need to change shape key-data as sub-D applied. I can add driver for shader node of Vector displacement, but can not confrim it can represent daz complex ERC well . (so even though we import MCM or JCM HD expression, it may not show same as daz with some case.
-
And Xin really thanks you take time. I may test it as body shape (non pose or expression MCM support) . then may test MCM or JCM or expression HD morphs later. ( all expression should be, usual morph + addtional HD detail, so I may need to think way how to mix Base shape VDM + added pose VDM in shader node.. (if it can be simply represent as each RGB component with color mix node,, maybe not so complex )
At same time, I may test how import HD morphs (jcm , mcm) as high-reso shape key, work. with most simple case… (but I still not clear understand, how you make it about base expression morphs. (not high-reso one) , we can not bake them when import.. so they may need to be converted as sub-D shape keys right?
-
And I think, even though I could import some expression HD morphs as VDM, it never means I can use it as same as current import figure with driven morphs.
Because to it work, I need to use each VDM for each expression. I can not imagine, each UV tile mat VDM shader node attach all VDM textures. (of course one by one process may work , but can not make it work as same as controller. For each expression I may need to change VDM texture nodes. or it need to generate all VDM texture nodes first, then make driver in shader node (I can not imagine, such node groups but without it I can not mix expression at all)
But as my request, I do not think it seriously, my hope is keep base resolution but only import HD morph as VDM for base character shape…so I can get shape detail with keep all function which add on offered for us still… I hope to see how Thomas improved about baking Normal map with HD morphs. with Xin request,, anyway,, once Xin finish work,, I am glad to test eahc option. (For me import HD morph as VDM is too amazing,,)
-
Then Alessandro, what actual difference HD morphs and HD expression you means? those 2 are offered as different format? (I did not think so) you may test much about HD than me, so how you think those 2 HD morphs. (I think there may be no difference, what data is included in file)
===
Ah ok so you said, the purpose of each HD (expression and body) file is used with bump map,, (or normal map). I may follow the blender doc, when I use VDM.
Displacement and Bump
Both methods can be combined to use actual displacement for the bigger displacement and bump for the finer details. This can provide a good balance to reduce memory usage. Once you subdivide the mesh very finely, it is better to use only actual displacement. Keeping bump maps will then only increase memory usage and slow down renders.
(but we may need to use this option for final render, because daz keep bump map with HD morph so we can not remove it I suppose)
I actually feel, if we use VDM we may not hope to bake HD as Normal map. because even though we use Normal map, set it to work as same as shape key controller can not be expected.. I feel.. (need to swap normal for each expression) then if only use one Normal map,, VDM defiretly offer more quarity.
-
Alessandro that sounds very well. And this also works with all HD faceexpressions? Also Unique Smiles for examples? I optimized the HD expressions with the hieght displacement not with normal maps til now. will this plugin also make displacement?
Another question is, is there a plan that Blender implement the way to import shape keys without change resolution like Poser and DAZ. As I unterstand also Blender is open source, so maybe it’s possible to implement this in future?
-
engetudouiti Yep, once Xin and Thomas can get it to work, I’d go with HD maps for expressions and HD morphs for jcms. Then personally I’m not sure if base-res morphs + HD maps can do the same as HD morphs, as Xin is suggesting. We’ll have to do tests.
Jochen As I understand it, the HD maps will be blended with drivers, so you can mix HD expressions the same as base expressions. As for blender, what we miss is HD support for multires, that is, the ability to store HD morphs. Actually multires only stores base-res morphs. But, from what I understand by Xin, it is possible to transfer the rig to the HD mesh, so we can get a workaround this way.
Then of course the simple way to get HD figures is to pose in daz then import in blender, this will always work, and could also be enough for single pictures. So the hard work Xin is doing is more targeted to animation I guess, but it also has the benefit to allow true HD in blender.
-
I still not understand how you can go with HD maps for expressions and HD morphs for jcms. of course if you use without sub-D but keep using sub-D applied mesh, you can do it. but if so You may need not use HD maps I suppose… (but just use all as HD shape keys then set driver for shape keys)
Then I can not think HD maps will be blended with drivers. Yes all node in shader value can set driver.. and it can be adjusted like we change shape key value. but when we make JCM or MCM driver, all shape key data need to be imported first. (then each mesh data have shape keys data as collection)
So lets think, smile need smile HD, angry need angry HD, laugh need laugh HD, (if it is offered). you need all VDM maps at first, then they need to be included in one shader node of each UV sets.
Though I do not count how many HD morph files will be used for expression,, ,but if daz offer 6 HD morphs for expression we need to generate 6 VDM texture as blender texture nodes first.. then can adjust or mix them. can you imagine it work well? at least our shader node become really messy I suppose.
Blender shape key are already offer, we set value for each shape key-data, then they will be auto added with each shape key value,and show current shape.
if we make it as Texture data,, we need to make such Node groups, which count all VDM maps with current value. To make it work add on need to generate new custom node, which include all image VDM as texture node, multiple with strength = controller value , and mix the value in the node group as final VDM (RGB) . (I do not know it is already made or not for normal map, and if it actualy work or not,, when we controll them with UI controller.
One thing I can confirm <“base-res morphs + HD maps “ can do the same as HD morphs.” > yes it is how ds work with HD morphs. and many product at least I already have .. make so HD character.
In daz studio.
we set sub-D to HD work in DS first. then add base morphs (it only describe base resolution verts delta, other sub-D verts are interporated, (auto adjusted). then finally add HD morphs which move all sub-D verts. with current level. you can confirm, so HD character controller often move base morphs + HD morphs. daz separate them then apply them at once. with one character controller.
So,, generate VDM from HD morph file, with the sub-D level. actually do same thing in blender. then we do same thing.
1 set sub-D (not applied , we need not)
2 set base shape key values with add on UI controller ( the base shape morphs are already baked when we export as base resolution, so I do not need this step to import HD character, I miss input)
3 finally add VDM map then move all sub-D verts as delta from 2. (how mix use when there are many HD morphs are used at same time,, it is different problem, then I do not suppose the usage at current,, I only plan to use it as Character FBM or PBM which represent as 1 or 2 HD morph only)
-
engetudouiti Thank you for the nice explanation, indeed I may be confused about drivers. As for true HD, as I understand it HD maps will be normal maps. So I believe base-res morphs + HD maps can’t do the same as HD morphs, because normal maps can’t do true displacement. If HD maps will be true displacement maps then yes, it may work with some approximation, depending on the map resolution and geometry resolution, because pixels will be interpolated to the vertex geometry.
So what I mean is we have to do real case tests to see if HD normal maps can be enough to get a good approximation of the daz HD figures. And personally I’m expecting them to be good with expressions and fail with jcms and body features. While true HD, if we can get it, will always work fine of course.
-
@Alessandro
Yes I agree perfectly what you said. anyway, we may get some new options Xin kindly offer, so it may depend each user what we may hope to use.. eg use normal map for peformance.., or VDM with high sub-D level,, or actually get HD shape keys... and may need to consider each procedure and test with user work-flow. (so some user may need not controll driver for JCM MCM VDM ,I suppose, do not know)
I am exciting, generate VDM add new option which we could not solve it,, then how it wlill be generated Xin know clear,, so I might ask and test for my purpose. (If it generate VDM for non morphed base character, I need to change import setting, if it generate VDM for base morphed character, I only need to import the morphed shape as baked, after all I need actual test,
-
reporter I finished the HD shape keys operator. You don’t need to import an HD mesh, the .dll can generate the HD mesh itself from the base mesh, and once imported back into Blender (this is quite fast), you can transfer materials and vertex weights to it. The slowest part by far is the vertex weights transfer to rig the HD, which is done with Blender’s Data Transfer modifier. Overall, it’s not slower to generate the rigged HD with the .dll than importing the unrigged HD with this addon. Once you generate that rigged HD mesh, you can import HD shape keys to it with another operator (it also supports importing shape keys from regular .dsf files, so base morphs can also applied to the HD mesh). Be aware that I’m not bothering with setting up Drivers like this addon does. It would make no sense to re-do what Thomas has already done, That can be implemented later.
Now only the normal maps operator remains, which should be quick.
Later, once I post the first version for testing, I will try to create a simple reduced .collada exporter/importer (.obj doesn’t support vertex weights), which would allow the .dll to “paint” the vertex weights on the HD itself, as it subdivides it. This would be a considerable speed boost I’m quite sure. Even better, Blender’s .collada importer is also implemented in C++. So getting rid of the need to use Blender’s data transfer to transfer vertex weights shouldn’t be that hard. Everything else is fairly fast.
-
I do want it bad
-
I have some questions, about import HD morph as real shape key work flow, but seems better test with real case first.
At same time, I suspect if we can convert default import and already finished character to HD sub-D one. after that use Xin offer HD morph importer.
That means,
- import character as base mesh then import all morphs first without HD morph
- by script apply sub-D for meshes, with sub-D all shape keys data at same time it need to keep morph driver.
- finally import HD morph only, with use Xin add on, so we can easy set driver. (though if there are many JCM or MCM it is not easy for user. )
The difficult thing is 2 , there are some scripts which may try to apply sub-D for all shape keys, but it usually not expected to use for rig character (armature modifier).
So it need original script. Though I do not know, if Xin script already include those things.
-
reporter I have finished all operators.
Now I am testing them and exposing more options. For example, I believe offering the following option was interesting: you can use real subdiv up to a level X, and then only use maps for details from X+1 onwards, which are small enough that you don’t need vertices for them and you can rely on normals/displacement. This is particularly useful because textures can only capture a finite depth range, unlike Shape Keys. This is not a problem for facial expressions and human-like deformations, but would come handy for heavily distorted figures like monsters and animals.
By the way, I tested the monster thing from Alessandro, and that one’s top level has subdiv 3, which has over 1 million faces, so it becomes very heavy for Blender to handle if you want to use a real HD mesh (it still works though). On the other hand, the vector displacement + subdiv works quite well for it (yes, even while having a quite wide range of depth). The monster’s materials already come with normals so there is no need to bake them for this example.
Also I wanted to write somewhat detailed step by step examples later since I don’t think it will be obvious to understand how to use the operators effectively without further explanation, especially now that I added more settings. The ancient gpu I have doesn’t help, so I have to rely on cpu rendering for testing.
So I hope to finish everything in the next 3 days or so.
Also I’m not adding any Driver set up at all, since this addon already does that, so it can be easily added later. Right now, the test addon doesn’t call any functions from this addon (it reuses some of its code like the handling of UDIMs though), to keep it simple for initial testing.
-
reporter I attached the test addon.
-
reporter Read the “example.html” example in the directory “notes” first.
-
I’m on it Xin this sounds revolutionary for HD figures. Thank you so much for your time on this.
-
Thanks Xin take time much, I really expect to test with VDM for HD morphs
-
Xin I suppose I should make some mistake, but I tried to test with follow your attached Html.
after import HD demon with base resolution by Daz importer. I set working directory etc, and keep as default for VDM setting. then generate VDM, with open console, now I see these error message
mesh name: Genesis_2_Male_Mesh_Genesis2Male
Error in DLL: invalid face: f 431/616 8801/615 8809/625
Python: Traceback (most recent call last):
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\operator_common.py", line 16, in invoke
return self.execute(context)
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\operator_vector_disp.py", line 23, in execute
if not self.generate_maps():
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\operator_vector_disp.py", line 46, in generate_maps
self.morph_base_until )
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\dll_wrapper.py", line 118, in execute_in_new_thread
r = future.result()
File "C:\myprograms\blender-2.91.2-windows64\2.91\python\lib\concurrent\futures_base.py", line 435, in result
return self.__get_result()
File "C:\myprograms\blender-2.91.2-windows64\2.91\python\lib\concurrent\futures_base.py", line 384, in __get_result
raise self._exception
File "C:\myprograms\blender-2.91.2-windows64\2.91\python\lib\concurrent\futures\thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\dll_wrapper.py", line 105, in generate_disp_morphs
r = w.generate_disp_morphs( *args )
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\dll_wrapper.py", line 76, in generate_disp_morphs
raise RuntimeError("Function "{0}" of DLL "{1}" failed.".format("generate_disp_morphs()", self.dll_path))
RuntimeError: Function "generate_disp_morphs()" of DLL "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\dll_dir\dhdm_dll.dll" failed.location: <unknown location>:-1
Error: Python: Traceback (most recent call last):
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\operator_common.py", line 16, in invoke
return self.execute(context)
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\operator_vector_disp.py", line 23, in execute
if not self.generate_maps():
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\operator_vector_disp.py", line 46, in generate_maps
self.morph_base_until )
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\dll_wrapper.py", line 118, in execute_in_new_thread
r = future.result()
File "C:\myprograms\blender-2.91.2-windows64\2.91\python\lib\concurrent\futures_base.py", line 435, in result
return self.__get_result()
File "C:\myprograms\blender-2.91.2-windows64\2.91\python\lib\concurrent\futures_base.py", line 384, in __get_result
raise self._exception
File "C:\myprograms\blender-2.91.2-windows64\2.91\python\lib\concurrent\futures\thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\dll_wrapper.py", line 105, in generate_disp_morphs
r = w.generate_disp_morphs( *args )
File "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\dll_wrapper.py", line 76, in generate_disp_morphs
raise RuntimeError("Function "{0}" of DLL "{1}" failed.".format("generate_disp_morphs()", self.dll_path))
RuntimeError: Function "generate_disp_morphs()" of DLL "C:\Users\TAKE\AppData\Roaming\Blender Foundation\Blender\2.91\scripts\addons\daz_hd_moprh_test\dll_dir\dhdm_dll.dll" failed.location: <unknown location>:-1
Can you point out what may cause erroer ? about dll I have no knowledge if it show error.
-
reporter That looks like a mesh with a triangle. Make sure it only has quads. I haven’t yet adapted it to work with triangles.
-
My test demon HD load horn obj too. (but it is not daz grafted item, it simply load as obj (as same as prop etc),
then I do not think HD morph effect the horn mesh. and when I load the actor, the horn loaded as separate mesh. (so I simply leave it off, I may parent later if it need)
If it caused , the saved duf include the horn geometry, I may test later without horn.
-
OK I may check it,, (maybe genesis2 male include triangle I afraid ^^; )
Yes I could find many triangles with genesis2 male (I only use it as creatrure… but do not mind. I test with G3 HD morphs later.
-
reporter I don’t think it’s hard to make it work with triangles, but I don’t even have older morphs to test them.
-
If you happend to get one G2 F or M HD product, it should show same issue. (because genesis2 mesh actually have some triangles with base resolution) but I understand , G2 is old figure to play with recent add ons. ^^;
-
Xin, can I simply import VDM without generate new scene , but just add VDM with select the file?
From your guide (though I only read carefully about VDM section), I may need not to import mesh which have not merged..
Or to generate VDM correctly, add on still need to export the base mesh which not merged?
===
OK do not mind, I succeed to generate G3 mouthrealism VDM easy, with my character..
I am exciting,, how it will be shown now.. Yes it work as I expected !!!!
Yes I could confrim, even though it is detail of mouth only morph, it already work well keep adaptive sub-D,
with no displacement (VDM) (scale 0) as you see teeth is smooth out . (I already made morph, but had not applied this character still ><) and the more sub-D the teeth gap may become clear.
Then set VDP 1.2 * 0.01 scale
Actually keep to use 0.01 scale may do good job (it is correct scale value I think) you may clear see detail added for teeth and tongue with same sub-D value. (adaptive sub-D I use)
the mouth realism HD morph seems hide teeth gap too thanks. (I just generate VDM from g3 PHMmouthrealismHD duf. then app;ly it for mouth and teeth only.
Thanks Xin, it already achived what I expected long time. (try to use those detail morph HD, but not hope to import high reso mesh) . I supposed I can do it with displacement map. but the true power of Xin offer add on and dll,, it now achive it by VDM and can generate with simple few clicks. only blender user never can generate VDM before.. (so I really disappointed,, blender not offer tool which bake VDM,, but recommend to use other 3d sculpt tool VDM,,)
And set-up is not difficult at all. the generated map name clear show us which UV tile we need to apply the VDM. and I think it take many time to offer good help HTML.. thanks XIN.
=====
As future request,, if you can,, so now it make me clear, when there is vertex delta data, those data can be converted as VDM.So,, if I sculpt and make morph for high- resolution mesh (simply subdivide 2 or 3 times), then generate the mesh as obj file etc,,,, can you convert it as VDM for the base mesh ^^;?
If you enhance this dll and add on,, it means, we can generate VDM with blender sculpt for our import character,, (HDmorph VDM) as we like…(without ray cast baking)
-
Forgive me I only test with Non HD shape key options, because my PC is not powerful which can manage many HD shape keys I believe. So if Xin may plan to enhance this add on. I hope these things.
-
ideally I really hope if add on will work with Genesis2 meshes (because I have only many genesis2 product for male, actually I have no character HD male which can use for G3 man. my interesting about male figure, are usually creature. then they are genesis2 male base..
-
I am now thinking, how merge 2 or 3 HD morph which will be generated as VDM for same tile.
there should be some ways.. but I supposed,, if you can generate one VDM from 2 or 3 dhdm file? daz may not forgive us to generate new dhdm file, from product dhdm. but I suppose, from offered documents about dhdm,, you can pick up necessary data to generate VDM from multiple dhdm file data,
if not, we need to generate same count VDMs as dhdm which user used for UV part.
If I mimic daz morph ERC, generate one VDM for one dhdm is most reliable I think. but if there is case, user hope to generate only one VDM from some dhdm. (it is same, how we generate new shape key from mix other shape key which represent each morph)
If I generate one VDM for one dhdm (add on only suport it, at current) I hope if Thomas add on will generate shader node, which already mix VDMs for displacement input socket.
the VDM texture out-put (R,G,B) is vector ,so we may need to subtract, (0.5 0.5 0.5) (midpoint value) which we set in vector displacement node) from VDM RGB first, then it will represent as transform vector value of the pixel.
eg now I have 3 dhdm used for current character,, and set each value like this.
dhdm1 (value 0.3) , and dhdm2(value 0.5), dhdm3(value 1), in daz studio.. and they are used for same UV tile verts.
if I represent it in blender mix color (or vector math) node, it should be converted like this (I will test it later)
final vector (R,G,B) =
(R0.5, G 0.5, B0.5) + (vdm1R-0.5, vdm1G-0.5, vdm1B-0.5)* 0.3 + (vdm2R-0.5, vdm2G-0.5, vdm2B-0.5) * 0.5 + vdm3R- 0.5, vdm3G-0.5, vdm3B-0.5) * 0.8
Yes if Daz importer generate those 3 DHDM controler as same as other morph, I can set driver in shader node. as same as above. (0.3 = dhdm1 value, 0.5 =dhdm2 value, 0.8 =dhdm3 value, so I may only need to set driver for those value. driver target is UI controller value of each dhdm.
the final vector (RGB) will be connected, one vector displacement map node. with keep midlevel 0.5. ( of course we can still adjust global scale with keep morph ratio)
But I do not have good dhdm which can test these ^^; (without dhdm file, I do not have way to generate vdm. I may try with procedural texture and plane though,,
-
-
update. I have successfully completed the vector displacement part and everything worked fine. Though I used angry instead of triumph since triumph is not available in the basic pack. It would be nice for the addon to set a driver for the vector map instead of using it static. And to generate the displacement material instead of leaving it to the user. If Thomas may help with this.
Overall it’s an amazing work so far.
-
reporter You can already pass several morphs (and weighting factors if desired) to the .dll function and all the morphs would be considered for the final texture according to their weights. It’s just not implemented in the addon’s UI since I don’t know how to expose that in a clean manner.
I will look into generating the vector displacement textures for sculpted meshes (or any HD deformation) in Blender. That should be easy since the .dll reads arbitrary .obj files already. The information needed is the edited/sculpted HD mesh, the base mesh, and the type of subdivision which will be used with the vector displacement texture once generated (whether “limit surface” will be on or not since that would affect the look of the generated texture a little).
I will see if I can get a genesis 2 HD morph to make it work with triangles. It shouldn’t be hard.
-
Xin yes your discribed way (gather HD morphs with each weight, and generate one VDM t) is perfectly what I means.
about those Baked shape VDM the way seems best. because they are no need to be driven in blender UI. as same as we do not generate Victoria 7 or Aiko 8 shape keys. (they are baked when we export)
Thomas already offer advance option to Character JCM work corretly, we manually input those morph value, for imported JCM to be multiple with weight. (like AIko7 0.3 shape, we need to multiple JCMxxxx_aiko effect, when bone rotate.)
So I can suppose you can add UI prop which auto multipled with the dhdm (which we set path in UI). but how add prop which user input mutliple path and weight is actually complex.. (ideally it may better, kind of dynamic UI prop, which user can add path as we like, (HD path name1, weight) , (HD path name 2, weight)
At same time, I think if this way is more better?
when import character.duf,, the duf describe the currently used all morph value. for zero pose shape. so if your HD importer relate with Thomas daz importer, I suspect,, you can use the value (all HD morph path and weight value) as temporally varialbe,, then can use it to generate one mix VDM
(I may call those VDM as base shape delta VDM) which we do not change value for the character. (though we may globally change the effect with VDM node scale, still)
If it work so, your add on can auto generate , final shape delta VDM for improted base. without user set anything. (It is just my thinking how it work ,, and you already do many amazing work,,
About HD morphs VDM (JCM or MCM etc, whichneed to be driven by bone or other morph value, without user set value), I request Thomas, when he could find time to test your add on, and how can set driver for shader nodes. correctly.
And if you succeed to generate VDM from sculpted mesh and base in blender,
it is too amazing for me (as future,, select 2 mesh with base and subD applied (but keep vertex order may need?) auto export obj temporally, and generate VDM is most easy user work flow, I feel. (actually it can sell, in any blender add on market place,,so if you make and sell it I simply buy and say thank you, as same as I always feel same thing about Thomas daz importer add on.)
-
Xin
I have 2 queston about the add on UI VDM scale setting.. (it seems key to generate VDM and set up node correctly for hard sculpt HD morph)
- when I keep the scale default = 1.0, that means the VDM can represent max HD morph delta 1 cm ?
- console can output min and max as RGB, then it can show 389 etc? (of course actually we can not generate such texture, (max = 255) but to adjust scale we need min and max of the data, not real generated one.)
- if I set the setting scale = 0.5, it means VDM can represent max delta 1 * 0.5 or 1 * 1/0.5 ?
I believe you understand what I means clear… (those setting must need when we generate VDM for hard delta HD morph, (though I do not know, daz HD morph range,, but if it is true morph,, there may be no limit actually, though we may not make 1m delta HD morph, but I suppose there is case, 3 cm delta etc,,for creature deform..
then to generate it as VDM, we need to adjust it with conversion scale for VDM, after that, we set displacement map node scale (which I can circulate with the VDM setting scale, though I may adjust as I like to show variation..)
-
update. Tested the normal map option that’s excellent with eevee. Below G8F with the HD angry expression in eevee. In this case a strength of 2 in the normal map node seems to mix fine with the bump map already in the face material.
Everything seems to work fine so far.
-
reporter I will look into trying to set up something like https://docs.blender.org/api/current/bpy.types.UIList.html later. It’s the same interface element used by Shape Keys and Materials.
As for the scale, the .dll works in daz’s dimensions, so every export from Blender is scaled up by 100.0. The scale setting is applied to the displacements inside the .dll. The formula is in “…/dll_source/diff.cc”:
red channel = [difference of X coordinates in the scale of daz’s coordinates system] * (scale setting) + 0.5
I guess it would be easier to just print values between 0 and 1 and anything outside that range would be out of bounds. I will change that later too.
-
update. Finally the true HD test. It works fine too. Below the G8F angry expression as true HD geometry.
Please note that with the true HD geometry we get issues with deformations. Below an example where I bended the G8F leg and strange ripples happen to the geometry. This is the same issue that happens in the daz bridge since the bridge exports true HD geometry too, instead of multires as diffeomorphic does.
In this case we can help the armature with a smoothing modifier. So the ripples go away. Please note that this will affect the whole HD geometry unless we use a weight map. So a map is needed o avoid losing details in the whole figure.
Xin this is really amazing work. It is missing drivers to trigger the morphs, that is, HD jcms. But everything seems to work fine.
-
reporter Using the Smooth Vertex Weights operator on the entire mesh (for all the vertex groups) gives better results (with around 16 iterations, which takes a few seconds). This won’t affect geometry itself but the vertex weights only, so the smoothing doesn’t get rid of the mesh details. I might add this option later.
I believe that subdividing the vertex weights in the .dll would also get rid of these artifacts, since there is an option in OpenSubdiv to subdivide data smoothly. What is happening here, I believe, is that the vertex weights are being subdivided linearly (the “peaks” are the edge loops of the base mesh).
-
Xin
Yes I see, daz X,Y,Z delta of morph should be object space value. so dll try to convert as UV tangent vector for each verts. as (U, V, normal) , I only think the converted vector which had converted as UV tangent.
If add on output the min and max float value of all converted vector components, I think I can decide setting scale from only 2 float value. U or V or normal, is not matter (R or G or B ) to set the setting scale.
eg,, now, converted vector component, min = -0.8, max = 2.7 , I compare, absolute value, about this case, we need to represent 2.7 in VDM RGB range. (0 to 1) (then min should be in the range)
from the conversion formula, [2.7 * (setting scale) + 0.5] < = 1 , > >> 2.7* setting scale <= 0.5
so I need to set “setting scale” as <= 0.5/2.7
it means we need to set 1/scale >= 2.7/0.5 (5.4)
I recommend, we may only use smallest integer, as 1/scale, about this case, it should be 6.
that means , now the generated VDM can represent max 3 and min -3 range delta in the VDM texture.
to return it as actuall unit value.. we may need to set shader node scale, as 0.01*6 to get same effect as daz unit delta.
or I may use color mix node and set it as multiple. then the VDM (out put vector) * user input scale (1/setting scale). then connect it to the VDM shader node vector.
I may request these things.
A Untill generate VDM, in console, output one pair min and max float value in all converted vector components in the dhdm.
(only one min and max float pair may need to get setting scale value. (actually we only need to get one float value.. (absolute max delta value of min and max) as delta max.
B change user input value for VDM scale. not dll setting scale, but we input 1/setting scale (label as scale as same as before,,)
C. (if you make add on try to auto-set scale value),, user input scale (integer) >= delta max/0.5 then pick the smallest integer value. as VDM scale value. (in dll 1/scale = setting scale)
and set default VDM shader node scale 0.01 to 0.01* scale(integer) (0.01 is conversion 1unit cm (daz) for 1 unit m (blender) or multiple the scale(integer) with VDM texture node output vector , then connect to VDM node vector.
As for me,, A and B is most important ^^; then,,
I guess it would be easier to just print values between 0 and 1 and anything outside that range would be out of bounds. I will change that later too.
I approve, console output show min and max vector (U, V, normal) as float, ( (not 0 to 255) but actually we need to get outside values of the range, with default scale 1 setting. (actuall morph delta vector value with UV tangent cordinate)
if it is just shown as outof bands, or only show value in the 0 to 1 range,, we just guess and change scale, untill console not show out of bounds.
========
I miss formula., then edit the formula to set shader node scale.
(shader node global scale need to set 0.01 * (user input scale integer value) (t need to multiple, so when scale = 6, we can show range -3cm to 3cm, in blender (meter unit) as displacement.)
because,, as defaut (scale setting = 1.0) VDM texture vector is in range -0.5 to 0.5..
at same time, to adjust the strength we may need to use shader node scale, then I may add multiple node (math node), and may generate the final shader scale like this
( 0.01 * user input (or plug in auto set) scale) * adjsutment strength value = final VDM shader node scale value. or may use color mix node, then multiple with, (I do not have strong view how set these shader, but ideally auto-set default property of shaders, for morph strength 1, show same effect (delta) as daz studio, (in blender),. then user may only change other parameter to adjust the morph strength is usable.
-
And I understand, I should not request many things at same time ^^; (sorry) and need to test the setting formulal actually show the deform delta, (reasonable size) untill you enhance it.
I may ask anyway, show the min and max value out of range ,as vector value. (Then I may test with real HD morph which may need to change scale setting.. I do not think I have product, which need to change setting scale out of -0.5cm to 0.5cm at current for G3 , G8)
one thing I clear understand, as default, VDM texture can only represent -0.5(cm) to 0.5(cm) range (in uv tangent space) if we only use scale setting as 1.0 , more delta (like 0.6 or 0.65) may be simply represent as 0.5 displacement.
then if we set more small value with current add on setting scale,, the more range delta can be included in the VDM,, at same time when we change setting scale, we need to adjust shader node scale with the setting scale value.
eg if we set setting scale as 0.25 now the VDM can represent, all -2cm to 2cm (before -0.5cm to 0.5cm) range delta (in daz), but if we keep VDM shader node scale as 0.01 , it still be shown as -0.5cm to 0.5cm range delta in blender. (so we need to multiple 1/0.25) to return actual length delta in blender.
From those things, how improve or add , I perfectly depend Xin ; (actually I only hope you offer the min and max value which converted as UV tangent space, , as float value.
I may simply use my current guess formula to get the setting scale value, and shader scale, then test. if it can be confirmed it work, (show all delta which we see in daz, with reasnable size) , later you might plan to auto-set etc,, when you have interesting…
-
reporter I will try to set up the generation of vector displacement textures for arbitrary .objs first. Also a few small changes.
Then I will look into the scale. The scale doesn’t change from object space to tangent space. Only the components of the difference between the two positions change, but not the length of that difference. The textures save values per component, so it doesn’t matter that much what the length is, only the largest component, at least if we aren’t doing any auto-set.
-
Xin, yes please smoothing the weight maps would be a major improvement for posing the HD mesh. Thank you for the nice tip.
-
reporter I will add the smoothing with the Vertex Weights Smooth operator first.
The smoothing in the .dll will take longer, since I need to make at least a simple .collada exporter/importer first, since .obj doesn’t support vertex weights.
-
Xin, I have an issue with mutation for G8F and may be I’m doing something wrong. My purpose is to first import mutation with diffeo to get the armature and the base mesh, then get the HD version with your addon. What happens is that the HD morph seems applied twice, or out of scale may be.
steps
- import mutation in blender as base mesh with diffeo
- generate the HD mesh with the HD addon
- import the mutation HD morph with the HD addon
Below it’s the base mesh imported with diffeo, I need it for the armature since it’s a different shape from G8F, this is not just a face expression.
Then below the generated HD mesh with the HD morph applied. My guess is that the HD morph is applied twice since we already have the base mesh with diffeo, so may be in this case we need an option to only import the dhdm file.
-
reporter Yes, that’s mentioned in the notes I think. What you want to do in that case, since you already have the base morph applied, is to select the .dhdm file as the morph file when importing the HD shape key. The addon can read both .dsf and .dhdm files (.dhdm files only contain data for subdiv 1+, while using the .dsf means using both base morphs + the referenced .dhdm subdiv 1+ morphs).
The “Generate HD mesh” operator doesn’t apply any morphs. So you use that one first as usual. Then you use “Import HD morph to HD mesh” with the generated HD as the “HD mesh” and the .dhdm file as the “Morph file” (the .dhdm files are in the same directory as the .dsf files, so you should find them quickly). This figure has subdiv 3 details, so you can set the “Morph subdivisions” to 3 or to whatever you used when Generating the HD mesh.
For very high res figures which have more than 2 subdivisions, there is another alternative. You use the real HD operators assuming the max detail is at 2 subdivisions, then you use the “Starting subdiv level” option in the texture operators to generate maps that only capture subdiv 3+ details, so you can combine both options.
-
reporter Also, I finished making it work with triangles now, so it works on genesis 2 figures.
-
Xin, can you attach new test version zip please? I really hope to test your new version, to check hard deform VDM with setting for g2monster HD morphs)
========
Alessandro, I may need to test it actually, but as Xin mentioned, when there is one HD morph controller in daz studio, usually in data/morphs, there should be 2 HD labelle files. (XXXHD. dsf = base verts delta only at same time it describe the dhdm file path as high-resolution version), (XXXXHD.dhdm = which describe all sub-D delta with sub-D vertex order)
So if you apply all controller for HD muta, (when you load the character) then export it as base… it already include HD dsf delta. you may need to select dhdm file to generate as shape key (then it may not include HD dsf delta)
One thing I can not confirm,, if I set base,, daz should use the XXXHD.dsf and add delta. But when I set high-resoltuion (sub-divide set as 3, 4 etc), I do not know daz actually still add XXXHD.dsf delta too.. (I suppose (had thought) the delta still added for base though, but if daz auto-swap it (not add HD.dsf delta any more when sub-D set >1, with highresolution, but add HD.dhdm, it may show some un-expected effect, I suppose)
-
reporter Here it is, keep the old version around too in case you find bugs.
-
reporter - attached test2.7z
Can handle triangles and quads (no N-gons though).
-
Xin tahnks ^^ . I keep both as you said.
-
Xin thanks now it generate HD demon (g2male HD), then actually I could catch case, which may overflow with setting scale as 1.0 . when I generate vector disp from dhdm file, now it show those in console.
Maximum displacement length: 1.7492835040878418
Largest pixel component value: 1.9614374603966263
Smallest pixel component value: -0.6917768042326324I suppose,, that means, we need to change setting scale. if you did not change way, I may need to set 1/4 for setting scale. to get -2 to 2 range I suppose do you think same thing?
because we may need to keep the largest component under 1 , smallest component over 0 . (0=min, 1 = max). about this case,, absolute value 1.96 > -0.69 then 1.96 is only matter .
[1.965 * (setting scale) + 0.5] < = 1 1.965 * setting scale <= 0.5 , setting scale<0.5/1.962 (1/3.924)
(then actually I change setting scale as 1/4, and re-generate now, console show like this,, so at least this VDM catch all delta in VDM.
Maximum displacement length: 1.7492835040878418
Largest pixel component value: 0.8653593650991566
Smallest pixel component value: 0.2020557989418419
Rendering tile 0...==
wao,, I do not remember if genesis2 uv was UDIM type or not ^^; I need retry again, (maybe I need to set as UDIM UV to generate VDM for all part correctly,, I suppose)
-
reporter If that’s the case, then you have, with scale 1:
- M + 0.5 == 1.96
- m + 0.5 == -0.69
But you need a scale “s” such that:
- M * s + 0.5 < 1.0
- m * s + 0.5 > 0.0
So, if you solve it, you need: (s < 0.34) and (s < 0.42), so any scale below 0.34 would capture the range. But you lose precision the smaller you make it, so you should try to make it as large as possible, in this case 0.32 or so.
-
reporter And yes, the Genesis 2 I tried didn’t have any UDIMs so everything overlapped. You need to separate the islands so they don’t overlap. Only in the horizontal direction (with the first tile on the left being the original tile), the .dll can’t deal with vertical UDIMs right now.
This shouldn’t affect the pixel components values though.
-
Xin thanks, now your formula make me things clear . I did not know when I count 0.5 in formula.
And if the conversion which I did was correct,, it should get more close value for 2nd VDM console out-put (max). but it show as 0.86, then I believed something wrong.
Then do you think, if I set s = 0.34, we may need to change VDM node scale value from 0.01 to 0.01*1/0.34? (I think at least we need to change the node scale, to get same deform delta with blender unit, if it work so, I can easy set setting scale and VDM scale (as default value for morph 1.00) (and maybe as future your add on can auto-set it, so user may need not adjust default value.
UDIM is ok. I separate them manually ^^; (It must need I understand,,)
(I may request Thomas ^^; if he forgive me,, auto set UDIM tile for non UDIM figure but use template number becaues daz already set template group as UV tile, then we can check it in surface . for Genesis 1 and 2.. (but I do not know if the template grup of material are shown in duf ^^; )
-
OK I get it https://sharecg.com/v/88140/view/21/DAZ-Studio/UDIM-Conversions-for-Genesis-Genesis-2
I hope when I import with Thomas add on it show the UDIM UV for genesis2 with this UV set. (basically it should work I suppose)
(Thanks Vascania ^^)
-
Thank you Xin and Engetudouiti, I didn’t get that the HD addon can load dhdm files too. Now everything works fine, also I made it with the new test2 version.
As a side note here’s a little comparison as for speed on my pc:
- the diffeo plugin takes about 20 seconds to export mutation at subd 2
- the HD exporter by Donald takes about 3 seconds to export mutation at subd 2, see
#191 - the HD addon actually takes about 85 seconds to generate the HD mesh at subd 2, then importing the dhdm takes about 10 seconds
-
Yes I think, it work well. (and when setting scale = x, change VDM node scale as 0.01/x seems correct for me. (at least it make more reasonable, than keep as 0.01, to return detail deform)
There are some ways How adjust VDM strength as morph strength value, eg use vector math then multiple with VDM color (vector), but if so, we need to change midpoint of VDM node, then I do not recommend it.
to make things simple,, use multiple node then user only change morph strength.
(morph strength) * 0.01 /setting scale
then not touch the VDM scale (only change 0.01 to 0.01/setting scale),
I really now hope Thomas daz importer will auto apply shader node, by relate with Xin add on.
(my G2 monster use usuall displacement map too, but basically after all generated value is vector. so I suppose I can simply add VDM + DM node output vector, connect to Displacement input of Material (Cycles)
non VDM
-
reporter Most of that time is spent transferring the rig, which the other exporters don’t do, at least without breaking vertex order which makes the meshes useless for morphs (this is also the reason why the official daz exporter is relatively slow too). Also, this addon is exporting and importing twice, from Blender to .dll and from .dll to Blender. You can’t avoid this since you need to perform the subdivision in the .dll.
-
So I may hope Alessandro try VDM for muta ^^; then show me how it look same a daz .
(I am now a little exausted to set-up nodes now ^^;)
-
Thanks Xin it is really kind infomation for user
-
Xin, yes the HD plugin is essentially different as for the HD mesh because it uses true HD instead of multires, I’m aware of that. The speed comparison is just to get the idea of what to expect. Personally I’m excited and I believe it’s a great job you’re doing here. Thank you so much.
I can’t wait to test the new version with the weight smoothing.
-
reporter I believe that the rigged mesh from the official daz exporter would work fine with the other HD operators in this addon, since it has the right vertex order too. You can use that one as an alternative, but not the others (at least if you want a rig), since they are either unrigged or rigged but with the wrong vertex order so you can’t import morphs to them.
To get a better idea of speed difference, try exporting an HD mesh from daz with whatever method you choose, then add the import time in Blender. Then compare with this addon when not transferring rigs. Still not the same since this addon still has to export/import twice and execute stuff in the .dll, but will give you a better idea of what speeds are possible.
-
reporter Also Alessandro, is the HD addon transferring materials to the HD mesh? because it should, but in your screenshots it’s gray. The materials should be the same ones used in the base mesh. If they are being duplicated by the HD addon that’s also wrong. Check also if rig properties that drive bones work fine too. All such rig properties from the base rig should be in the HD rig too.
-
Xin, the smoothing works great. Below it’s G8F generated as HD mesh with smoothing, no more ripples there. As for materials I can confirm the HD mesh shares the same materials as the base mesh. I’m using gray pictures just because they show better the geometry.
As for rig properties, in my tests face expressions are transfered fine, while jcms are not. Below an example where I imported the elbow jcms for G8F. In the base mesh the elbow bends fine and jcms are applied. In the HD mesh jcms are not applied.
As a side note it seems to me that applying multires to the diffeo HD version gets a rigged HD mesh, the same as generating a new HD mesh in the HD addon, with the difference that diffeo is faster. The applied multires gets also the same issues, that is, we need weight smoothing to avoid ripples. Unfortunately we can’t apply multires if the base mesh gets morphs, so this idea is quite unuseful.
-
reporter That’s fine, the Shape Keys aren’t supposed to be transferred automatically, although it could be done later. Right now, you have to use the operator “HD Shape Key from base mesh shape”. Or, alternatively, just select the JCM .dsf file with the “Import HD morph to HD mesh” (after setting Morph subdivisions to match the subdivisions of the HD mesh). Probably the best option.
Just in case, to use “HD Shape Key from base mesh shape”:
Leave the base mesh in its rest pose (otherwise the pose will be baked into a Shape Key too), and dial the shape key you want to export, leaving all the others at 0. This can be done quickly with the pin icon in the shape keys panel. Then use the operator and it should work.
The problem with applying the multires is that the vertex order of the resulting HD doesn’t match with daz’s HD, so morphs don’t work. That’s the main problem here and why you need to either generate the HD mesh with the .dll and transfer rig with data transfer, or export the rigged HD mesh with daz’s official exporter. The others exporters export either an unrigged HD mesh or a rigged one with the incorrect vertex order (since they are using the multires to get to the base shape and rigging the base, but the multires breaks vertex order).
Just in case, see if you can find an alternative way to match the vertex order of the mesh generated by the .dll while still ending up with the vertex groups/rig (the daz bridge satisfies these conditions, but it’s not faster in my experience).
In the end, a .collada exporter might still be the best solution, but it will take longer to implement.
-
Xin, thank you for the nice explanation. I can import the dsf morphs for the elbow but they will not get a driver to be applied with the elbow rotation. But this is the same for maps too, that need drivers to be applied. So far so good then, everything works fine at this stage.
Sure I’ll keep thinking and let you now if I get something, I’m not that smart but you never know I may be lucky.
-
Thomas, please let us know what you think of this work by Xin, and if/how do you see an integration with the plugin.
-
What I may hope to confirm daz, how daz actually apply 2HD morph file (xxxxhd.dsf + xxxxhd.dhdm) with base or with sub-D applied.
If the dsf delta always deform (non relation with sub-D), it make things simple. (with set driver, it may not so simple though)
but I afraid, still, if daz only apply dhdm delta, when we apply -sub D.if it work so, I may need to remove dsf delta (minus delta) when sub-D. (so the VDM or shape key of dhdm delta can correctly added)
when I check xxxxHD.dsf, as I mentioned it describe dhdm file path like this
"hd_url" : "/data/DAZ%203D/Genesis%202/Male/Morphs…../XXXXXXXX_HD.dhdm"
So when we apply sub-D, if daz simply change morph path from dsf to dhdm, the dsf version (only base delta described) are ignored.
and I suppose dhdm after all describe all sub-D delta (include original base verts (but may change vertex number) delta)
So I do not know, if sub-D version still add xxxxxHD.dsf delta or not clear.
- daz add XXXXhd.dsf for base. then when we set sub-D ,,ds sub-divide mesh which include dsf delta for base already,, then it add xxxxxhd.dhdm delta for all sub-d verts again.
- daz add XXXXXhd.dsf for base (so we still see, even though we keep as base,, the HD morph controller deform clear) , but when we set sub-D. ds do not count dsf delta anymore, but only use xxxxhd.dhdm delta
I can not check it from ds controller.. because, we only get 1 HD morph controller, which relate with 2 file describe different delta (and verts count)
Though If I really confirm it, I may edit dsf version as empty delta,, then if when apply sub-D it still show same shape as before, it means,, ds ignore, when we set-sub-D. (only dhdm will be circulated when we set-sub D)
The difference can easy cause issue, if we use it for JCM MCM. (and even though I only use HD as full morph it effect how we mix morph correctly (when I use creature HD mixing) when set sub-D
It may effect much if Thomas or Xin or we manually add driver for VDM or shape keys. (I actually do not care it much, but user may expect to get same as daz studio,, then it may change hard how deform mesh.. usually MCM and JCM is for detail adjustment, so if it work wrong,, it cause hard problem. though I actually do not check MCM or JCM HD , (and expression HD which may often controlled by other morph or bone)
-
reporter The .dhdm ONLY has information for subdiv 1+. Nothing for base deformation (level 0), that’s all done by the .dsf. The formula controls both at the same time, using the same output to control their influence. That formula is the formula contained in the .dsf file.
The .dsf is ALWAYS applied, because it’s by far the most influential deformation, done at level 0. .dhdm are smaller frequency since they operate in smaller faces, those of subdiv 1+.
So daz doesn’t ignore the .dsf later. The .dsf is always applied when the .dhdm is applied. The .dhdm has no influence on level 0 data.
-
Xin yes, as you means, dhdm only describe infromation for subdiv 1 + mesh. not describe delta for base mesh verts.
But I means, it include all verts delta with current sub-D. eg as base, we may get vert 1, 2, 3, 4. for one poligon. (so dsf describe 4 verts delta)
then when we apply sub-D2 the verts count change as 1. to 9 . I means, all 9 verts delta will be described in dhdm still. (without dhdm only describe 5 verts delta infomation)
But I understand, what you means,. basically I suppose as you said.
I may actually test it,, with dsf no delta version (swap it, and check how shape change with sub-D + HD morph value as 1.0)
-
reporter The deltas are added over the deltas of the previous level. They are deltas relative to the previous level, not to the base. So thinking about deltas relative to base is wrong. You always need the base deltas (of the .dsf) whenever you use .dhdm data or the final deformation will be wrong.
final vertex position = rest position + d0 + d1 + d2 + d4 + …
- d0 is in the .dsf.
- d1, d2, …. are all in the .dhdm (the .dhdm can skip levels here, you don’t need deltas for all levels).
You can’t add d1, without d0. You can’t add d2 without d0 and d1. You can’t add d4 without d0, d1 and d2. Conclusion: you always need to apply .dsf deltas whenever you apply .dhdm.
This part is in “…/dll_source/subdivide.cc”.
-
Xin though I understand what you means but I do not think morph delta = relative to base. (and not say so) but say morph delta = all verts delta with current sub-d mesh. (then dsf only descrbe for verts without sub-D,)
Then I think, if we made so, dhdm can describe all infomation even though we do not use dsf. to get same creature shape with the sub-D level.
but I understand you are the man offer this good tool. then you may already confrim it as your real check, So I understand you are right.
Why I said the case 2, my creature product use 3 morph files to represent the creature HD.
xxxxxLD.dsf + xxxxxHD.dsf + xxxxxxHD.dhdm.
then when I apply the creature morph, it show 2 controller as value 1.0 , xxxxxxLD = 1.0 xxxxxxHD = 1.0 as visible property in daz.
Actually I can easy modify it, xxxxxHD.dsf include xxxxxxLD.dsf delta too. that means I can modify it xxxxxxLD.dsf include xxxxxxHD.dsf. about this case,, we do not need to use xxxxxxHD.dsf at all. (though daz need to use xxxxxxHD.dsf to describe dhdm path)
Then I just supposed there might be case, daz only use xxxxxLD.dsf and xxxxxHD.dhdm when sub D set more than 1. Though actually I do not think so,, as you said daz may use all file with sub D >1. but I can not say, daz actually do so.
-
reporter Is “xxxxxHD.dsf” an empty .dsf file without morphs, just holding a reference to the .dhdm?
I’m assuming what is happening is that the regular .dsf file was split in 2 for some reason: xxxxxLD.dsf contains the base deformation, xxxxxHD.dsf contains no deformations but a reference to xxxxxxHD.dhdm. If that’s the case then xxxxxHD.dsf is basically operating like the .dhdm,
In any case, loading “xxxxxLD.dsf” is not a problem, it’s just a regular morph. It’s not an HD morph. So you load the two .dsf with the import_daz addon, then use the .dhdm for HD detail. It’s the same as usual.
.dsf files are governed by their formulas.
.dhdm files are governed by the corresponding formulas in the .dsf files that reference them, AND the current subdivision level. This last part doesn’t need special code like .dsf formulas, deltas for higher levels than the current level are naturally ignored when using the opensubdiv api to perform subdivision.
-
The product (what I added pic for VDM test (g2creature) ) , xxxxLD.dsf and xxxxHD.dsf, both file actually describe all verts delta for base. Then it may need not if vendor include LD.dsf in HD.dsf, so I just think (possibility) such case, (though not believe, but if daz swap dsf to dhdm when apply sub-D> 1 only)
at same time I perfectly understand what you means,,
base + f0 (base) (describe f0 in dsf)
subd (base +f0) + f1 (subd1)
subd (subd (base +f0) + f1 ) + f2 (subd2)
then dhdm only need to describe f1 and f2 . (and daz actually do so I think)
what I means,, even though we do not descrbe f0 for base, if we modify f1 and f2. dhdm can describe for all sub-D
eg,, subd (base +f0) + f1 can change as subd(base) + f1a (for subD1) because we can describe all delta infomation in f1a for current sub divided verts.
So I said, if daz make it so… (but I think the main reason is, recently I often see netflix BCC drama, then maybe I just hoped to say something like detectve ^^;)
-
reporter Oh I think I understand better what you meant now.
Try using the following setting in the vector displacement operator: “Starting subdiv level = -1”. Then select a .dsf file (with HD morph or not) as the morph file. You will get the base deformation in the vector displacement map too, on top of the .dhdm deformations if any. This is not recommended because the base deformations are usually bigger in scale so scaling issues arise.
-
I actually test 2 case. (generate with dsf, and with dhdm)
case1 generate VDM from dhdm file, I keep value (xxxxHD and xxxxLD) as 1.00 then export as base. then add VDM.
case 2 generate VDM from dsf, I remove xxxxHD (set as 0.0) for base. then import with Daz importer. and set it as imported mesh for HD import (generate VDM)
They should show difference with base, but when I apply sub-D (usually just use adaptive sub-D) it may show reasonably same shape.
I may hope to use dhdm verison, and it basically work better I suppose,, (because the dsf delta already baked as real mesh defomation without displacement. for such creature FBM,
but when I use JCM HD or expression HD, and set driver, I may need to consider which is better. because these are not FBM so we need to change value with condition. if I generate VDM from dhdm, it need to import HDmorph.dsf by Thomas add on.. then I need to set same driver for HDmorph controller and VDM node scale.
if I use case 2, (though It may not offer perfect same quarity as case 1), I should not import the dsf controller. (because all delta are already include in VDM (dsf + dhdm). then only set driver for the VDM node scale value.
These things may need to consider, when Thomas add function which can use the Xin HD importer (VDM), generate node, and set driver. I suppose user may better to decide which work-flow is good (or Thomas need to consider botth case),
My favor is use case1 ( import HD dsf as shape key, and generate controller as HD labelled, ) + generate dhdm delta (as VDM) , then set driver for both. always.. (as Daz importer option)
===
I may request it as option, for import HD morph with VDM (cycles only) as DAZ importer.
- when I import HD morph (MCM or JCM type) , with the option, it import HD morph (dsf only) shape key as same as before with driver
- it may auto generate VDM (with settting scale which cover all delta, with most good resutl) for dhdm only, and generate displacement map node with set scale for 1.00 strength. (I may offer node group image which can easy add with current material options, and user may only tweak strength value (or it will be driven by other prop ) with set same driver for the mesh.
- at same time it change all material setting to (bump + displacement or displacement)
- if there was no sub-D modifier, ideally it may auto apply it too.
Then user may only need to set adaptive sub-D on, with set render as experimental.
Only thing I worry about VDM, I still did not test with material which use normal map for base ^^; (it include my favorite g3 character) so I am now worry about it…. (I still hope to remain the normal map effect,, and actually I have used normal (tangent) RGB map than bump (B and W) for custom mat etc,,)
ideally there may be better to offer switch user easy remove current VDM effect (material setting etc) , (use node switch) then return performance, and only activate we hope to check dhdm effect. ^^;
(I only describe about VDM options, so may other user need request it with each option,,I suppose to use VDM as HD morph, we need not import as HD, (then no multires but sub-D modifeir only)
-
And I just confrimed VDM node scale = 1.00 means 1M delta or not. (though I supposed it should be,)
so default = 0.01 (when we use VDM scale setting = 1) then when use Xin importer show console value for VDM generate, (call it as mscale)
VDM node scale = 0.01/mscale is logically correct value (to represent HD morph as 1.00 (for dhdm delta)
What I did not know , I simply thought R=U (uvspace), G=V (uvspace) B = Normal , with Tangent space VD node. but at least with test plane (subd smooth), I could confirm G = Normal delta.. for Blender Tangent VD.
(Then I do not know about R and B represent , it not show what I expected,, ^^; , but I just believe, add on already convert morph delta vector in daz object space, as the blender tangent space VDM vector correctly. )
-
reporter The last version (test4b above) already shows in the console the final scale you should use in the Vector Displacement node, based on the selected scale (0.01/scale). It also shows the “optimal” scale that would maximize the range and give you the most precision (with a midlevel of 0.5).
-
Thanks, so maybe Thomas can catch those value if add on auto generate node (though I do not know when Thomas may plan to Daz importer relate with Xin HD export .
(and I understand at least add on need to generate HD map with default scale, to get mscale. then next time user set so (at current).
when I see muta HD images of Alessandro, and test with G2 creature, we may often need to change scale setting to catch all vector, I felt.. (when it is FBM hd morph), vendor may hope to show clear effect when we add sub-D to content with HD,, then may try to add delta in dhdm more than 0.5 cm seems usuall for creature FBM HD. (may not use it for character though)
Anyway to make things work as auto,, It need Xin and Thomas discussion, so I may only add request and some user-side report when Thomas plan it. (For a while I may plan just add simple scirpt, which can auto set material setting as displacement when there is disp map. at current daz importer seems only set as bump only, even though there is displacement (not VDM though) for G2 creature mat, I could confirm it.
(G2 base have more than 15 mat I think,, so change material setting one by one, is double trouble to manually add disp node for all mat which I used ^^; )
-
repo owner I haven’t followed this discussion, but I tried to use test4b today. I keep getting runtime errors, though, but that is because I don’t really know what I’m expected to do and try things at random. A short step-by-step tutorial would be appreciated.
-
Xin, the vector displacement doesn’t seem to work fine for mutation. In the first attempt I was warned by the plugin that the maps were out of bound, and to rerun with 0.3 scale, so I did it. Then the suggested scale for vector displacement is 0.03, so I did it. Below the renderings at subd 2, first iray for reference, then cycles without vdm, then cycles with vdm. I removed the diffuse textures to better show the geometry surface.
Please note that cycles without vdm, that is, only with the base mesh morph and bump maps, may be good enough if we don’t need a strong accuracy.
I imported only the dhdm to generate the vector map since I understand this is the right thing to do, since the base mesh is already baked.
In the vdm version there are bumps in the torso that are not present in the iray version, also the “bones” in the back have not the same shape. It looks like may be the vdm is misplaced, but I used tangent space with the uv map vector.
-
Thomas, get test0.7z for a nice guide included, then test4b.7z for the last update. Also, don’t hide anything in the scene while using the addon, or it may not work.
-
did you set the material setting In property, as displacement and bump, then set adaptive sub-D for sub-D modifer, , with change render engine as experimental (it work with CPU and GPU).?
(I may recommend at first set material >setting> surface Displacement = Displacement only with remove bump in daz and iray, then compare it)
I will test same thing with G2 creature, and see if it not show difference for me or not. with use white mat only.
-
Thank you engetudouiti for your help.
I set cycles as displacement only, since I understand bump maps should not contribute in iray. As well, no adaptive subdivision since iray doesn’t get any. The mesh is at subd 2 the same as iray. Please note that displacement does work, I can change the scale and midlevel and I see it affecting the mesh. It just doesn’t get the same result as iray.
Going to try various options to see if it gets better.
edit. Tried starting subd -1 for the vector map, and “displacement and bump” for cycles, but no sensible improvements. Then midlevel 0 instead of 0.5 seems better but it’s not the same as iray either.
-
, I actually approve you (Alessandro) test with muta and VDM then compare with iray version.
(so I requestaed it). because muta may show more clear how VDM work with large defomation.
at least, I can confirm scale option (in VDM) and add on recommend setting for generate scale is correct. and mid level should be 0.5 (you keep it then compare)
if it not work, it is how VDM will be generaetd with dhdm morph you should not change those value of vector disp node. I may test again with white mat only.
(and first attempt (we may experience it) show ,out of bound is good thing, (so we know we need to set setting scale from 1.0 to recommend one, to catch all delta and pack as RGB color (0 to 255).
with 1.0, it only catch delta from -0.5cm to 0.5cm as 0 to 1.0 RGB. (then change scale of morph delta, and pack it in VDM RGB, >> return the scale, with VDM node scale.
-
Xin, I’m also wondering if we should use catmull-clark or simple for the mesh subdivision with vector maps. I’d say catmull since it is what iray uses, but may be this also depends on how the vdm are baked.
As a side note, if we could use simple it would be better for speed, especially at high subd.
-
reporter Alessandro, that looks like your textures are low res or you misplaced your textures. Either Blender is clamping your texture due to some setting or you baked at low res or you are not plugging them correctly. Try with at least a size of 2048.
Also make sure you are plugging the textures correctly. The right tiles must be in the right material. Also make sure your UVs are spread like UDIMs.
Here is what I get with 3 subdivisions with 2048 textures:
Left is the generated HD, right is the base mesh with vector displacement maps.
CORRECTION: Left is generated HD using displacement maps on top, so it’s a bad comparison. Right comparison is below.
-
reporter Also Daz matches Blender’s catmull-clark’s subsurf with limit surface disabled perfectly.
-
reporter Actually I did a mistake, the one on the left above was using displacements too, on top of it being true HD.
Here is the correct comparison. The match at 3 subdivisions is great with only 2k textures and no adaptive subdivision (same as above, left is true HD, right is base with vector displacements):
-
reporter Thomas, as Alessandro said, download the test0.7z and read the “notes” subdirectory. It contains a walkthrough.
-
Thank you Xin, that was it. I had to spread the udims to make it work, because the mutation uvs are collapsed at tile one. I don’t understand why we have to do this though. I mean, collapsed uvs should work fine for vdm as for any other texture.
Plus, if this is required to spread the udims for vector maps to work, then it may be better for the addon to warn the user if collapsed uvs are found, and/or to add some notes to the docs about this.
edit. Never mind, there is the note in the docs about udims. Though there’s no mention of the “udim from textures” tool that would be useful. And I still don’t get why collapsed uvs shouldn’t work the same. But I admit I have a limited comprehension of vdms.
-
reporter Oh I completely forgot about issue
. I used that feature that Thomas implemented recently, but I forgot that it was for this mesh. Keep in mind, you also need to move the eyes there, which are overlapping the face. Move them to an empty tile (or a tile with things you don’t care about) on the right.#378Here are some tips to make this quicker: use face selection in UV mode (press 3 in the UV editor), then select an island by hovering over it and pressing L, and finally move it by pressing G + X + [number of tiles to move it].
You have to do this for anything that involves baking. For example, you also need to do this for normal maps, or, if you ever used a game engine, for so called “light maps”. The reason is that when rendering the “bakes” onto textures, if the UVs overlap, the information overlaps. Think of it like projecting a ray from the 3d space to the surface of the model: you need to map a position in 3d space to a texture. If the UV coordinates of the surface overlap, multiple 3d space positions on the surface of the mesh are mapped to the same pixel in texture space, so the information from the ray is recorded on top of previous information.
-
That’s the part I don’t understand. The information overlaps why ? The uv space is the same but the vertices are different. That is, each “overlayed” uv map gets its own vertices to be applied to. That’s how uvs always worked before udims.
Then if we were to bake all the vertices in the same texture I understand they would overlap. But we can bake to different textures even without udims, just looking at the materials for reference. Oh .. I get it can be hard to understand what the separate textures are without udims, just looking at the materials. So yes udims come handy for baking. It is just not so nice that we need to change the daz uv map, but I guess it’s a dirty job someone has to do.
-
reporter Yes, vertices are different, but how do you go from 3d space to 2d texture space? what you do is look at the vertex’s UV coordinates, then record the information on a texture at those coordinates. See the problem? if multiple vertices map to the same UVs, their information overlaps.
The reason why materials work (UDIMs or not), is because you are sampling different textures.
Example:
Suppose you are baking onto a texture T of size 2048 x 2048, and you want to go from vertex A to texture space. Vertex A has UVs (u1, v1), so you put the information on the pixel at (2048 * u1, 2048 * v1).
Now suppose vertex B also has UVs (u1, v1). What happens? you end up overwriting the pixel from vertex A.
Now this is what happens when rendering: material A, which contains vertex A, samples texture TA with UVs (u1, v1) at the position of vertex A, so you get the pixel at (2048 * u1, 2048 * v1) from TA. Now material B, which contains vertex B, samples a different texture TB with the same UV coordinates, but the pixel is different because the textures are different.
-
I imagined UDIM issue ^^; because alessandro show image is far different what I saw as quarity with creature.
But at same time I may confirm, if we change setting scale as 0.3 it means lost detail quarity as 0.3 too. and the generate VDM may not work well about the surface when the surface angle is hard.
eg the back lamp bone of the muta and my creature have sholder lamp bone with hard angle. it not so looks same as daz with VDM.
Then why I may only test with adaptive sub-D, I think, daz iray HD use render sub-D when we render with iray. then even though you set sub-D as 2, render image use render sub-D for iray. (you may set it as 3 or 4 etc)
But if I remove the VDM as scale = 0.00 I lost so many detail from mesh. Then I can not live without VDM (if I use HD morph without import high-reso mesh)
with set VDM scale as 0 (remove dhdm delta) but keep same resolution (adaptive sub-D)
with set default scale/VDM setting scale.
I can find some weaken deform for (basically minus delta I feel) but, it still show me reasonable DHDM effect.
And I know, if I really check all detail, I may better export high reso baked mesh then compare with render.
-
Then about most deformed part (almost near of max delta of current settign), I feel it may cause jaggy part. (and it seems shown about texture center point I feel) with keep strength.
But with this pic, you may confirm the quarity of VDM. I may suppose if I set ub sub-D as perfect same as daz which I currently use for render sub-D, it may show more same shape. but actually I do not hope to set sub-D more than 4 in blender.. (actually I never use it I feel), then for me keep to use sub-D as adaptive is only practial option with work VDM in blender.
-
reporter Try with non-adaptive subdivision and a subsurf modifier, and disable limit surface.
How does the base mesh without subdivisions look?
For the second one, check if there is a triangle in those spots (in the base mesh).
Also try with 4k textures. At least for 1 tile.
-
Xin, also we need to select the uvmap to be baked. In my first test I was baking with “base female” that’s selected by default. So I got separate textures but with the wrong mapping. Then if we select “mutomorph female uv” without spreading udims we get a single texture with overlapped information as you explained.
May be it is useful to add to the docs that the right uvmap has to be selected before baking.
-
reporter Yes, that’s the same for Blender’s .obj export, which only exports 1 UV map, the selected one.
Keep a list of stuff to add to the notes in a file somewhere, so whenever we update the notes we don’t forget anything.
-
About my pic, yes I may use 4K map if I really get all delta with the scale setting. (about the image, I need to use 0.3 so I supposed I may better use 4K map) but the more I add 4K (with RPG 16bit png), the more it may took more render. then I may not test it. untill auto generate nodes I suppose.
The base mesh is perfectly same for me. when I remove the scale and set blender sub-D as 0 (it should be)
Then about the jaggy center point of nose, actually they are not triangle, but as you supposed the mesh flow is not good with base (include HD dsf delta already) some polgion almost overwrapped (bury other polgion) and nose line is not mirror (so center line is corrupted a little)
it may effect when generate VDM from delta of the base mesh, I suppose. (but to modify it, I may need to edit base mesh which perfectly same as daz base (with HD morph applied)
-
Here is another example that may not be entirely obvious.
We can use the displacement maps as geometry displacement, this way it works both with cycles and eevee. Below an example with eevee where I applied the displacement map to the base mesh.
It is also worth noting that, since we bake from dhdm that’s essentially vertex based, we can also remap the geometry and bake on our own custom uv map. As for udims, we don’t necessarily need multiple tiles, we can use a single tile as long as the involved geometry doesn’t overlap on the uv map. That will become a single displacement texture for the whole figure.
-
reporter Also around the weekend I will focus on finishing the operator to generate vector displacement maps for arbitrary meshes (not just daz morphs).
Also, I will look into this: https://docs.blender.org/api/current/bpy.types.bpy_struct.html#bpy.types.bpy_struct.as_pointer
This might get rid of the need to import/export stuff and just interface with Blender directly. But I have to read more and it will take time to implement.
Alessandro, if you are using Vector Displacements in the Displacement modifier, be careful, those won’t work right for cases where the vector is not along the normal. For the Displacement modifier, you want the Grayscale displacements that you can generate with the Normals operator (select the option “Grayscale displacements” instead of the default “Normals”). These aren’t as accurate as vector displacements, but there is no modifier that can handle tangent vector displacements right now (non-tangent ones aren't good for rigged stuff).
-
Yep I’m using the grayscale displacement. Thank you so much for the new update.
As for memory sharing I’m not sure to get the advantage over exporting a file. I mean, for very large assets exporting a file may be better to save memory. Also, as for the exported files, for diffeo Thomas used obj and dae at first, then he made the dbz exporter to get just the data he needed and it was simple and much faster. I don’t know if you may get a similar approach.
-
reporter - attached test4d.7z
Reuploading with a small fix to handle zero displacement.
Vector displacement operator now takes a "midlevel" input too.
Now the console output will suggest an optimal Midlevel and Scale.
To get old behavior, just leave midlevel at 0.5 and only read the "recommended scale when midlevel is 0.5" from the console output.
-
Xin it is really important up-date. thanks. I actually thinking to request it . because we may see min = -0.2 but max = 0.7 etc. If keep RGB mid level = 0.5 we need to change scale. but if we change midlevel, it can keep setting scale as 1.00
so now the VDM can use RGB more effectively
Then to get more quarity,, as long span future request, if you can change setting for per UV tile VDM.
I suppose there should be clear difference for each UV tile part. (max and min delta), (like Face part and body part) , so ideally ,
get min and max delta for each UV VDM, and adjust setting-scale for each UVtile >> genearte VDM >> + change VDM node scale for each UV tile.
seems offer best quarity.. (if it not cause some seam for each UV tile part connected area) , But I do not have clear view, how user control it from UI. ^^;
And if there will be option , to choose open-exr, I hope to test it.
-
update. Krampus of the daz forum pointed out that we can use vector maps instead of grayscale maps with the displace modifier. I didn’t notice the “rgb to xyz” option.
-
I noticed it, but I could not get same effect for offered test exr ear. (it actually generate real Ear from plane only (witch is maya test VDM file, then it work for blender without problem (though it need to set midpoint as black)
You set space as Local is it right setting for tangent space VDM? I do not know if the Krumpus could show real case which compare with VDM in shader, and disp modifier. But at current I do not think the conversion way is same as VDM. so you may get defrom but it may not same as VDM with shader (I suppose, if we edit texture with use vector math node, and save it as new image , there may be way to manage it so, )
as default it seems not work for me.
sub D already applied + sub D 4 with VDM (midlevel =0, and scale = 0.25)
remove the displacement from node, and apply it for modifier
midlevel should be 0.0 (because it can keep the plan position), and I set same scale value (if I use normal displacement map, it show same effect, so I think VDM not work for disp mode.
-
reporter RGG to XYZ modifier doesn’t work because it’s a fixed space, in this case LOCAL (similar to global), This doesn’t work because the generated maps are in TANGENT space. The vertices of a rigged mesh are constantly being transformed so you need a proper space that adapts to the pose, the tangent space, the space used by normals too.
-
reporter An image to explain why that doesn’t work:
-
Yes then what I supposed is,, blender out put node, displacement vector. I suppose blender VDM and displacement node actually convert the texture RGB (it is the tangent space vector, ( U and V and normal , though I only confrimed, G = normal for VDM texture) then the vector will be converted as different cordinate then the vector is used for the displacement.
So if we can generate the shader (VDM node) out-put vector as texture, , and use it as displacement I expect if it worked (without clear confrim)
But actually I do not know, blender disp modifier space how work. (RGB >>XYZ with keep UV Cordinate) the modifier do not offer any customize RGB component way, so we can only input real texture directly,
or may need to use procedural texture in texture property. I may check with Red and Blue and Green texture for more unique poligons. with the RGB >>XYZ option.. + cordinate as UV. but basically as default it never work.
-
Yes I see.. we can not directly use,, covnerted vector of the shader for modifier.
-
reporter - edited description
-
Thank you Xin and Engetudouiti for the tests and explanations. I did a test myself and didn’t go well either. I’ll update the daz forum though there’s already a link here.
https://www.daz3d.com/forums/discussion/474511/hear-ye-hear-ye-hd-morphs-for-diffeo
-
Thanks Xin, it add good option, for PBM (and I only consider use VDM for the purpose now not go far (use as JCM HD etc))
I found the old G2 creature morphs seems include a few HD morph, witch can mix with other creature product HD.
so I may hope to use it to get variation. (I may choose, generate a few controller to mix in blender, or generate as one baked HD , after Thomas add shader node for VDM HD)
-
Though most of user may not need to consider, but I remember important thing, why we may better not set HD.dsf path, and generate VDM (which include dsf and dhdm). Because, to it work, we need to export without HD morph (dsf and dhdm) with base resolution. (need to set HD morph as 0)
But most of HD.dsf include bone transform delta too. then at current (as long term), we never get it work with change morph value in blender. So we hope to bake it when import.
Then after all,
we set all HD morph in daz studio and set base resolution.(so it add delta of HD,dsf too ) ,, with adjusted bone postion. then inblender, import only dhdm delta as VDM for sub-D mesh in blender.
I will test with G2 Creature HD morphs (mix 2 creature HD morphs for head and body, so I need to mix 4 VDM then it need to set strength. (I prefer, anyway generate VDM as strength 1, then multiple with ratio of each morph strength in daz studio)
(so actually set manually all shader node, may cause miss I afraid it , then about this case,, if each part VDM is included as blender UDIM texture set, I feel it is complex to manage them as I need…
So I may only merge material witch use same Tile and node groups. to ease my work. not use Blender UDIM texture set, for generated VDM.
-
reporter The VDM operator already ignores base .dsf deltas when the “Starting Subdiv Level” is 0 (the default). So it doesn’t matter if you use .dsf or .dhdm as long as you leave “Starting Subdiv Level” == 0. To read about this parameter, hover over it to read the description.
For the VDM operator to bake base deltas too, you need to set Starting Subdiv Level to -1 (and use a .dsf of course).
-
Thanks Xin I did not notice it.
Then when I use default Starting sub-div 0, generate VDM, we need to select mesh which use generate VDM,
I can manage with FBM, which HDmorph.dsf already baked. so I only need dhdm delta as VDM.
But if we import HDexpression.dsf later,, (like HDsmile etc), add on imported base mesh not include HD dsf delta.
About the case, we need to use the Starting sub-D as -1, then select dsf as VDM path?
or if we set the HDmorph controller in Blender as 1.0, (it add HDmorph.dsf delta for current mesh) then select the current defromed mesh when generate VDM, and keep to use starting sub-div 0 , it can generate only dhdm delta,(which not include dsf delta)?
I means, we set mesh until generate VDM, then if the mesh deformed by shape key , it change generate VDM? or it not effect to generate VDM?
-
If we need to add HDexpression.dsf in daz studio, first, then generate dhdm delta as VDM, the way not work for HD expression morph, or HD jcm morph, which we change value in blender I think.
we can still control dhdm delta by VDM strength, but base mesh already include the HD expression dsf delta (like smile HD etc) for base. so it can not be used as Expression controll.
The complex thing for me,, so we should not include the expression.dsf delta, when import character mesh to blender as base.
then use Start sub-D as -1. and set the HDexpression.dsf path. generate VDM include dsf + dhdm delta. about the case, we avoid to import HDexpression.dsf morph when Add on Import expression morph. (or even though add on import them as shape key, we should not touch it, but only use VDM strength, which include (dsf + dhdm) I think.
What I hope is (if it work),
- I do not add dsf delta in daz studio. then import base by add on.
- I import HD smile.dsf delta as shape key (by add on import morph)
- I only generate VDM which include dhdm delta. with Start-sub-D 0 and select import mesh.Inot include delta of smile.dsf)
- to get final HDexpression,, I set the morph controller value as 1.0 + VDM strength as 1.0
If current add on work so without problem, I have no issue. (I hope add on work so)
just confirm, in blender, Select mesh include “dsf delta”, or not, for base, cause difference about the generated VDM image? (and if we apply the smile HD shape key as 1.00 then select the mesh, it cause difference for generate VDM?)
( I may test later 2 case, with set HDsmile on and off in blender, and import mesh include HDsmile or not, cause difference generated VDM texture or not. (only use default starting sbu-D with 0)
it may related, if Thomas will use VDM to import HD morph, I suppose.. (Though Xin may know how it work, I need to know it , or I may miss use VDM , then may report wrong result when we seriously test effect)
-
I now test with G3 MouthRealism HD, (it is not JCM and usually it was auto applied for default gensis3. ad hidden morphs, I just use it as HD expression)
- In daz studio, I set MouthRealismHD as 0. (it deform hard with bese resolution)
- Import non HDdsf version by Daz importer
- Import the MouthRealismHD by Daz Importer, import morph. so now I have the controller for MouthRealsimHD.dsf delta as shape key.
when I generarte VDM (by button) , HD importer export base mesh as obj. (if there is already same name obj, it is over written)
But it not generate obj with shape key delta which currently used. so export mesh obj is same about import character. it always export same mesh (non shape key value), then generated VDM is perfectly same. (so it not matter, with shape key value or not, shape key delta is ignored for VDM.
I actually set different dir name for working dir, and compare exported obj, with shape key on and off.
Then add on keep to export same base mesh (without shape key delta) I suppose.
To test it clear, I duplicate, shape key added mesh, with other add on option. (so the mesh include shape key delta, then set the new mesh as export mesh, and generate VDM, now exported obj show the shape key delta, and VDM actually show difference when I compare with 2D tool.
From those test, I almost confirm, if we try to import HD morph for expression or JCM, (they need to change value with UI cotnroller or driven by another prop), we need to generate VDM which include dsf and dhdm delta. then if we use dhdm as expression (or JCM) HD morph, we should not import HD.dsf by DAZ importer as shape key. And only control by VDM strength. (then DAZ importer may only generate UI controller which drive the VDM strength, I suppose so as future)
If VDM quarity may lost when we include dsf delta, (basically for base dsf delta, shape key should offer perfectly same delta as daz and it is large defomation than dhdm), I may request, when generate VDM, and set mesh, there is option, which add shape key delta currently used for the mesh then export the obj (include currently used shape keys delta)
I suppose, VDM generator work,
1_ export base mesh (non shape key),
2_ generate VDM with selected dhdm (and dsf) and exported obj in working directory.
so if export base mesh can include shape key delta, it can offer more variation for VDM. (we set each dsf controller value as 1.0 in blender, then generate VDM as dhdm only)
Though I still think, to auto-mate these, by Thomas add on import morph option, is actually not so easy. (which need to decide import shape key or not, and how add driver for UI controller for HD morph (dsf + dhdm) . about HD expression, or HD jcm.
-
Yes I think the option (which generate VDM with currently shape key added mesh in blender (export as base obj) ) is useful, not only for JCM, MCM, or expression HD, but simple PBM few morphs mixing.
if you make so, we can import HD morph later, as I need. then may add shape key value, to generate dhdm only VDM as we need
(or I need to set base mesh with apply each shape key delta, one by one in daz studio, untill export I think), of course it is not matter, If we use VDM which include (dsf and DHDM delta), I just worry the quarity, (dsf include hard deform delta, than dhdm, so may need to adjust sclale more)
-
reporter This is how it should work:
What I hope is (if it work),
1. I do not add dsf delta in daz studio. then import base by add on.
2. I import HD smile.dsf delta as shape key (by add on import morph)
3. I only generate VDM which include dhdm delta. with Start-sub-D 0 and select import mesh.Inot include delta of smile.dsf)
4. to get final HDexpression,, I set the morph controller value as 1.0 + VDM strength as 1.0
Adding the Shape Key to the imported mesh in the dll shouldn’t be making a big difference (or any), because these are displacements:
Exported Base Mesh 0 = Base + [level 0 shape key]
Exported Base Mesh 1= Bsae
Displacements = [Exported Base Mesh 0/1 + subdivisions + dhdm deformations] - [Exported Base Mesh 0/1 + subdivision]
I guess depending on the influence of subdivision you could see a difference. Adding the option to include current shape or not is easy. I will add the option.
I also finished making the VDM operator for an arbitrary mesh (not dhdm data, but sculpting done inside Blender for example).
-
reporter - attached test4f.7z
- Generation of VDM for Blender meshes with multires (you must keep a copy of the clean base to use it for now, could be changed later). To use: select as base mesh the clean base, select as HD mesh of the operator “Blender vector displacements textures” the copy of the base mesh that has a multires modifier on it (possibly sculpted).
- Options to affect base mesh export to the .dll.
- "Pin" button in the list of morphs to only use the pinned morph, all others in the list will be ignored.
- Fixed small problem with Blender 2.92.
-
Xin thanks, up-date. I may use the pin option if I do not include hd.dsf delta in VDM but import hd.dsf as shape keys.
At same time, I am thinking, use HD JCM or HD expression may not work as same as daz HD morph.
because, VDM(shader) displace mesh which include all modifier and shape keys delta. and the delta direction (tangent space) change with current mesh (Posed or shape key deformed)
Shape key (or daz morph) move verts with object space. so each verts delta direction never change with current shape, Then vendor make JCM or MCM morph with posing. then import the delta as morph.
Though actually it is not matter for me, (because I only plan to add same detail effect, as FBM or PBM, then not expect it may show perfectly same effect with posing in daz. (we still see the delta with current mesh flow tangent space in blender, so without the mesh already deformed hard by bone weight, As FBM detail it may not effect hard.
And really thanks you add new option to generate VDM. (with multires).
only thing I hope to ask,, so now I sculpt with multires sub-D 4, then generate VDM, I understand I could use it without multires, but it may show little difference if I use the VDM for sub-division modifier?
(Actually generate VDM with multires is really logical I know. so we can easy sculpt without keep same sub-D, just confirm it may show small difference when I apply the VDM with sub-D modifier (use same sculpt sub-D)
-
reporter Shape key (or daz morph) move verts with object space
The Shape Key is applied before the armature, Then it’s posed.
Tangent space is needed for texture maps because they are applied after armature/posing on the shader.
.dhdm files are applied in tangent space too at the time of subdivision, which happens after armature/posing. In fact, .dhdm files describe deltas in tangent space. They are not regular shape keys.
Using tangent space after pose vs shape keys are two ways of expressing the same final transform. They are not different in the final result. You might be confusing the order in which shape keys are applied.
For the Blender VDM operator, it should work too with a mesh sculpted after a subdivision modifier was applied to it. You can choose to apply a subsurf on a copy of the base mesh first and then sculpt on it. If you use that one as HD, it should work too.
Even if you managed to add details to a mesh with the subsurf modifier still on, and you used that one as HD, it should work too.
The addon searches the HD object for subdivision modifiers and applies them. Then applies the same modifier (either multires or subsurf) to the base to match it. Then it exports both. Alternatively, it checks the vertex count of the HD object to see if it’s HD. and if it is, it uses a subsurf on the base mesh to match the HD vertex count, then it exports both.
-
reporter So I modified the .dll to give another option: create vector displacement textures in Object Space. Will upload this version later.
This allows you to use these displacements with the modifier in Eevee and the viewport, but you have to re-order the modifiers (by the way, you can add Drivers to the parameters of the modifiers). Example:
-
Xin, that option for the object space seems great for the displace modifier. One question, if I understand it correctly, we can then use it for static scenes, but not for animated figures, because animation will change the figure shape. Or can we use it both for scenes and figures ?
-
reporter It can still be used for animation, because the displacement in object space takes place before the armature, but since the subdivision now comes before the armature, you get those weird artifacts you would get without vertex weight smoothing in the true HD.
You could maybe use another subdivision after the armature, since there is no “vertex weights smooth” modifier. Maybe something with the VertexWeightEdit modifier could also work.
-
Xin at first thanks update. I expect the new option (Object space vdm) work well . And I may often miss understand or confuse things. But as you mentioned, shape key applied the delta before modifier. it is not matter why VDM will show difference effect with posing from shape key.(Of course the order is important, when we make shape key for posed mesh by armature as we designed. In blender and daz studio, we have option to generate shape key (or import as morph) with current posed shape. (daz use morph loarder option, for blender we use corrective shape key add on)After all, about shape key, the delta only work as object space for each verts then it deform with armature, so one shape key delta vector direction not change with each posing for one verts. But VDM , the delta vector direction change with each posing.If we remove VDM, then describe one poligon tangent 3 Vectors (U, V, normal) , then set pose, now the Vector should change in object space.That means, we can generate VDM for one posed mesh (as source), now it may show same effect with shape key for same posed mesh as daz. but next I change pose, = deform mesh, or add new shape key by morph, The mesh poligon face tangent space change, and VDM try to deform along with current mesh tangent space. it is not same shape as daz with shape key morph.Of course it still work with armature. and as real, VDM seems more naturall for me. because it re-circuate current face tangent space. real time. but if we expect same effect as daz JCM with change posing or deform mesh by other shape keys,, it can not show same delta deform as daz.(wrong infomation, sorry)
-
Ah no Xin sorry to take your time.,, I actually miss understand. it not effect final. As you said, yes shape key delta added , after that it deform with pose, so the shape key delta vector direction change along with current pose or other deform.
So if we generate vdm with tangent space and zero pose, when posed, poligon move then change VDM vector of each U, V, normal. but the difference is almost same as how shape key vector moved when posing or moved by other shape keys. Though Object space VDM is still useful, but about this case, we must need to use the modifier as you described.
add VDM with object space modifier >> armature deform = same as daz.
armature deform + VDM with tangent space (as shader) = same as daz.
-
The addon searches the HD object for subdivision modifiers and applies them. Then applies the same modifier (either multires or subsurf) to the base to match it. Then it exports both. Alternatively, it checks the vertex count of the HD object to see if it’s HD. and if it is, it uses a subsurf on the base mesh to match the HD vertex count, then it exports both.
And thanks new sculpt VDM options . I really exciting to test your options.
Actually I am planning to sculpt high sub-D mesh in 3d coat ^^; then import the high reso mesh , and generate VDM with select original (in blender)
-
reporter - attached test4g.7z
I changed the Blender Vector Displacement options, since I believe letting the user decide which subdivision method to use for the base is better than trying to infer it, as you said earlier.
This version also lets you generate vector displacements in Object space (note that midlevel and scale are not the same in Tangent space and Object space, they vary a little).I haven’t tested this one that much, so I will leave the previous version for a day or two, until I can test it.
About using 3d coat, make sure it doesn’t change vertex order, either when importing or exporting. Or if it does, do the same with the base mesh. The operator can accept a subdivided base mesh with the same vertex count as the hd mesh, then it exports both without subdividing and just calculates the deltas.
-
Thanks Xin it have been long dream, I sculpt and generate VDM ; now it have achieved I feel.
Then if you forgive me,, I hope you remain both options in future stable version, to generate VDM (tangent space and object space) I think it case by case..
about VDM your add on not only for Daz dhdm any more with new sculpt + VDM options. I do not have strong view, (though understand it may show difference with midvalue and scale) which is better as default option, but hope to choose VDM as Object or Tangent when generate.
-
reporter Yes, it already has both options for both the Blender operator and the dhdm operator.
-
3d coat, make sure it doesn’t change vertex order, either when importing or exporting.
Yes actually it need to confirm , (3d coat often show problem about vertex order, I remember and had talked about it with 3d coat blender add on developer. the problem is 3d coat not seriously think to use it for morphing, then FBX importer exporter, had show the problem often. (some version it corrected, some version it return ^^;) I need to confrim it, if not need to request about it.
-
Thanks I now work hard ^^; to keep my hobby time to test with your add on thanks!!
-
reporter You can always just export the base mesh to 3d coat too, and the immediately import it back. Unless 3d coat does something very weird, it should have the same vertex order as the sculpted one, even if that vertex order doesn’t match the original in Blender.
-
Ah I see now. after import to 3d coat, I export same mesh, with high-reso without sculpt, and sculpted one from 3d coat to Blende, Then it may keep same vertex order, generate VDM should work for blender original, with use same UV. So at least to generate VDM it not matter? Anyway I test with plane first. thanks.
-
Xin great work it is more than I expected..
sculpt version
vdm version
I actually did not believe when I test render, (I should miss something, then just render same plane,,I afraid, so just to confirm,,
but actually it is just a plane
(though I push sub-D apply really hard ^^; to test without use modifier)
the setting (console output) must need to apply to generate map correctly. for VDM setting (in add on panell) and node setting. the effect is really great. (I test sculpt in blender, do not mention about how other aprication work, etc here ^^;)
It seems game changer,, for blender user. I feel.
-
I don’t use sculpting myself. But seeing this in action, how it bakes sculpting to vector maps is impressive.
-
it already really work well, only thing user consider, correctly set-up shader and the suggested setting value which out-put. I felt, we may better to keep base mesh, already sub-Divide and keep dense untill generate VDM sems work better. (eg,, load cube, sub-D 3 times 4 times, then apply it first,and use as base. duplicate it and add multi-res (set sub-D as you need). of course it may only matter you sculpt from default primitive . if you sculpt with daz figure, you do not need it, just add multires and set sub-divide level for sculpt.(it already have base resolution without sub-D to use as base )
Then I test object mode to check with displacement modifier,
but it cause (vertex count miss-match) errorI suppose once I set meshes, and generate VDM, I only need to change mode, then add on may generate Object space VDM as different name, or I need some special setting?Or I may need to set same-sub-D count and apply it to generate object base mesh?
i
f I set as tangent, with same setting add on work without problem.No I seems hide the sculpt mesh multi-reso modifier (non activate ^^;) to check base mesh, then it cause problem. Now I could confrim it work really well
-
reporter I added 32-bit OpenEXR support too. The operator is even faster than when writing PNGs, but the files are bigger. I would use this option only for heavy deformations with Blender. For face details, 16-bit PNGs are more efficient in terms of size (and have better compression).
Also, with 32-bit OpenEXR you don’t need to deal with either Scale or Midlevel settings (Scale is always 1 (0.01 in Blender) and Midlevel is always 0), because 32-bits match the width of a float, so there is no need to remap deltas, you can save them directly on the image.
I will add this version later.
-
Yes I may only use exr for creature FBM as main. (and I confirmed now, your 16 bit png option still work well, with set setting and node scale correctly)
As I requested in another topic, if you can add “limit on” option to import dhdm and generate VDM I really welcom for adaptive sub-D, I may plan mainly use adaptive sub-D for VDM. (then to make pose etc, I may need not to keep VDM then after all , I hide the VDM from node , and return limit off, I am now planing so,,)
-
Xin, as for OpenEXR, I guess the advantage of a fixed scale and midlevel can make it easier to integrate vector maps into the new morph system. Especially to avoid the out of bound issue.
-
reporter I was considering adding an auto-scale auto-midlevel option to the png option. The program already knows the values when it outputs them to the console, so the png option could be made easier too.
I will also check if limit surface subdivision in the dll works the same way as in Blender. If it does, then it should be easy to add that option too.
-
Xin I really apreciate, if you add the auto setting option .
(I suppose when click “generate VDM” add on test generate VDM once, then if it is out of bounding, auto set setting scale. and mid value. then generate and overwrite VDM )should be useful. (so I only need to add VDM nodes with console output node setting value, )
It may need if Thomas plan to generate VDM nodes by daz importer UI morph option.
-
About limit surface option, actually I do not know.. if you can make it work. (I just supposed you can do same thing as blender sculpt option)
-
reporter Regarding the limit surface, I will check that later. But, are you sure you are setting up adaptive subdivision right, with the proper parameters? the difference between limit surface enabled vs disabled is tiny (from the point of view of displacement). I don’t see how you would see much of a difference. I think your issue is that adaptive subdivision is being too aggressive, but that is not related to the limit surface. It’s a matter of tweaking the parameters of adaptive subdivision.
-
reporter - attached test4h.7z
- 32-bit OpenEXR option for vector displacement textures (these don't need special scale or midlevel settings).
- Auto scale and auto midlevel option for 16-bit PNG vector displacement textures.
-
you already said, Use limit Off.. if it not matter, you means, there is almost no difference, when we apply sub-D. which option I used.
of course adaptive sub-D auto applied with view. but even though it exceed , it not effect much. because it just try to high-dense mesh more sub-D. I just keep to use default, then it can not be too agressive.
-
What I think, if generate VDM from dhdm may not work as same as you did as blender sculpt option.
I just supposed we can get delta value, with compare 2 mesh. (sculpted high mesh and sub-D with limit on and off. your option for sculpted mesh, do this I suppose. Then If option by dhdm work like these,
1. generate high sculpted mesh, from sub-D (no limit) + dhdm delta , as obj (real shape)
2. generate sub-D mesh, (limit on or off) as obj
3. get new delta >> convert as RBG with current UV.
I feel I can use “limit on” as option. and it should show differecne. wish sub-D mesh. (so adaptive sub-D auto change, with the distance, but it may show almost same shape as real shape.
And what we clear see,, if we use “limit on” option in blender ( adaptive sub-D seem work with limit ON only, or the option is ignored) , we need to modify delta value (described in dhdm), because dhdm describe delta will be circulated for limit OFF sub-D mesh. then generate delta..
So to get same final shape with sub-D, we may need to adjust the delta, with the difference limit on and off, I suppose (I think actually your sculpt option, circulate delta with compare 2 mesh, though I do not know actually how you use generate. or selected 2 mesh for circulation)
-
Then Xin, you may need not make it work so for dhdm option. Thanks all effort , it is just my choose option (adaptive sub-D), which you may not recommend. (you recommend use as limit OFF to get almost same shape in daz)
And it is reasonable. because when vendor generate dhdm, they actually use blender limit off (In daz studio) then get delta and generate dhdm (with vendor import HD morph tool , I suppose)
So limit on not show same shape is usuall. you happend to offer the option for sculpt version, then I become more greedy (the more you kind, the more I expect match)
-
Xin, I’m testing the new ad5b4cc commit by Thomas for the normal maps option. I see that your addon always generates all the udim tiles when I select a dhdm for a face expression. In the new commit I can select a material to add the normal map to, so it’s a single udim tile we need.
May it be you could add an option to specify a material or a udim tile number/set to generate the textures ?
-
reporter I’m not sure I understand, I have yet to test what Thomas added. I will check it later to see if I understand.
In case it’s useful right now: the textures generated have their tile number in their filename. You can get the tile corresponding to a material by looking at the name of their daz textures (daz textures have the tile number in their filenames too).
-
Xin, yep your addon works fine, I know we can get the tile number we need. It is just that it is unnecessary and time consuming to generate all the tiles when we only need 1001 for the face material for example, so if we could specify the tiles to generate it will be much faster.
But sure take your time to check the work by Thomas, no need to rush.
-
reporter Oh ok, that’s already implemented in the code. It isn’t exposed yet to the user, so it shouldn’t be hard to expose it in the UI. Maybe another list like the one used for morph files?
-
reporter - attached test4i.7z
Option to select specific UV tiles for vector displacement textures and normal/grayscale displacement textures.
-
Xin, 4i works great we can generate only the tiles we need.
I noticed that it is not possible to select multiple files though. That is, if we have to generate 1001 for angry and afraid, we have to select the file and generate, then select again and generate .. That becomes tedious fast for a number of expressions. It would be nice to select multiple dhdm files without exiting the file dialog, the same as Thomas does to select morphs for example.
-
reporter I already added that, but also another thing.
I added a grouping feature to the morphs list. So all morphs in a group are considered together according to their weights. You can now call the operators that take morphs with a list of morph groups and produce all the textures/shape keys (one set for each group) in one call. A text file with the settings to use in Blender, for each group, is generated for vector displacement textures.
I will add this version later.
-
Thank you Xin, this will be great to simplify the user work.
-
reporter - attached test5c.7z
Reuploading since the last one had a few bugs.
-
You can now group morphs in the list of morphs, and one set of outputs will be produced for each non-zero weighted group after calling the operator, as if you had run the operator with an older version one time for each group.
-
The addon now generates a .json file along the textures, with the filepaths and weights (and the midlevel and scale to use in Blender for vector displacements) of the morphs used for each set of textures (one set for each group).
-
Better operator descriptions.
-
As for 5c it works great we can now select both the tiles and multiple maps. Thank you Xin for the nice improvement.
-
reporter So I looked at Blender’s internal structs and using as_pointer() to interface directly with Blender wasn’t hard. The next version will give the option to not use .objs, and this results in better performance (notably, Generate rigged HD at subdiv 2 takes around 20 seconds and the vertex weights are correctly interpolated during subdivision instead of relying on smoothing, and the Blender vector displacement operator, which had to export 2 HD meshes, is also considerably faster).
When dealing with vertex weights, the bottleneck is now creating the vertex groups for all the vertices in the HD mesh (not as bad as the bottleneck caused by the Data Transfer before). Right now it’s done by creating a lot of placeholders for all vertex groups since the .dll can’t allocate memory for Blender structs, it can only change values (Blender does some memory bookkeeping, so it would be dangerous to sidestep it).
-
This sounds great. I didn’t realize that the obj was the main bottleneck, I had the impression that generating the maps was.
-
reporter - attached test6b.7z
- Option to not use .objs and directly interface with Blender, which speeds up all operators to various degrees. With this option enabled (the default), the generated HD mesh is rigged properly without relying on smoothing or blender’s slow Data Transfer operator. You can still use the old version by selecting “Through .objs” in the main settings.
I will try to find a way to fill the vertex groups more efficiently since pretty much all the time is spent waiting for Blender to fill all the vertex groups when generating the rigged HD mesh. The .dll side is already as fast as it can be now that it doesn’t need to import/export .objs.
I think a .collada export from the .dll to Blender when dealing with vertex weights would be a possibility, since the .collada importer creates vertex groups more efficiently than Blender’s operators.
-
Xin, to generate the HD mesh goes from 85 to 60 seconds here so I can confirm there’s a noticeable gain. When I read “the generated HD mesh is rigged properly” I hoped that jcms were transfered but I see it’s not so.
How do we transfer jcms to the HD mesh ?
-
reporter The same as before, with either the “HD shape key from morph” operator, where you select the JCMs you want, and by making sure they are in different groups, which is the default setting when selecting several morphs from the file browser. This operator is considerably faster now too.
Another option is to use “HD shape key from base shape” and making sure that the base mesh has the shape key active (the pin button in the shape keys panel can be useful here). Also make sure that “Base export modifier“ setting is set to “Only shape keys”.
-
Xin, what I mean is if we can transfer jcms so they do work, that is, with drivers. May you do that when transfering the rig ? Or is it something that Thomas needs to do as an added tool ?
-
reporter The rig is duplicated, so all its drivers will be there. I think it would be a matter of just importing the shape keys for the HD mesh and then creating the drivers for them knowing that the rig they are parented to already has the drivers set up (so the shape keys should be driven by the (fin) properties of the rig).
One issue is the naming convention, how do we match the imported shape keys with the correct (fin) property? if that match can be made easier, then Thomas could make a button that creates the drivers for the shape keys. Or maybe I can do it on this addon’s end. I will look into it later.
-
As for naming conventions, I understand Thomas uses the id inside the dsf file, that’s what used in daz expressions. For example, pJCMForeArmFwd_135_L.dsf gets pJCMForeArmFwdL as id.
-
reporter Oh ok thanks, that should make it easier then. At least for a first version. I will look into it later.
-
reporter So the collada importer resulted in a significant speed gain for the generation of the rigged HD mesh. Less than 5 seconds. I was considering trying another alternative where a buffer would be shared between python and c++ to generate vertex groups more efficiently, but I don’t think it will be needed since the collada importer is already quite fast. I will upload this version later.
I will look into the shape key drivers next.
Also Alessandro, what kind of cpu do you have? I will try to change the compiler optimization settings in case the optimization level is too high and giving issues to cpus other than mine.
-
The speedup you’re reporting sounds amazing since right now generating the HD is a bit slow here. Also shape key drivers will finally bring a working HD mesh ready for the user to play with.
I can test with a ryzen 2200G that’s quite weak nowadays. I got it for the viewport so I can dedicate the 1060 to rendering. This setup is cheap but overall works fine enough for rendering, though it’s weak for the blender simulation that’s cpu based. It would be nice if blender could use OpenCL instead but I guess that’s not gonna happen ever.
-
reporter That’s better than mine, so it’s weird you are getting much worse results even if the compiler is optimizing for my cpu. I will look into changing some compilation settings.
-
If it can be useful I can confirm here it takes about 60 seconds to generate a subd 2 HD mesh. I didn’t import any jcms or anything on the rig, it’s just a basic G8F. Then I have a crucial BX500 SSD that’s not too fast, don’t know if this may affect anything.
edit. I’m timing with a clock from when I press the generate button to when I see the HD figure appear on the viewport. For test purposes may be it could be useful for the addon to report the conversion time.
steps:
- import a basic G8F
- generate a subd 2 HD mesh with rig transfer
-
reporter Yes, that’s what I test too. And I don’t even have an SSD, although these latest version don’t depend on large files.
I will give a few dlls with different compilation settings next time so we can see if that’s the issue or if it’s something else.
-
update. Also if it can be useful below it’s the cpu load during conversion. The addon doesn’t seem to use a 100% load for all four cores, so I guess there’s something else it does other than calculations. Unless it’s not designed for multi threading and it only uses one core ?
edit. On comparison below it’s blender when it does simulations. There the cpu is fully loaded.
-
reporter - attached test6c.7z
- Collada exporter to significantly speed up the generation of rigged HD meshes.
-
reporter Next version I will try to connect the rig properties with the shape keys. Also, Alessandro, check how long it takes for you now.
-
Xin I can’t believe it this is light speed. Now it only takes a bunch of seconds (about 3) to generate the subd 2 HD mesh and transfer the rig. I feel HD meshes are close now to be used on par with the other options.
You’re really doing an amazing job thank you so much to be here with us.
-
reporter - attached test6d.7z
- Small bug when deleting temporary materials
- Typo in the name of the addon
-
A minor suggestion for
#438, and to optimize the interface a bit.You may replace “HD mesh from base” with “HD mesh from base and morph”. Then when there’s no morph it does the base, so you join two menus together. Then add two options to generate the rig, the actual “transfer rig to HD mesh” and the new “transfer rig to multires”.
p.s. I’d also have a fancy name for the plugin, “Xin’s HD Booster” since it “boosts up” the diffeomorphic HD section. If you like it. It’s not mine I copied it from the lightwave “ik booster” idea.
-
reporter Yes, that would be cleaner. Those two operators are too similar now.
-
reporter - attached test7a.7z
- Try to create drivers for shape keys (you must have the respective driven base shape key imported on the base mesh and then use the operator to create HD shape keys).
- Changed operators and panels. Also new option to create multires HD (you can’t import HD shape keys to it though, it only takes into account the HD morphs you select when using the operator).
- Faster subdivision in several cases.
-
reporter - attached test_0_16.7z
- Another bugfix related to UV layers. The user must set to active the UV layer that wishes to have in the generated mesh.
-
reporter - attached v0_17.7z
- Some small UI changes and a few more options.
-
reporter - attached v0_18.7z
- New alternative to using collada to import rigged hd meshes that might be faster for computers with slow hard drives.
-
reporter - changed title to Blender addon to generate rigged hd meshes, hd shape keys, normals and vector displacement textures from daz HD morph files
-
reporter Next thing I was gonna try is to generate normals for the FACS units. Does the face control rig (
#477) for FACS units work? I don’t have Blender here right now. -
@Thomas Larsson , as you get some time, it would be really nice to make the functions for Xin to add automatic HD jcms, see
#438. This is all is needed to complete the addon and make it easy to use. -
Xin, as for commit 4903c1d the FACS controls seem to work fine here. If I understand it correctly the dsf HD morphs work fine as well since the new rig system imports them together with the facs units. Then dhdm requires manual work as before.
-
reporter Oh thanks, but are the face controls that didn’t work before working now too? or is Thomas is still trying to fix them? I wanted to test setting up driven normals in the materials so the face control rig also controls them. I could still try to test them in the ones that work for now though.
-
repo owner I will have a look at this tomorrow.
-
repo owner The first step towards integrating Xin’s addon with mine has been taken.
When a morph is loaded, the path to the dhdm file is stored.
The Advanced Setup > HD Morphs > Bake DHDM Maps tool lists the available morphs with dhdms, and also displays the most important of Xin’s arguments. The idea is now to bake maps for the selected morphs, but it fails because I didn’t manage to add files. Specifically, I have a list of absolute file paths, but don’t understand how to add them using the
bpy.ops.dazmorphtest.morph_files_op_add
operator. I tried withbpy.ops.dazmorphtest.morph_files_op_add(files = list_of_paths
)but it didn’t work.
-
reporter I didn’t want to make you do that, since it will be harder and more time-consuming for you to maintain.
I was thinking more about you creating a mini API so I can call it from this addon. A few functions that would be needed for that:
* List JCMs files (even non-HD ones), to re-import them into the generated HD mesh.
* List (but not bake, since that’s this addon’s responsibility), the morphs with dhdm files and their filepaths. This addon would then populate the list and call the relevant operator.
* This for later: function that given a normal/displacement texture, a material and a driven property, it chains it in the shader and creates the driver.
Maybe you can think of similar smaller functions like these, that are easier for you to maintain long-term. The idea is not to make you do all the work and to keep your addon independent.
-
Another useful function if I may, and please correct me if I’m wrong, would be one to get the custom morphs folder. That is, the plugin by Xin opens in the documents folder when we have to add dsf and dhdm files, so the user has to navigate deep in the daz library searching for the figure folders. While diffeomorphic knows exactly where to look when we press the “import custom morphs” button. And this is the same path needed for dsf dhdm files.
* Custom Morphs Path()
-
Then again, and I’m completely out of my domain here so correct me if I’m wrong. It seems to me that some tasks may be better done by diffeomorphic, as for example adding dhdm textures to materials, that we started in
. So I don’t know if some functions from Xin to diffeomorphic could be useful too.#406Perhaps in a later time when and if they’ll be needed.
-
repo owner @Xin. OK, the new tool creates a list of absolute paths to dhdm files, which can then be read from the outside with the function get_dhdm_files():
from import_daz import get_dhdm_files
paths = get_dhdm_files()
The paths remain in place until the next time the tool is invoked. It also sets the working directory and the base mesh. If you prefer to do that yourself, the base mesh is the active mesh and the working directory is the “textures” subdirectory of the blend file directory (the blend must be saved).
-
repo owner If we are only interested in getting the file paths the button is unnecessary. Paths to dhdm and jcm files are now stored in the meshes when the morphs are loaded, or transferred if a vendor morph is found. The functions get_dhdm_files() and get_jcm_files() return a list of file paths. Note that these functions return the filepaths for the active object, so the result is different if you select another mesh.
-
reporter Thanks Thomas. I will check it later.
-
repo owner The interface functions now take an optional object argument. If omitted the active object is used.
-
Hello
Is it now possible to use HD-mophs with the DAZ Importer?
-
Yep it kinda works already but you have to setup everything by hand, see
#438. That’s why Thomas did the functions for Xin so he can automate a little when he gets some time. -
Thank you. That sounds great. It seems a little complicated, but I will try it soon. Maybe I will have some questions.
-
I fail because I don’t understand how I can install this HD-Morph addon. Usually addons are ZIP files. But this is a 7z file which Blender does not recognize. I don't know how to install the addon. Is there a description?
-
Just unzip and place “daz_hd_morph_test” into the blender addons folder, then you’ll find the addon in the testing category. I guess Xin didn’t provide a proper install because this is supposed to be a beta yet.
-
reporter Sorry I haven’t been able to add the automatic set up for drivers since my hard drive broke, and now I’m using an old hard drive which is semi-broken (only good for reading and browsing), so I can’t install daz studio or Blender as they would be quite slow and hang a lot, and they would also risk completely breaking this hard drive too.
Eventually I will try to get a new hard drive, but it will probably take a while.
-
Xin may you setup a patreon please ? I'm sure people will help. I will.
-
I’ve installed and tested the HD-Morph addon with the dispalcement and normal textures. It works very well. My question is, if it’s possible to make the scale of the Vector Displacement driven by the value of the expressions?
-
see
#406 -
Thank you I’ve found out. I can use the “add Driven value Nodes”
-
reporter Thanks for the offer Alessandro, but I already got a new hard drive.
Thomas, I thought about it again and I think there is a better solution. Instead of having your addon list all dhdm paths, you could write a single function that returns the directories for all the morphs loaded on the character. This addon can then search for dhdms or dsfs to load, based on the shape keys' names (since the shape keys' names are the IDs from the files).
That way, only directories are needed, not full paths. Also, this would allow this addon to load all shape keys from the base mesh, not just jcms. This way the base mesh state can be replicated in the hd mesh (while also loading dhdm files if they exist, but this is already handled by this addon, you don’t need to do that).
-
Xin it’s nice to have you back.
-
repo owner There are now two new interface functions:
get_dhdm_directories(): Return directories with dhdm files.
get_morph_directories(): Return directories with morph files, not just jcms.
The functions above return all directories that have been loaded. There are still functions that return the individual functions if the user only wants to convert some: get_dhdm_files() and get_morph_files().
All functions take an optional object argument. If not provided the active object is used.
-
reporter - attached v0_19.7z
- Automatic migration of base mesh morphs (and drivers) when generating rigged HD meshes (and also potential import of associated HD morphs of those base mesh morphs).
- Addon is no longer in the Testing category of Blender addons.
Keep in mind that this does not apply to “Unrigged” HD mesh generation (it wouldn’t make much sense then) or “Multires” HD mesh generation. The reason for the last one is because that would be a bad workflow even if the transfer can be performed by this addon. If the user wants to use a Multires mesh, which doesn’t allow you to import HD shape keys but only to “bake” HD details upon generation, then the user should generate the Multires version of the base mesh at the very beginning (just like you could also do by importing the mesh with the HD script from import_daz, although that is slower). Then, the import_daz addon can handle that mesh like any other mesh, so this addon would no longer be needed.
-
Thank you Xin for working on this.
Unless I miss something that doesn’t seem to work fine though. There's something odd with the HD mesh, that is, when I bend the arm the whole body goes from HD to base resolution for some reason. Below an example with mutation for G8F.
daz studio 4.15.0.2, blender 2.93.0, diffeomorphic 1.6.0.0396, hd addon 0.19
steps:
- import mutation at base resolution
- import jcms
- select “with morphed base” to get the dhdm, then generate the HD mesh
- bend the arm
As a minor note when I select the dhdm the file dialog doesn’t open in the morphs folder. So I have to copy and paste from “import custom morphs“. It would be nice if the addon could open in the right folder.
-
reporter I found what caused that problem and I already fixed it, but in the process I realized some other function could be optimized more, so I will do that before next version.
Also Thomas, the get_morph_files() and get_dhdm_files() functions don’t seem to work. I didn’t use them here so they don’t matter in this case ( this addon only uses get_morph_directories() ).
-
reporter Ok, I finished the changes and now I’m testing it (will upload the new version tomorrow maybe). There is drastic gain in performance when generating shape keys or morphs in general when calling any operator with a lot of morph files (more precisely, with a lot of “groups” as the addon calls them), as subdivision in the dll now reuses a lot of data structures instead of creating new ones for each subsequent subdivision. This means performance gains across the board for operators that can take a lot of “groups” as morphs inputs (like the operators to generate vector displacements, normals, or shape keys).
As an example, this means that generating all JCMs shape keys for the rigged HD mesh takes around 14 seconds including the HD mesh generation and the shape keys' drivers transfer. Before, this took like 200 seconds.
I’m thinking of later creating a separate .dll to try loading morph files usually loaded by the import_daz addon with Python to see if that gives any considerable speed up on the base mesh only too.
-
That sounds excellent can’t wait to test it. Thank you so much again for working on this.
-
reporter - attached v0_20.7z
- Considerably faster execution when calling operators that can take several "groups" as input with several groups (operators for generation of shape keys, vector displacements, normals; and migration of base shape keys to generated HD mesh).
- Fixed problem with HD shape keys not importing correctly on morphed HD meshes (which had "baked" HD morphs upon generation).
- Option to not to set up drivers automatically when generating shape keys for an HD mesh.
-
Xin, unless I miss something version 20 doesn’t seem to work fine. Procedure same as above with mutation for G8F. The HD version doesn’t seem to apply the jcms correctly. That is, the jcms are applied but they turn out with bad deformations.
Below it’s daz, then the base resolution version with jcms, then the HD version.
-
reporter - attached v0_21.7z
- Fixed small bug where base deformations weren’t being properly considered when importing shape keys.
Thanks for testing it Alessandro.
-
Version 21 works great thank you Xin for the quick fix.
If it is possible it would be nice to fix the file browser. That is, when I select the dhdm the file browser opens at the blender folder, and I have to copy and paste the path from “import custom morphs“, that’s the right folder for the figure custom morphs.
-
reporter I would need Thomas to expose those directories with a function first too. I don’t think duplicating all the set-up done by import_daz would make sense, so Thomas creating those simple functions would be better.
Also, Thomas, I suggest you create a .py file exclusively for “api” functions, so even if the implementation changes later, the api function names remain the same. Also, it’s cleaner to have a file you can look at so you can see what kind of functions you can access from import_daz from another addon. I know you once tried to create some sort of api, but this time the api is supposed to be simpler so it should never contain complex functions.
-
repo owner The function get_default_morph_directories(ob) now returns a list of morph directories for the given object. The list may contain zero, one, or several paths, depending on what directories exist under the various root directories.
Functions intended for external scripting are now collected in the new file api.py.
-
reporter Thanks Thomas.
Also, before the next version, I will try to create an operator to automatically set up all facs hd morphs as driven normal nodes so they respond to the face control rig.
-
reporter I tried to generate normal maps from facs’ HD details (associated .dhdm files) that come with daz, but those barely have any HD details (no proper wrinkles), so they are basically useless. Don’t know why daz even bothers with them.
I will now try to create a set of facs' HD details myself, and store them in files as vertex offsets so they can be re-generated for other genesis figures at least. This will take longer though, but once created they should be able to be re-used on genesis figures with the imported daz face control rig that drives facs shape keys, and this time there should be proper HD details through normals (or displacements). Can’t say how long this will take.
-
reporter - attached v0_22.7z
- Fixed small bug when importing shape keys in some cases.
- Adding morph files now tries to open the file dialogue in the morphs directory (needs import_daz addon to be enabled).
-
I’ve tried this addon recently. I use it to generate HD mesh with Triumph HD files.
When import_daz’s unit scale in global settings is 1 and blender’s scene unit scale is 0.01, the detail in .dhdm file can’t import correctly.
Then I tried default settings which import_daz’s unit scale in global settings is 0.01 and blender’s scene unit scale is 1, the detail in .dhdm file can import correctly.
Is there any way to import details in .dhdm files match the scene unit scale settings?
-
Xin, it seems with 0.22 the jcm issue is back, that was fixed in 0.21. The file dialog works great, now it opens in the morphs folder.
-
Can this addon generate all UV layers in base mesh for HD mesh? Now it just generate the active UV layer in base mesh for HD mesh.
Some times I use G8 character’s skin materials and G8.1 eye materials on G8.1 character. In this situation, the base mesh has two UV layers.
If the HD mesh only have one UV layer, some materials will not render correctly.
-
It seems there are issues with multiple uv maps at least with multires. See
#529. -
repo owner The HD script exports the geometry (mesh and uvs) used for rendering in DS. So the HD uvs are a mix of the uv layers, depening on material. HD uvs that are not used by any active material is not available and cannot be exported, because the DS renderer does not use it.
-
reporter Alessandro I can’t reproduce the error, can you tell which steps you followed? be aware that I changed the directory name of the addon, in case you have two versions installed.
For the UV layers, you can still import them through a two step process as a workaround until multiple UV layers get implemented in this addon. Just change the active layer and generate two meshes, then transfer the UVs from the second to the first, then delete the second mesh. In Blender, to transfer UVs, go to Object menu → Link/Transfer Data → Copy UV Maps (this copies the active UV layer on the active object to the active UV layer on the other, non-active, selected object, or creates a new layer if the selected object has no UV layers yet). For the second mesh, you can generate an unmorphed mesh so it’s faster (the morphs don’t affect UV subdivision so don’t bother about them for the second mesh). But still make sure to use collada if you used collada for the first mesh. since collada import creates a different face order (vertex order is still the same, otherwise morphs wouldn’t work).
As a side note, it’s weird Blender’s UV Copy relies on the face order as opposed to only using vertex order. A tool could be written to transfer UVs only based on vertex matching.
For the scale settings, I will create a global scale parameter since the .dll right now is hard coded to use 0.01 to change the dimensions from daz to Blender.
-
Xin, it is my fault someway I didn’t load the jcms .. version 0.22 works great !
As for “copy uv maps” in
#529, I suspect the uv maps are slightly different where the multires geometry is different from the base mesh. That is, we know that multires is not exactly the same as the base mesh. But this doesn’t explain why multiple uv maps don’t work with multires. That is, it seems we have to merge them all in a single uv map for them to work with multires. Do you have any idea ? -
repo owner Alessandro, there is only one uv map because the hd exporter exports the cached geometry, which DS uses for rendering. The cached uv map is unique - each face covers a single area in uv space at render time. This cached uv map is created somehow from the uv maps for the base mesh, but the renderer does not need to know how that was done.
-
reporter I think these are different issues. What 东方夕惕 meant is that this addon, not import_daz, doesn’t create the other UV layers besides the active one on the generated HD mesh. For that, you can use “Copy UV Maps” as explained above.
-
Xin is right. I know that import_daz generate HD mesh with one mixed UV layer. But xin’s addon does not, it just copys the active UV layer of base mesh.
-
reporter - attached v0_24.7z
- Scale setting to correctly handle scales when using a scale other than 0.01 with import_daz.
-
Thanks Xin! Now I can import details in dhdm files when import_daz’s global unit scale is 1 and blender scene unit scale is 0.01.
Just set the unit scale values of HD Morphs addon same as import_daz’s unit scale.
-
reporter - attached v0_28.7z
- Fixed problem with the collada exporter being unable to handle vertex groups' names with spaces in them.
- Addon now only accepts .dsf files (.dhdm files no longer accepted directly). You can specific whether you want to use the HD morphs associated with a .dsf file in the morphs list panel.
- Initial support for being able to handle meshes after they were merged with a geograft mesh (only a single geograft for now) with import_daz. For this purpose you need to use the new Geograft operator which will generate a file in the working directory.
-
reporter To use the new geograft operator:
- Create a copy of the body mesh and geograft mesh just before merge.
- Merge one set of body-geograft meshes with import_daz, as usual. Assume the merged mesh is called “merged”.
- Generate file with the new Geograft operator: select the body mesh, then the geograft mesh and finally the “merged” mesh, which should be the active one. Click the button. A .json file will be generated in the working directory.
- If you ever want to use the “merged” mesh as a “base mesh” with this addon (for example, to create a rigged HD merged mesh from it), you need to also specify the generated file above in this addon’s main settings for the morphs to work properly.
In principle, it should be possible to morph the “merged” mesh with either original body morphs or original geograft morphs, although I only tested it with body morphs since I don’t have morphs for geografts (especially HD).
-
reporter - attached v0_4.7z
- Addon now handles multiple UV layers. As a result, there is a new option to specify the UV layer in the operators that bake textures. For hd meshes, all UV layers from the base mesh will be created in the hd mesh.
-
repo owner Xin, are you still using the functions that return individual files, i.e. get_dhdm_files(ob) and get_morph_files(ob), or do you only use the functions that return the directories? Today I was documenting the tools needed to specify the files, but if you don’t use them I would like to remove them before the 1.6 release.
-
repo owner Also, does your add-on have a homepage? I referred to this thread in the docs, but it would be better to link to a github directory or so.
-
reporter The only functions that are used are “get_morph_directories()” and “get_default_morph_directories()”.
As for the homepage, it doesn’t have any. I will try to write at least some documentation in a pdf and post it here eventually. Maybe then it could also be moved to a repository.
-
repo owner The api functions have been removed, as well as the tools needed to select the files.
If you like I could include the latest version of your add-on in the 1.6.0 release, with due credit of course. It is not possible to put several add-ons in the same Bitbucket repo because of the way that Bitbucket generates zip files, but for the release I make the zip file myself, and can put several directories at the top level.
-
reporter Ok, use “v0_4.7z”.
-
Xin, awesome work. Really appreciate this add-on, in addition to the work done by everyone else.
When I import base G8, apply all standard morphs, generate an HD mesh, then import JCM’s, I get an error message. I’ve narrowed it down to pJCMNeckTwist_Reverse.dsf. The error message pop up is no big deal, but when it happens, none of the JCM’s will work. Is there a way to skip JCM’s that cause this error so the others will still work? I’ve tried this on other imported JCM’s and random morphs, almost all of them have something that causes this error so I have to import them one at a time until I find the specific .dsf that is causing the issue.
Also, when importing .dsf files, is it possible to exclude the dhdm files in the file listing? It takes quite a while to click through a directory selecting only the .dsf files. Would be a big time saver (I tried saving all the .dsf files in their own folder but it seems the .dhdm files have to be there too, so it didn’t work).
-
reporter - attached v0_41.7z
- File browser for adding morphs now filters .dsf files (as explained above,, .dhdm files are no longer accepted directly, use the "morph type" option in the morph list to exclusively use the HD portion of morphs).
- Addon will now report invalid morphs in the console and ignore them instead of throwing an exception.
-
reporter Those errors are due to those morphs not really being morphs but sliders for other morphs. They contain no information about geometry, hence they aren’t shape keys in Blender’s context.
Even though I made the changes above Krampus, you shouldn’t be loading JCMs the way you are doing. There is a better way.
This addon is intended to work with the import_daz addon, so a better way is to load all JCMs on the base figure with import_daz, as usual.
Then generate the HD mesh with this addon, while having “Copy base shape keys” checked. This will import, to the HD mesh, the base shape keys and also the HD details of any .dhdm files associated with them. Finally, it will create the drivers for those shape keys like in the base mesh.
This way, you don’t need to manually select all JCMs and create a huge list of morphs. You can use import_daz for that step, and then the HD mesh generation just migrates the shape keys automatically.
You should also get all the rig sliders that import_daz creates on the armature, and they should work fine on the HD mesh too. For example, an expression slider on the rig of the generated HD mesh should pose the rig’s face bones and also drive any associated HD shape key.
Of course, the option “Copy drivers from base” when using the operator to import HD shape keys does something similar, but it’s less convenient when trying to import a lot of morphs since you first need to add them to the morphs list panel.
-
Thanks for the update and explanation. I must be doing something wrong so I’ll keep working at it. Everything you described works except the HD mesh that is generated does not have the JCM’s applied automatically. That is why I was importing them. The JCM’s are not listed in the morphs panel in import_daz and the shape key panel is empty. There is no place that shows if a JCM is active on the HD morph, so I did a comparison on the screenshot below to confirm.
When I manually import the morphs to the HD mesh (after importing them to the base mesh) it does copy all drivers and the shape key panel is populated with the jcm’s and morphs I import. It’s a beautiful thing.
-
reporter That’s weird, are you importing the morphs on the base mesh before generating the HD mesh? what does the console output say? also, can you give a few examples of which morphs those are?
Make sure you have the latest version of import_daz installed and enabled, since upon generation this addon relies on import_daz telling it where to find the morphs already imported on the base mesh. If this addon can’t find such function (“get_morph_directories()”), then it will skip the entire morph migration process.
When manually importing shape keys with the HD shape key operator, the pairing is done based on shape keys' names (which are the same as the ids in their .dsf files), so it doesn’t rely on import_daz at all. That’s probably why this is working.
If you have an older version of import_daz, you will need to download the latest version and you will also need to start a new blend and re-import the morphs to the base mesh for it to work.
-
Ok, I think I got it. Initial tests with all standard morphs works as you described.
The console gave an error saying “import_daz” not found. I made sure your add-on was in the same directory as import_daz (it wasn’t initially because of previous manual installs).
More importantly though, when installing Diffeomorphic 1.6 stable version it creates the import_daz folder. When installing a development version of Diffeomorphic, the folder is named something else. In my case - ‘Diffeomorphic-import_daz-179952114d49’. So I guess it’s necessary to at least have a copy of 1.6 stable installed along with any other development versions, or rename the folder of the development version.
Thanks for your help!
-
reporter The directory or its name don’t matter as long as Blender recognizes the addon (in the addon list for example). If you want you can also just rename the directory of the development version to “import_daz”, and it will still work.
What I think was happening to you is that you had installed and enabled an older import_daz version, but the function that this addon needs was added relatively recently, so even though import_daz was there, it didn’t have the relevant function.