DNACalib SetVertexPositionsCommand is not efficient! #31
Labels
No Label
bug
documentation
duplicate
enhancement
good first issue
help wanted
invalid
question
wontfix
No Milestone
No project
No Assignees
1 Participants
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: CGNICO/Metahuman_DNA_Calibration#31
Loading…
Reference in New Issue
Block a user
No description provided.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
DNACalib can change neutral joint positions, but can't change neutral mesh positions.
this code can't change mesh positions in DNA file:
new_neutral_mesh = SetVertexPositionsCommand(
mesh_index, deltas, VectorOperation_Add
)
commands = CommandSequence()
commands.add(new_neutral_mesh)
commands.run(calibrated)
only run_joints_command is efficient.
Hello,
It is difficult to say from this code snippet what could be the problem but I would advise you to try to see what you have in
deltas
list. Those values should be the difference between new and old vertex positions. The other thing ismesh_index
, make sure it has the correct value.Yes, I have checke the deltas list.They are difference between new and old vertex positions. I focus on mesh_index 0 , head_lod0_mesh。
I have test all examples code about SetVertexPositionsCommand: dna_viewer_grab_changes_from_scene_and_propagate_to_dna.py dnacalib_demo.py dnacalib_neutral_mesh_subtract.py .
I can't change the head mesh by SetVertexPositionsCommand. I also try build_meshes, build_rig after use SetVertexPositionsCommand.
Did I miss some steps? Is there any special step to use SetVertexPositionsCommand? Is there anyone has changed head mesh successfully by this command?
Can you give us full script that you are using? So we can take a look at it.
This is my code ,It's changed from dna_viewer_grab_changes_from_scene_and_propagate_to_dna.py.
I read some metahuman's DNA Vertex Positions and save these to new DNA files.
The face SkeletalMesh of metahuman wasn't changed after I import these new DNA files.
We want to generate a new DNA file by modified head Vertex Positions directly . Could you gvie us a tested example code which is read Vertex Positions from a DNA file and save them to a new DNA file?
EDIT START 1 13 09 2023
Finally discovered how to do this.
discussion thread
You can see the documentation on my git
documented process
EDIT END 1 13 09 2023
Hi marijavik,
Is there a code snippet or an example file that shows the process how to use a blend shape to swap the original MH head with a new shape (same topology) and then run the set_vertices or other commands. I feel a lot of us (perhaps even he1chenglong ) are struggling to understand that workflow.
As of now the Grab_chages_from_scene script allows us to tweak a mesh using joint displacement. But most of us are looking to swap the head out with a custom one with a few tweaks. The process / methodology for that is unclear.
Please could the team elaborate on that.
Thanking you,
b
Hello,
usually I generate mesh in Maya from DNA file and in that scene import another mesh containing changes I want. Then I run 'run_vertices_command', save the changes in new DNA file. To validate I generate mesh from this new DNA file, import model mesh again and compare. Something like this:
One important note - once you re happy with your changes in Maya, do not forget to generate FBX files. It won't be enough just to replace DNA files in UE, but to reimport FBX files as well.
Thanks for both answers MariJavik.
I'm glad you bought the FBX export part up.
I think there is a problem with just exporting the modified DNA scene FBX to Unreal engine.
There is bone structure mismatch. At least with the results of "grab_changes_from_scene.py" example file.
Is there some prep required before we export it ?
Some tutorials show how to do that by renaming the spine bones to DHI:root and remove a few joints from the body.
For e.g this is the metapipe tools prep process.
prepare for export
Ideally we'd like to have the "grab_changes_from_scene.py" script assemble the head rig with the correct naming and bone structure so we could send it to UE. If you could share a code snippet around that it would be golden.
I'm learning so much from you.
Thanking you abundantly
b
Hi,
for FBX generation, we need some body joints and those could be found here. Normally, we would take existing MH body and delete meshes and extra joints so that we get joint chain like in those files. No need to rename anything. After body is prepared, use script for FBX files generation.
wow !
Cant wait to try this out and make it work inside UE.
Cheers,
b
Hello MariJ,
I followed the procedure you outlined above but my fbx is looking deformed. I disabled the skin cluster to see of the body joints used from the data/body folder are at a different height from the MH i downloaded from bridge.
deformed face
The default MH file from quixel has the head bones at a different height (a seen in the comparison pic below)
original MH vs DNA modified
Should I unbind the skin then reposition the mesh and re-bind it ? I'm just worried about the head portion not meeting the body seam in UE if I do this.
Any idea how I should be going about fixing this ?
EDIT START 1 18 09 2023
On further inspection, the offset is being caused because in the modified DNA result file the head is coming in 2 units below the normal mh file from quixel. This is because of the flip-flops which are approximately 2 units.
modified dna head coming in 2 units below the quixel mh
https://pasteboard.co/SDEz7NvmoDyB.png
EDIT END 1 18 09 2023
Thanks
Hi @booomji
just make sure that you take DNA and body file from the same MetaHuman. And do prepare skeleton file to contains the same joints we have in body folder. In that case you should have corresponding height for both body and head. Do not take any of those predefined skeletons as the most likely they won't fit. Those are skeletons that fit Ada and Taro DNA files and can be used with those.
Thank you mariJ.
Just to be sure I understood you correctly.
Thanks for the clarification
b
Yes, that is correct. In body folder we prepared bodies for Ada and Taro and I guess your is different, so you need to prepare the one you have downloaded from bridge.
Thank you MariJ.
Finally, do we have the ability to add blend shapes via the DNA calib commands ?
I know there is a SetBlendShapeTargetDeltasCommand . I am assuming this will only change the current blend shape and not add another blendshape to the list.
For e.g If i want to animate a Pinocchio nose in the sequencer (or BP) I could
Possible to do procedurally (dna toolset)?
Right now I'm adding them via the steps shown in the tutorials below but it'd be nice to have this functionality in the DNA suite of tools.
Adding blend shapes & interactively modifying them in the editor
Adding blend shapes to metahumans
Thanking you,
b
Hello @booomji,
Unfortunately, we do not have an option to add new blend shapes, only to update existing with SetBlendShapeTargetDeltasCommand. So you are right.
Thank you Marij. You have been super helpful.
I hope Epic bring you on their channel for a deep dive on this topic for all the devs out there.
Cheers,
b
If you mean at UnrealFest for the talk MetaHuman: DNA Calibration Deep Dive - I will be there :)
WOW this is fantastic news !
I hope the talk will be recorded. It will be an invaluable resource.
Cheers,
b
Hello MariJ.
Was your session recorded ?
Very eager to view it.
Cheers,
b
Hello @booomji,
It has been recorded and we expect it soon to be available on YT.
Hello Mari J.
The video has still not been uploaded. Please can you ask the team to up them soon.
Thanks so much.
b
Hello @booomji,
The video has been uploaded today - link
@marijavik Sorry I couldn't find a better place to post this, if there is please let me know and I will.
I want to generate random metahumans programatically. I think I can do this by editing metahuman DNA files but I'm not sure how to go about this. I saw your talk at Unreal fest and read through the examples and github docs and code but couldn't find a way to do it.
My working assumption rn is that if I can find the vertices corresponding to different parts of the face, like mouth, nose, eyes, etc, I can just scale them randomly (within reason) to create an entirely new metahuman BUT I'm not sure how to find the relevant vertices for each part of the face.
Any help would be appreciated
omg omg this is AMAZING !
Thank you so much.
b