• Hello Guest!

    We have recently updated our Site Policies regarding the use of Non Commercial content within Paid Content posts. Please read the new policy here.

    An offical announcement about this new policy can be read on our Discord.

    ~The VaMHub Moderation Team
  • Hello Guest!

    We posted an announcment regarding upcoming changes to Paid Content submissions.

    Please see this thread for more information.

Creating a new MMD Plugin

DarkChris

Member
Messages
14
Reactions
63
Points
13
Hi, I'm DarkChris - a Blender animator and programmer with a lot of experience converting MMD animation using Python scripts.

While I'm very impressed with VAM's abilities, I've noticed the MMD options don't have facial expression and that simply won't do for me. I've noticed that a few years ago CraftyMoment had a decent script to translate it using python:


Using his approach as a starting point, I've already corrected a few joint rotations, and tweaked the physics. (I've posted 2 of my test conversions in Free Scenes to demonstrate the progress)

And now I've found the import scripts we used for Blender to get facial expressions:


Progress updates and more demos to be added shortly.
 
Greetings, CuteSvetlana - the pleasure is mine.

I had to stop late last night to do some paid work, but I've got some time this week I can use to improve what I have so far for the VMD code. I was able to find the encoding for the VMD animation files, which should help me with the bytecode for facial expressions:


While there's still a lot of work to be done, if you have any dances you'd like included in the test runs, let me know. I also have a thread going in the plugins section with more links/sources.

No issue. I can be patient. :)
There is this iconic dance that I watched so many times 😁 . I believe the music & dance would fit nicely with VaM's spirit
I was contemplating doing it the Crafty Moment way before you popped up on the Hub. It might be good for your test as the video is showing face and eyes movement

I have copy of the vmd files should you need them for your tests

I look forward to check if my system can go up to 3 models dancing at the same time. Welcome to VaM!
 
No issue. I can be patient. :)
There is this iconic dance that I watched so many times 😁 . I believe the music & dance would fit nicely with VaM's spirit
I was contemplating doing it the Crafty Moment way before you popped up on the Hub. It might be good for your test as the video is showing face and eyes movement

I have copy of the vmd files should you need them for your tests

I look forward to check if my system can go up to 3 models dancing at the same time. Welcome to VaM!
That's a very good animation - I'll use that one in my next batch of tests tonight. I'm currently going over github libraries to find all of the encoding parameters needed to upgrade my current worflow for facial animations. Some good finds so far for any fellow programmers following along:



I should have some more tests to upload later this evening.
 
OK, so it's definitely possible. I've already taken half a dozen expressions and automated the import of them into VAM. But there are a lot to go and then I'll still be tweaking in for a while.

The biggest issue I see so far is that while VAM's engine can definitely handle it, VAM's interface cannot. VAM's trigger menu for scene animation will lock up hard once you put several hundred to a few thousand animation triggers in, but that's what it's taking so far.

I've only had VAM for a week, and I've spent far more time coding and learning to mod it than actually playing around with it. So I'm sure this process will be refined as I go.

One more night of coding and hopefully I should be able to crank out a new demo with facial animation.
 
Thank you Chris for the regular updates. It looks very promising.
Feel free to hit me when you need functional testers. I have quite few dances at hands.

In the event you would hit a roadblock regarding the coding, there a good number of people that may be able so don't be shy asking on the hub (but not me!).

Cheers :)
 
Thank you Chris for the regular updates. It looks very promising.
Feel free to hit me when you need functional testers. I have quite few dances at hands.

In the event you would hit a roadblock regarding the coding, there a good number of people that may be able so don't be shy asking on the hub (but not me!).

Cheers :)
It's been an interesting process testing how to import new forms of animation into VAM. I'm posting a new test scene now that includes facial animation import. Afterwards, I'll find a decent copy of Gentleman and run it through the same process. The animation in the YouTube link didn't appear to have any facial animation, and it may not be a part of the original. But, since the process I've made adds the facial expression separately, I can probably slipstream the facial animation from another .VMD file to give it some additional kick.

So far, the results are raw and unpolished but show a lot of promise. It definitely works as a proof of concept that it can be done, and works for what I wanted which was a much richer pageant of facial animation. After learn more about VAM and how the plugin structure works, I can refine my workflow into something that can be redistributed.
 
This is some very exciting work your doing @DarkChris :D Thank you so much
My pleasure. I'm still learning VAM, but I used to do all this stuff manually in Blender so I know the structure for animation and rigging. I also work in tech and have a good familiarity with the JSON format VAM uses for save files. I've been using my moderate programming skills to correct a few things about Crafty Moment's code (which was brilliant, but unfinished) and add in new code to address facial animation.

I can see why so few have attempted it - the language barrier is daunting enough on it's own. And testing for the differences in the animation is slow going. To mitigate this, I'll be sharing some of the testing results with the community as I go. I want the end result to be able to save animation directly into the save file like my examples, to give the creators the range to save animations that won't require a special plugin to play.
 
Thank you Chris for the regular updates. It looks very promising.
Feel free to hit me when you need functional testers. I have quite few dances at hands.

In the event you would hit a roadblock regarding the coding, there a good number of people that may be able so don't be shy asking on the hub (but not me!).

Cheers :)
I've made an attempt at converting Gentleman, and posted the file under Free Scenes on the Hub.
 
I've made an attempt at converting Gentleman, and posted the file under Free Scenes on the Hub.
Hi Chris,
Downloaded and played. Below are my comments (not critics!) on what stroke me the most. Hope you find the information useful
  • When you're playing the animation manually, the models is positioned ~2m up in the air for about the first two seconds
  • When watching, the movements that seem weird to me were the shoulders from my perspective, like going too high for the human body
So I don't know anything about body armatures except as a concept. However we know MMD body is very petite. Could some of the discrepancies come from the fact the bones themselves do not have the same dimension ratio (e.g. length) when comparing the two body armatures to each other?
 
Hi Chris,
Downloaded and played. Below are my comments (not critics!) on what stroke me the most. Hope you find the information useful
  • When you're playing the animation manually, the models is positioned ~2m up in the air for about the first two seconds
  • When watching, the movements that seem weird to me were the shoulders from my perspective, like going too high for the human body
So I don't know anything about body armatures except as a concept. However we know MMD body is very petite. Could some of the discrepancies come from the fact the bones themselves do not have the same dimension ratio (e.g. length) when comparing the two body armatures to each other?

The size of the model is a large part of it, because currently the only positional markers in use by the Crafty Moment method are the center of the body and the feet. Which means the bulk of the animation is being translated via rotational vectors alone.

My first attempts at this have been using that method with a few corrections I've added. Then, I've written my own new code to grab the facial animation data and inject that into the same scene file.

I've mainly used this method as a relatively quick way of proving that it can be done at all, since I haven't seen anything on the Hub that does that yet. On my next run I'll try to integrate it better (Like with Timeline) so imported data is less buggy.

Once I've established how I want the data to be imported into the scene, I'd like to start over on the import code using the Blender tools I'm already much more familiar with. I've had much more accurate results with those because it matches the scale of the model so it can use position data for all of the limbs; this also allows it to use the interpolation data built into the MMD file.

This is a spare time project, so it might be a while, but I'll still keep posting test scenes as I go to showcase progress that gets made.
 
Fun fact: PSY paid copyright fees in order to use the "arrogant dance" (the hip-swaying part) from Brown Eyed Girls "Abracadabra". Gain from Brown Eyed Girls even appears in the music video for "Gentleman"! I found an MMD video of "Abracadabra" here:


It might be another one to consider using, since it's also in the vein of "sexy dances".
 
Back
Top Bottom