We don't really need this, its very easy to create mocap animations in VAM using any VR headset, oculus quest 2 for example
Hint: you don't have to record the whole body at once, record one hand, rewind, record another hand, rewind, record hips, head, and so on, very easy and fun!
We don't really need this, its very easy to create mocap animations in VAM using any VR headset, oculus quest 2 for example
Hint: you don't have to record the whole body at once, record one hand, rewind, record another hand, rewind, record hips, head, and so on, very easy and fun!
Yes, you can use builtin VAM animation recording or you can use Timeline recording, it works more or less the same, but Timeline is probably better because you can have multiple animations at the same time.Whoa, that’s news to me, bigly!!!! Any direction you can point me in for more info on how to do this? I’ve got a quest 2 and had no idea it was capable of mocap
I just posted about this subject in another thread where I'm having trouble with the .bvh files both Deep Motion and Plask.ai generate. Neither play with BVH Player in VaM although I have other .bvh files downloaded elsewhere that play fine.
But I can say that Deep Motion could be a very useful tool for making poses if we can get it to work. You get 30 free seconds/credits per month. I would not use it for mocap of a 2D video to 3D figure, I think Plask.ai is better for that. But Deep Motion lets you import a 2D photo and turn it into a 3D figure and this works pretty good. It only costs one credit/second per photo so you can do 30 poses a month for free. A good source for full body model poses is at www.fineart.sk
Plask.ai is what I've been playing with for 2D video to 3D figure mocap. You only get a limited number of free uses per month, I think it's around 30 seconds. But if you do this a lot it might be worth it. I found a good source for 2D full body videos that have a solid color background (white, black or green) are on stock footage sites like Shutterstock. You can download their preview videos and import these into Plask, the watermark on the file doesn't seem to impact the final animation. The final 3D mocap appears to be good quality although like I said I have not been able to check this with a VaM model using the BVH Player plugin. YouTube is also a good source for videos to convert to mocap, such as dance videos.
I also did not know you could create mocaps with the Quest 2. I'm still not clear on the process. Are you supposed to possess a model and move them around as you would in real life, recording it? In that case what are you doing, making fucking motions into the air... or another person?
I wish someone could make a tutorial video on how to use the Quest 2 for mocap in VaM.
Have you tried bringing in a 2D porn video into 3D?
the resulting BVH file will most likely not be using Genesis 2 skeleton//bone names, so you would have to retarget it to Genesis 2 body for it to work with VAM
has anyone successfully created a VAM mocap generated from a 2D traditional video?
That must be the reason I can't get them to work with BVH player in VaM. That's a shame because I think they would be useful. I tried one with a 30 second clip of a hip hop dancer and it came out very well. Just can't use it in VaM.
My rule of thumb, if you have to mess around converting files and ticking stuff like Unity or DAZ boxes them messing some more with code and tick boxes, then best wait until someone make a pipeline plugin that does it all for you.
I felt the same way. Have been trying several methods and the steps you need to go through are quite time consuming. So far Deep Motion seems to take the least amount of time so they're my first choice at the moment.
There are some people online who use the Kinnect method but I have not tried it yet. Seems like I might get a similar result just using the built in possess mocap feature of VaM. I heard Sony is coming out with a cheap mocap solution in 2023 so that's something to keep an eye on.
Maybe we are too lazy to learn to use Acid Bubbles' excellent Timeline properly. I'm sure if we did, we'd get similar results and there's no of that brain-painful fill out forty check boxes before you export from one app/game engine to another. My god, I once exported a model from Daz into to Unity for lipsysnc then set up the lipsync for it! What a performance! A whole week of watching 14 videos plus an exchange of how-to PMs to the lipsync developer, etc. VAM's Adamant's lipsync does exactly the same and none of the stupid faff!
I'd love to make clothing using Marvellous Designer, good female clothes for futa models, butI simply can't be bothered to import from MD to Daz, then to Blender, then to VAM. It's not worth the payoff.
Are you suggesting... to manually edit a timeline animation in sync with a video? Or are you saying that timeline somehow has the ability to capture motion from a 2d source?