Over the holidays I may get access to a professional mocap system, the type you see where there are a couple of dozen sensors on your body tracking your skeleton including fingers. This is the type used by professional animators, video game developers...etc.
This has real-time mocap output and they say they have plugins for Unity though I'm not sure of all the details yet. I'm wondering it would be possible to use this live mocap plugin of theirs with VAM, possibly with some custom VAM plugin code, or would support have to be written into VAM to do it?
I've seen examples of this setup being used to live animate characters in Unity's editor, but I'm not sure how big a deal it would be to get it to drive VAM without the source code.
Just to be clear, I know I can record mocap with this and import it into VAM. What I want to do is real-time animate the character like you can do with VIVE trackers, so I could live stream the output, talk to people through a VAM character...etc.
I'm a coder but I've mostly worked with Unreal Engine, no experience with Unity (yet).
So is this possible? How would I set it up?
This has real-time mocap output and they say they have plugins for Unity though I'm not sure of all the details yet. I'm wondering it would be possible to use this live mocap plugin of theirs with VAM, possibly with some custom VAM plugin code, or would support have to be written into VAM to do it?
I've seen examples of this setup being used to live animate characters in Unity's editor, but I'm not sure how big a deal it would be to get it to drive VAM without the source code.
Just to be clear, I know I can record mocap with this and import it into VAM. What I want to do is real-time animate the character like you can do with VIVE trackers, so I could live stream the output, talk to people through a VAM character...etc.
I'm a coder but I've mostly worked with Unreal Engine, no experience with Unity (yet).
So is this possible? How would I set it up?