• Hi Guest!

    Please be aware that we have released another critical security patch for VaM. We strongly recommend updating to version 1.22.0.12 using the VaM_Updater found in your installation folder.

    Details about the security patch can be found here.

Plugin to convert FUNSCRIPT Data into motion, is it possible?

Cley

New member
Messages
2
Reactions
0
Points
1
Hello smart People,
I have done a simple project using the funscript data of a porn video to create the following animation: Cowgirl audio sync test (newgrounds.com)
The animation isn't really fancy at all, I just used the different timestamps in FUNSCRIPT editor and moved the z axis accordingly.
I used Paid Plugins - RenderToMovie(MMDEditorPack) | Virt-A-Mate Hub (virtamate.com) afterwards to get a precise framerate of the animation.

The animation is not good, but the sync with the given audio from the FUNSCRIPT data is solid. My idea would be to implement a plugin that can use the data of the funscript and transform it into motion automatically. The only thing the player would have to determent is the atom, its direction or rotation of the funscript provided and the positions min and max.
I guess it would be a challenge to implemend it the way that the funscript data for rotation and direction would work with the same atom.
I just started my programming journey with C++ and don't even know where to start.

My question would be, is it possible?
 
I for one would definitely love to see this and would support it, even as single axis. Extra points if it can talk directly with vamsync. This would allow us to take our favorite scripted human performances and animate our favorite vam characters with them with full and accurate interactive toy (osr+handy) support.
 
I spent a LONG time on this and had a breakthrough. I am figuring out how to monetize this (I have been trying to come up with a small revenue stream from this hobby).

Do you think there's an audience where they would pay for sending me a funscript and in exchange getting a plug in with an timeline / animationpattern in return?

I've testing it on the Handy, and it works perfectly and is very customizable.
 
I spent a LONG time on this and had a breakthrough. I am figuring out how to monetize this (I have been trying to come up with a small revenue stream from this hobby).

Do you think there's an audience where they would pay for sending me a funscript and in exchange getting a plug in with an timeline / animationpattern in return?

I've testing it on the Handy, and it works perfectly and is very customizable.

I think if you want to create a business model, you might want to improve the clarity of your explainations :p
What are talking about here?
How people sending a funscript and get an animation pattern for VAM would be interesting? An Animation pattern for what? What is the goal of your thing... I don't understand (and I'm trying very hard).
 
I think if you want to create a business model, you might want to improve the clarity of your explainations :p
What are talking about here?
How people sending a funscript and get an animation pattern for VAM would be interesting? An Animation pattern for what? What is the goal of your thing... I don't understand (and I'm trying very hard).
Hey Hazmox. Hah, sorry I was less than clear.

I've developed a workflow to go from a recorded funscript file to a timeline animation file. the timeline animation file can easily be loaded into a timeline plug-in.

The scene is set up such that the Person's controller is driven by an AnimationPattern together with a trigger object.

Inspired in part by reverse engineering how CMC does some of their scene work.
 
So Funscript to Timeline animation which would, in the end, animate a character. Right?
Or are you talking about a timeline animation driving a toy?

I'm asking the below questions from a "potential customer perspective":

If it's the first, this is a true question: how would anybody be interested in that?
If it's the second one: this is also a true question: how would anybody be interested in that? : 'D

To be more transparent:
  • If it's the first : I don't see ANY benefits of using a funscript to create an animation. There are HUNDREDS of free animation you can grab/strip to create your scene on the hub
  • If it's the second one: I don't see the appeal nor the benefits of bringing a random external funscript to VAM. On the other hand, having a plugin/app which would be able to create a timeline animation to control a toy THAT would be interesting. If toy powered scenes could control toys properly that would be interesting in the "VAM animation > dynamic funscript > controls toy" order.
 
Ah! I guess I forgot the most obvious part. All of this is connected to a Handy.

And you maybe did and maybe didn't notice last year I asked Blazedust to update the "ToySerialController+VAMLaunch" so that the settings are more directly exposed. From there, I built a subscene that includes a handy (no pun intended) panel to take full control in real time over all the scene plugin controls.

So, there is currently no way to take a funscript and use those same encoded movements (that are essentially simple/linear mocaps) to utilize in VAM. Instead we all have to rely on the current relative dearth (compared to funscripts) of high-quality mocap scenes published by folks like CMC, KM, C&G, etc.

Meanwhile, there is an entire world of funscripts out there from fast to slow, music beats, etc.

It's also amazingly simple to generate a quick funscript from a favorite video once you have the environment set up.

But there's no way to do that in VAM.

Until now!

I'm basically bringing the 2D funscript experience into the VAM environment.

For me, I have been using funscripts for years. I have a whole library of custom-made ones against favorite 2D videos. There are also websites like faptap that have whole libraries of these.

Started out with the Launch, which was discontinued, and now I have the Handy, which has been mostly great. I use ScriptPlayer sometimes, Faptap, sometimes, and VAMSync in VR sometimes.

I now have a workflow that lets me bring my favorite 2D funscripts into a timeline environment.

So, for me it's been an enjoyable experiment to set up a scene that essentially acts like watching a 2D video but within a VAM environment. The handy works exactly the same.

Favorite scripts that are revisited a lot, can now be put into the VAM environment and fully customized.

I even played with the idea of having the original video playing alongside the person atoms, but the refresh rate wasn't great, and syncing was impossible.

In any case, so that's the use-case. People have funscripts that they really enjoy the motion of, and now can be used within VAM.

If this doesn't seem interesting to anyone that's totally fine. I've been enjoying creating all of this for myself :)
 
I do know the funscript community but I still can't really see myself trying to use VAM as a "surrogate" / "intermediate" for funscripts.
Again, trying to "project" myself into the boots of a end-user enjoying toys.

The only reason why I would use "funscripts" through VAM is if it could control a toy FROM a scene animation but without a predictive behavior.

For flat 2D content or even VR content, there are a bunch of alternatives that have far less friction than VAM and already works without having the need to understand the software and a bunch of plugins... just like you said, faptap or even apps like Whirligig.

That said, that's what I'm imagining and assuming. There are already so much constraints to master VAM. That I can only eventually imagine an ultra ultra small niche that would like a concept like this.
 
What did you mean by "but without a predictive behavior"?

You would set the motion source in the Handy to the AnimationPattern. This would "control a toy from the scene animation" as you say above.

It wouldn't be making predictions because it's wired directly to the animation pattern.

The trick is this -- the timeline has one target: trigger value, from a VariableTrigger object.

The data within the timeline animation are corresponding values of 0-1 of the variable trigger's value.

The VariableTrigger is pointed to the Animation Pattern's currentTime.

In this way, playing the timeline animation (which is attached to an empty object) controls the variabletrigger value, which in turn drives the animation pattern, which finally drives whatever part of the person atom you want (e.g. hip).

My value add would be to take someone's funscript and essentially build a scene with that motion in there for use with the handy, similar to any other mocap scene.
 
That was not targeted to your project : )

The toy system right now is only predictive. Which is glitchy as hell. So I'm just saying that the only reason I would like to use a VAM toy system is if it had a plugin that would drive a toy through absolute animations, not predictive.

Btw, this is only a friendly discussion around the potential interest, I'm not here to discourage you. All resources and R&D/creative use of VAM is welcome and wanted!
 
Hey thanks for that. I've never been really comfortable talking about all of this to begin with so I appreciated that clarification.

I agree 100%; it's really crazy that we have all of this amazing tech and somehow we're stymied/hampered by the industry. I remember reading a bit of the history of teledildonics and how the patents were somehow blocked, slowing down progress.

I did NOT realize that even with the AnimationPattern it was predictive. I thought that was absolutely direct data transfer.

So what needs to happen for direct control to be here?
 
I agree 100%; it's really crazy that we have all of this amazing tech and somehow we're stymied/hampered by the industry. I remember reading a bit of the history of teledildonics and how the patents were somehow blocked, slowing down progress.

What did you expect? :p

The companies behind the toys are selling those shit over 150 bucks. It would be a problem that independent devs like us would put to shame the "native" apps in weeks. You seem to know the Handy: are you not shocked that their own plateform does not handle the bluetooth mode? You can only use their tools IF you're using the wifi mode. And in a couple of months, faptap.net arrives with a proper wifi + bluetooth system.

This is a conscious thing. They do not want to loose the potential control over their hardware. And eventually keep selling services : )


I did NOT realize that even with the AnimationPattern it was predictive. I thought that was absolutely direct data transfer.

So what needs to happen for direct control to be here?

Sorry! all the modes BUT animation patterns are predictive. From what I gathered in the code (since I initially repacked the VAMLaunch plugin), the animation pattern is an absolute control.

The problem is, AP are not interesting from a scene perspective.

If you ask me: Timeline control. But "simple". I dug in that months ago and contacted Acid to find a potential lead, but the system was so annoying that I dropped the ball. Timeline does not allow you to know what is the next key. This is only assumption on my end, but I'm pretty sure funscript works that way:
  • When hitting a keyframe, the system look up the next frame and computes the duration and position.
  • Animates movement
  • When hitting next frame, looks up... etc etc

I tried to think of a way to emulate that in Timeline without success, by using a pure simple "float" track which would allow you to author a parallel timeline to the character animation which would be a mirror of the motion... with potential more detail like stutters/vibration (that are handled quite well by the handy for instance).

If you were able to pull that off, then it's starting to look like something far more interesting.

Creators would have to do the character animation AND the toy animation, but the final user experience would be far more compelling.
 
What did you expect? :p

The companies behind the toys are selling those shit over 150 bucks. It would be a problem that independent devs like us would put to shame the "native" apps in weeks. You seem to know the Handy: are you not shocked that their own plateform does not handle the bluetooth mode? You can only use their tools IF you're using the wifi mode. And in a couple of months, faptap.net arrives with a proper wifi + bluetooth system.

This is a conscious thing. They do not want to loose the potential control over their hardware. And eventually keep selling services : )




Sorry! all the modes BUT animation patterns are predictive. From what I gathered in the code (since I initially repacked the VAMLaunch plugin), the animation pattern is an absolute control.

The problem is, AP are not interesting from a scene perspective.

If you ask me: Timeline control. But "simple". I dug in that months ago and contacted Acid to find a potential lead, but the system was so annoying that I dropped the ball. Timeline does not allow you to know what is the next key. This is only assumption on my end, but I'm pretty sure funscript works that way:
  • When hitting a keyframe, the system look up the next frame and computes the duration and position.
  • Animates movement
  • When hitting next frame, looks up... etc etc

I tried to think of a way to emulate that in Timeline without success, by using a pure simple "float" track which would allow you to author a parallel timeline to the character animation which would be a mirror of the motion... with potential more detail like stutters/vibration (that are handled quite well by the handy for instance).

If you were able to pull that off, then it's starting to look like something far more interesting.

Creators would have to do the character animation AND the toy animation, but the final user experience would be far more compelling.

You made me laugh a few times there, thanks for that. And to be clear about my own abilities -- I can't consider myself a real dev. I'm self-taught, have a lot of holes in my knowledge, and I go through phases of learning how to code followed by periods where I don't touch it. It would be more accurate to say I'm PseudoDev. If I had the ability to work with a REAL dev I could move mountains. At least I like to pretend thinking that :)

The history lesson was appreciated, as was the perspective. It all makes sense, and all I can respond with is I agree 100%.

Just like in the world of video games, I suppose, the "mod community" and open source community is where the real fun begins.

onwards: you said this, "The problem is, AP are not interesting from a scene perspective."

What makes any mocap interesting to me -- whether its VAM's cooked in animation capture, or the fancier technologies that the mocap creators use, or funscripts as I've been playing with -- what makes it interesting is the authentic, organic actions. The fingerprint of a human nervous system moving muscles, the intentional movements, the accidentally movements, the ones that happen without intent.

THAT is what makes a mocap dataset interesting. And while I agree that a simple, two-dimensional encoding (as seen in a simple funscript file) is not as interesting, it's when you combine it with other things that -- IMHO -- make it compelling.

I've experimented with having a set of random 'cowgirl' funscripts (I took 8 of them) from faptap. All named "cg/1", "cg/2", etc.

Combined with the ability to place the person in different poses each with associated different idle motions, varied expression generators, etc., it's been pretty amazing.

So -- given that the APs are *not* predictive, but absolute... does that maybe change your mind (out of curiosity)?

>>I tried to think of a way to emulate that in Timeline without success, by using a pure simple "float" track which would allow you to author a parallel timeline to the character animation which would be a mirror of the motion... with potential more detail like stutters/vibration (that are handled quite well by the handy for instance).

For this, I *think* this is explaining what I've set up with the VariableTrigger value driving the AP directly, which in turn drives the character animation. The stutters are all there, and another nice thing you can do is create a set of 3 custom sliders; two for each of the AP steps, so you can tweak the up and down separately, and a third that lets you bring both up and down with the same values.

Imagine layering onto that a randomized range of values for those sliders, and you can basically create a completely organic, unpredictable (and therefore exciting) infinite experience that falls within a desired range of user parameters.
 
So -- given that the APs are *not* predictive, but absolute... does that maybe change your mind (out of curiosity)?

>>I tried to think of a way to emulate that in Timeline without success, by using a pure simple "float" track which would allow you to author a parallel timeline to the character animation which would be a mirror of the motion... with potential more detail like stutters/vibration (that are handled quite well by the handy for instance).

For this, I *think* this is explaining what I've set up with the VariableTrigger value driving the AP directly, which in turn drives the character animation. The stutters are all there, and another nice thing you can do is create a set of 3 custom sliders; two for each of the AP steps, so you can tweak the up and down separately, and a third that lets you bring both up and down with the same values.

Imagine layering onto that a randomized range of values for those sliders, and you can basically create a completely organic, unpredictable (and therefore exciting) infinite experience that falls within a desired range of user parameters.

It can change my mind IF I could have the ability to drive this through a normal timeline : )
Just like a funscript does if you prefer, but on a timeline track. Simple, efficient. Key here. Height value. Next key here, height value... and so on.

Most of my content is very "narrative" / "controlled". And I like to craft very specific scenes. The idea of randomization is interesting but this is not my way of producing content. I could see myself testing it, for a scene or two, but I would fall back on wanting to create a very "absolute" (or static) animation like a funscript is.

But knowing that people around the hub like randomization, this might appeal to some of them tho!
 
I tried to think of a way to emulate that in Timeline without success, by using a pure simple "float" track which would allow you to author a parallel timeline to the character animation which would be a mirror of the motion... with potential more detail like stutters/vibration (that are handled quite well by the handy for instance).
If I remember correctly CuddleMocap does it with a very basic linear motion animation pattern for the toy and the actual animation is done with Timeline by animating the playback position of the animation pattern through a relay atom.
 
So! The only scene I have access to for CuddleMocap is the free one (the cruise one). And they're using a standard animation pattern.
But like any other day in the life of a scripter/coder, I tried something and thought of several aspect of timeline... and it works : )

So the setup is:
  • Animation pattern with 3 steps
  • Timeline animation on... well, whatever you want.
  • Timeline animation controls AP playhead with a "constant" curve type and oscillate between a 0 value and 1 value.
Why didn't I thought about that sooner? oO

See @ajtrader23 at least our discussion and your ideas pushed me further :p
Thanks for that!
 
It can change my mind IF I could have the ability to drive this through a normal timeline : )
Just like a funscript does if you prefer, but on a timeline track. Simple, efficient. Key here. Height value. Next key here, height value... and so on.

Most of my content is very "narrative" / "controlled". And I like to craft very specific scenes. The idea of randomization is interesting but this is not my way of producing content. I could see myself testing it, for a scene or two, but I would fall back on wanting to create a very "absolute" (or static) animation like a funscript is.

But knowing that people around the hub like randomization, this might appeal to some of them tho!

Okay then maybe I will change your mind!

1. Attach timeline to any atom (I use an empty, but there's no reason it couldn't be a person or thing).
2. Be sure there is a VariableTrigger in the scene appropriately named.
3. Import one of my converted funscripts-to-timeline animation.
4. The timeline animation will have a single control, which is the variabletrigger value. ranges from 0 to 1. Simple, efficient, a key here with a height value from 0-1, next key, next height. etc.
5. There's an AnimationPattern in the scene, with 2 steps. you can attach the AP to anything, such as a HipControl.
6. You position the two steps however you want. Step 1 goes by the pelvis, step 2 by the chest, for example.

I could set up a simple demo scene with all of this in place already, and the user (e.g. you) can import one of my converted funscripts as a simple animation into that timeline.

Now you have to just make sure that the toy controller scene plugin is set to the animationpattern. Make your adjustments of choice there, and go press play in that timeline animation.
 
So! The only scene I have access to for CuddleMocap is the free one (the cruise one). And they're using a standard animation pattern.
But like any other day in the life of a scripter/coder, I tried something and thought of several aspect of timeline... and it works : )

So the setup is:
  • Animation pattern with 3 steps
  • Timeline animation on... well, whatever you want.
  • Timeline animation controls AP playhead with a "constant" curve type and oscillate between a 0 value and 1 value.
Why didn't I thought about that sooner? oO

See @ajtrader23 at least our discussion and your ideas pushed me further :p
Thanks for that!

Awesome! Glad I sparked something here.

Now the question is - how do you encode the motion? Where does it come from?
 
Okay then maybe I will change your mind!

....

Now you have to just make sure that the toy controller scene plugin is set to the animationpattern. Make your adjustments of choice there, and go press play in that timeline animation.

Seems simple and straight forward enough even for someone that has almost no VAM background contrary to me. : )


Now the question is - how do you encode the motion? Where does it come from?

As I was saying, I'm a very "scripted/cinematic/controlled" creator : )
I'm animating my character (without any concerns besides having a proper animation). And I have two additional tracks with the animation pattern: time and speed. After the animations are done, I'm then creating the pattern commands to control the toy.

One thing tho, you said you had the vibrations (aka: stutters, which is something the Handy handles very well). I can't seem to figure out how to do it. I'm even thinking that the VAMSync/bluetooth mode can't handle commands under 50ms for a cool vibration effect.
 
Ok, so I dug a bit further. From what I could find the fastest command interval is 7.5ms over bluetooth. Which, if you account for the whole pipeline latency might result in what I'm getting: I couldn't have an accurate vibration command under 30ms.

Which is far different from the vibration commands you can get over the WIFI network using the handy.

So unless I'm missing something, I'd be curious to know what "vibration" effect you can get because it seems impossible to be at the speed I have in mind : ) ( as a frame of reference, this is the type of vibration I'm refering to , obviously NSFW )
 
Back
Top Bottom