• Hello Guest!

    We have recently updated our Site Policies regarding the use of Non Commercial content within Paid Content posts. Please read the new policy here.

    An offical announcement about this new policy can be read on our Discord.

    ~The VaMHub Moderation Team
  • Hello Guest!

    We posted an announcment regarding upcoming changes to Paid Content submissions.

    Please see this thread for more information.

ToySerialController+VAMLaunch

Plugins ToySerialController+VAMLaunch 12

Blazedust

Well-known member
Featured Contributor
Messages
172
Reactions
1,101
Points
93
Blazedust submitted a new resource:

ToySerialController+VAMLaunch - VAMLaunch controlled by ToySerialController with auto motion sources as a session plugin.

Plugin for Launch owners to feel the action in scenes without having to set up anything specific other than a session plugin.

It works by using VAMLaunch controlled by ToySerialController with auto motion source detection and tracking as a session plugin.

View attachment 79743

Thanks to:
Zengineer for the VAMLaunch and hazmhox for the VAMLaunch Repack
[URL...

Read more about this resource...
 
Woah ok, so theoretically, this could work to pick up motion from not just multiple atoms like hand, mouth, feet, whatever from ONE person atom but also could work with threesome scenes with two female person atoms trading off?!
 
Nice work! I wonder if this could work with the Kiiroo Keon ?

Whatever the VAMServer and VAMLaunch supports I guess - see https://github.com/ZengineerVAM/VAMLaunch
"VAMLaunch now supports any device that is supported by Buttplug that accepts linear commands.:
Launch, Kiiroo Onyx 1/2/2.1, Titan 1, RealTouch
"

I have not tested anything other than the Launch and will not test anything else either! I have only tried this plugin with the Launch, nothing else!

--
Woah ok, so theoretically, this could work to pick up motion from not just multiple atoms like hand, mouth, feet, whatever from ONE person atom but also could work with threesome scenes with two female person atoms trading off?!

Yes: Works in the following scenes like:
and

Some other scenes:
 
Last edited:
I can confirm - Keon works fine. The issue with it is - and this is a Kiiroo problem, not dev problem - the slowest speed programmed isn't as slow as the launch.
 
@Blazedust this really is impressive, thanks for your contribution. Question: You write about "the zone". Similar to the original plugin where the user could visualize the zone to more easily position it - how easy would it be to put a toggle in there to show/hide the zone being tracked? It would make it easier to adjust the settings for the scenes that don't have the simpler up/down motions.
 
@Blazedust this really is impressive, thanks for your contribution. Question: You write about "the zone". Similar to the original plugin where the user could visualize the zone to more easily position it - how easy would it be to put a toggle in there to show/hide the zone being tracked? It would make it easier to adjust the settings for the scenes that don't have the simpler up/down motions.
This is how I debug/visualize:
  1. Have one or two females in a scene with static poses.
  2. Place a dildo on an animation track going up/down.
  3. Set motion source to "Dildo + Female". The Dildo will be the reference instead of the male's penis.
    • Tip: Turn physics off on the dildo so you can move around the animation pattern freely through the female without having to worry about collision.
  4. Check enable "Debug" in the plugin (there's an option for that under the Debug group)
    • Enabling debug visualizes the reference zone and the motion target point by drawing them as cylinders.
    • Depending on how far the reference area is inside the motion target point you will see the Current Axis Input value move along it.
    • The Current Axis Input drives a "virtual zone" that is driving the VAMLaunch plugin logic. This is just a range from 0 to 1 which is the input from the Current Axis Input. What you want to see is the reference zone and motion target point.
    • To change the reference area's size, use the following settings, Reference Length/Radius and Target Point End/Start:
      1638728927536.png
  5. You can use the GlassPlugin to turn the female invisible so you can see the debug zones being drawn inside the female.
    • You can then use some clothing to get the outline of the female body. Set the clothing's alpha and then change the draw priority to 4000 or 5000 and you will easily see all the reference zones and motion target points.
  6. Drag around the animation pattern with the dildo to see all the different zones visualized.
 
Nice Work, this is huge! I can confirm that this also works with VAMSync 3.0 (forked from VAMLaunch) and therefore, the Handy.
Also, would it be feasible to add a M+M Motion Source, as the male model might be constrained / not have the necessary targets?
 
Nice Work, this is huge! I can confirm that this also works with VAMSync 3.0 (forked from VAMLaunch) and therefore, the Handy.
Also, would it be feasible to add a M+M Motion Source, as the male model might be constrained / not have the necessary targets?
With "M+M", do you mean Male + Male? I looked into it anyway and it wasn't too much work to get it working. I need some days to test it before releasing an update.
 
Ooo, one other side note Blazedust! When trying to get this to work with Assets + Female, I find that the list it includes is A LOT of irrelevant stuff and I can't scroll down to the end of it because the page cuts off. So in other words, it includes like anything and everything in the list when choosing an asset. That said, the easy work around is just to have the female selected so it picks up on whatever her hand is doing I guess
 
Last edited:
Ooo, one other side note Blazedust! When trying to get this to work with Assets + Female, I find that the list it includes is A LOT of irrelevant stuff and I can't scroll down to the end of it because the page cuts off. So in other words, it includes like anything and everything in the list when choosing an asset. That said, the easy work around is just to have the female selected so it picks up on whatever her hand is doing I guess
I will limit that asset and component dropdown list controls to a max height that doesn't extend the plugin interface.
 
This is how I debug/visualize:
  1. Have one or two females in a scene with static poses.
  2. Place a dildo on an animation track going up/down.
  3. Set motion source to "Dildo + Female". The Dildo will be the reference instead of the male's penis.
    • Tip: Turn physics off on the dildo so you can move around the animation pattern freely through the female without having to worry about collision.
  4. Check enable "Debug" in the plugin (there's an option for that under the Debug group)
    • Enabling debug visualizes the reference zone and the motion target point by drawing them as cylinders.
    • Depending on how far the reference area is inside the motion target point you will see the Current Axis Input value move along it.
    • The Current Axis Input drives a "virtual zone" that is driving the VAMLaunch plugin logic. This is just a range from 0 to 1 which is the input from the Current Axis Input. What you want to see is the reference zone and motion target point.
    • To change the reference area's size, use the following settings, Reference Length/Radius and Target Point End/Start:
      View attachment 79990
  5. You can use the GlassPlugin to turn the female invisible so you can see the debug zones being drawn inside the female.
    • You can then use some clothing to get the outline of the female body. Set the clothing's alpha and then change the draw priority to 4000 or 5000 and you will easily see all the reference zones and motion target points.
  6. Drag around the animation pattern with the dildo to see all the different zones visualized.

Thanks for the detailed walkthrough. And I didn't know GlassPlugin until now, so thanks!

I was thinking about the L0 vs. R0 settings. I do a little dev work myself in Unity with VR and understand some of this. Shouldn't there be a way to easily translate the X axis movements (L-R, R0) into Y axis movements (U-D, L0)? In other words, a logic that says "if it moves forward 1 inch, let that represent moving down one inch". And you can replace the L0 and R0 with a simpler "LR0". I'm not sure if I'm being clear, let me know if I need to draw a diagram :)
 
Just my two cents, because this has nothing to do with the review or the quality of the plugin :)

I still won't jump back on the teledildo train for VAM with this. Even tho the plugin is great... the prediction mechanic still kinda "stinks" (but this is not your fault, this is the way the logic works). It makes any implementation of the plugin in VAM pretty limiting. As soon as you have a scene that changes pace drastically, the toy is lost. The other aspect is that the sync as actually pretty "behind" what you see.

As long as we don't have something that can read "in front" of what is happening in game (with some kind of timeline tool like the funscripts system), it will always be lagging behind or doing odd movements as soon as the scene has a bit more randomness.

That said ! As I've said in my review, really nice job.
 
Just my two cents, because this has nothing to do with the review or the quality of the plugin :)

I still won't jump back on the teledildo train for VAM with this. Even tho the plugin is great... the prediction mechanic still kinda "stinks" (but this is not your fault, this is the way the logic works). It makes any implementation of the plugin in VAM pretty limiting. As soon as you have a scene that changes pace drastically, the toy is lost. The other aspect is that the sync as actually pretty "behind" what you see.

As long as we don't have something that can read "in front" of what is happening in game (with some kind of timeline tool like the funscripts system), it will always be lagging behind or doing odd movements as soon as the scene has a bit more randomness.

That said ! As I've said in my review, really nice job.

Is the issue that the underlying code doesn't expose the real-time coordinates of the various controllers? It's crazy how we can't implement this in a straightforward way. The AnimationPattern or CycleForce works perfectly. Is it because we can push coordinates to the controller and just not get them back directly?

I agree we need the equivalent of the funscript system here.

You said "won't jump back on... for VAM"... what are you working on in this front instead?
 
Is the issue that the underlying code doesn't expose the real-time coordinates of the various controllers? It's crazy how we can't implement this in a straightforward way. The AnimationPattern or CycleForce works perfectly. Is it because we can push coordinates to the controller and just not get them back directly?

I'm not sure exactly how the api works behind, I just initially redid a pack for the original plugin. Code interacting with electronics is something I never did and most certainly is way beyond my coding skills.

In the case of AnimationPattern and CycleForce, it is pretty easy because you can do the maths in advance.

I assume that, for the "dynamic version" (the zone one) the way it is implemented in VaM is an approximation of the motion. It does not know exactly where it is nor what it's gonna do. Which result in an algorithm trying to emulate and anticipate the next motion with the previous ones. You can heavily notice that if you have extreme speed changes, it takes a few cycles for the toy to actually reach the pace of what you see.

This is something that will always be like that as soon as you use prediction... there is absolutely no information for the code to know that after a few back and forth, the motion will suddenly stop for 2 seconds or go twice as fast / slow.

The only perfect way is to have a system that directly feeds the destination in advance with the exact pace, so yes, a "funscript like" track on Timeline for instance would be flawless : )

You said "won't jump back on... for VAM"... what are you working on in this front instead?

Nothing! I won't make new content / scenes for teledildos. 2069 was the first and the last for the big scenes and content. It is way too much work and way too impredictable in the way I'm making my scenes to actually be enjoyable. I have several scenes that had teledildonics, and Djinn has also some experiments, but I threw them away pretty quickly seeing the results.

That said, I don't say that to discourage people from creating content for this plugin. I think that creating scenes at slow / average speeds and most importantly with a pace that changes over a few seconds should do the trick. But this is not the kind of content I wanna create... and I'm way too demanding on my own work to release something that doesn't work perfectly.

I might do a demo scene to support @Blazedust and give a few hints to people who want to create content for it... but after that, I'll be waiting for a future (potential) "funscript" solution before considering spending time on teledildos :)
 
That said, I don't say that to discourage people from creating content for this plugin. I think that creating scenes at slow / average speeds and most importantly with a pace that changes over a few seconds should do the trick. But this is not the kind of content I wanna create... and I'm way too demanding on my own work to release something that doesn't work perfectly.

I might do a demo scene to support @Blazedust and give a few hints to people who want to create content for it... but after that, I'll be waiting for a future (potential) "funscript" solution before considering spending time on teledildos :)


Thanks for the explanation, makes good sense. I look forward to your demo!
 
Blazedust updated ToySerialController+VAMLaunch with a new update entry:

ToySerialController+VAMLaunch (update) v.2

ToySerialController+VAMLaunch (update) v.2

* You can now include males in female targets for motion sources with females. See options under "Motion Source". You can now have two males interacting with this plugin. After toggling the option, you might have to press "Refresh".
* You can now have a fixed penis reference length for males. See options under "Motion Source". Some animations animate the penis tip, causing the length to vary up to 10% (as the ToySerialController uses the collision...

Read the rest of this update entry...
 
Thanks for the detailed walkthrough. And I didn't know GlassPlugin until now, so thanks!

I was thinking about the L0 vs. R0 settings. I do a little dev work myself in Unity with VR and understand some of this. Shouldn't there be a way to easily translate the X axis movements (L-R, R0) into Y axis movements (U-D, L0)? In other words, a logic that says "if it moves forward 1 inch, let that represent moving down one inch". And you can replace the L0 and R0 with a simpler "LR0". I'm not sure if I'm being clear, let me know if I need to draw a diagram :)
That's surely possible, but nothing I will do. My intent was to merge ToySerialController and VAMLaunch plugins - which is now done. You could try to change the input axis to L-R or R0 or something for scenes where applicable instead as that's possible with the axis input selector.

the prediction mechanic still kinda "stinks" [...]. It makes any implementation of the plugin in VAM pretty limiting. As soon as you have a scene that changes pace drastically, the toy is lost. The other aspect is that the sync as actually pretty "behind" what you see.

As long as we don't have something that can read "in front" of what is happening in game (with some kind of timeline tool like the funscripts system), it will always be lagging behind or doing odd movements as soon as the scene has a bit more randomness.

That's totally the case. The prediciton will always be behind because it need to sample for some period of time.

When the VamLaunch plugin got released I experimented myself with a basic keyframe implementation. It worked by having a json with stored positions and timestamps for the launch synced to an animation. This could predict where the launch should move ahead of time and compensate for animation speed, time scales and fps/lag. It worked really well. I also made it the other way around that the keyframes could drive an animation pattern.

However, editing the keyframes manually, even for a 20 second long loop, was super tedious. And this approach was also limited to animations baked into the timeline of the scene.

One way of doing this would be to record a scene to get the position sample data. Then do some cleanup to the date and normalize it. Then bake it into the scene and enable playback.
In theory, it could be done with further development of this plugin.
Pass 1: Record the scene and get the sample data.
Pass 2: Clean up the data and normalize it (and maybe do some manual edits).
Pass 3: Enable playback with perfect ahead of time prediciton.

As of now I consider this plugin done. My goal was to merge the two plugins ToySerialController and VAMLaunch Repack :)

I will leave it up to someone else to keep developing this plugin if they want to.
 
Which available "Toys" work Premium with that .
I am sure people are interested ?
 
Blazedust updated ToySerialController+VAMLaunch with a new update entry:

Important fix

* Fixed a problem introduced in the last update (related to the 5 units minimum distance to move before predicting a stroke) where strokes at certain speeds could unintentionally be delayed to 0.4 seconds depending on plugin settings. For fast moving scenes where strokes were faster than 0.4 seconds this means strokes could be omitted entirely :(
** Translation: Update the plugin to version 3 for a much better experience! Version 2 is kinda bad! Sorry!

Read the rest of this update entry...
 
If you have version 2 - update to version 3 as it contains an important fix!
 
Hmmm. I followed all the instructions and installed everything. But VAMServer just can't find my lovense gush.
I can only give you some advice.
Check support over devices here https://github.com/ZengineerVAM/VAMLaunch - I don't know about lovense gush .
Does your device show up in windows' list over connected bluethooth devices? You might have to pair it first with windows I guess.
 
I can only give you some advice.
Check support over devices here https://github.com/ZengineerVAM/VAMLaunch - I don't know about lovense gush .
Does your device show up in windows' list over connected bluethooth devices? You might have to pair it first with windows I guess.

Yes. The gush also works with intiface so it should also work with vamlaunch no? Since that means it has buttplug.io support. :unsure:
 
Back
Top Bottom