Connect

MacGruber

Invaluable member
Developer
Wiki Contributor
Featured Contributor
Messages
1,607
Reactions
3,169
Points
143
MacGruber submitted a new resource:

Connect - Controlling E-Stim devices and virtual self-bondage

Connect is a plugin that allows you to control audio-driven E-Stim devices. The package also includes VirtualLock, a plugin for virtual self-bondage in VR. In combination that means, if you do not behave, your dominatrix will punish you...in real life!

Screenshots
View attachment 10459 View attachment 10460 View attachment 10461 View attachment 10462 View attachment 10463


Content

  • Proof-of-concept demo scene...

Read more about this resource...
 
MacGruber updated Connect with a new update entry:

Version 2

Changelog for version 2
  • VirtualLock: Improved breach behavior. You can set a timer now how long until it resets.
  • VirtualLock: Produced a little overview showing the state and events, so you know what needs to be hooked to what. See screenshots.
  • VirtualLock: Rearranged plugin UI a bit to be more logical.
  • Demo: Slightly improved light setup which was possible now with yesterdays fixes to SkyMagic.
  • Demo: New SkyMagic...

Read the rest of this update entry...
 
MacGruber updated Connect with a new update entry:

Version 3

Changelog
  • Making the Connect plugin working again under the new VaM security restrictions. Thanks @ brycerer for reporting the issue, as I only tested it after the big 1.20.77.6 update, but ironically it broke after Meshed did another minor update after that. Anyway, works with 1.20.77.9 now.
  • Updated demo scene with the new technical possibilities of IdlePoser 7. Much more animation...

Read the rest of this update entry...
 
This Connect plugin is very promising. Is there a way i can have it emulate 'stroking' (via the A and B channels) synced up to a female's movements?
 
@itsgus The main problem is placing electrodes (2 per channel) and a lot of experimenting to make it feel like stroking. But you can certainly setup something like that. For example you could easily use the Life thrust demo scene and rig the ThrustTrigger sub-plugin to emit triggers at the right time. Trigger could also come from other sources like an AnimationPattern, Timeline, etc.
1615010708072.png
 
This works alright with a DG Labs Coyote after modifying the pulse width, by feeding the sound to the Coyote by connecting the PC's speaker out to the phone's microphone in. May require a 3.5mm mic/speaker splitter depending on the phone.
 
MacGruber updated Connect with a new update entry:

Version 4

Added ability to connect to a Raspberry PI and control its GPIO pins.

Changelog
  • Connect: Implemented UI allowing you to use custom hostname and port to connect to.
  • Connect: Improved error handling.
  • Connect: Improved DNS resolve.
  • ConnectAudio.App repackaged to include source code. For some reason source code was not included in the previous versions.
  • ConnectAudio...

Read the rest of this update entry...
 
Would be very awesome if your Connect plugin is able or a plugin similar to it to retrieve information from the Raspberry PI and for that data to work with VAM plugins. There would be so much more coming to the VAM world in ways of possibilities. Merging our physical world with the VAM virtual world and both influencing each other. Even amazing things that do not necessarily have to be sexual think about for example smart objects in our world transferring their data to VAM and the VAM world responding to that. If your room is too cold or too hot an atom could make you aware of this lol. If your room is too hot the atom might want to have less clothes on haha. If you turn on a cooling or heating fan in real life then a cooling or heating fan in the VAM world would also turn on. Or you press a button attached to the Raspberry PI in the physical world and it would have the same capability as a VAM virtual button and the same effect in VAM, so many things I could come up with. I know these things are possible it is just a matter of it manifesting by either me or someone else or a community project all it needs is at least a very basic way of the Raspberry PI to send data to VAM and for VAM to allow it to be used with other plugins for example your Logic Bricks. I would even become your Patreon to support you if you come up with a working VAM plugin that allows data from the Raspberry PI 3/4 to be send to VAM plugins. I feel that this should be available for VAM the ability of Raspberry PI to send data to VAM plugins. It will be so much fun and lead to much more abilities and functionality for VAM and Raspberry PI.
 
Last edited:
A TCP connection is always in both directions, I'm just not sending data through it at the moment. Basically you would need to replicate the data retrieval code and package reassembly from the ConnectAudio app (because that's C#) in the VaM plugin. For outgoing triggers, same as VaM, LogicBricks does only support action and float triggers at the moment, though. So its kind of hard to do this in a flexible/generic purpose way, unless you want a messy UI. You would need coding on the Raspberry side anyway to deal with your sensors, etc, unless it's just relaying GPIO inputs. So no generic purpose on that side either.

However, I would imagine the target group for Connect in its current state to be like less than 20 people from this community. This addition would attract even less people, as it requires coding skills. It doesn't really make sense to do, if I don't need it myself.
 
MacGruber updated Connect with a new update entry:

"The New Remote" demo scene released

Just released a new scene specifically made for "Connect" and "VirtualLock":
Your mistress has a mysterious new remote she would like to try...and that's just a condom, right? Maybe it isn't...o_O


(Watch with sound!)



Read the rest of this update entry...
 
Would it be possible to integrate something like a Bhaptics vests for touch and whips, thrusts and any other kind of haptic feedback? , They have open source code for developers and even audio feedback, I would be glad to donate for something like that, I was told not to ask you because its uncommon and you wouldn't help for something like that, but these devices are uncommon also... so I figured why not at least ask.

1677940157863.png
1677940168536.png
1677940200407.png
 
Would it be possible to integrate something like a Bhaptics vests for touch and whips, thrusts and any other kind of haptic feedback? , They have open source code for developers and even audio feedback, I would be glad to donate for something like that, I was told not to ask you because its uncommon and you wouldn't help for something like that, but these devices are uncommon also... so I figured why not at least ask.

View attachment 218987View attachment 218988View attachment 218989
As you said yourself, these are pretty uncommon and I don't do commissions anyway. I don't even have enough time for my own crazy ideas...let alone other people's. However, they seem to have an SDK for C# on their website. You would only need to modify the ConnectAudio app (which is just a simple C# windows app, source is here on the Hub) to send signals to their SDK?

 
Update: Sending the output to XToys via a sound output that isn't the headset works. There's a little bit of inconsistency in the signals because the FFT XToys is doing is not matched perfectly to the generated audio, but the only way to improve that would be writing a direct websocket adapter. The scene works great after adjusting the output parameters to feel appropriate on my Coyote :D
 
MacGruber updated Connect with a new update entry:

Guide: Switching things with Raspberry PI + Relay Board

Just posted a guide on how you could use a Raspberry PI and a Relay board to switch devices like vibrators and E-Stim from VaM. Also philosophizing about the holy grail of VR adult software...detecting your real life (pre-)orgasms to trigger things in VaM:

Read the rest of this update entry...
 
MacGruber updated Connect with a new update entry:

Version 6 (free)

Major update for VirtualLock. Solving some old issues and allowing you better to define where the player is allowed to look at. You can now define a whole focus area, not just a single point. The reasoning here is that staring at a small point (e.g. the face of your mistress) for a couple of minutes in VR can cause your eyes to "defocus". You can actually loose the "3D" effect, despite being in VR. Allowing the player to look around a bit is essential.

Changelog
  • ...

Read the rest of this update entry...
 
Dont know if this is an oversight or if its just me, but I can't route the audo from the connectaudio.app anywhere but the default sound device. I can tell windows to switch the device through the defualt interface, using an app, or try routing through a virtual soundboard, and it always continues to output the the default device. This makes this part of the plugin functionally useless, becuase it means the main sound output will always be present in the estim channel and vice versa.
 
Dont know if this is an oversight or if its just me, but I can't route the audo from the connectaudio.app anywhere but the default sound device. I can tell windows to switch the device through the defualt interface, using an app, or try routing through a virtual soundboard, and it always continues to output the the default device. This makes this part of the plugin functionally useless, becuase it means the main sound output will always be present in the estim channel and vice versa.
Well, the assumption is that you use it with VR. In the Oculus software you can set which audio device to use for regular VaM sound, so you can set it to something different that whatever your Windows default is.

If you want to be able to change the audio device within ConnectAudio, you can add something like this to the StartAudio() method in MainWindow.xaml.cs. Obviously you could also build UI and what not...but this is the quick and simple version:
C#:
private void StartAudio()
{
    StopAudio();

    int deviceNumber = -1;
    for (int i = -1; i < WaveOut.DeviceCount; i++)
    {
        var caps = WaveOut.GetCapabilities(i);
        if (caps.ProductName == "Speakers (Realtek(R) Audio)")   // <--- run once to get devices names, then put your favorite device here and recompile
        {
            deviceNumber = i;
            break;
        }
    }

    Log.Message("Available audio devices:");
    for (int i = -1; i < WaveOut.DeviceCount; i++)
    {
        var caps = WaveOut.GetCapabilities(i);
        if (i == deviceNumber)
            Log.Message($"{i}: {caps.ProductName} <= CHOOSEN");
        else
            Log.Message($"{i}: {caps.ProductName}");
    }

    WaveProvider waveProvider = new WaveProvider(myWaveManager);
    myAudio = new WaveOutEvent();
    myAudio.NumberOfBuffers = 2;
    myAudio.DesiredLatency = 60;
    myAudio.DeviceNumber = deviceNumber;
    myAudio.Init(waveProvider);
    myAudio.Play();
}
 
Well, the assumption is that you use it with VR. In the Oculus software you can set which audio device to use for regular VaM sound, so you can set it to something different that whatever your Windows default is.

If you want to be able to change the audio device within ConnectAudio, you can add something like this to the StartAudio() method in MainWindow.xaml.cs. Obviously you could also build UI and what not...but this is the quick and simple version:
C#:
private void StartAudio()
{
    StopAudio();

    int deviceNumber = -1;
    for (int i = -1; i < WaveOut.DeviceCount; i++)
    {
        var caps = WaveOut.GetCapabilities(i);
        if (caps.ProductName == "Speakers (Realtek(R) Audio)")   // <--- run once to get devices names, then put your favorite device here and recompile
        {
            deviceNumber = i;
            break;
        }
    }

    Log.Message("Available audio devices:");
    for (int i = -1; i < WaveOut.DeviceCount; i++)
    {
        var caps = WaveOut.GetCapabilities(i);
        if (i == deviceNumber)
            Log.Message($"{i}: {caps.ProductName} <= CHOOSEN");
        else
            Log.Message($"{i}: {caps.ProductName}");
    }

    WaveProvider waveProvider = new WaveProvider(myWaveManager);
    myAudio = new WaveOutEvent();
    myAudio.NumberOfBuffers = 2;
    myAudio.DesiredLatency = 60;
    myAudio.DeviceNumber = deviceNumber;
    myAudio.Init(waveProvider);
    myAudio.Play();
}

I'll second senorgif2's request that choosing the audio output is an important feature. I wanted to try this, and had to spend a while trying random things before coming here. I'll try the code change and see if that works.

Edit: Update for anyone else trying this out: The code MacGruber posted works and I was able to set it up with my preferred output device.
 
Last edited:
Back
Top Bottom