• Hi Guest!

    We are extremely excited to announce the release of our first Beta1.1 and the first release of our Public AddonKit!
    To participate in the Beta, a subscription to the Entertainer or Creator Tier is required. For access to the Public AddonKit you must be a Creator tier member. Once subscribed, download instructions can be found here.

    Click here for information and guides regarding the VaM2 beta. Join our Discord server for more announcements and community discussion about VaM2.
  • Hi Guest!

    VaM2 Resource Categories have now been added to the Hub! For information on posting VaM2 resources and details about VaM2 related changes to our Community Forums, please see our official announcement here.
Voxta (Virt-A-Mate Plugin)

Plugins + Scripts Voxta (Virt-A-Mate Plugin)

Download [<1 MB]
I'm trying to self host Voxta, and I've come pretty far (I can talk to her and get answers in the Desktop App, and that's already amazing), but when I try to use Voxta inside VAM I can't get it running, just always the same Voxta Error-Message. Where could I get help?
 
I finally got the program working after two days of trying on a RTX 4070, Ironically I had to use chat gpt to get my pc to the right settings and voxta. But it does now work which I can deal with. You must set the models to use the cpu instead of the gpu.
 
No tutorial? Shit's very confusing. I have everything loaded (chat session connected, speech set to listen, etc., and all the modules themselves in the Voxta server are working. But how to talk to the character in the game? It's set to 'automatically send user speech'. But it doesn't pick anything up? It did once but I can't get it to respond again it's just sitting on idle. The chat itself is working too if I go into the chat on the app. But I'm very confused how to talk or send commands thru Virt A Mate.
 
No tutorial? Shit's very confusing. I have everything loaded (chat session connected, speech set to listen, etc., and all the modules themselves in the Voxta server are working. But how to talk to the character in the game? It's set to 'automatically send user speech'. But it doesn't pick anything up? It did once but I can't get it to respond again it's just sitting on idle. The chat itself is working too if I go into the chat on the app. But I'm very confused how to talk or send commands thru Virt A Mate.
I love Voxta but am not currently subscribed because of the same reason. Need a user-friendly YouTube description video that I can follow to set up Voxta to work properly in the VAM game. Otherwise I am not able to make it work for its ultimate intended use. Do the video and promote it on VAM. I guarantee you will multiply your membership numbers!
 
I've read through most of the posts on the hub and looked through the website. I've also looked through the discord. I am still a bit confused as to what exactly Voxta with the VaM plug-in will do.

- Are you only able to use it with specific scenes?

- Can I use this in any scene including self created ones?

- Does/can Voxta also control the movement of the character in VaM?

- If I want to run this locally do I have access to all of the features?

- Do I set personality and other settings outside of VaM for the individual character? Does it save or do I need to change it every time?

- Is the VaM character able to be aware of thier in game environment?

- Does the VaM character respond to touch or interaction within the game environment?

- Voxta can only be applied to one character in scene?

- Does voxta with VaM only use speech or can it also provide sounds like moaning?

- Instructions on setup and use are lacking (or I couldn't find them).

If you could answer these questions or guide me where to find them, that would be great. I've been wanting to build a VaM plug-in with these features but, it is a huge undertaking and I opted not to. So, I very much appreciate you taking this on.

I'd be happy to support you on your adventure with AI. I just want a better understanding of what you have and are working on.

Thank you
 
Acid Bubbles updated Voxta (Virt-A-Mate Plugin) with a new update entry:

Voxta Plugin v1.2.0

🎛 New: Quick-Access HUD Menu


A new Voxta button on the VaM toolbar opens a quick-access floating panel on the main HUD with a Voxta Active toggle. The plugin's Active state stays in sync between the quick menu, the main plugin UI, and any external triggers.

📡 New storables — last message metadata


Five new read-only JSONStorableString parameters expose info about...

Read the rest of this update entry...
 
I've read through most of the posts on the hub and looked through the website. I've also looked through the discord. I am still a bit confused as to what exactly Voxta with the VaM plug-in will do.

- Are you only able to use it with specific scenes?

- Can I use this in any scene including self created ones?

- Does/can Voxta also control the movement of the character in VaM?

- If I want to run this locally do I have access to all of the features?

- Do I set personality and other settings outside of VaM for the individual character? Does it save or do I need to change it every time?

- Is the VaM character able to be aware of thier in game environment?

- Does the VaM character respond to touch or interaction within the game environment?

- Voxta can only be applied to one character in scene?

- Does voxta with VaM only use speech or can it also provide sounds like moaning?

- Instructions on setup and use are lacking (or I couldn't find them).

If you could answer these questions or guide me where to find them, that would be great. I've been wanting to build a VaM plug-in with these features but, it is a huge undertaking and I opted not to. So, I very much appreciate you taking this on.

I'd be happy to support you on your adventure with AI. I just want a better understanding of what you have and are working on.

Thank you
Hey! Here are your answers!

1. Are you only able to use it with specific scenes?
No. The plugin is just an MVRScript you add to any Person atom via `Voxta.cslist`. Demo scenes are a convenience, not a requirement.

2. Can I use this in any scene including self-created ones?
Yes. Load your scene, go into Edit mode, select a Person atom -> Plugins -> add `Voxta.cslist`. Open Custom UI, pick a character from the dropdown, and it'll connect to your Voxta server.

3. Does/can Voxta control the movement of the character in VaM?
Not directly - Voxta doesn't animate bones itself. What it does expose:
- A State storable that flips between `idle`, `listening`, `thinking`, `speaking`.
- An On State Changed trigger and an On Action trigger.
- A CurrentAction string whenever the AI decides to invoke a custom action you've defined (jump, wave, blush, etc.).

You wire those triggers into Timeline (for animation clips) or Scripter (for scripted logic) - that's the pattern shown in the README. VaM's built-in Auto Behaviors -> Lip Sync handles mouth movement from the TTS audio. So movement is "AI decides -> your scene executes."

4. If I run locally, do I have access to all of the features?
Yes. Voxta server runs on your machine (default `http://127.0.0.1:5384`). It supports local LLMs (KoboldCpp, Oobabooga, Llamasharp, Exllamav2, v3 etc), local TTS (XTTS, Piper, Coqui, etc.) and local STT (Whisper). The VaM plugin doesn't care where the AI lives - same SignalR connection either way. The quality of the experience depends on the local models you pick, but no feature gets locked off.

5. Do I set personality outside of VaM? Does it save?
Yes. Characters (name, personality, scenario, system prompt, voice choice, example dialogue, memories) are created in the Voxta server's web UI. They persist server-side. In VaM you just select a character from the dropdown and the plugin stores that selection in your scene/preset - you don't redo it every time.

6. Is the VaM character aware of their in-game environment?
Partially, and it's up to the scene to feed it. The plugin exposes Context slots - string fields. E.g. your scene can write "User is in the bedroom. Lights are dim. It's 2 AM." into a context slot and the AI sees it as part of its prompt. There's also a Flags system (set/read flags) for AI-visible state and conditional actions. So awareness is as rich as your scene logic makes it - nothing automatic.

7. Does the character respond to touch or interaction?
Not out of the box, but it's designed for exactly this. You hook VaM's collision triggers (or Scripter) into Voxta's TriggerMessage storable (injects a user message) or RequestCharacterSpeech (makes the character say something). You can also set flags or update context when touched. Example: a collision on a breast trigger -> Scripter writes "User is touching your chest" into Context, AI reacts accordingly. Again, plugin is the bridge; scene provides the inputs.

8. Voxta can only be applied to one character per scene?
No. A single plugin instance supports up to 3 characters (character 1, 2, 3 with separate roles and names). And you can add the plugin to multiple Person atoms if you want truly independent brains. Multi-character conversations are supported server-side.

9. Speech only, or can it also do moans/sighs/etc.?
Depends on the TTS engine you pick in Voxta, not the plugin. Expressive models (XTTS, ElevenLabs, some fine-tuned voices) will vocalize moans, sighs, laughter, gasps when the LLM emits them - either as inline action tags or via audio-tag syntax the TTS supports. Basic TTS engines won't. So: prompt your character to use those cues + pick a capable voice = yes, it works. It's emergent from LLM output + TTS capability, not a dedicated plugin feature.
 
Acid Bubbles updated Voxta (Virt-A-Mate Plugin) with a new update entry:

Voxta Plugin v1.3.0

v1.3.0 — VaM package #87

Highlights​

A UI overhaul of the plugin's Custom UI plus VR HUD fixes and smaller polish.

New​

  • Reorganized plugin Custom UI into 6 focused tabs with color-coded tabs and primary/destructive action buttons
  • "Open Plugin UI" HUD button — open the full plugin UI directly from the HUD
  • Start/Stop Chat button on the Chat tab, with label...

Read the rest of this update entry...
 
Acid Bubbles updated Voxta (Virt-A-Mate Plugin) with a new update entry:

Voxta Plugin v1.6.0

Highlights​

a floating chat input bar so you can type to the character (great for VR, clutch when voice isn't working), native subtitles with no VAMOverlays dependency (now with a distinct color for /event / /note / /story narration), and screen-share vision so the character can actually see what's on your monitor.

New — Chat input bar​

  • Floating chat input bar at the bottom of the view. Type a message, hit Enter...

Read the rest of this update entry...
 
Acid Bubbles updated Voxta (Virt-A-Mate Plugin) with a new update entry:

Voxta Plugin v1.7.0

v1.7.0 — VaM package #91

Highlights


Character POV vision, a full debug toolkit for tuning it, and live preview panels right in the Settings tab so you can actually see what the AI sees — no more guessing.

New — Eyes vision (POV)

  • Let Character See Through Eyes — new independent toggle. A POV camera parented to the character's head bone captures what they see, complementing the existing...

Read the rest of this update entry...
 
Acid Bubbles updated Voxta (Virt-A-Mate Plugin) with a new update entry:

Voxta Plugin v1.8.0

v1.8.0 — VaM package #94

Highlights


The AI now sees your scene cleanly — no more red dots, blue lines, or floating bulbs polluting its view. Vision toggles remember what you set them to. Chat input works on the first try when you trigger it from VaM buttons. A bunch of "should just work" stuff that wasn't quite working — now does.

New — Clean vision captures

  • Editor handles and gizmos are hidden from...

Read the rest of this update entry...
 
Back
Top Bottom