• Hi Guest!

    We are extremely excited to announce the release of our first Beta1.1 and the first release of our Public AddonKit!
    To participate in the Beta, a subscription to the Entertainer or Creator Tier is required. For access to the Public AddonKit you must be a Creator tier member. Once subscribed, download instructions can be found here.

    Click here for information and guides regarding the VaM2 beta. Join our Discord server for more announcements and community discussion about VaM2.
  • Hi Guest!

    VaM2 Resource Categories have now been added to the Hub! For information on posting VaM2 resources and details about VaM2 related changes to our Community Forums, please see our official announcement here.

VaM 1.x Using My Phone as a Motion Controller for VaM (Live Motion + OSR Sync)

Threads regarding the original VaM 1.x

gesmax

New member
Joined
Nov 4, 2025
Messages
1
Reactions
0
Hi guys,

If you’ve ever tried to push “full immersion” with VaM + a stroker robot, you’ve probably hit the same wall I did:
the setup and workflow often feel more like tinkering than playing — taping trackers/controllers, chaining extra tools, fighting connections, and still not getting the exact motion you want.
On top of that, even if you already own a multi-axis OSR (SR6/OSR2+), finding motion that really uses those axes (and matches your taste) can be surprisingly limiting.

So I asked a different question:
What if we could create motion for VaM in real time — like using a motion-aware controller — and sync that same motion to an OSR device at the same time?

The idea​

Use your phone as a lightweight motion-capture controller:
  • Your phone captures motion (gyro + touch)
  • It streams motion via UDP to VaM (for character control)
  • And it can also drive an OSR device in sync
This is exactly what I’ve been building into an app I’m developing called MiraPlay AiO. I’ll keep the feature list short here—this post is mainly about the workflow and why it’s fun in VaM.

Demo video​


I recorded a single demo where VaM character motion + OSR motion are controlled together, and it shows all three Gesture Sync modes:

Gesture Sync Demo Video — Auto / 4x / 6x




What “Gesture Sync” actually controls​


- Auto Mode (one-hand, casual)
  • Phone tilt controls: Pitch + Roll
  • Phone twist controls: Twist
  • Stroke: either
    1. auto speed based on phone movement intensity, or
    2. fixed stroke speed via a slider
      Best for relaxed, quick control. Trade-off: stroke speed isn’t fully “drawn” in real time.
- 4x Mode
  • Pitch / Roll / Twist = same as Auto (phone motion)
  • Stroke = finger swipe on a touch area (you draw the stroke path live)
- 6x Mode (full control, two-hand, advanced)
  • Everything in 4x, plus:
  • Second touch area controls Surge + Sway (lateral axes)

Why VaM users might care​

This isn’t trying to replace VR full-body tracking or existing animation workflows.
It’s a different tool for a different kind of fun:
  • Live, improvisational character motion (not just “play an animation”)
  • A practical way to use extra OSR axes even when content doesn’t provide them
  • A lightweight alternative to “strap hardware / chain tools / troubleshoot for 30 minutes” workflows

Coming soon​

Within ~2 weeks, I’m planning to ship a real-time motion recording + playback feature:
  • Perform a Gesture Sync “mocap” once
  • Record it
  • Next time, just play it back like a motion clip
This may become a major update that significantly reduces the “multi-axis funscript shortage” by making multi-axis motion creation far more accessible.

About MiraPlay AiO​

  • Currently free for all VaM and OSR users (I want more people to experiment and give feedback)
  • Designed to be simple to set up and quick to use
  • I’m adding and iterating features continuously

If you want setup notes or more details, I’ve put everything here:
 
Back
Top Bottom