Hi guys,
If you’ve ever tried to push “full immersion” with VaM + a stroker robot, you’ve probably hit the same wall I did:
the setup and workflow often feel more like tinkering than playing — taping trackers/controllers, chaining extra tools, fighting connections, and still not getting the exact motion you want.
On top of that, even if you already own a multi-axis OSR (SR6/OSR2+), finding motion that really uses those axes (and matches your taste) can be surprisingly limiting.
So I asked a different question:
What if we could create motion for VaM in real time — like using a motion-aware controller — and sync that same motion to an OSR device at the same time?
I recorded a single demo where VaM character motion + OSR motion are controlled together, and it shows all three Gesture Sync modes:
▶ Gesture Sync Demo Video — Auto / 4x / 6x
- Auto Mode (one-hand, casual)
It’s a different tool for a different kind of fun:
If you want setup notes or more details, I’ve put everything here:
mirabotx.com
If you’ve ever tried to push “full immersion” with VaM + a stroker robot, you’ve probably hit the same wall I did:
the setup and workflow often feel more like tinkering than playing — taping trackers/controllers, chaining extra tools, fighting connections, and still not getting the exact motion you want.
On top of that, even if you already own a multi-axis OSR (SR6/OSR2+), finding motion that really uses those axes (and matches your taste) can be surprisingly limiting.
So I asked a different question:
What if we could create motion for VaM in real time — like using a motion-aware controller — and sync that same motion to an OSR device at the same time?
The idea
Use your phone as a lightweight motion-capture controller:- Your phone captures motion (gyro + touch)
- It streams motion via UDP to VaM (for character control)
- And it can also drive an OSR device in sync
Demo video
I recorded a single demo where VaM character motion + OSR motion are controlled together, and it shows all three Gesture Sync modes:
▶ Gesture Sync Demo Video — Auto / 4x / 6x
What “Gesture Sync” actually controls
- Auto Mode (one-hand, casual)
- Phone tilt controls: Pitch + Roll
- Phone twist controls: Twist
- Stroke: either
- auto speed based on phone movement intensity, or
- fixed stroke speed via a slider
Best for relaxed, quick control. Trade-off: stroke speed isn’t fully “drawn” in real time.
- Pitch / Roll / Twist = same as Auto (phone motion)
- Stroke = finger swipe on a touch area (you draw the stroke path live)
- Everything in 4x, plus:
- Second touch area controls Surge + Sway (lateral axes)
Why VaM users might care
This isn’t trying to replace VR full-body tracking or existing animation workflows.It’s a different tool for a different kind of fun:
- Live, improvisational character motion (not just “play an animation”)
- A practical way to use extra OSR axes even when content doesn’t provide them
- A lightweight alternative to “strap hardware / chain tools / troubleshoot for 30 minutes” workflows
Coming soon
Within ~2 weeks, I’m planning to ship a real-time motion recording + playback feature:- Perform a Gesture Sync “mocap” once
- Record it
- Next time, just play it back like a motion clip
About MiraPlay AiO
- Currently free for all VaM and OSR users (I want more people to experiment and give feedback)
- Designed to be simple to set up and quick to use
- I’m adding and iterating features continuously
If you want setup notes or more details, I’ve put everything here:
MiraPlay AiO | Smart Control Hub for MiraBot & OSR Devices
Discover MiraPlay AiO, the smart mobile hub that turns your phone into the brain of MiraBot and OSR devices, enabling 2D/3D/VR video sync, gesture control, and internet remote play anywhere.
mirabotx.com