Resource icon

Plugins HeadTracker

Messages
11
Reactions
61
Points
13
Website
github.com
Eugene-E0a80fd8080ff8e submitted a new resource:

HeadTracker - Rotate camera with rotation of your head. Webcam is needed

This is the Head Tracking plugin. It moves camera according to your head moves.
First, you set up a webcam either on top or under your monitor, then you run HeadTracker6DoF.exe , which is a head pose sensor (sources are available on my github). Next, you add this plugin.

You may use F2 button to change middle point. Move camera with your head to something what should become a new middle point, hold F2 while returnng your head to a normal position. Camera stays still while you holding F2...

Read more about this resource...
 
Case Don't get me wrong - I really dig your work, and I'm beyond happy someone in the VaM community is paving the way in terms of DIY-(body)tracking, but ... I really liked your PhoneController app, and I was hoping you'd further explore the possible improvements you mentioned when you published it? (IIRC, you pondered using some custom libraries (SENSORS_6DOF?) to help with drift correction).
For background: I'd wager many people in the community can manage to save up the dough for a Quest or similar at some point, but there's only very, very few peeps that can afford dedicated commercial fullbody-tracking solutions. Afaik, it's rare even amongst the semi-professional scene creators.

Granted, there's Driver4VR, which offers various DIY-solutions for lowerbody-tracking - but my impression is that they're more focussed on precise foot-tracking for action games & VRchat? (I haven't tried it yet, but I would wager that, say, a kinnect would have trouble correctly tracking at least one coordinate of the hip, since the sensors all face in one direction)

TL;DR - I think there'd be a lot of demand in the VaM community for a good, reasonably accurate DIY-hiptracking solution. Full lowerbody-tracking would be more awesome yet, but I think the hiptracking is going to be most critical, since so much of animation & posture is influenced by that atom/controller
.
Eugene-E0a80fd8080ff8e
Although this might seems like a step towards body tracking, it is not (yet).
This plugin only manages camera: you turn your head to the left, VaM's camera turns to the left and image on your display runs to the right (so you see what was behind the left side of your monitor). You have to keep your eyes on the monitor. Eyes are not tracked, just head.

This is similar to what MS Flight Simulator's people use. Search youtube for "FaceTrackNOIR" -- there a lot of videos for this application of headtrackng.

I understand that - I was just trying to pester you into working on upgrading (the drift-correction of) your PhoneController plugin ;)

Eugene-E0a80fd8080ff8e
For hiptracking :) , this seems to be easy task -- just slap an ArUco marker on your butt, and you good to go. Probably copying an example snippet from OpenCV docs would be enough for PoC.

I'm sorry, you're a few floors over my head here ...

* What's this OpenCV you speaketh off?
* What's PoC? (not Person of Color, I presume)?

I understand (theoretically) that an ArUco marker can be used as a target for tracking software. But in order to supply VaM with real-time 6DOF coordinates, I guees I'd still neeed:
* Camera(s) - (be it a WebCam, a kinect, or whatnow - what would be your recommendation?)
* Software to convert the camera input into coordinates
* An interface with VaM.

I was trying to interest you into coding the latter two (or maybe writing a tutorial, if VaM-compatible solutions already exist) - otherwise, I'd be standing around looking stupid with a useless marker on my butt ... ?

TL;DR - "I'm asking you to speak to me as you would to a small child, or a dog ..."
 
Last edited:
Hello!

* What's this OpenCV you speaketh off?
OpenCV is a graphics processing library. You can find out more on its website, wikipedia or even youtube.
* What's PoC? (not Person of Color, I presume)?
PoC is a Proof of Concept. It refers to a minimalistic program which existence proofs that idea is doable.
I understand (theoretically) that an ArUco marker can be used as a target for tracking software. But in order to supply VaM with real-time 6DOF coordinates, I guees I'd still neeed:
* Camera(s) - (be it a WebCam, a kinect, or whatnow - what would be your recommendation?)
I use Genius ECam 8000 and I would recommend AGAINST buying similar. Its FullHD picture looks like it was upscaled and I was not able to take more than 16 FPS off the cam.
* Software to convert the camera input into coordinates
* An interface with VaM.
I was trying to interest you into coding the latter two (or maybe writing a tutorial, if VaM-compatible solutions already exist) - otherwise, I'd be standing around looking stupid with a useless marker on my butt ... ?
Well, let's say it is on my list, but I doubt a proper motion capture is doable by a team of just one person.
Right now I am working another interesting project :) Soon you will see

Regards, Eugene.
 
Eugene-E0a80fd8080ff8e updated HeadTracker with a new update entry:

major bugfixes and opentrack support

Major bug has been fixed. This should ease shaking. If you have adjusted deadband, dampings and limiters to a high values you may was to review them.

Also, support for opentrack was added. You may use it with IR clip, TrackAI or any other tracker.
To use it, run opentrack with its output set to "UDP over network" with ip 127.0.0.1 and port 62731.

Read the rest of this update entry...
 
This works really smooth now with opentrack's neuralnet tracker.
 
Last edited:
Back
Top Bottom