Now that the Meta Quest 3 is here, PCVR hand tracking is here, both with Airlink and Virtual Desktop. The good news is that, in VR, VaM does a very good job of matching the onscreen hands with our own hands, in terms of general positioning and orientation. But it has no idea about individual digits moving, and no idea how to translate hand motions into commands.
I get it that these features would require a lot of coding, and are probably going to be relegated to "Maybe in V2 -- Maybe" category, but I'd love to be able to use hand tracking to summon and interact with menus, and to be able to interact more fully and naturally with the VaM characters -- being able to squeeze and pinch and to gesture with fingers.
I get it that these features would require a lot of coding, and are probably going to be relegated to "Maybe in V2 -- Maybe" category, but I'd love to be able to use hand tracking to summon and interact with menus, and to be able to interact more fully and naturally with the VaM characters -- being able to squeeze and pinch and to gesture with fingers.