I need to implement a feature where the virutal hands representing the user’s hands stop moving the fingers in the 3D space. However, I want to maintain the ability to use WebXR hand tracking for other purposes. Is there a way to achieve this?
I’m struggling to understand how hand movements are represented in XR. Specifically:
How is the movement of the hand translated into XR(in the virtual hand)?
Are finger movements handled through animations?
Is it done by lerping each finger based on the position of hand joints?
To manage virtual hand movements in WebXR while still allowing hand tracking, you can modify how the hand movements are processed. Here’s how you can address each of your concerns:
Translation of Hand Movement to XR: Virtual hands in XR environments like WebXR use the positions of hand joints obtained from the XRSession’s joint tracking to position the virtual hand. The XRHandMeshModel in threejs, for example, updates the positions of the joints based on these tracked positions. The virtual representation uses these points to interpolate hand poses dynamically.
Handling Finger Movements: Finger movements can be managed via animations or directly through joint tracking as described above. In dynamic hand tracking scenarios like those provided by WebXR, direct joint positioning is common. The joint poses are fetched and updated per frame, which handles individual finger movements in near-real time.
Utilizing Lerp for Joints: You can indeed lerp between joint positions for smoother transitions if necessary. However, in real-time settings, direct updates from joint tracking are often sufficient for capturing realistic movements without the need for interpolation unless you’re smoothing or filtering sensor noise.
Implementation Strategy
To freeze finger movements but maintain tracking, you could:
Set the joint positions once and then halt further updates based on conditions (e.g., a user setting or specific gesture).
Disable joint position writing through condition checks in the update loop to freeze the visual representation but allow tracking data to continue being processed for other purposes.
Here’s a pseudo implementation strategy in TypeScript:
class FreezeHandMovement extends Behaviour {
shouldFreeze: boolean = true; // Toggle this to freeze or unfreeze
onUpdateXR(args: NeedleXREventArgs): void {
const hand = args.xr.someHandRepresentation; // Hypothetical hand object
if (!this.shouldFreeze) {
// Update joints normally
this.updateJointPositions(hand);
} else {
// Optionally reset positions to a "resting" pose or leave frozen
}
}
updateJointPositions(hand: any) {
// logic to update joint positions from XR hand tracking data
}
}
Ensure correct implementation by hooking into the update cycle and conditionally processing joint data. You might need to consider handling other WebXR components you are using to maintain consistent behavior (see the documentation).
24 replies left thread_HnvAYxeUhTnyIzMBTAR1QLi6 Reply to this message to continue the conversation
Hi @boss15493 to make sure i understand it right: you want to use hand tracking with the xr virtual hands for x amount of time and then through some event freeze the hands (but not disable hand tracking altogether but just control the time during which the hand model is updated?
Hi Marcel, that’s exactly what I want. I’d like to create a hand poser so that when i grab an object, those hands remain in a predefined position, and they can’t move until the custom gesture I create is no longer recognized, the hands return to moving normally.
Create a custom component that derives from the XRControllerModel component and use your custom component in the scene instead of the core XRControllerModel component.
Override the onBeforeRender method and call super.onBeforeRender as long as you want your virtual hand to follow hand-tracking and stop calling super.onBeforeRender if you want the hand to be frozen