Hand Poser

How could I apply a hand poser like the one in Unity XR Toolkit in Needle?

Original Post on Discord

by user 691445074055397447

Hello there – we have an experimental AI bot :robot: that might be able to help you with your question. Would you like to try this out?

How it works:
Reply to the messages of the bot that end with a thread_id to interact with it and ask follow-up questions. You can also mention the bot directly in this channel to get help.

Help requested: please wait a moment…

Here is what I found for you :brain: Discord

In Needle Engine, direct equivalents for Unity XR Toolkit components, such as a “Hand Poser” are not explicitly documented in the types.md files. However, Needle Engine provides components that can be utilized to work with WebXR and VR input devices. You can utilize the WebXRController class to access and interact with VR controllers which includes hand tracking functionality if supported by the device.

To simulate a hand poser, you could develop a custom component that controls the pose of a 3D hand model based on the input from the WebXRController class, adjusting the hand model to match the position and rotation provided by the tracked controller or hand data.

Here’s a step-by-step pseudocode outline:

  1. Create a new Behaviour component for hand posing.
  2. Reference an Object3D for the hand model.
  3. In the update method, access the WebXRController for hand data.
  4. Apply the position and rotation data from the controller to the hand model.

Below is the TypeScript code snippet for such a component, this is not a feature built into Needle Engine, but a suggestion on how you could implement a custom hand poser:

*thread_opv9DJzYV4oblbp9S6tfhtZU*

typescript
import { Behaviour, WebXRController } from “@needle-tools/engine”;
import { Object3D } from “three”;

export class HandPoser extends Behaviour {
// Reference to the hand model Object3D
private handModel: Object3D;

// Initialize in awake or start
start(): void {
// Fetch the model from this gameObject or set it up using the editor.
// This is a placeholder, replace with actual hand model fetching logic
this.handModel = this.gameObject;
}

update(): void {
// Get the WebXRController for the left or right hand
const webXRController = this.context.mainCameraComponent.getController(/* left or right */);

if (webXRController && this.handModel) {
  // Apply position and rotation to the hand model.
  this.handModel.position.copy(webXRController.controller.position);
  this.handModel.rotation.setFromQuaternion(webXRController.controller.quaternion);
}

}
}

*thread_opv9DJzYV4oblbp9S6tfhtZU*

Please note, this is a high-level conceptual example. You would need to refine it with proper hand model fetching, and handle different states of hand tracking and controller input according to the app’s specific requirements and the capabilities of the WebXRController class.

1 messages left GPT 4
thread_opv9DJzYV4oblbp9S6tfhtZU

Is there any way to use a Custom Editor in Needles?

by user 691445074055397447

Here is what I found for you :brain: Discord

Needle Engine supports customizing C# code generation behavior through TypeScript attributes when using the Component Compiler. You can employ the following TypeScript attributes to control C# code generation behavior:

  • @generate-component: Force the generation of the next class.
  • @dont-generate-component: Disable the generation of the next class.
  • @serializeField: Decorate the generated field with [SerializeField].
  • @type UnityEngine.Camera: Specify the type of the generated C# field.
  • @nonSerialized: Skip generating the next field or method (see the documentation).

Additionally, you can extend generated components using the partial keyword, allowing you to add functionality like drawing gizmos, adding context menus, or introducing extra fields and methods that are not part of the built-in components. This method is helpful for customizing the behavior of generated C# classes (see the documentation).

If you wish to use a custom editor, you can create your own as well as interact with generated C# stub components. For example, you can extend the generated component and add custom editors. Below is an example of how you can do this:

*thread_opv9DJzYV4oblbp9S6tfhtZU*

csharp
namespace Needle.Typescript.GeneratedComponents
{
public partial class MyCustomComponent : UnityEngine.MonoBehaviour
{
// Your generated fields and methods here
}
}

namespace Needle.Typescript.GeneratedComponents
{
// This is how you extend the generated component
public partial class MyCustomComponent : UnityEngine.MonoBehaviour
{
public void MyAdditionalMethod()
{
// Your additional method logic here
}

    private void OnValidate()
    {
        // Your validation logic here
    }
}

// You can also add custom editors
[CustomEditor(typeof(MyCustomComponent))]
public class MyCustomComponentEditor : Editor
{
    public override void OnInspectorGUI()
    {
        EditorGUILayout.HelpBox("This is my sample component", MessageType.None);
        base.OnInspectorGUI();
    }
}

}

*thread_opv9DJzYV4oblbp9S6tfhtZU*

You can refer to the Needle Engine documentation for more details on controlling component generation and extending generated components (see the documentation).

0 messages left GPT 3.5
thread_opv9DJzYV4oblbp9S6tfhtZU
Thanks for your feedback!

You have reached your limit of 2 messages… please try again in a few hours

Yes, all generated components are marked as partial, so you can both extend the components themselves in Unity (e.g. add custom code to them, add scene gizmo drawing, …) and add custom editors for them

Okay, great! Thank you so much!

by user 691445074055397447