How do I apply AR Face tracking/ Filter from Unity to Needle?

Just discovered needle today. Playing with the image target and place tracking in web AR already giving me goosebumps. I wanted to know if it’s possible to develop a WebAR face tracking app with Needle? Either with ARCore or ARKit package in Unity, or Mediapipe ?? If yes, can someone give me some direction? Thanks

Original Post on Discord

by user 373346625025736705

hello. see here for a related discussion: Discord

by user 395602247196737546

Thanks for the reply. I have checked it out. Seems like I will have to use the needle engine to achieve this. I have one more question. What is the purpose of these packages? GitHub - needle-mirror/com.unity.xr.arkit: Provides native Apple ARKit integration for use with Unitys multi-platform XR API.Supports the following features:-Efficient Background Rendering-Horizontal Planes-Depth Data-Anchors-Hit Testing-Face Tracking-Environment Probes-Meshing-Occlusion 📦 [Mirrored from UPM, not affiliated with Unity Technologies.]?

by user 373346625025736705

those are unity’s own privateley developed packages. they do not live on any public repository so the needle guys mirror them to github. this way, we can easily look into their code online, see a version history and so on. they are standard c# packages, so they do not work with needle engine.

i can see how both engine and all of the mirror packages has “needle” in it is a bit confusing. even more so as the needle engine fame overshadows everything else lately :wink:

by user 395602247196737546

True that. Basically we as company “Needle” are providing a number of packages and services, and Needle Engine is the latest and biggest endeavour of those. Sorry for the confusion!

@zero-aspect yes, MediaPipe face tracking would be the way to go for cross-platform face tracking in the browser. While QuickLook on iOS supports face tracking, there’s no WebXR face tracking spec as of today unfortunately, so no Android support (and the iOS support is limited to rigid heads)

Got it. I will give it a try. I am basically trying to achieve a try on effect with needle. The hand tracking should be a good start for me. But I probably also need to figure out how to 1. render the camera feed 2. Calculate the 3D coordinates of the face landmarks from 2d data!? Never used TS or JS in my whole life. Gonna be fun :joy:

by user 373346625025736705

I have started playing with the media pipe example, but struggling a lot to show the camera feed. I tried to follow the thread from @maXR but no luck. this is how my background looks like. It detects the camera, but possibly some other component is hiding it. Any hits or reference to approach this

by user 373346625025736705

At first glance this looks like you have to set the background to transparent :thinking: I’m at work atm but I’ll try to have a look later :slightly_smiling_face:

by user 474974683163394049

Thanks for such promt reply!! I will keep trying and see if I can figure it out. I have made the background white, but it makes it white. When i inspect the video component it does not select anything. Not sure whats going on tbh.

by user 373346625025736705


by user 373346625025736705

Ah ok, so the video stream is there, the canvas is just “overlaying” it. Not quite sure how MediaPipe is implemented, but I guess you can bring your video in NY setting the z-index of the canvas to e.g. -10. Doesn’t solve your problem though :thinking:

by user 474974683163394049

To try things in a fresh project without mediapipe, I used this project. Its a empty project with just the web cam. But I just see sky

by user 373346625025736705

There is a VideoTexture class in threejs, which should help using the video as background. Not sure if needle engine supports video background or if you have to implement it yourself Oo

by user 474974683163394049

Understood. I will try that out if this goes nowhere. Lets see if anyone else from the Needle team has some way around :frowning_with_open_mouth:

by user 373346625025736705

Some additional Information before I give up and go to bed :frowning_with_open_mouth:
In the stackBlitz project I could somehow see the area the htmlvideoplayer occupy.(Pic) But I dont see it my local project. (Pic2)

There is some #shadow-root id on Pic 2 which is not present in Pic 1. Code for both is same

by user 373346625025736705


by user 373346625025736705

This took me some time, so sry for the late reply :confused: To get the camera feed, you’ll have to do a few things.

  • First remove the skybox from your scene.
  • I’ve set my camera background type to solid color with alpha = 0
  • removed the spectator camera, as this was confusing :sweat_smile:
  • set this._video.playsInline = false;
  • add css for your video element to go fullscreen and into your (now transparent) background:
video {
    position: fixed;
    overflow: hidden;
    min-width: 100%;
    min-height: 100%;
    width: auto;
    height: auto;
    z-index: -10;
}

by user 474974683163394049

by user 474974683163394049

Thanks a lot for your time. For me it only sort of works when I put the video in shadowDom

if(this.context.domElement.shadowRoot)
            this.context.domElement.shadowRoot.appendChild(this._video);

by user 373346625025736705