how does the setup look like in your unity scene? I didn’t find anything in the documentation.
by user 684154017479524352
how does the setup look like in your unity scene? I didn’t find anything in the documentation.
by user 684154017479524352
You can find the sample scene here right now on the image-tracking
branch: https://github.com/needle-tools/needle-engine-samples/tree/samples/image-tracking
can you already estimate how long it will take for iOS devices to be supported?
by user 684154017479524352
cc @herbst🌵
What do you think, is the Samsung S8+ too old? Couldn’t do Image AR tracking with this device. EDBYTO – E-Learning-Anwendungen für das WebXR.
by user 684154017479524352
Did you enable the webxr incubations flag? via chrome://flags/
Image tracking does require it right now
The scene doesnt use textures but vertex colors and that’s not supported by quicklook I believe
Yes, I had already set that.
by user 684154017479524352
I actually added image tracking support for QuickLook yesterday
Have to clean it up a bit but works already.
There are some QuickLook on iOS limitations unfortunately, so the above sample won’t work:
I’ll take a glance if baking vertex colors to texture is feasible at runtime.
At some point we’ll potentially have a validator of sorts in the editor to show when a scene will / will not work in QuickLook
I was wrong there. There is NO image tracking on IOS, only AR.
by user 684154017479524352
Image Tracking on iOS is not released yet. Felix just worked on it yesterday
Image tracking on IOS works great. My textures are also displayed. Many thanks to Halle :)
by user 684154017479524352
Thanks great to hear
My own workaround is to use an AnimateBy preliminary…Action to jettison the asset in place at Scene start. ARQL handles both well designed scenes and badly authored assets with the origin/pivot in wild places. Therefore it first calculates a bounding box of the whole scene - wrongly assuming, the bottom lies on the ground plane.
by user 277531393297350676