Hi!
I’m working on an AR app where you can zoom/scale items, starting with a coffee bean (which starts at a realistic scale) and allowing you to scale all the way down to a virus. There are a total of about 8 things in the intermediate scales (bacterial cell, red blood cell, etc.) I am hoping to build an AR experience based on this 2D web-based interactive: Cell Size and Scale
I’ve made objects to scale in 3D and imported them to Needle as a .glb, but am running into some issues when I try to view them. I’m sure this is due to the huge amount of scaling that needs to happen (around 250,000x from coffee bean to virus). I was really hoping we could just implement this using the Needle viewer implemented in AR in Android/iOS, but am running into issues.
-
It works pretty well in Android/AR - I can place the coffee bean and scale all the way down to the virus. Issues with rendering - specifically, flickering of surfaces - happens with the smallest objects, though. And while scaling (pinching), the objects all become a little stretched.
-
In iOS/AR, there are a lot more problems. iOS appears to limit scaling between 1% and 5000%. I’m sure this is reasonable for most things, but at these scale ranges the coffee bean is huge and the virus is not really visible. There are also a lot of issues with clipping of larger things.
-
In the web-based Needle viewer (not using AR), there are limitations on zoom and also a lot of clipping issues.
Any ideas for solutions that might not be too difficult to implement? Note that I’m a total newbie to Needle and AR app development, but am learning some Unity. If there are any tutorials out there that might point me in the right direction, that would be great!