Extreme zoom in AR

Hi!
I’m working on an AR app where you can zoom/scale items, starting with a coffee bean (which starts at a realistic scale) and allowing you to scale all the way down to a virus. There are a total of about 8 things in the intermediate scales (bacterial cell, red blood cell, etc.) I am hoping to build an AR experience based on this 2D web-based interactive: Cell Size and Scale

I’ve made objects to scale in 3D and imported them to Needle as a .glb, but am running into some issues when I try to view them. I’m sure this is due to the huge amount of scaling that needs to happen (around 250,000x from coffee bean to virus). I was really hoping we could just implement this using the Needle viewer implemented in AR in Android/iOS, but am running into issues.

  1. It works pretty well in Android/AR - I can place the coffee bean and scale all the way down to the virus. Issues with rendering - specifically, flickering of surfaces - happens with the smallest objects, though. And while scaling (pinching), the objects all become a little stretched.

  2. In iOS/AR, there are a lot more problems. iOS appears to limit scaling between 1% and 5000%. I’m sure this is reasonable for most things, but at these scale ranges the coffee bean is huge and the virus is not really visible. There are also a lot of issues with clipping of larger things.

  3. In the web-based Needle viewer (not using AR), there are limitations on zoom and also a lot of clipping issues.

Any ideas for solutions that might not be too difficult to implement? Note that I’m a total newbie to Needle and AR app development, but am learning some Unity. If there are any tutorials out there that might point me in the right direction, that would be great!

Hi!

For creating these kinds of “infinite zoom” experiences, you are right that usually actually putting objects at their real-world scale will run into limitations on floating point accuracy and other rendering challenges.

However, there are viable approaches that you can use to create the experience that you’re looking for. One solution is to take advantage of the fact that you’ll usually not see more than two adjacent “zoom levels” at the same time. You can make containers for each zoom level, and scale these containers in reasonable ranges (say, from 0..1 when they appear first, and from 1..10 when you’re zooming in to a new level). You can also disable objects once they’re out of the current zoom area (too large to reasonably still view).

This approach is somewhat similar to the zoom levels you may see on an application like Google Maps; when you zoom in you’re not actually enlarging the current level, new levels are loaded in and the previous ones fade out.

On iOS, the interaction might not be pinch-to-zoom-based then, but rather you’d have buttons like “+” and “-” or “zoom in” and “zoom out” that go to the next and previous zoom level, respectively. These would then trigger animations that convey the zooming.

Hello @JanetI did you get to try the suggested solution by Felix?