Hi, I’m currently doing some benchmarking to get a feeling for what’s possible using Needle. Interestingly, performance seems to be impacted by postpro effects much more than a vanilla threejs scene is.
It does not really matter what effect is active, as long as the composer is active, frames go down.
Here’s the “minimal” scene in 4k, I just added three cubes, disabled the directional light and added a Volume with just tone mapping active:
I compared it to three.js examples which i modified on the fly to be displayed in fullscreen mode as well and confirmed both apps are using the same drawingBufferSize of 4k.
Here’s the three.js scene using bloom and tone mapping
Thank you for providing detailed information about your benchmarking results. It seems that you’re experiencing performance drops when using post-processing effects with Needle Engine compared to a vanilla Three.js setup. Here are a few aspects to consider for your investigation:
Post-processing Workflow: Review how the Needle Engine integrates the post-processing pipeline. Differences in the setup of the composer or the way effects are applied can lead to performance impacts.
Asset Management: Ensure that your assets, including textures used in your post-processing, are optimized for performance. Large textures or complex materials can drastically affect rendering speed, especially at 4K resolutions.
Volume Component Settings: While you’ve indicated that only tone mapping is active, double-check for any additional settings in the Volume component that might be contributing to the load.
Profiler Tools: Use GPU and CPU profiler tools to trace where the bottleneck is occurring. Tools like Chrome’s DevTools or NVIDIA Nsight can give insights into frames per second and rendering times.
Post-processing Effects: Test with a single post-processing effect activated at a time, if not done already. This helps identify if there’s a particular effect that significantly impacts the performance.
Update Needle Engine: If the performance issue persists in version 4.2.6 and you are aware of the black screen bug in 4.3.0, consider checking if any commits/fixes have been made that might address performance or compatibility.
Unity Export Settings: Make sure that the export settings from Unity are optimized for performance when using Needle.
If you’ve already tested all of the above, you might consider sharing your scene setup or specific configurations that are being used in Needle with post-processing to further identify potential optimization areas.
One thing I noticed is you’re instancing the WebGLRenderer with an antialiasing option. For better comparability I tried to disable it by invoking the following using the console:
I had been looking for a way to disable it but I couldn’t find anything anywhere (docs/code/forum/discord). Guess because I was searching for antialiasing and not multisampling…
My findings while playing with it: Any kind of multisampling instantly kills performance. I know this is kind of expected on high fill rates but this is pretty severe. Even just setting it to 1 (which does not visually help in any way) drops the framerate below 50 and I can hear my fans spin.
But what’s even more interesting: When there’s no composer active (no Volume), the rendered scene has pretty good AA and good performance.
My hack from the post above using Needle.Context.DefaultWebGLRendererParameters.antialias = false; and restarting the renderer does not disable the AA in this case either.
From the looks of it it might be some FXAA pass injected somewhere if no composer is active?
Currently looking into it as well - when a composer is active the composer does render the scene. Looking at the pmndrs postprocessing demos seems like they always have not set multisampling at all and instead use SMAA (which is added by the Antialias component in Needle Engine). But the demos also haven’t been updated in 2 years so currently looking at the source code.
Toying around with SMAA as well. The current implementation of the AntialiasingEffect component lacks configurability though. It doesn’t even apply the currently single exposed parameter (preset) to the underlying effect
In the meantime I’ve also exposed the edge detection mode and found DEPTH performs visually worst in my use case. COLOR gives a much better result. It’s also the underlying SMAAEffect’s default and should be used here too I reckon.
And instead of calling createNewRenderer you can just do the above in global scope (not from within a class or component but right at the top of main.ts for example).
Regarding AA: We’ll probably expose a setting for multisample steps in the Volume component but still looking into it