Would it be possible to render objects after post processing?

I have a scene with post processing enabled, dof being one of the effects.
I have some objects designed to be unlit, crisp and “on top” of the rendered scene.

In the image you can se that the dof affects the white object. Plus the white object is overlapped by the particle quad
chrome_mGSu0TB6fS.png

Original Post on Discord

by user 219335701790064640

Hello there – we have an experimental AI bot :robot: that might be able to help you with your question. Would you like to try this out?

How it works:
Reply to the messages of the bot that end with a thread_id to interact with it and ask follow-up questions. You can also mention the bot directly in this channel to get help.

Help requested: please wait a moment…

Here is what I found for you :brain: Discord

Hello SteffenMFureli. It seems like you’re facing an issue where DoF (Depth of Field) post-processing is affecting the visibility of “unlit” crisp objects that should appear on top of other elements.

Given that “unlit” objects should not be affected by lighting or post-processing effects, you might want to ensure that such objects are rendered after the post-processing effects are applied. This is typically done using render queues or layers that allow you to control the order in which objects are rendered.

Here’s a suggested approach to ensure the unlit objects are rendered on top and unaffected by DoF:

  1. Make sure the unlit objects have a material that does not interact with lighting or post-processing. This may involve setting the depthWrite property to false.
  2. You can use a RenderTexture to render the scene with DoF applied, then render the unlit objects onto this texture separately (see the documentation).
  3. Sort the render order of the objects, ensuring that the unlit objects are rendered last.

Step by step in pseudocode:

  1. Render the main scene to a RenderTexture with DoF applied.
  2. Change the material of unlit objects to ensure they are unaffected by post-processing effects.
  3. Render the unlit objects onto the RenderTexture.

I hope this helps resolve the visibility issue you’re experiencing. If you need assistance with specific implementation details, such as code, let me know!

1 messages left GPT 4
thread_39td1FIPOQpGbRLIi2ReAZGM

How would you create a material that don’t interact with Post processing?

by user 219335701790064640

Here is what I found for you :brain: Discord

To create a material that doesn’t interact with post-processing effects like Depth of Field (DoF), you can adjust the properties of the material.

In the case of DoF, you would want to ensure that the material of the object has the depthWrite property set to false. This property controls whether the object writes to the depth buffer, which is used by post-processing effects like DoF to determine the blur amount based on depth.

Here’s a TypeScript code example of creating a material that doesn’t interact with post-processing:

import { MeshBasicMaterial } from 'three';

const unlitMaterial = new MeshBasicMaterial({ color: 0xffffff, depthWrite: false });

In this code, we create a MeshBasicMaterial for the unlit objects and set the depthWrite property to false. This will ensure that the unlit objects don’t affect the depth buffer used by post-processing effects, allowing them to appear on top unaffected by DoF.

Let me know if you need further assistance with this or any other related topic!

0 messages left GPT 3.5
thread_39td1FIPOQpGbRLIi2ReAZGM
Thanks for your feedback!

This would make the mesh not affect dof, but the rendered pixels produced from the material recieves the dof effect, as it is applied to the whole render

by user 219335701790064640

You have reached your limit of 2 messages… please try again in a few hours

Can anyone else look into this issue?

by user 219335701790064640

Hey @SteffenMFureli Have you resolved this issue?

I don’t understand your setup, since DOF should be a fullscreen effect and it shouldn’t affect just a part of the screen.

You can change the render order on the materials to help the sorting issues between the particles and the white circle object.

------

Ah i see more details from this thread: Discord

Perhaps the simpelst solution woudl be to use UI. Are you able to use a Screenspace canvas for the circle element?

For the dot itself that could work, yes, I’ll give it a go. Thanks :slight_smile:

But I also have ĂĄ box I need to render without the dof effect, like in the appended image.
I suppose a RenderTexture approach might be possible if combined with Screenspace UI? Something to look into atleast.

Thanks for answering

by user 219335701790064640

You can have multiple cameras, but only the main cam renders automatically - you can have an onAfterRender() method that calls “renderNow” on your second cam. Make sure to turn depth clear on and color clear off on that second camera

Oh damn, seriously?
Would it be possible for Needle to do this automatically with render stack? Would help making this process easier :slight_smile:

by user 219335701790064640

It would - and you can also build that of you want to. We didn’t because, well, the URP API just changes too much to be worth the effort compared to “here’s how you render a second camera on top”

One early design choice we did for Needle Engine is that only one camera renders by default, in contrast to Unity where all cameras render by default in a random order - they kind of had to build render stacks to work around that legacy

By the way, render textures work too :slightly_smiling_face: but I think for your specific effect here a second cam trending on top is better

I agree.
I’m doing a test now :slight_smile:

Never did like when output is arbitrary, like script execution order in Unity as well.

by user 219335701790064640

I might be blind, but where is the “render now” method?

by user 219335701790064640