advanced_postprocessing.rst 8.3 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197
  1. .. _doc_advanced_postprocessing:
  2. Advanced post-processing
  3. ========================
  4. Introduction
  5. ------------
  6. This tutorial describes an advanced method for post-processing in Godot.
  7. In particular, it will explain how to write a post-processing shader that
  8. uses the depth buffer. You should already be familiar with post-processing
  9. generally and, in particular, with the methods outlined in the :ref:`custom post-processing tutorial <doc_custom_postprocessing>`.
  10. Full screen quad
  11. ----------------
  12. One way to make custom post-processing effects is by using a viewport. However,
  13. there are two main drawbacks of using a Viewport:
  14. 1. The depth buffer cannot be accessed
  15. 2. The effect of the post-processing shader is not visible in the editor
  16. To get around the limitation on using the depth buffer, use a :ref:`MeshInstance3D <class_MeshInstance3D>`
  17. with a :ref:`QuadMesh <class_QuadMesh>` primitive. This allows us to use a
  18. shader and to access the depth texture of the scene. Next, use a vertex shader
  19. to make the quad cover the screen at all times so that the post-processing
  20. effect will be applied at all times, including in the editor.
  21. First, create a new MeshInstance3D and set its mesh to a QuadMesh. This creates
  22. a quad centered at position ``(0, 0, 0)`` with a width and height of ``1``. Set
  23. the width and height to ``2`` and enable **Flip Faces**. Right now, the quad
  24. occupies a position in world space at the origin. However, we want it to move
  25. with the camera so that it always covers the entire screen. To do this, we will
  26. bypass the coordinate transforms that translate the vertex positions through the
  27. difference coordinate spaces and treat the vertices as if they were already in
  28. clip space.
  29. The vertex shader expects coordinates to be output in clip space, which are coordinates
  30. ranging from ``-1`` at the left and bottom of the screen to ``1`` at the top and right
  31. of the screen. This is why the QuadMesh needs to have height and width of ``2``.
  32. Godot handles the transform from model to view space to clip space behind the scenes,
  33. so we need to nullify the effects of Godot's transformations. We do this by setting the
  34. ``POSITION`` built-in to our desired position. ``POSITION`` bypasses the built-in transformations
  35. and sets the vertex position directly.
  36. .. code-block:: glsl
  37. shader_type spatial;
  38. void vertex() {
  39. POSITION = vec4(VERTEX, 1.0);
  40. }
  41. Even with this vertex shader, the quad keeps disappearing. This is due to frustum
  42. culling, which is done on the CPU. Frustum culling uses the camera matrix and the
  43. AABBs of Meshes to determine if the Mesh will be visible *before* passing it to the GPU.
  44. The CPU has no knowledge of what we are doing with the vertices, so it assumes the
  45. coordinates specified refer to world positions, not clip space positions, which results
  46. in Godot culling the quad when we turn away from the center of the scene. In
  47. order to keep the quad from being culled, there are a few options:
  48. 1. Add the QuadMesh as a child to the camera, so the camera is always pointed at it
  49. 2. Set the Geometry property ``extra_cull_margin`` as large as possible in the QuadMesh
  50. The second option ensures that the quad is visible in the editor, while the first
  51. option guarantees that it will still be visible even if the camera moves outside the cull margin.
  52. You can also use both options.
  53. Depth texture
  54. -------------
  55. To read from the depth texture, we first need to create a texture uniform set to the depth buffer
  56. by using ``hint_depth_texture``.
  57. .. code-block:: glsl
  58. uniform sampler2D depth_texture : source_color, hint_depth_texture;
  59. Once defined, the depth texture can be read with the ``texture()`` function.
  60. .. code-block:: glsl
  61. float depth = texture(depth_texture, SCREEN_UV).x;
  62. .. note:: Similar to accessing the screen texture, accessing the depth texture is only
  63. possible when reading from the current viewport. The depth texture cannot be
  64. accessed from another viewport to which you have rendered.
  65. The values returned by ``depth_texture`` are between ``0.0`` and ``1.0`` and are nonlinear.
  66. When displaying depth directly from the ``depth_texture``, everything will look almost
  67. white unless it is very close. This is because the depth buffer stores objects closer
  68. to the camera using more bits than those further, so most of the detail in depth
  69. buffer is found close to the camera. In order to make the depth value align with world or
  70. model coordinates, we need to linearize the value. When we apply the projection matrix to the
  71. vertex position, the z value is made nonlinear, so to linearize it, we multiply it by the
  72. inverse of the projection matrix, which in Godot, is accessible with the variable
  73. ``INV_PROJECTION_MATRIX``.
  74. Firstly, take the screen space coordinates and transform them into normalized device
  75. coordinates (NDC). NDC run ``-1.0`` to ``1.0`` in ``x`` and ``y`` directions and
  76. from ``0.0`` to ``1.0`` in the ``z`` direction when using the Vulkan backend.
  77. Reconstruct the NDC using ``SCREEN_UV`` for the ``x`` and ``y`` axis, and
  78. the depth value for ``z``.
  79. .. note::
  80. This tutorial assumes the use of the Vulkan renderer, which uses NDCs with a Z-range
  81. of ``[0.0, 1.0]``. In contrast, OpenGL uses NDCs with a Z-range of ``[-1.0, 1.0]``.
  82. .. code-block:: glsl
  83. void fragment() {
  84. float depth = texture(depth_texture, SCREEN_UV).x;
  85. vec3 ndc = vec3(SCREEN_UV * 2.0 - 1.0, depth);
  86. }
  87. Convert NDC to view space by multiplying the NDC by ``INV_PROJECTION_MATRIX``.
  88. Recall that view space gives positions relative to the camera, so the ``z`` value will give us
  89. the distance to the point.
  90. .. code-block:: glsl
  91. void fragment() {
  92. ...
  93. vec4 view = INV_PROJECTION_MATRIX * vec4(ndc, 1.0);
  94. view.xyz /= view.w;
  95. float linear_depth = -view.z;
  96. }
  97. Because the camera is facing the negative ``z`` direction, the position will have a negative ``z`` value.
  98. In order to get a usable depth value, we have to negate ``view.z``.
  99. The world position can be constructed from the depth buffer using the following code. Note
  100. that the ``INV_VIEW_MATRIX`` is needed to transform the position from view space into world space, so
  101. it needs to be passed to the fragment shader with a varying.
  102. .. code-block:: glsl
  103. varying mat4 CAMERA;
  104. void vertex() {
  105. CAMERA = INV_VIEW_MATRIX;
  106. }
  107. void fragment() {
  108. ...
  109. vec4 world = CAMERA * INV_PROJECTION_MATRIX * vec4(ndc, 1.0);
  110. vec3 world_position = world.xyz / world.w;
  111. }
  112. An optimization
  113. ---------------
  114. You can benefit from using a single large triangle rather than using a full
  115. screen quad. The reason for this is explained `here <https://michaldrobot.com/2014/04/01/gcn-execution-patterns-in-full-screen-passes>`_.
  116. However, the benefit is quite small and only beneficial when running especially
  117. complex fragment shaders.
  118. Set the Mesh in the MeshInstance3D to an :ref:`ArrayMesh <class_ArrayMesh>`. An
  119. ArrayMesh is a tool that allows you to easily construct a Mesh from Arrays for
  120. vertices, normals, colors, etc.
  121. Now, attach a script to the MeshInstance3D and use the following code:
  122. ::
  123. extends MeshInstance3D
  124. func _ready():
  125. # Create a single triangle out of vertices:
  126. var verts = PackedVector3Array()
  127. verts.append(Vector3(-1.0, -1.0, 0.0))
  128. verts.append(Vector3(-1.0, 3.0, 0.0))
  129. verts.append(Vector3(3.0, -1.0, 0.0))
  130. # Create an array of arrays.
  131. # This could contain normals, colors, UVs, etc.
  132. var mesh_array = []
  133. mesh_array.resize(Mesh.ARRAY_MAX) #required size for ArrayMesh Array
  134. mesh_array[Mesh.ARRAY_VERTEX] = verts #position of vertex array in ArrayMesh Array
  135. # Create mesh from mesh_array:
  136. mesh.add_surface_from_arrays(Mesh.PRIMITIVE_TRIANGLES, mesh_array)
  137. .. note:: The triangle is specified in normalized device coordinates.
  138. Recall, NDC run from ``-1.0`` to ``1.0`` in both the ``x`` and ``y``
  139. directions. This makes the screen ``2`` units wide and ``2`` units
  140. tall. In order to cover the entire screen with a single triangle, use
  141. a triangle that is ``4`` units wide and ``4`` units tall, double its
  142. height and width.
  143. Assign the same vertex shader from above and everything should look exactly the same.
  144. The one drawback to using an ArrayMesh over using a QuadMesh is that the ArrayMesh
  145. is not visible in the editor because the triangle is not constructed until the scene
  146. is run. To get around that, construct a single triangle Mesh in a modeling program
  147. and use that in the MeshInstance3D instead.