post-processing.html 11 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318
  1. Title: Three.js Post Processing
  2. Description: How to Post Process in THREE.js
  3. TOC: Post Processing
  4. *Post processing* generally refers to applying some kind of effect or filter to
  5. a 2D image. In the case of THREE.js we have a scene with a bunch of meshes in
  6. it. We render that scene into a 2D image. Normally that image is rendered
  7. directly into the canvas and displayed in the browser but instead we can [render
  8. it to a render target](threejs-rendertargets.html) and then apply some *post
  9. processing* effects to the result before drawing it to the canvas. It's called
  10. post processing because it happens after (post) the main scene processing.
  11. Examples of post processing are Instagram like filters,
  12. Photoshop filters, etc...
  13. THREE.js has some example classes to help setup a post processing pipeline. The
  14. way it works is you create an `EffectComposer` and to it you add multiple `Pass`
  15. objects. You then call `EffectComposer.render` and it renders your scene to a
  16. [render target](threejs-rendertargets.html) and then applies each `Pass`.
  17. Each `Pass` can be some post processing effect like adding a vignette, blurring,
  18. applying a bloom, applying film grain, adjusting the hue, saturation, contrast,
  19. etc... and finally rendering the result to the canvas.
  20. It's a little bit important to understand how `EffectComposer` functions. It
  21. creates two [render targets](threejs-rendertargets.html). Let's call them
  22. **rtA** and **rtB**.
  23. Then, you call `EffectComposer.addPass` to add each pass in the order you want
  24. to apply them. The passes are then applied *something like* this.
  25. <div class="threejs_center"><img src="resources/images/threejs-postprocessing.svg" style="width: 600px"></div>
  26. First the scene you passed into `RenderPass` is rendered to **rtA**, then
  27. **rtA** is passed to the next pass, whatever it is. That pass uses **rtA** as
  28. input to do whatever it does and writes the results to **rtB**. **rtB** is then
  29. passed to the next pass which uses **rtB** as input and writes back to **rtA**.
  30. This continues through all the passes.
  31. Each `Pass` has 4 basic options
  32. ## `enabled`
  33. Whether or not to use this pass
  34. ## `needsSwap`
  35. Whether or not to swap `rtA` and `rtB` after finishing this pass
  36. ## `clear`
  37. Whether or not to clear before rendering this pass
  38. ## `renderToScreen`
  39. Whether or not to render to the canvas instead the current destination render
  40. target. Usually you need to set this to true on the last pass you add to your
  41. `EffectComposer`.
  42. Let's put together a basic example. We'll start with the example from [the
  43. article on responsiveness](threejs-responsive.html).
  44. To that first we create an `EffectComposer`.
  45. ```js
  46. const composer = new EffectComposer(renderer);
  47. ```
  48. Then as the first pass we add a `RenderPass` that will render our scene with our
  49. camera into the first render target.
  50. ```js
  51. composer.addPass(new RenderPass(scene, camera));
  52. ```
  53. Next we add a `BloomPass`. A `BloomPass` renders its input to a generally
  54. smaller render target and blurs the result. It then adds that blurred result on
  55. top of the original input. This makes the scene *bloom*
  56. ```js
  57. const bloomPass = new BloomPass(
  58. 1, // strength
  59. 25, // kernel size
  60. 4, // sigma ?
  61. 256, // blur render target resolution
  62. );
  63. composer.addPass(bloomPass);
  64. ```
  65. Finally we had a `FilmPass` that draws noise and scanlines on top of its input.
  66. ```js
  67. const filmPass = new FilmPass(
  68. 0.35, // noise intensity
  69. 0.025, // scanline intensity
  70. 648, // scanline count
  71. false, // grayscale
  72. );
  73. filmPass.renderToScreen = true;
  74. composer.addPass(filmPass);
  75. ```
  76. Since the `filmPass` is the last pass we set its `renderToScreen` property to
  77. true to tell it to render to the canvas. Without setting this it would instead
  78. render to the next render target.
  79. To use these classes we need to import a bunch of scripts.
  80. ```js
  81. import {EffectComposer} from './resources/threejs/r132/examples/jsm/postprocessing/EffectComposer.js';
  82. import {RenderPass} from './resources/threejs/r132/examples/jsm/postprocessing/RenderPass.js';
  83. import {BloomPass} from './resources/threejs/r132/examples/jsm/postprocessing/BloomPass.js';
  84. import {FilmPass} from './resources/threejs/r132/examples/jsm/postprocessing/FilmPass.js';
  85. ```
  86. For pretty much any post processing `EffectComposer.js`, and `RenderPass.js`
  87. are required.
  88. The last things we need to do are to use `EffectComposer.render` instead of
  89. `WebGLRenderer.render` *and* to tell the `EffectComposer` to match the size of
  90. the canvas.
  91. ```js
  92. -function render(now) {
  93. - time *= 0.001;
  94. +let then = 0;
  95. +function render(now) {
  96. + now *= 0.001; // convert to seconds
  97. + const deltaTime = now - then;
  98. + then = now;
  99. if (resizeRendererToDisplaySize(renderer)) {
  100. const canvas = renderer.domElement;
  101. camera.aspect = canvas.clientWidth / canvas.clientHeight;
  102. camera.updateProjectionMatrix();
  103. + composer.setSize(canvas.width, canvas.height);
  104. }
  105. cubes.forEach((cube, ndx) => {
  106. const speed = 1 + ndx * .1;
  107. - const rot = time * speed;
  108. + const rot = now * speed;
  109. cube.rotation.x = rot;
  110. cube.rotation.y = rot;
  111. });
  112. - renderer.render(scene, camera);
  113. + composer.render(deltaTime);
  114. requestAnimationFrame(render);
  115. }
  116. ```
  117. `EffectComposer.render` takes a `deltaTime` which is the time in seconds since
  118. the last frame was rendered. It passes this to the various effects in case any
  119. of them are animated. In this case the `FilmPass` is animated.
  120. {{{example url="../threejs-postprocessing.html" }}}
  121. To change effect parameters at runtime usually requires setting uniform values.
  122. Let's add a gui to adjust some of the parameters. Figuring out which values you
  123. can easily adjust and how to adjust them requires digging through the code for
  124. that effect.
  125. Looking inside
  126. [`BloomPass.js`](https://github.com/mrdoob/three.js/blob/master/examples/js/postprocessing/BloomPass.js)
  127. I found this line:
  128. ```js
  129. this.copyUniforms[ "opacity" ].value = strength;
  130. ```
  131. So we can set the strength by setting
  132. ```js
  133. bloomPass.copyUniforms.opacity.value = someValue;
  134. ```
  135. Similarly looking in
  136. [`FilmPass.js`](https://github.com/mrdoob/three.js/blob/master/examples/js/postprocessing/FilmPass.js)
  137. I found these lines:
  138. ```js
  139. if ( grayscale !== undefined ) this.uniforms.grayscale.value = grayscale;
  140. if ( noiseIntensity !== undefined ) this.uniforms.nIntensity.value = noiseIntensity;
  141. if ( scanlinesIntensity !== undefined ) this.uniforms.sIntensity.value = scanlinesIntensity;
  142. if ( scanlinesCount !== undefined ) this.uniforms.sCount.value = scanlinesCount;
  143. ```
  144. So which makes it pretty clear how to set them.
  145. Let's make a quick GUI to set those values
  146. ```js
  147. import {GUI} from '../3rdparty/dat.gui.module.js';
  148. ```
  149. and
  150. ```js
  151. const gui = new GUI();
  152. {
  153. const folder = gui.addFolder('BloomPass');
  154. folder.add(bloomPass.copyUniforms.opacity, 'value', 0, 2).name('strength');
  155. folder.open();
  156. }
  157. {
  158. const folder = gui.addFolder('FilmPass');
  159. folder.add(filmPass.uniforms.grayscale, 'value').name('grayscale');
  160. folder.add(filmPass.uniforms.nIntensity, 'value', 0, 1).name('noise intensity');
  161. folder.add(filmPass.uniforms.sIntensity, 'value', 0, 1).name('scanline intensity');
  162. folder.add(filmPass.uniforms.sCount, 'value', 0, 1000).name('scanline count');
  163. folder.open();
  164. }
  165. ```
  166. and now we can adjust those settings
  167. {{{example url="../threejs-postprocessing-gui.html" }}}
  168. That was a small step to making our own effect.
  169. Post processing effects use shaders. Shaders are written in a language called
  170. [GLSL (Graphics Library Shading Language)](https://www.khronos.org/files/opengles_shading_language.pdf). Going
  171. over the entire language is way too large a topic for these articles. A few
  172. resources to get start from would be maybe [this article](https://webglfundamentals.org/webgl/lessons/webgl-shaders-and-glsl.html)
  173. and maybe [the Book of Shaders](https://thebookofshaders.com/).
  174. I think an example to get you started would be helpful though so let's make a
  175. simple GLSL post processing shader. We'll make one that lets us multiply the
  176. image by a color.
  177. For post processing THREE.js provides a useful helper called the `ShaderPass`.
  178. It takes an object with info defining a vertex shader, a fragment shader, and
  179. the default inputs. It will handling setting up which texture to read from to
  180. get the previous pass's results and where to render to, either one of the
  181. `EffectComposer`s render target or the canvas.
  182. Here's a simple post processing shader that multiplies the previous pass's
  183. result by a color.
  184. ```js
  185. const colorShader = {
  186. uniforms: {
  187. tDiffuse: { value: null },
  188. color: { value: new THREE.Color(0x88CCFF) },
  189. },
  190. vertexShader: `
  191. varying vec2 vUv;
  192. void main() {
  193. vUv = uv;
  194. gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1);
  195. }
  196. `,
  197. fragmentShader: `
  198. varying vec2 vUv;
  199. uniform sampler2D tDiffuse;
  200. uniform vec3 color;
  201. void main() {
  202. vec4 previousPassColor = texture2D(tDiffuse, vUv);
  203. gl_FragColor = vec4(
  204. previousPassColor.rgb * color,
  205. previousPassColor.a);
  206. }
  207. `,
  208. };
  209. ```
  210. Above `tDiffuse` is the name that `ShaderPass` uses to pass in the previous
  211. pass's result texture so we pretty much always need that. We then declare
  212. `color` as a THREE.js `Color`.
  213. Next we need a vertex shader. For post processing the vertex shader shown here
  214. is pretty much standard and rarely needs to be changed. Without going into too
  215. many details (see articles linked above) the variables `uv`, `projectionMatrix`,
  216. `modelViewMatrix` and `position` are all magically added by THREE.js.
  217. Finally we create a fragment shader. In it we get a pixel color from the
  218. previous pass with this line
  219. ```glsl
  220. vec4 previousPassColor = texture2D(tDiffuse, vUv);
  221. ```
  222. we multiply it by our color and set `gl_FragColor` to the result
  223. ```glsl
  224. gl_FragColor = vec4(
  225. previousPassColor.rgb * color,
  226. previousPassColor.a);
  227. ```
  228. Adding some simple GUI to set the 3 values of the color
  229. ```js
  230. const gui = new GUI();
  231. gui.add(colorPass.uniforms.color.value, 'r', 0, 4).name('red');
  232. gui.add(colorPass.uniforms.color.value, 'g', 0, 4).name('green');
  233. gui.add(colorPass.uniforms.color.value, 'b', 0, 4).name('blue');
  234. ```
  235. Gives us a simple postprocessing effect that multiplies by a color.
  236. {{{example url="../threejs-postprocessing-custom.html" }}}
  237. As mentioned about all the details of how to write GLSL and custom shaders is
  238. too much for these articles. If you really want to know how WebGL itself works
  239. then check out [these articles](https://webglfundamentals.org). Another great
  240. resources is just to
  241. [read through the existing post processing shaders in the THREE.js repo](https://github.com/mrdoob/three.js/tree/master/examples/js/shaders). Some
  242. are more complicated than others but if you start with the smaller ones you can
  243. hopefully get an idea of how they work.
  244. Most of the post processing effects in the THREE.js repo are unfortunately
  245. undocumented so to use them you'll have to [read through the examples](https://github.com/mrdoob/three.js/tree/master/examples) or
  246. [the code for the effects themselves](https://github.com/mrdoob/three.js/tree/master/examples/js/postprocessing).
  247. Hopefully these simple example and the article on
  248. [render targets](threejs-rendertargets.html) provide enough context to get started.