Просмотр исходного кода

Merge pull request #12 from Ali-RS/master

Add PBR article
David Bernard 9 лет назад
Родитель
Сommit
48e9cb2c21

+ 4 - 0
src/docs/asciidoc/jme3.adoc

@@ -151,6 +151,10 @@ Exporting OgreXML scenes from Blender to JME3
 *  <<jme3/advanced/jme3_srgbpipeline#,Gamma correction or sRGB pipeline>>
 *  <<jme3/shader_video_tutorials#,Videos: jME3 introduction to shaders video tutorial series>>
 *  link:http://www.youtube.com/watch?v=IuEMUFwdheE[Video: jME3 Material with Alpha Channel]
+*  Article: Physically Based Rendering (PBR)
+**  <<jme3/advanced/pbr_part1#,Physically Based Rendering – Part one>>
+**  <<jme3/advanced/pbr_part2#,Physically Based Rendering – Part two>>
+**  <<jme3/advanced/pbr_part3#,Physically Based Rendering – Part three>>
 
 *Physics Integration*
 

+ 128 - 0
src/docs/asciidoc/jme3/advanced/pbr_part1.adoc

@@ -0,0 +1,128 @@
+= Physically Based Rendering – Part one
+
+I’ve been looking at Physically Based Rendering (PBR from now on) since a few weeks, because that’s what all the cool kids are talking about these days. I read almost all the interweb about it and finally somehow wrapped my head around the mechanics behind the concept.
+
+None of the papers I read gave me the epiphany though, the understanding came little by little, literally reading some of the papers 10 times. 
+
+The intent of this series of posts is first to brush up the concept of PBR from the artist point of view (the easy part :D), and then to explain the physical concepts behind it and what you have to understand as a developer.
+
+This paper aims to present PBR as I would explain it to my mother. You shouldn’t need a degree in image rendering theories, neither should you need to be a genius to understand what’s coming. 
+
+There are a lot of papers out there, with a lot of complicated words and equations, that assume a solid background knowledge of image rendering, lighting, shading etc…
+
+I won’t assume this here. 
+
+Of course, I’d like this to be as accurate as possible, so if you have more information, or if explanations are not clear, feel free to chime in. 
+
+*I’m an artist, I want to know what PBR is :*
+
+So you’re an artist, and have some experience in making assets for games. The most commonly used model for describing a material is the Phong reflection model (from link:https://en.wikipedia.org/wiki/Bui_Tuong_Phong[Bui Tuong Phong], a clever guy that died very young).
+
+This model describes how light reflects on a material by splitting it in 3 factors: Ambient color, Diffuse color, Specular color. This should sound familiar to 3D game artists. 
+
+This model is a very rough approximation of what’s really going on when light hit a surface of a real life material, but until then it was pretty much enough for a video game. Of course there are dozens of other models and even modification of Phong model, but this one is the most used, and that’s the one we use in jMonkeyEngine. 
+
+The issue with this model is that it’s complicated to have a material that looks consistent under different lighting environment.
+
+   * Ambient is supposed to represent Ambient lighting, being some sort of omnipresent dim light, that tries to fake indirect lighting coming from reflection of light on the surrounding objects. 
+   * Diffuse is more straightforward: it’s the actual color of the object when it’s under a white light. 
+   * Specular represent the color of the reflected highlights, and the intensity is driven by a “shininess” parameter (at least in jME but that’s pretty common). The issue is that the specular color also drives the intensity because the brighter the color the more intense the specular will be.
+
+All of this leads to a lot of tweaking to look correct, and may not look as good as it should under a different lighting environment. It also relies heavily on an artist’s best guesses about the material. 
+
+So here comes Physically Based Rendering. Not that the previous one was not physically based…but whatever, that sounds pretty cool.
+
+Everybody has their own way to implement PBR, but every implementation share common goals and concepts. 
+
+*Goals :*
+
+   * Ease the artist’s material workflow.
+   * More “photo realistic” rendering. 
+
+*Concepts :*
+
+   * Every surface has a reflection (specular); even the rougher ones at grazing angles.
+   * Energy conservation: a surface cannot reflect more light that it has received. 
+
+This wraps up the entire concept but how does it translate in practice?
+
+A material can now be described with 3 parameters :
+
+*Base color :* Base color is the raw color of the material, it’s also often referred as the Albedo. It’s similar to the Diffuse color you know from Phong model, but with some crucial differences :
+
+   * It should not contain any shading information. Very often with phong model, Ambient Occlusion (AO) is baked into the diffuse map. Here Base color must be the raw color of the material
+   * It does not only influence the diffuse color, but also the specular color of the material.
+
+*Metalness :* The degree of metallicity of the material. What does that mean? is your material rather metallic or rather not (non metallic materials are called dielectric materials in the literature). Some implementation calls that parameter “specular”, but I found it pretty misleading as it’s completely different as the specular we know today. In practice, just start out with extreme values to get the feel for it: 1 for metallic, 0 for dielectric. 
+
+image::jme3/advanced/metalness.png[metalness.png,with="320",height="250",align="center"]
+Here is the same material with metalness of 0 (dielectric) on the left and 1 (metallic) on the right.
+
+Of course there are intermediary values, but from my reading, most dielectric material should vary from 0.04 and 0.1, and metallic are usually 1. Those values are based on real life measures and you can find some references about them link:https://seblagarde.wordpress.com/2012/04/30/dontnod-specular-and-glossiness-chart/[here] and link:https://seblagarde.wordpress.com/2014/04/14/dontnod-physically-based-rendering-chart-for-unreal-engine-4/[here]. Note that those values are not subject to interpretation, and are “known” factors and artist may follow them if they want to keep realistic look. 
+
+*Roughness :* The degree of roughness of the material : Is your material smooth or rough. 0 means smooth, 1 means rough. Some implementations refer to this as Smoothness or Glossiness. That’s essentially the same except it’s the other way around. 1 is smooth and 0 is rough. I find the term “Roughness” pretty much self explanatory and doesn’t leave room for misinterpretation.
+
+image::jme3/advanced/Roughness.png[Roughness.png,with="320",height="250",align="center"]
+Here is the same material with different level of roughness from 0 (left) to 1 (right). As opposed to metalness, this parameter is very artist driven. The roughness of a material does not really depend on physics, it’s more related to micro scratches, wearing, etc… So that’s where artists should be creative! 
+
+These parameters are the basics of PBR. Of course, each of them can be stored in a texture, and more common additional parameters can be used.
+
+*For example :*
+
+   * Normal map : the same as with phong model.
+   * AO map : since we can’t bake AO in diffuse anymore, it’s now an extra channel. 
+
+The nice thing is that Metalness, Roughness and AO are grey scaled textures, so basically they only use one channel of a texture. So you can pack those 3 maps in one texture. 
+
+You can find an example asset that should work in a typical PBR implementation here. This page showcases pretty well what the textures should look like. 
+
+That’s it for PBR from the artist point of view. Next week I’ll explain what’s under the hood for you fellow developers 😉
+
+*Updates (01/01/2015)*
+
+Since this post I had some discussions about it and it appears, it lacks some informations about the different art pipeline you may come across, the differences and what to expect from them.
+
+The post above is about the *Metalness Workflow*.
+
+The question I had frequently about it is “how one specify the specular color if you just have a black and white metalness texture?”.
+
+The answer is you do in the albedo map.
+
+In the metalness workflow the albedo map is used for both diffuse color and specular color. When the metalness is zero (dielectric material) the base color is the diffuse color of the material. When the metalness is one (metallic material), the base color is the specular color.
+
+So if you wonder what this base color should be, just look at it in the most naive way. “What color is that thing?” and don’t care if that’s diffuse or specular.
+
+The other common workflow is called the *Specular workflow* as it uses a specular color map instead of the metalness map. In this workflow, the albedo map is the diffuse color, the specular map is the specular color, and you have a gray scale gloss map that is the same as the roughness map but inverted (1 is smooth and 0 is rough).
+
+Now there are pro and cons on using one or the other. Here are the main points :
+
+== Metalness workflow :
+
+*Pros*
+
+   * Use less texture space. Albedo map is an rgba map, metal and roughness can be packed in another rgba map and you have 2 additional channels for whatever you want (AO, cavity, …w/e)
+
+   * Harder to make implausible materials (some may see this as a con thought). it’s not more physically accurate, but you’re sure to follow the energy conservation paradigm.
+   * Easier color concept : base color is the color of the material.
+
+*Cons*
+
+   * May produce some white artifacts at the junction between metal and non metal
+   * Harder to make implausible materials, not impossible though.
+
+== Specular workflow
+
+*Pros*
+
+   * Closer to the current phong workflow : diffuse map, specular map. Must be easier for seasoned artists to transition to PBR.
+
+*Cons*
+
+   * You’re in charge of the energy conservation paradigm (may be seen as a + for some).
+   * More memory used : 2 rgba textures for diffuse and specular, you may be able to pack glossiness in the alpha channel of specular map, but you have no room left for anything and you may have to use a third texture.
+
+ 
+
+IMO, the metalness workflow is more suited to real time 3D engine. And as an artist I find it more intuitive.
+
+That  said, as a developer making his PBR pipeline; especially for an engine mainly used by Indie devs; whatever pipeline you choose, you can’t ignore the other. Free or charged PBR ready model you can find are done with whatever workflow suited the artist. some conversion are possible, but that’s easier for user to be able to use the model as is. That’s why I decided to support both in my implementation.

+ 171 - 0
src/docs/asciidoc/jme3/advanced/pbr_part2.adoc

@@ -0,0 +1,171 @@
+= Physically Based Rendering – Part Two
+
+<<pbr_part1#,In previous post>>, I explained what you had to know about Physically Based Rendering if you were an artist. If you’re a developer, and reading this article, you may have tried, or are planning  to implement your own PBR system. If you started to read some of the available literature, you’ve probably been struck by the math complexity of it, and by the lack of explanation of the big picture. You usually see articles that focus on specifics parts of the process, and don’t talk much about other parts as they are assumed easier. At some point you have to assemble all these parts, and I had a hard time figuring out how to do it in my readings. I guess it’s considered basic stuff for other authors, but I think it deserves its proper explanation.
+
+I don’t pretend these articles will enlighten you to the point you are ready to implement your own system, but I hope they will give you solid basis and understanding to start reading the literature without saying “WTF?? on every line as I did.
+
+You can find a lexical, at the end, with all the strange words you’ll come across and their explanations.
+
+So here is what I figured out about using PBR and lighting in general in a 3D rendering pipeline.
+
+ 
+**Ligthing**
+
+So first, lets talk about lighting in games. It all boils down to 2 things :
+
+   * Computing *Diffuse* reflection: This represent the light that reflects off a surface in all directions
+   * Computing *Specular* reflection : This represent the light that reflects off a surface directly to your eye.
+
+This image from wikipedia is the most simple and yet the most helpful to understand this
+
+image::jme3/advanced/Lambert2.png[Lambert2.png,with="320",height="250",align="center"]
+By GianniG46 (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0) or GFDL (http://www.gnu.org/copyleft/fdl.html)], via Wikimedia Commons
+
+To compute each of these factors, we’re going to use a function. This function answers to the delicate name of *Bidirectional Reflectance Distribution Function or BRDF*.
+
+Don’t be scared by this, it’s a big name for just a function really. Usually, it will be a shader function.
+
+ 
+
+Of course there are different BRDFs depending on what you want to compute, and on the lighting model you use. The BRDFs are usually called by the name of the guys that discovered/elaborated them.
+
+Also, most of the time, in implementations for real time rendering, those BRDFs are approximated for the sake of performance. And incidentally, those approximations also have names, that can be people names or technique names…
+
+ 
+== Lighting in PBR
+
+Computing lighting for PBR is exactly the same as with the current rendering ( the system we use today with ambient, diffuse, specular, sometimes called ad-hoc system in the literature) :
+
+For each light source, compute the diffuse factor and the specular factor. The main difference is that the BRDFs used are different, more physically accurate, and works predictably under different light sources with few parameter entries.
+
+ 
+
+So what is a light source?
+
+=== Direct light source
+
+Something that emits light. In games the most common light sources are Directional lights (think of the sun), Spot lights (think of a torch light), Point lights (think of a light bulb).
+
+That’s what is commonly used in the ad-hoc system, and PBR also handle those types of lights.
+
+ 
+=== Indirect light source
+
+Something that reflects light and indirectly lights its surroundings. Think for example of a red wall next to a car at daytime, the sunlight hits the wall and the wall reflects red light that, in turn, lights up the car.
+
+This is not handled by the ad-hoc system, or very poorly faked with ambient lighting.
+
+This part is optional for PBR, but that’s actually the part you really want. because that’s what make things pretty!
+
+In games, indirect lighting is done by using an environment map as a light source. This technique is called *Image Based Lighting (IBL)*.
+
+ 
+
+So let’s say we’re looking for the full package. we need to compute diffuse and specular contribution for each light source be it direct or indirect.
+
+To do so we need a BRDF for diffuse and a BRDF for specular, and stick to them for each light source for consistency. Also those BRDF should accept as entry the parameters we want to expose for the artists (base color, metalness, roughness), or derived parameters with minimal transformation.
+
+ 
+
+So the pseudo code for a complete lighting is this :
+[source]
+----
+//direct lighting
+for each directLightSource {
+    directDiffuseFactor = DiffuseBRDF(directlightSource)
+    directSpecularFactor = SpecularBRDF(directLightSource)
+    directLighting += Albedo * directDiffuseFactor + SpecColor * directSpecularFactor
+}
+
+//Indirect Lighting, done through Image Based Rendering with an environment map
+indirectDiffuseFactor = DiffuseBRDF(EnvMap)
+indirectSpecularFactor = SpecularBRDF(EnvMap)
+
+indirectLighting = Albedo * indirectDiffuseFactor + SpecColor * indirectSpecularFactor
+
+Lighting = directLighting + indirectLighting
+----
+
+I’ll go into more details, in the posts serie, on how to compute each factors, but that’s pretty much it.
+
+ 
+== Choosing your BRDFs
+
+There is a vast choice of BRDF, and I’m not going to talk about all of them but focus on the ones that I use in my implementation. I’ll just guide you to alternatives and provide links to relevant articles for more details. 
+
+I chose to use the same BRDF as the ones used in Unreal Engine 4 from link:http://blog.selfshadow.com/publications/s2013-shading-course/karis/s2013_pbs_epic_notes_v2.pdf[this article] by Brian Karis, as I completely trust his judgement. The provided code helped a great deal, but it was far from straight forward to integrate. In the end I had to fully research, and understand all the whereabouts of BRDFs.
+
+ 
+=== Diffuse BRDF : Lambert
+
+The most used diffuse BRDF in games. It’s very popular because it’s very cheap to compute and gives good results. This is the most simple way of computing diffuse.  link:https://en.wikipedia.org/wiki/Lambertian_reflectance[here are the details]
+
+image::jme3/advanced/DiffuseLambert.jpg[DiffuseLambert.jpg,with="320",height="250",align="center"]
+Diffuse Lambert factor for a direct light source (directional light) with a yellow surface color.
+
+Some Alternatives :
+
+*Oren-Nayar* : Gives better visual results than classic Lambert, and has the advantage of using an entry parameter called roughness…rings a bell? Unfortunately, the additional computation cost is not really worth it,IMO. link:https://en.wikipedia.org/wiki/Oren%E2%80%93Nayar_reflectance_model[Details here] 
+
+*Harahan-Krueger* : Takes into consideration sub-surface scattering for diffuse lighting (every material surface has layers and light scatters into those different layers before going out of the material in a random direction). A lot of computations compared to Lambert, but may be important if you want to have a good sub surface scattering look for skin for example.  link:http://cseweb.ucsd.edu/~ravir/6998/papers/p165-hanrahan.pdf[more details in this paper]
+
+ 
+
+ 
+== Specular BRDF : Cook-Torrance
+
+This is a bit more complicated for specular. We need a physically plausible BRDF. We use what is called a *Microfacet BRDF*. So what is it?
+
+It states that at a micro level a surface is not plane, but formed of a multitude of little randomly aligned surfaces, the microfacets. Those surfaces acts as small mirrors that reflects incoming light. The idea behind this BRDF is that only some of those facets may be oriented so that the incoming light reflects to your eye. The smoother the surface, the more all facets are aligned, and the most neat is the light reflection. In the contrary, if a surface is rough, the facets are more randomly oriented so the light reflection is scattered on the surface, and the reflection looks more blurry.
+
+image::jme3/advanced/Specular.png[Specular.png,with="320",height="250",align="center"]
+Microfacet specular factor for a direct light source. On the left a smooth surface, on the right a rough one. Note how the reflection is scattered on the surface when it’s rough.
+
+The microfacet BRDF we use is called Cook-Torrance. From my readings, I couldn’t find any implementation that use another specular BRDF. It seems like this is the global form of any microfacet BRDF. 
+[source]
+----
+f = D * F * G / (4 * (N.L) * (N.V));
+----
+*N.L* is the dot product between the normal of the shaded surface and the light direction.
+
+*N.V* is the dot product between the normal of the shaded surface and the view direction.
+
+The other terms are :
+
+   * *Normal Distribution Function called D* (for distribution). You may also find some references to it as NDF. It computes the distribution of the microfacets for the shaded surface
+   * *Fresnel factor called F*. Discovered by Augustin Fresnel (frenchies are sooo clever), it describes how light reflects and refracts at the intersection of two different media (most often in computer graphics : Air and the shaded surface)
+   * *Geometry shadowing term G*. Defines the shadowing from the microfacets
+
+That’s where it gets complicated. For each of these terms, there are several models or approximations to computed them.
+
+I’ve settled to use those models and approximations :
+
+   * *D : Trowbridge-Reitz/GGX* normal Distribution function.
+   * *F : Fresnel term Schlick*’s link:http://www.cs.virginia.edu/~jdl/bib/appearance/analytic%20models/schlick94b.pdf[approximation]
+   * *G : Schlick-GGX* approximation
+
+I won’t go into the details of all the alternatives I just want to expose an overview of the whole process first.  But I’ll dive into more technical details on the terms I use, in following posts. To have a neat overview of all alternatives you can see this link:http://graphicrants.blogspot.fr/2013/08/specular-brdf-reference.html[post] on  Brian Karis’s blog.
+
+  
+
+That sums up the whole process, but there is still much to explain. In next post I’ll make a focus on indirect lighting, as it’s the part that gave me the hardest time to wrap my head around. I’ll explain the Image Based Lighting technique used, and how you can compute diffuse and specular from an Environment Map.
+
+<<pbr_part3#,Next Post>> 
+ 
+== Lexical :
+
+*Diffuse reflection :* light that reflects from a surface in every direction.
+
+*Specular reflection :* light that reflects from a surface toward the viewer.
+
+*Bidirectional Reflectance Distribution Function or BRDF :* a function to compute Diffuse or Specular reflection.
+
+*Image Based Rendering or IBL :* a technique that uses an image as a light source
+
+*Microfacet Specular BRDF :* A specular BRDF that assumes a surface is made of a multitude of very small randomly aligned surfaces: the microfacets. it depends on 3 factors called D, F and G.
+
+*Normal Distribution Function called D* (for distribution). You may also find some references to it as NDF. It computes the distribution of the microfacets for the shaded surface
+
+*Fresnel factor called F*. Discovered by Augustin Fresnel (frenchies are sooo clever), it describes how light reflects and refracts at the intersection of two different media (most often in computer graphics : Air and the shaded surface)
+
+*Geometry shadowing term G*. Defines the shadowing from the micro facets

+ 185 - 0
src/docs/asciidoc/jme3/advanced/pbr_part3.adoc

@@ -0,0 +1,185 @@
+= Physically Based Rendering – Part Three
+
+image::jme3/advanced/irradianceMap.png[irradianceMap.png,with="320",height="250",align="center"]
+*Note* : after several discussions in the team, I realized that some points were not clear in the  “PBR for artists” post. I’ve made an update with additional information on how to handle metalness and specular. <<jme3/advanced/pbr_part1#,I invite you to read it>>.
+
+== Image Based Lighting in PBR
+
+In the <<jme3/advanced/pbr_part2#,previous post>>, I talked about the basics of PBR for developers, and explained the different steps of the lighting process with PBR.
+
+As said before, PBR does not necessarily imply to have indirect lighting, But that’s what makes it look so good.
+
+Today I’m gonna focus on a technique called Image Based Lighting (IBL), that will allow us to cheaply compute this indirect lighting.
+
+As before you can find at the end of the article a lexical with definitions of various unusual terms you’ll come across.
+== Indirect Lighting for PBR with Image Based Lighting.
+
+Direct lighting is usually pretty clear for everyone as it uses common light sources (point light, directional light,…).
+
+However indirect lighting is not that obvious. First you need to understand what we want to simulate with indirect light.
+
+It is often referred as *Global Illumination (or GI)*. This represents the light bouncing on surrounding objects that is lighting the shaded surface. There are several techniques to implement global illumination, but the most common is *Image Based Lighting (IBL)*. It is very often associated with PBR pipelines.
+
+So basically, a light source in game is a color, and optionally other parameters like direction, position, radius, etc… An image has color informations, and this color can be considered as a light source.
+
+For global Illumination light is coming from everywhere. So a good way to simulate GI with IBL is to consider an environment map as a light source.
+=== Reminder on environment maps :
+
+Most often, in-game environment maps are cube maps.
+
+How do we fetch a pixel from an environment map? We need a vector. Often called the reflect vector (refVec), because thats the reflection vector of the view direction on the shaded surface.
+
+A picture worth thousand words
+
+image::jme3/advanced/Cube_mapped_reflection_example.jpg[Cube_mapped_reflection_example.jpg,with="320",height="250",align="center"]
+from wikipedia : TopherTH at the English language Wikipedia [GFDL (http://www.gnu.org/copyleft/fdl.html), GFDL (http://www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0/)], via Wikimedia Commons Here the reflected Ray is our reflection vector.
+
+Here the reflected Ray is our reflection vector.
+
+Unfortunately we can’t take each pixel of the env map and compute light as if it was a direct light source and hope for the best.
+
+There’s crazy math around that topic, and to be honest I ddidn’tget all of it myself. So instead of explaining difficult math equations that may be confusing, I’m gonna go straight to the point : How are we going to compute lighting from the environment map?
+=== IBL Diffuse
+
+First we need to compute Diffuse factor from the environment map. Remember our diffuse BRDF from last post? *Lambert*.
+
+To simplify, Lambert diffuse BRDF, as it’s used in game, is the light color multiplied by a visibility factor.
+
+This visibility factor is a function of the normal of the shaded geometry and the light direction.
+
+Let’s say you have a direct light source lighting the front side of a geometry. The back side of this geometry is in the dark. Meaning the front side visibility factor is 1 and the back side visibility factor is 0.
+
+For indirect lighting, we can ditch out this visibility factor because the light is coming from everywhere. So all of this simplifies in Diffuse factor = light color.
+
+But what’s the light color for a given point?
+
+Technically, every pixel in the environment map is a light source, so a shaded point is lighten by a vast amount of pixels.
+
+image::jme3/advanced/irradiance.png[irradiance.png,with="320",height="250",align="center"]
+
+In this picture the orange area represent the light rays coming from the cube map to the shaded pixel, that we have to consider to properly compute the light color. So the idea would be, for each shaded pixel, to fetch all that texels and combine the colors.
+
+As you can image that’s not practical for a real time rendering pipeline. Even with a 128×128 env map that’s around 50k texture fetches per pixel.
+
+Fortunately, We can use something called an *Irradiance map*. An irradiance map is basically the afford mentioned computation…except that it’s precomputed and baked into a texture. In practice here is what it looks like.
+
+image::jme3/advanced/irradianceMap.png[irradianceMap.png,with="320",height="250",align="center"]
+On the left the original cube map, on the right, the pre computed irradiance map.
+
+So at run time you just have to do one texture fetch in that map with the reflection vector. Pretty cool heh?
+
+Except that to pre-compute that map we still have to sample the cube map literally billions of times, and even if it’s at design time…it’s painfully long.
+
+*Spherical Harmonics (SH) to the rescue*
+
+What’s that again? I won’t go into explaining them in details (because I can’t actually ;-P ), but just know that it’s once again some math magic with a big name on it. Here is a post where it’s explained with simple words, in terms of what you can use them for.
+
+To put it simple, SH can help us to compute the irradiance map way faster. This article explains that it all boils down to compute only 9 spherical harmonics coefficients to efficiently approximate an irradiance factor.
+
+At this point you can even skip the pre computation of the irradiance map, and use those coefficients directly in your shader for each shaded pixels. That’s fast enough to be real time, and use less memory that a cube map.
+
+But still…it’s slower than one texture fetch, so I chose to compute the Irradiance map and use it in the shader.
+
+With this technique I can compute a 128X128 irradiance cube map on the CPU in Java in about 200ms. Too slow to be done on each frame, but at design time that’s the blink of an eye.
+
+image::jme3/advanced/DiffuseIBL.png[DiffuseIBL.png,with="320",height="250",align="center"]
+Here is the diffuse effect of indirect lighting using an irradiance cube map
+
+=== IBL Specular
+
+Indirect diffuse is cool, but we want “shiny”!! Shiny implies specular lighting.
+
+It’s important to understand what we want as a specular reflection. We want it to be very neat when the roughness is low and very blurry when it’s high.
+
+image::jme3/advanced/Roughness.png[Roughness.png,with="320",height="250",align="center"]
+As roughness increase the reflection gets more blurry.
+
+To do this, we have to resolve an integral called the *radiance integral.*
+
+There is a method to do it quite fast that is called *importance sampling*. However it requires a lot of samples to get correct results, and therefore it’s pretty slow.
+
+As an example, for the previous shot, I was using this method, with 1024 samples. It was barely interactive, because it ran at 16 fps on a GTX 670. 
+=== Thanks Epic games!
+
+Epic games came with a solution to this issue for Unreal Engine 4. Others did too, actually, but Epic games made it public in this paper, from Brian Karis. I can’t thank them enough for this.
+
+ In this link:http://blog.selfshadow.com/publications/s2013-shading-course/karis/s2013_pbs_epic_notes_v2.pdf[paper], they explain how they do it in UE4. They use a method they called the *Split Sum Approximation*. It doesn’t make the computation faster, but it transforms it so that it can be baked in two prefiltered textures.
+
+   * The prefiltered environment map
+
+We are going to pre process an env map on the CPU.
+
+As explained before, we need the reflection to be more blurry as the roughness increase. The main idea here is to store different levels of roughness in the env map mip maps. The first mip map level will match roughness = 0 and the last will match roughness = 1. 
+
+From mip levels to mip levels we’re going to convolve (blur) the images depending on the roughness. The more the roughness increase the more samples we’re going to use, and the more spread out they will be. 
+
+But that’s not all, we also want to “bake” the specular BRDF in the map, so for each pixel we are going to compute the Cook-Torrentz microfacet BRDF (remember last post).
+
+But, as we are preprocessing the map, we don’t have any information about the shaded surface normal and view direction. So we are going to assume they are all the same, and equal to the envVector we’ll use to fetch pixels from the map. Also we assume that the shading point is exactly at the center of the cube map.
+
+image::jme3/advanced/prefilteredEnvMapSampling.png[prefilteredEnvMapSampling.png,with="320",height="250",align="center"]
+
+This is an approximation again, and it has a cost in quality, but we’re all for approximation as long as it’s perform faster while still looking good, right?
+
+Here is what the result looks like
+
+image::jme3/advanced/PrefilteredEnvMap.png[PrefilteredEnvMap.png,with="320",height="250",align="center"]
+The prefiltered environment map, with the different mip levels. notice how the blur increases through them.
+
+So now we can evaluate the first sum of the split sum approximation with a single texture fetch. We are going to compute the Lod level (the mip level where to fetch the texel) according to the roughness.
+
+Note that the image needs to be set up so that textureCube interpolates linearly between mip maps so that if the roughness value is not right on the mip level, it will interpolate between the two closest mip levels. 
+
+   * The BRDF integration Map
+
+Now we need the second sum of the split sum approximation.
+
+It’s an integration that has two inputs, the *roughness* that varies from 0 to 1, and the dot product between the normal and the light direction (*N.L*, read N dot L) that also varies from 0 to 1.
+
+The outputs are a *scale*, and a *bias*, also varying from 0 to 1.
+
+So basically we can bake all combinations into a 2D map. roughness and N.L will be the texture coordinate. the red channel of the map will be the scale, and the green channel will be the bias. (the blue channel is not used)
+
+Here is what it looks like :
+
+image::jme3/advanced/integrateBrdf.png[integrateBrdf.png,with="320",height="250",align="center"]
+
+The nice part is that this map is constant for white light. It does not depends on the environment. So you can bake it once and for all then use it as an asset in your shaders.
+
+Now we have to combine values fetched from these maps to get the specular lighting.
+
+Here is what indirect specular alone, looks like, with a roughness of 0.1.
+
+image::jme3/advanced/IndirectSpeculra.png[IndirectSpeculra.png,with="320",height="250",align="center"]
+
+*So in the end :*
+
+Our indirect lighting pseudo code looks like this : 
+[source]
+----
+//diffuse
+indirectDiffuse = textureCube(IrradianceMap, refVec)  * diffuseColor
+
+//specular
+lod = getMipLevelFromRoughness(roughness)
+prefilteredColor =  textureCube(PrefilteredEnvMap, refVec, lod)
+envBRDF = texture2D(BRDFIntegrationMap,vec2(roughness, ndotv)).xy
+indirectSpecular = prefilteredColor * (specularColor * envBRDF.x + envBRDF.y)
+
+indirectLighting = indirectDiffuse + indirectSpecular
+----
+That concludes the post. Quite a lot of information to process. Now you should have an idea of the whole thing. Next time, we are going to go under the hood, and YOU GONNA HAZ CODE!!
+
+== Lexical :
+
+*Global Illumination (GI):* A concept that represent all the lighting of a scene that is not coming from a direct light source.
+
+*Image Based Lighting (IBL):* A technique that uses an image as a light source
+
+*Irradiance map :* Precomputed environment map that contains diffuse lighting data of the environment.
+
+*Spherical Harmonics (SH):* link:https://dickyjim.wordpress.com/2013/09/04/spherical-harmonics-for-beginners/[Read this]
+
+*Importance Sampling :* A math technique to approximate the result of an integral.
+
+*Split Sum Approximation :* A way,used in Unreal Engine 4, to transform the specular radiance integral into 2 sums that can be easily baked into prefiltered textures.

BIN
src/docs/images/jme3/advanced/Cube_mapped_reflection_example.jpg


BIN
src/docs/images/jme3/advanced/DiffuseIBL.png


BIN
src/docs/images/jme3/advanced/DiffuseLambert.jpg


BIN
src/docs/images/jme3/advanced/IndirectSpeculra.png


BIN
src/docs/images/jme3/advanced/Lambert2.png


BIN
src/docs/images/jme3/advanced/PrefilteredEnvMap.png


BIN
src/docs/images/jme3/advanced/Roughness.png


BIN
src/docs/images/jme3/advanced/Specular.png


BIN
src/docs/images/jme3/advanced/integrateBrdf.png


BIN
src/docs/images/jme3/advanced/irradiance.png


BIN
src/docs/images/jme3/advanced/irradianceMap.png


BIN
src/docs/images/jme3/advanced/metalness.png


BIN
src/docs/images/jme3/advanced/prefilteredEnvMapSampling.png