ソースを参照

Merge pull request #5 from jMonkeyEngine/master

update from jMonkeyEngine/wiki
Yan 7 年 前
コミット
b3d448012a

+ 16 - 3
src/docs/asciidoc/jme3/advanced/pbr_part1.adoc

@@ -1,4 +1,12 @@
 = Physically Based Rendering – Part one
+:author:
+:revnumber:
+:revdate: 2018/01/15 23:16
+:relfileprefix: ../../
+:imagesdir: ../..
+:experimental:
+ifdef::env-github,env-browser[:outfilesuffix: .adoc]
+
 
 I’ve been looking at Physically Based Rendering (PBR from now on) since a few weeks, because that’s what all the cool kids are talking about these days. I read almost all the interweb about it and finally somehow wrapped my head around the mechanics behind the concept.
 
@@ -55,14 +63,14 @@ A material can now be described with 3 parameters :
 
 *Metalness :* The degree of metallicity of the material. What does that mean? is your material rather metallic or rather not (non metallic materials are called dielectric materials in the literature). Some implementation calls that parameter “specular”, but I found it pretty misleading as it’s completely different as the specular we know today. In practice, just start out with extreme values to get the feel for it: 1 for metallic, 0 for dielectric.
 
-image::metalness.png[metalness,width="320",height="250",align="center"]
+image::jme3/advanced/metalness.png[metalness,width="320",height="250",align="center"]
 Here is the same material with metalness of 0 (dielectric) on the left and 1 (metallic) on the right.
 
 Of course there are intermediary values, but from my reading, most dielectric material should vary from 0.04 and 0.1, and metallic are usually 1. Those values are based on real life measures and you can find some references about them link:https://seblagarde.wordpress.com/2012/04/30/dontnod-specular-and-glossiness-chart/[here] and link:https://seblagarde.wordpress.com/2014/04/14/dontnod-physically-based-rendering-chart-for-unreal-engine-4/[here]. Note that those values are not subject to interpretation, and are “known” factors and artist may follow them if they want to keep realistic look.
 
 *Roughness :* The degree of roughness of the material : Is your material smooth or rough. 0 means smooth, 1 means rough. Some implementations refer to this as Smoothness or Glossiness. That’s essentially the same except it’s the other way around. 1 is smooth and 0 is rough. I find the term “Roughness” pretty much self explanatory and doesn’t leave room for misinterpretation.
 
-image::Roughness.png[Roughness,width="320",height="250",align="center"]
+image::jme3/advanced/Roughness.png[Roughness,width="320",height="250",align="center"]
 Here is the same material with different level of roughness from 0 (left) to 1 (right). As opposed to metalness, this parameter is very artist driven. The roughness of a material does not really depend on physics, it’s more related to micro scratches, wearing, etc… So that’s where artists should be creative!
 
 These parameters are the basics of PBR. Of course, each of them can be stored in a texture, and more common additional parameters can be used.
@@ -125,4 +133,9 @@ Now there are pro and cons on using one or the other. Here are the main points :
 
 IMO, the metalness workflow is more suited to real time 3D engine. And as an artist I find it more intuitive.
 
-That  said, as a developer making his PBR pipeline; especially for an engine mainly used by Indie devs; whatever pipeline you choose, you can’t ignore the other. Free or charged PBR ready model you can find are done with whatever workflow suited the artist. some conversion are possible, but that’s easier for user to be able to use the model as is. That’s why I decided to support both in my implementation.
+That  said, as a developer making his PBR pipeline; especially for an engine mainly used by Indie devs; whatever pipeline you choose, you can’t ignore the other. Free or charged PBR ready model you can find are done with whatever workflow suited the artist. Some conversion are possible, but that’s easier for user to be able to use the model as is. That’s why I decided to support both in my implementation.
+
+'''
+
+*  <<jme3/advanced/pbr_part2#,Physically Based Rendering – Part Two>>
+*  <<jme3/advanced/pbr_part3#image-based-lighting-in-pbr#,Physically Based Rendering – Part Three>>

+ 17 - 6
src/docs/asciidoc/jme3/advanced/pbr_part2.adoc

@@ -1,6 +1,14 @@
 = Physically Based Rendering – Part Two
+:author:
+:revnumber:
+:revdate: 2018/01/15 23:16
+:relfileprefix: ../../
+:imagesdir: ../..
+:experimental:
+ifdef::env-github,env-browser[:outfilesuffix: .adoc]
 
-<<pbr_part1#,In previous post>>, I explained what you had to know about Physically Based Rendering if you were an artist. If you’re a developer, and reading this article, you may have tried, or are planning  to implement your own PBR system. If you started to read some of the available literature, you’ve probably been struck by the math complexity of it, and by the lack of explanation of the big picture. You usually see articles that focus on specifics parts of the process, and don’t talk much about other parts as they are assumed easier. At some point you have to assemble all these parts, and I had a hard time figuring out how to do it in my readings. I guess it’s considered basic stuff for other authors, but I think it deserves its proper explanation.
+
+<<jme3\advanced\pbr_part1#,In Part one>>, I explained what you had to know about Physically Based Rendering if you were an artist. If you’re a developer, and reading this article, you may have tried, or are planning  to implement your own PBR system. If you started to read some of the available literature, you’ve probably been struck by the math complexity of it, and by the lack of explanation of the big picture. You usually see articles that focus on specifics parts of the process, and don’t talk much about other parts as they are assumed easier. At some point you have to assemble all these parts, and I had a hard time figuring out how to do it in my readings. I guess it’s considered basic stuff for other authors, but I think it deserves its proper explanation.
 
 I don’t pretend these articles will enlighten you to the point you are ready to implement your own system, but I hope they will give you solid basis and understanding to start reading the literature without saying “WTF?? on every line as I did.
 
@@ -18,7 +26,7 @@ So first, lets talk about lighting in games. It all boils down to 2 things :
 
 This image from wikipedia is the most simple and yet the most helpful to understand this
 
-image::Lambert2.png[Lambert2,width="320",height="250",align="center"]
+image::jme3/advanced/Lambert2.png[Lambert2,width="320",height="250",align="center"]
 By GianniG46 (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0) or GFDL (http://www.gnu.org/copyleft/fdl.html)], via Wikimedia Commons
 
 To compute each of these factors, we’re going to use a function. This function answers to the delicate name of *Bidirectional Reflectance Distribution Function or BRDF*.
@@ -100,7 +108,7 @@ I chose to use the same BRDF as the ones used in Unreal Engine 4 from link:http:
 
 The most used diffuse BRDF in games. It’s very popular because it’s very cheap to compute and gives good results. This is the most simple way of computing diffuse.  link:https://en.wikipedia.org/wiki/Lambertian_reflectance[here are the details]
 
-image::DiffuseLambert.jpg[DiffuseLambert,width="320",height="250",align="center"]
+image::jme3/advanced/DiffuseLambert.jpg[DiffuseLambert,width="320",height="250",align="center"]
 Diffuse Lambert factor for a direct light source (directional light) with a yellow surface color.
 
 Some Alternatives :
@@ -118,7 +126,7 @@ This is a bit more complicated for specular. We need a physically plausible BRDF
 
 It states that at a micro level a surface is not plane, but formed of a multitude of little randomly aligned surfaces, the microfacets. Those surfaces acts as small mirrors that reflects incoming light. The idea behind this BRDF is that only some of those facets may be oriented so that the incoming light reflects to your eye. The smoother the surface, the more all facets are aligned, and the most neat is the light reflection. In the contrary, if a surface is rough, the facets are more randomly oriented so the light reflection is scattered on the surface, and the reflection looks more blurry.
 
-image::Specular.png[Specular,width="320",height="250",align="center"]
+image::jme3/advanced/DiffuseLambert.jpg[Specular,width="320",height="250",align="center"]
 Microfacet specular factor for a direct light source. On the left a smooth surface, on the right a rough one. Note how the reflection is scattered on the surface when it’s rough.
 
 The microfacet BRDF we use is called Cook-Torrance. From my readings, I couldn’t find any implementation that use another specular BRDF. It seems like this is the global form of any microfacet BRDF.
@@ -148,10 +156,8 @@ I’ve settled to use those models and approximations :
 I won’t go into the details of all the alternatives I just want to expose an overview of the whole process first.  But I’ll dive into more technical details on the terms I use, in following posts. To have a neat overview of all alternatives you can see this link:http://graphicrants.blogspot.fr/2013/08/specular-brdf-reference.html[post] on  Brian Karis’s blog.
 
 
-
 That sums up the whole process, but there is still much to explain. In next post I’ll make a focus on indirect lighting, as it’s the part that gave me the hardest time to wrap my head around. I’ll explain the Image Based Lighting technique used, and how you can compute diffuse and specular from an Environment Map.
 
-<<pbr_part3#,Next Post>>
 
 == Lexical :
 
@@ -170,3 +176,8 @@ That sums up the whole process, but there is still much to explain. In next post
 *Fresnel factor called F*. Discovered by Augustin Fresnel (frenchies are sooo clever), it describes how light reflects and refracts at the intersection of two different media (most often in computer graphics : Air and the shaded surface)
 
 *Geometry shadowing term G*. Defines the shadowing from the micro facets
+
+'''
+
+*  <<jme3\advanced\pbr_part1#,Physically Based Rendering – Part one>>
+*  <<jme3\advanced\pbr_part3#,Physically Based Rendering – Part Three>>

+ 27 - 12
src/docs/asciidoc/jme3/advanced/pbr_part3.adoc

@@ -1,11 +1,21 @@
 = Physically Based Rendering – Part Three
+:author:
+:revnumber:
+:revdate: 2018/01/15 23:16
+:relfileprefix: ../../
+:imagesdir: ../..
+:experimental:
+ifdef::env-github,env-browser[:outfilesuffix: .adoc]
 
-image::irradianceMap.png[irradianceMap,width="320",height="250",align="center"]
-*Note* : after several discussions in the team, I realized that some points were not clear in the  “PBR for artists” post. I’ve made an update with additional information on how to handle metalness and specular. <<pbr_part1#,I invite you to read it>>.
+
+image::jme3/advanced/irradianceMap.png[irradianceMap,width="320",height="250",align="center"]
+*Note* : after several discussions in the team, I realized that some points were not clear in the  “PBR for artists” post. I’ve made an update with additional information on how to handle metalness and specular. I invite you to read it.
+
+<<jme3\advanced\pbr_part1#,Physically Based Rendering – Part one>>
 
 == Image Based Lighting in PBR
 
-In the <<pbr_part2#,previous post>>, I talked about the basics of PBR for developers, and explained the different steps of the lighting process with PBR.
+In <<jme3/advanced/pbr_part2#,Physically Based Rendering – Part Two>>, I talked about the basics of PBR for developers, and explained the different steps of the lighting process with PBR.
 
 As said before, PBR does not necessarily imply to have indirect lighting, But that’s what makes it look so good.
 
@@ -33,7 +43,7 @@ How do we fetch a pixel from an environment map? We need a vector. Often called
 
 A picture worth thousand words
 
-image::Cube_mapped_reflection_example.jpg[Cube_mapped_reflection_example,width="320",height="250",align="center"]
+image::jme3/advanced/Cube_mapped_reflection_example.jpg[Cube_mapped_reflection_example,width="320",height="250",align="center"]
 from wikipedia : TopherTH at the English language Wikipedia [GFDL (http://www.gnu.org/copyleft/fdl.html), GFDL (http://www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0/)], via Wikimedia Commons Here the reflected Ray is our reflection vector.
 
 Here the reflected Ray is our reflection vector.
@@ -58,7 +68,7 @@ But what’s the light color for a given point?
 
 Technically, every pixel in the environment map is a light source, so a shaded point is lighten by a vast amount of pixels.
 
-image::irradiance.png[irradiance,width="320",height="250",align="center"]
+image::jme3/advanced/irradiance.png[irradiance,width="320",height="250",align="center"]
 
 In this picture the orange area represent the light rays coming from the cube map to the shaded pixel, that we have to consider to properly compute the light color. So the idea would be, for each shaded pixel, to fetch all that texels and combine the colors.
 
@@ -66,7 +76,7 @@ As you can image that’s not practical for a real time rendering pipeline. Even
 
 Fortunately, We can use something called an *Irradiance map*. An irradiance map is basically the afford mentioned computation…except that it’s precomputed and baked into a texture. In practice here is what it looks like.
 
-image::irradianceMap.png[irradianceMap,width="320",height="250",align="center"]
+image::jme3/advanced/irradianceMap.png[irradianceMap,width="320",height="250",align="center"]
 On the left the original cube map, on the right, the pre computed irradiance map.
 
 So at run time you just have to do one texture fetch in that map with the reflection vector. Pretty cool heh?
@@ -85,7 +95,7 @@ But still…it’s slower than one texture fetch, so I chose to compute the Irra
 
 With this technique I can compute a 128X128 irradiance cube map on the CPU in Java in about 200ms. Too slow to be done on each frame, but at design time that’s the blink of an eye.
 
-image::DiffuseIBL.png[DiffuseIBL,width="320",height="250",align="center"]
+image::jme3/advanced/DiffuseIBL.png[DiffuseIBL,width="320",height="250",align="center"]
 Here is the diffuse effect of indirect lighting using an irradiance cube map
 
 === IBL Specular
@@ -94,7 +104,7 @@ Indirect diffuse is cool, but we want “shiny”!! Shiny implies specular light
 
 It’s important to understand what we want as a specular reflection. We want it to be very neat when the roughness is low and very blurry when it’s high.
 
-image::Roughness.png[Roughness,width="320",height="250",align="center"]
+image::jme3/advanced/Roughness.png[Roughness,width="320",height="250",align="center"]
 As roughness increase the reflection gets more blurry.
 
 To do this, we have to resolve an integral called the *radiance integral.*
@@ -121,13 +131,13 @@ But that’s not all, we also want to “bake” the specular BRDF in the map, s
 
 But, as we are preprocessing the map, we don’t have any information about the shaded surface normal and view direction. So we are going to assume they are all the same, and equal to the envVector we’ll use to fetch pixels from the map. Also we assume that the shading point is exactly at the center of the cube map.
 
-image::prefilteredEnvMapSampling.png[prefilteredEnvMapSampling,width="320",height="250",align="center"]
+image::jme3/advanced/prefilteredEnvMapSampling.png[prefilteredEnvMapSampling,width="320",height="250",align="center"]
 
 This is an approximation again, and it has a cost in quality, but we’re all for approximation as long as it’s perform faster while still looking good, right?
 
 Here is what the result looks like
 
-image::PrefilteredEnvMap.png[PrefilteredEnvMap,width="320",height="250",align="center"]
+image::jme3/advanced/PrefilteredEnvMap.png[PrefilteredEnvMap,width="320",height="250",align="center"]
 The prefiltered environment map, with the different mip levels. notice how the blur increases through them.
 
 So now we can evaluate the first sum of the split sum approximation with a single texture fetch. We are going to compute the Lod level (the mip level where to fetch the texel) according to the roughness.
@@ -146,7 +156,7 @@ So basically we can bake all combinations into a 2D map. roughness and N.L will
 
 Here is what it looks like :
 
-image::integrateBrdf.png[integrateBrdf,width="320",height="250",align="center"]
+image::jme3/advanced/integrateBrdf.png[integrateBrdf,width="320",height="250",align="center"]
 
 The nice part is that this map is constant for white light. It does not depends on the environment. So you can bake it once and for all then use it as an asset in your shaders.
 
@@ -154,7 +164,7 @@ Now we have to combine values fetched from these maps to get the specular lighti
 
 Here is what indirect specular alone, looks like, with a roughness of 0.1.
 
-image::IndirectSpeculra.png[IndirectSpeculra,width="320",height="250",align="center"]
+image::jme3/advanced/IndirectSpeculra.png[IndirectSpeculra,width="320",height="250",align="center"]
 
 *So in the end :*
 
@@ -187,3 +197,8 @@ That concludes the post. Quite a lot of information to process. Now you should h
 *Importance Sampling :* A math technique to approximate the result of an integral.
 
 *Split Sum Approximation :* A way,used in Unreal Engine 4, to transform the specular radiance integral into 2 sums that can be easily baked into prefiltered textures.
+
+'''
+
+*  <<jme3\advanced\pbr_part1#,Physically Based Rendering – Part one>>
+*  <<jme3/advanced/pbr_part2#,Physically Based Rendering – Part Two>>

+ 2 - 0
src/docs/asciidoc/jme3/virtualreality.adoc

@@ -6,6 +6,8 @@
 :imagesdir: ..
 ifdef::env-github,env-browser[:outfilesuffix: .adoc]
 
+Please see this link:https://hub.jmonkeyengine.org/t/official-vr-module/37830/67[forum post] for additional information on JME Official VR module.
+
 jMonkeyEngine 3 has a wide range of support for Virtual Reality (VR). The known supported systems are:
 
 HTC Vive and systems supporting SteamVR/OpenVR