Ver Fonte

Updated for feedback from jeremyong-az

Signed-off-by: santorac <[email protected]>
santorac há 3 anos atrás
pai
commit
186ed074f0
1 ficheiros alterados com 3 adições e 2 exclusões
  1. 3 2
      rfcs/MaterialPipelineAbstraction.md

+ 3 - 2
rfcs/MaterialPipelineAbstraction.md

@@ -8,7 +8,7 @@ A similar RFC was published last year (https://github.com/o3de/sig-graphics-audi
 What is the relevance of this feature?
 ======================================
 
-The Atom renderer that ships with O3DE currently provides a _shader-centric_ workflow for authoring materials, where the material type definition is composed of complete shader programs. A tool called Material Canvas is in development that will add support for _node-editing_ workflows for authoring materials, but the current shader-centric workflow is being maintained (in addition to graph-based authoring of full shaders). The current problem can be simply stated by observing that the implementation of material algorithms is strongly coupled to the shaders used to author them. These shaders are hard-wired with assumptions about the render pipeline including the render target layout, vertex input streams, and resource binding strategy. The rendering pipeline for the current StandardPBR, EnhancedPBR, and similar material types are implemented in a Forward+ rendering pipeline. Without architectural changes, the material implementations cannot be easily ported to alternative rendering pipelines (e.g. deferred, visibility buffer, etc.). This is especially important for workloads that need to render high-density meshes (the Achilles’ heel of a Forward+ pipeline, due to quad-dispatch inefficiencies), but also restricts research and development of alternative rendering pathways. Over time, materials authored using the current abstraction are liable to ossify architectural decisions made early.
+The Atom renderer that ships with O3DE currently provides a _shader-centric_ workflow for authoring materials, where the material type definition is composed of complete shader programs. A tool called Material Canvas is in development that will add support for _node-editing_ workflows for authoring materials, but the current shader-centric workflow is being maintained (in addition to graph-based authoring of full shaders). The current problem can be simply stated by observing that the implementation of material algorithms is strongly coupled to the shaders used to author them. These shaders are hard-wired with assumptions about the render pipeline including the render target layout, vertex input streams, and resource binding strategy. The rendering pipeline for the current StandardPBR, EnhancedPBR, and similar material types are implemented in a Forward+ rendering pipeline. Without architectural changes, the material implementations cannot be easily ported to alternative rendering pipelines (e.g. deferred, visibility buffer, etc.). This is especially important for workloads that need to render high-density meshes (the Achilles’ heel of a Forward+ pipeline, due to quad-dispatch inefficiencies), but also restricts research and development of alternative rendering pathways. It is similarly difficult to make any changes to the default pipeline because it would impact the shader code for every material type. Over time, materials authored using the current abstraction are liable to ossify architectural decisions made early.
 
 O3DE should provide an abstraction that allows engineers and artists to define how a material is shaded and lit _once_, and leverage this abstraction to drive _any_ particular rendering pipeline (be it forward, deferred, v-buffer, ray tracing, or any other future technique).
 
@@ -101,6 +101,7 @@ void MaterialShadingFunction(VSOutput IN, out Surface outSurface, inout float de
     // It could adjust depth here
     // It could check alpha clipping here
     // GENERATED_DEPTH_OR_CLIPPING_INSTRUCTIONS_END
+    // (... or we could just call MaterialDepthAndOrClippingFunction() below, but this kind of nuance can be clarified as we build the system).
  
     surface.position = IN.m_worldPosition.xyz;
     surface.normal = normalize(IN.m_normal);
@@ -460,7 +461,7 @@ The set of shaders will include the combination of every lighting model and ever
 
 We will use Lua for these scripts because that's what material functors use. While Lua isn't necessary (python could be an option since these scripts aren't used at runtime), it should provide nice continuity because material functors use Lua. Also, the material pipeline might even include a script that does execute at runtime, so we should be prepared for that possibility (for example, the material pipeline might know how to enable/disable certain shaders at runtime).
 
-Each pipeline will define its own settings but there will likely be common ones where the same property name could map to multiple pipelines. For now we'll just merge the pipeline setting names from all pipelines, and assume that duplicate names will have the same usage in all pipelines. This should give the most convenient experience to the user, as there's no need to set the same value for each pipeline. If needed in the future, would could provide a way to disambiguate in case different pipelines use the same name for different things.
+Each pipeline will define its own settings but there will likely be common ones where the same property name could map to multiple pipelines. For now we'll just merge the pipeline setting names from all pipelines, and assume that duplicate names will have the same usage in all pipelines. This should give the most convenient experience to the user, as there's no need to set the same value for each pipeline. If needed in the future, we could provide a way to disambiguate in case different pipelines use the same name for different things.
 
 Each setting could be connected directly to a preprocessor flag, or processed by the pipeline's script to select which shaders to compile, or show/hide other properties. (We might want to consider just always exposing every setting as a preprocessor flag rather than having to specify one explicitly in the properties list. If a setting is to be controlled automatically rather that explicitly set by the user, then there's no real reason to provide metadata for it, except to do error checking for unhandled settings. It could use more thought).