...menustart
- GPU Programming for Video Games, Summer 2020, Georgia Tech
- 14 Introduction to (HLSL) High-Level Shading Language
- 15 Unity's Built-In Shader Source Code
- 16 Preview of the IntroShaders UnityPackage
- 18: Shaders + Parameters = Materials
- 19: Unity Pipelines: Built-In vs. Scriptable
- 20: Simple Unity Shaders
- 21: Accessing Textures in Shader Code
- 22: Normal Vector Transformations
...menuend
- A C-like languge and syntax
- But generally does NOT have
- Pointers
- Dynamic memory allocation (heap)
- Complex flow control
- fixed size loops for SM 2.0 or earlier
- Call stack
- functions are often inlined; no practical recursion
- Two different input data types in shader code
- uniform ( a keyword in HLSL/Cg) input:
- global, does not change per vertex
- Outside the scope of the shader function
- Define at the beginning of the shader code
- outside function
- Don't need to explicitly write uniform
- as long as you're defining variables outside the functions which you have to for uniform variables
- Variable input
- Attributes associated with each vertex or pixel
- these are basically the big buffers of vertices that are being sent from your main memory into the GPU. these are also various things like the locations and colors of pixelds being created by your interpolator.
- Declared using semantics
- Attributes associated with each vertex or pixel
- A uniform variable value comes from an external source
- your custom c++ DirectX or OpenGL game
- Your C# XNA or MonoGame application
- The unity or unreal 4 runtime , or your custom script
- Stored in a constant register (e.g., c0, read-only) in the GPU
- Same for all processed vertices or pixels in the entire shading process
- Something not in usual C/C++ programming Unity Sementics Doc
- When writing HLSL shader programs, input and output variables need to have their “intent” indicated via semantics.
- This is a standard concept in HLSL shader language. MSDN
- A colon and a keywoard, e.g.,
VertexPos: POSITION VertexColor: COLOR0 VertexNormal: NORMAL VertexUVcoord: TEXCOORD0
- A glue that
- Binds a shader program to the rest of the graphics pipeline
- Connects the semantic variables to the pipeline
- Semantics(语义)是HLSL中变量与众不同的一个地方
- 在编写HLSL shader程序时,输入输出变量需要通过语义(semantics)来表明它们的意图,这是hlsl语言的一个标准概念。
- 它用来描述变量的一些信息
- 本质上语义是规定着色器变量与硬件之间的关系, 例如POSITION指定这个变量是被用在位置相关的寄存器。
v2f vert ( float4 vertex : POSITION, // vertex position input float2 uv : TEXCOORD0 // first texture coordinate input ) { ...
语义 | which shader | comments |
---|---|---|
POSITION | vert | 输入:Vertex position in object space. 输出: Position of a vertex in homogenous space. Compute position in screen-space by dividing (x,y,z) by w. Every vertex shader must write out a parameter with this semantic. |
SV_Target | frag | 片元着色器一般会输出一个颜色,并带有SV_Target语义。 fixed4 frag (v2f i) : SV_Target |
SV_TargetN | frag | 多个渲染目标,SV_Target1、SV_Target2等等。 SV_Target0等同于SV_Target。 |
SV_Depth | frag | 片元着色器深度输出, 通常情况下, 片元着色器不会覆盖 Z 缓冲区值。但是, 对于某些效果,输出每个像素的自定义 Z 缓冲区深度值很有用。 请注意,在许多 GPU 上,这会关闭一些深度缓冲区优化。 |
SV_POSITION | vert,frag | 指示顶点在裁剪空间中的位置,以便GPU知道屏幕上的栅格化位置和深度。 |
TEXCOORD0, TEXCOORD1 | vert,frag | 用于指示任意高精度数据,如纹理坐标和位置。 |
COLOR0, COLOR1 | vert(out),frag(in) | 用于低精度 0 到 1 范围的数据(如简单的颜色值)。 |
VPOS | frag | 屏幕空间像素位置 |
VFACE | frag | 片元着色器可以接收一种指示渲染表面是面向摄像机还是背对摄像机的变量。在渲染两侧可见的几何体时非常有用,通常用于树叶或类似的薄片物体。 VFACE语义修饰的变量如果是整数表示渲染的面片是正面,否则表示反面。 |
SV_VertexID | vert | 顶点着色器可以接收具有“顶点编号”(为无符号整数)的变量。当您想要从纹理或ComputeBuffers中 获取额外的每顶点数据时,这非常有用。 此功能从 DX10(着色器模型 4.0)和 GLCore/OpenGL ES 3 开始才存在,因此着色器需要具有 #pragma target 3.5 编译指令。 |
- Most commonly used C/C++ operations are suported
- No pointer support or indirection in Cg/HLSL, so no
*
or->
- Some ops in Shader Model 4 and higher only:
- Bitwise logic operation (
&,^,|,&=,...
) - shift:
<<,>>, <<=,>>=
- Modular: %
- (I can't recall seeing any examples of shader code used for 3D graphcs that uses those bit operations)
- Bitwise logic operation (
- dot(a,b)
- cross(a,b)
- distance(pt1,pt2) : Euclidean distance
- leap(a,b,f) : r = (1-f)a + fb
- lit(NL, NH, pwr) : for Blinn-Phong diffuse and specular lighting
- mul(M, N)
- normalize(v)
- saturate(a) : clamp to values between 0 and 1
- reflect( I, N) : calculate reflection vector of ray I
- sincos(x, s, c) : calculate sin(x) and cos(x), x is the input, s & c are outputs
- google:
unity archive
, findUnity - Download Archive
- unity provides the source code to all of its various built-in shaders , although the shader code is embedded inside the unity executable the acutal source code needs to be download separately.
- pick a unity version, from the drop-list, choose
Built in Shaders
to download
https://catlikecoding.com/unity/tutorials/
Unity Store Assets
Fantasy Sword and Shield
Comeback
- A materials is a combination of a shader and a set of parameters.
- Assets/Texture
- Built-in pipelines:
- Forward Rendering
- basically the kind of rasterization model
- Deferred Shading
- a little bit more complicated, it basically pre-computes various kinds of lighting informations, stores that in the buffer, and then runs a special shader that run through all individual pixels after the fact to figour out how the lighting should actually work.
- Legacy Defferred Lighting
- Legacy Vertex Lit
- Forward Rendering
- Scriptable Render Pipelines:
- Universal Render Pipeline ( formerly lightweight , LWRP )
- can scale up and down ( for mobiles )
- High-Definition Render Pipeline ( HDRP )
- Your own custom pipeline !
- Universal Render Pipeline ( formerly lightweight , LWRP )
Fixed Color Shader
- Always ensure use of linear colorspace
- the actual HLSL is between
HLSLPROGRAM
andENDHLSL
- in some older shader examples for unity , you'll probably see
CGPROGRAM
ENDCG
instead ofHLSL
- all other stuffs outside here is not acutally part of HLSL, this is unity's specific. This is all meta stuff that tells unity how to organize the code.
- this shader has just 1 subshader , and 1 pass.
- in some older shader examples for unity , you'll probably see
- any shader you write for unity, you will want to include
UnityCG.cginc
- what will admittedly make you confusing if you write your own engine or somebody else's engine besides unity is that there's a lot of variables that we're going to use that unity will define for you.
- e.g.
unity_ObjectToWorld
,unity_MatrixVP
. - anyways if you were not using unity's rendering routines that fill in these variables for you and passes into GPU from the CPU side, you would have to actually put in your c# script and explicitly create these matrices on the CPU side and then send them into the GPU.
- here the unity runtime is handling that for you behind the scenes.
- e.g.
- the
#pragma
is designed to tell the compiler what's the vertex shader , and what's the fragment shader. float4(positionOS.xyz, 1.0);
just hints the compiler to do some possible optimization.return mul(unity_MatrixVP, positionWS);
- notice in here , there's no place we're dividing by w ( value z ), that's handled by other hardware on the card.
-
Alternative style , without return value
-
So far the pixel shader I've showed you haven't taken any parameters. Here we are going to take in a color parameter.
-
since we have 2 output , we have to use this "out convention".
-
Note: the color
out c
in VertGeomColor , is not directly passed in FragGeomColor. The color that are being passed in to the pixel shader are the interpolated colors.- The fact that we have the
COLOR0
in vertex shader matching theCOLOR0
in fragment shader , the samantics hook to the interpolation hardware and make that connection.
- The fact that we have the
Properties
- these represent variables that you can assign in the unity inspector
- the variable name you see here
_BaseTex
, this is the veriable you can use in the actual HLSL program.- the under score is not necessary, but unity has a convertion of putting an underscore on these properties.
"Base (RGB)"
this string is how this propery will be labeled in the unity inspector.2D
describes the type of this property."white"
sets up a default{}
maybe options could go in there, but I don't think I've ever seen anybody actually put stuff in here.
-
sampler2D _BaseTex;
declaration, declaring the_BaseTex
variable, this uniform variable, to be a sampler2D which basically means this is a 2D texture.- it should be declared properly, because you have speificy it's type as
2D
in Properties.
- it should be declared properly, because you have speificy it's type as
- If you want to add offset to uv's of the texture, or repeate the texture
TRANSFORM_TEX
this is annoyingly quite unity specific.- the unity shader compiler would like you to take all of your texture variable name (
_BaseTex
) and create a float4 variable that has_ST
appended at the end (_BaseTex_ST
). - and there is an macro, a uv texture coordinate, and the texture you want to look up
#define TRANSFORM_TEX( tex, name ) ( tex.xy * name##_ST.xy + name##_ST.zw )
- the unity shader compiler would like you to take all of your texture variable name (
- transforming normals, just applying the rotation but not the translation to the vertices position, seems almost works.
Only using upper left 3x3 part of the 4x4 matrix.
- say, a point on a surfapce represents as vector t. A linear transform M is applied on it.
tM
- to keep the normal still perpendicular to t, we need apply some transform on the normal as well
(Wn)·(Mt) = 0 (Wn)ᵀ(Mt) = 0 // dot product -> matrix mul (nᵀWᵀ)(Mt) = 0 nᵀ(WᵀM)t = 0 sine nᵀt = 0, we can set WᵀM = I MᵀW = Iᵀ W = (Mᵀ)⁻¹
- so to transform a normal, you take the same matrix that you used to transform the vertices, you grab the upper left 3x3 part, and then you take its inverse transpose.
- as mentioned before, if M has only rotation, under those condition, the inverse transform of that kind of matrix is just the transpose of the matrix, so W = (Mᵀ)⁻¹ = M. but that is only for that special case.