Shader Code

'''DISCLAIMER: This page is written by Unity users and is built around their knowledge. Some users may be more knowledgeable than others, so the information contained within may not be entirely complete or accurate.'''

= Structures = Unity's shaders use structures to pass information down the rendering pipeline.

Semantics
Semantics are special qualifiers that are added to some structures. They define how fields in the structure are bound to data channels on the hardware, and/or convey information about the intended use of a field.

Application to Vertex Shader Structure (appdata)
The first structure passes raw information about the geometry being rendered to the vertex shader. To create your own appdata structure, they must conform to the formatting and limitations outlined below. See the built-in appdata structures below for an example.

Format
Structure: struct [Name] {     [One or more Fields] }; Field: [Type] [Name] : [Semantic];

Acceptable Fields
The following fields can be written in the structure in any order. 

Built-In
Base structure, contains the least amount of data that most shaders will use. struct appdata_base {     float4 vertex   : POSITION;  // The vertex position in model space. float3 normal  : NORMAL;    // The vertex normal in model space. float4 texcoord : TEXCOORD0; // The first UV coordinate. }; Tangents included - tangents are used to rotate the normals of normal maps when the normal maps are also rotated. Use this structure if you wish to intervene in their calculation process and manipulate them. If you do not want to manipulate tangents you may use the base structure instead since they will be calculated anyway.  struct appdata_tan {     float4 vertex   : POSITION;  // The vertex position in model space. float3 normal  : NORMAL;    // The vertex normal in model space. float4 texcoord : TEXCOORD0; // The first UV coordinate. float4 tangent : TANGENT;   // The tangent vector in model space (used for normal mapping). }; All the possible fields you can derive from a mesh about to be rendered are in this structure. struct appdata_full {     float4 vertex    : POSITION;  // The vertex position in model space. float3 normal   : NORMAL;    // The vertex normal in model space. float4 texcoord : TEXCOORD0; // The first UV coordinate. float4 texcoord1 : TEXCOORD1; // The second UV coordinate. float4 tangent  : TANGENT;   // The tangent vector in Model Space (used for normal mapping). float4 color    : COLOR;     // Per-vertex color };

Vertex Shader to Fragment Shader Structure (v2f)
The second structure contains information generated by the vertex shader which is passed to the fragment shader. The vertex shader calculates and returns these values on a per-vertex basis. An interpolator then then calculates these same values on a per-pixel basis when the connected polygons are rasterized. The interpolated values are then used by the fragment shader. To create your own v2f structure, they must conform to the formatting and limitations outlined below. See the built-in v2f structures below for an example.

Format
Structure: struct [Name] {     [One or more Fields] }; Field: [Type] [Name] : [Semantic]; or [Type] [Name];

Acceptable Fields
The following fields can be written in the structure in any order. 

Built-In
This structure is designed specifically for implementing image effects. See also vert_img. struct v2f_img {    float4 pos : SV_POSITION; half2 uv  : TEXCOORD0; };

struct v2f_vertex_lit {   float2 uv	: TEXCOORD0; fixed4 diff	: COLOR0; fixed4 spec	: COLOR1; };



Surface/Fragment Shader to Lighting Shader Structure (SurfaceOutput)
The third and final structure contains pixel values returned by either a surface or fragment shader. They are read as input to a lighting shader (such as Lambert, BlinnPhong or a custom lighting model) which then returns a single RGBA color value.

Format
Structure: struct [Name] {     [One or more Fields] }; Field: [Type] [Name]; Note that semantics are not used in SurfaceOutput structures.

Acceptable Fields
The following fields can be written in the structure in any order. ''''''

Built-In
Default structure; must be used unless you have implemented your own custom lighting shader. struct SurfaceOutput {     half3 Albedo; half3 Normal; half3 Emission; half Specular; half Gloss; half Alpha; };

Surface Shader input structure (Input)
The input structure Input generally has any texture coordinates needed by the shader. Texture coordinates must be named "uv" followed by texture name (or start it with "uv2" to use second texture coordinate set).

Additional values that can be put into Input structure:


 * float3 viewDir - will contain view direction, for computing Parallax effects, rim lighting etc.
 * float4 with COLOR semantic - will contain interpolated per-vertex color.
 * float4 screenPos - will contain screen space position for reflection effects. Used by WetStreet shader in Dark Unity for example.
 * float3 worldPos - will contain world space position.
 * float3 worldRefl - will contain world reflection vector if surface shader does not write to o.Normal. See Reflect-Diffuse shader for example.
 * float3 worldNormal - will contain world normal vector if surface shader does not write to o.Normal.
 * float3 worldRefl; INTERNAL_DATA - will contain world reflection vector if surface shader writes to o.Normal. To get the reflection vector based on per-pixel normal map, use WorldReflectionVector (IN, o.Normal). See Reflect-Bumped shader for example.
 * float3 worldNormal; INTERNAL_DATA - will contain world normal vector if surface shader writes to o.Normal. To get the normal vector based on per-pixel normal map, use WorldNormalVector (IN, o.Normal).

= Functions = ShaderLab comes packaged with built-in, or "intrinsic" functions. Many of them are based on the intrinsic functions provided by shader languages like CG, GLSL and HLSL, while others are unique to ShaderLab.

