6.1.8 Vertex shader output and varyings

Vertex shader output is defined in an output structure that must contain the vertex transformed coordinates. In the following example, the output structure is very simple but you can add other magnitudes.

The following code lists the semantics supported by Unity:
struct vertexOutput
{
	float4 pos : SV_POSITION;
	float4 tex : TEXCOORD0;
	float4 texSpecular : TEXCOORD1;
	float3 vertexInWorld : TEXCOORD2;
	float3 viewDirInWorld : TEXCOORD3;
	float3 normalInWorld : TEXCOORD4;
	float3 vertexToLightInWorld : TEXCOORD5;
	float4 vertexInScreenCoords :  TEXCOORD6;
	float4 shadowsVertexInScreenCoords :  TEXCOORD7;
};	
The transformed vertex coordinates are defined with the semantic SV_POSITION. Two textures, several vectors, and coordinates in different spaces calling the semantic TEXCOORDn are also passed to the fragment shader.
TEXCOORD0 is typically reserved for UVs and TEXCOORD1 for lightmap UVs, but technically you can send anything from TEXCOORD0 to TEXCOORD7 to the fragment shader. It is important to notice that each interpolator, that is each semantic, can only process a maximum of 4 floats. Put larger variables such as matrices into multiple interpolators. This means that if you define a matrix to be passed as a varying: float4x4 myMatrix : TEXCOORD2 , Unity uses the interpolators from TEXCOORD2 to TEXCOORD5.
Everything you send from the vertex shader to the fragment shader is linearly interpolated by default. For every pixel in the triangle defined by the vertices V1, V2 and V3 the rasterizer, located in the graphic pipeline between vertex and fragment shaders, calculates the pixel coordinates as a linear interpolation of the vertices coordinates using the barycentric coordinates λ1, λ2 and λ3.
Figure 6-3 Linear interpolation using barycentric coordinates

The following diagram shows the results of color interpolation in a triangle with vertex colors red, green and blue.
Figure 6-4 Color Interpolation

The same interpolation is applied to any varying passed from the vertex to the fragment shader. This is a very powerful tool because there is a hardware linear interpolator. For example, if you have a plane and you want to apply a color as a function of the distance to the center C, you pass the coordinate of the center C to the vertex shader, calculate the squared distance from the vertex to C and pass that magnitude to the fragment shader. The value of the distance is automatically interpolated for you in every pixel of every triangle.
Values are linearly interpolated so it is possible to perform per-vertex computations and reuse them in the fragment shader, that is, a value that can be linearly interpolated in the fragment shader can be calculated in the vertex shader instead. This can provide a substantial performance boost because the vertex shader runs on a much smaller data set than the fragment shader.
You must be careful with the use of varyings especially in mobiles where performance and memory bandwidth consumption are critical to the success of many games. The more varyings there are, the more vertex accesses and fragment shader varying reads bandwidth. Aim for a reasonable balance when using varyings.
Non-ConfidentialPDF file icon PDF versionARM 100140_0201_00_en
Copyright © 2014, 2015 ARM. All rights reserved.