6.1.6 Vertex shaders

The vertex shader example runs once for each vertex of the geometry. The purpose of the vertex shader is to transform the 3D position of each vertex, given in the local coordinates of the object, to the projected 2D position in screen space and calculate the depth value for the Z-buffer.

The vertex shader example sample code is in 6.1.2 Shader structure.
The transformed position is expected in the output of the vertex shader. If the vertex shader does not return a value the console displays the following error:
Shader error in 'Custom/ctTextured': '' : function does not return a value: vert at line 36
In the example, the vertex shader receives as input, the vertex coordinates in local space and the texture coordinates. Vertex coordinates are transformed from local to screen space using the Model View Projection matrix UNITY_MATRIX_MVP that is a Unity built-in value:
output.pos = mul(UNITY_MATRIX_MVP, input.vertex);
Texture coordinates are passed to fragment shaders as a varying but this it does not mean that they are not transformed.
Normals are transformed from object space to world space in a different manner. To guarantee that the normal is still normal to the triangle after a non-uniform scaling operation, it must be multiplied by the transpose of the inverse of the transformation matrix. To apply the transpose operation you flip the order of factors in the multiplication. The inverse of the local to world matrix is the built-in World2Object Unity matrix. It is a 4x4 matrix so you must build a 4 component vector from the 3 component normal input vector.
float4 normalWorld = mul(float4(input.normal, 0.0), _World2Object);
When building the four component vector you add a zero as the fourth component. This is necessary to handle vector transformation correctly in the fourth dimensional space while for coordinates the fourth component must be a unit.
You can skip the process of normal transformation if normals are supplied already in world coordinates. This saves work in the vertex shader. Avoid this hint if the object mesh could potentially be handled by any Unity built in shader because in this case normals are expected in object coordinates.
Most of the graphics effects are implemented in the fragment shader but you can also do some effects in the vertex shader. Vertex Displacement Mapping, also known as Displacement Mapping is a well-known technique enabling you to deform a polygonal mesh using a texture to add surface detail, for example, in terrain generation using height maps. To have access in the vertex shader to this texture, also known as displacement map, you must add the pragma directive #pragma target 3.0 because it is only available in shader model 3.0. According to the shader model 3.0 at least 4 texture units must be accessible inside the vertex shader. If you force the editor to use the OpenGL renderer then you must also add the #pragma glsl directive. If you do not declare this directive the error message produced suggests it:
Shader error in 'Custom/ctTextured': function "tex2D" not supported in this profile (maybe you want #pragma glsl?) at line 57
In the vertex shader you also can animate vertices using “procedural animation” techniques. You can use the time variable in shaders enabling you to modify the vertex coordinates as a function of time. Mesh skinning is another type of functionality implemented in the vertex shader. Unity uses this to animate the vertices of the meshes associated with character skeletons.
Non-ConfidentialPDF file icon PDF versionARM 100140_0201_00_en
Copyright © 2014, 2015 ARM. All rights reserved.