These are notes to be aware of when upgrading projects from Unity 4 to Unity 5, if your project uses custom shaderA program that runs on the GPU. More info
See in Glossary code.
Shaders no longer apply a 2x multiply of light intensity. Instead lights are automatically upgraded to be twice as bright. This creates more consistency and simplicity in light rigs. For example a directional light shining on a white diffuse surface will get the exact color of the light. The upgrade does not affect animation, thus if you have an animated light intensity value you must change your animation curvesAllows you to add data to an imported clip so you can animate the timings of other items based on the state of an animator. For example, for a game set in icy conditions, you could use an extra animation curve to control the emission rate of a particle system to show the player’s condensing breath in the cold air. More info
See in Glossary or script code and make them 2x as large to get the same look.
In the case of custom shaders where you define your own lighting functions, you need to remove the * 2 yourself.
// A common pattern in shader code that has this problem will look like this
c.rgb = s.Albedo * _LightColor0.rgb * (diff * atten * 2);
// You need to fix the code so it looks more like this
c.rgb = s.Albedo * _LightColor0.rgb * (diff * atten);
Built-in lighting pipeline in Unity 5 can in some cases use more texture coordinate interpolators or math instruction count (to get things like non-uniform mesh scale, dynamic GI etc. working). Some of your existing surface shadersA streamlined way of writing shaders for the Built-in Render Pipeline. More info
See in Glossary might be running into texture coordinate or ALU instruction limits, especially if they were targeting shader model 2.0 (default). Adding “#pragma target 3.0” can work around this issue. See http://docs.unity.cn/Manual/SL-ShaderPrograms.html for the reference.
In Unity 5.0, non-uniform meshes are not “prescaled” on the CPU anymore. This means that normal & tangent vectors can be non-normalized in the vertex shaderA program that runs on each vertex of a 3D model when the model is being rendered. More info
See in Glossary. If you’re doing manual lighting calculations there, you’d have to normalize them. If you’re using Unity’s surface shaders, then all necessary code will be generated for you.
Unity 5.0 makes built-in Fog work on Windows Phone and consoles, but in order to achieve that we’ve changed how Fog is done a bit. For surface shaders and fixed function shaders, nothing extra needs to be done - fog will be added automatically (you can add “nofog” to surface shader #pragma line to explicitly make it not support fog).
For manually written vertex/fragment shaders, fog does not happen automagically now. You need to add #pragma multi_compile_fog and fog handling macros to your shader code. Check out built-in shader source, for example Unlit-Normal how to do it.
By default all opaque surface shaders output 1.0 (“white”) into alpha channel now. If you want to stop that, use “keepalpha” option on the #pragma surface line.
All alpha blended surface shaders use alpha component computed by the lighting function as blend factor now (instead of s.Alpha). If you’re using custom lighting functions, you probably want to add something like “c.a = s.Alpha” towards the end of it.
Unity no longer sorts by material index in the forward renderloop. This improves performance because more objects can be rendered without state changes between them. This breaks compatibility for content that relies on material index as a way of sorting. In 4.x a meshThe main graphics primitive of Unity. Meshes make up a large part of your 3D worlds. Unity supports triangulated or Quadrangulated polygon meshes. Nurbs, Nurms, Subdiv surfaces must be converted to polygons. More info
See in Glossary with two materials would always render the first material first, and the second material second. In Unity 5 this is not the case, the order depends on what reduces the most state changes to render the sceneA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
See in Glossary.
Unity 5.0 removed support for this fixed function shader functionality:
Any of the above will do nothing now, and shader inspectorA Unity window that displays information about the currently selected GameObject, asset or project settings, allowing you to inspect and edit the values. More info
See in Glossary will show warnings about their usage. You should rewrite affected shaders using programmable vertex+fragment shaders instead. All platforms support them nowadays, and there are no advantages whatsoever to use fixed function shaders.
If you have fairly old versions of Projector or Water shader packages in your project, the shaders there might be using this functionality. Upgrade the packages to 5.0 version.
Mixing partially fixed function & partially programmable shaders (e.g. fixed function vertex lighting & pixel shader; or a vertex shader and texture combiners) is not supported anymore. It was never working on mobile, consoles or DirectX 11 anyway. This required changing behavior of Legacy/Reflective/VertexLit shader to not do that - it lost per-vertex specular support; on the plus side it now behaves consistently between platforms.
Mostly this should be transparent (just result in less codegen bugs and slightly faster shaders). However HLSL compiler can be slightly more picky about syntax. Some examples:
The “unity_Scale” shader property has been removed. In 4.x unity_Scale.w was the 1 / uniform Scale of the transform, Unity 4.x only rendered non-scaled or uniformly scaled models. Other scales were performed on the CPU, which was very expensive & had an unexpected memory overhead.
In Unity 5.0 all this is done on the GPU by simply passing matrices with non-uniform scale to the shaders. Thus unity_Scale has been removed because it can not represent the full scale. In most cases where “unity_Scale” was used we recommend instead transforming to world space first. In the case of transforming normals, you always have to use normalize on the transformed normal now. In some cases this leads to slightly more expensive code in the vertex shader.
// Unity 4.x
float3 norm = mul ((float3x3)UNITY_MATRIX_IT_MV, v.normal * unity_Scale.w);
// Becomes this in Unity 5.0
float3 norm = normalize(mul ((float3x3)UNITY_MATRIX_IT_MV, v.normal));
// Unity 4.x
temp.xyzw = v.vertex.xzxz * unity_Scale.xzxz * _WaveScale4 + _WaveOffset;
// Becomes this in Unity 5.0
float4 wpos = mul (_Object2World, v.vertex);
temp.xyzw = wpos.xzxz * _WaveScale4 + _WaveOffset;
Forward rendered directional light shadows do not do separate “shadow collector” pass anymore. Now they calculate screenspace shadows from a cameraA component which creates an image of a particular viewpoint in your scene. The output is either drawn to the screen or captured as a texture. More info
See in Glossary’s depth texture (just like in deferred lighting).
This means that LightMode=ShadowCollector passes in shaders aren’t used for anything; you can just remove them from your shaders.
Depth texture itself is not generated using shader replacement anymore; it is rendered with ShadowCaster shader passes. This means that as long as your objects can cast proper shadows, then they will also appear in camera’s depth texture properly (was very hard to do before, if you wanted custom vertex animation or funky alpha testing). It also means that Camera-DepthTexture.shader is not used for anything now. And also, all built-in shadow shaders used no backface culling; that was changed to match culling mode of regular renderingThe process of drawing graphics to the screen (or to a render texture). By default, the main camera in Unity renders its view to the screen. More info
See in Glossary.