Here are notes on my current knowledge on how shaders work. Much of it is probably wrong. It’s mostly for my own reference.

I was tasked with recreating the look of a shader provided by an outsourced vendor that was not performant on device. Furthermore, it only really worked well with high polygon objects, and we had a mix of high and low to display together that needed to look unified in style. Creating this simplified shader also nearly doubled our performance.

Resources

https://www.packtpub.com/books/content/using-specular-unity

Surface Shader Documentation: https://docs.unity3d.com/Manual/SL-SurfaceShaderExamples.html

Always great tutorials from catlike coding: http://catlikecoding.com/unity/tutorials/rendering/part-2/

Good info on using world space info for gradation: https://spennerino.wordpress.com/2017/06/20/gradient-shader/

Cool interactive tutorial (Thanks Hunter!) for generic shader writing – the syntax is slightly different but the principles apply: https://hughsk.io/fragment-foundry/chapters/03-rgb.html

Custom Lighting Models

Lighting models are how we tell the shader to make light react to the surface.

I learned a lot from this resource: http://www.jordanstevenstechart.com/lighting-models

There are built in models like ‘Lambert’. To create your own, create a function that starts with ‘Lighting’

half4 LightingSimpleSpecular (SurfaceOutputColorSpec s, half3 lightDir, half3 viewDir, half atten) 

And under CGPROGRAM, you can now pragma surface surf with the new name.

#pragma surface surf SimpleSpecular

In this example, I’ve attempted to piece together learnings from various sites to create a good specular map look. It looks good in editor but fails in two ways on device. The first is that any material with a strong spec is having odd blown out areas even in spots where there is 0 brightness in the map. The second and maybe somewhat related problem is that on low polygon objects, faces at angles away from the light are turning completely black. I’ve narrowed the latter down to the spec variable that I’m multiplying by. I’m getting a half normal like in a blinn-phong model, and then take it to the power of x, x being the specular gloss value.

It’s definitely a problem in the calculation where I’m using pow with the gloss value to make the highlight on the surface. However, many posts online do this exactly the same way for mobile. SOLUTION: I was allowing gloss to go to zero. iOS had a problem with power of 0 where editor did not.

half4 LightingSimpleSpecular (SurfaceOutputColorSpec s, half3 lightDir, half3 viewDir, half atten) 
      {
          half NdotL = dot (s.Normal, lightDir);
          
          half3 h = normalize (viewDir + lightDir);
          
          float nh = max (0dot (s.Normal, h));
          float spec = pow (nh, _SpecularGloss) * s.Specular * _SpecularPower;
          
          half diff = dot (s.Normal, lightDir);
          
          half4 c;
          c.rgb = (s.Albedo * _LightColor0.rgb * diff * _DiffPower * _MainTint.rgb + _LightColor0.rgb * spec);
          c.a = 1.0;
          return c;
      }

Possible clue for rendering discrepancy: http://answers.unity3d.com/questions/545907/why-does-inscreenpos-in-a-surface-shader-have-diff.html

Structs

One thing in particular that I’ve never understood/only copy pasted in the past is how this data gets passed around. It never occurred to me that these are not built in things, but engineer created, and can be named whatever I want.

This line takes input values and spits output:  void surf (Input IN, inout SurfaceOutputColorSpec o) 

Both Input and SurfaceOutputColorSpec are structs that I created. In the surface shader I apply whatever values I want, like Tex2D, and those can be used in the custom lighting model.

One point of confusion however, is some examples show using the surface shader portion of the code to assign the inspector driven values to the struct, while others show simply defining certain floats before the lighting model. Both seem to work.

Rim Lighting with Normals

Following the basic rim lighting tutorial, I learned how to look at the normals of a surface in relation to the normals of the camera to create a rim light. However, it takes into account the normals of the normal map as well. In my case, my art directors essentially wanted a rim on only geometry, while maintaining the detail of a normal map. My solution for this is not optimal. I ended up making a texture slot for an “empty normal”. Then in the surface shader I unpacked the empty normal into the .Normal of the output struct, then did my rim lighting line, then unpacked the actual normal. It’s weird.

o.Normal = UnpackNormal (tex2D (_SecondaryBump, IN.uv_BumpMap));
          half rim = 1.0 – saturate(dot (normalize(IN.viewDir), o.Normal) + _PushRimAngle);
          float2 screenUV = IN.screenPos.xy * _HologramDetail / IN.screenPos.w;
          o.Emission = clamp(_RimColor.rgb * tex2D (_EmissionMap, screenUV).r / 10.010000.0) * pow (rim, _RimPower);
          o.Normal = UnpackNormal (tex2D (_BumpMap, IN.uv_BumpMap));

Rim Pushing

I previously came up with this solution while working on using rim shading to create a lightsaber’s alpha. Instead of using color, I was setting the alpha based on the angle of the geo in relation to the camera. However, pure falloff would require an angle completely perpendicular to the camera, which was never happening on the cylinder. This works nicely in other cases allowing me to have more control over how much rim is showing, other than just using rim power. I modified the documentation’s calculation like so:

half rim = 1.0 – saturate(dot (normalize(IN.viewDir), o.Normal) + _PushRimAngle);

Booleans in Shaders

There is no boolean value type. This is a hack to allow a toggle-able option to do something in the shader.

At the top:

[Toggle]
_DESATURATE(DesaturateFloat) = 0

In the pragma section:

#pragma shader_feature __ _DESATURATE_ON

In the surface shader:      (In writing this up, I realize I could have probably been smarter about this by operating on the Tex2D before applying Albedo – something to look into)

#if defined(_DESATURATE_ON)
              fixed3 col = o.Albedo;
              fixed3 average = fixed3((col.r + col.g + col.b) / 3.0, (col.r + col.g + col.b) / 3.0, (col.r + col.g + col.b) / 3.0);
              o.Albedo = average;
#endif

 

ZWriting in Transparent Shader

https://docs.unity3d.com/Manual/SL-CullAndDepth.html

In order to keep transparent shader from looking weird and showing itself through itself in ugly ways, I’ve necessary to write to the Depth Buffer. This means an additional pass.

Pass

{

ZWrite On

ColorMask 0

}

 

Gradient in World Space

I made promises when tasked with adding a fade to one of our holographic characters. I was absolutely sure I could grab the bounding box of a mesh, and apply a screen space transparency based on those positions. However I couldn’t find that info anywhere.

Instead, I came up with a much more powerful solution based on this tutorial: https://spennerino.wordpress.com/2017/06/20/gradient-shader/

Where this person drove color, I drove transparency. I also wrote a helper script that takes the positions of two dummy objects for the start and end of the fade, allowing the animator to animate fade in and out with a lot of control. The downside is changing the material properties causes the material to be registered as a change in git every time you play it. I’ve yet to find a solution to this type of problem.

half _FadeStart;
      half _FadeEnd;

      void myvert(inout appdata_full v, out Input data)
      {
          UNITY_INITIALIZE_OUTPUT(Input,data);
          float4 pos = mul(unity_ObjectToWorld, v.vertex).xyzw;
          data.fade = saturate((_FadeStart – pos.y) / (_FadeStart – _FadeEnd));
      }

 

and

 

// Set alpha for bounding box fade
          o.Alpha = lerp(0.01.0, IN.fade);

 

Odd Tint Problem

I added a tint color option to my shader, and we had an interesting problem. On the opposite side of the light, the tint was turning to an inverted color of what the tint was.

I was applying the tint in my custom lighting model, in line with my multiplying of the N dot L. So we’re essentially getting the opposite lighting angle calculation and applying it to the tint which makes it inverted. To solve this I moved the multiplication of the tint into the application of the Albedo down in my surface shader function. This way the tint affects the entire diffuse color before any lighting gets calculated.

Lighting Model Diffuse Calculation:  c.rgb = (s.Albedo * _LightColor0.rgb * diff * _DiffPower * _MainTint.rgb…. etc

Moved to surface function:

o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb * _MainTint.rgb;

Sorting Order for Transparent Meshes

https://forum.unity3d.com/threads/drawing-order-of-meshes-and-sprites.212006/

Vertex Color Weirdness

http://answers.unity3d.com/questions/189584/how-to-get-vertex-color-in-a-cg-shader.html

Avoiding Conditionals in Shader

http://theorangeduck.com/page/avoiding-shader-conditionals

The above looks like a useful resource, unfortunately I couldn’t figure out how to get those functions to be recognized in my surface shader. Maybe some kind of tag I need to add. Leave a comment if you know.

Instead, I found step() to do the trick.

I had an if/else if/else to create a sort of threshold effect using a noise texture. I wanted to remove the branching. With some shifting around of values, I was able to get this hopefully more performant, albeit a little less readable…

 

// Old conditional

if (noise <= _TransitionPower – _TransitionMidWidth)
{
o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
}
else if (noise > _TransitionPower – _TransitionMidWidth && noise < _TransitionPower + _TransitionMidWidth)
{
o.Albedo = _BurnColor.rgb;
}
else
{
o.Albedo = tex2D (_SecondaryTex, IN.uv_MainTex).rgb;
}

 

 

// Using step instead
o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb * step(noise, _TransitionPower – _TransitionMidWidth) +
tex2D (_SecondaryTex, IN.uv_MainTex).rgb * step(_TransitionPower – _TransitionMidWidth, noise) +
_BurnColor.rgb * (noise > _TransitionPower – _TransitionMidWidth && noise < _TransitionPower + _TransitionMidWidth);

 

Depth Sorting in Transparent Shaders

https://docs.unity3d.com/Manual/SL-CullAndDepth.html

I had an issue implementing the solution above. On iPhone 8 only, I was seeing odd sorting issues in the mesh as it flashed on and off transparent. It turns out my two passes were rendering on top of one another. Paul came up with a great one line solution after I spent SEVENTEEN hours trying to troubleshoot for our safety build… Use Offset 1,1 in the pass to push it back in rendering order, making sure it renders the depth pass behind. For more details, see my forum post about it: https://forum.unity.com/threads/depth-testing-transparent-shader-zfighting-on-mobile.498897/