Alpha blending for realtime hair

Saturday, October 17, 2009 | |

With my project progressing, I finally have to take care about the hair.
All the rendering is based on polygons. But polygon edges are hard lines and hair is highly detailed and irregular. To get nice borders, some kind of transparency is needed. And here I ran into a problem, which I realized is well known, but has no real solution...

When a polygon object is rendered, each triangle is rendered on after another. The order does not necessarily depend on the position, rotation or visibility. So weired things like this can happen:


See how the draw order (squares at the bottom) affects the result.

One possibility would be to depth-sort all the triangles, but this can be quiet expensive when drawing a few thousand polygons each frame.
For that reason, there is the ZBuffer. A buffer on the graphics card that stores information about the distance between the camera and the rendered surfaces. By checking the ZBuffer for the respective fragment, it's possible to say if the current rendered polygon is being occluded by some previous rendered polygons or not. If it's occluded, it will be simply skipped. This way you can make sure to draw on top only. Each result would look like the example on the left (1.1).

That works great... but it fails when it comes to semi-transparent objects, because the ZBuffer can only hold one depth value per fragment.


In 2.2 the green plane completely occludes the plane in the back.

To avoid semitransparent objects to occlude objects in the back, the most common technique is to draw them after all the opaque objects have finished rendering. Again: This works fine, but...


3.2 shows a problem with multiple semitransparent layers

If there are multiple layers of semitransparent objects, the result depends on the ZWrite state, i.e. if the ZBuffer will be overwritten. If it is disabled the result for the worst-case-draw-order looks like 3.2. If the ZWrite state is enabled, the same situation looks like 3.3.
Which one looks better, heavily depends on the alpha value (i.e. how transparent the object is).

So often it's the best to stick with a very small number of small semitransparent objects. This way the chance to see depth-sorting artifacts is quiet low.
One common way to completely avoid semi-transparency, is to use AlphaTest in stead of alpha blending. This gives predictable results at the cost of quality.


AlphaTesting to avoid semitranspartency, but with poor results.

Ok, so what does that mean for me...
To use 1bit alpha, was no option, because it just looks bad. I did some test, but it was never even close to look acceptable. Depth-sorting all the triangles on the CPU would be too expensive, especially because Unity does all the skinning on the CPU.

After going through several different techniques and options and finally ended up with the following procedure:

  1. Render the opaque parts of the hair (AlphaTest Equal 1)
  2. Render the semitransparent back faces of the hair (Cull Front)
  3. Render the semitransparent front faces (Cull Back)
Each of the three steps is split up into 2 passes:

  • The first one runs once to get the ambient-light (Cubemap Sampler) and blends with normal AlphaBlending (SrcAlpha OneMinusSrcAlpha).
  • The second pass will run once for each lightsource with soft-additive AlphaBlending (SrcAlpha One). 
For the semitransparent parts, only the fragments that are closer than the opaque parts are rendered (ZTest Less) and the ZBuffer is not updated (ZWrite Off).

This procedure (which is roughly based on this paper by Thorsten Scheuermann) does not completely solve the problem, but when taking care of a clean mesh, the result might be acceptable.

I hope I can post first results next week, together with some decent anisotropic specular hair shading... :D


2 comment(s):

Jan Bubenik said...

I guess I was a little bit optimistic about that... still got lot's of zFighting. I'm working on it! :(

Anonymous said...

Any change you share the code for this shader?

Post a Comment