Monday, October 8, 2018

Ripples For Days



I see a lot of water shaders with all sorts of techniques for doing ripples when objects interact.  There's using the SDF in Unreal to figure out how far objects are from the water surface, but you can't pass in information about specific objects and their interaction.  There's passing in specific points to the shader about where you want ripples to happen, but this is limited to a handful of ripples who's state needs to be maintained manually.  And there's full fluid simulations which look cool, but can be costly to calculate and difficult to stylize.



The technique I've used in a few projects, including Prodeus and Super Lucky's Tale, takes the flexibility of a particle system and combines it with the shader performance of sampling a texture once.



The basic idea is to render all the ripples from an orthographic camera looking down and then re-project that texture onto your water surface.  This allows you to get all the ripple information and composite it with your water information, combining height, normal and foam information into one seamless shader.

Click Here to skip to project on GitHub and try it out for yourself.


There are three components to this technique; the ripple rendering script that handles rendering all the the ripples into a single texture, the ripple shader that is applied to the ripple particles being rendered, and the ripple include which you use in your water shader to sample the ripple texture.

Ripple Renderer Script

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class DynamicRippleRenderer : MonoBehaviour
{
 public LayerMask layerMask;
 public int texSize = 2048;
 public float rippleDist = 64.0f;
 public Camera rippleCam;
 public RenderTexture targetTex;
Starting off the ripple renderer script we need a layer mask, the ripple particles will be on their own layer, I added a layer at the bottom of the layer list called "Ripples" and that is what I set this field to and set the ripple particles layer to as well.
texSize defines the size of the render texture, a multiple of 2 is usually used.
rippleDist is how large to actually make the renderable ripple area, ripples outside this area will not be rendered.
rippleCam is a camera that will be created later to render the ripples.
targetTex is a render texture that will be created later that the camera will render to.

 void Start() {
  CreateTexture();
  CreateCamera();
 }

 void CreateTexture() {
  targetTex = new RenderTexture(texSize, texSize, 0, RenderTextureFormat.ARGBHalf, RenderTextureReadWrite.Linear);
  targetTex.Create();
 }

 void CreateCamera() {
  rippleCam = this.gameObject.AddComponent(); // add a camera to this game object
  rippleCam.renderingPath = RenderingPath.Forward; // simple forward render path
  rippleCam.transform.rotation = Quaternion.Euler(90, 0, 0); // rotate the camera to face down
  rippleCam.orthographic = true; // the camera needs to be orthographic
  rippleCam.orthographicSize = rippleDist; // the area size that ripples can occupy
  rippleCam.nearClipPlane = 1.0f; // near clip plane doesn't have to be super small
  rippleCam.farClipPlane = 500.0f; // generous far clip plane
  rippleCam.depth = -10; // make this camera render before everything else
  rippleCam.targetTexture = targetTex; // set the target to the render texture we created
  rippleCam.cullingMask = layerMask; // only render the "Ripples" layer
  rippleCam.clearFlags = CameraClearFlags.SolidColor; // clear the texture to a solid color each frame
  rippleCam.backgroundColor = new Color(0.5f, 0.5f, 0.5f, 0.5f); // the ripples are rendered as overlay so clear to grey
  rippleCam.enabled = true;
 }
In the Start function we will call the two functions that create the render texture and the camera.

In CreateTexture a new render texture is created and set to the targetTex variable. The format I'm using is ARGBHalf to get better quality but you can try ARGB32 to save some space or memory bandwidth or a single channel format if all you need is height.

In CreateCamera we add a new camera to this game object. (did I mention this script should be applied to an empty game object) There are lots of settings for the camera which I have hopefully explained in the comments, but the camera needs to be orthographic, pointed down, uses the target texture, only draws the "Ripple" layer, gets cleared to gray. It needs to be cleared to grey because the ripple shader is going to render as an overlay (light parts make things lighter and dark parts make things darker) This allows for normal map accumulation in a low range buffer (ARGB32) and is also order independent. Saves some memory and still looks pretty good. Alternatively you could clear to black, use an HDR buffer (ARGBHalf) and render additively with negative parts of the normal map subtracting below zero.
 void OnEnable() {
  Shader.EnableKeyword("DYNAMIC_RIPPLES_ON");
 }

 void OnDisable() {
  Shader.DisableKeyword("DYNAMIC_RIPPLES_ON");
 }
When the script is enabled or disabled the ripple feature in the shader will be toggled to save on performance.
 void LateUpdate() {

  Vector3 newPos = Vector3.zero;
  Vector3 viewOffset = Vector3.zero;

  if (Camera.main != null) {
   newPos = Camera.main.transform.position;
   viewOffset = newPos + Camera.main.transform.forward * rippleDist * 0.5f;
  }

  newPos.x = viewOffset.x;
  newPos.z = viewOffset.z;
  newPos.y += 250.0f;
  float mulSizeRes = (float)texSize / ( rippleDist * 2f );
  newPos.x = Mathf.Round (newPos.x * mulSizeRes) / mulSizeRes;
  newPos.z = Mathf.Round (newPos.z * mulSizeRes) / mulSizeRes;
  this.transform.position = newPos;
  this.transform.rotation = Quaternion.Euler(90, 0, 0);

  Shader.SetGlobalTexture ("_DynamicRippleTexture", targetTex);
  Shader.SetGlobalMatrix ("_DynamicRippleMatrix", rippleCam.worldToCameraMatrix);
  Shader.SetGlobalFloat ("_DynamicRippleSize", rippleCam.orthographicSize);

 }
}
In LateUpdate the ripple rendering camera will follow the main camera but will snap to the pixels in the ripple render texture. This will keep the pixels from swimming when you move the camera a little bit. The global shader variables for the ripple texture, ripple camera matrix, and the ripple camera size are also set.

The Ripple Shader

Shader "Custom/ParticleRipple"
{
 Properties {
 _MainTex ("Particle Texture", 2D) = "white" {}
 }

 Category{
  Tags { "Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent" }
  Blend DstColor SrcColor // overlay blend mode
  //Blend One One // additive blend mode
  ColorMask RGBA
  Cull Off
  Lighting Off
  ZWrite Off

  SubShader {
   Pass {

    CGPROGRAM
    #pragma vertex vert
    #pragma fragment frag

    #include "UnityCG.cginc"

    sampler2D _MainTex;
    float4 _MainTex_ST;
   
    struct appdata_t {
     float4 vertex : POSITION;
     float2 texcoord : TEXCOORD0;
     fixed4 color : COLOR;
    };

    struct v2f {
     float4 vertex : SV_POSITION;
     float2 texcoord : TEXCOORD0;
     fixed4 color : COLOR;
    };
   
    v2f vert (appdata_t v)
    {
     v2f o;
     o.vertex = UnityObjectToClipPos(v.vertex);
     o.texcoord = TRANSFORM_TEX(v.texcoord, _MainTex);
     o.color = v.color;
     return o;
    }

    fixed4 frag (v2f IN) : SV_Target
    {
     half4 col = tex2D(_MainTex, IN.texcoord);
     col = lerp( half4( 0.5, 0.5, 0.5, 0.5 ), half4( col.xyz,1.0 ), col.w * IN.color.w );   
     return col;
    }
    ENDCG 
   }
  } 
 }
}
The ripple particle shader itself is very simple. It basically just samples a texture and blends it to gray based on the vertex alpha and the texture alpha. The RGB of the vertex color is not used. Blend DstColor SrcColor is to get the overlay blend mode.



The ripple texture itself has normal overlay packed into the red and green channel, foam overlay packed into the blue channel, and height / alpha overlay packed into the alpha channel. When using overlay blend mode it's good to use an alpha to blend to 0.5 because sometimes texture colors are not exact and can lead to seeing the edges of the ripple particle. This texture is uncompressed and non SRGB.

The Ripple Shader Include

#ifndef DYNAMIC_RIPPLE_INCLUDED
#define DYNAMIC_RIPPLE_INCLUDED

sampler2D _DynamicRippleTexture;
float4x4 _DynamicRippleMatrix;
float _DynamicRippleSize;

float4 WaterRipples(float3 worldPos, float3 worldNormal) {
 float2 rippleCoords = mul(_DynamicRippleMatrix, float4(worldPos, 1)).xy * (1.0 / _DynamicRippleSize);
 half rippleMask = saturate((1.0 - abs(rippleCoords.x)) * 20) * saturate((1.0 - abs(rippleCoords.y)) * 20) * saturate(worldNormal.y);
 half4 ripples = tex2D( _DynamicRippleTexture, saturate(rippleCoords.xy * 0.5 + 0.5) );
 ripples.xyz = pow(ripples.xyz, 0.45);

 ripples = ripples * 2.0 - 1.0;
 ripples *= rippleMask;

 return ripples;
}

#endif // DYNAMIC_RIPPLE_INCLUDED
The ripple include file has a function for sampling the ripple texture. The world position of the surface gets passed in and transformed by the ripple camera matrix. This puts the world position in -1 to +1 ripple coordinate space. A mask is made using this -1 +1 space and the surface world normal which also gets passed in. The ripples need to be faded out at the edges to avoid a harsh ripple cutoff. When sampling the ripple texture the -1 +1 space get converted to 0-1 coordinate space. For whatever reason, if your project is in Linear color space render textures always get sampled as SRGB textures regardless of what they are set to in script so the texture needs to be powed by 0.45. However, if your project is gamma space all render textures are sampled as Non-SRGB. Because the ripples were rendered as overlay the texture needs to be converted to -1 +1 color space. The texture then gets masked by the ripple Mask and returned to the shader to be used however.

Water Shader



The water texture is packed similarly to the ripple texture. Normal information is in the red and green channels, foam is in the blue channel, and height is in the alpha channel. This texture is uncompressed and non SRGB.
Shader "Custom/Water" {
 Properties {
  _Color ("Color Dark", Color) = (1,1,1,1)
  _Color2("Color Light", Color) = (1,1,1,1)
  _FoamColor("Foam Color", Color) = (1,1,1,1)
  _MainTex ("Albedo (RGB)", 2D) = "white" {}
  _Scrolling("Water Scrolling", Vector) = (0,0,0,0)
  _Glossiness ("Smoothness", Range(0,1)) = 0.5
  _Metallic ("Metallic", Range(0,1)) = 0.0
 }
 SubShader {
  Tags { "RenderType"="Opaque" }
  LOD 200

  CGPROGRAM
  // Physically based Standard lighting model, and enable shadows on all light types
  #pragma surface surf Standard fullforwardshadows

  // Use shader model 3.0 target, to get nicer looking lighting
  #pragma target 3.0

  // multi compile for turning ripples on and off
  #pragma multi_compile _ DYNAMIC_RIPPLES_ON

  // the ripple include file that has the functions for sampling the ripple texture
  #include "RippleInclude.cginc"

  // the variables 
  sampler2D _MainTex;
  half _Glossiness;
  half _Metallic;
  fixed4 _Color; // light color of the water
  fixed4 _Color2; // light color of the water
  fixed4 _FoamColor; // the color of the foam
  float4 _Scrolling; // X and Y scrolling speed for the water texture, Z and W is scrolling speed for the second water texture

  struct Input {
   float2 uv_MainTex; // needed for the water texture coords
   float3 worldNormal; // needed for WorldNormalVector() to work
   float3 worldPos; // we need the world position fro sampling the ripple texture
   INTERNAL_DATA // also needed for WorldNormalVector() to work and any other normal calculations
  };

  // Add instancing support for this shader. You need to check 'Enable Instancing' on materials that use the shader.
  // See https://docs.unity3d.com/Manual/GPUInstancing.html for more information about instancing.
  // #pragma instancing_options assumeuniformscaling
  UNITY_INSTANCING_BUFFER_START(Props)
   // put more per-instance properties here
  UNITY_INSTANCING_BUFFER_END(Props)

Just setting up all the variables and telling the shader what information it's going to need to pass to the surface function. Make sure to include RippleInclude.cginc and enable multi_compile for DYNAMIC_RIPPLES_ON! Hopefully the comments explain what everything is.

  void surf (Input IN, inout SurfaceOutputStandard o) {
   
   // Albedo comes from a texture tinted by color
   fixed4 c = tex2D (_MainTex, IN.uv_MainTex + frac(_Time.yy * _Scrolling.xy));
   fixed4 c2 = tex2D(_MainTex, IN.uv_MainTex * 1.3 + frac(_Time.yy * _Scrolling.zw));

   // blend textures together
   c = (c + c2) * 0.5;

   // get the normal, foam, and height params
   half3 normal = half3(c.x, c.y, 1) * 2.0 - 1.0;
   normal.z = sqrt(1 - saturate(dot(normal.xy, normal.xy)));
   half foam = smoothstep(0.4, 0.6, c.z * c.z );
   half height = c.w;

For the surface function start by sampling the textures and blending them together. Then generate the normal, foam, and height which are the 3 things that will drive the overall look of the water.

#ifdef DYNAMIC_RIPPLES_ON
   // get the world normal, tangent, and binormal for masking the ripples and converting world normals to tangent normals
   float3 worldNormal = WorldNormalVector(IN, float3(0, 0, 1));
   float3 worldTangent = WorldNormalVector(IN, float3(1, 0, 0));
   float3 worldBinormal = WorldNormalVector(IN, float3(0, 1, 0));
   
   // sample the ripple texture
   half4 ripples = WaterRipples(IN.worldPos, worldNormal);

   // convert normal from world space to local space and add to surface normal
   // we only need the X and Y since this is an overlay for the existing water normals
   float2 rippleNormal = 0;
   rippleNormal.x = dot(worldTangent, half3(ripples.x, 0, ripples.y));
   rippleNormal.y = dot(worldBinormal, half3(ripples.x, 0, ripples.y));
   
   // add the normal foam and height contributions
   normal.xy += rippleNormal;
   foam += ripples.z * 5.0;
   height += ripples.w;
#endif

Put all the ripple related stuff inside the #ifdef so it won't activate unless the ripple renderer is enabled. We have to get the world normal, tangent and binormal for the surface in order to convert the ripple normals from world space to tangent space so they can be blended properly with the tangent space normals of the water surface. Then all the ripple information gets added to the water information, I boosted the foam a little bit since that's the driving visual for this toony water.

   // tighten the foam transition for a toony look
   foam = smoothstep(0.45, 0.55, foam);

   // modify the height ( which is used as a light dark color mask ) by the normal
   height = height + (normal.x * 0.5) - (normal.y * 0.5);

   // smooth step the height to get a tighter transition
   height = smoothstep(0.50, 0.55, height);

   // blend between the light and dark water color based on the height
   float3 waterColor = lerp( _Color.rgb, _Color2.rgb, height );

   // blend between the water color and the foam color
   waterColor = lerp(waterColor, _FoamColor.rgb, foam);

Once everything that will contribute to the water has contributed, we can blend everything together to get the final water color.

   // half the color to the albedo for shadow
   o.Albedo = waterColor * 0.5;
   // half the color to the emissive for consistancy
   o.Emission = waterColor * 0.5;
   o.Metallic = _Metallic;
   o.Smoothness = _Glossiness;
   // normal is flat but still needs to be set to get the INTERNAL_DATA
   o.Normal = float3(0,0,1);
   o.Alpha = c.a;
  }
  ENDCG
 }
 FallBack "Diffuse"
}
Half the water color gets sent to the Albedo to receive lighting and shadow and the other half gets send to the Emission to give the water a consistent look. While the Normal is flat, it still needs to be set to keep the compiler from excluding things in the INTERNAL_DATA.

When making a ripple particle system be sure to set it's layer to "Ripple" so it will be drawn by the ripple camera. Also set the article rotation to 0, and the render type to "Billboard" or "Horizontal Billboard" so the normal overlay from the ripple texture will be in world space. The main camera in the scene should have the "Ripples" layer disabled in it's culling mask.

Performance





One slight annoyance in this technique is that it comes with a little overhead from the ripple camera. You can see that the ripple camera takes about 0.3 milliseconds to render, which is about the same amount of time as the main camera takes to render this entire simple scene. The actual work of rendering the ripple particles only takes 0.02 milliseconds (varying with the amount of ripples being rendered) but you always have this 0.25-ish amount of overhead when using a second camera. You can get this precious quarter millisecond back by adding a command buffer to the main camera and manually rendering the ripples. That's a topic for another time though.

Monday, August 27, 2018

Lucky Bioms: Lava

The lava biom is another procedural-ish background that goes on forever.  The main components are the tessellated displaced ground plane that makes up the rocky lava surface, and a second duplicate plane that uses a geometry shader for the lava bubbles.  On top of that there are some smoke particles that spawn in a large circle around the camera and some swirling ember particles.



The mesh is made up of triangulated quads that are 10x10 units square.

Terrain Mesh

These meshes and particles follow the camera on the X and Z snapping to every 10 units.  This prevents any vertex swimming that can occur when a displacement map moves along tessellated geometry.

Following Camera

The surface of the lava I started with a pretty simple terrain made in world machine.

World Machine Terrain

Simple Terrain Layout

This outputs the normal, height and sediment channels that I combines into a single texture and made tile using Materialize.  This texture was going to be viewed at a very low resolution so I added a bit of blur to the whole thing, just enough so that there's no single pixel deails.  The normals are also used for the flow direction of the lava streams and the sediment was used as a mask for the streams.

Terrain

The other textures used are a tiling rock that was made by taking some of the existing rock assets and arranging them in a tiling pattern and then rendering out the normals.  I generated an edge map, height map, and ambient occlusion map in Materialize and combined them all into a single texture.  The edge and ao are combined in the blue channel and used as an overlay for the rock color in the shader.

The tessellation is edge length based with an adjustable capping value.  This means that the mesh topology will change as the camera gets nearer and further from it, this can cause vertex swimming as the tessellating vertexes slide over the displacement map.  An easy way to hide this is to use the mip maps of the displacement textures.  This shader uses mip0 at 50 meters and lerps to mip8 at 400 meters.  There are better, mathier, ways to determine the best mip level to use but this was quick and easy and, along with some extra fog on the ground, hid most of the problems.

Rocks

The flowing lava and lava pool texture have normals in their red and green channels and value overlay in the blue, same as the rock; but have a glow mask in the alpha channel.  These 2 textures don't contribute to the displacement.  A moving displacement map would cause vertex swimming and not sampling these textures makes the vertex shader a bit cheaper.

Flowing Lava

Lava Pool

The height from the terrain is added to the height from the rocks masked by the inverse of the sediment map.  The flowing lava is masked by the sediment map, the flow direction is generated from the blurry terrain normal map, and it's intensity is scaled by the sediment map.  This makes the flow faster in the middle of a stream and slower at the edges.


Lava Flow

The lava pool is masked by a height cutoff and can be adjusted to make larger or smaller pools.


Lava Pool Height Adjust

The lava bubbles are made using a geometry shader to turn the triangles into particles.  I rendered out some flip book animations of fluid sims to create the splashes.  The alpha is changed to a distance field so the splashes will have a smoother cutoff when they get alpha clipped.


Bubble Particle 25%

Bubble Particle 100%

The displacement code needs to be copied in the bubble shader so that the bubbles will know what height they should be.  The also get masked out based on if there is lava where they spawned.  And finally, since they have access to the flow map, they get a little nudge in that direction so it looks like they are following the lava streams.


Lava Bubble Flow

An interesting, and very much undocumented at the time of this implementation, trick you can do with opaque shaders is adjust the depth that they draw.  This can help a lot when you have opaque particles that are clipping through opaque geometry.  By adjusting the depth of the particle ( screenDepth + (1.0 - alpha) * adjustAmount ) you can ensure that there will never be a hard line where the flat particle geo intersects with the world.  In Super Lucky's Tale custom surface shaders can adjust their depth by defining _DEPTH_ADJUST and passing the adjustment through the DepthAdjust parameter that is part of the custom lighting surface struct.
 
// depth adjust fragment struct
struct fragOut
{
 fixed4 color : SV_Target;
 float depth : SV_DEPTH;
};

// fragment shader
#ifdef _DEPTH_ADJUST
fragOut frag_surf (v2f_surf IN) {
#else
fixed4 frag_surf (v2f_surf IN) : SV_Target {
#endif

 #ifdef _DEPTH_ADJUST
  fragOut output;
  output.color = c; // color from the surface function
  output.depth = ( ( 1.0 / ( IN.screenPos.z + o.DepthAdjust ) ) - _ZBufferParams.w ) / _ZBufferParams.z;
  return output;
 #else
  return c;
 #endif
}
The smoke particles turned out to be far more work than they should have been.  At the time Super Lucky's Tale was going to be a vr game, and one really annoying thing that happens with billboard particles in vr games is that when you tilt your head to the side, the particles tilt with you.  I tried using vertical aligned particles but those were horizontally aligned with the camera view direction and would whip around and cut through the world unnaturally when you turned your head.  So I wrote this big awful custom particle shader just for this smoke that was vertically aligned and would point at the camera position.  These features are now standard in the particle editor so hopefully no one will ever have to deal with that.


Smoke Particle
Smoke Particle Distortion


As for the smoke pixel shader, it uses some fake lighting from the normal map and a distortion texture that gives is a swirly dissolve like the smoke from Breath of the Wild.  Drop some embers on there and it's done!

Flying Around


That's about it!  Phoenix FD was used for the fluid sims, World Machine for the terrain, and lots of little things were done with Materialize, which you can download for free here.

Monday, August 6, 2018

Lucky Bioms: Clouds

To try and make things easy for the design team, the backgrounds in Super Lucky's Tale are template scenes that go on infinitely.  The designers can then place down platforms and art without worrying too much about what is going on in the background.  These template scenes are usually a collection of assets that follow the camera around and use world space shaders and whatnot to make a procedural-ish, never ending background.  In the Lucky Bioms series I'll go over a few of these background templates and explain an bit about how they were made.


The first biom I made was the clouds.  This biom is in one of the early public demos for Super Lucky's Tale.  The main reference was the Mario Kart 8 Cloudtop Cruise race course, which looks really cool but is static.  Static in Mario Kart is not so bad since you are racing around the course and stuff is usually flying past you pretty quickly.  But for the slower pace of Lucky's Tale Things need to have a little motion all the time so a tessellated displacement shader was the way to go.



3D Tiling Worley Noise

To make puffy clouds you could use a couple of displacement maps and scroll them on top of each other but you would end up seeing the pattern and may end up looking a bit meh.  So I decided to try out 3D Worley noise.  It's expensive to calculate Worley noise in 2D in a shader so 3D would have been even worse, and with the performance budget I thought it best to store the noise in a 3D texture.  Using textures often produces softer results as opposed to calculating each pixel and since Lucky's Tale has a soft look it was even more reason to store the displacement in a texture.

Now I don't know much about Worley noise or making it tile but luckily I found this shader toy by David Hoskins which covered most of the work.  From there it was pretty simple to extend it to 3D.
 
float worley3d(float3 p) {
 float d = 100.0;
 for (int xo = -1; xo <= 1; ++xo){
  for (int yo = -1; yo <= 1; ++yo){
   for (int zo = -1; zo <= 1; ++zo){
    float3 tp = floor(p) + float3(xo, yo, zo);
    d = min( d, length( p - tp - noise3d(tp) ) );
   }
  }
 }
 return cos(d); // use cosine to get round gradient
 //return 1.0 - d;
}

The first implementation was in C# script and took about 15 seconds to generate a 32x32x32 noise texture, it was also a bit harsh on the edges where cells met.  I moved the noise generation to a compute shader and could generate a 128x128x128 noise texture with 5x5x5 super sampling instantly!  Below is a shader visualizing the noise texture in world space. As the cube is dragged around you can see how cells appear and disappear.



Cloud Shader

The noise texture is mapped to surfaces using world space coordinates and is scrolled "through" a surface rather than across it.  As the surface samples different layers of the noise, the cloud cells expand and shrink in a way that scrolling 2D textures just can't replicate.

One thing about displacing geo with a shader is that the normals aren't changed to reflect the displacement.  So to fix this, tangent normals are generated in the vertex shader by sampling the noise map a little bit in the world tangent and world bi-normal direction and using the difference to make tangent normals.  These new tangent normals are then passed through to the pixel shader where they are treated like a tangent space normal map.  The normals are not super accurate but they were nice and soft and fit well with the soft cartoony look of the game.



This is the displacement function from the cloud vertex shader.  This builds the tangent normals from the displacement map, adds the displacement values together and updates the world normals and occlusion.  The tangent normals and occlusion gets passed to the pixel shader in the vertex color.
 
void displaceStuff ( inout float3 worldPos, inout float3 worldNormal, float3 worldTangent, float3 worldBinormal, inout float3 localNormal, inout float occlusion, float tile ){

 // Main tex coords for cloud displacement
 float3 texCoords = worldPos.xyz * 0.01 * _Tiling1.xyz * tile + _Time.y * _Tiling2.xyz;
 float4 disp = tex3Dlod (_WorleyNoiseTex, float4( texCoords, 0) );

 // get the coords for the tangent and binormal displacement
 float3 texCoordsTan = texCoords + normalize( worldTangent.xyz ) * 0.05;
 float3 texCoordsBiNorm = texCoords + normalize( worldBinormal.xyz ) * 0.05;

 // sample the displacement for the tangent normal
 float4 dispTan = tex3Dlod (_WorleyNoiseTex, float4( texCoordsTan, 0) );
 float4 dispBiNorm = tex3Dlod (_WorleyNoiseTex, float4( texCoordsBiNorm, 0) );

 // get tangent normal offset from displacements
 float2 localNormOffset = float2( disp.x - dispTan.x, disp.x - dispBiNorm.x );

 // scale the normal by the one over the tiling value
 float oneOverTile = ( 1.0 / tile );
 localNormal.xy += localNormOffset * _BumpIntensity * oneOverTile;
 occlusion *= pow( disp.x, oneOverTile );

 // set up the tSpace thingy for converting tangent directions to world directions
 float3 tSpace0 = float3( worldTangent.x, worldBinormal.x, worldNormal.x );
 float3 tSpace1 = float3( worldTangent.y, worldBinormal.y, worldNormal.y );
 float3 tSpace2 = float3( worldTangent.z, worldBinormal.z, worldNormal.z );

 // update the world normal
 worldNormal.x = dot(tSpace0, localNormal);
 worldNormal.y = dot(tSpace1, localNormal);
 worldNormal.z = dot(tSpace2, localNormal);
 worldNormal = normalize( worldNormal );

 // push the world position in by the average value of the displacement map
 worldPos -= worldNormal * 0.7937 * _Displacement * oneOverTile;
 // push the world position out based on the new world normal
 worldPos += worldNormal * disp.x * _Displacement * oneOverTile;

}
The displacement happens along those new normals to make the clouds puff out more instead of displace straight up.



This unfortunately splits the mesh on texture seams so you have to hide the seems of anything in the world you want to put the shader on.



The Cloud Plane

The cloud ground plane is a round mesh made up of 1x1 meter squares.


This mesh follows the camera on the x and z with its position being snapped to the closest meter.  This ensures that while the mesh is following you you never see it pop abruptly into place because the topology doesn't change as it moves.


So obviously this mesh can't be tessellated and stretch off into the distance as that would be too expensive.  So instead the 3D Worley noise map is used in the global fog function and is sampled at the level that the ground plane is at.  Then the approximate height of the displaced cloud plane can be figured out and objects beyond the cloud plane mesh can be faded to the solid fog color where the cloud plane would have intersected them.  The Cloud plane itself fades to the solid fog color at a distance so you rarely see the edge of it.


There is another shader for background clouds that supports transmission from the sun light and has a simpler pixel shader.  This shader also has a bit of code that pushes it back towards the camera far plane to keep it from drawing in front of the world.


Monday, July 23, 2018

Geometry Shader Adventures, Mesh Triangle to Particle

Geometry shaders are pretty cool because they let you turn a triangle into just about anything so long as the output doesn't exceed 1 kilobyte (don't quote me on that). Here is a simple geometry shader that turns all triangles into screen facing quads and gives them some particle like motion that can be driven with some parameters. If you want to fire up the above example in Unity you can download the asset package below. Exported with 2017.4.3f1 but should work for other versions too since it's just an unlit shader.
Shader "Unlit/MeshToParticle"
{
 Properties
 {
  _MainTex ("Texture", 2D) = "white" {}
  _Color ("Color", Color) = (1,1,1,1)
  _Factor ("Factor", Float) = 2.0

  _Ramp ("Ramp", Range(0,1)) = 0
  _Size ("Size", Float) = 1.0
  _Spread ("Random Spread", Float) = 1.0
  _Frequency ("Noise Frequency", Float) = 1.0
  _Motion ("Motion Distance", Float) = 1.0

  _InvFade ("Soft Particles Factor", Range(0.01,3.0)) = 1.0
 }

These are just the parameters that will control the particles.

_MainTex is the particle texture.
_Color is the color of the particle, this gets multiplied by vertex color.
_Factor is how bright the particles should be (for boosting values over 1).
_Ramp drives the lifetime of the particles and sliding it back and forth with "play" the particles.
_Size is the size of the particles in world space.
_Spread is how far apart the particles will move in a random direction.
_Frequency is the frequency of curl like noise that will be added to the particles over their lifetime.
_Motion is how far the particles will travel.
_InvFade is for depth bias blending with opaque objects.

 SubShader
 {
  Tags { "Queue"="Transparent" "RenderType"="Transparent"}
  Blend One OneMinusSrcAlpha
  ColorMask RGB
  Cull Off Lighting Off ZWrite Off
  LOD 100

  Pass
  {
   CGPROGRAM
   #pragma vertex vert
   #pragma geometry geom
   #pragma fragment frag
   #pragma target 4.0
   #pragma multi_compile_particles

Defining the various shaders. The geometry shader happens inbetween the vertex and pixel shader like so: Vertex Shader -> Geometry Shader -> Pixel Shader

   #include "UnityCG.cginc"

   sampler2D _MainTex;
   float4 _MainTex_ST;
   float4 _Color;
   float _Factor;

   float _Ramp;
   float _Size;
   float _Frequency;
   float _Spread;
   float _Motion;

   sampler2D_float _CameraDepthTexture;
   float _InvFade;

Just defining the variables to use.

   // data coming from unity
   struct appdata
   {
    float4 vertex : POSITION;
    float4 texcoord : TEXCOORD0;
    fixed4 color : COLOR;
   };

This is the data that unity will feed to the vertex shader. The vertex shader isn't going to do much work so we will use the same struct to send information to the geometry shader.

   // vertex shader mostly just passes information to the geometry shader
   appdata vert (appdata v)
   {
    appdata o;

    // change the position to world space
    float3 worldPos = mul( unity_ObjectToWorld, v.vertex ).xyz;
    o.vertex = float4(worldPos,1);

    // pass these through unchanged
    o.texcoord = v.texcoord;
    o.color = v.color;

    return o;
   }

The vertex shader just transforms the vertex position to world space.

   // information that will be sent to the pixel shader
   struct v2f {
    float4 vertex : SV_POSITION;
    fixed4 color : COLOR;
    float2 texcoord : TEXCOORD0;
    #ifdef SOFTPARTICLES_ON
     float4 projPos : TEXCOORD1;
    #endif
   };

This is the data that the geometry shader will sent to the pixel shader. This looks like something the vertex shader might normally do.

   // geometry vertex function
   // this will all get called in geometry shader
   // its nice to keep this stuff in its own function
   v2f geomVert (appdata v)
   {
    v2f o;
    o.vertex = UnityWorldToClipPos(v.vertex.xyz);
    o.color = v.color;
    o.texcoord = TRANSFORM_TEX(v.texcoord, _MainTex);
    #ifdef SOFTPARTICLES_ON
     o.projPos = ComputeScreenPos (o.vertex);
     // since the vertex is already in world space we need to 
     // skip some of the stuff in the COMPUTE_EYEDEPTH funciton
     // COMPUTE_EYEDEPTH(o.projPos.z);
     o.projPos.z = -mul(UNITY_MATRIX_V, v.vertex).z;
    #endif

    return o;
   }

This function is called in the geometry shader for each vertex it generates. It does most of the work that a vertex shader would normally do so I like to think of this a vertex shader for the geometry shader.

   // geometry shader
   [maxvertexcount(4)]
   void geom(triangle appdata input[3], inout TriangleStream stream )
   {
    // get the values for the centers of the triangle
    float3 pointPosWorld = (input[0].vertex.xyz + input[1].vertex.xyz + input[2].vertex.xyz ) * 0.3333333;
    float4 pointColor = (input[0].color + input[1].color + input[2].color ) * 0.3333333;
    float4 uv = (input[0].texcoord + input[1].texcoord + input[2].texcoord ) * 0.3333333;

This is the actual geometry shader, the real meat and potatoes of the whole effect. This geometry shader gets sent a triangle (an array of 3 appdata structs) so to start we get values for the center of the triangle by averaging the 3 points of the triangle

    // lifetime based on tiling and ramp parameters
    half lifeTime = saturate( uv.x + lerp( -1.0, 1.0, _Ramp ) );

    // fade particle on and off based on lifetime
    float fade = smoothstep( 0.0, 0.1, lifeTime);
    fade *= 1.0 - smoothstep( 0.1, 1.0, lifeTime);

    // don't draw invisible particles
    if( fade == 0.0 ){
     return;
    }

    // multiply color alpha by fade value
    pointColor.w *= fade;

The particle lifetime is based on the uv.x value and is biased by the ramp value. This makes the lifetime go from 0-1 across the texture coords. A fade value is generated based on the lifetime and if the value is 0.0 (before or after its lifetime) we return nothing, skipping the pixel shader and the rest of the work for this particle

    // random number seed from uv coords
    float3 seed = float3( uv.x + 0.3 + uv.y * 2.3, uv.x + 0.6 + uv.y * 3.1, uv.x + 0.9 + uv.y * 9.7 );
    // random number per particle based on seed
    float3 random3 = frac( sin( dot( seed * float3(138.215, 547.756, 318.269), float3(167.214, 531.148, 671.248) ) * float3(158.321,456.298,725.681) ) * float3(158.321,456.298,725.681) );
    // random direction from random number
    float3 randomDir = normalize( random3 - 0.5 );

We generate a random value for each particle, This can be used to ad variability to all kinds of things like size, color, rotation; but we are just using it to get a random direction in this case.

    // curl-ish noise for making the particles move in an interesting way
    float3 noise3x = float3( uv.x, uv.x + 2.3, uv.x + 5.7 ) * _Frequency;
    float3 noise3y = float3( uv.y + 7.3, uv.y + 9.7, uv.y + 12.3 ) * _Frequency;
    float3 noiseDir = sin(noise3x.yzx * 5.731 ) * sin( noise3x.zxy * 3.756 ) * sin( noise3x.xyz * 2.786 );
    noiseDir += sin(noise3y.yzx * 7.731 ) * sin( noise3y.zxy * 5.756 ) * sin( noise3y.xyz * 3.786 );

We also generate some noise with sine functions seeding based on the uvs. This creates some wispy curl-like motion for the particles
    // add the random direction and the curl direction to the world position
    pointPosWorld += randomDir * lifeTime * _Motion * _Spread;
    pointPosWorld += noiseDir * lifeTime * _Motion;

Then add the random and noise motion to the particle world position

    // the up and left camera direction for making the camera facing particle quad
    float3 camUp = UNITY_MATRIX_V[1].xyz * _Size * 0.5;
    float3 camLeft = UNITY_MATRIX_V[0].xyz * _Size * 0.5;

    // v1-----v2
    // |     / |
    // |    /  |
    // |   C   |
    // |  /    |
    // | /     |
    // v3-----v4

    float3 v1 = pointPosWorld + camUp + camLeft;
    float3 v2 = pointPosWorld + camUp - camLeft;
    float3 v3 = pointPosWorld - camUp + camLeft;
    float3 v4 = pointPosWorld - camUp - camLeft;

The camera up and left direction are hidden in the view matrix and we can use them to generate the positions for the 4 vertices.

    // send information for each vertex to the geomVert function

    appdata vertIN;
    vertIN.color = pointColor;

    vertIN.vertex = float4(v1,1);
    vertIN.texcoord.xy = float2(0,1);
    stream.Append( geomVert(vertIN) );

    vertIN.vertex = float4(v2,1);
    vertIN.texcoord.xy  = float2(1,1);
    stream.Append( geomVert(vertIN) );

    vertIN.vertex = float4(v3,1);
    vertIN.texcoord.xy  = float2(0,0);
    stream.Append( geomVert(vertIN) );

    vertIN.vertex = float4(v4,1);
    vertIN.texcoord.xy  = float2(1,0);
    stream.Append( geomVert(vertIN) );

   }
Now we can send some updated appdata to the geomVert function and append the result. The color will be the same for all the verts in the quad but the position and texture coordinates need to be updated before sending the appdata to the geomVert function.

stream.Append() adds a vertex to a triangle strip. The first 3 appends create the first triangle, adding a forth append creates the second triangle using that vertex and the previous 2 vertices. This is known as a triangle strip (super old school term) and you can continue appending vertices and each one will be a new triangle with the previous 2 verts. You can make hair or blades of grass this way.

   // simple particle like pixel shader
   fixed4 frag (v2f IN) : SV_Target
   {
    #ifdef SOFTPARTICLES_ON
     float sceneZ = LinearEyeDepth (SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, UNITY_PROJ_COORD(IN.projPos)));
     float partZ = IN.projPos.z;
     IN.color.w *= saturate (_InvFade * (sceneZ-partZ));
    #endif

    // sample the texture
    fixed4 col = tex2D(_MainTex, IN.texcoord);
    col *= _Color;
    col *= IN.color;
    col.xyz *= _Factor;

    // premultiplied alpha
    col.xyz *= col.w;

    return col;
   }
   ENDCG
  }
 }
}
The pixel shader looks like a simple particle shader because it pretty much is. There's lots more you can do with geometry shaders. You could add a tessellation shader and turn each mesh triangle into 100+ particles or other crazy stuff. I hope this gives you some insight into geometry shaders and helps get you started making some crazy stuff.

Monday, July 16, 2018

Lucky Swooshes

While working on Super Lucky's Tale something that I though was solved in a pretty cool way was the swooshes. Swooshes are used for collecting coins, spawning certain enemies, and Lucky's tail swipe effect. For this type of effect you want a fire and forget solution and you also want it to be super predictable. It should do the same thing every time and keep a consistent look over its lifetime. An obvious approach may be to use a trail attached to an object that moved toward a target. There's a few issues that may arise with this approach though. If the target moves the trail could end up having a funny shape, it could also be difficult to figure out exactly when the swoosh will arrive at the target which doesn't help with timing when to spawn things.
The best solution turned out being a mesh that was a strip of polygons with the beginning and end at 0,0,0 with a trail texture that scrolled across it. A script on the swoosh object informed the shader where the target was and the vertex shader moved the end of the swoosh over to the targets position. The script could also orient the mesh to face the target. This allowed for lots of variation in the shape of the swooshes and also guaranteed that the swoosh would reach the target exactly when it was supposed to and always have the intended shape.
A swoosh model could be made with all kinds of twisting ribbons and then deformed along a path to make all kinds of fun shapes.

The important part of the vertex shader that moves the end of the swoosh is below:
// For screen shooshes smoosh the mesh flat on the Y axis
v.vertex.y *= 1.0 - _ScreenSquish;

// Add some random offset on the X and Z zxis for screen swooshes
float2 divergence = _Divergence.xy * saturate( sin ( v.uv.x * UNITY_PI ) );
v.vertex.xz += divergence * _ScreenSquish;

// Find the start position of the swoosh
float3 worldOrigin = mul( unity_ObjectToWorld, float4(0,0,0,1) ).xyz;

// Now figure out the end position relative to the start position
float3 endOffset = _TargetPos - worldOrigin;

// Get the world position of the vertex
float3 worldPos = mul( unity_ObjectToWorld, v.vertex ).xyz;

// Add the end offset to the vertex world position masked byt the uv coordinates
worldPos += endOffset * v.uv.x;

// Transform the world position to screen position
o.vertex = mul(UNITY_MATRIX_VP, float4(worldPos,1));

// Smoosh the swoosh against the screen if it is a screen swoosh
o.vertex.z = lerp( o.vertex.z, o.vertex.w, _ScreenSquish * 0.99 );
_ScreenSquish is a float from 0-1 that is passed in from script telling the swoosh if it should be pressed against the screen, like the coin collect swooshes. This keeps is from being occluded by any opaque geometry while still being attached to a point in the world. It still follows a world space position but that position is attached to the screen.
_Divergence is a float2 passed in from script that adds some offset the the middle of the swoosh so screen swooshes don't overlap and follow a bit of a random path.

The pixel shader is pretty simple, here's the basic swoosh texture lookup with a bit of fade out on either end:
// texture coords for swoosh texture
float2 swooshUV = saturate( IN.uv * _Tiling.xy + float2( lerp( _Tiling.z, _Tiling.w, _Ramp ), 0 ) );
half4 col = tex2D(_MainTex, swooshUV ) * _Color;

// start and end fade in
half edgeFade = saturate( ( 1.0 - abs( IN.uv.x * 2 - 1 ) ) * (1.0 / _FadeInOut ) );
edgeFade = smoothstep(0,1,edgeFade);

// multiply together
col *= edgeFade;
_Ramp is passed in from script to control the swoosh travel progression.
_Tiling is set in the material and allows for control of the length of the swoosh and how _Ramp effects the swoosh travel.
_FadeInOut is lets you set how mush on the ends of the swoosh to fade out.

A material property block can be used to send the information right to the swoosh renderer without messing with the materials at all like so:
MaterialPropertyBlock MPB = new MaterialPropertyBlock ();
Renderer thisRenderer = this.GetComponent ();

if (targetScreen) {
 MPB.SetFloat ("_ScreenSquish", 1.0f);
 MPB.SetVector ("_Divergence", new Vector2 (Random.Range (-screenDivergence, screenDivergence), Random.Range (-screenDivergence, screenDivergence)));
}

MPB.SetFloat("_Ramp", ramp);

thisRenderer.SetPropertyBlock (MPB);
Now a swoosh can be spawned anywhere, its script will drive _Ramp over a specified time, you know exactly when it will reach the end and what it will look like along the way.
Even when moving the camera a screen swoosh will always start at its world position and end at the screen position, the shape is always smooth and movement always fluid.
Lucky's tail swipe uses the same shader with an extra texture overlay to make it look more wispy. The tail swoosh is spawned at Lucky's position and rotation and then the end position is just updated to be Lucky's current position.
If Lucky is jumping, the tail swoosh it will follow him in the air while maintaining its smooth shape. It's a subtle effect but helps tie it to Lucky.
But we're not done yet. You can get really fancy with a geometry shader by turning each triangle into a little particle. Here one of the swoosh ribbons has the swoosh particle shader on it. This shader turned each triangle into its own quad and gave it some movement over its lifetime. Because this is just a shader you can play the whole effect backwards. And like the regular swooshes, the particle swooshes update their positions when the target moves.
How exactly that all works may be a post for another day though.

Monday, July 9, 2018

Dark and Stormy


This repo is availible on github: github.com/SquirrelyJones/DarkAndStormy
In this post I'll break down some of whats going on in this funky skybox shader for Unity. Some of the techniques used are Flow Mapping, Steep Parallax Mapping, and Front To Back Alpha Blending. There are plenty of resources that go over these techniques in detail and this post is more about using those techniques to make something cool than it is about thoroughly explaining each one. It should also be noted that this shader is not optimized and is structured for easier readability.

The Textures

This is the main cloud layer.
This is the flow map generated from the main cloud layer. The red and green channels are similar to a blurry normal map (openGL style) generated from a the main clouds layer height. The smaller clouds will flow outward from the thicker parts of the main clouds. The blue channel is a mask for where there may be pinches due to the flow pushing a lot of the clouds texture into a small area. This doesn't look great on clouds so we want to mask out places where this will occur.
This is the second cloud layer, it will add detail to the large clouds and be distorted by the large clouds flow map.
This is the wave distortion map. This distorts all the clouds and gives an ocean wave feel to the motion.
The wave distortion mas generated in Substance Designer using a cellular pattern with heavy anisotropic blur applied. he blur direction should be perpendicular to the direction it will scroll to give it a proper wavy look.
The last texture is the Upper color that will show through the clouds.

The Shader

Shader "Skybox/Clouds"
{
 Properties
 {
  [NoScaleOffset] _CloudTex1 ("Clouds 1", 2D) = "white" {}
  [NoScaleOffset] _FlowTex1 ("Flow Tex 1", 2D) = "grey" {}
  _Tiling1("Tiling 1", Vector) = (1,1,0,0)

  [NoScaleOffset] _CloudTex2 ("Clouds 2", 2D) = "white" {}
  [NoScaleOffset] _Tiling2("Tiling 2", Vector) = (1,1,0,0)
  _Cloud2Amount ("Cloud 2 Amount", float) = 0.5
  _FlowSpeed ("Flow Speed", float) = 1
  _FlowAmount ("Flow Amount", float) = 1

  [NoScaleOffset] _WaveTex ("Wave", 2D) = "white" {}
  _TilingWave("Tiling Wave", Vector) = (1,1,0,0)
  _WaveAmount ("Wave Amount", float) = 0.5
  _WaveDistort ("Wave Distort", float) = 0.05

  _CloudScale ("Clouds Scale", float) = 1.0
  _CloudBias ("Clouds Bias", float) = 0.0

  [NoScaleOffset] _ColorTex ("Color Tex", 2D) = "white" {}
  _TilingColor("Tiling Color", Vector) = (1,1,0,0)
  _ColPow ("Color Power", float) = 1
  _ColFactor ("Color Factor", float) = 1

  _Color ("Color", Color) = (1.0,1.0,1.0,1)
  _Color2 ("Color2", Color) = (1.0,1.0,1.0,1)

  _CloudDensity ("Cloud Density", float) = 5.0

  _BumpOffset ("BumpOffset", float) = 0.1
  _Steps ("Steps", float) = 10

  _CloudHeight ("Cloud Height", float) = 100
  _Scale ("Scale", float) = 10

  _Speed ("Speed", float) = 1

  _LightSpread ("Light Spread PFPF", Vector) = (2.0,1.0,50.0,3.0)
 }
All the properties that can be played with.
 SubShader
 {
  Tags { "RenderType"="Opaque" }
  LOD 100

  Pass
  {
   CGPROGRAM
   #pragma vertex vert
   #pragma fragment frag
   
   #include "UnityCG.cginc"
   #define SKYBOX
   #include "FogInclude.cginc"
There is a custom include file that has a poor mans height fog and integrates directional light color. The terrain shader also uses the same fog to keep things cohesive.
   sampler2D _CloudTex1;
   sampler2D _FlowTex1;
   sampler2D _CloudTex2;
   sampler2D _WaveTex;

   float4 _Tiling1;
   float4 _Tiling2;
   float4 _TilingWave;

   float _CloudScale;
   float _CloudBias;

   float _Cloud2Amount;
   float _WaveAmount;
   float _WaveDistort;
   float _FlowSpeed;
   float _FlowAmount;

   sampler2D _ColorTex;
   float4 _TilingColor;

   float4 _Color;
   float4 _Color2;

   float _CloudDensity;

   float _BumpOffset;
   float _Steps;

   float _CloudHeight;
   float _Scale;
   float _Speed;

   float4 _LightSpread;

   float _ColPow;
   float _ColFactor;
Just declaring all the property variables to be used.
   struct v2f
   {
    float4 vertex : SV_POSITION;
    float3 worldPos : TEXCOORD0; 
   };

   
   v2f vert (appdata_full v)
   {
    v2f o;
    o.vertex = UnityObjectToClipPos(v.vertex);
    o.worldPos = mul( unity_ObjectToWorld, v.vertex ).xyz;
    return o;
   }
The vertex shader is pretty lightweight, just need the world position for the pixel shader.
   float rand3( float3 co ){
       return frac( sin( dot( co.xyz ,float3(17.2486,32.76149, 368.71564) ) ) * 32168.47512);
   }
We'll need a random number for some noise. This will generate a random number based on a float3.
   half4 SampleClouds ( float3 uv, half3 sunTrans, half densityAdd ){

    // wave distortion
    float3 coordsWave = float3( uv.xy *_TilingWave.xy + ( _TilingWave.zw * _Speed * _Time.y ), 0.0 );
    half3 wave = tex2Dlod( _WaveTex, float4(coordsWave.xy,0,0) ).xyz;
The wave texture needs to be sampled first, it will distort the rest of the coordinates like a Gerstner Wave. In all the _Tiling parameters .xy is tiling scale and .zw is scrolling speed. All scrolling is multiplied byt the global _Speed variable for easily adjusting the overall speed of the skybox.
    // first cloud layer
    float2 coords1 = uv.xy * _Tiling1.xy + ( _Tiling1.zw * _Speed * _Time.y ) + ( wave.xy - 0.5 ) * _WaveDistort;
    half4 clouds = tex2Dlod( _CloudTex1, float4(coords1.xy,0,0) );
    half3 cloudsFlow = tex2Dlod( _FlowTex1, float4(coords1.xy,0,0) ).xyz;
Using the red and green channels of the wave texture (xy) distort the uv coordinates for the first cloud layer. Also sample the clouds flow texture with the same coordinates.
    // set up time for second clouds layer
    float speed = _FlowSpeed * _Speed * 10;
    float timeFrac1 = frac( _Time.y * speed );
    float timeFrac2 = frac( _Time.y * speed + 0.5 );
    float timeLerp  = abs( timeFrac1 * 2.0 - 1.0 );
    timeFrac1 = ( timeFrac1 - 0.5 ) * _FlowAmount;
    timeFrac2 = ( timeFrac2 - 0.5 ) * _FlowAmount;
This is a standard setup for flow mapping.

    // second cloud layer uses flow map
    float2 coords2 = coords1 * _Tiling2.xy + ( _Tiling2.zw * _Speed * _Time.y );
    half4 clouds2 = tex2Dlod( _CloudTex2, float4(coords2.xy + ( cloudsFlow.xy - 0.5 ) * timeFrac1,0,0)  );
    half4 clouds2b = tex2Dlod( _CloudTex2, float4(coords2.xy + ( cloudsFlow.xy - 0.5 ) * timeFrac2 + 0.5,0,0)  );
    clouds2 = lerp( clouds2, clouds2b, timeLerp);
    clouds += ( clouds2 - 0.5 ) * _Cloud2Amount * cloudsFlow.z;
The second cloud layer coordinates start with the first cloud layer coordinates so the second cloud layer will stay relative to the first. Sample the second cloud layer using the flow map to distort the coordinates. Then add them to the base cloud layer, masking them by the flow maps blue channel.
    // add wave to cloud height
    clouds.w += ( wave.z - 0.5 ) * _WaveAmount;
Add the wave texture blue channel to the cloud height
    // scale and bias clouds because we are adding lots of stuff together
    // and the values cound go outside 0-1 range
    clouds.w = clouds.w * _CloudScale + _CloudBias;
Since everything is just getting added together there is the possibility that the values could go outside of 0-1 range. If things look weird we can manually scale and bias the final value back into a more reasonable range.
    // overhead light color
    float3 coords4 = float3( uv.xy * _TilingColor.xy + ( _TilingColor.zw * _Speed * _Time.y ), 0.0 );
    half4 cloudColor = tex2Dlod( _ColorTex, float4(coords4.xy,0,0)  );
sample the overhead light color texture.
    // cloud color based on density
    half cloudHightMask = 1.0 - saturate( clouds.w );
    cloudHightMask = pow( cloudHightMask, _ColPow );
    clouds.xyz *= lerp( _Color2.xyz, _Color.xyz * cloudColor.xyz * _ColFactor, cloudHightMask );
Using the cloud height (the alpha channel of the clouds) lerp between the the 2 colors and multiply the overall cloud color. The power function is used to adjust the tightness of the "cracks" in the clouds that let light through.
    // subtract alpha based on height
    half cloudSub = 1.0 - uv.z;
    clouds.w = clouds.w - cloudSub * cloudSub;
subtract the uv position from the cloud height. This gives us the cloud density at the current height.
    // multiply density
    clouds.w = saturate( clouds.w * _CloudDensity );
Multiply the density by the _CloudDensity variable to control the softness of the clouds.
    // add extra density
    clouds.w = saturate( clouds.w + densityAdd );
Add any extra density if needed. This variable is passed in and is 0 except for the final pass in which it is 1
    // add Sunlight
    clouds.xyz += sunTrans * cloudHightMask;
Add in the sun gradients masked by the cloud height mask.
    // pre-multiply alpha
    clouds.xyz *= clouds.w;
The front to back alpha blending function needs the alpha to be pre-multiplied.
    return clouds;
   }
This is the main function for sampling the clouds. The pixel shader will loop over this function.
   fixed4 frag (v2f IN) : SV_Target
   {
    // generate a view direction fromt he world position of the skybox mesh
    float3 viewDir = normalize( IN.worldPos - _WorldSpaceCameraPos );

    // get the falloff to the horizon
    float viewFalloff = 1.0 - saturate( dot( viewDir, float3(0,1,0) ) );

    // Add some up vector to the horizon to pull the clouds down
    float3 traceDir = normalize( viewDir + float3(0,viewFalloff * 0.1,0) );
We can get the view direction from subtracting the camera position from the world position and normalizing the result. "traceDir" is the direction that will be used generate the cloud uvs. It is just the view direction with a little bit of "up" added at the horizon. This adds a little bit of bend to the clouds, like they are curving around the planet, and keeps them from sprawling off into infinity at the horizon and causing all kinds of artifacts.
    // Generate uvs from the world position of the sky
    float3 worldPos = _WorldSpaceCameraPos + traceDir * ( ( _CloudHeight - _WorldSpaceCameraPos.y ) / max( traceDir.y, 0.00001) );
    float3 uv = float3( worldPos.xz * 0.01 * _Scale, 0 );
Use the camera position + the trace direction to get a world position for the cloud layer. This way the clouds will react to the camera moving, just make sure not to move the camera up through the clouds, things get weird. Then make the uvs for the clouds from the world position, multiplying by the global scale variable for easy adjusting.
    // Make a spot for the sun, make it brighter at the horizon
    float lightDot = saturate( dot( _WorldSpaceLightPos0, viewDir ) * 0.5 + 0.5 );
    half3 lightTrans = _LightColor0.xyz * ( pow( lightDot,_LightSpread.x ) * _LightSpread.y + pow( lightDot,_LightSpread.z ) * _LightSpread.w );
    half3 lightTransTotal = lightTrans * pow(viewFalloff, 5 ) * 5.0 + 1.0;
Using the dot product from the first directional light direction and the view direction, get a gradient in the direction of the sun. Then use power to tighten up the gradient to your liking. This it the light from the sun that will shine through the back of the clouds. The _LightSpread parameter has the power and factor for the two sun gradients that get added together for better control.
    // Figure out how for to move through the uvs for each step of the parallax offset
    half3 uvStep = half3( traceDir.xz * _BumpOffset * ( 1.0 / traceDir.y ), 1.0 ) * ( 1.0 / _Steps );
    uv += uvStep * rand3( IN.worldPos + _SinTime.w );
Standard steep parallax uv step amount. This is how far through the uvs and the cloud height we move with each sample. Then the starting uv is jittered a bit wit a random value per pixel to keep it from looking like flat layers.
    // initialize the accumulated color with fog
    half4 accColor = FogColorDensitySky(viewDir);
    half4 clouds = 0;
    [loop]for( int j = 0; j < _Steps; j++ ){
     // if we filled the alpha then break out of the loop
     if( accColor.w >= 1.0 ) { break; }

     // add the step offset to the uv
     uv += uvStep;

     // sample the clouds at the current position
     clouds = SampleClouds(uv, lightTransTotal, 0.0 );

     // add the current cloud color with front to back blending
     accColor += clouds * ( 1.0 - accColor.w );
    }
Start by getting the fog at the starting point. This creates an early out opportunity from the loop since we don't need to sample clouds once the the accumulated color is fully opaque. Then Iterate over the clouds moving the uv with each iteration and adding the clouds to the accumulated color using front to back alpha blending.
    // one last sample to fill gaps
    uv += uvStep;
    clouds = SampleClouds(uv, lightTransTotal, 1.0 );
    accColor += clouds * ( 1.0 - accColor.w );
Once we have iterated over the entire cloud volume do one last sample without testing against the cloud height to fill in any holes from cloud values that didn't fit inside the volume.
    // return the color!
    return accColor;
   }
   ENDCG
  }
 }
}
Then return the color and we're done!