Refraction is the physical phenomena that distorts views through the boundary between two transparent materials that have differing refractive indices. Common examples are looking into water or through a lens. This is very easy to achieve with ray-tracing because it closely simulates the physical process of light moving around a scene. Today (2021) ray-tracing can only be done offline or with limited complexity scenes on the latest hardware.
Most games today use a technique called rasterization which is much faster but has many limitations. One limitation is that refraction is not directly possible. After a bit of digging I found an approach to simulate refraction in a limited way. This approach assumes a single object in the scene and a cubemap environment. This is what is projected and rendered onto the object as though it was visible through it. The mathematics of refraction is based upon the view angle and the surface normals of the object. To do this is fairly straightforward, and a simple custom shader within Unity can perform the task. See RefractLiquidShader.shader in the repo.
Another characteristic of the surface of water, or indeed any surface boundary which goes from a material with lower refractive index to a higher index, is that there will be a reflection that is dependent upon the view angle. Shallow angles tend to reflect more and steep angles less so. This is why when looking at a lake from a distance you can’t see under the water, but if you get close and look straight down you can. Again this effect can be simulated in the shader by projecting a cube-map environment onto the surface from above according to the Fresnel equations.
Putting this together with the dynamic water surface and a nice cube-map we get a quite satisfying result.
The Scene with the setup described in this section is RefractingLiquid.unity.
Being able to create the effect of refraction is nice, but only being able to see an environment map through the material is a bit of a limitation. However there is a solution. Its basis lies in the fact that environment maps don’t need to be static. By dynamically updating it with a special camera we capture what is happening in the scene and use that to render the caustics. The approach isn’t perfect as environment maps are a flat projection of what effectively appears in a distant sphere around the scene - that is it’s purpose - to provide an environment to wrap the scene. Objects in the scene that get close to the camera can end up with a distorted perspective. However as we will see it is a pretty good approximation.
So to implement this technique we add a script called CubeMapRefraction.cs to our liquid surface object. This creates the camera at runtime and uses it to update the environment texture of the objects material.
A repo containing the full Unity project we are using in this tutorial is available on GitLab. The Scene with the setup described in this section is SeethroughWater.unity.
Continue with Part 4: Caustics Simulation in Unity