shader

Ocean wave formation

Posted on June 13, 2008. Filed under: shader, Uncategorized | Tags: , , , |

I want to demonstrate the wave formation in this video. The radial waves are dominant, 3 normal sinusoidal wave components are added to them. Changing the background image changes the reflection and rafraction colors as well.

Advertisements
Read Full Post | Make a Comment ( None so far )

Choppy waves

Posted on April 25, 2008. Filed under: Lake water shader, shader, Technical background | Tags: , , , , |

The general methods discussed in these pages use randomly generated or sinusoidal wave formations. They can be absolutely enough for water scenes with normal conditions, but there are some cases, when choppy waves are needed. For example, stormy weather or shallow water where the so-called “plunging breaker” waves are formed. In the following paragraphs I will briefly introduce some of the approaches to get choppier waves.

Analytical Deformation Model

[UVTDFRWR] describes an efficient method which disturbs displaced vertex positions analytically in the vertex shader. Explosions are important for computer games. To create an explosion effect, they use the following formula:

where t is the time, r is the distance from the explosion center in the water plane and b is a decimation constant. The values of I0, w, and k are chosen according to a given explosion and its parameters.

For rendering, they displace the vertex positions according to the previous formula, which results convincing explosion effects.

Dynamic Displacement Mapping

[UVTDFRWR] introduces another approach as well. The necessary vertex displacement can be rendered in a different pass and later used to combine it with the water height-field. This way, some calculations can be done before running the application to gain performance. Depending on the bases of the water rendering, the displacements can be computed by the above-mentioned analytical model or, for example, by the Navier-Stokes equations as well.

Although these techniques can result realistic water formations, they need huge textures to describe the details. The available texture memory and the shader performance can limit the applications of these approaches.

Direct displacement

In [DWAaR] they compute the displacement vectors with FFT. Instead of modifying the height-field directly, the vertexes are horizontally displaced using the following equation:

X = X + λD(X,t)

where λ is a constant controlling the amount of displacement, and D is the displacement vector. D is computed with the following sum:

choppy waves - displacement vector computation

where K is the wave direction, t is the time, k is the magnitude of vector K and h(K,t) is a complex number representing both amplitude and phase of the wave .

The difference between the original and the displaced waves is visualized on the following figure. The displaced waves on the right are much sharper than the original ones:

choppy waves - deformation

The source of the image is [DWAaR].

Choppy Waves Using Gerstner Waves

If the rendered water surface is definded by the Gerstner equations, our task is easier. Gerstner waves are able to describe choppy wave forms. Amplitudes need to be limited in size, otherwise the breaks can look unrealistic. A fine solution to create choppy waves can be the summation of Gerstner waves with different amplitudes and phases. The summation can be carried out through the following sum:

where ki is the set of wavevectors, ki is the set of magnitudes, Ai is the set of wavefrequencies, ωi is the set of phases and N is the number of sine waves.

The sum of 3 Gerstner waves is visulaized on the following figure:

Gerstner wave summation

The source of the image is [GW].

References

[UVTDFRWR] – Using Vertex Texture Displacement for Realistic Water Rendering

[IAoOW] – Damien Hinsinger, Fabrice Neyret, Marie-Paule Cani: Interactive Animation of Ocean Waves

[GW] – Jefrey Alcantara: Gerstner waves

Read Full Post | Make a Comment ( None so far )

Rendering caustics

Posted on April 21, 2008. Filed under: Lake water shader, shader, Uncategorized | Tags: , , , , , |

However environment mapping is supported by graphic hardware, it is only good approximation in the case where the reflecting/refracting object is small compared to its distance from the environment. This means, environment mapping can be used only when the objects are close to the water surface. Objects under dynamic water surfaces need an often updated environment map, so the usability is limited.

Several approaches render accurate caustics through ray tracing methods, but generally, they are too time-consuming for real-time applications. (See [LWIuBBT]). Other techniques approximate textures of underwater caustics on a plane using wave theory. Although, these moving textures can be rendered onto arbitrary receivers at interactive frame rates, the repeating texture patterns are usually disturbing.

Graphics hardware has made significant progress in performance recently and many hardware-based approaches has been developed for rendering caustics. Real caustics calculation needs intersection tests between the objects and the viewing ray reflected at the water surface. Generally, the illumination distribution of object surfaces needs to be computed, but these are really time-consuming and difficult. Although, backward ray tracing, adaptive radiosity textures and curved reflectors are published methods for creating realistic images of caustics, they can’t be done real time because of the huge computational cost. For more details about these approaches, see [BRT], [ARTfBRT] and [IfCR].

[FRMfRaRCDtWS] describes a technique for rendering caustics fast. Their method takes into account three optical effects, reflective caustics, refractive caustics, and reflection/refraction on the water surface. It calculates the illumination distribution on the object surface through an efficient method using the GPU. In their texture based volume rendering technique objects are sliced and stored in two or three-dimensional textures. By rendering the slices in back to front order, the final image is created, and the intensities of caustics are approximated on the slices only, not on the entire object. The method is visualized on the next figure:

Rendering Caustics

The source of the image is: [FRMfRaRCDtWS].

Although, this reduces computation time, it does not enable real-time caustics rendering. The caustics map cannot be refreshed for every frame using this method.

Caustics-maps show the intensifies of caustics. They are generated by projecting the triangles of the water surface onto the objects in the water. The intersecting triangles influence the force of light on the object. The intensity of the caustic triangles are proportional to the area of the water surface triangle divided by the area of the caustic triangle. The more triangles intersect each other and the higher their intensity is at a given point, the lighter that point is. In the end, caustics map and the original illumination map is merged as on the next figure:

Caustics rendering 2

The source of the image is: [FRMfRaRCDtWS].

[IISTfAC] introduces a faster approach for rendering caustics. The method emits particles from the light source and gathers their contributions as viewed from the eye. To gain efficiency, they emit photons in a regular pattern, instead of random paths. The pattern is defined by the image pixels in a rendering from the viewpoint of the light. Or in another way: counting how many times the light-source sees a particular region is equivalent to counting how many particles hit that region. For multiple light sources, multiple rendering passes are required. Several steps are approximated to reduce the required resources, for example, interpolation among neighbouring pixels, no volumetric scattering effect or restriction to point lights.

In [IRoCuIWV], a more accurate method is described. In the first pass, the position of receivers are rendered to a texture. In the second pass, a bounding volume is drawn for each caustic volume. For points inside the volume, caustic intensity is computed and accumulated in the frame buffer. They take warped caustic volumes into account also, which is skipped in the other caustics-rendering techniques. Their technique can produce real-time performance for general caustic computation, but it is not fast enough for entire ocean surfaces. For fully dynamic water surfaces with dynamic lighting, their method rendered the following image at 1280 x 500 pixels with 0.2 fps:

Caustics rendering example

For more details, see [IRoCuIWV].

In [DWAaR], they optimise the problem to real-time performance. They consider only first-order rays and assume the receiving surface at a constant depth. The incoming light beams are refracted, and the refracted rays are then intersected against a given plane. The next figure illustrates the method:

Caustic trinagles

To reduce the necessary calculations, only a small part of the caustics-map is calculated, and they show a method to tile it for the entire image seamlessly. Finally, the sun’s ray direction and the position of the triangles are used to calculate the texture-coordinates by projection. For futher discuss on this method, see [DWAaR].

The main ideas of caustics rendering were briefly introduced. The accurate methods use ray tracing techniques, but they cannot produce real-time performance without cheating. The most often used approaches use pre-generated caustic textures and try to avoid the visible repetition.

References

[FRMfRaRCDtWS] – Kei Iwasaki1, Yoshinori Dobashi and Tomoyuki Nishita: A Fast Rendering Method for Refractive and Reflective Caustics Due to Water Surfaces

[BRT] – J. Arvo, “Backward Ray Tracing,” SIGGRAPH

[ARTfBRT] – P.S. Heckbert, “Adaptive Radiosity Textures for Bidirectional Ray Tracing,” Proc. SIGGRAPH

[IfCR] – D. Mitchell, P. Hanrahan, “Illumination from Curved Reflections,” Proc. SIGGRAPH

[IISTfAC] – Chris Wyman, Scott Davis: Interactive Image-Space Techniques for Approximating Caustics

[IRoCuIWV] – Manfred Ernst, Tomas Akenine-Möller, Henrik Wann Jensen: Interactive Rendering of Caustics using InterpolatedWarped Volumes

[LWIuBBT] – Mark Watt: Light-Water Interaction using Backward Beam Tracing

[DWAaR] – Lasse Staff Jensen, Robert Goliáš: Deep-Water Animation and Rendering

Read Full Post | Make a Comment ( None so far )

Specular highlights

Posted on April 18, 2008. Filed under: Lake water shader, shader, Technical background | Tags: , , , , , , , |

Specular highlights

Specular highlights are approximated by adding some light color to specific areas as the Phong illumination model describes. For computational reasons, the half-vector is used instead of the vector of reflectance, for more details, see the Water mathematics chapter. The half-vector is also approximated, and some perturbations are added from the values of the bump-map (specPerturb). In this demo, I used the following code for this:

float4 speccolor;
float3 lightSourceDir = normalize(float3(0.1f,0.6f,0.5f));
float3 halfvec = normalize(eyeVector+lightSourceDir+float3(perturbation.x*specPerturb,perturbation.y*specPerturb,0));

The angle between the surface-normal and the half-vector is calculated using the dot product between them. An input variable (specpower) adjusts the power, which results the specular highlights only in case of a very little angle between the vectors. Finally, the specular color is added to the original one.

float3 temp = 0;
temp.x = pow(dot(halfvec,normalVector),specPower);
speccolor = float4(0.98,0.97,0.7,0.6);
speccolor = speccolor*temp.x;
speccolor = float4(speccolor.x*speccolor.w,speccolor.y*speccolor.w,speccolor.z*speccolor.w,0);
Output.Color = Output.Color + speccolor;

Some screenshots of lake water specular highlights:

screenshot_specular_4

screenshot_specular_2 screenshot_specular_3 screenshot_specular_4

Read Full Post | Make a Comment ( 1 so far )

UV Flipping Technique to Avoid Repetition

Posted on April 11, 2008. Filed under: Lake water shader, shader, Uncategorized | Tags: , , |

Alex Vlachos
A common problem exists among many shaders that rely on scrolling two copies of the same
texture in slightly different directions. No matter what angle or speed the textures are scrolling,
they will eventually line up exactly and cause a visual hiccup. This scrolling technique is commonly
used to gain the effect of having random patterns, like water caustics. With caustics, the
pattern appears to stop every once in a while even though it is scrolling at a constant speed.
This was also encountered when implementing the Ocean Water and Reflective and
Refractive Water shaders when we scrolled two copies of the same bump map in opposite
directions to produce high frequency waves (see “Rendering Ocean Water” and “Rippling
Refractive and Reflective Water” later in this book). The method we used to solve this problem
is to compute the two sets of texture coordinates like you normally would. Then immediately
before using those texture coordinates to fetch from textures, choose one of the sets of texture
coordinates and swap the U and V. This will effectively flip the texture along the diagonal.
Using this method, the intermittent hiccup that appears due to the two copies being perfectly
aligned is now impossible since one of the textures is reflected about the diagonal of the
texture.

Alex Vlachos: UV Flipping Technique to Avoid Repetition [D3DShaderX]

Read Full Post | Make a Comment ( None so far )

Particle systems

Posted on April 10, 2008. Filed under: Lake water shader, shader, Technical background, Uncategorized | Tags: , , , , , |

Physics-based approaches have become very popular recently. The improving hardware performance makes the application of real-time particle systems also possible. Depending on the issue, vertex-based and pixel-based solutions can be appropriate as well to make huge a amount of independet particles seem alive. Particle system techniques can be combined with other water animation approaches to get a more realistic result.

Particle system approaches need to answer to questions: how do the particles move, and what are the particle as objects. The whole system can have a velocity, as a vector, but this vector does not need to be constant across the entire flow. The next figure visualize this:

particle system velocity vector

The answer to the second question is: our particles can be negligible in siza and in mass as well. But they can carry further information to make other kind of interaction also possible, for example, color, temperature and pressure, depending on the expected result.

The particles move according to the physical laws, they motion can be calculated in time steps with the help of our previously discussed velicity-vector map. To be able to make these calculations on graphic hardware, a texture must store the place of the particles, so their place are sampled into a texture. These textures are called particle maps:

particle map

To get the place of the paticles in the next timestap, we trace them just like if they moved alone along the velocity-vector map. This approach is called forward-mapping. This is illustrated on the next figure:

forward mapping

This described technique suffers from some problems. First, if the velocity is to small, some particles can stay in the same grid cell forever as they are assumed to start from the center of the cell in each iteration, but they cant leave the cell in one timestap, and they are located to the center again. Second, there might be cells which stay alwayes empty because of the same reasions, which cause stationary particles.

To overcome these issues, backward mapping can be used instead of forward mapping. Fore each grid cell, we calculate, which cell its particle can be originated from. Then, we determine the color of the cell using the color of the color of the original cell. If interpolation is used, the surrounding colors can be also taken into account, and we can avoid stationary particles and empty gaps as well:

backward mapping

Based on the previous considerations, the graphics hardware-based method to texture advection is as follows. The velocity-map and the particle-map are stored in separate textures, which have two components. A standard 2D map can be represented this way, the third dimension is added by approximations to gain performace. Offset textures are part of hardware-supported pixel operations, so the move along the velocity-field can be impleneted by them. Inflow and outflow (particle generation and removal) is outside the scope of this paper. More detailed explanations and source codes can be found in [SHADERX].

The particle systems can be good solutions to make real-time interaction between external objects and the water surface. They can efficiently animate the moving surface as well, but usually they are applied with other techniques at the same time. Flowing water, water-drops, spay, waterfalls are just some of the possible water-related topics that can be implemented through particle systems.

Sprays are modeled as a separate subsytem in [DSoSF], as metioned earlier in The Navier-Stokes Equations chapter. When an area of the surface has high upward velocity, particles are distributed over that area. The particles don’t iteract with each other, they only fall back to the water surface becase of the gravity, and then they are removed from the system. This technique can be really convincing visually for spray-simulation.

The source of these illustrative figures is [SHADERX]

Read Full Post | Make a Comment ( None so far )


Liked it here?
Why not try sites on the blogroll...