19 October 2024

How to implement water movement in Godot

Japanese landscape we are going to implement
In my previous article, I explained how to create a shader to reflect a scene on the surface of a sprite. However, such a reflection is only possible on an absolutely still sheet of water. Normally, water has surface disturbances that distort that reflection. In this article, we're going to enhance the previous shader so that it shows this distortion and changes over time.

As with other articles, you can find the code for this one in my GitHub repository. I recommend downloading it and opening it locally in your Godot editor to follow along with the explanations, but make sure to download the code from the specific commit I’ve linked to. I’m constantly making changes to the project in the repository, so if you download code from a later commit than the one linked, what you see might differ from what I show here.

I don’t want to repeat explanations, so if you’re not yet familiar with UV coordinates in a Godot shader, I recommend reading the article I linked to at the beginning.

Therefore, we are going to evolve the shader we applied to the rectangle of the water sprite. The code for that shader is in the file Shaders/WaterShader.gdshader.

To parameterize the effect and configure it from the inspector, I have added the following uniform variables:

Shader parameters configurable from the inspector
Shader parameters configurable from the inspector

The wave_strength and wave_speed parameters are intuitive: the first defines the amplitude of the distortion of the reflected image, while the second defines the speed of the distortion movement.

However, the wave_threshold parameter requires some explanation. If you place a mirror at a person’s feet, you expect the reflection of those feet to be in contact with the feet themselves. The reflection is like a shadow—it typically begins from the feet. The problem is that the algorithm we will explore here may distort the image from the very top edge of the reflection, causing it not to be in contact with the lower edge of the reflected image. This can detract from the realism of the effect, so I added the wave_threshold parameter to define at what fraction of the sprite the distortion effect begins to apply. In my example, it’s set to 0.11, which means that the distortion will start at a distance from the top edge of the water sprite equivalent to 11% of its width (i.e., UV.y = 0.11).

Finally, there is the wave_noise parameter. This is an image. For a parameter like this, we could use any kind of image, but for our distortion algorithm, we’re interested in a very specific one—a noise image. A noise image contains random black and white patches. Those who are old enough might remember the image seen on analog TVs when the antenna was broken or had poor reception; in this case, it would be a very similar image. We could search for a noise image online, but fortunately, they are so common that Godot allows us to generate them using a NoiseTexture2D resource.

Configuring the noise image
Configuring the noise image

A configuration like the one shown in the figure will suffice for our case. In fact, I only changed two parameters from the default configuration. I disabled Mipmaps generation because this image won’t be viewed from a distance, and I enabled the Seamless option because, as we will see later, we will traverse the image continuously, and we don’t want noticeable jumps when we reach the edge. Lastly, I configured the noise generation algorithm to FastNoiseLite, though this detail is not crucial.

The usefulness of the noise image will become apparent now, as we dive into the shader code.

To give you an idea of the values to configure for the above variables, I used the following values (I didn’t include in the screenshot the parameters configured in the previous article for the static reflection):

Shader parameter values
Shader parameter values

Taking the above into account, if we look at the main function of the shader, fragment(), we’ll see that there’s very little difference from the reflection shader.

fragment() shader method
fragment() shader method

If we compare it to the code from the reflections article, we’ll notice the new addition on line 50: a call to a new method, get_wave_offset(), which returns a two-dimensional vector that is then added to the UV coordinate used to sample (in line 58) the color to reflect.

What’s happening here? Well, the distortion effect of the reflection is achieved not by reflecting the color that corresponds to a specific point but by reflecting a color from a nearby point. What we will do is traverse the noise image. This image has colors ranging from black to pure white; that is, from a value of 0 to 1 (simultaneously in its three channels). For each UV coordinate we want to render for the water rectangle, we’ll sample an equivalent point in the noise image, and the amount of white in that point will be used to offset the sampling of the color to reflect by an equivalent amount in its X and Y coordinates. Since the noise image does not have abrupt changes between black and white, the result is that the offsets will change gradually as we traverse the UV coordinates, which will cause the resulting distortion to change gradually as well.

Let’s review the code for the get_wave_offset() method to better understand the above:

Code for the get_wave_offset() method
Code for the get_wave_offset() method

The method receives the UV coordinate being rendered and the number of seconds that have passed since the game started as parameters.

Line 42 of the method relates to the wave_threshold parameter we discussed earlier. When rendering the water rectangle from its top edge, we don’t want the distortion to act with full force from that very edge because it could generate noticeable aberrations. Imagine, for example, a person standing at the water’s edge. If the distortion acted with full force from the top edge of the water, the reflection of the person’s feet would appear disconnected from the feet, which would look unnatural. So, line 42 ensures that the distortion gradually increases from 0 until UV.y reaches the wave_threshold, after which the strength remains at 1.

Line 43 samples the noise image using a call to the texture() method. If we were to simply sample using the UV coordinate alone, without considering time or speed, the result would be a reflection with a static distorted image. Let’s assume for a moment that this was the case, and we didn’t take time or speed into account. What would happen is that for each UV coordinate, we would always sample the same point in the noise image, and therefore always get the same distortion vector (for that point). However, by adding time, the sampling point in the noise image changes every time the same UV coordinate is rendered, causing the distortion image to vary over time. With that in mind, the multiplicative factors are easy to understand: wave_speed causes the noise image sampling to change faster, which speeds up the changes in the distorted image; multiplying the resulting distortion vector by wave_strength_by_distance reduces the distortion near the top edge (as we explained earlier); and multiplying by wave_strength increases the amplitude of the distortion by scaling the vector returned by the method.

And that’s it—the vector returned by get_wave_offset() is then used in line 58 of the fragment() method to offset the sampling of the point being reflected in the water. The effect would be like the lower part of this figure:

Our final rendered water movement

13 October 2024

How to implement water reflections in Godot

Our japanese image
I love the simplicity of Godot's shaders. It's amazing how easy it is to apply many of the visual effects that bring a game to life. I've been learning some of those effects and want to practice them in a test scene that combines several of them.

The scene will depict a typical Japanese postcard in pixel art: a cherry blossom tree swaying in the wind, dropping flowers, framed by a starry night with a full moon, Mount Fuji in the background, and a stream of water in the foreground. Besides the movement of the tree, I want to implement effects like falling petals, rain, wind, cloud movement, lightning, and water effects—both in its movement and reflection. In this first article, I’ll cover how to implement the water reflection effect.

The first thing to note is that the source code for this project is in my GitHub repository. The link points to the version of the project that implements the reflection. In future projects, I will tag the commits where the code is included, and those are the commits I’ll link to in the article. As I progress with the articles, I might tweak earlier code, so it's important to download the commit I link to, ensuring you see the same code discussed in the article.

I recommend downloading the project and opening it in Godot so you can see how I’ve set up the scene. You’ll see I’ve layered 2D Sprites in the following Z-Index order:

  1. Starry sky
  2. Moon
  3. Mount Fuji
  4. Large black cloud
  5. Medium gray cloud
  6. Small cloud
  7. Grass
  8. Tree
  9. Water
  10. Grass occluder

Remember that sprites with a lower Z-Index are drawn before those with a higher one, allowing the latter to cover the former.

The grass occluder is a kind of patch I use to cover part of the water sprite so that its rectangular shape isn’t noticeable. We’ll explore why I used it later.

Water Configuration

Let's focus on the water. Its Sprite2D (Water) is just a white rectangle. I moved and resized it to occupy the entire lower third of the image. To keep the scene more organized, I placed it in its own layer (Layer 6) by setting both its "Visibility Layer" and "Light Mask" in the CanvasItem configuration.

In that same CanvasItem section, I set its Z-Index to 1000 to ensure it appears above all the other sprites in the scene, except for the occluder. This makes sense not only from an artistic point of view, as the water is in the foreground, but also because the technique we’ll explore only reflects images of objects with a Z-Index lower than the shader node's. We’ll see why in a moment.

Lastly, I assigned a ShaderMaterial based on a GDShader to the water’s "Material" property.

Let’s check out the code for that shader.

Water Reflection Shader

This shader is in the file Shaders/WaterShader.gdshader in the repository. As the scene is 2D, it's a canvas_item shader.

It provides the following parameters to the outside:

Shader Uniforms
Shader variables we offer to the inspector

All of these can be set from the inspector, except for screen_texture:

Shader Inspector
Shader Inspector

Remember, as I’ve mentioned in previous articles, a uniform Sampler2D with the hint_screen_texture attribute is treated as a texture that constantly updates with the image drawn on the screen. This allows us to access the colors present on the screen and replicate them in the water reflection.

This shader doesn’t distort the image; it only manipulates the colors, so it only has a fragment() component:

Fragment Method
fragment() method

As seen in the previous code, the shader operates in three steps.

The first step (line 33) calculates which point on the screen image is reflected at the point of water the shader is rendering.

Once we know which point to reflect, the screen image is displayed at that point to get its color (line 38).

Finally, the reflected color is blended with the desired water color to produce the final color to display when rendering. Remember, in GDShader, the color stored in the COLOR variable will be the one displayed for the pixel being rendered.

We’ve seen the second step (line 38) in other shaders in previous articles. It's basically like using the eyedropper tool in Photoshop or GIMP: it picks the color of a specific point on an image. Keep in mind that, when the shader executes, the image consists only of the sprites drawn up to that point. That is, sprites with a Z-Index lower than the one with our shader. This makes sense: you can’t sample a color from the screen that hasn’t been drawn yet. If objects are missing from the reflection, it’s likely because their Z-Index is higher than the shader node’s.

Now, let’s see how to calculate the screen coordinate to reflect:

get_mirrored_coordinate() Method
get_mirrored_coordinate() method

This method tests your understanding of the different coordinate types used in a 2D shader.

In 2D, you have two types of coordinates:

UV coordinates: These have X and Y components, both ranging from 0 to 1. In the rectangle of the water sprite that we want to render with our shader, the origin of the coordinates is the top-left corner of the rectangle. X increases to the right, and Y increases downward. The bottom-right corner of the water rectangle corresponds to the coordinate limit (1, 1).

SCREEN_UV coordinates: These are oriented the same way as UV coordinates, but the origin is in the top-left corner of the screen, and the coordinate limit is at the bottom-right corner of the screen.

Note that when the water sprite is fully displayed on the screen, the UV coordinates span a subset of the SCREEN_UV coordinates.

To better understand how the two types of UV coordinates work, refer to the following diagram:

UV Coordinate Types
UV coordinate types

The diagram schematically represents the scene we’re working with. The red area represents the image displayed on the screen, which in our case includes the sky, clouds, moon, mountain, tree, grass, and water. The blue rectangle represents the water, specifically the rectangular sprite where we’re applying the shader.

Both coordinate systems start in the top-left corner. The SCREEN_UV system starts in the screen’s top-left corner, while the UV system starts in the top-left corner of the sprite being rendered. In both cases, the end of the coordinate system (1, 1) is in the bottom-right corner of the screen and the sprite, respectively. These are normalized coordinates, meaning we always work within the range of 0 to 1, regardless of the element’s actual size.

To explain how to calculate the reflection, I’ve included a blue triangle to represent Mount Fuji. The red area is the mountain itself, while the triangle in the blue area represents its reflection.

Suppose our shader is rendering the coordinate (0.66, 0.66), as represented in the diagram (please note, the measurements are approximate). The shader doesn’t know what color to show for the reflection, so it needs to sample the color from a point in the red area. But which point?

Calculating the X-coordinate of the reflected point is easy because it’s the same as the reflection point: 0.66.

The trick lies in the Y-coordinate. If the reflection point is at UV.Y = 0.66, it means it's 1 - 0.66 = 0.33 away from the bottom edge (rounded to two decimal places for clarity). In our case, where the image to be reflected is above and its reflection appears below, the natural expectation is that the image will appear vertically inverted. Therefore, if the reflection point was 0.33 away from the bottom edge of the rectangle, the reflected point will be 0.33 away from the top edge of the screen. Thus, the Y-coordinate of the reflected point will be 0.33. This is precisely the calculation done in line 11 of the get_mirrored_coordinate() method.

So, as the shader scans the rectangle from left to right and top to bottom to render its points, it samples the screen from left to right and bottom to top (note the difference) to acquire the colors to reflect.

This process has two side effects to consider.

The first is that if the reflection surface (our shader’s rectangle) has less vertical height than the reflected surface (the screen), as in our case, the reflection will be a "squashed" version of the original image. You can see what I mean in the image at the start of the article. In our case, this isn’t a problem; it’s even desirable as it gives more depth, mimicking the effect you'd see if the water’s surface were longitudinal to our line of sight.

The second side effect is that, as we scan the screen to sample the reflected colors, there will come a point where we sample the lower third of the screen where the water rectangle itself is located. An interesting phenomenon will occur: What will happen when the reflection shader starts sampling pixels where it’s rendering the reflection? In the best case, the color sampled will be black because we’re trying to sample pixels that haven’t been painted yet (that’s precisely the job of our shader). So what will be reflected is a black blotch. To avoid this, we must ensure that our sampling doesn’t dip below the screen height where the water rectangle begins.

Using our example from the image at the beginning of the article, we can estimate that the water rectangle occupies the lower third of the screen. Therefore, sampling should only take place between SCREEN_UV.Y = 0 and SCREEN_UV.Y = 0.66. To achieve this, I use line 13 of get_mirrored_coordinate(). This mix() method allows you to obtain the value of its third parameter within a range defined by the first two. For example, mix(0, 0.66, 0.5) would point to the midpoint between 0 and 0.66, giving a result of 0.33.

By limiting the vertical range of the pixels to sample for reflection, we ensure that only the part of the screen we care about is reflected.

With all this in place, we now have the screen coordinate that we need to sample in order to get the color to reflect in the pixel our shader is rendering (line 15 of get_mirrored_coordinate()).

This coordinate will then be used in line 38 of the fragment() method to sample the screen.

Once the color from that point on the screen is obtained, we could directly assign it to the COLOR property of the pixel being rendered by our shader. However, this would create a reflection with colors that are exactly the same as the reflected object, which is not very realistic. Typically, a reflective surface will overlay a certain tint on the reflected colors, due to dirt on the surface or the surface's own color. In our case, we will assume the water is blue, so we need to blend a certain amount of blue into the reflected colors. This is handled by the get_resulting_water_color() method, which is called from line 40 of the main fragment() method.

get_resulting_water_color() Method
get_resulting_water_color() method

The main effect of this method is that the water becomes more blue as you get closer to the bottom edge of the water rectangle. Conversely, the closer you are to the top edge, the more the original reflected colors should dominate. For this reason, the mix() method is used in line 29 of get_resulting_water_color(). The higher the third parameter (water_color_mix), the closer the resulting color will be to the second parameter (water_color, set in the inspector). If the third parameter is zero, mix() will return the first parameter's color (highlighted_color).

From this basic behavior, there are a couple of additional considerations. In many implementations, UV.Y is used as the third parameter of mix(). However, I chose to add the option to configure a limit on the maximum intensity of the water's color. This is done in lines 25-28 using the clamp() method. This method will return the value of currentUV.y as long as it falls within the range limited by 0.0 and water_color_intensity. If the value is below the lower limit, the method returns 0.0, and if it exceeds the upper limit, it will return the value of water_color_intensity. Since the result of clamp() is passed to mix(), this ensures that the third parameter's value will never exceed the limit set in the inspector, via the uniform water_color_intensity.

Another consideration is that I’ve added a brightness boost for the reflected images. This is done between lines 20 and 22. I’ve configured a uniform called mirrored_colors_intensity to define this boost in the inspector. In line 22, this value is used to increase the three color components of the reflected color, which in practice increases the brightness of the color. In line 22, I also ensure that the resulting value does not exceed the color limits, although this check may be redundant.

The Occluder

Remember that we mentioned this shader can only reflect sprites that were drawn before the shader itself. In other words, it can only reflect sprites with a lower Z-Index than the shader. Since we want to reflect all sprites (even partially the grass sprite), this means the water rectangle needs to be placed above all other elements.

If we simply place the water rectangle above everything else, the result would look like this:

Water without occluder
Water without occluder

It would look odd for the shoreline to be perfectly straight. That’s why I’ve added an occluder.

An occluder is a sprite fragment that acts like a patch to cover something. It usually shares the color of the element it overlaps so that they appear to be a single piece. In my case, I’ve used the following occluder:

Occluder
Occluder

The occluder I used has the same color as the grass, so when placed over the lower part of the grass, it doesn't look like two separate pieces. On the other hand, by covering the top edge of the water rectangle, it makes the shoreline more irregular, giving it a more natural appearance.

The result is the image that opened this article, which I will reproduce here again to show the final result in detail:

Water with occluder
Water with occluder

With this, I conclude this article. In a future article, I will cover how to implement some movement in the water to bring it to life.