27 October 2024

"Make Online Games Using Unity's NEW Multiplayer Framework" course by GameDevTV

Tanks
I've been taking courses for some time now to learn the multiplayer functionalities offered by both Godot and Unity. It’s a very complex field that you should dive into only after mastering the engine basics, and both engines have received major updates recently, so it’s essential to choose a course that’s up-to-date.

In this case, I’ve been dedicating time to the course Make Online Games Using Unity's NEW Multiplayer Framework, offered by GameDevTV on the Udemy platform. They also offer it on their own GameDevTV platform, but I got it during a sale on Udemy.

The course starts with the development of a simple shooting game where each player’s tank moves around a top-down arena, shooting at other players' tanks. As in other GameDevTV courses, beyond the core multiplayer focus, it also covers various typical game mechanics, such as movement, player input, shooting, health management, collectibles, GUI, and particle systems (I especially liked how they implemented the tank tracks). All of these mechanics are explained in detail, and the code provided for implementing them is high quality. Overall, I felt that the instructor is experienced and adheres to best coding practices.

In terms of the networking section, it’s a vast topic, and it’s well explained. I would have liked a bit more focus on the fundamental concepts of RPC, as even after finishing the course, I still feel I haven’t fully grasped some nuances. My impression is that the RPC section explains the how but could benefit from more emphasis on the why behind certain approaches in RPC. That said, I found the RPC part to be similar to what’s in Godot, so if you're already familiar with it in Godot, the Unity approach should come easily. I have to admit that I’m not quite there yet.

Where the course does go into extensive, valuable detail is in integrating Unity Game Services (UGS) multiplayer functionalities. This is great because UGS provides numerous services ready to use, saving us the trouble of implementing and maintaining them from scratch. When analyzing all that UGS offers, it’s clear that Godot is still quite underdeveloped in this area. Even comparing what W4 Games provides to UGS, the former comes up short, offering only a subset of what the latter has already developed.

Overall, my impression of the course is very positive. I think it covers the most important aspects for a game with up to several dozen players. If you’re looking to create an MMORPG with Unity, this course probably won’t suit your needs, but it’s also unclear if Unity and UGS are the best options for a game of that type.

In summary, it’s a highly recommended course for anyone interested in making multiplayer games. However, I would suggest not taking it until you’re truly comfortable with the engine. It’s at a medium-advanced level.

How to implement a tree moved by wind in Godot

Cherry tree
In previous articles, I explained how to use shaders to make the water in a 2D scene more realistic. Shaders can achieve much more than that. Another common use is to make vegetation move as if swayed by the wind. This adds life to the scene, making its elements appear more organic.

To explain how to achieve this, I’ll evolve the scene from the previous articles. As before, I recommend you download the repository at the specific commit for this article, to ensure that the code you study is exactly the same version that I’ll be showing here.

In the previous shaders, we manipulated the colors to display from the fragment() method, but before this, Godot shaders execute another method called vertex(). While fragment() is used to manipulate the color of the pixels on the rendered surface, vertex() is used to deform the surface by manipulating its vertices. This is highly effective in 3D, as it allows for dynamic mesh deformation, simulating effects like ocean movement or the indentation that occurs when making a groove in the snow. In 2D, it is also useful, though somewhat limited since any sprite has only four vertices that we can manipulate:

  • Top left: VERTEX(0,0)
  • Top right: VERTEX(Width,0)
  • Bottom left: VERTEX(0,Height)
  • Bottom right: VERTEX(Width,Height)

The VERTEX concept is equivalent to UV, which we used in the fragment() method. If fragment() executes for every pixel on the rendered surface, collecting its relative position in the UV variable, the vertex() method executes for every vertex on the surface, gathering its position in the VERTEX variable. The key difference is that UV coordinates are normalized between 0 and 1, whereas VERTEX coordinates are not normalized and vary between the width and height in pixels of the sprite where we apply the shader. If you still don’t recognize what UV coordinates are, I recommend reviewing the article where I explained it.

It doesn’t matter if a sprite has an irregular shape; in shader terms, it is rectangular, with the four vertices in each corner, just filled with transparent color (alpha=0) where there is no drawing (if the sprite image has an alpha channel).

Although a 2D sprite lacks the vertex resolution of a 3D mesh, by manipulating the position of its vertices, we can stretch and compress the sprite. This is very effective for simulating vegetation that is compressed, stretched, and ultimately moved by the wind.

Looking at the code in the repository’s example, you’ll see that I’ve applied a shader to the tree, located in Shaders/Tree.gdshader. Being a 2D shader, it is of type canvas_item, like those in the previous articles.

Shader configuration
Shader configuration

The shader exports the following variables for editing through the inspector:

Shader variables exported to inspector.
Shader variables exported to inspector.
  1. wind_strength: This variable models the wind's strength. The greater it is, the more the tree canopy will sway.
  2. horizontal_offset: This creates an asymmetrical sway. The closer this value is to 1.0, the more the canopy tends to sway to the right; conversely, the closer it is to -1.0, the more it will lean left. If the value is zero, the sway will be symmetrical.
  3. maximum_oscillation_amplitude: This defines the maximum displacement (in pixels) for the tree canopy. It’s the maximum displacement reached when wind strength is 1.0.
  4. time: This variable isn’t intended for user manipulation but for an external script. It synchronizes the canopy's oscillation with the particle emitter that releases flowers. The particle emitter will be covered in a later article, so please ignore it for now. If there were no particle emitter, I could have used the TIME variable, which is internal to the shader and provides a counter with the shader’s active time.

Let’s now examine the vertex() method:

vertex() method
vertex() method

In line 11, the code limits the manipulation to the top vertices. Why limit ourselves to the top vertices? Because vegetation is firmly anchored to the ground by its roots, so only the top sways in the wind. Therefore, we only modify the top vertices, leaving the bottom ones in their default position. In theory, I should have been able to filter using VERTEX.y instead of UV.y, but when I tried, I didn’t get the desired effect since the entire image swayed, not just the top vertices.

Also, keep in mind that the comparison in line 11 cannot be for exact equality because floating-point numbers are not precise. Hence, the comparison is not "equal to zero" but "very close to zero."

The oscillation is generated in line 14 using a sinusoid with a maximum amplitude set by wind_strength. Remember that a sinusoid produces values between -1 and +1, so multiplying it by its amplitude means its resultant oscillation will range between -wind_strength and +wind_strength. Since wind_strength is limited between 0 and 1, the oscillation will not exceed -1 and +1 at the maximum value of wind_strength.

Note that inside the sinusoid there are two terms. One is time, allowing the sinusoid to change as time progresses. The other is VERTEX.x, which varies for each top vertex, allowing their oscillations to be unsynchronized, creating a stretching and compressing effect. If we were rendering vegetation without foliage, such as grass, and didn’t want this stretching and compressing effect, we could remove the VERTEX.x term, making the two vertices move in sync.

In line 15, we multiply by the maximum displacement to achieve the desired oscillation amplitude.

In line 16, we add the displacement to the vertex's horizontal position.

Finally, in line 17, we register the vertex's new position.

By moving only the two top vertices and leaving the bottom ones fixed, the effect is as if pinning the two bottom corners of a flexible rectangle in place and moving the two top corners.

Surface deformation
Surface deformation achieved with the shader.

When you overlay a tree image onto a surface deformed in this way, you get the effect we aimed to create:

Movement of the tree
Movement of the tree

19 October 2024

How to implement water movement in Godot

Japanese landscape we are going to implement
In my previous article, I explained how to create a shader to reflect a scene on the surface of a sprite. However, such a reflection is only possible on an absolutely still sheet of water. Normally, water has surface disturbances that distort that reflection. In this article, we're going to enhance the previous shader so that it shows this distortion and changes over time.

As with other articles, you can find the code for this one in my GitHub repository. I recommend downloading it and opening it locally in your Godot editor to follow along with the explanations, but make sure to download the code from the specific commit I’ve linked to. I’m constantly making changes to the project in the repository, so if you download code from a later commit than the one linked, what you see might differ from what I show here.

I don’t want to repeat explanations, so if you’re not yet familiar with UV coordinates in a Godot shader, I recommend reading the article I linked to at the beginning.

Therefore, we are going to evolve the shader we applied to the rectangle of the water sprite. The code for that shader is in the file Shaders/WaterShader.gdshader.

To parameterize the effect and configure it from the inspector, I have added the following uniform variables:

Shader parameters configurable from the inspector
Shader parameters configurable from the inspector

The wave_strength and wave_speed parameters are intuitive: the first defines the amplitude of the distortion of the reflected image, while the second defines the speed of the distortion movement.

However, the wave_threshold parameter requires some explanation. If you place a mirror at a person’s feet, you expect the reflection of those feet to be in contact with the feet themselves. The reflection is like a shadow—it typically begins from the feet. The problem is that the algorithm we will explore here may distort the image from the very top edge of the reflection, causing it not to be in contact with the lower edge of the reflected image. This can detract from the realism of the effect, so I added the wave_threshold parameter to define at what fraction of the sprite the distortion effect begins to apply. In my example, it’s set to 0.11, which means that the distortion will start at a distance from the top edge of the water sprite equivalent to 11% of its width (i.e., UV.y = 0.11).

Finally, there is the wave_noise parameter. This is an image. For a parameter like this, we could use any kind of image, but for our distortion algorithm, we’re interested in a very specific one—a noise image. A noise image contains random black and white patches. Those who are old enough might remember the image seen on analog TVs when the antenna was broken or had poor reception; in this case, it would be a very similar image. We could search for a noise image online, but fortunately, they are so common that Godot allows us to generate them using a NoiseTexture2D resource.

Configuring the noise image
Configuring the noise image

A configuration like the one shown in the figure will suffice for our case. In fact, I only changed two parameters from the default configuration. I disabled Mipmaps generation because this image won’t be viewed from a distance, and I enabled the Seamless option because, as we will see later, we will traverse the image continuously, and we don’t want noticeable jumps when we reach the edge. Lastly, I configured the noise generation algorithm to FastNoiseLite, though this detail is not crucial.

The usefulness of the noise image will become apparent now, as we dive into the shader code.

To give you an idea of the values to configure for the above variables, I used the following values (I didn’t include in the screenshot the parameters configured in the previous article for the static reflection):

Shader parameter values
Shader parameter values

Taking the above into account, if we look at the main function of the shader, fragment(), we’ll see that there’s very little difference from the reflection shader.

fragment() shader method
fragment() shader method

If we compare it to the code from the reflections article, we’ll notice the new addition on line 50: a call to a new method, get_wave_offset(), which returns a two-dimensional vector that is then added to the UV coordinate used to sample (in line 58) the color to reflect.

What’s happening here? Well, the distortion effect of the reflection is achieved not by reflecting the color that corresponds to a specific point but by reflecting a color from a nearby point. What we will do is traverse the noise image. This image has colors ranging from black to pure white; that is, from a value of 0 to 1 (simultaneously in its three channels). For each UV coordinate we want to render for the water rectangle, we’ll sample an equivalent point in the noise image, and the amount of white in that point will be used to offset the sampling of the color to reflect by an equivalent amount in its X and Y coordinates. Since the noise image does not have abrupt changes between black and white, the result is that the offsets will change gradually as we traverse the UV coordinates, which will cause the resulting distortion to change gradually as well.

Let’s review the code for the get_wave_offset() method to better understand the above:

Code for the get_wave_offset() method
Code for the get_wave_offset() method

The method receives the UV coordinate being rendered and the number of seconds that have passed since the game started as parameters.

Line 42 of the method relates to the wave_threshold parameter we discussed earlier. When rendering the water rectangle from its top edge, we don’t want the distortion to act with full force from that very edge because it could generate noticeable aberrations. Imagine, for example, a person standing at the water’s edge. If the distortion acted with full force from the top edge of the water, the reflection of the person’s feet would appear disconnected from the feet, which would look unnatural. So, line 42 ensures that the distortion gradually increases from 0 until UV.y reaches the wave_threshold, after which the strength remains at 1.

Line 43 samples the noise image using a call to the texture() method. If we were to simply sample using the UV coordinate alone, without considering time or speed, the result would be a reflection with a static distorted image. Let’s assume for a moment that this was the case, and we didn’t take time or speed into account. What would happen is that for each UV coordinate, we would always sample the same point in the noise image, and therefore always get the same distortion vector (for that point). However, by adding time, the sampling point in the noise image changes every time the same UV coordinate is rendered, causing the distortion image to vary over time. With that in mind, the multiplicative factors are easy to understand: wave_speed causes the noise image sampling to change faster, which speeds up the changes in the distorted image; multiplying the resulting distortion vector by wave_strength_by_distance reduces the distortion near the top edge (as we explained earlier); and multiplying by wave_strength increases the amplitude of the distortion by scaling the vector returned by the method.

And that’s it—the vector returned by get_wave_offset() is then used in line 58 of the fragment() method to offset the sampling of the point being reflected in the water. The effect would be like the lower part of this figure:

Our final rendered water movement

13 October 2024

How to implement water reflections in Godot

Our japanese image
I love the simplicity of Godot's shaders. It's amazing how easy it is to apply many of the visual effects that bring a game to life. I've been learning some of those effects and want to practice them in a test scene that combines several of them.

The scene will depict a typical Japanese postcard in pixel art: a cherry blossom tree swaying in the wind, dropping flowers, framed by a starry night with a full moon, Mount Fuji in the background, and a stream of water in the foreground. Besides the movement of the tree, I want to implement effects like falling petals, rain, wind, cloud movement, lightning, and water effects—both in its movement and reflection. In this first article, I’ll cover how to implement the water reflection effect.

The first thing to note is that the source code for this project is in my GitHub repository. The link points to the version of the project that implements the reflection. In future projects, I will tag the commits where the code is included, and those are the commits I’ll link to in the article. As I progress with the articles, I might tweak earlier code, so it's important to download the commit I link to, ensuring you see the same code discussed in the article.

I recommend downloading the project and opening it in Godot so you can see how I’ve set up the scene. You’ll see I’ve layered 2D Sprites in the following Z-Index order:

  1. Starry sky
  2. Moon
  3. Mount Fuji
  4. Large black cloud
  5. Medium gray cloud
  6. Small cloud
  7. Grass
  8. Tree
  9. Water
  10. Grass occluder

Remember that sprites with a lower Z-Index are drawn before those with a higher one, allowing the latter to cover the former.

The grass occluder is a kind of patch I use to cover part of the water sprite so that its rectangular shape isn’t noticeable. We’ll explore why I used it later.

Water Configuration

Let's focus on the water. Its Sprite2D (Water) is just a white rectangle. I moved and resized it to occupy the entire lower third of the image. To keep the scene more organized, I placed it in its own layer (Layer 6) by setting both its "Visibility Layer" and "Light Mask" in the CanvasItem configuration.

In that same CanvasItem section, I set its Z-Index to 1000 to ensure it appears above all the other sprites in the scene, except for the occluder. This makes sense not only from an artistic point of view, as the water is in the foreground, but also because the technique we’ll explore only reflects images of objects with a Z-Index lower than the shader node's. We’ll see why in a moment.

Lastly, I assigned a ShaderMaterial based on a GDShader to the water’s "Material" property.

Let’s check out the code for that shader.

Water Reflection Shader

This shader is in the file Shaders/WaterShader.gdshader in the repository. As the scene is 2D, it's a canvas_item shader.

It provides the following parameters to the outside:

Shader Uniforms
Shader variables we offer to the inspector

All of these can be set from the inspector, except for screen_texture:

Shader Inspector
Shader Inspector

Remember, as I’ve mentioned in previous articles, a uniform Sampler2D with the hint_screen_texture attribute is treated as a texture that constantly updates with the image drawn on the screen. This allows us to access the colors present on the screen and replicate them in the water reflection.

This shader doesn’t distort the image; it only manipulates the colors, so it only has a fragment() component:

Fragment Method
fragment() method

As seen in the previous code, the shader operates in three steps.

The first step (line 33) calculates which point on the screen image is reflected at the point of water the shader is rendering.

Once we know which point to reflect, the screen image is displayed at that point to get its color (line 38).

Finally, the reflected color is blended with the desired water color to produce the final color to display when rendering. Remember, in GDShader, the color stored in the COLOR variable will be the one displayed for the pixel being rendered.

We’ve seen the second step (line 38) in other shaders in previous articles. It's basically like using the eyedropper tool in Photoshop or GIMP: it picks the color of a specific point on an image. Keep in mind that, when the shader executes, the image consists only of the sprites drawn up to that point. That is, sprites with a Z-Index lower than the one with our shader. This makes sense: you can’t sample a color from the screen that hasn’t been drawn yet. If objects are missing from the reflection, it’s likely because their Z-Index is higher than the shader node’s.

Now, let’s see how to calculate the screen coordinate to reflect:

get_mirrored_coordinate() Method
get_mirrored_coordinate() method

This method tests your understanding of the different coordinate types used in a 2D shader.

In 2D, you have two types of coordinates:

UV coordinates: These have X and Y components, both ranging from 0 to 1. In the rectangle of the water sprite that we want to render with our shader, the origin of the coordinates is the top-left corner of the rectangle. X increases to the right, and Y increases downward. The bottom-right corner of the water rectangle corresponds to the coordinate limit (1, 1).

SCREEN_UV coordinates: These are oriented the same way as UV coordinates, but the origin is in the top-left corner of the screen, and the coordinate limit is at the bottom-right corner of the screen.

Note that when the water sprite is fully displayed on the screen, the UV coordinates span a subset of the SCREEN_UV coordinates.

To better understand how the two types of UV coordinates work, refer to the following diagram:

UV Coordinate Types
UV coordinate types

The diagram schematically represents the scene we’re working with. The red area represents the image displayed on the screen, which in our case includes the sky, clouds, moon, mountain, tree, grass, and water. The blue rectangle represents the water, specifically the rectangular sprite where we’re applying the shader.

Both coordinate systems start in the top-left corner. The SCREEN_UV system starts in the screen’s top-left corner, while the UV system starts in the top-left corner of the sprite being rendered. In both cases, the end of the coordinate system (1, 1) is in the bottom-right corner of the screen and the sprite, respectively. These are normalized coordinates, meaning we always work within the range of 0 to 1, regardless of the element’s actual size.

To explain how to calculate the reflection, I’ve included a blue triangle to represent Mount Fuji. The red area is the mountain itself, while the triangle in the blue area represents its reflection.

Suppose our shader is rendering the coordinate (0.66, 0.66), as represented in the diagram (please note, the measurements are approximate). The shader doesn’t know what color to show for the reflection, so it needs to sample the color from a point in the red area. But which point?

Calculating the X-coordinate of the reflected point is easy because it’s the same as the reflection point: 0.66.

The trick lies in the Y-coordinate. If the reflection point is at UV.Y = 0.66, it means it's 1 - 0.66 = 0.33 away from the bottom edge (rounded to two decimal places for clarity). In our case, where the image to be reflected is above and its reflection appears below, the natural expectation is that the image will appear vertically inverted. Therefore, if the reflection point was 0.33 away from the bottom edge of the rectangle, the reflected point will be 0.33 away from the top edge of the screen. Thus, the Y-coordinate of the reflected point will be 0.33. This is precisely the calculation done in line 11 of the get_mirrored_coordinate() method.

So, as the shader scans the rectangle from left to right and top to bottom to render its points, it samples the screen from left to right and bottom to top (note the difference) to acquire the colors to reflect.

This process has two side effects to consider.

The first is that if the reflection surface (our shader’s rectangle) has less vertical height than the reflected surface (the screen), as in our case, the reflection will be a "squashed" version of the original image. You can see what I mean in the image at the start of the article. In our case, this isn’t a problem; it’s even desirable as it gives more depth, mimicking the effect you'd see if the water’s surface were longitudinal to our line of sight.

The second side effect is that, as we scan the screen to sample the reflected colors, there will come a point where we sample the lower third of the screen where the water rectangle itself is located. An interesting phenomenon will occur: What will happen when the reflection shader starts sampling pixels where it’s rendering the reflection? In the best case, the color sampled will be black because we’re trying to sample pixels that haven’t been painted yet (that’s precisely the job of our shader). So what will be reflected is a black blotch. To avoid this, we must ensure that our sampling doesn’t dip below the screen height where the water rectangle begins.

Using our example from the image at the beginning of the article, we can estimate that the water rectangle occupies the lower third of the screen. Therefore, sampling should only take place between SCREEN_UV.Y = 0 and SCREEN_UV.Y = 0.66. To achieve this, I use line 13 of get_mirrored_coordinate(). This mix() method allows you to obtain the value of its third parameter within a range defined by the first two. For example, mix(0, 0.66, 0.5) would point to the midpoint between 0 and 0.66, giving a result of 0.33.

By limiting the vertical range of the pixels to sample for reflection, we ensure that only the part of the screen we care about is reflected.

With all this in place, we now have the screen coordinate that we need to sample in order to get the color to reflect in the pixel our shader is rendering (line 15 of get_mirrored_coordinate()).

This coordinate will then be used in line 38 of the fragment() method to sample the screen.

Once the color from that point on the screen is obtained, we could directly assign it to the COLOR property of the pixel being rendered by our shader. However, this would create a reflection with colors that are exactly the same as the reflected object, which is not very realistic. Typically, a reflective surface will overlay a certain tint on the reflected colors, due to dirt on the surface or the surface's own color. In our case, we will assume the water is blue, so we need to blend a certain amount of blue into the reflected colors. This is handled by the get_resulting_water_color() method, which is called from line 40 of the main fragment() method.

get_resulting_water_color() Method
get_resulting_water_color() method

The main effect of this method is that the water becomes more blue as you get closer to the bottom edge of the water rectangle. Conversely, the closer you are to the top edge, the more the original reflected colors should dominate. For this reason, the mix() method is used in line 29 of get_resulting_water_color(). The higher the third parameter (water_color_mix), the closer the resulting color will be to the second parameter (water_color, set in the inspector). If the third parameter is zero, mix() will return the first parameter's color (highlighted_color).

From this basic behavior, there are a couple of additional considerations. In many implementations, UV.Y is used as the third parameter of mix(). However, I chose to add the option to configure a limit on the maximum intensity of the water's color. This is done in lines 25-28 using the clamp() method. This method will return the value of currentUV.y as long as it falls within the range limited by 0.0 and water_color_intensity. If the value is below the lower limit, the method returns 0.0, and if it exceeds the upper limit, it will return the value of water_color_intensity. Since the result of clamp() is passed to mix(), this ensures that the third parameter's value will never exceed the limit set in the inspector, via the uniform water_color_intensity.

Another consideration is that I’ve added a brightness boost for the reflected images. This is done between lines 20 and 22. I’ve configured a uniform called mirrored_colors_intensity to define this boost in the inspector. In line 22, this value is used to increase the three color components of the reflected color, which in practice increases the brightness of the color. In line 22, I also ensure that the resulting value does not exceed the color limits, although this check may be redundant.

The Occluder

Remember that we mentioned this shader can only reflect sprites that were drawn before the shader itself. In other words, it can only reflect sprites with a lower Z-Index than the shader. Since we want to reflect all sprites (even partially the grass sprite), this means the water rectangle needs to be placed above all other elements.

If we simply place the water rectangle above everything else, the result would look like this:

Water without occluder
Water without occluder

It would look odd for the shoreline to be perfectly straight. That’s why I’ve added an occluder.

An occluder is a sprite fragment that acts like a patch to cover something. It usually shares the color of the element it overlaps so that they appear to be a single piece. In my case, I’ve used the following occluder:

Occluder
Occluder

The occluder I used has the same color as the grass, so when placed over the lower part of the grass, it doesn't look like two separate pieces. On the other hand, by covering the top edge of the water rectangle, it makes the shoreline more irregular, giving it a more natural appearance.

The result is the image that opened this article, which I will reproduce here again to show the final result in detail:

Water with occluder
Water with occluder

With this, I conclude this article. In a future article, I will cover how to implement some movement in the water to bring it to life.