26 December 2024

Automatized testing (TDD) in Godot - GdUnit4

GdUnit4 logo
TDD (Test-Driven Development) is a software development methodology where you write tests first and then the code necessary to make those tests pass. By having to define the tests beforehand, TDD forces you to consider what functionality you want to achieve before writing any code. You need to design what inputs your test will receive and what outputs will be considered correct. These inputs and outputs will define your functionality, and having defined them in advance allows you to implement it in a cleaner and more encapsulated way.

Many people dislike this methodology because they find it boring to start with tests and prefer to dive straight into writing code and then test it manually. They often justify this by claiming they can't afford to waste time writing tests. The problem arises when their project starts growing in size and complexity, and introducing new features becomes a nightmare because older functionalities break. This often happens unnoticed, as it becomes impossible to manually test the entire application with every change. Consequently, hidden bugs accumulate with each update. By the time one of these bugs is discovered, it becomes incredibly difficult to determine which update caused it, making the resolution process a torment.

With TDD, this doesn't happen because the tests you design for each feature are added to the test suite and can be automatically re-executed when incorporating new functionalities. When a new feature breaks something from previous ones, the corresponding tests will fail, serving as an alarm to indicate where the problem lies and allowing you to resolve it efficiently.

Games are just another application. TDD can be applied equally to them, and that's why all engines have automated testing frameworks. In this article, we will focus on one of the most popular in Godot Engine: GdUnit4.

GdUnit4 allows test automation using both GdScript and C#. Since I use the latter to develop in Godot, we will focus on it throughout this article. However, if you're a GdScript user, I recommend you keep reading because many concepts are similar in that language.

Plugin Installation

To install it, go to the AssetLib tab at the top of the Godot editor. There, search for "gdunit4" and click on the result.

GdUnit4 in the AssetLib
GdUnit4 in the AssetLib

In the pop-up window, click "Download" and accept the default installation folder.

After doing this, the plugin will be installed in your project folder but will remain disabled. To enable it, go to Project --> Project Settings... --> Plugins and check the Enabled box.

GdUnit4 plugin activation
GdUnit4 plugin activation

I recommend restarting Godot after enabling the plugin; otherwise, errors might appear.

C# Environment Configuration

The next step is to configure your C# environment to use GdUnit4 without issues.

Ensure you have .NET version 8.0.

Open your .csproj file and make the following changes in the <PropertyGroup> section:

  • Change TargetFramework to net8.0.
  • Add <LangVersion>11.0</LangVersion>.
  • Add <CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies>.

Create an <ItemGroup> section (if it doesn’t already exist) and add the following:


<ItemGroup>

    <PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.9.0" />

    <PackageReference Include="gdUnit4.api" Version="4.3.*" />

    <PackageReference Include="gdUnit4.test.adapter" Version="2.*" />

</ItemGroup>


As an example, here's my .csproj file:

A csproj file adapted for GdUnit4
A csproj file adapted for GdUnit4

If you rebuild your project (either from your IDE or MSBuild) without errors, your configuration is correct.

IDE Configuration

GdUnit4 officially supports Visual Studio, Visual Studio Code, and Rider. Configuration varies slightly between them.

I use Rider, so I'll focus on that IDE.

GdUnit4 expects an environment variable named GODOT_BIN with the path to the Godot executable. I use ChickenSoft's GodotEnv tool to manage Godot versions, which maintains a GODOT variable pointing to the editor executable. I created a user-level GODOT_BIN variable pointing to GODOT. If you don't use GodotEnv, manually set the path to the executable.

Env var configuration in windows
Env var configuration in windows

In Rider, ensure the Godot support plugin is enabled and enable VSTest adapter support:

Enable the "Enable VSTest adapters support" option and add a wildcard (*) in the "Projects with unit tests" list.

Configuration for VSTest support
Configuration for VSTest support

Create a .runsettings file at your project root with the following content:


<?xml version="1.0" encoding="utf-8"?>

<RunSettings>

    <RunConfiguration>

        <MaxCpuCount>1</MaxCpuCount>

        <ResultsDirectory>./TestResults</ResultsDirectory>

        <TargetFrameworks>net7.0;net8.0</TargetFrameworks>

        <TestSessionTimeout>180000</TestSessionTimeout>

        <TreatNoTestsAsError>true</TreatNoTestsAsError>

    </RunConfiguration>

    <LoggerRunSettings>

        <Loggers>

            <Logger friendlyName="console" enabled="True">

                <Configuration>

                    <Verbosity>detailed</Verbosity>

                </Configuration>

            </Logger>

        </Loggers>

    </LoggerRunSettings>

    <GdUnit4>

        <DisplayName>FullyQualifiedName</DisplayName>

    </GdUnit4>

</RunSettings>


Point Rider to this file in its Test Runner settings.

Path to .runsettings configuration
Path to .runsettings configuration

Once all of the above is done, you need to restart Rider to apply the configuration.

If you want to verify that the setup is correct, you can create a folder named Tests in your project and, inside it, a C# file (you can call it ExampleTest.cs) with the following content:

Test/ExampleTest.cs
Test/ExampleTest.cs

The example is designed so that Success() will be a passing test, while Failed() will intentionally fail.

You can run this test either from Rider or from the Godot editor.

In Rider, you first need to enable the test panel by navigating to View → Tool Windows → Tests. From the test panel, you can run an individual test or all tests at once.

Test Execution Tab in Rider
Test Execution Tab in Rider


To run them from Godot, you need to go to the GdUnit tab.

Test Execution Tab in Godot
Test Execution Tab in Godot

If the Godot tab doesn’t display the tests, it might not be looking in the correct folder. To verify this, click the tools button in the top-left corner of the tab and ensure that the "Test Lookup Folder" parameter points to the folder where your tests are located.

Path to the Test Folder
Path to the Test Folder

If the tests still don’t appear, I recommend restarting. Both Godot and Rider sometimes fail to detect changes or newly added tests. When this happens, restarting Godot or Rider (whichever is failing) usually resolves the issue. I assume this problem will be addressed over time. In any case, from the tests I’ve conducted, those run through Rider have consistently worked much better than those executed directly from the Godot editor.

Testing a Game

All the previous setup has been arduous, but the good thing is that you only need to do it once. From then on, it’s just a matter of creating tests and running them.

There are many things to test in a game. Unit tests focus on testing specific methods and functions, but I’m going to explain something broader: integration tests. These tests evaluate the game at the same level a player would, interacting with objects and verifying whether the game reacts as expected. For this purpose, it’s common to create special test levels, where functionalities are concentrated to allow for quick and efficient testing.

To make this clear, let’s use a simple example. Imagine we have a game where there’s an element (an intelligent agent) that must move towards a specific marker's position. To verify the agent's correct behavior, we would place it at one end of the scenario, the marker at the other, and wait a few seconds before evaluating the agent's position. If it has moved close enough to the marker, we can conclude that it’s working correctly.

You can download the Godot code for this example from a specific commit in one of my repositories. If you open it in the Godot editor, load the scene Tests/TestLevels/SimpleBehaviorTestLevel.tscn, set it as the main scene (Project → Project Settings... → General → Application → Run → Main Scene) and run the game, you’ll see that it behaves just as described earlier: the red crosshair will move to where you click, and the green ball will move towards the crosshair’s position. So, manually, we can verify that the game behaves as expected. Now let’s automate this verification.

The game we’re testing
The game we’re testing

The first step is to set up the level with the necessary elements for testing. These auxiliary elements would only get in the way in the levels accessed by players, which is why dedicated test levels are usually created (hence, this one is in the folder Tests/TestLevels).

Our test level has the following structure:

Test Level Structure
Test Level Structure

The elements are as follows:

  • ClearCourtyard: This is the box where our agent moves. It’s a simple TileMap where I’ve drawn dark gray walls (impassable) and a light gray floor (where movement happens).
  • Target: This is a scene whose main node is a Marker2D, with a Sprite2D underneath containing the crosshair image. The main node has a script that listens to Input events and reacts to left mouse button clicks, placing itself at the clicked screen position.
  • SeekMovingAgent: This is the agent whose behavior we want to verify.
  • StartPosition1: This is the position where we want the agent to start at the beginning of the test.
  • TargetPosition1: This is the position where we want the Target to start at the beginning of the test.

The code for our test will consist of one or more C# files located in the Tests folder. In this case, we’ll only have one file, but we could have several if we were testing different functionalities. Each file can contain multiple tests. A test is essentially a method marked with the [TestCase] attribute. The class containing that method must be marked with the [TestSuite] attribute so that the test runner includes it when executing tests.

Our example test is located in the file Tests/SimpleBehaviorTests.cs.

The first half of the file focuses on preparations.

Test Preparation
Test Preparation

As you can see in line 9, I’ve stored the path to the level we want to test in a constant variable.

We load this level in line 17. This line belongs to the method LoadScene(), which we can name however we like as long as we mark it with the [BeforeTest] attribute. This signals that LoadScene() should run just before each test. This way, we ensure that the level starts fresh each time, and that previous tests don’t interfere with subsequent ones.

There are other useful attributes:

  • [Before]: Runs the decorated method once before all tests in the class execute. Useful for initializing resources shared across tests that don’t need resetting.
  • [AfterTest] and [After]: These are the counterparts of [BeforeTest] and [Before], used for cleaning up after tests.

Once we have a reference to the freshly loaded level, we can start the test:

The code for our test
The code for our test

Between lines 24 and 30, we gather references to the test elements using the FindChild() method. In each call, we pass the name of the node we want to retrieve. Since FindChild() returns a Node type, we need to cast it to its actual type.

Once we have the references to the elements, we place them at their starting positions. In line 33, we place _target at the position marked by _targetPosition. In line 34, we place _movingAgent at the position marked by _agentStartPosition.

We could have set these positions via code, but if they need adjustment later, it’s much easier to tweak them visually in the editor rather than editing the code.

In line 37, we activate the agent so it starts moving. In general, I recommend keeping agents deactivated by default and activating only those needed for each specific test, avoiding interference.

In line 39, we wait for 2.5 seconds, which we estimate is sufficient for the agent to reach the target.

Finally, the moment of truth arrives. In line 41, we measure the distance between the agent and the target. If this distance is smaller than a tiny threshold (as floating-point math isn’t perfectly precise), we assume the agent successfully reached its target (line 43).

Conclusion

Like this example, many different tests can be created. The usual approach is to group related tests into a single class (a .cs file) for organizational clarity.

This test was highly visual, as it evaluated movement through position changes. However, other tests might focus on internal values of the agent, verifying specific properties using assertions.

Lastly, I highly recommend checking out the official GdUnit4 documentation. While originally focused on GDScript, it has started supporting C#. The documentation contains specific installation guidelines for C# and example code with tabs for both languages.

21 December 2024

Course "Unity Cutscenes: Master Cinematics, Animation, and Trailers" by GameDev.tv

I just finished another course from GameDev.tv included in the Humble Bundle pack I bought recently. This time, it was "Unity Cutscenes: Master Cinematics, Animation, and Trailers". Like the others, this course is available both on GameDev.tv's own platform and on Udemy. Its name is quite descriptive of its focus: teaching you the tools Unity provides for creating cinematic scenes.

As you probably know, a cinematic scene is a sequence generated in real-time, using the game engine, that narrates an important part of the story. Using cinematics has two significant advantages: on one hand, it saves the storage space required for pre-rendered video sequences; on the other hand, since it runs using the same assets as the rest of the game, there’s no risk of breaking the game's aesthetic, allowing smoother transitions between gameplay and cutscenes, and vice versa.

The course structures its 10 hours into five sections. In my opinion, the first two are the most interesting because they introduce genuinely new concepts. In the first section, it covers importing animations from Mixamo, explains the basics of rigging and skeletons (from Unity's perspective), and dives deeply into the Timeline. The second section is, in my opinion, the best because it thoroughly explains Unity's gem: the Cinemachine system. Specifically, it covers the use and transitions between multiple cameras, tracking cameras, "travelling" shots using Dolly Tracks, and automatic camera switching when a character leaves the field of view (the foundation of a camera system like the one in the first Resident Evil). All of this is explained clearly and in detail. I truly enjoyed that part.

The problem starts after that. The last three sections span just under seven hours of the course, but I think they could have been condensed into a third of that time. In these sections, the instructor stops introducing new concepts and focuses on illustrating the previously covered topics through examples. While this isn’t inherently bad, dedicating only 3.5 hours to introducing concepts and then spending over 6.5 hours repeatedly fine-tuning the same examples feels very unbalanced. It became quite tedious to keep making adjustments in the same windows just to frame the shot at the "cool" angle the instructor wanted. Admittedly, the examples are impressive, and you can tell the instructor enjoys polishing them, but I believe it would have been far more productive and engaging if he had stuck to simpler examples in favor of explaining more concepts that would fit within the course's scope. Topics like lighting, VFX (with VFX-Graph), or shaders are only briefly touched upon and could have been explored in more depth.

Additionally, the instructor has a very thick accent, making him hard to understand even if you're used to following courses in English. To make matters worse, the course lacks subtitles (which is unusual for GameDev.tv), so you can’t rely on them during moments when the instructor’s explanation isn’t clear. A real shame.

Therefore, my recommendation is to buy this course during a sale if you're curious about learning Cinemachine or Timeline, and focus on the first two sections. The following ones can be watched at double speed, slowing down only when the instructor does something genuinely new (which doesn’t happen often), or you can skip them altogether — in my opinion, you won’t miss much.

20 December 2024

Particle systems in Godot introduction (Part II) - The rain

In one of my previous articles, I explained how we could use Godot’s particle systems to simulate the falling of cherry blossom petals. It was a very subtle effect but encapsulates the basic principles of what particle systems offer. Another much more spectacular and common effect is rain. If we think about it, adding rain to our game using a particle system is very simple. In this article, we’ll see how.

As with the other articles, I’ve shared the code and resources for this one on GitHub. I recommend downloading it and opening it in the Godot editor to comfortably follow along with what I explain here. The code is in the SakuraDemo repository, although I recommend downloading this specific commit to ensure you see exactly the version I’ll explain.

Once you’ve done that, you can open the project in Godot. The new feature, compared to previous SakuraDemo examples, is that I’ve introduced three new nodes: two particle emitters (Rain and RainSplash) and a LightOccluder2D (WaterRainCollider). However, I’ve disabled the latter, making it invisible in the editor (by closing the eye icon in the hierarchy) and setting its Process → Mode to Disabled in the inspector (so the node doesn’t execute any logic on each frame).

Scene Hierarchy
Scene Hierarchy

In this article, we’ll explain the two particle emitters. The first one, Rain, generates the raindrops, while the second, RainSplash, activates at the point where a drop ceases to exist to simulate the small splashes that would occur when rain falls on the lake’s water. In particle system terms, this is called a sub-emitter. They’re named this way because they originate at the endpoints (either due to lifespan expiration or collision) of particles in a parent particle system.

We mentioned that particles can cease to exist either because their lifespan expires or because they collide. The thing is, particle systems are managed by the GPU, which doesn’t have knowledge of the CollisionShapes that populate our scene, as these are managed by the CPU. What particles can detect are polygons defined on the screen with a LightOccluder2D. In theory, when a particle touches one of the edges of a LightOccluder2D, it will cease to exist. In practice, it’s a bit more complex. If the particles are too small and fast, they may pass through the edges of a LightOccluder2D without being detected. To solve this, you need to increase the physical (but not visible) size of the particles by adjusting the Collision → Base Size parameter of the GPUParticles2D node in the particle system. If you test increasing the particle size within the physics simulation, you’ll see they penetrate less into the LightOccluder2D polygon.

Another difficulty with LightOccluder2D nodes is that, in my opinion, they’re only useful for simulating surfaces that rain can hit if your game is purely side-scrolling. However, in a perspective-based scene like this one, we don’t want our raindrops to collide with the edge of a polygon but rather on the visual area covered by the water sprite. While reducing the physical size of the particles can allow them to reach the water area, I haven’t observed much difference compared to the effect achieved by randomizing the particles’ lifetimes, as we’ll see below. That’s why, in the end, I ended up disabling the LightOccluder2D. Still, I recommend activating it and experimenting with the particles' physical size to observe its effect.

LightOccluder2D Shape
LightOccluder2D Shape

Let’s focus on the primary rain effect. If you access the Rain node, you’ll see I’ve made the following configurations:

  • Amount: Set to 1000 to ensure a sufficient number of raindrops.
  • Sub Emitter: Drag the particle system node you want to activate when your raindrops die here. In this case, it’s RainSplash, which we’ll discuss later in this article.
  • Time → Lifetime: I set it to 0.7 because this is the minimum time a raindrop needs to travel from its generation point above the top edge of the screen to the area of the screen where the water is located.
  • Time → Preprocess: It would look odd if the drops appeared nearly stationary at the top edge of the screen and then accelerated from there. Intuitively, you’d expect the drop to already be moving at high speed, having originated from a great height above the screen. This parameter allows you to simulate that prior acceleration, making particles appear with physical characteristics as if they started falling 0.5 seconds earlier. Through trial and error, I found 0.5 seconds sufficient to achieve a believable effect.
  • Drawing → Visibility Rect: This is the rectangle within which the particles will be visible. In my case, I wanted the rain to cover the entire screen, so I configured a rectangle spanning the entire screen.
  • CanvasItem → LightMask and Visibility Layer: I placed the particle system in its own layer. For this specific case, I think it has no effect, but I like to keep things tidy.
  • Ordering → Z Index: This parameter is important. Raindrops should not be obscured by anything, so their Z-Index must be higher than the other sprites. For this reason, I set it to 1,200.

As with the cherry blossom particle system, the ParticleProcessMaterial created for the Process Material parameter has its own settings:

  • Lifetime Randomness: Increasing this value expands the range of possible lifetimes. The value you set here is subtracted from 1.0 to calculate the lower bound of the range. In my case, I set it to 0.22, meaning the raindrops will have random lifetimes between 0.78 times the lifetime and 1.0 times the lifetime. With this parameter, we can make the raindrops cover the entire lake sprite. The lower range values represent drops that will reach the lake’s shore, while the upper range values represent drops that will live long enough to reach the bottom edge of the screen. All intermediate values will be distributed across the screen area occupied by the lake.
  • Disable Z: Since this is a 2D scene, particles moving along the Z-axis doesn’t make sense.
  • Spawn → Position → Emission Shape: I chose Box because I want the particles to generate more or less simultaneously along the entire top edge of the screen. This aligns well with the volume of a long box parallel to the screen.
  • Spawn → Position → Emission Box Extents: This defines the shape of the box where particles will generate. In my case, it’s narrow (Y=1) but long enough to cover the top edge of the screen (X=700).
  • Accelerations → Gravity: Raindrops are generated in the sky, so they fall at high speed by the time they reach the ground. We can simulate this high velocity by setting gravity to 3,000 on the vertical axis (Y).
  • Display → Scale: It would be boring if all raindrops were the same size. Instead, I added variety by setting a minimum size of 0.2 and a maximum of 1.
  • Display → Scale Over Velocity Curve: The acceleration of the drops is more visually engaging if they stretch as they speed up. This can be achieved using this parameter by creating a curve resource and making it rise along the Y-axis as it approaches the maximum velocity. In my case, I set the drops to have an initial size of 15 when they start falling and 30 when they reach 25% of their maximum velocity.

Curve to scale particle size along the Y-axis as velocity increases.
Curve to scale particle size along the Y-axis as velocity increases.

  • Display → Color Curves → Color Initial Ramp: Another way to add variety to the drops is to give them slightly different colors. With this parameter, we can set a gradient so each drop adopts a color from it.
  • Display → Color Curves → Alpha Curve: Another curve, this time to vary opacity throughout the particle’s lifespan. I configured a bell curve so the particle is transparent at the beginning and end of its life but visible in the middle.
  • Collision → Mode: Defines what happens when a particle collides. If you’re using a LightOccluder2D to simulate impact zones, you’d likely want the drop to disappear on impact, so you’d set this to Hide On Contact. However, if you’re simulating other types of particles and want them to bounce, you’d use Rigid.
  • Sub Emitter → Mode: Defines when to activate the secondary particle system, the sub-emitter. In our case, this is RainSplash. I set this parameter to At End so the secondary particle system activates at the position where a particle extinguishes due to its lifespan ending. If we had simulated collisions, we’d set it to At Collision here.

If we only wanted to simulate the drops, we could stop here. However, as I’ve mentioned throughout the article, we want to simulate the splashes produced when raindrops hit the lake water. To do this, we need to configure the RainSplash node, the particle system that activates whenever a drop extinguishes.

Just like with the Rain node, there are two levels of configuration: general node settings and specific Process Material settings.

At the general node level, I used the following parameters:

  • Emitting: I set this to false because the Rain node will handle its activation whenever one of its drops extinguishes.
  • Amount: I set this to 200. However, the effect is so subtle and quick that there aren’t significant differences between values for this parameter.
  • Time → Lifetime: The splash effect is so brief that I left this time at 0.2.
  • CanvasItem → LightMask and Visibility Layer: I kept these splashes in the same layer as the rain.
  • Ordering → Z Index: I placed it above the raindrops with a value of 1,300.
At the Process Material level, the settings I used were:

  • Lifetime Randomness: I set this to 1 for maximum randomness in the splashes' lifetimes.
  • Spawn → Velocity → Direction: To make the splashes rise before falling, I set an initial velocity vector of (0, -1, 0).
  • Spawn → Velocity → Spread: In my case, I noticed little difference when modifying this parameter, but in theory, it should randomize the initial velocity vector within an angular range.
  • Accelerations → Gravity: The splashes should fall quickly but not as fast as the raindrops. So, I set it to 500.
  • Display → Color Curves → Alpha Curve: I used a curve similar to the one for the Rain node.

And that’s it. With this configuration, I achieved a sufficiently convincing rain and splash effect. However, this is neither the only nor the best way to do it. All the parameters we’ve adjusted can be set to different values, and there are many others we haven’t even touched. I encourage you to experiment with other parameters and values to see their effects. I have no doubt that, with a bit of time, you’ll achieve effects better than mine.

Final Rain Effect
Final Rain Effect

06 December 2024

Course "Mastering Game Feel in Unity: Where Code Meets Fun!" by GameDev.tv

I'm still working through the courses from the latest Humble Bundle pack I purchased, featuring courses by GameDev.tv. This time, I tackled "Mastering Game Feel in Unity: Where Code Meets Fun!". It's an 8-hour course aimed at developers who have already completed introductory courses and want to dip their toes into more advanced topics. You can purchase it either on the GameDev.tv platform or on Udemy.

The author starts with a pre-made game based on what is taught in the beginner-level course. It’s a 2D shoot'em up where you control your character with the keyboard and aim with the mouse. At this stage, the game is fully functional but flat and boring. The author's thesis is that by adding small, specific details, you can make the game incredibly engaging and appealing. Almost all these details revolve around the idea of providing feedback to the player, visually reinforcing their actions. If the player shoots, the camera should shake slightly; if they throw a grenade, the camera should shake more intensely. If the player moves, their character should sway in the direction of motion and kick up a slight dust trail. And so on, with a long list of examples, each one used to demonstrate a different Unity feature.

For camera shakes, the course briefly introduces Cinemachine impulses. Dust trails and explosions serve as an excuse to overcome the fear of particle systems. The main character is given the ability to fly with a jetpack, which serves as a way to configure a Trail Renderer. The course breathes life into the game environment using Unity's 2D lighting capabilities in the URP, and gets unexpected mileage out of HDR colors with a simple Bloom post-processing effect. Sound isn't overlooked either, providing an opportunity to use Unity’s AudioMixer for the first time (at least for me).

As a result, the course touches on a lot of features. All of them are presented through examples, none explored in great depth but with enough detail to get comfortable with Unity tools and mentally map out where they might be useful. When the time comes to use them, you'll have the chance to dive deeper.

I diverged quite a bit from the course at the coding level. I'm not saying the author writes poor code—far from it—but their organizational style didn't quite resonate with me. It's a matter of personal preference. For instance, they tend to use events to subscribe methods within the same class as the event, while also leaning heavily (in my opinion) on direct component references. I usually work the other way around: using direct method calls within the same class, references to components below the caller, and events to communicate upwards in the hierarchy or between components at the same level. I'm not claiming my way is better, but I feel more comfortable with it than the author’s approach. In the end, I implemented the code in my own way, and as I progressed through the course, my version of the code became increasingly different from the author’s. It’s not a big deal. I'm mentioning this because if you take the course and find the author's coding style doesn’t work for you, don’t be afraid to do things your way. You'll probably learn much more that way than by just copying the instructor’s code.

It’s also true that GameDev.tv courses give you room to do things your way. A hallmark of their approach is the frequent challenges they present. In each one, the instructor describes what they want you to achieve functionally and then asks you to pause the video and try to implement it yourself. When you resume, the instructor explains how they solved it. If you get stuck, their solution helps you out; and if you manage to solve it, the instructor’s approach might reveal new options you hadn’t considered. Either way, it’s a much more interesting, fun, and rewarding way to learn than just following the instructor’s steps.

The only thing to keep in mind is that with this method, it’s normal for your implementation to gradually diverge from the instructor’s. By the final chapters, it’s not uncommon to spend extra time adapting what the instructor does to your own code version. This means you often spend much more time than the course estimates per chapter, but I think the learning process is much more effective.

In conclusion, I don’t regret taking the course. I found it entertaining, interesting, and I finished it having learned a lot of Unity features I wasn’t aware of before. For a course to give you that is not bad at all.

23 November 2024

Course "Unity Shader Graph: Create Procedural Shaders & Dynamic FX" by GameDev.tv

From now until the end of the year, I’m taking as many courses as I can before fully dedicating myself to my new project. I’m currently focusing on the GameDev.tv courses included in a Humble Bundle pack I recently purchased. These courses are hosted on GameDev.tv’s original platform, although most of them are also available on Udemy. Since I recently finished a shader course in Unity by Penny de Byl, this one seemed like the natural progression.

Unlike Penny de Byl’s course, this one doesn’t implement shaders in code but instead uses Unity’s visual language for shaders, Shader Graph. Shader Graphs still feel confusing to me compared to code, but I have to admit that writing Godot’s shader code feels immensely simpler than Unity’s CG/HLSL. I suspect I’ll stick to code for shaders in Godot, but in Unity, I’ll likely rely on Shader Graphs (begrudgingly).

This course uses several examples to teach you how to work with Shader Graphs. These examples increase in complexity, starting with simple vertex displacement shaders and progressing to a more complex ocean wave effect, with others in between for simulating fire and snow. The fire, snow, and ocean shaders were the most valuable for me, though the snow example was explained in a rather confusing and rushed way, and it was the only one I couldn’t fully get to work.

Overall, the course achieves its goal of helping you overcome any fear of Shader Graphs. However, it suffers from the same issue as many other shader courses: focusing on how to implement specific shaders without explaining why they’re implemented that way. Too often, the course boils down to listing the nodes to deploy and how to connect them. While you can refer to Unity’s (excellent) documentation for each node, the documentation only explains the node’s inputs, outputs, and immediate functionality. It doesn’t provide the broader context or concepts behind them. In this sense, I think Penny de Byl’s course, which I mentioned in my previous review, makes a much greater effort to help you understand the concepts underpinning each shader implementation.

That said, I’ve progressed through the course, and generally, things have worked out. However, I feel like I’ve implemented many of these shaders more by intuition than by certainty. I’m not sure whether that intuition is thanks to the instructor or simply because I’ve taken several courses on the subject and, no matter how clumsy I remain in this discipline, some knowledge always sticks.

What I really liked about this course is its challenge-oriented structure, which I’ve seen in other GameDev.tv courses. It seems to be a hallmark of their teaching approach: explaining certain parts of an example and then, as a challenge, asking you to implement the next section on your own using what you just learned. After a pause, the instructor explains the solution so you can compare your work or see how it was done if you couldn’t figure it out. This dynamic makes the courses very engaging, keeps you interested, and forces you to apply the concepts immediately. It’s incredible how much you learn when you roll up your sleeves. I must admit I miss this approach when taking more traditional courses with a purely lecture-based format.

The course is in English, but the instructor has clear pronunciation, and you can understand him well. Subtitles (in English) are also available in case you need help with any words. Overall, it’s easy to follow.

Is it worth it? I wouldn’t buy it at full price. It seems overpriced to me. However, it might be worth it if you catch one of the frequent sales and get it for €10 or €12, or as part of a broader course bundle.


Course "Shader Development from Scratch for Unity" by Penny de Byl

I must admit that one of the things I find most challenging to learn about game development is shaders. No matter how proficient you are in Unity, Unreal, or Godot, with shaders, you essentially have to start from scratch. They are essential for making your game’s textures and visual effects truly appealing, but to begin using them, you need to master a set of concepts that seem barely related to other disciplines. It’s no surprise that in large teams, there are people entirely specialized in this area.

However, my biggest issue isn’t learning something new but rather that all these concepts seem artificially cryptic, the documentation opaque, and the shader syntax contrived and unintuitive. I must admit that the little I’ve learned about Godot shaders seems to be the most intuitive I’ve come across so far, though even then, there are numerous “awkward” aspects. In Unity’s case, I know its shaders are extremely powerful, but everything feels rather chaotic and confusing to me.

Despite this, I’m determined to understand them and achieve a reasonable level of mastery. That’s why I’ve taken several courses on the topic, gradually making progress—though far less than their authors promised. The reason is that most courses simply list the steps to achieve a specific effect, but very few take the time to explain the concepts behind shaders that justify those steps.

Fortunately, Penny de Byl (whom I discovered through her excellent courses on game AI) makes an exceptional effort to explain these concepts in this course, "Shader Development from Scratch for Unity", available on Udemy. She doesn’t always succeed completely in explaining them, but at least she tries and does so from the most basic concepts to the more advanced ones.

The course focuses on 3D shaders, dedicating most of its time to how to apply effects to the textures of 3D objects. It avoids Unity’s visual language for shaders and instead uses the specific programming language of this engine. Contrary to what I might have expected, I’ve realized that I understand shaders much better when they’re implemented in code rather than in any of the visual languages used by game engines. Perhaps because I come from a programming background, I find it easier to read code from top to bottom than to navigate the spaghetti plate resulting from a visual language.

Out of the course’s nine sections, I think the first six are quite understandable. I found the explanation of how to use the Dot Product in shaders particularly enlightening. Sections 7 and 8 start to get tangled and seem much less well explained. As for the final section, which covers volumetric shaders, I must have been especially tired because I admit I understood very little of it. I’ll probably focus on other topics for a while and revisit that section later. Hopefully, on a second pass, when I’m fresher, what’s explained there will make more sense.

Although there are things I haven’t grasped, there are many others that I’ve encountered in other courses but only managed to understand here, thanks to Penny de Byl’s explanations. Additionally, the Q&A section for each class is almost as interesting as the class itself. I recommend reading all the questions asked by other students for each class; you’ll probably find others who’ve had the same doubts as you. The instructor is thorough and answers almost all the questions, which is very enlightening. Sometimes, she even acknowledges mistakes in the class or includes corrections and links to external resources to resolve doubts, which is especially valuable for tying all the concepts together.

Is the course worth it? Yes, without a doubt. If you find the course discounted on Udemy, I think it’s a great opportunity to dip your toes into the world of shaders and to discover an author with some truly interesting courses.

10 November 2024

Particle systems in Godot introduction (Part I) - The sakura flowers

Flowers falling from sakura tree
In previous articles, we worked on a Japanese scene, using shaders to give the water reflection and movement, as well as to animate the tree. In this article, we’ll set shaders aside and introduce a deceptively simple tool that, when used well, can create multiple effects. This tool is called particle systems.

A particle system is a component that emits a continuous stream of particles. Different effects can be achieved by adjusting the number of particles, their speed, direction, or visual appearance. My idea is to add three particle systems to the Japanese scene from the previous article: one to simulate cherry blossom petals falling, another to visually represent the wind, and another to create a rain effect.

In this article, we’ll see how to implement the falling cherry blossom petals. This is a simple example, but I think it’s useful to understand particle systems and to get comfortable using them. Also, with a few variations, this example can be adapted to similar effects, like autumn leaves falling.

As in previous examples, it’s important to download the project on my GitHub to follow along with the explanations. Make sure to download the exact commit; otherwise, you might see code that differs from this article.

The effect we want is for the cherry blossoms to fall softly from the top of the tree. To preserve the scene’s zen character, we’ll keep the number of petals low, although we’ll see how to increase it. In fact, we’ll subtly increase it when the wind’s strength rises. We’ll also see how to move the particle system in sync with the movement of the tree's canopy.

Configuring the Particle System

In Godot, a particle system is just another node, called GPUParticles2D (with an equivalent node in 3D). Since our system will be linked to the tree, it makes sense for the particle system node to be a child of the tree node.

Once we create the particle system node under the tree node, we can begin configuring it.

Particle Emitter Node
Particle Emitter Node

A particle emitter has two levels of configuration: node level and material level.

The most basic settings are configured at the node level:

Particle emitter node configuration
Particle emitter node configuration


Emitting: Activates the particle emitter. If unchecked, the emitter will stop working.

Amount: Controls the number of particles emitted per unit of time. I set it to 1 for a subtle effect, but if you want to see the emitter in full effect, try setting it to 5 or 10.

Amount Ratio: Multiplies the value of Amount in each cycle, letting us vary the emitter’s flow without restarting it. Changing the Amount via script would restart the emitter, which looks unrealistic. But modifying Amount Ratio allows adjusting the flow without interruption.

Time--> Lifetime: Determines how long each particle remains visible, from birth to disappearance. I set it to 3 so the petals reach the ground before disappearing.

Drawing--> Visibility Rect: Specifies a rectangle within which the particles are visible. When particles leave this rectangle, they stop being visible.

Texture: Allows us to assign a sprite to give the particles a visual appearance. Instead of an individual sprite, I used a sprite sheet to generate an animation.

A sprite sheet is a single image composed of multiple individual images, useful for organizing all images in a few files rather than many individual ones. Engines are optimized for sprite sheets, which is more efficient than loading multiple individual image files.

Sakura flowers sprite sheet
Sakura flowers sprite sheet

You can make an sprite sheet gathering individual images into a single one, using an editor like Gimp or an specific tool like Texture Packer. I've used that one. Keep in mind thet, if you want to use your sprite sheet for an animation, you must sort horizontally the individual images and distribute then at even distances into the sprite sheet file. 

To animate the sprite, configure the node at the material level by assigning a ParticleProcessMaterial resource to the Process Material field. Set Animation --> Speed to 1 and Animation --> Offset (min to 0, max to 1). This makes the sprite sheet animate at a constant speed as each particle falls. Each particle starts from a different image in the sprite sheet, creating variety and an organic look.

For variety in particle size, I set the Scale parameter to a random size between 0.03 and 0.05. I also set Lifetime Randomness to 0.07 to make particle lifetimes vary slightly.

Since this is a 2D scene, I disabled Z-axis movement using Disable Z. For the shape of the particle generator, I used a sphere instead of the default cube because the tree canopy resembles a sphere. The sphere radius is set in Emission Shape Radius, and I offset the sphere slightly to the right to match the tree’s asymmetrical canopy shape.

To make the particles fall, I set a gravity value for the Y-axis in Process Material --> Accelerations--> Gravity. After some testing, 50 felt like the best value.

Moving the Particle System

The configured system behaves as a spherical volume in which particles randomly appear and fall due to gravity. If our tree were static, we could stop here, but it’s not. In previous chapters, we implemented wind-induced tree sway. For realism, the particle system’s volume should move along with the canopy.

Instead of making the particle system a child of the tree (which won’t move as it’s only visually animated by a shader), we’ll add a script to the particle system to move it in sync with the canopy’s movement

The particle system’s script, located at Scripts/LeafEmitter.cs, exports four fields for configuring from the inspector:

LeafEmitters.cs exported fields.
LeafEmitters.cs exported fields.

minimumFlowerAmount: Number of petals emitted when the tree is at rest.

OscillationMultiplier: Defines the maximum amplitude of the particle system’s oscillatory movement.

WindStrength: Wind strength.

HorizontalOffset: Offsets the oscillation direction.

LeafEmitter.cs process method
LeafEmitter.cs process method

The algorithm executed in each frame is simple: it calculates the oscillation (line 55) and uses it to shift the particle system’s horizontal position (line 57).

The method get_oscillation() calculates the oscillation:

LeafEmitter.cs get_oscillation() method
LeafEmitter.cs get_oscillation() method

The get_oscillation() method uses the same algorithm as the tree shader to define the movement of the upper vertices of the texture.

Shader synchronization with particle system

To synchronize the shader with the particle system, we can use a script located on the tree that has access to both the tree shader and the particle emitter script.

The script is located at Scripts/Tree.cs.

The configuration fields it exports to the inspector are as follows:

Exported Fields of Tree.cs
Exported Fields of Tree.cs

The trick here is that these fields are actually properties that apply modifications to both the tree oscillation shader and the LeafEmitter particle emitter script.

For example, the WindStrength property in Tree.cs works like this:

Tree.cs WindStrength property
Tree.cs WindStrength property

When a value is requested for this property, it returns the value set in the tree's oscillation shader (line 20). On the other hand, when you set a value, the property applies it to the shader (line 27) as well as to the particle emitter script (line 30).

The other two properties, HorizontalOffset and MaximumOscillationAmplitude, function similarly.

This way, any changes we make to the properties of Tree.cs are automatically replicated in the equivalent properties of LeafEmitter.

However, this method synchronizes all oscillation properties except for one: time. To ensure both Tree and LeafEmitter use the same time reference for their oscillations, we’ll manage the time count ourselves and pass it to both the tree shader and the particle emitter script.

The time count is handled in the _Process() method of the tree script:

Process() Method in Tree.cs
Process() Method in Tree.cs

In line 102 of the method, we simply add the time elapsed since the last frame (delta).

Time Property in Tree.cs
Time Property in Tree.cs

The Time property is responsible for updating both the shader and the particle emitter. Line 83 updates the shader, while line 84 updates the particle emitter.

With the same algorithm, parameters, and time variable, both the tree and the particle emitter can oscillate in sync. In the screenshot below, I have selected the particle emitter node, with its position marked by a cross. Notice how it moves in rhythm with the tree.

Sakura tree oscillation synced with particle emitter movement
Sakura tree oscillation synced with particle emitter movement

This concludes our introduction to particle emitters. In future articles, I will explain how to implement a rain effect, as well as a visual representation of the wind.

27 October 2024

"Make Online Games Using Unity's NEW Multiplayer Framework" course by GameDevTV

Tanks
I've been taking courses for some time now to learn the multiplayer functionalities offered by both Godot and Unity. It’s a very complex field that you should dive into only after mastering the engine basics, and both engines have received major updates recently, so it’s essential to choose a course that’s up-to-date.

In this case, I’ve been dedicating time to the course Make Online Games Using Unity's NEW Multiplayer Framework, offered by GameDevTV on the Udemy platform. They also offer it on their own GameDevTV platform, but I got it during a sale on Udemy.

The course starts with the development of a simple shooting game where each player’s tank moves around a top-down arena, shooting at other players' tanks. As in other GameDevTV courses, beyond the core multiplayer focus, it also covers various typical game mechanics, such as movement, player input, shooting, health management, collectibles, GUI, and particle systems (I especially liked how they implemented the tank tracks). All of these mechanics are explained in detail, and the code provided for implementing them is high quality. Overall, I felt that the instructor is experienced and adheres to best coding practices.

In terms of the networking section, it’s a vast topic, and it’s well explained. I would have liked a bit more focus on the fundamental concepts of RPC, as even after finishing the course, I still feel I haven’t fully grasped some nuances. My impression is that the RPC section explains the how but could benefit from more emphasis on the why behind certain approaches in RPC. That said, I found the RPC part to be similar to what’s in Godot, so if you're already familiar with it in Godot, the Unity approach should come easily. I have to admit that I’m not quite there yet.

Where the course does go into extensive, valuable detail is in integrating Unity Game Services (UGS) multiplayer functionalities. This is great because UGS provides numerous services ready to use, saving us the trouble of implementing and maintaining them from scratch. When analyzing all that UGS offers, it’s clear that Godot is still quite underdeveloped in this area. Even comparing what W4 Games provides to UGS, the former comes up short, offering only a subset of what the latter has already developed.

Overall, my impression of the course is very positive. I think it covers the most important aspects for a game with up to several dozen players. If you’re looking to create an MMORPG with Unity, this course probably won’t suit your needs, but it’s also unclear if Unity and UGS are the best options for a game of that type.

In summary, it’s a highly recommended course for anyone interested in making multiplayer games. However, I would suggest not taking it until you’re truly comfortable with the engine. It’s at a medium-advanced level.

How to implement a tree moved by wind in Godot

Cherry tree
In previous articles, I explained how to use shaders to make the water in a 2D scene more realistic. Shaders can achieve much more than that. Another common use is to make vegetation move as if swayed by the wind. This adds life to the scene, making its elements appear more organic.

To explain how to achieve this, I’ll evolve the scene from the previous articles. As before, I recommend you download the repository at the specific commit for this article, to ensure that the code you study is exactly the same version that I’ll be showing here.

In the previous shaders, we manipulated the colors to display from the fragment() method, but before this, Godot shaders execute another method called vertex(). While fragment() is used to manipulate the color of the pixels on the rendered surface, vertex() is used to deform the surface by manipulating its vertices. This is highly effective in 3D, as it allows for dynamic mesh deformation, simulating effects like ocean movement or the indentation that occurs when making a groove in the snow. In 2D, it is also useful, though somewhat limited since any sprite has only four vertices that we can manipulate:

  • Top left: VERTEX(0,0)
  • Top right: VERTEX(Width,0)
  • Bottom left: VERTEX(0,Height)
  • Bottom right: VERTEX(Width,Height)

The VERTEX concept is equivalent to UV, which we used in the fragment() method. If fragment() executes for every pixel on the rendered surface, collecting its relative position in the UV variable, the vertex() method executes for every vertex on the surface, gathering its position in the VERTEX variable. The key difference is that UV coordinates are normalized between 0 and 1, whereas VERTEX coordinates are not normalized and vary between the width and height in pixels of the sprite where we apply the shader. If you still don’t recognize what UV coordinates are, I recommend reviewing the article where I explained it.

It doesn’t matter if a sprite has an irregular shape; in shader terms, it is rectangular, with the four vertices in each corner, just filled with transparent color (alpha=0) where there is no drawing (if the sprite image has an alpha channel).

Although a 2D sprite lacks the vertex resolution of a 3D mesh, by manipulating the position of its vertices, we can stretch and compress the sprite. This is very effective for simulating vegetation that is compressed, stretched, and ultimately moved by the wind.

Looking at the code in the repository’s example, you’ll see that I’ve applied a shader to the tree, located in Shaders/Tree.gdshader. Being a 2D shader, it is of type canvas_item, like those in the previous articles.

Shader configuration
Shader configuration

The shader exports the following variables for editing through the inspector:

Shader variables exported to inspector.
Shader variables exported to inspector.
  1. wind_strength: This variable models the wind's strength. The greater it is, the more the tree canopy will sway.
  2. horizontal_offset: This creates an asymmetrical sway. The closer this value is to 1.0, the more the canopy tends to sway to the right; conversely, the closer it is to -1.0, the more it will lean left. If the value is zero, the sway will be symmetrical.
  3. maximum_oscillation_amplitude: This defines the maximum displacement (in pixels) for the tree canopy. It’s the maximum displacement reached when wind strength is 1.0.
  4. time: This variable isn’t intended for user manipulation but for an external script. It synchronizes the canopy's oscillation with the particle emitter that releases flowers. The particle emitter will be covered in a later article, so please ignore it for now. If there were no particle emitter, I could have used the TIME variable, which is internal to the shader and provides a counter with the shader’s active time.

Let’s now examine the vertex() method:

vertex() method
vertex() method

In line 11, the code limits the manipulation to the top vertices. Why limit ourselves to the top vertices? Because vegetation is firmly anchored to the ground by its roots, so only the top sways in the wind. Therefore, we only modify the top vertices, leaving the bottom ones in their default position. In theory, I should have been able to filter using VERTEX.y instead of UV.y, but when I tried, I didn’t get the desired effect since the entire image swayed, not just the top vertices.

Also, keep in mind that the comparison in line 11 cannot be for exact equality because floating-point numbers are not precise. Hence, the comparison is not "equal to zero" but "very close to zero."

The oscillation is generated in line 14 using a sinusoid with a maximum amplitude set by wind_strength. Remember that a sinusoid produces values between -1 and +1, so multiplying it by its amplitude means its resultant oscillation will range between -wind_strength and +wind_strength. Since wind_strength is limited between 0 and 1, the oscillation will not exceed -1 and +1 at the maximum value of wind_strength.

Note that inside the sinusoid there are two terms. One is time, allowing the sinusoid to change as time progresses. The other is VERTEX.x, which varies for each top vertex, allowing their oscillations to be unsynchronized, creating a stretching and compressing effect. If we were rendering vegetation without foliage, such as grass, and didn’t want this stretching and compressing effect, we could remove the VERTEX.x term, making the two vertices move in sync.

In line 15, we multiply by the maximum displacement to achieve the desired oscillation amplitude.

In line 16, we add the displacement to the vertex's horizontal position.

Finally, in line 17, we register the vertex's new position.

By moving only the two top vertices and leaving the bottom ones fixed, the effect is as if pinning the two bottom corners of a flexible rectangle in place and moving the two top corners.

Surface deformation
Surface deformation achieved with the shader.

When you overlay a tree image onto a surface deformed in this way, you get the effect we aimed to create:

Movement of the tree
Movement of the tree