Game development in Go: Ebitengine shaders

Hello, my friend. Stay awhile and listen!

Does game development in Go programming language fascinate you? You’ve come to the right place!

Today I want to share some of my Ebitengine shaders experience with you. Most of the examples I’ll use and refer to come from my Roboden and Decipherism open-source games. You can find their sources on GitHub.

Without further ado, let’s do it!

A Brief Shaders Intro

I’ll only talk about fragment shaders (also known as pixel shaders) because those are the only ones supported in Ebitengine.

A fragment shader is an algorithm that describes how to transform the pixels of an image before displaying it. Most often, this algorithm is described in code, but there are visual ways to create shaders as well. The specific shader language dialect used to describe these programs depends on the engine you are using, as they may try to hide the details of the specific graphics card you are writing the shader for (but more on that later).

An example of a simplest shader is a program that multiplies the alpha channel of each pixel by 0.5, making the image semi-transparent. Shaders can be very complex and create effects similar to animation: distortions, waves, dynamic color changes.

Please keep in mind that I’m no shader expert by any standards. I’ll also simplify some of the terminology to make things look more easy to understand for newcomers. All opinions that are shared here are partially subjective.

Motivation

But why do we need shaders at all? In most cases, there are ways to change the alpha channel without using shaders. A wave animation can be created with a normal frame-based sequence.

That’s mostly true, but shaders have undeniable advantages:

  • They are almost always more efficient than alternatives
  • Shaders can simplify the implementation (so it’s faster to do it this way)
  • The shader code is maintainable than a set of components implementing the effect

This article will be in the form of tasks and their solutions. We start by defining a goal and then we’ll implement that desired effect using shaders.

Round 1: Applying the Damage Mask

Let’s assume that we have buildings in our game. You may want to visually display the building’s damage level. How can we achieve this?

I have prepared the assets for us. Building sprites, top view, four variations:

Now to the damage mask. When there is no damage, this mask will have zero opacity. As the received damage increases, the alpha channel of the mask increases, making it more noticeable.

Four variations of the damage mask:

I’ve only drawn one mask and rotated it by 90 degrees to get 4 sprites.

Now let’s try to apply them. Let’s assume the damage level is around 100% and the visibility of the mask is absolute:

It doesn’t look very neat: this mask only fits square-shaped sprites.

It is possible to solve this issue without shaders:

  • We can draw masks of different shapes that perfectly fit the sprite.
  • Alternatively, we can use round damage textures that can fit anywhere.
  • Or we can apply intersection composite modes when rendering.

But let’s try to solve the task using a shader, shall we?

First, I’ll write the shader, and then I’ll show you how to apply it to the image.

package main

var HP float // A health level, a value in [0, 1] range

func Fragment(_ vec4, texCoord vec2, _ vec4) vec4 {
	c := imageSrc0At(texCoord)    // A pixel from the building image
	mask := imageSrc1At(texCoord) // A pixel from the damage mask image
	if c.a != 0.0 && mask.a != 0.0 {
		a := clamp(HP+(1.0-mask.a), 0.0, 1.0)
		// Create a darker pixel if it's inside a damage mask.
		return vec4(c.r*a, c.g*a, c.b*a, c.a)
	}
	return c // Otherwise, leave a pixel color as is
}

Our shader has some dependencies:

  • The texture of the main image (Src0)
  • The texture of the damage (Src1)
  • The HP parameter for calculating color components

Src0 and Src1 should be of the same size, so each pixel in Src0 has a corresponding pixel in Src1. For each intersection of opaque pixels in Src0 and Src1, we calculate a new color.

The shader result looks like this:

You can modify the shader to prevent the damage texture from overlaying the object’s outline. By checking other color components (apart from c.a), you can determine whether the textures should be blended in that pixel or not.

Kage Shader Language Overview

Did you notice that we wrote the shader in a Go-like language?

We use Kage as a shader language in Ebitengine. Kage’s translator parses Go code and generates the corresponding snippet in the required shader language dialect. For example, on my machine, the shader from the previous example is transformed into the following code:

#if defined(GL_ES)
precision highp float;
#else
#define lowp
#define mediump
#define highp
#endif

int modInt(int x, int y) {
	return x - y*(x/y);
}

uniform vec2 U0;
uniform vec2 U1[4];
// ... много других uniform-деклараций.

varying vec2 V0;
varying vec4 V1;

vec4 F5(in vec2 l0);
vec4 F7(in vec2 l0);
vec4 F12(in vec4 l0, in vec2 l1, in vec4 l2);

vec4 F5(in vec2 l0) { /* ... */ }

vec4 F7(in vec2 l0) { /* ... */ }

vec4 F12(in vec4 l0, in vec2 l1, in vec4 l2) {
	vec4 l3 = vec4(0);
	vec4 l4 = vec4(0);
	l3 = F5(l1);
	l4 = F7(l1);
	if ((((l3).a) != (0.0)) && (((l4).a) != (0.0))) {
		float l5 = float(0);
		l5 = clamp((U8) + ((1.0) - ((l4).a)), 0.0, 1.0);
		return vec4(((l3).r)*(l5), ((l3).g)*(l5), ((l3).b)*(l5), (l3).a);
	}
	return l3;
}

void main(void) {
	gl_FragColor = F12(gl_FragCoord, V0, V1);
}

Fascinating, isn’t it?

My opinion on Kage is mixed. On one hand, I understand why this layer of abstraction was added. On the other hand, Kage makes working with shaders more difficult for both beginners and experienced shader creators. For beginners, it is challenging to learn something with minimal documentation, while for experienced creators, it is difficult to apply existing knowledge.

Advantages of Kage:

  • Convenient editing features similar to Go: gofmt, autocompletion, go to definition
  • Familiar Go concepts work (e.g. multi-value return)
  • Improved portability due to the ability to translate into any target language
  • The Kage layer allows the engine to have more control over shaders

Disadvantages of Kage:

  • It’s harder to use existing shaders; they need to be rewritten in Kage
  • Less documentation, examples, and tutorials available
  • Less shader tooling (looking forward to a visual shader editor by Sedyh
  • Debugging Kage is more challenging

Here are some useful hints about Kage that will come in handy pretty soon:

  • The entry point is the Fragment() func, we’ll examine its parameters later.
  • The types we need are int, float, vec2, and vec4.
  • Kage provides built-in funcs such as distance(), clamp(), and imageSrc0At().
  • You can also define your own functions and numerical constants.
  • Kage supports swizzling
  • Arithmetic operations such as * also work for vectors (vec2, vec4)
  • Coordinates are represented with vec2 (x, y).
  • Colors are represented with vec4 (r, g, b, a).

Connecting a Shader

Let’s assume that our game scene contains Sprite objects. They hold an *ebiten.Image and, optionally, a compiled shader.

type Sprite struct {
	x, y float64

	img *ebiten.Image

	shader        *ebiten.Shader
	shaderTexture *ebiten.Image
	shaderParams  map[string]any
}

Rendering sprites without shaders can be done as follows:

func (s *Sprite) Draw(dst *ebiten.Image) {
	var options ebiten.DrawImageOptions
	options.GeoM.Translate(s.x, s.y)
	dst.DrawImage(s.img, &options)
}

Next, in our root game.Draw() function, we call Sprite.Draw() on every sprite object to render every one of them.

Now let’s add shader-aware drawing:

func (s *Sprite) Draw(dst *ebiten.Image) {
	// If there is no shader, do it as before.
	if s.shader == nil {
		var options ebiten.DrawImageOptions
		options.GeoM.Translate(s.x, s.y)
		dst.DrawImage(s.img, &options)
		return
	}
	// We'll need a different draw options type here.
	var options ebiten.DrawRectShaderOptions
	options.GeoM.Translate(s.x, s.y)
	options.Images[0] = s.img           // Src0
	options.Images[1] = s.shaderTexture // Src1
	options.Uniforms = s.shaderParams
	b := s.img.Bounds()
	drawDest.DrawRectShader(b.Dx(), b.Dy(), s.shader, &options)
}

The amount of code definitely increased, but there’s nothing fundamentally difficult out there. We need to correctly populate DrawRectShaderOptions and call DrawRectShader() instead of DrawImage().

Where do s.shaderParams and s.shaderTexture come from? I suggest associating them with the sprite once when assigning the shader:

type ShaderParams struct {
	Compiled *ebiten.Shader
	Uniforms map[string]any
	Src1     *ebiten.Image
	// ... can be extended to include Src2, and so on.
}

func (s *Sprite) SetShader(params ShaderParams) {
	s.shader = params.Compiled
	s.shaderParams = params.Uniforms
	s.shaderTexture = params.Src1
}

*ebiten.Shader can be reused for all sprites that require that shader effect. Similarly, a single *ebiten.Image can be used as Src1 everywhere. However, the “data” (uniforms) for each sprite will be specific to that sprite.

Since a map is a pointer wrapper, changes made outside will be visible inside the Sprite. We’ll take advantage of this to modify shader parameters.

The code for an object that uses a shader sprite would look something like this:

func (b *Building) Init() {
	b.shaderData = map[string]any{"HP": 1.0}
	b.sprite = NewSprite()
	b.sprite.SetShader(damageShader, damageMask, b.shaderData)
}

func (b *Building) OnDamage(damage float64) {
	b.hp -= damage
	if b.hp <= 0 {
		b.destroy()
		return
	}
	// Updating the shader parameter.
	// Note: we're using float32 here.
	// Kage shader uniforms support int, float32, and []float32 types.
	b.shaderData["HP"] = float32(b.hp / b.maxHP)
}

Where:

  • damageShader is an *ebiten.Shader created from our Kage snippet.
  • damageMask is an *ebiten.Image that contains the damage mask.
  • b.shaderData belongs to the Building object; the shader never mutates that map.

The shader script itself is a regular file, a data. You can store it next to your application or embed it directly into the binary using go:embed. To compile the shader, you need to pass the shader source code bytes to the ebiten.NewShader() function.

Round 2: the Pick Effect

You can find fonts that look like handwriting on the internet. However, each letter will look identical, which is unrealistic. Some entropy during the rendering could help here.

There are different ways to achieve this entropy. In my Decipherism game, I just randomly shuffled some neighboring pixels when rendering the text:

Let’s recall the signature of the fragment shader (ignoring the irrelevant parameters):

func Fragment(_ vec4, texCoord vec2, _ vec4) vec4

texCoord is a texel coordinate on the source texture.

All we need to about the texels is that they have values ranging from 0 to 1. If an image has a size of 500 pixels, 0.5 texels will represent a size of 250 pixels within the bounds of that image.

The imageSrc0At() function takes texel coordinates. But what if we want to work at the pixel level? Conversions between texels and pixels are possible.

Ebitengine allows us to define custom functions inside shaders, so we’ll do just that:

// text2pixCoord converts a texel coordinate texCoord
// into a pixel coordinate, taking a texture offset into account.
func tex2pixCoord(texCoord vec2) vec2 {
	pixSize := imageSrcTextureSize()
	originTexCoord, _ := imageSrcRegionOnTexture()
	actualTexCoord := texCoord - originTexCoord
	actualPixCoord := actualTexCoord * pixSize
	return actualPixCoord
}

Since Ebitengine combines multiple images into atlases automagically, our source image could be at some non-zero offset inside that atlas. Because of this, we need to subtract the origin to translate the texel coordinate into one that we can interpret as a regular pixel coordinate on the image.

Here is what we’ll do:

  • Convert texels to pixel coordinates.
  • Apply logic to the pixels.
  • Convert pixels back to texels in the end.

For the last step in that list, we’ll need a tex2pixCoord() counterpart:

func pix2texCoord(actualPixCoord vec2) vec2 {
	pixSize := imageSrcTextureSize()
	actualTexCoord := actualPixCoord / pixSize
	originTexCoord, _ := imageSrcRegionOnTexture()
	texCoord := actualTexCoord + originTexCoord
	return texCoord
}

Next, we need to apply something like the pick filter. I can suggest the following implementation:

func applyPixPick(pixCoord vec2, dist float, m, hash int) vec2 {
	// dist - how much we're offseting a pixel
	// dir - where we're moving that pixel
	// Kage doesn't have a switch statement (yet?),
	// hence we'll use an if/else here.
	dir := hash % m
	// We need to use an explicit int conversion here,
	// otherwise we'll get an "operands of `==' must have the same type" error.
	// Ebitengine converts a bare 0 literal into 0.0
	// which will be interpreted as a float value by the underlying driver.
	if dir == int(0) {
		pixCoord.x += dist
	} else if dir == int(1) {
		pixCoord.x -= dist
	} else if dir == int(2) {
		pixCoord.y += dist
	} else if dir == int(3) {
		pixCoord.y -= dist
	}
	// Otherwise, don't move it anywhere.
	return pixCoord
}

A lower m parameter makes more pixels shuffled; a higher m value keeps more pixels in their original positions.

Now, the remaining question is: where do we get the hash value from? In theory, it is a pseudo-random value that determines what to do with a specific pixel. But there’s no rand() inside shaders, of course.

Let’s write a pseudo-random function ourselves then:

func shaderRand(pixCoord vec2) int {
	return int(pixCoord.x+pixCoord.y) * int(pixCoord.y*5)
}

With all the functions we created above, we can express the fragment processor:

func Fragment(_ vec4, texCoord vec2, _ vec4) vec4 {
	c := imageSrc0At(texCoord)
	actualPixCoord := tex2pixCoord(texCoord)
	if c.a != 0.0 {
		h := shaderRand(actualPixCoord)
		p := applyPixPick(actualPixCoord, 1.0, 15, h)
		return imageSrc0At(pix2texCoord(p))
	}
	return c
}

This shader will produce the desired pick effect.

Round 3: the CRT Display Effect

In my Decipherism game, I needed to implement a terminal screen that would scream “retro hacking”. The terminal screen displayed elements of a scheme that implemented a certain encoding algorithm.

Here’s what it looked like in the end:

This is how it looks without a shader:

We’ll need a more sophisticated pseudo-random numbers generation here. For this purpose, I’ll introduce two external parameters:

  • Tick - a value that changes over time
  • Seed - each element will have its own seed for some extra randomness

A new shaderRand() will look like this:

func shaderRand(pixCoord vec2) (seedMod, randValue int) {
	pixSize := imageSrcTextureSize()
	pixelOffset := int(pixCoord.x) + int(pixCoord.y*pixSize.x)
	seedMod = pixelOffset % int(Seed)
	pixelOffset += seedMod
	return seedMod, pixelOffset + int(Seed)
}

seedMod will be used as an additional source of randomness by the caller.

In addition, we need to create some animated glitch-like distortions. I would say it resembles the video degradation effect.

func applyVideoDegradation(y float, c vec4) vec4 {
	if c.a != 0.0 {
		// Every 4th pixel on the Y axis will be darkened.
		if int(y+Tick)%4 != int(0) {
			return c * 0.6
		}
	}
	return c
}

The final code for the fragment shader:

func Fragment(pos vec4, texCoord vec2, _ vec4) vec4 {
	c := imageSrc0At(texCoord)

	actualPixCoord := tex2pixCoord(texCoord)
	if c.a != 0.0 {
		seedMod, h := shaderRand(actualPixCoord)
		dist := 1.0
		if seedMod == int(0) {
			dist = 2.0
		}
		p := applyPixPick(actualPixCoord, dist, 5, h)
		return applyVideoDegradation(pos.y, imageSrc0At(pix2texCoord(p)))
	}

	return c
}

I’m using the pos parameter for the first time here. It represents the position in the destination image (in pixels). By using this value, I avoid issues when rotating source textures. Thus, the glitch waves always go from top to bottom, instead of right to left when source texture is rotated by 90 degrees.

Round 4: a Looped Texture Animation

Let’s take a texture of an energy beam:

…and start looping it along the X-axis:

Here’s another example:

My first attempt at solving this looked like the code below:

var Time float

func Fragment(_ vec4, texCoord vec2, _ vec4) vec4 {
	pixSize := imageSrcTextureSize()
	_, srcRegion := imageSrcRegionOnTexture()
	width := pixSize.x * srcRegion.x
	actualPixCoord := tex2pixCoord(texCoord)
	p := vec2(slide(actualPixCoord.x, width), actualPixCoord.y)
	return imageSrc0At(pix2texCoord(p))
}

func slide(v, size float) float {
	return mod(v-(100*Time), size)
}

When applied to the image above, this shader yields this result:

The sliding direction depends on whether Time increases or decreases.

This is almost what we need, but the loop is rough due to the sharp transition at both ends of the image. To achieve a better-looking result, we need to add some code to this shader:

func Fragment(_ vec4, texCoord vec2, _ vec4) vec4 {
	pixSize := imageSrcTextureSize()
	_, srcRegion := imageSrcRegionOnTexture()
	width := pixSize.x * srcRegion.x
	actualPixCoord := tex2pixCoord(texCoord)
	p := vec2(slide(actualPixCoord.x, width), actualPixCoord.y)

	c := imageSrc0At(pix2texCoord(p))
	const cutoffThreshold = 10.0
	if actualPixCoord.x <= cutoffThreshold {
		c *= actualPixCoord.x * 0.1
	} else if actualPixCoord.x >= (width - cutoffThreshold) {
		c *= (width - actualPixCoord.x) * 0.1
	}

	return c
}

We added a gradient that reduces the opacity of the image. Pixels closer to the edge of the image will have higher transparency.

I have another cool example for you. We can make a planet rotation effect using almost the same shader! First, we’ll need a rectangular texture:

The shader will be similar to the previous ones, but with the addition of shadows and a rendering radius:

var Time float

func Fragment(_ vec4, texCoord vec2, _ vec4) vec4 {
	_, srcRegion := imageSrcRegionOnTexture()
	pixSize := imageSrcTextureSize()
	sizes := pixSize * srcRegion
	width := sizes.x
	height := sizes.y
	actualPixCoord := tex2pixCoord(texCoord)

	// We won't render anything that is outside of the circle radius (32).
	// This way, we're keeping the center section of the texture.
	const planetSize = 64.0
	center := vec2(width, height) * 0.5
	if distance(center, actualPixCoord) > planetSize {
		return vec4(0)
	}

	// The light source will be a slightly off from the center.
	lightPos := vec2(center.x*0.85, center.y*0.9)
	lightDist := distance(lightPos, actualPixCoord) / planetSize
	colorMultiplier := vec4(1, 1, 1, 1)
	// A higher distance from the light source makes the color darker.
	colorMultiplier.xyz *= clamp(1.8-lightDist*1.6, 0.0, 1.0)

	// As a last step, apply a sliding loop animation.
	p := vec2(slide(actualPixCoord.x, width), actualPixCoord.y)
	return imageSrc0At(pix2texCoord(p)) * colorMultiplier
}

Are you ready to see the results?

Round 5: Building Construction Effect

In Roboden game, you can build bases and turrets. The animation of constructing a new building is done through shaders.

In the game, it looks like this:

For convenience, here are frames from the animation above, in isolation:

The t parameter (named Time in the shader) is controlled by the game logic. When workers construct a building, t increases. t represents a normalized construction progress value (from 0 to 1).

This shader requires almost everything we’ve learned so far:

  • Rendering only a part of the texture that is outside the circle.
  • Adding the dark outline to the rendering circle to make it look less flat.
  • Moving and painting pixels close to the rendering outline (pick effect).

Let’s start by introducing helper functions:

func shaderRand(p vec2) int {
	return int(p.x+p.y) * int(p.y*5)
}

func sourceSize() vec2 {
	pixSize := imageSrcTextureSize()
	_, srcRegion := imageSrcRegionOnTexture()
	return pixSize.x * srcRegion
}

The shader itself had many parameters that I manually hardcoded to get a result I wanted. I modified the shader specifically for this article to make it more versatile (and look less crappy).

func Fragment(_ vec4, texCoord vec2, _ vec4) vec4 {
	// texCoord is known to be in Src0 bounds, therefore we
	// can use an unsafe imageAt version which works faster.
	c := imageSrc0UnsafeAt(texCoord)
	if c.a == 0 {
		return c
	}

	actualPixPos := tex2pixCoord(texCoord)

	// We'll take the image size into the account
	// to make this shader work with images of different sizes.
	sizes := sourceSize()
	width := sizes.x // For square images it's enough to have a width alone

	// Defining a rendering area and its dt movement.
	initialY := -2.0
	offsetY := width * 0.15 * Time
	circleCenter := vec2(width*0.5, initialY-offsetY)
	dist := distance(actualPixPos, circleCenter)

	progress := 1.4 - Time
	if dist > ((width * 0.95) * progress) {
		// Everything outside of this area is drawn without distortions.
		return c
	}

	spread := 0
	colorMultiplier := vec4(0)

	// Defining several distance rings.
	// We're using if/else again due to the lack of switch statement.
	if dist > ((width * 0.85) * progress) {
		spread = 15
		colorMultiplier = vec4(1, 1.1, 1.3, 1.0)
	} else if dist > ((width * 0.75) * progress) {
		spread = 11
		colorMultiplier = vec4(0.9, 1.2, 1.6, 1.0)
	} else if dist > ((width * 0.65) * progress) {
		spread = 7
		colorMultiplier = vec4(0.8, 1.4, 2.0, 1.0)
	} else if dist > ((width * 0.62) * progress) {
		spread = 6
		colorMultiplier = vec4(0.25, 0.25, 0.25, 1.0)
	} else {
		// Too close to the circle, skip this area.
		return vec4(0)
	}

	h := shaderRand(actualPixPos)
	p := applyPixPick(actualPixPos, 1, spread, h)
	if p == actualPixPos {
		// If pixel wasn't moved at all, render it with its original color.
		return c
	}
	return imageSrc0At(pix2texCoord(p)) * colorMultiplier
}

As Time increases, we move our abstract circle up, which changes the distribution of displayed pixels due to the updated distance from the circle’s center.

This was the last of the shaders I wanted to show you in this article.

Want even more shaders? Check out examples/shader in the Ebitengine repository. It includes these effects:

  • Dissolve effect
  • Radial blur
  • Water reflections effect
  • Chromatic aberration

Finally, here is a small list of my suggestions regarding shaders in Ebitengine:

  • Store shader source code inside the binary using go:embed.
  • Compile each shader only once and reuse *ebiten.Shader.
  • When a shader is not needed*, draw using DrawImage instead of DrawRectShader.
  • Write more helper functions in shaders as they improve readability a ton.
  • Work at the pixel level if it makes the algorithm more concise.

(*) For example, the damage mask at HP=1.0 will not change the appearance, so you can draw the sprite using DrawImage() instead of DrawRectShader().

Go Gamedev Entry Checklist

Do you want to write your game in Go, but you don’t know how to get started?

  1. Join the official Ebitengine discord channel.
  2. Complete the Ebitengine Tour.
  3. Come up with some simple game idea (asteroids will do) and start making it.

While working on your game, use these resources:

You probably won’t need shaders in your first game, but you might end up wanting to use them anyway; and I won’t stop you. Shaders are awesome.

I hope that you enjoyed this article and found it useful. If you want to say thanks to me, star the Roboden GitHub repository. :)