[Devember 2022] a space project | open source space game

Hello Internet. I make game.

The project:
A 2D procedurally generated space combat / exploration game. Kinda like the original arcade Asteroids, only a little bit bigger. Fly around in ships, shoot things, do a barrel roll! Procedurally generated planets and stars.

Current state: Very much in a sandbox debug/developer mode with temporary test objects for engine development. No sound yet…

Current goal: Fixing bugs. Flesh out and polish existing features with focus on gameplay. Improve graphics. Reach a stable vertical slice.

I have been working at this side project on and off for…a while… The scope is a little ambitious for a solo dev, so I’m coming to terms with that and trying to focus on core mechanics and engine features and aim for “fun”. I have a plan of attack and it’s gonna take longer than devember, but I like the idea of the challenge and have some spare time the next couple months between contracts. I’m gonna poke at this mess and see how much progress I can make.

Release date: before I die?
The Steam Deck looks pretty neat and I think it would be a great platform to target if I can get my hands on one. I already have controller support working on Manjaro so in theory it should “just work” out of the box since the Steam Deck is built on Arch.

There’s no build available yet because it’s not quite demo ready. I’ll aim to get a build by the end of the month, hopefully get some feedback. I’m open to constructive criticism and any ideas/features that might improve the project. I’ve been developing this game for myself so its sorta been made in a vacuum because its never really been in a state I was comfortable to share. Also I’m generally more of a read-only kinda person when it comes to the internet, so here goes to breaking out of comfort zones.


Good luck, I’m excited to follow! I’ve been playing around with LibGDX a bunch over the last couple of months (I’m a software dev by trade, but not in the gaming industry), so I was pleasantly surprised by your choice of framework.

1 Like

Thanks, I love libgdx. Its a really great library with some engine-like features it still gives me a lot of low level control.

I first got started making games with Gamemaker 6/7 when I was younger. I only ever finished and published one (not so great) game to the old yoyogames sandbox. Made some classic brick breaker / break out clones, and some other toys and prototypes that never got finished. Then I started learning java and when I got the idea for this project I had the intention of writing my own engine from scratch with no dependencies for fun and a challenge.

It started off with a basic double-buffered JFrame, load some images from disk, render sprites pixel by pixel, move them around the screen with mouse and keyboard input, and some basic AABB (Axis Aligned Bounding Box) collision detection in a render loop at smooth 60fps. Woohoo! But then I started learning about cameras, viewports, shaders, meshes and models, lighting, particles, UI, physics…the list goes on, all the other things that go into rendering and game engines… I became overwhelmed.

I had severely underestimated the challenge of writing a game engine from scratch.

So for the sake of sanity and productivity I ported to libGDX (OpenGL). And its been the best decision for the project. It has lots of great community libraries too and great documentation.


Since this game is in space, I’ll start by breaking down how space is currently rendered. I want space to feel big and have the illusion of depth within the limitation of top down 2D-perspective. To do this I a use few layered textures rendered in a parallax style at different depths. This is the same technique you see in most side scrollers where the background scenery layers move slowly relative to the foreground layers.

Essentially parallax = cameraPosition * depth; Where depth is how much the layer moves relative to camera. A depth of 0.5 would make the layer move half as much.

Currently the space background is a blend of 4 layers. 3 star layers at different depths, and a colorized noise layer.

For the very background nebula layer, its simply a 2D heightmap using Kurt Spencer’s OpenSimplexNoise. Using 3 instances of noise for 3 channels: (red, blue, alpha) each with a unique seed. For each pixel in the pixmap, calculate the noise and draw a pixel with the mixed channels from the noise.

static OpenSimplexNoise alphaNoise = new OpenSimplexNoise(SEED);
static OpenSimplexNoise redNoise = new OpenSimplexNoise(SEED + 1);
static OpenSimplexNoise blueNoise = new OpenSimplexNoise(SEED + 2);
public static Texture generateSpaceBackgroundDust(int tX, int tY, int tileSize) {
    Pixmap pixmap = new Pixmap(tileSize, tileSize, Format.RGBA8888);
    double featureSize = 100;
    for (int y = 0; y < pixmap.getHeight(); y++) {
        for (int x = 0; x < pixmap.getWidth(); x++) {
            double nX = (x + (tX * tileSize)) / featureSize;
            double nY = (y + (tY * tileSize)) / featureSize;
            double opacity = alphaNoise.eval(nX, nY, 0);
            opacity = (opacity * 0.5) + 0.5; //normalize from range [-1:1] to [0:1]
            double red = redNoise.eval(nX, nY, 0);
            red = (red * 0.5) + 0.5;
            double blue = blueNoise.eval(nX, nY, 0);
            blue = (blue * 0.5) + 0.5;
            pixmap.setColor(new Color((float) red, 0, (float) blue, (float) opacity));
            pixmap.drawPixel(x, pixmap.getHeight() - 1 - y);
    Texture tex = new Texture(pixmap);
    return tex;

This gives us a simple colorized noise layer for like a sort of sort of dusty nebula look. It’s fairly primitive but works for now. I’ll be adding better colors and more details like distant spiral galaxies and other cool “space stuff”.

Generating the star layers is a very simple: its just randomly placed white dots of varying brightness. But plain white and stars are boring. So let’s add some color!

Star temperatures typically range from ~2,000K - 40,000K kelvin.
Some common color temperatures for reference (kelvin):

     1900	Candle flame
     2000	Sunlight at sunset
     2800	Tungsten bulb—60 watt
     2900	Tungsten bulb—200 watt
     3300	Tungsten/halogen lamp
     3780	Carbon arc lamp
     5500	Sunlight plus skylight
     5772   Suns "effective temperature"
     6000	Xenon strobe light
     6500	Overcast sky
     7500	North sky light

Disclaimer: I am not a physicist, only an ape with a search engine. There may be errata in the following numbers / calculations so let me know if I made a mistake.

We can calculate the peak wavelength of the black body radiation using Wien’s displacement law: λₘT = b

  • λₘ = The maximum wavelength in nanometers corresponding to peak intensity
  • T = The absolute temperature in kelvin
  • b = Wein’s Constant: 2.88 x 10-3 m-K or 0.288 cm-K

Which essentially means:
Hotter things - peak at shorter wavelengths - bluer
Cooler things - peak at longer wavelengths - redder

//b = Wien's displacement constant
public static final double b = 2.8977719; //mK
public static double temperatureToWavelength(double kelvin) {
        return b / kelvin;

So our peakWavelenth = b / kelvin; → 2.8977719 / 5772 = 502nm
The suns temperature of about 5772K gives us a peak wavelength of 502nm. Which turns out to be a sort of light green. Neat!

Green? But the sun is not only green! Very true, this calculation does not give us the full spectra emitted, only the peak wavelenth of emissions.

For now we’re going to convert that wavelength to a color.

Typical color ranges:
     Color   Wavelength(nm) Frequency(THz)
     Red     620-750        484-400
     Orange  590-620        508-484
     Yellow  570-590        526-508
     Green   495-570        606-526
     Blue    450-495        668-606
     Violet  380-450        789-668
A typical human eye will respond to wavelengths from about 380 to about 750 nanometers.
Tristimulus values: The human eye with normal vision has three kinds of cone cells that sense light, having peaks of spectral sensitivity in
     short   420 nm – 440 nm
     middle  530 nm – 540 nm
     long    560 nm – 580 nm

So we want a method to approximate RGB [0-255] values for wavelengths between 380 nm and 780 nm.

Thankfully this problem is a solved thanks to the following:

/** approximate RGB [0-255] values for wavelengths between 380 nm and 780 nm
 * Ported from: RGB VALUES FOR VISIBLE WAVELENGTHS by Dan Bruton ([email protected])
 * http://www.physics.sfasu.edu/astro/color/spectra.html
public static int[] wavelengthToRGB(double wavelength, double gamma) {
    double factor;
    double red, green, blue;
    if ((wavelength >= 380) && (wavelength < 440)) {
        red = -(wavelength - 440) / (440 - 380);
        green = 0.0;
        blue = 1.0;
    } else if ((wavelength >= 440) && (wavelength < 490)) {
        red = 0.0;
        green = (wavelength - 440) / (490 - 440);
        blue = 1.0;
    } else if ((wavelength >= 490) && (wavelength < 510)) {
        red = 0.0;
        green = 1.0;
        blue = -(wavelength - 510) / (510 - 490);
    } else if ((wavelength >= 510) && (wavelength < 580)) {
        red = (wavelength - 510) / (580 - 510);
        green = 1.0;
        blue = 0.0;
    } else if ((wavelength >= 580) && (wavelength < 645)) {
        red = 1.0;
        green = -(wavelength - 645) / (645 - 580);
        blue = 0.0;
    } else if ((wavelength >= 645) && (wavelength < 781)) {
        red = 1.0;
        green = 0.0;
        blue = 0.0;
    } else {
        red = 0.0;
        green = 0.0;
        blue = 0.0;
    // Let the intensity fall off near the vision limits
    if ((wavelength >= 380) && (wavelength < 420)) {
        factor = 0.3 + 0.7 * (wavelength - 380) / (420 - 380);
    } else if ((wavelength >= 420) && (wavelength < 701)) {
        factor = 1.0;
    } else if ((wavelength >= 701) && (wavelength < 781)) {
        factor = 0.3 + 0.7 * (780 - wavelength) / (780 - 700);
    } else {
        factor = 0.0;
    // Don't want 0^x = 1 for x <> 0
    final double intensityMax = 255;
    int[] rgb = new int[3];
    rgb[0] = red   == 0.0 ? 0 : (int)Math.round(intensityMax * Math.pow(red * factor, gamma));
    rgb[1] = green == 0.0 ? 0 : (int)Math.round(intensityMax * Math.pow(green * factor, gamma));
    rgb[2] = blue  == 0.0 ? 0 : (int)Math.round(intensityMax * Math.pow(blue * factor, gamma));
    return rgb;

Finally we apply the calculated rgb color for given wavelength to our stars. This gives us some nice red and yellow, green, blue dots across the spectrum, be we have a problem. We are only calculating the peak wavelength. We are missing the rest of the spectrum. For a temperature range of 2,000K to 40,000K, many of these values have a peak radiation wavelengths that fall outside the range of our vision (Ultra Violet / Infrared). The wavelengthToRGB() function is tuned for 380nm to 780nm (7626K to 3715.1K). Any star outside of that temperature range are being calculated as black with no color.

To fix this for now just added a check to turn black stars to white, otherwise use the raw peak wavelength as a color for the stars. This isn’t quite right as in reality stars still have plenty of spectra in the red (hot) or blue (cold). So in the future I will fix this by using the “CIE 1931 color space” to get the proper color temperature, and play with Plank’s law to calculate the rest of color spectrum.

This effect is good enough for now, but there’s a problem with the gameplay feel: Since the parallax layers are so deep, they don’t move that much when the player moves in their space ship. This makes it difficult to see how fast our player is moving unless there are other objects on screen. We lack a frame of reference!

So I added another “dust” layer which is basically the same thing as the background dust layer, but this time with no parallax effect so it is essentially the same depth as the player. I’ll be adding more game objects and details like floaty particles and “space debri” which should help visually with frame of reference also.

Also I should probably consider a color palette of sorts to make sure the contrast between the ships and game objects are distinct from the background. Currently the colors are just random mixes of what ever I thought looked kinda cool at the time.

I’ll break down in the space ships in a future post.


Procedurally generated spaceships and barrel roll!

Let me preface this by saying I am not a graphics artist. I just really like geometry and simple shapes so this game is mostly programmer art.

A less experienced me wrote this part years ago and I was obsessed with the idea of procedural generation at the time. I wanted to apply it to EVERYTHING, including textures.

The body of the ship is a very simple algorithm based on a coin flip to build varied edges. It first chooses a random size of body and a starting point for the edge. The real code is old and ugly, so here is some psuedo code. The algorithm is as follows:

height = 0;
for (each pixel along the edge) {
	//there is a 50% chance to change or not change height.
	if (random boolean) {
		//there is a 50% chance the edge will add 1 or subtract 1 value from its position relative to the current position. 
		if (random boolean) {
			height += 1;
		} else {
			height -= 1;

The edge is mirrored on the other side for symmetry. Wings are just a half triangle mirrored on either side of the body. It’s not the most sophisticated algorithm, but this produces some ok variation of shapes:

tbh it doesn’t look that great… They don’t really look like spaceships, but assets can be easily swapped later so these place holders will suffice for now.

Now we have a particle effect layer to show engine fire, and add a gentle roll to the body to give feel of movement:

Wait, how does one “roll” a 2D sprite?

One of the key features I wanted was a dodging mechanic / barrel roll effect for avoiding incoming projectiles. This is simply an impulse to the left or right. For the animation I wanted the ship to do a barrel roll (Starfox64 was dope!) But this a 2D game. How do I rotate a 2D sprite on the 3D axis?

My first attempt was to try and pull and stretch the edges of the texture similar to how you might use the transformation tool in photoshop. I originally tried to do this effect with a shader using this: GitHub - bitlush/android-arbitrary-quadrilaterals-in-opengl-es-2-0: Arbitrary Quadrilaterals in OpenGL ES 2.0 for Android

But I couldn’t figure it or get it to work at the time. So I switched to just actually using 3D renderable model. Now this is where the engine gets a little tricky as I start to mix 2D and 3D rendering rendering layers. For the model it’s basically just a rectangle with no width. And a texture for each side. Based off this:


This is complicates the rendering pipeline a little and probably adds a performance hit, but it works for now. I also need to figure out how to apply lighting source so the 3D rotation would have a little bit of shadow might make the rotation look a little better. Point light source could be the nearby stars.

To make the movement feel a little more reactive, we lag camera behind the player using linear interpolation instead of locking the camera position directly to the player. There’s a good GDC on camera stuff.:

Here is with the camera locked to player (with debug render on to get an idea of the motion vectors):
camera.position = player.position
This feels static and boring. Like the ship isn’t moving. Here is with camera linear interpolation enabled:
camera.position = lerp(player.position, lerpAmmount * delta)

It’s amazing the how much of a difference this very simple change makes in terms of feel.

But there’s still a lot of camera work do be done. For example I need to figure out how to keep multiple entities in focus when the player is in combat with another ship. Like how the camera zooms in and out for you in smash bros. Problems for later…

Speaking of combat. You can shoot!
[Mouse left-click] or [B on Xbox controller] or [O on playstation]

And if you don’t feel like dodging the incoming projectiles: you can use a shield to block them. It has a little bit of a charge and discharge time and only it’s active when fully charged:
Hold [Shift] or [A on Xbox controller] or [X on playstation]

I want add different kinds of ships and allow the player to customize their ships either visually or functionally somehow, but thats out of scope for now.

I will focus on only one type of ship and getting it to feel right in terms of responsiveness and “game feel”. Meaning tuning the impulse and animation timings to find what feels good. Another good talk:


For the “reverse” particle effect when charging the ships gun, are you doing that by hand or using the 2D particle editor? That looks really good!

1 Like

Yep, that’s exactly what I built the particles in:


That tool pops out a little text file that defines the properties for the particles. Which looks something like this:

You then load that into a particle emitter (I’m using pooling also for memory):

//engine fire
fireEffect = new ParticleEffect();
fireEffect.load(Gdx.files.internal("particles/shipEngineFire.particle"), Gdx.files.internal("particles/"));
fireEffectPool = new ParticleEffectPool(fireEffect, 20, 20);
engineColor = new float[]{ 1, 0.34901962f, 0.047058824f };
engineColorBoost = new float[]{ 0.047058824f, 0.34901962f, 1 };
engineColorHyper = new float[]{ 1, 0.047058824f, 0.34901962f };

//projectile charge
chargeEffect = new ParticleEffect();
chargeEffect.load(Gdx.files.internal("particles/absorb2.particle"), Gdx.files.internal("particles/"));
chargeEffectPool = new ParticleEffectPool(chargeEffect, 20, 20);

You can also control all the particle properties programmatically.

In the case of the charge particle we need to update the emitter position to be on our projectiles position:

private void updateChargeParticle(Entity entity, ParticleComponent particle) {
        ChargeCannonComponent cannon = Mappers.chargeCannon.get(entity);
        if (cannon != null) {
            if (cannon.isCharging) {
            } else {
        TransformComponent transform = Mappers.transform.get(entity);
        particle.pooledEffect.setPosition(transform.pos.x, transform.pos.y);

I keep my particle files in …/assets/particles/

The particle system code is all in systems/ParticleSystem if you’re curious.

I am going to add the basically same inward gravity particle effect when charging up the shield, but blue particles instead. Or what ever color the shield ends up being. Still just picking random colors as I go. That’d prolly look neat.


Update on colors: Luckily all the heavy lifting is done for us by John Walker’s Colour Rendering of Spectra:

I won’t go into how it works here but the curious can check the link. I ported the supplied public domain C to Java.

Now when I generate the star texture I give each star a random temperature, then set the color for the star based on that temperature.

(using Vectors to store the Color values because the built-in libgdx Color class clamps the r,g,b values when setting them which was messing with calculations before giving the final normalized result)
Vector3 has x, y, z so: x = red, y = green, z = blue

public static Texture generateSpaceBackgroundStars(int tileX, int tileY, int tileSize, float depth) {
	MathUtils.random.setSeed((long) (MyMath.getSeed(tileX, tileY) * (depth * 1000)));
	Pixmap pixmap = new Pixmap(tileSize, tileSize, Format.RGBA4444);
	int numStars = 200;
	for (int i = 0; i < numStars; ++i) {
		int x = MathUtils.random(tileSize);
		int y = MathUtils.random(tileSize);
		//give star a random temperature
		double temperature = MathUtils.random(2000, 40000); //kelvin
		//calculate the color for the given temparature
		Vector3 spectrum = BlackBodyColorSpectrum.spectrumToXYZ(temperature);
		Vector3 color = BlackBodyColorSpectrum.xyzToRGB(BlackBodyColorSpectrum.SMPTEsystem, spectrum.x, spectrum.y, spectrum.z);
		Vector3 normal = BlackBodyColorSpectrum.normRGB(color.x, color.y, color.z);
		//draw a point with that color
		pixmap.setColor(normal.x, normal.y, normal.z, 1);
		pixmap.drawPixel(x, y);
    //create texture and dispose pixmap to prevent memory leak
    Texture texture = new Texture(pixmap);
    return texture;

And this is the current result (blue = hot, red = cold)

Right now the background stars are all just single pixels, but I’ll add some size variation later…

I realized I can also mix this color temperature stuff with some of the nebula generation too so its more based on pockets of heat and cold rather than arbitrary mixes of blue and red noise.

I don’t even know what art style to settle on so I’m not sure why I’m wasting so much time over colors of the stars but I’m having fun learning about cool space stuff. I want to play around with generating stuff like this in the future (this is a really cool in browser demo):
How it works: Rendering a Galaxy with the density wave theory

Now onto the “real” stars. The background parallax layer star textures are purely cosmetic and don’t serve any gameplay purpose. But we also have some much closer actual star entities that are part of the gameplay.

Currently the galaxy generation is just randomly placed points, and at each point will be a star or body or some flavor.

For the star entities I generate a circular noise texture. The height values are represented in grayscale where 0 is black, 1 is white.

/** generate circular grayscale heightmap to represent star and features */
public static Texture generateStar(long seed, int radius, double scale) {
    OpenSimplexNoise noise = new OpenSimplexNoise(seed);
    Pixmap pixmap = new Pixmap(radius * 2, radius * 2, Format.RGBA8888);
    // draw circle
    pixmap.setColor(0.5f, 0.5f, 0.5f, 1);
    pixmap.fillCircle(radius, radius, radius - 1);
    //add layer of noise
    for (int y = 0; y < pixmap.getHeight(); ++y) {
        for (int x = 0; x < pixmap.getWidth(); ++x) {
            //only draw on circle
            if (pixmap.getPixel(x, y) != 0) {
                double nx = x / scale, ny = y / scale;
                float i = (float)noise.eval(nx, ny, 0);
                i = (i * 0.5f) + 0.5f; //normalize from range [-1:1] to [0:1]
                pixmap.setColor(i, i, i, 1);
                pixmap.drawPixel(x, y);
    Texture texture = new Texture(pixmap);
    texture.setFilter(Texture.TextureFilter.Linear, Texture.TextureFilter.Linear);
    return texture;

I am not sure how much depth to go in these breakdowns, I am definitely skipping over some things like how noise works. I feel like most people who have played minecraft are at least vaguely familiar with the concept of a seed and maybe rng noise. If not: this is a great resource on how perlin is generated (I am using OpenSimplex instead of perlin but the same principals apply):

Anywho, I wanted to bring a little life into this static noise. So I’m learning how to write shaders and here is my first pass attempt at a animating the texture. I’m shifting the heightmap values by adding an offset that changes over time; dark values will become brighter and bright values will become darker in a sort of endless morphing. Then we assign a color per height value.

It currently looks a little dull, so I’ll play with some saturation and other effects to fix that. Previously I had been adding the colors the texture during generation, but doing the coloring live in a shader which offers a lot more flexibility.

For those not familiar with shaders, its essentially mini-programs that run on the GPU instead of CPU. In the case of this fragment shader, we can think of it as running this program for each pixel in our texture.

  • First we sample the current pixel color from the texture coordinate. The color vector contains the red, green, blue, alpha channels for the pixel.

  • Then create a new height value by offsetting the current height value. The color channels are represented as a value 0-1, so we wrap the values with modulo division to keep the color values between 0 and 1.

  • Finally we colorize the new shifted height values. Since this is a grayscale image we know the values for red, green, and blue will all be equal so we only need to check one channel to get the height value.
    If the height is greater than 0.5 (half): the pixel will be a shade of yellow (by removing the blue channel). Otherwise: it will be a shade of red (by removing the green and blue channels).

varying vec4 v_color;
varying vec2 v_texCoord0;

uniform sampler2D u_sampler2D;
uniform float u_shift;

void main() {
	vec4 color = texture2D(u_sampler2D, v_texCoord0) * v_color;

	//shift height values to animate colors
	vec3 fireColor = mod(color.rgb + u_shift, vec3(1.0));//wrap values [0-1] with modulus
	if (fireColor.r > 0.5) {
		//set to shades of yellow
		fireColor.b = 0.0;
	} else {
		//set to shades of red
		fireColor.r = 1.0 - fireColor.r;
		fireColor.g = 0.0;
		fireColor.b = 0.0;

	gl_FragColor = vec4(fireColor, color.a);

For the control of the height shifting: We can pass information from our cpu code to the gpu code (java → glsl) via uniforms!

In our shader we defined a uniform float that holds shift value uniform float u_shift; and we can set that value using ShaderProgam.setUniformf(<name of uniform>, <data>);
Which looks something like this (where deltaTime is time between frames for framerate independent calculations)

float shift = 0;//accumulator
float shiftSpeed = 0.5f;//how ever fast you want values to change
public void update(float deltaTime) {
        //update shift value and pass it to the shader
        shift += shiftSpeed * deltaTime;
        starShader.setUniformf("u_shift", (float) Math.sin(shift));
        //do rendering things...

By passing in the sine of our accumulator will always have a value that oscillates between -1 and 1.

And of course need to apply the shader to the sprite batch. (shaders stored in …/assets/shaders/)

SpriteBatch spriteBatch = new SpriteBatch();
ShaderProgram.pedantic = false;
ShaderProgram starShader = new ShaderProgram(Gdx.files.internal("shaders/starAnimate.vert"), Gdx.files.internal("shaders/starAnimate.frag"));
if (starShader.isCompiled()) {

This is not the final effect, but its much more interesting than a static image so that’s pretty fun while I try to wrap my head around shaders. The texture could use some more octaves/layers. But first I need some brightness and lighting to make it not so dull: Bloom! Which I understand so far as blending in a layer of brightened Gaussian blur. Which should make it look more like a light source I think?

Right now all the star entities are this sorta yellow and red color profile, but I’ll try to feed the color temperature stuff from earlier into the shader and so I can in theory dynamically render a hotter blue star vs a cooler red star by passing in the color temperature as a uniform. Which should be pretty cool.

I am not sure how I am going to do the flares yet. Probably some firey sorta particles emitting outward all along the edge of the star. Maybe another texture for the flares and some kinda shader to move them.

Then eventually have the star as a light source for adding shadows to nearby planets and entities.

Also I’m starting to really like shaders and I am heavily considering replacing the parallax system with a single quad that fills the viewport and writing a shader to render space stuff to the quad. This should allow for way more visual possibilities, nicer rendering of space stuff, nebula’s and galaxy’s. I already have a prototype based on an existing project that can essentially render shader toys (with some conversion) to a quad, but I am having troubles getting the image to not to stretch/squish on window resize. It only looks ok when the window is a perfect square. The solution may be to render a perfect square FBO, then fill the veiwport with that…? todo: learn how to frame buffers.

The controls need some tweaks. The movement is currently relative to the orientation of the ship, meaning moving left moves the ship left which is fine when your ship is facing up or “forward”. This is the intended behavior, but when facing the opposite direction (pointed down facing something below you) the left now pushes your ship to the right of the screen. It is a little disorientating and awkward when trying to navigate around asteroids. Perhaps a side effect of tying both direction facing and movement controls to a single stick on controller. It might feel better to split movement to left stick and orientation to right stick… todo: test that theory.

Speaking of asteroids, stuff about asteroids and polygons coming up next…


Other than visuals, the last couple weeks I have spent mainly on implementing destructible asteroids and spawning mechanics.

One of the inspirations for this game is the original arcade Asteroids. Lets take a quick look at the way the asteroids break up in the original:
When you destroy an asteroid, it just spawns 2 new smaller child asteroids that don’t really make sense relative to parent asteroid.

This is something I wanted to improve upon by having the pieces that break off make up the original object. Meaning the sum of the child bodies should maintain the total mass and surface area of the parent body. To accomplish this? Voronoi Diagrams!

The plan is to do this using Voronoi Diagrams, but for now I am just using Delaunay triangulation until I fix my voronoi stuff. A cool property of voronoi diagrams is that the polygons it creates are always convex. If you have tried to implement collision detection, you will know why this is a nice feature to have. I also just find them very pretty and think it should make for a satisfying shatter mechanic.

G E O M E T R Y == A E S T H E T I C

Now this is something I worked on years ago and gave up on due to being busy with education at the time (and I struggled with maths at the time and Fortune’s algorithm intimidated me). I never got back around to finishing as I got distracted with other parts of the engine and busy with life. Yikes, I just went a looked at my commit history and my original voronoi code goes back to 2016; meaning I left this problem for ~6 years…feels bad. But first, I need to remember what heck I was even doing… I have this problem where I like to reinvent wheels and try to solve problems myself, instead of using an existing solution (Fortune’s sweep line algorithm), this was my hacky almost working solution:

The dual graph for a Voronoi diagram is the Delaunay triangulation for the same set of points. And since libgdx has a lovely DelaunayTriangulator class built in, my plan of attack was to abuse this property derive the voronoi cells from delaunay cells.

First we calculate the delaunay triangulation for a set of points. We also define the outermost points as being the convex hull in red. All points within with be the shattered pieces.

Then to get the voronoi points, we calculate the circumcirle for each triangle. A circumscribed circle or circumcircle of a polygon is a circle that passes through all the vertices of the polygon. The center of this circle is called the circumcenter and its radius is called the circumradius.
It starts getting a little messy with more points:

Once we have all the cicumcenters, we know these are the vertices for the voronoi cells. So I try to connect each cell by determining the neighbor cells. But sometimes these points fall outside the hull (outside bounds of the polygon), in which case we want to connect to the edge of the hull and no further. Abusing the dual graph property again we know that the edges of a voronoi cell will always pass through the midpoints of the of delaunay cells. So we calculate the midpoint between each delaunay triangle and that gives us the edge points.


It’s honestly pretty close aside from some literal…edge cases ;p The center cells are all correct, but there are issues calculating some of cells where the edge meets the hull. And sometimes broken where I calculate from midpoint to circumcenter.

So my goal is to either fix my original jank, or it may be a better to do it “right way” and replace with fortunes algo. I have reinvented far too many wheels at this point… What I really need is a simple function that takes in a set of vertices defining the polygon hull and all interior points and returns a list of vertices containing each sub polygon…

For now here’s a demo with just Delaunay for shattering and debug rendering. It isn’t as pretty as voronoi will be as we just get a bunch of triangle shards, but it does still maintain the original body and works well enough for testing gameplay.

Now we generate an asteroid belt around our star!

Playing with the shattering physics, here’s a fun little scenario where we crash a set of asteroids into another set of asteroids.

I’m done adding new features for the moment, the focus will be on tweaking and tuning the physics, density and impact feel, the scale of objects, player movement. Cleaning up and better rendering.

Also there is not too much of a “gameplay loop” at the moment. So I think next is to add some resource drops from the asteroids. eg: Minerals, metals, gems? And then perhaps adding a space station we can dock at to sell our loot. At a space station you could buy ships and upgrades etc.


been feeling overwhelmed by my ever growing todo list (and a battle against winter SAD + a lovely dash of anhedonia…) any who, back to dev!

Mostly focused on asteroids still. Been playing with different physics models regarding orbiting bodies. I originally wanted to use some good old Newtonian n-body mechanics where bodies have mass and attract each other. But turns out gravity, orbiting bodies are a chaotic system, not stable. Extremely sensitive to initial conditions and any tiny influence to a “stable” system will throw it in to chaos (see the 3-body problem and double pendulum). Even if we did calculate a perfectly stable set of initial conditions, the system will be thrown out of order once the player starts interacting with objects (eg: shooting asteroid) returning the system to chaos.

I am not sure how ‘simulatory’ vs ‘arcadey’ to make the gameplay physics feel, and this same problem applies to the scale of the universe. I am obviously using much much much smaller distances between planets and stars because space is absurdly large and trying to emulate reality where one would spend literal years traveling between planets would not make for a fun gaming experience. Keeping things smaller also helps mitigate epsilon floating point rounding errors with large coordinates in deep space. Also to consider is the box2d constraints: b2D is tuned for (MKS) meters-kilogram-second units optimized for objects between 0.1 and 10 meters, with a velocity limit of 2 * ‘physics steps per frame’ (~120 with my current settings which is a little anemic but I can bypass this velocity limitation via my own physics system in the case of the “hyperdrive” feature where the player can travel much faster, but that’s a topic for another post).

Often games lie to the player for the sake of gameplay. Since I don’t want the player to be able to influence the orbit of planets we fake it. So I am “cheating” by essentially treating each orbiting body as a sort of 2-body problem where the central body never moves; locked in a perfectly circular orbit around the parent body. Sacrificing realism for simplicity/predictability.

First we choose a distance from the parent body, this is the fixed orbit radius. And a tangential speed for how fast the to move the body around the point of orbit. In this simplified model we don’t have to worry about mass and gravity and all that. To calculate the position of the body; simply convert from polar to Cartesian (angle & radius → XY) coordinates:

public static Vector2 polarToCartesian(float angle, float radius) {
	float x = (float) (Math.cos(angle) * radius);
	float y = (float) (Math.sin(angle) * radius);
	return new Vector2(x, y);

With a little tweaking I can essentially turn the orbit into a function of time allowing us to “move forward or backwards in time”. This timeshift wont be a final gameplay mechanic but it’s neat for testing / debugging things.


Orbit mechanics for the asteroids are little trickier as we use the same simplified 2-body orbit model to keep them orbiting our star, but treated a little differently as the asteroids have actual physics bodies that the player can interact with collision and impulse resolution. This introduces a problem as these 2 systems are in conflict with each other. I want to keep the asteroids orbiting the parent body, but also allow the player to hit them and knock them out of orbit. So we tell another lie to the player; the asteroids are given a state: ‘free’ or ‘orbit locked’.

By default when spawned in a belt the asteroids are ‘orbit locked’ and the body velocities are set perpendicular to the angle between the asteroid and parent body.

But once an asteroid is interacted with and meet an impulse threshold (eg: collision with another asteroid, collision with player if hard enough, but not collision with bullet), the state is switched to ‘free’ in which regular conservation of motion is respected and the asteroid will float freely as influenced by regular impulses and collisions. Also: if the player collides with an asteroid hard enough, it will damage the players ship unless the shield is active. So don’t ram asteroids with your ship!

This will take some tweaking and tuning to get right but it works pretty good for now.

Also will need to mask a texture over the polygons so they look more asteroid-like and less like arbitrary colored shapes.

More fun playing with shatter mechanics, I find it rather satisfying to crash an asteroid field into an asteroid belt:

I did have a LOT of fun making this little interactive n-body toy using some integration to learn about and play with the mechanics of chaos.


The figure 8 is a classic “stable” solution as computed by Carles Sim ́o

x1 = −x2 = 0.97000436, −0.24308753i
x3 = 0; ~V =  ̇x3=−2  ̇x1=−2  ̇
x2 = −0.93240737 −0.86473146i

EDIT: struggling to find original source, hopefully I am not mis-crediting. Please correct me if I am wrong. I pulled those values from this paper:

Also see:

However looks like this was discovered earlier by Cristopher Moore:

Here are some other cool stable periodic solutions: 3 Body Problem - Periodic Solutions - YouTube

Veritasuim has some good stuff on chaos too.

Overall I am pretty happy with the progress made this month. Unfortunately I didn’t quite get the point where I have a nice demo to present, as there there is still so much to do. I’m not sure if I should just say screw it and release a build as-is in this very unfinished developer state with all the debug rendering and crap, just call it a basic engine demo for now so people can poke at it? Or spend a few extra weeks (probably longer, I typically underestimate) trying to make it a little more presentable first?

Those who are brave/impatient can try following the build instructions on the readme.md; be warned all my devices are 1080p so if you have ultrawide or 4k there may be some scaling/rendering issues as that is untested territory. Please let me know if you do.

This thread is now just an unofficial official devlog I guess. Debating putting together a slightly more coherent and detailed devlog (feat. more sources) of which these posts are a sort of rough draft as I untangle my notes and brainfuzz.

Merry new yers y’all!


Update: Just confirmed some broken rendering on high-res screens.

edit: removed unrelated rant about how much i dislike hole punches and notches in phone screens

Any who… The Pixel 4 XL has a resolution of 1440 x 3040 pixels (which is sort of a lie thanks to the rounded corners…but I digress), and my parallax layers don’t quite go to the edges. And the title screen animation seems to be a little heavy at that resolution tanking the FPS, so I’ll try to fix that soon.


One trick you can do is put a “drop shadow” right behind an object so it has a subtle dark outline, similar to a mouse pointer or windows. When done right in a game it’s not noticable unless you know it’s there, and really helps thing pop out. Learned this while modding SPAZ, which despite the name and corny introduction, was a fun game with significant mod-ability.

I’ve loving the details of the physics and math you are using.


Under which objects? Like spaceships and stuff? That would probably help with contrast yeah, good idea. Looking at those SPAZ screenshots, you mean this outline?

I may have to acquire that game now for some inspiration/ideas…

Now the for the part where I overthink the implementation, should it be another texture underneath or can it be done with a shader. Not sure if a regular vertex/fragment shader works the same on 3D meshes, I feel like the barrel roll feature complicate things. I was looking into outline shaders for highlighting objects which would be useful for some other stuff I have planned.

Glad you are enjoying the physics and math, it’s a lot of fun. I struggled with maths when I was younger (cuz highschool sucked and I didn’t care at the time). I had to rediscover it for myself, and put in a lot of work to relearn. As I got into more involved programming I ran into certain problems and had no choice but to learn that I realized how useful math is and now I love it. Game dev is a lot of trigonometry, vectors, matrices, algebra, topology. I’m still a little slow to wrap my head around certain concepts but through trial and error and free education on youtube I’ve developed a (mostly) functional foundation. Fourier is currently one of my favorite mathematicians; we (my partner and I in class) used an FFT to build an audio visualizer for a final project in one of our Computer Engineering courses. Good times…

Now I’m obsessed with space and physics and fractals and light and endless list, it’s such a cool world. I really want a solar telescope, bit pricey tho.

I’m super excited to see what comes out of the Inouye Solar Telescope

Image Credit: Inouye Solar Telescope: First Light - NSO - National Solar Observatory %

Also very excited for when the James Webb starts sending back images.


The glow around the ships are the shields. The drop shadow is part of the raw ship image. There’s basically one visibly darkened pixel around the ships right outside the “edge”.

You can finally see it if you open these up in an editor that can show png alpha with a magenta background. Like I said, very subtle.
ship_bigBrother_pirate ship_saucer_pirate ship_tug_pirate

When I modded I actually went slightly heavier with the shadow, with it lasting about 2-3 pixels or so.

The game is definitely a great reference for making what I call “thick space”, space that’s atmospheric and visually fun to be in rather than anywhere close to realistic. The game itself is on sale for $5 as of right now on gog and $4 on steam.
If you do get it, here is the mod pack. The game was made with a heavily modified torque-2d engine if you are curious. The devs would certainly encourage you NOT to try the same lol. I believe the engine on github might be able to make compatible effect files, though I personally used a pirated version at the time.
And windows 10 black screen fix if needed.

This game is also a good example of what NOT to do with some effects. They quickly become too much and you can’t see shit, one of the first of many things I did when I modded the game was reduce laser width by a 1/3rd. I have hot opinions about what makes effects good or bad that I can spout off on if you like.


Ok, yeah I see it now.

Simply add an extra few pixels around the edge, no need for shaders. Added to the todo list. Thanks.

I’ll check it out for sure. Looks like torque-2D sits on top of box2d. Neat.

I actually do plan to add a laser, the functionality would be to slice asteroids where the laser intersects. My effects are very basic right now, just simple particle effects. It’s definitely easy to go overboard on the ‘juice’. I love hot opinions, hit me.


Alrighty. So I broke the parallax layer fighting with a separate camera and an ExtendedViewport to scale the rendering for more screen resolutions and ratios, it’s close but not quite there.

This got me thinking about the rendering pipeline as a whole, and what the correct way to organize things might be.

This is a high level view of the current rendering pipeline in order of execution. Render order is important for which layer draws on top of which layer to make up the composition of the scene.

Rendering Pipeline: 
	clear screen -> clear color and depth buffer: clears display between frames
	cam update -> move, zoom, and update camera
	space parallax renderer -> render background parallax star/noise layers
	sprite 2D renderer -> render regular textures, sorted by z depth
	sprite 2D shader renderer -> render textures with shader applied: stars
	asteroid renderer -> custom shaperenderer to draw filled polygons; to be swapped out with texture mask over polygon
	sprite 3D renderer -> render 3d renderables of spaceships
	shield renderer -> shape render shield polygon
	particle renderer -> render particles on top of sprites
	hud system -> render player info, minimap, healthbars, draw touch screen controls on mobile, and in game Menu when paused
	screen transition system -> renders full screen white out animation between game states and handles transition logic
	debug system -> render debug and diagnostic info (fps): should always be last so we can see debug info at all times

To add a bloom effect to the stars; plan is to render the stars to an fbo then apply the bloom shader to the fbo, which will probably happen as in the sprite2D shader system. Considering using the gdx-vfx library which handles bloom and more, its in beta but looks stable enough. I generally prefer to implement things myself and minimize dependencies, but no need to reinvent wheels.

Found PolygonSpriteBatch and PolygonRegion which might solve the texture mapping for asteroid polygons. libgdx/PolygonRegionTest.java at master · libgdx/libgdx · GitHub

I end up using the shape renderer a lot for debug, UI and in game drawing. There’s a ShapeDrawer library that does smoother rendering and some other handy things, so I may migrate to that.

Considering a lighting system. Stars could be a light source. Ship engine fire could be a light source. Could fake it with particles / sprites as lights. Looking into box2dlights.

The render steps could probably be simplified because I have just been adding layers as I need. Trying to figure out requirements so I can put a little planning to implementing things in a memory friendly and Entity Component System friendly way. EG: what textures and resources can be pooled, where to apply shaders, how to minimize flushing the batch. Questions for a profiler…

I’m looking for resources on what a good rendering pipeline / architecture should look like in OpenGL and ECS. Some interesting reading.

Next release of libgdx moves to LWJGL 3 which brings some nice things with it such as better linux support, vulkan bindings, arm build support: M1 Macs!

Been spending more time reading than coding. Gonna take a lil break from graphics.


Spent more time outside on the bike than behind a screen over the summer.

Back for another sprint. Gonna be lighter on the technical posts this time to focus on dev.

I borrowed a steam deck from a friend and got a build to work on there. But I had to go to the desktop side to get java running:

sudo steamos-readonly disable
pacman-key --init pacman-key --populate archlinux
sudo pacman -S jre-openjdk

After that it did Just Work™!

But only from the desktop side… It wouldn’t launch from steam os side for some reason.

Also curios is that from the desktop side, the gamepad controls were using desktop input and not controller input. (eg: right thumbstick was moving the mouse and not the analog stick in game). I got it working with an external controller plugged in.

Maybe there’s setting to use gamepad input as gamepad input from the desktop side?

I tried gdxvfx but ran into some issues with the frame buffer or blending I can’t figure out.

I gave up on bloom for now, but here are some pretty images of rendering bugs:

It looks similar to a GL_CLAMP_TO_EDGE artifact, but that didn’t seem to be the issue.

Sometimes my gpu gets angry and even my second monitor artifacts beautifully:


I don’t think that’s a texture-sampling bug, it looks more like a render-target issue. Normally when you see those trails the framebuffer isn’t being cleared.

Yeah, it didnt end up being clamp to edge itjust looked similar at the edges.

It think its something to do with the specific bloom shader blending of the buffers in the gfx library i couldn’t figure out. Some of the other effects worked just fine.

I’ll fight with it later… I just need to make things work for now so almost everything is shaperender because its just easy to use.


Hot plugging for controllers!:

Any thoughts on wing designs?