Roche Fusion Technical Recollection

A few days ago I was idly scrolling through my twitter history reminiscing about the last few years of my activities as game developer. For about two years a major part of that – and my life in general – has been Roche Fusion, the game project I started together with Tom Rijnbeek in September 2013, and which we finally finished – with a team of 7 – in January 2015.

For today’s post I thought I step through the major points of Roche Fusion’s early development from a technical standpoint, and give some insight in our development process.

Next week I will continue on the same topic and try to give some more insight into the later stages of the game’s development.

If you unfamiliar with the game, check out it’s trailer, and if you like explosions, consider getting it on Steam.

Note that I will skip a lot of features of the game – including most of its content gameplay-wise – and instead focus on the purely technical bits I developed. Given my role as team-lead and graphics-programmer, technical in this case mostly means graphical.

Starting out (December – October 2013)

Roche Fusion started out as a small 2-day prototype in my own graphics engine I created just for fun. For a comparison, here is what it looked like back then.

After having the code lying around for several months, Tom and I thought it would be a great idea to create a game together. After some back and forth we decided that we were going to work on a space shooter, and to use my prototype as a basis.

From that day in early December 2013 on we started building our team and writing code – and to be honest, we had no idea how long we would be busy on the game. Unfortunately, little public evidence of those days exists – with the exception of entirely outdated posts on our public developer forums.

After two months of development we had more or less re-written the entire prototype in a much more flexible way, and started adding more content and functionality.

On November 1 we published our first gameplay trailer, marking the release of our first public alpha.

First major graphical redesign (November 2013)

With time we realised that our ambitions far exceeded our original ideas of how long it would take us to finish the game. We made the decision to commit, and thus started working on a number of exciting features.

One of the biggest changes was a complete overhaul of our rendering pipeline to include post processing and complicated custom shaders.

This helped us not only waste plentiful GPU cycles, but it also gave us a much better vision as to where we wanted to go with the graphical style of the game.

Not to mention that it was the start of a process that ended up with Roche Fusion being the epilepsy inducing beauty that it is today.

Some minor hiccups aside that is…

This was also when we decided on the final name – Roche Fusion.

Another big graphical change was the inclusion of particle effects based on the enemy’s sprites. In essence, on destruction, we tear apart the respective sprites, and create particles from them to make look like the ships literally fell apart.

In January 2014 we felt confident that we had hit the nail of what would make Roche Fusion a great game, and started calling our public builds betas with the release of a shiny new trailer.

Key-framed sprite animations (November 2013 – January 2014)

Another important change was the decision to not be satisfied with static enemy sprites. Instead we wanted each enemy to consist of multiple sprites that would be skinned on a key-frame animated skeleton.

Apart from writing the entire animation system I also went ahead to write a program to preview animations while editing them which proved invaluable when we considerably stepped up the amount of content in Roche Fusion later on.

Some might argue that my type signatures went slightly over the top, but I still consider the resulting generic animation system a masterpiece of generic code design – certainly for my standards at the time.

Screen-space UV distortions (December 2013 – February 2014)

During this time I also developed one of the maybe most impressive effects in Roche Fusion: the black hole weapon. It makes use of post processing to distort the rendered image in screen-space, which allowed for a number of interesting effects down the line.

Originally we could only render a single such effect at a time unless we wanted to invite a number of ugly artefacts onto the screen.

However, in early 2014 I recreated this part of the rendering pipeline, to allow us to render any number of such effects, with any amount of overlap.

Sometimes with interesting side effects.

Remember: If you store UV screen coordinates in a texture, using 8 bits per channel MAY not be what you want.

With this system in place we started to use distortion effects for other part of the game as well.

Suffice it to say, working on this was a lot of fun (and cause for too many bad puns).

Since we found that doing these kinds of distortions was an extremely cheap graphical effect we started using them for shock-waves of every single explosion – and if you know the game, it has a lot of those.

If you are interested in how we achieved this effect, feel free to check this overview I wrote on IndieDB at the time. Also make sure to drop me a line if you would like me to go into more detail on this – or anything else – in a future post.

Propagating explosions and other area effects (March 2014)

While overall a small point, I wanted to give this effect its own section since it is in many ways essential for how Roche Fusion plays.

We noticed that it looked rather odd if explosions caused enemies to explode in a large radius, while the graphical effect of the explosion took a certain amount to traverse the same space.

Instead we wrote an area-of-effect system that allowed for any sort of effect to affect game-objects in any sort of area – including changing ones.

This made large explosion and shock waves look much more powerful and overall impressive, and had the side effect to distribute the computationally heavy enemy explosions over multiple frames. Given the sometimes ridiculous amounts of particles in the game this is in fact key to Roche Fusion’s great performance on even older systems – at no loss (and even gains) of visual quality.

Procedural sprite colouring (March – July 2014)

Something we realised in early 2014 was that it would be amazing – and not too much effort – to play Roche Fusion in local coop.

To allow players to tell apart their ships, I wrote a system that allowed us to colour all of our in-game sprites procedurally.

For this purpose, we needed to store separate colour channels for each of the three colours we wanted (main hull, glowing areas, detail colour). In essence, each channel is then multiplied with the set colour in the pixel shader, the results added up, and displayed.

When it became clear in early summer 2014 that we wanted a number of player ships, drones and other objects to use the same system, I developed a small tool that would automatically extract all the colour channels from a regular sprite, which ended up saving us a lot of time.

Note how the image shows both the input sprites, and a re-assembled output sprite in the top left. The difference between the two is very small, which allowed us to use the resulting sprites in-game without any further work.


I learned a lot during the development of Roche Fusion and I hope you find this kind of recap of the game’s development as interesting as me.

Let me know if you would like me to go into more detail on any of these topics, or anything else, and I’ll make sure to cover it in a future post.

Until then, see you next week, when we continue this somewhat nostalgic recollection, and of course:

Enjoy the pixels!

Leave a Reply