Soft 3D particles
Creating Advanced 3D particle systems in GameMaker #1
Along with many other missing 3D features, GameMaker lacks native support for any kind of advanced particle system when dealing with 3D. We’ve seem numerous implementations of 3d particles in the past, for example Iccurd’s 3D particle extension was a step forward in getting particles into GameMaker.
However, despite being able to render particles, we still fall into the same trap of not having a system which lives up to the requirements for us to actually get away with having significant numbers of particles in real-time.
As part of our creation of a more advanced 3D pipeline in native GMS, we have broken down particles into a number of sub-problems:
Order independent transparency
This is to ensure particles that get drawn on-top of each other do not cause weird alpha issues mucking up pixels behind them.
Fast and efficient buffering of particles
Currently, one limitation of having large numbers of particles is draw calls. Ideally we want to batch our particles into one (or a small number) of draw calls to reduce performance waste.
Pixel fill rate performance issues
This is a rather fundamental problem of particle systems in general. This is the idea that when you have a dense pack of particles, the same pixel gets filled in multiple times by different pixels when using transparency. I.E if you have 10 pixels over lapping, the same pixel may get filled 10 times, which means 10 fragment shader executions = terrible performance.
Particle billboard clipping
The last of the immediate issues is the notion that billboard particles can cut into objects as they rotate to face the camera. This results in an ugly artefact where a harsh line appears where the edge of the particle meets the scenes geometry. This is especially bad if particles have fast motion.
Whilst it may not seem like a problem at first, particles still need to be simulated somewhere, and with large particle counts, this can quickly become quite costly on performance when performed solely on the CPU.
Some of these issues can be solved, however many of them are quite tricky to solve. In this blog post, I will discuss our solution to the 4th problem, and perhaps throw some ideas around for where we can begin to start with the other ideas.
For the sake of this project, we are only really interested in results which can yield a performance gain, and we are also interested in ways we can use YYC to our advantage and achieve results we couldn’t previously comprehend.
Soft / volume particles
I’m going to jump straight to problem #4, because the solution to this problem is actually quite trivial. The idea for this came somewhat out of the blue after a gaming session on Modern warfare 2, and the question of what the “soft particle option” actually did.
We soon realised that the setting completely removed the issue of billboard clipping and then I realised that you could simply perform a depth test between the current depth of the pixel, and the depth of the scene. If a point on a particle had a depth which was close to that of the scene, this region would be where the clipping artefact would occur.
So, given a simple depth buffer of our scene, the following process was used to soften particle edges:
The first thing we can do, is determine the depth of the scene at every point within a particles geometry. This is achieved by transforming the vertex position into a 2D texture coordinate which we can use to sample the depth buffer with.
vec3 NormalisedDeviceCoordinate = v_HomogenousClipSpace.xyz / v_HomogenousClipSpace.w;
vec2 screenCoord = NormalisedDeviceCoordinate.xy * 0.5 + 0.5;
screenCoord.y = 1.0-screenCoord.y;
Here we perform a perspective divide, taking the clip space position (i.e Position in the vertex shader after multiplying by the MVP matrix, this step could be skipped if we had access to gl_Fragcoord).
After this, we can perform the ol’ scale and bias transformation to get the NDC coordinate out of the -1.0 to 1.0 range and into the 0.0 to 1.0 range so we can use it as a texture UV coordinate.
Finally, we need to flip the y value because of GMs texture coordinate inconsistency. This gives us scene-depth information per fragment, which we can compare with the particles per-fragment depth.
We can now perform a straight-forward depth test between the particles depth and the scene depth. I did this by defining a DEPTH_TOLERANCE, if the difference in depth was smaller than this tolerance, than the region is marked as being near the scene geometry.
Using this data, we can filter out the pixels which lie near scene geometry to prevent clipping.
Though it is worth noting this first stage does not create a smooth fade, it simply determines which regions we are interested in modifying.
This can simply be done in code with a basic check:
float delta = depth-vDepth;
float opacity = ( delta < DEPTH_TOLERANCE_UNITS_MAX ) ? 0.0 : 1.0;
gl_FragColor = vec4( mix( vec3( 1.0 ), vec3( 1.0, 0.0, 0.0), 1.0-opacity), 1.0);
A delta is defined as the difference between scene depth and particle depth (where vDepth is passed from the vertex shader, and depth is derived from a depth buffer.)
The final result simply mixes between white and red depending on the value of Opacity, so we can visually see how the shader is working.
We can then smooth the transition between a particle being fully opaque and fully transparent using a simple formula. Once again, in this, the red determines the transparency of the pixel. In this case, the bottom of the particle will smoothly fade away when it is touching the ground, rather than clipping through the floor.
float opacity = ( delta < DEPTH_TOLERANCE_UNITS_MAX ) ? ( (delta-DEPTH_TOLERANCE_UNITS_MIN) / (DEPTH_TOLERANCE_UNITS_MAX-DEPTH_TOLERANCE_UNITS_MIN) ) : 1.0;
In this code, I have defined a minimum tolerance bound as well. This is to allow a little padding, and a region between the scene and the particle, to stop the transition being immediate. Having a large value for this can make the particle not look great, so it Is best just to have a little bit of padding.
It is also worth noting that different particles will require different levels of softness. Effects like smoke where particles are larger will often look better with a much more gradual gradient, whereas with other effects, you may only want a small gradient.
Putting this all together
When the final effect has been applied to particles, we can quickly see a difference between our scene without the soft particle shader enabled, and with.
As you can probably see, the final result looks nice and smooth, and the particles blend together nicely into a mist. The problems with the original scene are not as visible from a static image; the real benefit comes from viewing the scene from a moving camera. At this point, it is easy to make a distinction between the two, and we are left with a much more pleasing result.
I have not provided full source code for this effect, however there are a few things worth mentioning. First of all, when dealing with GMs surfaces, rendering particles to a surface with transparency may yield odd results at first. This is to do with the default way alpha is applied on surfaces. Rather than blending alpha, GM will replace the old alpha with the new alpha, meaning you will get background-colour bleeding.
There is a relatively straight forward fix, you can simply first apply the blend mode:
And then modify the shader to pre-multiply alpha. This can simply be achieved by multiplying the rgb of gl_FragColor by the alpha of gl_FragColor:
gl_FragColor.rgb *= gl_FragColor.a
The final remark for working with particles in this system, is that you will want to disable zbuffer for particles. This allows transparent particles to be rendered on top of each other.
I will cover further optimisations better techniques in a future post.