As I’m finishing this up, the Summer Games Done Quick event of 2016 has just started – you should watch it.

I’ve dealt with some physics simulation issues lately that I wanted to talk about – this is probably going to be less coherent than usual since it is essentially a multitude of solutions to a problem that good system design would have prevented in the first place, but it has been a big enough deal for Backworlds development that it feels worth discussing.

“Physics simulation” in Backworlds is limited to the gravitational pull and jumping arc of the player avatar and various game objects – due to the simplicity of the algorithm and how easily it can be split up into several discrete steps, we typically solve this with simple Euler integration. This means that physics will produce incorrect results dependent on the simulation frequency – put simply, low frametimes will cause gravity to behave in an unrealistic way. We are fine with this since it is not enough to be disturbing – as I have touched on before, when it comes to game simulation plausibility is way more important than realism.

The bigger problem is that a changing framerate now changes the behavior – due to position being changed separate from velocity over a single timestep instead of at the same time, a lower framerate will cause the object to either over- or undershoot a higher framerate simulation. In our case, since gravity is applied after velocity, the player avatar will jump higher in a low framerate situation.

More updates means lower jump height

More updates means lower jump height

Not having a specific jump height introduces level design problems. While we could force the game to always run at 60 fps, I am not a fan of introducing restrictions and it could lead to slowdowns on machines where we cannot always maintain that framerate. To be perfectly clear, the right solution would have been to account for this problem upfront and had Backworlds been more reliant on precision platforming I would have rebuilt the gameplay part of the engine, but luckily there are ways to mitigate the problems without rewrites.

Separate simulation/render framerates

separate

If we’re rendering more frames than we are simulating, we need to interpolate between old frames or predict new ones so the intermediate frames are not identical

A common solution these days is to structure the engine around a fixed game simulation frequency and let the rendering update as quickly as it can predicting in-between frames. This way, we can have a predictable game simulation but still get smoother updates of animation, either by running game and rendering on separate threads or by interleaving several render frames between the game frames. There are a few problems with these solutions in that having a higher framerate will not give you a more responsive game and if going for the single-threaded solution we can get problems with uneven framerates – but most importantly it will not work well if the game is not built with the system in mind. For Backworlds, I use this system for objects that have very few updates – less than 10 per second – but most objects are too complex to be retrofitted in a reasonable amount of time.

Clamped simulation time

If we're simulating a really long period of time, we can do it in several smaller steps

If we’re simulating a really long period of time, we can do it in several smaller steps

A cheap solution that we’re using for the most complex objects is to simply limit the amount of time we allow the game to simulate at once – if the game dips into 30 fps, we simply evaluate 2 frames of 1/60th second each rather than 1 frame at 1/30th second. This way, we can limit the numerical errors – if the game frames are expensive there is a slight risk of this negatively effecting performance but Backworlds objects tend to be few and simple in terms of their gameplay code. In theory we could impose a minimum frame time as well and not update the game until this has passed, but this would reintroduce the problems with a fixed framerate and would ultimately not give us a lot since 60 fps is close enough to the actual curve.

Hybrid fixed/flexible simulation time

One option is to only keep values that may cause problems in fixed-time simulation

One option is to only keep values that may cause problems in fixed-time simulation

Something I’ve been using lately for objects that are too complex to disconnect from the general game update but still without too many exception cases is a hybrid solution between fixed and flexible simulation time. Essentially, first-order values (like changing position with velocity) changes over with a flexible frametime but second-order ones at a constant 60 fps. How this works is that store a counter for every velocity value and increase this counter for every flexible frame – when the counter exceeds 1/60th of a second we decrease it by that and add 1/60th of a second worth of acceleration to the velocity. This way, we do not have to sacrifice precision or smooth animation but still have predictable behavior. The drawback is that we cannot use this if acceleration is not constant or in cases where a lot of other systems impact the velocity per-frame.

Just ignore it

Sometimes, it doesn't really matter

Sometimes, it doesn’t really matter

For objects that only have a decorative visual effect, we do not really care if it looks slightly different at lower framerates. We usually introduce some special-case handling for spikes, but in general we are going to have other problems long before a reduced framerate causes problems in these objects.

Backworlds was originally built for a short competition and there are a few systems that have not scaled well with the increased scope – as we have more or less locked down the types of objects in the game at this point we have managed to work around this one  rather than rebuild the engine.

Happy 4th of July to our American friends!