Over the coming weeks as we continue development on our next game – and in an attempt to get into the habit of updating our web site more regularly! – we will be posting a series of development diaries. These diaries will not only allow us to document our progress for our own benefit, but we hope will provide other indie developers with an insight into how we work – which may prove beneficial when applying our development concepts to their own projects.
In our first dev diary we will cover some of our core tech that we have been developing on and off since our first release in 2008. To illustrate our tech we will be showing off a prototype that we were working on called “H3”.
H3 was a prototype for a puzzle-based platformer.
Most platformers are purely level-based where the player starts a level, travels from A to B (sometimes having completed a set of objectives) and then moves on to the next level. In H3 we wanted to make a continuous, seamless game where the emphasis was on the journey, but so that it still played like a traditional platformer.
The following videos show some game-specific features that our core tech enables us to do.
In this video you can see some of the basic mechanics that were implemented in the H3 prototype.
For this technique a simple collision tagging method is used. When the collision data is created each collision polygon can have up to 5 numerical ‘tags’ attached to it. These tags can be used for anything from auto jumps; which footstep sound effect to play; which footsteps particle effect to draw; as well as changes to the default physics settings for that surface (e.g. friction/restitution).
In the case of the auto climb/jump system we use a single numeric tag with a value of 1 or 2 to pick which animation should be played based on the height of the object. When the jump animation starts, the system grabs the current root bone position so that when the animation finishes, the player’s new position is correctly matched with the final animation frame within the level.
Near the end of the video you can also see another use of the tags which causes the player to slide down a slope.
Switches are managed via an invisible area that checks firstly to see if the player inside it, and secondly whether the use/action button was pressed. When a switch is used it calls an ‘OnUse’ scripted function which triggers any valid scripted event to occur (such as move a block, move the camera, enable/disable another switch etc.)
Ladders are handled in much the same way as switches where there is an invisible “use” area in front of the ladder. If the player chooses to “use the ladder”, a scripted event occurs that moves the player towards the ladder and into position so that the player can control their ascent or descent. To ensure the hand positions of the climb animation match with the rung positions, and to ensure mount/dismount positions are handled correctly, we also encode into the invisible area the number of ‘climb’ animations that should be played for that ladder instance. This makes the system very robust as it enables us to have ladders of any height, and means we have direct control when matching our animations to movement.
This video shows a test level which includes a block-based puzzle. The movement of each block is handled via a scripted event where a climbable staircase is formed if the player uses a switch.
The blocks are simple box2d objects which are attached to the world on a prismatic joint. The prismatic joint allows a block to be moved horizontally or vertically, and can be limited within min/max values depending on how you set them up.
The blocks are tagged in exactly the same as a normal world collision obstacle, which means the player’s auto climb/jump code works as it would on a static obstacle.
For a final look at the H3 prototype, this video shows a render test to see how we wanted things to look.
The world itself is made up of multiple layers that are ordered to make sure all the alpha and blending is correctly passed through the render system. Most of the level consists of 2D quads that are textured with silhouettes of grass, branches, rocks and other objects, which gives it a very organic look.
Since the game was to contain a limited colour range, we experimented with some overlay effects to try and give the world more of a unique style. As illustrated in the video, we implement a noise and sketch effect. These effects were achieved using some basic grey-scaled textures which were mapped to a full-screen quad.
Our Tech and Tools in Brief
We currently use Maya as our main ‘editor’. In Maya we have written a selection of custom exporters and plug-ins that are all derived from a generic base type which we can adapt to the specific needs of whichever game we are working on.
From a code point of view we mainly use open source technology which helps us enormously when it comes to getting things done quickly. The following is a list of some of the libraries that are currently in our pipeline:-
- Box2D – We use this for our collision and physics in games where we only need 2D feedback. All the Cabby levels had box2d physics for the vehicles and levels.
- Freetype – When a game requires some nicely rendered fonts Freetype is used to create quad glyphs from TTF files to pass to OpenGL.
- Ogg Vorbis – Air Cadets used ogg for sound effect playback. We have since switched to the AV Foundation from the iPhone SDK as it uses hardware for decompression which is much faster overall.
- Lua – Lua is our choice for anything script driven. Cabby for example used lua scripts for level setups so we could customise various parts without the need to ever recompile the game.
- Ode – This is a recent addition which will be used for any full 3D collision/physics that we will do in the future.
- Libpng/zlib – Used to load/save our png files in both game code and exporters.
Our current philosophy is to keep things very light and simple so that we don’t need to waste time creating tech, so that instead we can concentrate purely on creating games!
We hope you enjoyed this first look at some of our core tech in action. In the next few weeks we will hopefully be able to show off how we’re applying it to our next game…so watch this space! If you have any questions, comments or requests for things you’d like to see in future instalments of our development diaries, then please leave a comment.