A Uniject example

Let’s take a look at one of the example scenes that comes with Uniject – the bleeping, bouncing spheres!

We’re going to see how we can:

  • Write a TestableComponent that drives our bouncing lights
  • Have Uniject create the bouncing light GameObject hierarchy for us
  • Load and verify an audio clip and physic material
  • Test all of our code
  • Do all of the above without opening the Unity editor!

Here’s a functional spec for the scene:

  • It has a thin box on the floor
  • It features random, bouncing, light emitting spheres
  • When the spheres collide, they beep and change their light colour

The lights are derived from Uniject’s ‘TestableComponent’ class, which is semantically equivalent to MonoBehaviour:

public class BouncingLight : TestableComponent {

    private ILight light;
    private Random rand;
    private IAudioSource source;
    private UnityEngine.AudioClip beep;

    public const float killThresholdY = -5.0f;

    public BouncingLight(TestableGameObject obj, Sphere sphere,
                         IAudioSource source,
                         ILight light, Random rand,

                         IPhysicMaterial material,

                         AudioClip beep) : base(obj) {
        this.light = light;
        this.rand = rand;
        this.beep = beep;
        this.source = source;
        sphere.collider.material = material;
        light.intensity = 2.0f;
        light.range = 5;

    public override void Update() {
        if (this.Obj.transform.Position.y < killThresholdY) {

    public override void OnCollisionEnter(Collision collision) {
        light.color = new Color((float) rand.NextDouble(),
                                (float) rand.NextDouble(),
                                (float) rand.NextDouble());

This code is all that is required for Uniject to instantiate a complete, working GameObject hierarchy including referenced audio clip and physic material.

Now for the biggest advantage; we can execute every line of this code in a fraction of a second, without opening Unity at all!

Here is a test that instantiates our bouncing light, simulates a frame and ensures it has not destroyed itself:

    public class TestCollisions : BaseInjectedTest {

        public void testBouncingLightDoesNotDestroyAboveThreshold() {
            BouncingLight light = kernel.Get<BouncingLight>();

And here’s the test runner in MonoDevelop 3.0:

So we’ve tested our code, how do we instantiate a TestableComponent in Unity?

Enter Ninject:

BouncingLight light = UnityInjector.get().Get<BouncingLight>();

And our Bouncing Light is created for us:

It’s worth pointing out that this GameObject hierarchy has been created for us automatically by Uniject; it has a light, sphere collider, rigid body and audio source, and  it has the audio clip of our beep sound and the bouncy physic material.

There is no prefab required and no object construction code.

There is one further class involved that sets up the scene and randomly spawns the bouncing lights. It, too, is testable:

public class TestableCollisions {
    public TestableCollisions(IntervalBasedCallback caller,
                              [GameObjectBoundary] Box box,
                              Factory factory,
                              Random rand) {

        caller.callback = () => {
            BouncingLight light = factory.create();
                new Vector3((float) rand.NextDouble(),
                4 + (float) rand.NextDouble() * 10.0f,
                (float) rand.NextDouble()));

        caller.interval = TimeSpan.FromSeconds(2);
        box.Obj.transform.localScale = new Vector3(50, 1, 50);

Here is an integration test that instantiates the entire scene and simulates a few seconds worth of frames, to verify that it is spawning bouncing spheres:

public void testExampleSpawnsObjects() {
    int count = objectCount;
    Assert.Greater(objectCount, count);

But you’re referencing resources with strings!

It’s true, we’ve totally subverted Unity’s convention of editor assigned references; if we change the location of our resources won’t we break our code?

The difference is we’ll know about it immediately if we break a referenced resource; Uniject verifies that referenced resources actually exist. In a fraction of a second a test can verify every referenced resource in our entire project, and we don’t even need to fire up Unity to do so.

I wouldn’t care to know how much time I’ve wasted due to the brittleness of editor assigned references; Uniject has the potential to eliminate a swathe of runtime errors.

Note that we have a separate test solution in a sibling folder to the Assets folder. This solution contains the test project where the unit tests are actually written, and it references the .csproj files that are automatically created and managed for us by Unity.

We also use the latest version of MonoDevelop, 3.0, rather than the borked version that ships with Unity – it has a broken unit test runner.

Browse Uniject on Github.


Uniject is now on Github!

The framework is integrated into an elementary sample project, demonstrating how a TestableComponent can be both unit tested in your favourite IDE and runnable in Unity.


On Testability and Unity 3D

Testability is a crucial consideration when we write code, and for us it includes the ability to execute and unit test our code outside of Unity.

Unfortunately Unity throws up a few challenges here; MonoBehaviours are not testable, along with most types inside UnityEngine.dll.

This has led us to develop ‘Uniject’ – a C# testability framework for Unity that offers:

  • Plain Old C Sharp, testable MonoBehaviour equivalents
  • A robust and flexible way of creating GameObjects automatically, by inference of the code that drives them
  • Constructors!
  • An extremely flexible code base – in short, the benefits of DI + IOC.

The first attempt

Here’s how to make an untestable zombie, taken from our latest game, The Clones of Corpus.:

public class Zombie : MonoBehaviour {

    private AudioSource audioSource;

    void Start () {
        this.audioSource = GetComponent();


  • Only Unity knows how to create MonoBehaviours
  • We depend on concrete types in the UnityEngine namespace that we can’t mock out

So, how might we make our zombie testable?

The key is to break its dependence on UnityEngine*, and instead depend on interfaces which mirror their UnityEngine equivalent, supplied as parameters using the Dependency Injection pattern.

Here’s how our testable zombie looks (many dependencies omitted):

public class Zombie : Testable.TestableComponent {

    private IAudioSource audioSource;
    public Zombie(Testable.TestableGameObject obj, IAudioSource audioSource...) : base(obj) {
        this.audioSource = audioSource;

We use Ninject, an inversion of control framework, to actually construct our objects at runtime.

A test!

We’re now working with Plain Old C Sharp Objects, here’s one of our NUnit tests:

public void testZombieKilled() {
    Zombie zombie = kernel.Get<Zombie>();
    Assert.AreEqual(ZombieState.DYING, zombie.getState());

(Not shown is the testing base class that sets up Ninject for us and provides the means to ‘step’; simulating one or more frames).

How it works

Everything we need to instantiate a zombie is declared as a constructor parameter:


This is a dependency of the TestableComponent base class. It is equivalent to the UnityEngine.GameObject class; in the same way that MonoBehaviours belong to GameObjectsTestableComponents belong to a TestableGameObject.


This is identical to the UnityEngine AudioSource class. There are a number of other parameters which are not shown, such as INavMeshAgent, ISphereCollider…

Auto wiring

We configure Ninject with different Modules for running under NUnit and Unity. The NUnit module tells Ninject to use mock implementations of our interfaces, and the Unity module tells  it to use our ‘real’ implementations that wrap their UnityEngine equivalents.

The Unity Ninject module contains some special scoping to ensure our TestableComponents are translated into an appropriate GameObject hierarchy.

An instantiation

So how does our zombie actually get created when we call the following?


Ninject sees that our Zombie requires an instance of TestableGameObject. This is bound to a class that wraps a UnityEngine.GameObject, so Ninject creates one, and our Unity GameObject is created.

Next, Ninject tries to create our IAudioSource parameter. This is bound to a concrete class that wraps the UnityEngine.AudioSource class (a monobehaviour). This wrapper itself depends on having a GameObject to add the AudioSource to, which it takes as a constructor parameter. A custom Ninject scoping ensures that the same GameObject is supplied as was created for the TestableGameObject.

This process continues for the remaining dependencies.


An interesting consequence of this decoupling is the ease of porting our code to another game engine. To get our code running on Windows Phone 7, one would merely need to provide XNA based implementations of the interfaces in the Testable namespace.

The original Last Stand, despite being published as a pure java android game, is playable as a standalone desktop java application (and was mostly playtested this way).


The framework has been verified on Desktop, Android and iOS builds. Name mangling makes it unsuitable for Flash builds.

The Price!


  • All calls to UnityEngine now go through an interface.
  • Object construction speed

Practically, we did not notice these making The Clones of Corpus.

What we did notice was the massive increase in productivity these patterns can bring, which is extensively documented elsewhere.

*Mostly, we still use some essential structs like Vector3.

On Concept Art

These are the original concepts for the characters in the introduction sequence

Image Production

I am still slightly old fashioned when it comes to the production of my images. All my work is planned in rough on paper, transferred and cleaned up on the lightboard, coloured with ink and only then put into the computer to be composed and have tone added. I still feel that a piece of paper is the best way to lay down ideas, since you can arrange all your images in front of you. I colour before the image is put into the computer because it gives the image a natural, chaotic ink texture that I love.

I’ve used this picture of the office by way of illustration, as it were.

I started with the dimensions ofthe screen as seen in the faint grey line. I then went about planing the image, trying to fit everything in whilst keeping the composition correct. I map out in blue pencil – a habit picked up from the animation part of my degree – as I can then go over the top with a darker pencil, so when traced on the lightboard only the clean markings of the dark pencil show through.

Once the clean image has been traced onto a new piece of paper, the image is inked in nothing more than a tone of grey ink. This ‘flat’ image is then scanned in on a flatbed.

Using photoshop the image is broken up into layers of varying tone. The top layer is then deleted in areas where I want to add areas of tone. The process is repeated down through the increasingly dark layers. I do this in preference to just adding black, since it retains the texture of the ink and is a very simple method of giving a painted effect.

The final stage is to add any lighting effects, which is simply a matter of painting white over the top. Once that’s done, further alterations are made to the tone to balance out anything off-set by the added light. I also add a little blur to give the image more depth.