Christopher Jones, Games Development
  • Portfolio
    • Unity
    • 2D Games
    • Old Work
    • Source Engine
  • About Me / Contact
  • Blog

On what I've been doing recently... (GIF heavy)

5/7/2017

0 Comments

 
So here's the fun Catch-22 of profile websites and game development logs: you want to show off your best, but a game in prototype is by definition an unfinished mess. But unless you talk about it, the tumbleweeds in your blog roll, and then you have a completely different problem...

Anyway, here's what I've been doing recently: How many of you would be interested to hear about a 3D procedural boss fighting game?

Read More
0 Comments

Unity, Blender and Instancing Models

5/7/2016

13 Comments

 
As many of you may know, Blender files can be imported directly by Unity as models (teeechnically it's just telling the blender file to export itself as FBX, then importing that, but it's defacto Blender -> Unity for the end user). Honestly, I find it impossible not to want to use Blender as my level / modelling tool rather than trying to build levels out of prefabs in Unity. But with the default importer, there's a catch; instancing.

Read More
13 Comments

Unity and Custom Character Controllers, Revisited

4/30/2016

3 Comments

 
So, some time ago I wrote about the mayhem involved in getting a non-axially aligned character controller to function in Unity.

There were tears, there were shenanigans. By the end of it, there was a functional end product... but with more experience, it's time to look back on it with a more critical eye. For one thing, wouldn't it be amazing if you could just use standard Unity Rigidbody and Colliders to solve your character movement?

It is, in fact, possible, and without all the shaking and falling through mesh colliders that - at least for me - plagued my earlier attempts at solving with that method in the past.

Read on!

Read More
3 Comments

Unity vs the Custom Character Controller

10/25/2014

2 Comments

 
A/N from the future: This is an old post, and whilst the problems it aims to solve are valid, it's not an approach I would recommend. For those struggling for an alternative to the Unity Character Controller, see my later 'revisited' article on this topic.

So as a bunch of people have noticed, the stock Character Controller in Unity has a number of flaws. Erik Ross has done an excellent series of posts on the subject, to be found here, which served as a very helpful reference for implementing my own.


Read More
2 Comments

I'm back! Plus Polymorphic State Machines

10/22/2014

0 Comments

 
Had some trouble logging in to my site (Weebly kept getting caught in a log in loop) but eventually solved it by just using a different browser. Which I could really have done with thinking of earlier...

Anyway, I have a number of projects on the go now: I am now a member of the Pixel Spill studio!

I'm still faffing around in Unity though in my spare time, and I've picked up a few tricks that might be worth passing along...

Dynamic State Systems
Mostly I build this pattern in response to a problem I was having with character controllers. In the interests of code re-use, I wanted the player to simply be an 'Actor' driven around by their input, whilst AI characters were 'Actors' pushed around by their AI overlords. Functionally, these Actors are the same.

However, I didn't want all Actors created equal; for example some simple enemies just need to be able to move, but the Player needs to be able to run, jump, crouch and do all manner of wonderful things. The natural solution is to simply use a class hierarchy - HumanActor: Actor in my case - but how to build in a state machine whilst also supporting polymorphism?

Another related issue I was having was essentially a form of 'property bloat', where I needed to define various parameters repeatedly for different states (ie speed when crouching vs speed when standing vs speed when falling through the air and so on).

Ultimately, I solved this by transferring state mechanics and paramaters to nested classes. The basic code is as follows: (Pastebin)

The core feature is that the states themselves and their unique behaviours are defined in custom, nested classes. We keep a dictionary of these classes, allowing us to set state by passing a string as input; child classes can thus add more states by expanding that dictionary. In my case, I had an Actor class with a GroundState and an AirState, and HumanActor which also introduced a CrouchState.

As a bonus, because the State-unique properties are defined in the state itself, you get a collapsible rollout in the Unity Inspector! Making things much easier to handle.

There are a few 'gotchas' - the most obvious being the 'what happens if I use a string it doesn't recognise', which is fairly easy to catch. The other is that you need to initialise every state on startup; otherwise the state class won't know what it's supposed to be controlling and you'll probably get flooded with null reference errors.

You don't have to use strings, of course: I just used them here to support polymorphism. If that's not a problem, you can just as easily ditch the dictionary and use an enum of states and a switch statement in SetState instead (which I've done for my game camera).

Another curious 'gotcha' I ran into relates to a bug in Monodevelop. I had a camera script that, amongst other things moved itself around to orbit a follow target according to mouse input, but I wanted it to have extra modes of functionality (like being able to detach and move around independently), so I built it a state system similar to the above.

As there was no point in re-typing the old orbiting code, I just copied it over into the Follow State's Update() function, using properties in the base CameraState to link to the camera's transform (so I didn't have to change everything to owner.transform), again as in the pastebin code. This worked in the other Camera States I made which also changed transform.position etc in their own way, but with this copied one I got an error stating a nested class couldn't access an instance property in its parent. Even though Follow State had a transform property, Monodevelop still thought it was referring to the monobehaviour's transform property. Simply re-writing the offending line (without actually changing anything) solved it; it appeared to be a case of Monodevelop's metadata shooting itself in the foot.

Hope this proves useful! I have a few other topics I've picked up to talk on, and a log of development I hadn't been able to post until now, but that can wait.
0 Comments

Custom Image / RenderTexture effects and Unity

12/2/2013

0 Comments

 
So I recently got shot in the foot by an unexpected issue; custom render effects that work just fine in the Unity Editor, but mysteriously disappear when you run them in a standalone build - with a special thanks to Mark Backler for helping catch the issue (which should teach me to test a build before sending it off for someone to look at, even if it's only been in development a week...).

Doubly troublesome in my case given one of the effects (2D lighting) had core gameplay implications, but fortunately it proved easy enough to track. When a build is produced, amongst other things it creates a new folder <build_name>, containing various files. Key amongst them; a build log.

Popping open mine highlighted this little error:
NullReferenceException
  at (wrapper managed-to-native) UnityEngine.Material:Internal_CreateWithShader (UnityEngine.Material,UnityEngine.Shader)

  at UnityEngine.Material..ctor (UnityEngine.Shader shader) [0x00000] in <filename unknown>:0

  at SFXPass.get_mat () [0x00000] in <filename unknown>:0

  at PathfinderDoomFX.Awake () [0x00000] in <filename unknown>:0
Some quick googling later highlighted the culprit; Shader.Find(). Long story short, when Unity builds a project, it strips out content it believes isn't being used. In the case of shaders, it will strip out anything that does not meet the following criteria: 'It is used in a material that is used in a scene'. Guess what falls out those bounds? Shaders referenced only in scripts; for example, for image effects.

So. How to dodge around this little conundrum? Fairly simple really; stick your custom image effects shaders in a Resource folder. Ordinarily of course, assets are either loaded via the Resources.Load method or by being assigned in the Inspector. Unity can scan through the latter case and pick out any asset that's being used and thus needs including in the project, and it handles the former case by just automatically including everything in Resource folders blindly. Shader.Find() is one of those odd methods out that can reference things the above techniques would miss, and hence - whilst it will work in the Editor where the game has access to all the assets, even the non-final ones - fail in a full, leaner, standalone build.

Finally, there's also the 'Always Included Shaders' array in the Project Settings > Graphics rollout. This can be useful if, like me, you're re-using some of the Image Effects shaders but not the components themselves (you have to write your own scripts if you want them to run their calculations on a rendertexture rather than the screen) and don't want to touch or rearrange the Standard Assets themselves, which obviously rules out sticking them in a Resource folder. For your own custom Image Effect shaders though I would recommend the Resource folder option over the Always Included array; with Resource, they automatically get included simply by being there, making things much easier to maintain.

0 Comments

Unity, GetComponent(T) and Interfaces

11/9/2013

9 Comments

 
A/N: This is actually a post discussing GetComponent<T>, with angled brackets, but putting those in title made Weebly cry. Just to be clear though, this is about the Generic GetComponent function.

Here's a fun little quibble I've run into whilst mucking about with learning to code. I've seen it trip up a few other people on UnityAnswers and there's a pretty simple solution, thus; TIME TO DUST OFF THE BLOG.

Polymorphism is an exceptionally powerful concept. It's also the reason I will - and always - recommend people script in C# rather than Boo or Unity's weird hybrid Not-Quite-Java. In my current case, I found a situation wherein I needed to use an Interface with Unity's GetComponent<T>() function; it relates to making a 'IWeapons' interface for... well, components designed as Weapons to be 'fired' (which in said game can do anything to launching a single projectile to starting up a grappling beam with which you smash spaceships into spaceships and presumably investigate what maniacal laughter sounds like from behind an oxygen mask). As these 'weapon' components would be performing a very wide range of tasks, I felt the typical inheritance tree model would be too constricting and that an interface would be best.

Except, of course, for the GetComponent<T>() function. Which can, as you might imagine, only search for and return Components, aka 'Objects that Derive From the MonoBehaviour Class', of which an interface is no guarantee.

The solution? It's actually hilariously simple. There is a way to implement interface-like behaviour whilst still assuring you a derived from a certain base class. Namely, it's the concept of the abstract class.

Firstly, lets review. The purpose of an interface is purely to provide an agreed public contract; namely, any other class working with something that implements a specific interface is guaranteed to hold the methods and properties defined within that interface. So for example:

public interface IWeapon {
    void Fire();
    void StartFiring();
    void StopFiring();
}
If my weapons components all implemented that interface, then in all AI logic, player input handling and so forth, all that code would require is:
IWeapon weapon = GetComponent<IWeapon>();
weapon.Fire();
And the weapon component - be it a simple projectile launcher, raytracer or magic cat missile spewer - would fire. In short, the input / AI code doesn't need to know how the specific weapon fires, it just needs to know it is a weapon, and be able to tell it to fire. In this way, the weapon components themselves can handle their firing behaviour any way they like, by all the myriad ways.

And this is what interfaces allow you; it is a simple guarantee that any class that implements a given interface (and, most beautiful of all, a single class can implement more than one interface) will contain the functions the interface specifies. For instance, that IWeapon interface guarantees that any implementing class will have a public Fire() methods with zero arguments, returning void. Which is all anything interacting with a weapon component will need to know.

However, the catch rolls in here; all my weapon components also need to be components (classes deriving from MonoBehaviour that you can slap onto gameobjects from the inspector) and an interface does not guarantee this. It just declares what functions and properties will be publicly available. Thus, GetComponent<T>(), which can only search for and return MonoBehaviour objects, fails when it comes to searching for interfaces. If, as above, I wrote:
IWeapon weaponComponent = GetComponent<IWeapon>();
It would fail. Compiler error.
error CS0309: The type `IWeapon' must be convertible to `UnityEngine.Component' in order to use it as parameter `T' in the generic type or method `UnityEngine.Component.GetComponent<T>()'
So how can we save this? We want a public contract; that a Weapon Component will have the method Fire() but without defining what that method does and hampering what we can do with it. Well, mercifully, there's a way around this:

Enter, the abstract class.
public abstract class AbstractWeapon : Monobehaviour, IWeapon {
    public abstract void Fire();
    public abstract void StartFiring();
    public abstract void StopFiring();
}
(note how I still have it implementing IWeapon, given that it's true and I might as well, though you could probably remove the IWeapon interface at this point)
Now, if I have my weapon components all derive from AbstractWeapon, we ensure they implement IWeapon and we ensure they are a derived class from Monobehaviour, which GetComponent<T>() can work with:
AbstractWeapon weaponComponent = GetComponent<AbstractWeapon>();
An abstract class and any declared abstract methods are basically a way of saying "There will be a function/property here, and my derived classes will implement it". Much like an interface, it declares properties and functions without declaring how they work, leaving that up to the derived classes (which, like an interface, will have to implement said code by overriding the abstract functions or there'll be a compile error), but it also guarantees the base class (Monobehaviour) and all the code that comes with that.

There is a downside of course; a class can implement any number of interfaces, but can only have one base class; it's still an inheritance tree and thus there will naturally be a 'split' involved somewhere. Fortunately Unity's component based design lets you side-step this a little bit (whilst there will be splits, there's nothing to say you can't just have two different components).

Obviously you can still have some behaviour and code defined in the above abstract class example (which will probably prove advantageous; ie providing protected helper functions derived classes can call on rather that waste time writing the same snippet of code multiple times) - something you would not be able to do with an interface - but in this case, with all the methods marked abstract, it provides exactly the same functionality as an interface with a pre-defined base class. Which is exactly what we want.
9 Comments

    Author

    A UK-based amateur game developer.

    Archives

    May 2017
    May 2016
    April 2016
    October 2014
    December 2013
    November 2013

    Categories

    All
    Scripting And Theory
    Shaders Are Fun
    Shaders Are Hilarious
    Unity 4

    RSS Feed

Powered by Create your own unique website with customizable templates.