A Unity programmer in Unreal Engine’s Court: Introduction

By Mike

Well, I have a lot of updating to do here. That tagline in the “About” section, knocked off from Mythbusters, hasn’t been accurate for a long time.

As of now I have five years of game development experience, inclusive of the outside-of-classroom projects I did while I was in my last year of grad school. Adam has exited the games industry entirely for the tech sector, although he’s still exploring games on his own time and regularly there for me to bounce thoughts and ideas off of. For both of us the days when we were in school are so far in the rear-view mirror as to be irrelevant.

In those years I’ve worked at WayForward Technologies, Stompy Bot Productions, and NostalgiCO, and I’ve contributed to a couple of other independent game projects. I’ve worked on three published projects based out of Unity 4.x, going on a fourth with Cryamore coming sometime early 2017. I worked with Unreal 4 for about a year prior to its public release, and recently worked on an Unreal 4 VR project for Gear VR. I’ve wrapped my arms around a wide array of systems, including localization, UI, save data, combat, game state management, animation, input management, and a variety of eccentric Android systems, not the least of which was the Fire Phone and the illustrious Madcatz Mojo.


Here’s a build target everybody wants on their resume.

Yeah… I’m a little different than when I started this blog. Currently I’m one of the most-read Quora writers in game development. I’m writing a lot more regularly about game development topics, so I figure it’s way past time for me to update this blog.

I’m embarking on a bit of a personal project, which hopefully I’ll be revealing soon. As the majority of my professional experience has been with Unity to date it seemed like the decision of what engine I’d build it in would be a no-brainer, but as I’ve had the time to do some research the decision has become less and less clear. As an experiment I’ve been implementing some preliminary systems in each engine, and I felt it forthcoming to share the experience so far.

StarCaster, The Prototype

StarCaster is an RPG project I’ve wanted to build for quite a long time, essentially a JRPG set in a 1990’s space anime/sci-fi setting. Think of Firefly meets Final Fantasy 9, or perhaps just Skies of Arcadia in space, and you’ll get an idea of what it is I’m aiming for with it.

For those in the know, basically I just want Outlaw Star, but in RPG form.


Motley crew on a cool space ship having cool space adventures in a setting full of wonder and adventure and inconvenient bills. No game to date has quite scratched the itch for me.

It’s an ambitious project, but at this point I’ve worked on ‘Til Morning’s Light and Cryamore both, which has helped me to grow confident in my ability to implement projects of this kind.

While I’m a long way off from having the capital necessary to build it in full, I’m fully capable of producing game systems, UI, and building up the workflow and tools for implementing this spin on the classic JRPG. Content is always going to be demanding, but if anything, I’ve found that the systemic elements are all much easier to handle than most people make them out to be, especially if you temper your expectations of re-inventing the wheel or reaching for sky-high production values.

Voice acting can be an expensive double-edged sword, so let’s just not bother with it.

Workflow is my goal here, I would argue, far more so than the end product. I want to go back to this genre’s roots and get a feel for what would make content both easy and fun for a team to make.

That last part is key. Very few projects I’ve worked on have been free from creative frustrations from team members, and as it turns out a majority of people working on a game aren’t in any kind of creative position at all. It’s a job more so like being a construction worker than it is like being a production crew for a play.

While there will always be aspects of the job that feel that way (nobody’s going to be free of checking the localization table for spelling errors), I want to see if I can lean it in the other direction a bit, and make things feel more the way that people imagine making this kind of game should feel.

The Start of a Build: Tedious Bullcrap Extravaganza

My first order of business with this prototype, in my mind, was getting all of the tedious bullcrap work out of the way first. Text localization system, save data, menu flow, audio management, scene transitions, to name a few. All the things that people take for granted and put off, causing them each to become somebody’s full-time job to maintain and troubleshoot throughout a project. Nailing these down early gives you fundamentals much quicker.

I began working in Unity 5.4, because I’m extremely familiar with Unity and have built all the requisite systems for this kind of game in Unity at one point or another. I’ve also got a big library of asset store packages that cut out about 60% of the work in implementing a few of these.

I got as far as building save data, audio, a menu management/stack system…

Music by Benjamin Ray / First Turn Fold

… As well as character movement, and a camera management system a bit reminiscent of God of War.

Note: The blinking on entering the “maintenance area” is a result of the fade using a bounce curve. I couldn’t help but experiment with fades that used multiple kinds of curves.

This is the point where I stopped to evaluate how my Unity workflow was going, because there is an interesting series of snags that occurred during the implementation of camera and movement controls that caused me to second-guess whether or not Unity was really saving me that many headaches.

Big Savings on Big-Picture Systems

For extremely large-scale systems I found that the Unity Asset Store was saving me a lot of trouble. Some of the frameworks that I found most useful were Master Audio, an audio management package that greatly expands Unity’s existing audio support…


The ProCore suite, including ProBuilder and ProGrids — two indispensible packages that provide the ability to create level geometry in-engine…



These represent basic functionality that Unity doesn’t have supported by default, which would be broadly applicable to a lot of games. Tools like this are easy to justify because the expense of either ignoring them or building them yourself goes way beyond the $45-65 price tag to buy them on the asset store.

This makes a very effective argument for the asset store in general: if you don’t have to spend the entire development cycle troubleshooting a visual scripter, why should you when someone else is graciously doing it full-time?

Camera Chaos

Where I found this reasoning stumbling was when I started trying out more specific or directed tools. I’ll use the camera as an example that I feel is emblematic of this problem.

When I was handling the camera system, I first attempted to use a well-rated (five stars) asset store framework for camera control. While it made an extremely effective, quick, and user-friendly proof-of-concept, it wasn’t an effective tool for two reasons.

First, it had the relationship between camera states, triggers, and transitions backwards: the different camera states controlled how transitions worked, rather than the triggers controlling how transitions worked. Second, it did not have any easing curve support for interpolations. This means the camera will harshly linear interpolate to every target position.

That’s definitely not a five-star system. That’s like a three-star system at best.

When I cracked it open to exact modifications, I found the code was a bit of a mess. So I decided to take the concepts behind it and re-engineer it, but I was going to use DOTween as my base. DOTween is a popular tweening library for Unity that adds extensions to most common types and structs for interpolating and moving based on code-driven values; I was using it for my menu and found it strikingly easy and reliable, so I figured this would work too.

Wrong! DOTween, under certain circumstances (most of them), does not permit you to change the end-values in real-time. It’s more like a playback system than it is a live computation, so whatever the start and end values were when you initialized the tween are going to be the start and end values throughout the tween. This means the camera would interpolate to a desired position, sit there for a second, then let go and catch up with the player.

Finally, I tracked down an easing library from Flash, converted it to C#, and implemented that manually using Unity Coroutines, resulting in the camera system above — which, finally, behaves perfectly.

So in the end that’s two Unity frameworks, both considered top-notch, that failed to save me any effort.

Game Framework Shortcomings

I’ll also gloss over the movement system. My character is implementing Unity’s Nav Mesh Agent, which permits them to navigate an environment using a navigation mesh.


This makes it very easy to choreograph characters with Move-To commands, which you can actually see occurring in my camera demonstration. The trigger with the fade-through triggers an auto-move to put the player into a “safe” position so that they don’t suffer control disorientation.

Problem: The nav mesh agent in Unity is a substitute for the Character Controller and other forms of environmental collision. It is not compatible with physics, and it also does not permit the character to leave the Nav Mesh. Ever. In any way. This is an advantage when handling edge detection, but it also means that characters can’t jump unless you specifically turn off the Nav Mesh Agent and turn on something else.

I can develop work-arounds for this use-case during exploration, but knowing the kinds of movement that my combat system is going to entail, that seems like a bit of a hard sell. Meanwhile, remembering to myself that Unity’s tech art pipeline for mesh importing and animation management has not been pleasant, it makes me worry for all the hoop-jumping I’ll need to make artists go through for combat.

An Unreal Realization

Now the kicker!

Nearly every one of the asset store frameworks I’m using has some equivalent built into Unreal by default.

Having recently worked on an Unreal-based project, I had formerly come to the conclusion that Unity was much more productive than Unreal based on the gameplay programming workflow, which I’ve found is more productive than Unreal’s in practice. There isn’t much workflow architecture or game framework by default, but that makes it easy to figure out your starting point — it will always be a GameObject with a MonoBehavior attached.

As I got a look at how things behaved, though, and started comparing these features to Epic Games’ native implementations of the same concepts, I started to doubt I was getting the productivity that I thought I was getting from working in Unity and adopting asset store tools.

Therefore, as an experiment I’ve been reconstructing the systems I built above in Unreal 4 using C++.

So far the most challenging thing to this isn’t so much deciphering the Unreal Engine itself, but rather deciphering the gameplay framework and acquainting yourself with the giant mountain of things that it’s already capable of doing and where all that functionality lives.

It has by far the better tech art and animation pipeline, something I’ve known for a long time and that’s been completely true since Unreal 3.

It has a variety of easing functions that are fully customizeable.

It has a spline system already.

It has a comprehensive audio management tool.

It has a character class that supports nav meshes in addition to a variety of movement modes, and you don’t have to turn off navigation to use them.

It has game state management built-in, including a singleton class called GameInstance that carries through between scenes as well as Game Mode classes that do the initialization and management of individual game types. Level files include a default game type in their data. This is a system that I essentially had to re-engineer for Unity and that necessitates my remembering to drop a prefab into every level I make.

Hell, it even has a text localization system, a spreadsheet importer, and save data, as well as a system specifically built for interpolating the camera the way I’ve presented above. It takes a good deal of setup to accomplish, but you can basically re-create that using nothing more than the Level Blueprint.

The only problem is, Unreal’s documentation on these matters, although worlds better than it was during the days of UDK, isn’t as clear or as topic-oriented as Unity’s, and therefore it is difficult to find or decipher all of these incredibly time-saving features.

While initially I found Unity to be more productive, as I’m growing acquainted with the ins-and-outs of Unreal’s system, I am finding it upholds the professional standard that I’m looking to achieve a lot more readily. While formerly I would have complained about how Player Controllers, Game Instance, Game Modes, Player Instance, Pawns, etc. are all set up in relation to each other, what I’m finding is that I spend an awful lot of time implementing something that works exactly like this at the start of each project. Especially for a project of my intended scale, the robustness is seeming more and more welcome.

Therefore, this blog post marks a series that I intend to write explaining my findings as I continue to build this stuff up and likely transition over into Unreal — a guide to the Unreal gameplay structure, common classes, and all the tools that it puts at your disposal that are perhaps not as forthcoming as a migrant from Unity would hope.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s