Space Dust Studios

Developer Blog

Tag: unreal (page 2 of 2)

Unity vs Unreal: Choosing an engine for Space Dust Racing – Part 2: The Unreal Deal

UnityVsUnrealFeatureImage2

In case you missed it, you can check out the first part of Unity vs Unreal: Choosing an engine for Space Dust Racing at Part 1: Adventures in Unity.


Unreal GDC

Three of us made it over to GDC this year and after having spent the previous few days participating in the speed dating of game development (also known as Game Connection America), we were leisurely wandering around the GDC expo halls.

We had a chat to the guys at the Unity booth, showed them one of our playable prototypes, then went and said hello to a friend in the Qualcomm booth. Next we stared for a while at a demonstration Allegorithmic had on their Substance workflow, followed by quick laugh at Goat Simulator. Soon after we spotted an odd looking sign on the back of some unknown booth which basically said:

Get Unreal. Full engine and source. $19/MO +5%

After a few exchanges of WTH, we walked around to the front. Sure enough it was the Epic Games’ Unreal Engine 4 booth with a ton of people gathered around, and as many staff members eager to answer the onslaught of questions. After a few more exchanges of WTH with the staff, we joined the queue and had a look at the presentation they had going on in the back. Yep, looked legit. Fast forward a couple of hours to the hotel room, where we had signed up and downloaded a copy to start checking out.

GDC

A collection of snapshots taken while in San Francisco attending GameConnection and GDC 2014.

Rendering

For us, when compared to Unity, probably one of the biggest assets Unreal brings to the table is its rendering technology (or networking or source – it’s close). It features modern physically-based shading, full-scene reflections, TXAA, integrated GPU particles, an efficient and easy to use terrain system (including foliage and spline tools), as well as a host of other desired features. Most features are easy to use, efficient and importantly – well integrated.

A good example would be the environment reflection captures that ‘just’ involve placing primitives (sphere or box) into the scene, they are then captured and at runtime are selected, (approximately) re-projected and blended together in a similar screen-coverage-determines-cost way as the rest of the deferred rendering pipeline. In addition they are also blended with dynamic screen space reflections and use the lightmass data to reduce leaking. All in all a pleasurable system to work with.

Another example might be the particle system. The GPU particles are embedded within the standard particle system and support most of the standard particle features, but also feature screen space collisions, vector fields and plentiful numbers of particles. They have also made a good effort to do complexity-decoupled particle lighting.

Materials

Underlying most of the graphical content is the material system. It is a visual scripting/node graph based interface that covers most of your needs from static and skeletal mesh materials to particles and post processing effects. In order to keep things clean, you can create material functions to group node logic and then expose it in a simpler form. You can also write custom material expression code snippets if you need to use advanced shader features or just want to simplify your node graph.

UE4Materials

The material and material function editors.

The completed material graphs are first compiled down to HLSL or GLSL, and then get included by template shaders which basically pick and choose which part of the material code should be included for each type of shader required. It is then fed to the device to be compiled further, resulting in fairly efficient shaders. They have included an option within the material editor to view the generated HLSL code.

If you need to do something more advanced, such as hand optimize, use compute shaders or supply constant arrays, it gets a little less ideal as you’ll need to jump in and start modifying the engine code – however that option is there (first world problems, eh?).

Renderer WIP

Other parts of the renderer can be a bit lacking, missing or just generally incomplete. A single reflection capture is used for each translucency object and it picks it by a simple distance-to-center heuristic, without taking the influence radius into account, which can easily end up selecting non-ideal cubemaps.

In another case we had set out to test the foliage instancing with a few boulder meshes. We followed the lightmap requirements but couldn’t get the lighting to match – only to find that the UInstancedStaticMeshComponent::GetStaticLightingInfo function was empty.

UE4Buffers

Unreal Engine 4 visualizing a bunch of the rendering buffers.

Networking

Another huge plus for Unreal over Unity to us, is its networking and replication solution. Its overall architecture has been proven over many iterations of the engine and with Epic Games’ Fortnite as well as many other upcoming games utilizing the current iteration, it should be just as solid.

Right out of the box they support dedicated authoritative servers, which most commercial online games require. Even better, they can be compiled to run under Linux. If you need to host a bunch of game instances on a server farm back-end you will see that Windows boxes are almost always much more expensive than Linux boxes using the same hardware. Microsoft’s Azure instances are about 50% more expensive using Windows than using Linux, whereas Amazon EC2 instances using Windows are almost double the price of Linux instances.

Another nicety is that they have made it fairly easy to run dedicated servers and test multiple users on the one machine, all from within the editor.

MultiplayerShooter

Who needs friends? Four way multi-player on the one desktop.

Networking under Unity

Under Unity, from what we could tell, most people ignore the builtin solution (for a bunch of reasons) and instead go for a third party option such as Photon or uLink, but there is a whole spreadsheet of alternatives that the community has put together for comparison. If you want a C++ server, you will be looking at writing your own plugin for Unity and the back-end server yourself.

The solution we were leaning towards was not really ideal. It did support dedicated authoritative servers where we could run our fundamental physics and gameplay simulation on the server, but as it was written in C# this meant we needed to use Mono under Linux and we had been warned that we would see noticeably reduced performance compared to Windows. At the time, gauging by the forums this approach was also mostly untested on Linux. Because of these reasons, as well as the upfront/ongoing costs involved and a bit of misalignment with what we really needed, we budgeted for writing our own solution.

At that stage we also re-evaluated UDK/UE3 because of it’s networking framework, however due to cost and engine limitations we decided against it. Now that UE4 is out, it solves a lot of those issues for us.

Networking resources

If you want to learn more about the basics of Unreal’s networking and replication under Unreal, they have a great long existing UE3 documentation page, as well as a smaller UE4 page and a Networking Tutorials playlist that uses their Blueprints system (though watching those videos, I was a little concerned that the server side of the blueprint might ship with the client, which is far from ideal. I am not sure that is the case however).

Blueprints

Blueprints would have to be one of the better, well integrated, visual scripting languages I have used. It is fairly clean and easy to use but most of all it would have to be due to their runtime visual debugger. Debugging a visual language without one can be painful. It is also great to see that they are continuing to add features to make the workflow that much better. For anyone struggling to learn C# under Unity – Blueprints could potentially offer the answer.

I have heard that they are roughly a similar cost to UnrealScript at about ~10x slower than writing it in C++ but used correctly that might be mostly mitigated. However, it does make me wonder why they aren’t compiled down to C++ and then ran through an optimizing compiler, in a similar way to the material networks.

Blueprints

If only the links were depicted as strands of spaghetti, we could redefine spaghetti code.

Sample projects

Worth a mention would have to be the amount of high quality, varied, meticulously maintained, example projects that have been coming out of the Marketplace. While learning the engine, I would often load up and refer to the Shooter Game, Strategy Game or Content Examples but I also spent a good amount of time looking over their Elemental Demo and Effects Cave, as well as most of the other projects. Although we aren’t using the vehicle physics released in UE4.2, the Vehicle Game is another top example.

They are all definitely worth a look and you can find supporting documentation on their site.

The rest?

Most of the other core built-in systems such as NavMesh, audio, streaming and the animation tools for Unreal look solid (if not a little FPS/TPS orientated). It also includes CSG tools which are handy for prototyping/blocking out levels.

The Windows XInput implementation is currently missing vibration support, otherwise it works well. They have included a nice plugin framework for adding additional input modules.

The experimental behaviour trees look good. The cinematics tool, Matinee, is cool but is looking to be superseded eventually by a newer system called Sequencer. They are also in the process of improving their UI systems by adding a WYSIWYG system called UMG (Unreal Motion Graphics). It will be built on top of Slate which is used most often for game UI but is also what the Editor itself uses.

I had better leave it here for this post but will cover the rest of the rest next time, including some of the key considerations.

Unity vs Unreal: Choosing an engine for Space Dust Racing – Part 1: Adventures in Unity

Unity_Pri

Ahh… finally, a break from the beautiful artwork our awesome art team has produced, it’s time to break out the programmer art. I promise this short series of posts will be nice and dry, as I cover our take on Unity vs Unreal Engine, including some of the reasons why we switched from Unity to Unreal for Space Dust Racing, and also some of our experiences with both engines so far. This post may get a little technical in places, so feel free to skim!

Back in the day…

Prior to starting Space Dust Studios, the majority of our experience was working with cross-platform in-house engines. Depending on who you ask, this was both a blessing and a curse. On one hand we often wished the technology was more mature, but on the other hand we were lucky to be close enough to the underlying technology to make changes or add functionality, and were always able to solve any issues we faced.

Moving to a startup environment meant we were faced with a very different situation. We weren’t ready to licence a million dollar engine, and building our own engine did not align with our resources or goals. UDK and the CryEngine FreeSDK – while great in parts – seemed to be severely limited in the kind of rendering technology you could implement on top of them. We needed something more flexible for the type of environments and effects we were looking to create.

44554020

Unity

I had started looking at Unity 3.5 and after not too long, Unity 4 was out sporting their new deferred DirectX 11 renderer which was architectured in a way not too dissimilar to what we were used to.

The Butterfly Effect video was released at the same time and looked great. It featured their DirectX 11 rendering, with additional features such as physically-based shading, HDR image based lighting, improvements for character animation, hair, skin, clothes and other rendering features (though unfortunately many of these haven’t been released in the 4.x development cycle).

Apart from an underlying inflexible rendering core (which as far as we could tell, doesn’t allow for techniques such as proper stencil-optimized deferred decals, reduced resolution lighting, custom shadow attenuation, TXAA? and you can’t really create your own custom light types or uber lighting shaders – to name a few), there is still a fair bit of leeway for customized rendering, as you do get access to additional render targets, the ability to write shaders, you can modify a few key internal shaders, generate procedural geometry, and it has support for a native plugin system with which you can start calling on the DirectX device directly.

Unfortunately parts of the feature set seemed only surface deep. Many of the DirectX 11 features were not fully exposed and difficult or error prone to get working via a native plugin. We quickly came across the lack of support for floating point textures, 3D texture importing, limited texture addressing & filtering support (UVW), there was support for Dispatch but not for DispatchIndirect, limited support for multiple render targets (we needed 6-8 MRT’s for certain effects but Unity artificially capped it at 4) as well as plenty of bugs, some of which would make it unshippable for us without support or source access to fix them.

“We license Unity source code on a per-case and per-title basis via special arrangements made by our business development team. As this can be quite expensive, we do not generally license source code to smaller operations, educational institutions, nor to companies in countries which do not have adequate legal intellectual property protection.”

– From Unity’s FAQ on licensing source

An example would be that alt-tabbing in a fullscreen DirectX 11 deferred game would crash when returning to the game. I lodged a report with a test case project via the in-editor crash tool back in Unity 4.0.1f, then checked the issue tracker and sure enough it had been a reported issue back in Feb 2013 but is still yet to be fixed. (I did however recently see an acknowledgement of the issue in the Unity 4.5 release notes so hopefully that one will be fixed in the next release.)

Another example would be a sporadic issue on certain PC’s that meant that before even getting as far as the input/graphics configuration dialog (which happens before we are given control), it would become unresponsive… but not always.

Community

“630 thousand monthly active developers and nearly 2.9 million registered developers”

– From Unity’s public relations page

Thankfully the Unity community is fairly active in the forums (and generally just awesome), so for many of the other bugs we came across, we were not the first and often someone else had posted a workaround for the issue where possible (crashing on exit, add force oddity). For other issues, we knew we would probably need to reimplement a feature in a plugin to avoid them (XInput, I’m looking at you).

There are tons of plugins available on the asset store which have been created by the community. Some are free, some are licensed per seat and others licensed per project. To get set up comfortably, you really need to set aside a good chunk of cash for a bunch of these.

DIY

There are plenty of basic features which you end up building yourself and then share between your projects. Better tools to work with prefab hierarchies, a suitable object pooling framework, in-game debug menus and helpers to name just a few. Hopefully the next big release will start filling in these gaps.

In editor footage from an early volumetric rendering prototype built in Unity with a custom rendering extension plugin.

Due to the rendering limitations, we started crossing off features but also wrote a renderer extension plugin DLL to get around what issues we could that way. In order to set up the sampler state as required and provide the support for higher MRT’s and desired formats, we had to jump through a ton of hoops including calling through to the plugin in a fairly delicate way. It was an ongoing concern of ours that future Unity versions could cause implementation issues and would require a redesign or stripping out of features. In practice, thankfully, it made it through with only minor modifications between upgrades and we were able to strip back the plugin a bit as engine bugs were fixed. Alas, with access to the source, it would have been trivial from the start.

Overall

There were plenty of positive experiences while using Unity. It was great having diff-able text formats for assets. It was easy to get almost anything started and overall our prototypes came together nicely. Before all the big GDC announcements, I had implemented physically-based shading with planar and captured reflections in Unity, all fairly painlessly and I was looking forward to working in screen space reflections when I got back from GDC.

Footage from our physically-based shading prototype built in Unity. Flicks between two preset times of day.

I’ll leave this post here for now but before I do, I quickly want to mention that as part of the GDC announcements, Unity has announced that they are busy working away on Unity 5 (with physically-based shading and a bunch of new features). We expect it to be a fantastic release when it’s done.

Next time I’ll touch on our adventures at GDC and Unreal Engine 4.

Newer posts

© 2024 Space Dust Studios

Theme by Anders NorenUp ↑