The Machinery Beta — March 2020 (version 2020.3)

Welcome to the very first beta release of Our Machinery. Since these are our first official release notes they will contain a quick overview of what’s in the engine, rather than a description of what has changed since “last time”. Future release notes will be more traditional!

Join The Beta

Editor

The Machinery comes with a unified editor for editing game assets. (And other kinds of data. — The Machinery is not just for games, but for all kinds of demos, simulations and visualizations.)

The interface is tab based, with each tab having a specific function. You can rearrange the tabs, dock them or undock them and create new windows as you wish. You can even have multiple tabs of the same type. For example — two Scene tabs can be used to view the scene from two different cameras. Two Asset Browser tabs can be used to quickly drag and drop assets between directories.

A project in The Machinery consists of a collection of assets (models, sounds, entities, etc). It can be saved either as a single file (which is useful for small simple projects) or as directory with individual files for each asset (which is useful for big projects, and when you want to collaborate with others using version control).

One of the most important assets is the Entity. We use an entity-component system (ECS). Entities are our game objects and you can add Components to them to give them certain functionality. For example, the Light component adds a light to the entity and the Physics Shape component adds a collision shape to the entity. An entity can’t have more than one component of each type. If you need multiple lights, you need a separate entity to hold each light.

In addition to components, an entity can also have a list of child entities. Child entities are logically owned by their parent entities and get deleted together with the parent. If the child has a Link component (which is the normal case) it’s position is also linked to the parent, so that it moves together with the parent.

Note that The Machinery does not make any distinction between levels/scenes and entities. A “level” or a “scene” is just an entity with a lot of child entities in it. You use the Scene tab and the Entity Tree to set up entities.

Simulation

You can test run a level by opening the Simulate tab. When you simulate a level things like physics, sound and animation will run. These things are disabled in the Scene tab because you don’t want to edit the scene with everything moving around.

You can add gameplay to the simulation by adding Graph Components to your entities. This lets you script your entities using a visual scripting language. You can also write gameplay code in C by creating a plugin (more about that later).

In this screenshot the Simulate tab is running is full screen mode. You can toggle any tab in and out of fullscreen mode by pressing F11.

Creation Graphs

In The Machinery there are no such things as texture or material assets, instead, we only have Creation Graphs. Each creation graph ends up becoming a single asset in the project, but depending on what the graph outputs, it can contain any number of buffers, images and shaders. On top of that, it can also contain GPU workloads in the form of draw and compute dispatch calls, as well as CPU workloads and data that can be consumed by other systems, e.g. bounding volumes.

Creation graphs allow you to freely mix nodes executing on the GPU with nodes executing on the CPU within the same graph. So essentially, the creation graphs make it possible to freely define what asset granularity you want in your projects, while they at the same time provide a less static meaning of how your assets are composed compared to what you might be used to from other game engines.

In the beta of The Machinery we have just scratched the surface of what we believe will be possible to achieve with Creation Graphs, our focus has been to take them to a point where they can represent typical assets such as textures and materials, and while doing so hopefully not feel too clunky to work with for non-technical artists. They are also used to setup draw calls for rendering of meshes.

Our hope is that even at this rather early stage we can convince technical artists of its power and encourage experimentation.

Asset Pipeline

When importing a 3D scene created in another DCC-tool into The Machinery, it ends up as a .dcc_asset in the Asset Browser. A dcc_asset can be thought of as an intermediate representation of the contents found in the imported file. Currently, The Machinery uses The Open-Asset-Importer-Lib as its default file importer for 3D scenes. The import runs as a background task and we only do limited processing of the data during import step, just enough so that we can visualize and reason about the scene inside The Machinery.

While the resulting dcc_asset can be used directly in an entity, that is rarely what you want, typically there is a need to do further tweaks to the imported scene or part of its data. To unpack the content found inside a dcc_asset there are a set of nodes exposed to the Creation Graph, together with a special component called Entity Rigger which is responsible for creating an entity representation matching the scene hierarchy and objects found in the dcc_asset.

To minimize the burden of manually creating an entity with an Entity Rigger component and going through and setting up creation graphs for all textures and materials found inside the dcc_asset we have automated the workflow for you to a one-click process accessible through the Properties Tab when a dcc_asset is selected in the Asset Browser.

What this automatic process does behind the scenes is to instantiate a number of prototype creation graphs that describes what to do with images and materials as well as how to rig draw calls for all meshes. While we ship with predefined creation graphs for this (found under core/creation_graphs/) we definitely encourage anyone interested to play with these graphs or roll your own.

Prototypes

Once you have created a nice looking entity, you often want to use it in more than one place. For example, you may want to reuse a streetlights at multiple points along a street.

In The Machinery, you do this using our Prototype system. First, create an entity asset representing the streetlight. Then, drag this entity into another entity scene. The streetlight becomes an Instance in the other entity, with the original streetlight asset as its Prototype. (Note that some other engines use the term Prefab instead of Prototype.)

Prototypes and instances are linked so that if you make any changes to the prototype, those changes are automatically reflected to all instances of the prototype. Note also, that in The Machinery, prototypes aren’t special in any way. Any entity asset can be used as a prototype.

Sometimes, you don’t want a perfect copy of the prototype, but make some small modifications. For example, maybe one of the streetlights is broken, and its light source should be turned off.

In The Machinery, you can do that by Overriding properties on the prototype in a specific instance. Overrides work on individual properties, so if you override for example the color of a light source, you will still continue to inherit everything else from the prototype. Any changes to the prototype, except for the color of the light source, will still be reflected in the instance. If you want to override a property of a child entity, you have to override every object above it in the entity hierarchy in order to “drill down” to that specific property.

In addition to overriding properties, you can also add and remove child entities and components in your overridden instance. And if you save your instance with your overrides as an asset, that asset can in turn be used as a prototype by someone else. So you can have multiple instances of the “broken streetlight” asset. And those instances can in turn override specific properties on their prototype.

But prototypes are not just for the entities. The concept of prototyping and instancing is built into the engine and works with all kinds of engine assets. For example, if you want to make some small modifications to a Creation Graph, you can use it as a prototype and then override specific nodes in the instance.

Physics

The engine implements physics simulation using the PhysX library. You can add a Physics Shape Component to an entity to give it a collision shape and a Physics Body Component to enable physics simulation. We also implement joints through the Physics Joint Component and a physical character controller using the Physics Mover Component.

Animation

The Machinery supports advanced animation blending through a hierarchical state machine.

An Animation State Machine Asset has States, Transitions, Events and Variables. A State represents a particular behavior and plays a particular animation, or a blend of animations. The animated entity changes state by taking a Transition from one state to another. A transition is usually triggered by an Event and specifies how the animation should be blended between the two states. For example, a state machine may transition from a Running state to a Jumping state on receiving a jump event. Variables set externally can be used control the play speed, how animations are blended, etc. Transitions can also be set up to happen automatically when variables reach certain values.

The animation states are arranged in a Layer hierarchy. Animations in higher layers play over animations in lower layers, hiding them. For example, you can play a Hurt animation in a higher layer to show the player getting hurt without disturbing the underlying Walking animation. You can specify opacity Blend Sets for animations in higher layers to specify which bones they are applied to. Bones that are not in the blend set will “show through” from the underlying layers. For example, you can play a Shooting animation in a higher layer with a blend set that just affects the arms, and the legs will continue to play the underlying Walking animation.

Sound

The Machinery has a basic sound system. You can import WAV files and trigger their playback using the visual scripting language or gameplay plugins. We support 3D positioning of sounds as well as 5.1 and 7.1 surround sounds.

Rendering

The Machinery features a modular and data-driven rendering architecture, develoed to take full advantage of modern explicit graphics APIs such as Vulkan and DX12.

At its core is the Renderer plugin which is responsible for exposing a platform agnostic API for setting up GPU resources and scheduling GPU work. It makes it trivial to build command buffers in parallel and supports explicit reasoning about mGPU as well as multiple GPU queues (graphics, compute, transfer, etc). Output from the Renderer plugin is a set of abstract command buffers that are sent to one or many render backends that translate the command buffers to graphics API calls as efficiently as possible. In the beta version of The Machinery there’s only a Vulcan render backend available.

While the Renderer in itself is a very powerful abstraction it is still fairly minimalistic and low-level, so to ease development of more advanced rendering features we provide two additional plugins called Render Graph and Shader System.

The Render Graph provides a system for authoring and executing self-contained lighting and post processing effects through a M**odules concept. It features efficient management of large transient GPU resources (such as render targets) as well as automatic handling of resource transitions.

The Shader System implements a high-level C API as well as a JSON frontend for authoring shaders in HLSL. It also supports interoperability with the Creation Graph making it possible to link together snippets of shader code into complete shader programs.

Default Render Pipe

On top of the Render Graph and Shader System we can build render pipelines. The Default Render Pipe is another plugin that implements the default rendering pipeline used when rendering viewports in The Machinery. It defines the flow of a rendered frame by linking together a set of Render Graph modules, each specifying a number of rendering passes.

At the moment the default rendering pipeline in The Machinery is not very feature rich, it implements a basic hybrid deferred/forward renderer with support for the standard types of analytical light sources as well as IBLs (which can be captured using the Cubemap Capture component). Shadows are handled using standard shadow mapping. For directional lights, we support a variable number of cascades.

For anti-aliasing of the scene geometry, we use TAA based on Marco Salvi’s implementation using Variance Clipping. At the moment there is no post processing stack available in The Machinery.

Over the coming months we intend to flesh out the feature set of the default rendering pipeline significantly. The SDK ships with full source for both the **render pipeline itself as well as all the shaders and shader snippets exposed to the creation graph. Feel free to play around with it. Shaders can be hot-reloaded using F5 in in the editor and the rendering pipe plugin is automatically hot-reloaded if you recompile it.

Collaboration

The Machinery supports real-time collaborative editing between two or more users. All the users in a collaboration session will see all the changes made by the other users.

To start a collaboration session, one user acts as the host and starts the session, then other users join in as clients. This is done through the Collaboration tab in the editor.

In our view, real-time collaboration is not intended to replace version control, rather to act as a complement to it. Having 50 people working collaboratively in the same project would be pretty chaotic and there would be no good way of tracking changes.

Instead, we think real-time collaboration is best used in small “cabals” of maybe 2-5 users with a specific goal, for example to up the quality of a specific level. One user would act as the host and invite the others to the session. When the session ends, the host would save the changes locally on her machine and then check them into version control. That way, they can be tracked and reverted just as any other change.

Plugins

The Machinery is based around a plugin architecture. All the functionality is implemented as plugins, and you can decide which plugins you want to use. You can extend the engine with your own plugins. That way, you can add new editor tabs, tools, entity components, UI controls, etc, etc. You can also swap out parts of the engine. For example, you can implement your own render pipe and use that instead of the default one that we provide.

Since we build all the internal engine features as plugins, we’re constantly “eating our own dog food” with regards to the plugin system. Anything we do in our plugins, you can do in yours.

SDK

In addition to extending the engine with plugins, you can also just use the engine as an SDK and write your own applications on top of it. For example, you could just use our UI toolkit to implement a simple drawing program. Or maybe you just want to use our data processing code to write a little command-line tool that runs as a part of your data pipeline. The possibilities are endless.

All the APIs in our SDK are written in C11 which makes them accessible to all languages that can use a FFI to call functions using the C ABI.

by The Machinery Team