What is a Game Engine?
A game engine is a reusable software development kit that enables the creation of video games.
A fundamental aspect of a game engine is its capacity for reuse. It should be extensible and capable of serving as the foundation for numerous different games without requiring major modifications.
Generality vs. Optimality: While game engines aim for generality, there's often a trade-off: the more general-purpose an engine, the less optimal it may be for running a particular game on a specific platform. Engines are often carefully crafted and fine-tuned for a specific game on a particular hardware platform.
The Game Engine Architecture book from Jason Gregory mentioned that “ The advent of ever-faster computer hardware and specialized graphics cards, along with ever-more-efficient rendering algorithms and data structures, is beginning to soften the differences between the graphics engines of different genres. It is now possible to use a first-person shooter engine to build a real-time strategy game, for example.” [1]
Based on these we can conclude the more versatile a game engine is, the less it's likely to be perfectly optimized for one specific game or platform. However, as hardware and algorithms improve, game engines are becoming more adaptable, blurring the lines between genres they can support.
Unreal Engine Codebase
In this document we dive into the Unreal Engine Codebase. In order to navigate and analyse its architecture we can dive into any C++ game project solution in visual studio, or open the Unreal Engine solution file if we compiled from sourced code.
At the highest hierarchy we can see there are 5 folders:
Engine, Game, Programs, Rules, Visualizers.
Engine
The Engine folder contains the source code for the Unreal Engine itself. This is the heart of the engine, including all the core modules and the editor.
The Game folder contains all the C++ source code files (modules) that define our game's unique logic, characters, gameplay mechanics, and anything else specific to your project. any custom classes you create. If you opened the source code compilation this folder is not present.
The Programs folder holds the source code for various standalone applications and tools that are part of the Unreal Engine ecosystem. This can include things like the Shader Compiler, UnrealBuildTool, and UnrealFrontend. These programs are essential for building, running, and managing our project, but we typically won't need to modify them.
The Rules folder contains the build system configuration files. These files, typically written in C# and defines targets for the Unreal Build Tool, dictating how engine modules are linked together into different executables (Editor, Client, Server, etc.). It doesn’t contain engine logic, but without it, the engine can’t be built into usable binaries.
The Visualizers folder holds code for debugging and visualization tools. This can make debugging much easier by providing a clearer view of the data you're working with.
You may have a question why the Programs folder is not a part of the engine folder?
The separation of the is a deliberate architectural choice by Epic Games, and it serves several important purposes:
The Engine and Programs folders are separated because they target different layers of Unreal’s architecture. The Engine folder contains the C++ runtime and editor modules that are linked into the main executables (UE5.exe, UE5Editor.exe) and are essential for running games or the editor itself. The Programs folder, by contrast, hosts standalone utilities such as UnrealBuildTool (UBT), UnrealHeaderTool, and ShaderCompileWorker, most of which are written in C#. These tools can be executed independently to build engine binaries, process headers, compile shaders, or package content, effectively changing or generating pieces of the engine and projects. This separation keeps the reusable engine framework distinct from the auxiliary build and pipeline tools, while still allowing them to interact closely during development.
This structure helps clarify the development workflow. When you're working on gameplay code, you know to go into the Game folder. When you're configuring how your project is built, you look at the Rules folder. And when you're looking at the tools that power the process, you know to find them in Programs. This logical separation helps developers quickly find what they need and understand the purpose of each component.
We mentioned UnrealBuildTool several times but to go further we should understand it better.
UnrealBuildTool or any Build system tool is a program that automates the process of converting source code into a usable software product. Instead of manually running different commands for each file, a build system handles everything for you.
Here's what these tools typically do:
Compilation: It tells the compiler which source files to turn into machine code. For a large project, this can involve thousands of files in a specific order.
Dependency Management: It automatically finds, downloads, and manages external libraries and packages that your code relies on. This saves you from having to manually track down and install them.
Linking: After compilation, it links all the compiled files and libraries together into a single, executable program or library.
Task Automation: It can be configured to run other tasks as part of the build process, such as running tests, generating documentation, or packaging the final product for distribution.
For small, single-file projects, a simple compiler command might be enough. But as projects grow more complex, with multiple files, libraries, and languages, a build system becomes essential. It ensures that the build process is reproducible, consistent, and efficient, saving developers a lot of time and preventing errors.
Some common examples of build system tools are Make, CMake, Gradle, and MSBuild.
Lets go to our main destination which is the Engine folder, The structure we're seeing in our project solution is an overall view of the engine's entire codebase.
Game Engines generally consist of a tool suite named Editor and a Runtime component. Unreal engine is no exception here. It can be broken into two important components: the Editor and the Runtime Engine.
The Editor is the suite of tools used to create and edit content for the game.
The Runtime Engine is the part that runs the game and contains game related components.
Most game engines completely separate the Editor from the Runtime Engine, meaning the tools and space used for development run in an isolated layer above the system that executes the game.
Unity follows this model, which makes Play-in-Editor (PIE) slower but prevents most runtime crashes from taking down the entire editor.
Unreal Engine and the older Quake Engine, by contrast, integrate their tool suite (UnrealEd) directly into the runtime. This design allows the Editor to share the same asset system, reflection framework, and memory space as the game itself. The result is fast PIE performance—developers are effectively running the real game inside the editor—but with the trade-off that a runtime crash often brings down the entire editor.
Unreal Engine → The Editor and Runtime share the same asset system, UObject reflection system, and memory space. PIE is basically running the real runtime inside the editor process. That’s why:
It runs with near-runtime performance (we’re literally running the game code).
A crash in PIE often takes down the whole editor (shared process).
Unity (and others like Godot) → The Editor runs the game in a more sandboxed process or VM-like environment. The runtime simulation is abstracted away from the editor internals. That’s why:
PIE is noticeably slower (extra indirection + less optimized than the actual player build). Crashes
in play mode usually don’t kill the whole editor because of the isolation between editor and runtime.
So, Unreal’s speed advantage in PIE comes from integration, while Unity’s stability comes from separation.
Altho we can clearly see two separate folders for Runtime and Editor it is for the sake of an organized source code. The integration happens at the binary level and through the module system.
When you compile the Unreal Engine from source, the Unreal Build Tool (UBT) links the Editor modules directly into the main engine executable. This means that the editor and the engine are not two separate applications that communicate with each other. Instead, the Unreal Editor is essentially a specific build configuration of the engine itself. It is the engine running with all the editor-specific functionality enabled. This allows the editor to run the game logic "live" in a seamless way through features like Play-in-Editor (PIE), because the editor and the game runtime are parts of the same process.
Comparison with Unity Game Engine:
The difference in Play in Editor (PIE) performance between Unreal Engine and Unity comes down to their architectural choices. Unreal Engine's PIE is Faster and Unity's PIE Takes Longer here's the Why:
We already covered how Unreal Engine PIE works. Unity, on the other hand, separates the Editor from the Runtime Engine. When you press "Play," Unity has to:
Reload assets into memory.
Recompile scripts if there are changes.
Initialize the game environment from scratch.
This extra loading time is a trade-off for modularity—Unity's separation allows for more flexibility in certain workflows, but it does slow down iteration speed compared to Unreal.
One other reason that Unreal engine made that architectural choice may be because Unreal engine is mostly used for high end games with a lot of graphics and heavy performance compared to Unity that is mostly used for mobile games, and loading time of PIE can get very high and decrease developers performance.
The engine's modular design is what makes this tight integration possible. The Runtime folder contains the core modules that are essential for the game to run on any platform. The Editor folder contains modules that are only needed for the development environment.
However, the editor modules have dependencies on the runtime modules for example, the LevelEditor module needs to know about UWorld and AActor classes from the Runtime\Engine module. During the build process for the editor, the UBT includes and links both the runtime and editor modules together to create a single executable.
Let's first see the Runtime Engine Architecture then we come back to the classes in the Runtime\Engine module.
Runtime Engine Architecture
Most modern game engines, including Unreal Engine, Unity, and Godot, and … fundamentally employ a layered architecture. A key principle of this design is that lower layers generally don't depend on upper layers. This structure effectively prevents circular dependencies and fosters modularity. Jason Gregory mentioned “Circular dependencies can lead to undesirable coupling between systems, make the software untestable and inhibit code reuse. This is especially true for a large-scale system like a game engine.” [1]
Benefits of Layered and Modular Design:
When discussing modern game engines, it's important to understand that layered and modular designs often go hand-in-hand, creating powerful synergies. While layers define the overall vertical structure and dependencies, modules represent independent, self-contained units of code that sit within or across these layers, handling specific functionalities.
Modular Design:
Maintainability & Evolution: This is one of the biggest advantages. When functionality is separated into modules, fixing bugs or making changes tends to be more localized. That helps avoid unexpected side effects and makes long-term updates much easier.
Scalability (Both Team and Project): A modular setup allows multiple developers or teams to work in parallel without stepping on each other’s toes. It also makes it easier to scale the engine itself to support larger game worlds and feature sets.
Testability: Clear module boundaries and interfaces make it easier to write unit tests, which improves code quality and cuts down on debugging time over the long run.
Flexibility & Cross-Platform Support: When modules are well-isolated, it's much easier to swap out implementations—like changing the physics engine or supporting a new platform—without affecting the rest of the system.
Code Organization: A layered approach provides structure, so developers can quickly understand where different pieces of functionality live and how they fit together.
Monolithic Architectures
Unlike modular designs, monolithic architectures are characterized by tightly coupled components with minimal separation of concerns. In these systems, core functionalities like rendering, gameplay logic, and resource management are often deeply interwoven, making it hard to draw clear boundaries or define clean interfaces.
Common Examples:
Legacy Engines: Older engines like the original Quake were designed with tight hardware constraints in mind. They prioritized raw performance through close system integration, often at the expense of long-term maintainability.
Single-Purpose Engines: Sometimes, a game engine is built specifically for one title, with all engineering efforts focused on squeezing maximum performance for that game alone—without concern for future reuse.
Minimalist or Experimental Projects: In some cases, simplicity and fast iteration take priority, especially when scalability and long-term flexibility aren’t important.
Major Drawbacks (Why Monoliths Don’t Scale Well):
Hard to Maintain: Any change, even a small one, risks introducing bugs in unrelated parts of the codebase. Over time, maintaining and updating the system becomes extremely error-prone.
Poor Scalability: Adding features or making significant changes becomes increasingly difficult, as the tight coupling between systems limits flexibility.
Limited Testability: Without clear interfaces or separation, writing unit tests is often not feasible. Debugging usually depends on full-system testing, which is time-consuming.
Challenging for Teams: Monolithic codebases make it difficult for multiple developers to work independently. Conflicts are common, and coordinating changes becomes a bottleneck.
Unreal Engine Modules
“Modules are the basic building block of Unreal Engine's (UE) software architecture. These encapsulate specific editor tools, runtime features, libraries, or other functionality in standalone units of code.” Epic Games documentations[3]
Organizing your project with modules provides the following benefits:
Modules enforce good code separation, providing a means to encapsulate functionality and hide internal parts of the code.
Modules are compiled as separate compilation units. This means only modules that have changed will need to compile, and build times for larger projects will be significantly faster.
Modules are linked together in a dependency graph and limit header includes to code that is actually used, per the Include What You Use (IWYU) standard. This means modules that are not used in your project will be safely excluded from compilation.
You can control when specific modules are loaded and unloaded at runtime. This provides a way to optimize the performance of your project by managing which systems are available and active.
Modules can be included or excluded from your project based on certain conditions, such as which platform the project is being compiled for.[3]
The Unreal build system builds projects according to Target.cs files in your projects and the Build.cs files in your modules, not according to the solution files for your IDE. The IDE solution is generated automatically when editing code, but the Unreal Build Tool (UBT) will ignore it when compiling projects.
every single module in Unreal Engine has a .Build.cs file. This file is written in C# and is the core of how the Unreal Build Tool (UBT) understands and compiles that module.
When configuring your Build.cs files, you will mainly use the PrivateDependencyModuleNames and PublicDependencyModuleNames lists. Adding module names to these lists will set the modules that are available to your module's code.
For example, if you add the "Slate" and "SlateUI" module names to your private dependencies list, you will be able to include Slate UI classes within your module.[3]
PublicDependency: You should use the PublicDependencyModuleNames list if you use the classes from a module publicly, such as in a public .h file. This will make it possible for other modules that depend on your module to include your header files without issues.
PrivateDependency: You should use the PrivateDependencyModuleNames list if they are only used privately, such as in .cpp files. Private dependencies are preferred wherever possible, as they can reduce your project's compile times.
The .Build.cs file is the primary way you configure how a module is built, defining its relationships with other parts of the engine and external code. Below picture shows multiple modules .Build.cs files from Runtime.[3]
Your .uproject and .uplugin files contain a Modules list defining which modules are included in your project and how they will load.
When you regenerate your project files, entries for your modules will be added to this list automatically if they are not already present, provided that you have included them in the dependency chain.
Most gameplay modules will simply list their Name, while their Type will be set to Runtime. If their LoadingPhase is not defined it will be set to Default. There are a variety of other module types, loading phases, and additional parameters that control which platforms a module will and won't load on.
The most common module types are Runtime and Editor, which are used for in-game classes and editor-only classes, respectively.
Name Description
Runtime Loads on all targets, except programs.
RuntimeNoCommandlet Loads on all targets, except programs and the editor running commandlets.
RuntimeAndProgram Loads on all targets, including supported programs.
CookedOnly Loads only in cooked games.
UncookedOnly Only loads in uncooked games.
Developer Deprecated due to ambiguities.
DeveloperTool Loads on any targets where bBuildDeveloperTools is enabled.
Editor Loads only when the editor is starting up.
EditorNoCommandlet Loads only when the editor is starting up, but not in commandlet mode.
EditorAndProgram Loads only on editor and program targets.
Program Only loads on program targets.
ServerOnly Loads on all targets except dedicated clients.
ClientOnly Loads on all targets except dedicated servers.
ClientOnlyNoCommandlet Loads in editor and client but not in commandlets.
Max
How Layers Improve Testability
Isolation of Components – Each layer has a specific responsibility, so you can test individual layers without affecting others.
Mocking & Dependency Injection – Since layers interact through defined interfaces, you can replace dependencies with mock objects for unit testing.
Encapsulation of Business Logic – The core logic is separated from UI and infrastructure, allowing for pure unit tests without external dependencies.
Predictable Execution Flow – Layers follow a structured order, making it easier to write integration tests that verify interactions between layers.
In unreal Engine the upper-most layers contain the well-known GameFramework classes containing PlayerController and GameModeBase. The lower layers contain platform-specific implementations such as Runtime/Unix.
From top to bottom, the layers are:
Game-Specific Subsystems
Gameplay Foundations, Rendering, Scene Graph / Culling, Visual Effects, Front End, Skeletal Animation Collision & Physics, Animation, AI, HID Audio, Input
Resources (Resource Manager)
Core Systems
Platform Independence Layer (Networking, File System)
3rd Party SDKs (DirectX, OpenGL, PhysX)
OS
Drivers
Hardware
Top: Game Logic, AI, Rendering High-Level, etc. (Abstract, deals with game concepts)
Middle: Core Systems, Resource Management, Platform Independence (Providing general engine functionality)
Bottom: 3rd Party SDKs, OS, Drivers, Hardware (Dealing with the specifics of the machine)
Also to keep the project modular, many features within these layers (e.g. Replication Graph, Gameplay Ability System) are separated out into optional Plugins.
The above table shows all of the major runtime components that make up a typical 3D game engine from “Game Engine Architecture from Jason Gregory”[1] the details of each component in this book is well explained and more.
As we said our goal is to analyse the Unreal Engine architecture so let’s see a simplified and tailored table for unreal engine and dive into each layer.
Now we will Check each layer in detail from below to top.
The Role of the Target Hardware Layer in Unreal Engine’s Architecture
The target hardware layer, shown in isolation in Figure, represents the computer system or console on which the game will run. While Unreal Engine aims for cross-platform compatibility, there are inevitably some optimizations and code adjustments that are specific to the hardware of different platforms.
This layer is tasked with doing these optimizations and code adjustments for that specific hardware of that platform.
The platforms that Unreal Engine currently sports are Windows, Apple devices, Xbox, PlayStation, Nintendo, Android, Web, VR
The Role of the Drivers Layer in Unreal Engine’s Architecture
Drivers manage hardware resources and provide an interface (abstraction) for the operating system to interact with the myriad variants of hardware devices.
The Drivers Layer bridges the gap between the generic commands from the operating system and the specific instructions required by the hardware.
The Role of the Operating System Layer in Unreal Engine’s Architecture
This part of Unreal Engine handles the various operating systems which share hardware resources between multiple applications, one being your game. Unlike consoles of old where a game could "own" the entire device and assume full control of memory and compute resources, modern consoles and modern operating systems employ preemptive multitasking and can have multiple applications running alongside your game (e.g. Xbox Live, Netflix, Voice Chat, store downloads) that take over certain system resources or pause the game entirely.
Some reasons why this layer exists:
Implement memory access and tracking for each platform.
Obtain platform properties regarding features that are supported (e.g. Texture Streaming, High Quality Light Maps, Audio Streaming)
Access (and wrap functions for) platform native APIs (e.g. Atomics, File I/O, Time)
Execute general platform commands (e.g. get orientation of screen, get network type)
Provide platform-specific implementations of OS functions (e.g. FPlatformProcess::Sleep , FPlatformProcess::LaunchURL)
FGenericPlatformMisc and FGenericPlatform are examples of OS layer classes.
The Unreal Engine, like most graphical applications, starts with a window, and the OS provides the foundational mechanisms for creating and managing these windows.
On Windows (OS), the operating system provides the Windows API, which is a vast set of tools and functions that applications can use. Part of this is the underlying graphical subsystem.A stack of software components included the window manager, the graphics device interface(GDI) for basic drawing, the graphics driver is provided by hardware manufacturers and ultimately the graphics processing unit GPU itself. They all work together to render what you see on the screen.
The entry point for the engine depends on the platform. Every Windows program has an entry-point function called ‘WinMain’.
Unreal Engine's entry point for Windows, like all other game engines, is the WinMain function defined in Runtime/Launch/Windows/LaunchWindows.cpp.
Each supported platform has their respective entry point:
* MacOS: INT32_MAIN_INT32_ARGC_TCHAR_ARGV in Mac/LaunchMac.cpp
* Linux: int main in Linux/LaunchLinux.cpp
* IOS: int main in IOS/LaunchIOS.cpp
Also in Launch.cpp we have the engine loop, It is a very simple while loop.
Every game has a kind of heartbeat, a constant internal rhythm that keeps everything running, moving forward. The fundamental "heartbeat" of a real-time game is its game loop, a repeating cycle that drives the entire experience. For a simple game the loop involves reading player input, updating game objects, running physics and collision checks and renders the results. This cycle runs dozens or even hundreds of times per second, creating a live and responsive feel. A primary challenge in developing game loops is consistently meeting strict deadlines, such as the 16 milliseconds required to achieve 60 frames per second.
The Role of the 3rd Party SDKs Layer in Unreal Engine’s Architecture
In Unreal Engine’s architecture, the 3rd Party SDKs Layer plays a key role. Sitting just above the OS and Drivers layers, this layer connects the engine to a wide range of external Software Development Kits (SDKs) that provide specialized functionality—everything from rendering and physics to networking.
Instead of reinventing the wheel for every system, this layer leverages these well-established SDKs to save development time and tap into the expertise of specialized vendors. A few examples include:
Graphics APIs like DirectX, Vulkan, and OpenGL. Rather than building low-level graphics systems from scratch for each platform, Unreal integrates with these standard APIs to interface with the GPU. These SDKs are mature, well-optimized, and constantly updated to support new hardware features.
Physics Engines, in legacy systems such as NVIDIA’s PhysX, are used to handle things like collision detection and rigid body dynamics. Building a high-performance, stable physics engine is a major undertaking, so using an existing one makes sense both technically and economically. Altho now unreal uses its own chaos physics.
Other Specialized SDKs, like CUDA (general-purpose GPU computing), GeForce NOW (cloud gaming), Steamworks (online services), Python (for scripting), Oculus SDK (VR), WebRTC (real-time video streaming), and SpeedTree (tree rendering), give Unreal access to functionality that would otherwise take years to develop internally.
From a software architecture standpoint, this layer is vital for a few key reasons:
Leverages External Expertise: Integrating SDKs allows Unreal to take advantage of the years of R&D poured into these technologies by third parties, freeing Epic Games to focus on the engine’s core systems and game development tools.
Enables Rapid Access to New Tech: Because many third-party SDKs evolve quickly, Unreal can stay on the cutting edge by updating its integrations, keeping developers current with the latest rendering features, performance improvements, and hardware support.
Supports Multi-Platform Goals: Many SDKs are cross-platform or provide platform-specific versions, which fits perfectly with Unreal Engine’s aim to support a wide range of devices and operating systems. For example, the RHI (Rendering Hardware Interface) wraps these SDKs in a common abstraction so the engine’s rendering code remains platform-agnostic.
Reduces Internal Maintenance Load: Maintaining third-party SDK integrations isn’t free, but it’s often far less costly than developing and supporting full in-house alternatives for each system.
Aligns with Industry Standards: Integrating with widely-used SDKs helps Unreal stick to industry norms. This lowers the learning curve for developers already familiar with tools like DirectX or Steamworks, and makes the engine more attractive to teams and studios.
In short, the 3rd Party SDKs Layer is a foundational part of Unreal’s flexibility and power. It allows the engine to deliver advanced capabilities while staying lean, scalable, and up to date with the latest tech.
Most third-party SDKs in Unreal are stored under Engine/Source/ThirdParty, where you’ll find either their source code, prebuilt static libraries (.lib on Windows, .a on Linux/macOS), or both. These libraries are typically precompiled by Epic and linked into Unreal’s modules during the build. By default, they don’t appear in your IDE’s solution explorer since they aren’t part of Unreal’s core source. If needed, developers can regenerate project files with the -THIRDPARTY flag to include these external projects, but this is usually only required for debugging or rebuilding the third-party SDKs themselves.
The Role of the Platform Independence Layer in Unreal Engine’s Architecture
Unreal Engine's Platform Independence Layer is called the Hardware Abstraction Layer (HAL). Everything under Runtime/Core/Public/HAL falls under this layer. As you see that means the Core module is very dependent on HAL. HAL provides the lowest-level, most essential services that must be platform-specific, such as memory management, file I/O, and threading. Core itself is a module that provides fundamental data types, containers, and utilities; it is built on top of the HAL's platform-specific implementations.
HAL provides the raw, platform-dependent tools (like malloc for a specific OS or a function to create a thread). The Core module then uses these tools to build the higher-level, platform-independent utilities that the rest of the engine relies on, such as the TArray container, FString, and the engine's threading library.
Key responsibilities of the HAL include:
Memory Management (Malloc* and Memory* files): Unreal Engine uses highly specialized memory allocators, often replacing the default ones provided by the OS. Files like MallocBinned.cpp, MallocJemalloc.cpp, and MallocMimalloc.cpp show that Unreal Engine provides multiple alternative allocators to optimize for different performance characteristics and platforms. LowLevelMemoryUtils.cpp and MemoryMisc.cpp provide core memory-related utilities. This is a crucial part of the HAL's job, as efficient memory management is vital for game performance.
Threading (Thread.cpp, PThreadRunnableThread.cpp, ThreadingBase.cpp): These files provide the core implementation of the engine's threading model. This includes creating and managing threads, synchronization primitives, and thread-local storage. The PThreadRunnableThread.cpp file, for example, points to the use of POSIX threads, which is a common API on Linux and macOS, further illustrating the platform-specific nature of this layer.
File System (FileManagerGeneric.cpp, IPlatformFile*): These files implement the engine's platform-agnostic file system. IPlatformFile* files define the interface for file I/O, and the engine then provides different implementations for each platform, ensuring that code like FFileHelper::LoadFileToString works identically whether the game is running on Windows or a console.
Miscellaneous: Files like ExceptionHandling.cpp, PlatformMemory.cpp, and PlatformMemoryHelpers.cpp cover other essential, low-level platform-specific details. ConsoleManager.cpp and FeedbackContextAnsi.cpp handle basic console output and logging.
In essence, the Platform Independence Layer is the great enabler of Unreal Engine's multi-platform capabilities. It provides the necessary abstraction to shield the higher-level, platform-agnostic parts of the engine from the inherent variations of the underlying hardware and operating systems, making development more efficient and the engine more versatile.
The Resources offers a comparison point by mentioning older console architectures where a game could "own" the entire device and directly access hardware resources. In such scenarios, a Platform Independence Layer was often nonexistent or minimal because the software was written for a single, fixed hardware configuration.
Modern operating systems and consoles, however, involve preemptive multitasking and resource sharing between multiple applications. The necessity of the OS layer, and by extension the HAL, arises from this shift towards more complex, multi-application environments where direct hardware control by a single application is not feasible or desirable. The HAL is essential for navigating these complexities and allowing the engine to coexist with the operating system and other processes while still efficiently utilizing hardware resources.
In comparison to an architecture without a robust HAL, a multi-platform engine would require significant duplication of code for each platform, leading to increased development time, higher maintenance costs, more bugs, and slower adoption of new technologies. The Platform Independence Layer is a fundamental architectural pattern that directly addresses these challenges, making Unreal Engine capable of powering games across a wide range of devices.
Other members of this layer is
PhysicsCore is the platform- and solver-agnostic abstraction. It defines the API that the rest of the engine uses for physics. The files Runtime/PhysicsCore/Public contain the core interfaces and data structures. For example:
BodyInstanceCore.h defines the abstract representation of a physical body.
CollisionShape.h provides a generic way to describe collision geometry (sphere, box, etc.).
ChaosInterfaceWrapperCore.h and PhysXInterfaceWrapperCore.h are key files that highlight this purpose. They are the wrappers that translate the engine's generic physics requests into specific instructions for either the Chaos or PhysX solver.
This separation means that a developer writing gameplay code for a UStaticMeshComponent doesn't need to know if the engine is using Chaos or PhysX; they just interact with the PhysicsCore API. we’ll see more in the Collision & Physics Layer.
Networking: It provides a consistent API for network operations, even though the underlying implementation might use different operating system sockets or protocols ( Unreal Engine implements its own custom networking protocol called Unreal Datagram Protocol (UDPG) built on top of User Datagram Protocol (UDP) that provides reliability to the unreliable data you get with UDP due to dropped and out-of-order packets ).
The Role of the Core Systems Layer in Unreal Engine’s Architecture
The Core Systems Layer is the foundation of Unreal Engine’s runtime architecture, sitting directly above the Platform Independence Layer. It provides the fundamental services, data structures, and utilities on which all other engine modules depend. Without it, higher layers like Rendering, Animation, or Gameplay would lack the essential building blocks needed to function.
Key responsibilities include:
Data Structures & Algorithms – Custom containers (TArray, TMap, TSet) and templates optimized for performance-critical workloads, avoiding the overhead of STL/Boost.
Memory Management – Engine-wide allocators and tracking systems, designed for low-level efficiency and debugging of memory usage.
Math & Utility Libraries – Core math types (FVector, FMatrix, FRotator) and algorithms supporting physics, graphics, and gameplay systems.
String and Name System – Efficient handling of text and identifiers through FString and FName, enabling fast comparison and reduced memory usage.
Serialization & Object Handles – Facilities for saving/loading objects, managing unique IDs, and supporting the reflection-based object system.
Threading & Concurrency – Asynchronous task systems, thread pools, and synchronization primitives that power scalable multithreading.
Assertions & Logging – Runtime diagnostics (check, ensure, UE_LOG) and crash reporting to detect and document invalid engine states.
Debugging & Profiling Hooks – Cycle counters, stat tracking, and integration with the Trace framework used by Unreal Insights.
One of the most important parts of this layer is the UObject system, which provides reflection, garbage collection, and the basis for nearly all higher-level engine objects. UObject’s lifecycle management (mark-and-sweep garbage collection with UPROPERTY tracking) ensures memory safety and integration across subsystems like Blueprints, serialization, and networking.
From an architectural perspective, the Core Systems Layer delivers three critical benefits:
Performance – Custom low-level systems tuned for real-time simulation.
Consistency – A common foundation used by all modules, keeping the massive codebase coherent.
Scalability – Abstractions that can support multiple platforms, massive projects, and varied game genres.
In addition to its higher-level systems, the Core layer also implements many classic engine-level optimizations described in general game architecture literature. While gameplay programmers rarely deal with these directly, they are a critical part of Unreal’s performance and scalability under the hood. We will address some of these concerns by phrasing them as questions then answering them.
How does the engine manage memory?
At the lowest level, C or C++ programme managing memory is just paramount. Data lives in different places, The stack for temporary stuff, the heap for dynamic allocations, separate areas for the code itself. A critical detail is Endianness.
Endianness is the order in which bytes are stored in computer memory for multi-byte data types like integers and floating-point numbers.
So imagine a number like 0XABCD12340. On one type of machine it's read left to right, but on another it's read right to left. This memory management is crucial. If we try to load a saved game from one system onto another, if the interpreted the byte order differently your progress which is garbled data it's this management that ensures games can move data seamlessly across different hardware.
What about data Data alignment and packing. How is that Important for games?
When the compiler stores data in a structure. It often adds invisible padding bytes to ensure elements line up on specific memory addresses. This makes accessing them faster, but also waste space. We use Data Packing to solve this. It’s rearranging the members within your data structures. By simply putting all our smaller variables together, we can often eliminate those ways to padding bytes entirely.
This makes our game's data footprint much smaller and faster to access. It seems tiny, but it adds up massively in a large game. It's optimisation right down at the bite level. And it's not just how data is stored it’s also how quickly the processor gets to it.
What is simD and parallel processing? How does this make games run lightning fast?
Single Instruction, Multiple Data (SIMD) is a total game changer instead of the CPU processing one piece of data at a time. A single instruction operates on multiple data elements simultaneously, doing things in batches. This is how games can crunch massive amounts of visual or physics data almost instantly. But to get the most out of it your data needs to be perfectly aligned in memory usually to 16 by boundaries otherwise you face significant performance hits.
What are specialised memory allocation strategies that engines use beyond the standard way computers handle memory?
Standard memory allocation can lead to problems. So beyond the general purpose heat, game engines rely on highly specialised allocators. For instance, a double ended stack allocator was famously used in the game Hydro Thunder.Imagine a block of memory where two stacks grow towards each other from opposite ends. Helps prevent fragmentation. Interesting. Then there are pool allocators. These are perfect for efficiently allocating lots and lots of fixed sized blocks of memory, like say, for every bullet particle in a shooter game.And crucially, aligned allocation ensures data starts at memory addresses that are multiples of certain values, like 16 bytes, which is essential for performance, especially with simD. The biggest enemy here, though, is memory fragmentation. That's when your available memory gets broken up into tiny, unusable chunks scattered.Everywhere. Clever solutions involve things like using smart pointers or handles instead of direct memory addresses.These allow the engine to potentially move blocks of memory around in the background to consolidate free space defragmentation, often spreading the cost over many frames so the player never even notices.
The Role of the Resources (Game Assets) Layer in Unreal Engine’s Architecture
Unreal Engine's Resources (Game Assets) Layer acts as the centralized hub for all content used within a game. This layer, managed primarily by the Asset Manager, provides a unified and efficient interface for accessing, loading, and managing various types of game assets. At its core, Unreal Engine treats all content as "Assets," which are serialized to specific file formats, primarily .uasset and .umap files.
The Asset Manager is a high-level system within Unreal Engine, so its implementation is spread across several modules, but the core functionality is housed in the Engine/Source/Runtime/AssetManager module. The UAssetManager class, which can be extended for project-specific needs, is the central object that orchestrates asset discovery, loading, and unloading.
.uasset files: These are the most common asset files in Unreal Engine. Almost every piece of content you create or import into the engine, from a simple texture to a complex Blueprint class, is saved as a .uasset file. These files contain the serialized data of a single UObject (Unreal Object) and its properties. They are highly optimized for Unreal Engine's internal use.
.umap files: These specifically represent "maps" or "levels" within Unreal Engine. A .umap file essentially describes the layout of a game world, including the placement of actors, lighting information, navigation data, and references to all the. We will come to it shortly.
Unreal Engine also has a concept of Primary and Secondary assets. A Primary Asset is an asset that can be directly managed and loaded by the AssetManager (e.g., a level or a data asset), while a Secondary Asset is an asset that is loaded automatically because it's referenced by a primary asset (e.g., a texture used by a character model). This distinction is crucial for how the engine handles packaging, chunking, and memory management.
Unreal Engine's Asset Manager is a singleton object (meaning there's only one instance of it globally) that serves as the backbone of this layer. Its responsibilities include:
Asset Discovery and Registration: The Asset Manager scans designated content directories (defined in project settings) to discover and register available assets, creating an "Asset Registry" that can be queried.
Asynchronous Loading: A key feature of the Asset Manager (and its underlying FStreamableManager) is the ability to load assets asynchronously. This means assets can be loaded in the background without freezing the game, crucial for smooth streaming of content as players move through large worlds or for displaying loading screens.
Asset Bundles: For more fine-grained control, Asset Bundles allow developers to tag specific parts of an asset (or groups of assets) that can be loaded independently. This is useful for optimizing memory usage.
Cooking and Packaging: During the "cooking" process (preparing the game for a specific platform), the Asset Manager will optimise and package assets into .pak files. This converts them into final, binary formats optimized for the target platform, removes editor-only data, and can split content into chunks for efficient distribution and streaming.
The Resource Layer handles a vast array of asset types, each with its own internal structure and rendering/logic pipeline:
3D Model Resource (Static Meshes & Skeletal Meshes):
UStaticMesh Assets represent static geometry (e.g., props, environment pieces). They contain vertex data, normals, UVs, and references to materials.
USkeletalMesh Assets represent deformable geometry, typically used for characters. They contain vertex data, a skeleton hierarchy (bones), and references to materials.
Texture Resource (UTexture): Image data used for materials. This includes diffuse maps, normal maps, roughness maps, ambient occlusion maps, etc. The manager handles various image formats and streaming levels of detail (MIP maps).
Material Resource (UMaterial / UMaterialInstance): Define how surfaces look, reacting to light and other properties. They are built using a node-based graph that combines textures, mathematical operations, and parameters. Material Instances allow artists to create variations of a master material without recompiling shaders.
Font Resource (UFont): Defines characters and their properties for displaying text in UI elements or 3D text.
Skeleton Resource (USkeleton): The hierarchical bone structure used by Skeletal Meshes to define how a character can be posed and animated.
Animation Resource (UAnimSequence, UAnimBlueprint, etc.): Store motion data for Skeletal Meshes, ranging from individual animation clips to complex state machines and blending logic.
Collision Resource: Defines the physical boundaries and interaction properties for objects in the world. This can be generated from the mesh geometry or custom-defined.
Physics Parameters (UPhysicsAsset): A collection of bodies and constraints associated with a Skeletal Mesh, used by the physics engine for realistic character ragdolls or destructible objects.
Sound Resource (USoundWave, USoundCue, USoundAttenuation): Audio clips and the rules for their playback, spatialization, and effects.
Particle System Resource (UNiagaraSystem, UParticleSystem): Define visual effects like explosions, smoke, fire, and magical spells using a combination of particles, sprites, and often textures and materials.
Blueprint Class Resource (UBlueprint): Visual scripting assets that allow designers and artists to create game logic, actors, and components without writing C++ code. These compile down to C++ classes during packaging.
Data Asset Resource (UDataAsset): Simple, structured data containers used to store game data like item properties, enemy stats, or level configurations, allowing designers to easily tweak values without code changes.
The UWorld, which is saved as a .umap file, is a special kind of asset. It's the central container for a level. When this file is loaded, the engine doesn't just load the map itself. Instead, it reads the serialized data within the .umap file to understand what objects (actors) are in the world and where they are placed.
The UWorld then acts as the central point of contact for the Asset Manager. The Asset Manager ensures that all the other assets that are referenced by the UWorld (such as meshes, materials, and textures) are also loaded into memory.
This process is what allows for efficient content streaming. The Asset Manager can manage the lifecycle of these assets, unloading them from memory when they're no longer needed (e.g., when a player moves away from a certain area of the world), and loading new ones as needed. This prevents the game from using an excessive amount of memory by keeping the entire world loaded at once.
It's a foundational concept that allows for large, dynamic game worlds to be created and run efficiently on various hardware.
Unreal Engine 5 introduced a feature called World Partition that takes this a step further. Instead of having a single .umap file, it stores the world data in a persistent level that is subdivided into streamable grid cells. This allows for even more granular control over what assets are loaded and unloaded, which is especially useful for creating massive open-world environments.
The Role of the Collision & Physics Layer in Unreal Engine’s Architecture
This layer of the engine is dedicated to simulating the physical interactions of objects within the game world. Its core responsibilities encompass collision detection (determining when objects touch or overlap) and rigid body dynamics .
The physics system in Unreal Engine is a great example of using a layered architecture that separates core functionality from its implementation. This design allows the engine to be flexible, supporting different physics engines (like Chaos and the legacy PhysX) and enabling higher-level gameplay code to remain largely independent of the underlying physics solver. We already explained the Physics & Collision Wrappers (PhysicsCore) in Platform Independence Layer. Now we want to go deeper into how it is implemented.
The files in Runtime/Engine/Private/PhysicsEngine are the concrete implementations of the interfaces defined in PhysicsCore. This is where the engine provides its own unique, high-level physics functionality, such as:
PhysScene_Chaos.cpp: This file contains the actual C++ code that creates and manages the physics world using the Chaos solver.
ConstraintInstance.cpp: This is the implementation of the physics constraints you use in-game (like joints and hinges). It takes the high-level request and uses the wrapped Chaos or PhysX API to execute it.
Forces: Mechanisms to apply impulses, continuous forces, or torques to rigid bodies, driving their motion. This is the primary way gameplay logic influences physics.
Constraints (Joints): Define how two rigid bodies are physically connected or restricted relative to each other. These are high-level abstractions provided by the physics engine to model complex physical relationships.
So, PhysicsCore defines the "what" and PhysicsEngine provides the "how" for the engine's physics system.
Unreal Engine's Implementation:
Historically, Unreal Engine integrated NVIDIA's PhysX SDK, which was indeed free and open-source. This allowed Unreal Engine to benefit from a mature, high-performance physics solution.
However, in recent years, Unreal Engine has been transitioning to its own in-house physics solution, Chaos Physics Engine. This strategic move provides Epic Games with greater control, flexibility, and the ability to tightly integrate physics features with other core engine systems (like destruction and animation) without reliance on external vendors.
Shapes / Collidables (Collision Primitives):
These define the geometric representation used for collision detection.
Typically, simple primitive shapes like boxes, spheres, capsules, and convex hulls are used for efficient broad-phase and narrow-phase collision checks, rather than the complex visual mesh geometry. This is a crucial optimization: complex visual meshes are computationally expensive for collision, so simpler "collision meshes" or primitives are used for physics. The physics layer provides the mechanisms to attach these shapes to rigid bodies.
The Role of the Human Interface Devices (HID) Layer in Unreal Engine’s Architecture
This layer can be broken down into three logical parts, from the most abstract to the most concrete:
Game-Specific Interface: This is the highest level, where input is translated into game actions. It's not about which button was pressed, but what that button does in the game (e.g., "Jump," "Shoot," "Move Forward"). This layer is typically configured in the editor and through C++ classes that inherit from APlayerController and UPlayerInput.
Core Abstraction: This is the middle layer that provides a unified API for all input devices. It takes the raw data from the Physical Device I/O layer and presents it in a standardized format that the Game-Specific Interface can use.
Physical Device I/O: This is the lowest level, where the engine communicates directly with the operating system to receive raw input data from physical devices. This is where the engine handles raw data from keyboards, mice, gamepads, and other peripherals.
The code for the HID layer is spread across several modules in Unreal Engine, reflecting its layered nature:
Core Abstraction: The Runtime/InputCore module is the heart of this layer. It defines the core data types and classes for handling input, such as FKey (which represents a keyboard key or gamepad button) and FVector for analog input.
Physical Device I/O: This is handled by platform-specific code. The platform modules (e.g., Runtime/Core/Private/Windows) are responsible for communicating with the OS to get raw input.
High-Level Gameplay Integration: The Runtime/Engine module contains the classes that translate input into gameplay actions. For example, APlayerController handles input, and the Input folder in Runtime/Engine/Private has code for input mapping and binding.
You may be wondering about the "Enhanced Input" plug. It is the new standard for handling input in Unreal Engine, and it's used in virtually all new projects. While it is a plugin, it's enabled by default in new projects from Unreal Engine 5.1 onward, effectively making it the go-to system. It’s under Plugins/EnhancedInput.
The Enhanced Input plugin provides a highly flexible and robust framework for handling user input. Instead of the old, more rigid system of "Action Mappings" and "Axis Mappings," it uses an asset-based approach with four core concepts:
Input Actions: These are data assets that represent a single conceptual action in your game, like "Jump," "Fire Weapon," or "Look Around." They are not tied to any specific key or button.
Input Mapping Contexts (IMC): These assets map your physical inputs (e.g., keyboard keys, mouse buttons, gamepad sticks) to your Input Actions. You can have multiple IMCs and add or remove them at runtime. For example, you might have one IMC for gameplay and a different one for a menu.
Modifiers: These are pre-processors that can alter the raw input value before it's used. Common modifiers include things like "dead zones" for analog sticks, "smoothing" for camera input, or converting input vectors from local to world space.
Triggers: These determine whether an Input Action should activate based on the input value. Examples include "pressed," "released," "held," or a "tap."
The main reason Enhanced Input is a plugin is to maintain backward compatibility with older projects and the legacy input system from Unreal Engine 4. It provides a modern, more powerful alternative without breaking existing projects. By being a plugin, it gives developers an upgrade path and allows them to choose which input system they want to use.
The Role of the Low-Level Renderer Layer in Unreal Engine’s Architecture
Provide a unified, platform-agnostic interface between Unreal Engine’s higher-level rendering code and the GPU, hiding API-specific complexity (Direct3D 12, Vulkan, Metal) behind a consistent abstraction.
Architectural Importance:
Platform Agnosticism: The RHI's primary role is to provide a consistent API for the rest of the engine's rendering code, regardless of the underlying graphics hardware or operating system. This allows Unreal Engine to support a vast array of platforms (PC, consoles, mobile, VR) without rewriting the entire rendering pipeline for each.
Encapsulation of Complexity: It encapsulates the low-level, often intricate, details of graphics API calls (e.g., creating device contexts, managing command lists, binding render targets). This allows higher-level rendering systems to focus on what to render, rather than how to interact with a specific API.
Performance Optimization: The RHI is meticulously designed for minimal overhead, directly translating engine commands into native graphics API calls to maximize GPU utilization.
Maintainability & Evolution: When a new graphics API emerges (e.g., a new version of DirectX or Vulkan), only the RHI layer needs to be updated or extended with a new implementation, minimizing ripple effects across the entire rendering engine.
Implementation Structure: The RHI's core is found in Runtime/RHI and Runtime/RHICore, with specific implementations for each graphics API (e.g., DirectX RHI, Vulkan RHI, OpenGL RHI).
Resource Management
Creation, tracking, and lifetime management of GPU resources (textures, buffers, samplers).
Pipeline State Control
Creation and caching of PSOs for graphics, compute, and ray tracing.
GPU Profiling & Diagnostics
Timers, markers, and debug information (GPUProfiler.h, GpuProfilerTrace.h).
Validation
Optional runtime checks to catch API misuse (RHIValidation*.h).
Platform Backends
Provide concrete implementations for each supported API in /Windows, /Android, /IOS.
Game Thread → enqueues render commands (high-level renderer).
Render Thread → builds GPU work, sets PSO/state.
RHI → records to FRHICommandList & submits to backend.
Backend (D3D12/Vulkan/Metal) → native API + driver → GPU.
The Role of the Scene Graph / Culling Optimizations Layer in Unreal Engine’s Architecture
The Scene Management & Culling layer is responsible for organizing and filtering world geometry and actors so that only relevant and visible data is processed by the renderer and simulation systems. Its main role is to minimize CPU and GPU workload by determining what needs to be drawn and what can be ignored each frame.
Unreal Engine maintains two parallel scene hierarchies:
Gameplay Scene Graph – Hierarchical structure of SceneComponent objects, defining spatial relationships between actors and components.
Render Scene Graph – Optimized internal representation (FScene) used by the renderer for visibility checks, lighting, and GPU submission.
These graphs are synchronized via scene proxies, allowing gameplay logic and rendering to remain decoupled while sharing spatial information.
Spatial Organization
Uses hierarchical transforms for grouping and positioning.
Supports world partitioning and level streaming for large environments.
Frustum Culling
Discards objects outside the camera’s view volume.
Occlusion Culling
Skips rendering of objects blocked by other geometry.
Distance & Relevance Filtering
Ignore objects too far away or irrelevant to the current rendering pass.
Level of Detail (LOD)
Switches between high- and low-resolution meshes or materials based on distance and screen size.
Architectural Placement
Gameplay graph is managed in the Runtime\Engine module, closely tied to actors and components.
Render graph & culling logic live in the Renderer module, operating just before GPU command submission.
World partitioning and streaming span both domains, influencing visibility and asset loading.
Key Benefits
Performance – Reduces draw calls and GPU memory usage.
Scalability – Handles large worlds efficiently.
Cross-Platform – Culling logic is API-agnostic, working across RHI backends.
The Role of the Visual Effects Layer in Unreal Engine’s Architecture
The Visual Effects layer is responsible for delivering advanced rendering features that go beyond basic scene geometry and textures. It is focused on artistic quality and visual immersion, leveraging the low-level rendering pipeline and GPU-accelerated shaders to simulate realistic lighting, atmospheric effects, and cinematic post-processing.
This layer depends heavily on the Renderer module and its interaction with the RHI (Render Hardware Interface) to push complex effects in real-time.
Above: Low-Level Renderer (RHI, rendering pipeline setup)
Below: Game Framework & Scene Management (which determine what should be visible)
Primary Location: Runtime/Renderer module, with some features in Runtime/Engine and Niagara for particle systems.
Additional Support: Post-processing and lighting shaders in Shading, Renderer, and Niagara modules.
Lighting Models
Static light mapping for baked GI.
Dynamic shadows for real-time lighting changes.
HDR lighting for high-range intensity representation.
Material-Based Effects
Subsurface scattering for skin, wax, and similar materials.
Reflection and refraction via environment mapping.
Particle & VFX Systems
Niagara for complex GPU-accelerated particle simulations.
Cascade (legacy) for sprite-based effects.
Decal Rendering
Project bullet holes, dirt, and graffiti dynamically without modifying mesh geometry.
Post-Processing Pipeline
Bloom, motion blur, depth of field, screen space ambient occlusion (SSAO), chromatic aberration, and tone mapping.
Atmospheric & Volumetric Effects
Fog, volumetric lighting, god rays, and weather effects.
Renderer – Main module for deferred/forward rendering, post-processing, lighting, and shadows.
Niagara – GPU/CPU particle effects and simulations.
Engine – Materials, shaders, decals, basic visual effects hooks.
Shading – Shader implementations for lighting models and post effects.
RenderCore – Shared rendering utilities and render command dispatching.
Visual Fidelity – Delivers high-end graphics for modern platforms.
Flexibility – Modular systems allow easy addition of new effects.
Performance Tuning – Effects can scale based on platform capability.
The Role of the Front End Layer in Unreal Engine’s Architecture
This layer is responsible for everything the player interacts with that isn't direct gameplay in the 3D world. It encompasses the user interface, presentation elements, and narrative delivery systems that frame the core game experience.
Architectural Importance: This layer provides the visual and interactive bridge between the player and the game's internal logic. It often involves a separate rendering pass or dedicated UI rendering pipeline, distinct from the main 3D scene, ensuring UI elements are always drawn on top and perform efficiently.
Implemented primarily through:
UMG (Unreal Motion Graphics) – Widget-based UI for HUD elements.
Slate / SlateCore – Underlying low-level UI framework.
Engine – AHUD class and legacy Canvas drawing.
It’s part of the UI systems above the Game Framework layer, with rendering handled through Slate’s integration into the rendering pipeline.
Provided by:
MediaAssets – High-level API for playing video.
Platform-specific media modules (e.g., WmfMedia, AvfMedia, ElectraPlayer).
Media playback runs as a runtime service that outputs textures, which are then rendered through the UI system or onto world geometry.
Provided by:
LevelSequence – Cinematic playback.
MovieScene – Underlying timeline and track system.
CinematicCamera – Specialized camera controls.
Sits above the Game Framework, controlling cameras, animation, and rendering order through sequencing.
Uses the same modules as HUD, Part of the UI layer, not a distinct runtime layer. Menu logic is implemented in game code.
UMG, Slate, SlateCore.
Wrappers / Attract Mode
No dedicated runtime implementation.
Not an engine-provided runtime layer; it is built at the game project level.
The “Front End Layer” from the typical game engine model does not exist as a single, clearly defined runtime layer in Unreal Engine.
Instead:
HUD, GUI, and menus are handled by the UI subsystem (UMG, Slate), which sits above the Game Framework.
FMV and cinematics are handled by media playback and sequencing systems, which are separate runtime modules.
Wrappers / Attract Mode are purely game-level implementations.
The Role of the Skeletal Animation Layer in Unreal Engine’s Architecture
This layer powers character motion in Unreal Engine, controlling how models move, react, and interact.
Engine – Core skeletal mesh, animation components, animation graph execution.
AnimGraphRuntime – Runtime nodes for animation blueprints (state machines, blend nodes, IK solvers).
AnimationCore – Low-level math utilities for bone transforms and IK.
ClothingSystemRuntimeCommon / platform-specific clothing runtimes – Cloth simulation on skeletal meshes.
PhysicsCore / Chaos – Physics integration for ragdolls and physics-based bones.
SkeletalMesh code in Renderer – Final GPU skinning and vertex deformation.
Animation Graph Execution
Animation Blueprint state machines and blend spaces handled by AnimGraphRuntime.
Inverse Kinematics (IK)
Built-in solvers like FABRIK, CCDIK, and Two-Bone IK from AnimationCore.
Attachment System
Bone/socket attachment handled by USkeletalMeshComponent in the Engine module.
Blending
Linear interpolation (LERP) and additive blending in animation nodes.
Animation Playback
Controls for looping, rate scaling, and montage playback.
Partial / Layered Animation
Animation layers, slot nodes, and mask-based blending for localized motion.
Decompression
Animation asset loading and decompression handled in Engine runtime.
Ragdoll Physics
Physics asset + Chaos physics simulation for joint constraints.
Skeletal Mesh Rendering
Applies bone transformations to mesh vertices and sends them to the GPU for rendering.
The Role of the Audio Layer in Unreal Engine’s Architecture
This layer is responsible for all sound-related aspects of the game, it handles playback, spatialization, mixing, and processing of sounds in the game. It also provides both low-level DSP (Digital Signal Processing) capabilities and high-level audio asset management, integrating with the gameplay framework for triggering and controlling sounds.
AudioMixer – Core cross-platform audio rendering engine (mixing, DSP graph, submixes).
AudioExtensions – Interfaces for audio spatialization, reverb, occlusion plugins.
AudioModulation – Runtime control of audio parameters via curves, envelopes, and modulation sources.
AudioPlatform* modules – Platform-specific backends (e.g., AudioPlatformXAudio2, AudioPlatformCoreAudio).
MetasoundEngine – Node-based runtime DSP and procedural audio generation.
SoundFieldRendering – Ambisonics and spatial sound field processing.
DSP / Effects
Real-time effects via the submix system (EQ, reverb, compression, delay).
Procedural audio processing with Metasound (node graph executed in AudioMixer’s DSP thread).
3D Audio Model
Spatialization plugins for binaural audio and surround formats.
Attenuation, occlusion, and environmental reverb.
HRTF (Head-Related Transfer Function) support for VR/AR.
Audio Playback / Management
Sound Cue system for asset-based playback.
Streaming and decoding of compressed audio formats (Ogg Vorbis, Opus, ADPCM).
Voice prioritization and concurrency control.
Submix routing and real-time parameter control.
The Role of the Gameplay Foundations Layer in Unreal Engine’s Architecture
This layer provides the essential runtime systems that coordinate game flow, manage world content, handle events, and enable interactive gameplay.
Implemented via UGameInstance, AGameModeBase, and AGameStateBase in the Engine module. Coordinates match flow, rules, and level transitions.
Game state changes are often handled through Blueprint scripting or C++ state variables.
GameMode defines rules and flow for a match/level (server-only).
GameState holds match-wide replicated data (shared across clients).
PlayerState stores per-player replicated data (e.g., score, health).
GameInstance holds persistent, local data across levels and sessions.
Unreal does not include an embedded text-based scripting language in runtime.
Blueprint visual scripting serves as the runtime-executed scripting system, compiled to bytecode and interpreted by the Blueprint VM.
Managed by ULevel and UWorld.
Includes non-moving actors (e.g., environment geometry, static meshes).
Integrated with world partitioning and level streaming for large worlds.
The UObject class that we mentioned in Core Layer is the basis for almost everything in Unreal, used to interact with its main systems. A UObject itself does not have a Transform or a physical presence in the game world. UObjects are garbage-collected by UE’s memory manager.
GC uses a mark-and-sweep process; objects are marked reachable if referenced (e.g., via UPROPERTY()), otherwise freed in the sweep phase.
AActor inherits from UObject and represents any entity placed in or spawned into the world, with a transform and world association.
Components (UActorComponent, USceneComponent) define modular functionality for actors.
Physics interaction uses the Physics & Collision layer (covered separately).
Any gameplay object with per-frame updates is integrated into the Actor tick system.
Movement, animation, and interaction logic are processed in real time based on game rules and physics.
Built on Unreal’s delegate system and event dispatchers. Includes FDelegate, FMulticastDelegate, and Blueprint event dispatchers for runtime communication between objects. They can be found under Runtime/Public/Core as Delegates
GameplayTags and GameplayMessageSubsystem (optional) provide data-driven messaging.
Managed by World Partition, Level Streaming, and UWorld.
Supports asynchronous loading/unloading of world sections, both for large open worlds and for traditional sublevel setups.
The runtime object model architectures refer to the concrete implementation of a game's abstract object model within the gameplay foundation system at runtime. This runtime model is the in-game manifestation of the abstract model presented to designers in the world editor. While the tool-side object model defines the types of dynamic elements and their attributes and behaviours as seen by designers, the runtime object model provides the actual language constructs and software systems used by programmers to implement these.
Game engines generally follow one of two primary architectural styles for runtime object models:
Object-centric architectures
Property-centric architectures
In an object-centric design, each logical game object (as defined on the tool-side) is represented at runtime by a single class instance or a small collection of interconnected instances. The object's attributes and behaviours are encapsulated within these classes. The game world is seen as a collection of these game objects.
Various implementations exist under this umbrella:
Simple Object-Based Model (e.g., Hydro Thunder): Some engines, even those not strictly object-oriented like C-based Hydro Thunder, extend their language to support rudimentary object-oriented features. Hydro Thunder used a WorldOb_t struct with data members for position, orientation, mesh, collision spheres, and animation state, along with pointers to custom update and draw functions (m_pUpdate, m_pDraw) that acted polymorphically. This allowed specific game object types to maintain custom state information and have polymorphic behaviours and visual appearances.
Monolithic Class Hierarchies: A common approach where game object types are classified taxonomically using inheritance. This leads to a hierarchy where common, generic functionality resides at the root, and more specific functionality is added towards the leaves.
Advantages: Intuitive and straightforward for representing interrelated object types.
Disadvantages:
Complexity: Can become very wide and deep, making it difficult to understand a class without understanding all its parent classes.
Inflexibility: Limited in describing multi-dimensional taxonomies (e.g., an amphibious vehicle doesn't fit neatly into separate 'land vehicle' and 'aquatic vehicle' branches).
"Deadly Diamond": Multiple inheritance, while a potential solution for multi-dimensional taxonomies, can introduce confusion and technical difficulties.
"Bubble-Up Effect": Features intended for specific classes tend to "bubble up" to more generic base classes to facilitate code sharing, leading to root classes accumulating too much functionality (e.g., Unreal's Actor class handling rendering, animation, physics, audio, network replication, etc.).
Composition to Simplify Hierarchy: Favours the "has-a" relationship over "is-a". Instead of inheriting from many base classes, a main "hub" class (e.g., GameObject) composes other classes (components) like Transform, MeshInstance, AnimationController, and RigidBody.
Advantages: Reduces hierarchy width, depth, and complexity. Allows for greater flexibility, as new component types can be added without altering the core game object class.
Component Ownership: The "hub" class typically manages the lifetime of its components.
Unreal Engine's Hybrid Approach: While a Monolithic Class Hierarchy can lead to issues, Unreal Engine's design leverages both this and Composition to create a robust and flexible system. The core of this approach is the AActor class. It serves as the root of a significant, but not entirely monolithic, class hierarchy. All playable characters, objects, and world elements ultimately inherit from AActor, which provides fundamental functionality like transform data, ticking, and network replication. This is-a relationship ensures a consistent baseline for every object in the world. To solve the problems of an overly deep hierarchy, Unreal Engine uses the Component System. This system allows developers to attach UActorComponent objects to an AActor. A UStaticMeshComponent gives an AActor a visual representation, a UBoxComponent gives it a collision volume, and a UCameraComponent makes it a camera. This has-a relationship allows developers to build complex objects by composing simple, reusable components, rather than relying on deep, complex inheritance chains. This hybrid approach gets the best of both worlds: a strong, reliable foundation from the AActor hierarchy and the flexibility of a component-based system.
Unity's Architecture: Unity is primarily a Composition-based engine. Its architecture is built around the GameObject and Component model. A GameObject is essentially an empty container that has a Transform component by default. All functionality, such as rendering, physics, or scripting, is added to the GameObject by attaching various components. For instance, to give a GameObject a visual mesh, you'd add a MeshFilter and a MeshRenderer component. To give it physics, you'd add a Rigidbody and a Collider component. There is no single, deep class hierarchy for game objects; instead, all functionality is built from the composition of components. This makes Unity's architecture highly flexible and modular, allowing for easy mixing and matching of functionality.
Generic Components (Pure Component Model): It's what happens if we were to take the componentization concept to its extreme. We would move literally all of the functionality out of our root GameObject class into various component classes. The root game object class manages a generic linked list of components, often deriving from a common base class. Components are linked only indirectly by sharing a unique ID, rather than through direct ownership by a hub object.
Advantages: New component types can be created without modifying the GameObject class, and a game object can have an arbitrary number of instances of each component type.
Disadvantages: More difficult to implement. Requires mechanisms like a factory pattern or data-driven models for component instantiation. Inter-component communication can be challenging without a central hub.
In a property-centric design, a game object is represented only by a unique ID (e.g., an integer, hashed string ID, or string). The object's properties are distributed across multiple data tables, one for each property type, and are keyed by the object's unique ID.
Behavior Implementation: Behaviours are implemented either within the property classes themselves or via script code.
Property Classes: Each property type can be a class (e.g., a "Health" property that handles damage and destruction). The overall behaviour of a game object is the aggregation of its properties' behaviours.
Scripting: Property values can be stored as raw data, and script code manages the object's behaviours, including responding to events.
Properties vs. Components: While similar to components in object-centric designs (both use multiple sub-objects for a logical game object), the distinction lies in their roles. Property objects define attributes (e.g., health, visual representation), while components often represent linkages to low-level engine subsystems (e.g., renderer, animation, physics).
Advantages:
Memory Efficiency: More efficient as only active attribute data is stored.
Data-Driven: Easier to define new attributes without recompiling the game.
Cache-Friendly: Data of the same type is stored contiguously in memory (struct of arrays approach), reducing cache misses and improving performance on modern hardware.
Disadvantages:
Relationship Enforcement: Harder to enforce relationships between properties.
Debugging: Can be trickier to debug as all properties of a game object aren't in a single, inspectable unit.
Regardless of the architectural style, a runtime object model typically provides the following capabilities:
Dynamic Spawning and Destruction: Managing the creation and removal of game objects and their associated resources during gameplay.
Linkage to Low-Level Engine Systems: Ensuring game objects can access services from rendering, animation, physics, audio, and other engine systems.
Real-time Simulation of Object Behaviours: Updating the states of all game objects over time, often in a specific order due to interdependencies.
Ability to Define New Game Object Types: Flexibility to add new object types, ideally in a data-driven manner.
Unique Object IDs: Providing unique identifiers (e.g., hashed string IDs) for finding and distinguishing objects at runtime.
Game Object Queries: Mechanisms to find objects by ID, type, or arbitrary criteria (e.g., proximity-based queries).
Game Object References: Ways to hold references to objects (e.g., pointers, smart pointers, handles) that are robust to memory relocation and object deletion.
Finite State Machine (FSM) Support: Often used to model object behaviours.
Saving and Loading / Object Persistence: Ability to save and reload game object states to/from disk, often requiring runtime type identification (RTTI), reflection, and abstract construction.
The Role of the Game-Specific Subsystems Layer in Unreal Engine’s Architecture
This layer provides the specialized gameplay subsystems, base modules, and extensible classes that game developers rely on to implement core mechanics. Unlike the lower Gameplay Foundations Layer, which establishes world management and object lifecycles, this layer exposes ready-to-use gameplay frameworks (movement, AI, camera, abilities, etc.) that can be customized and combined to build entirely different genres of games.
It is still part of Unreal Engine itself, but designed for developers to subclass, extend, and plug into the higher-level Game Layer.
Primary Runtime Modules: Engine, AIModule, NavigationSystem, GameplayAbilities, GameplayTasks, GameplayTags, Water, Landscape, and related subsystems.
Character / ACharacter: A ready-made pawn class with walking, running, crouching, swimming, and jumping logic.
CharacterMovementComponent: Encapsulates physics-based movement, path following, and network prediction.
Pawn / APawn: Base controllable entity for implementing custom movement mechanics (e.g., vehicles, creatures).
CameraComponent: General-purpose view control; forms the base for FPS or TPS.
SpringArmComponent: Provides smooth, collision-aware offsets for third-person or over-the-shoulder cameras.
Animation Blueprints: Provide state machines for controlling character animations (idle, walk, run, jump).
Blend Spaces: Smoothly interpolate between animations (e.g., walk ↔ run ↔ sprint).
IK Systems: Extend motion realism (e.g., foot placement).
AnimationMontages: High-level tools for timed gameplay-driven animation (attacks, reloads, cutscenes).
Altho it’s a Plugin but duo to its highly usage and diversity of use it needed to be mentioned.
Modules: GameplayAbilities, GameplayTags, GameplayTasks.
Key Components:
UAbilitySystemComponent: Manages abilities and effects for actors.
UGameplayAbility: Defines actions (spells, attacks, powers).
UGameplayEffect: Handles buffs, debuffs, damage, and timed effects.
GameplayTags: Provides flexible, data-driven state labeling and queries.
Strengths: Highly modular, network-ready, prediction-aware, and ideal for RPGs, shooters, MOBAs, and other ability-driven games.
Modules: AIModule, NavigationSystem.
Core Components:
Blackboard & Behavior Trees: Goal-oriented decision-making systems.
Perception System: Sight traces, hearing, and stimulus-based AI sensing.
Pathfinding: Navigation meshes (NavMesh) with pathfinding algorithms (A* under the hood).
AIController: Controls pawns with AI logic.
Environment Query System (EQS): Advanced decision-making queries for tactical AI.
Landscape Module: Terrain rendering, LOD, and sculpted world features.
Water Plugin: Simulation and rendering of oceans, lakes, rivers with buoyancy.
Effects Integration: Particle systems (Niagara) and decals tied to gameplay events (footsteps, bullet impacts).
The Game-Specific Subsystems Layer is a library of extensible systems inside the engine, not unique to a game. Developers subclass and configure these systems to build their own mechanics. For example:
Using CharacterMovement + CameraComponent creates an FPS or TPS.
Using GAS builds ability-driven RPGs, shooters, or action games.
Using Blackboard + Behavior Trees enables tactical AI opponents.
Using Landscape and Water creates open-world environments.
Together, these subsystems form the developer-facing gameplay toolkit that empowers game teams to implement varied genres on top of Unreal Engine’s foundations.
The Role of the Game Layer (Game Code) in Unreal Engine’s Architecture
This final "layer" is unique as it's not part of the Unreal Engine itself, but rather the custom application built on top of the engine's extensive framework. It represents the unique rules, logic, content, and experiences that define a specific game. Architecturally, this is where all the engine's foundational, rendering, physics, animation, and gameplay systems converge to create a cohesive playable product.
It’s under the Game folder and contains whatever modules that game developers added to their games.
The Gameplay Framework in Unreal Engine provides multiple classes and components to serve as building blocks for your projects. Here is a simplified breakdown of the concepts
Actors: These are the fundamental building blocks of a level, acting as a container for other components. Think of them as any object you can place in the world, like a car or a tree.
Pawns & Characters: A Pawn is an Actor that can be controlled by a player or AI. A Character is a specialized Pawn that can walk and is often used for player-controlled figures.
Controllers: These are non-physical Actors that control the actions of a Pawn. A Player Controller is for human players, while an AI Controller is for non-player characters. A Controller can possess a Pawn to take control of it.
GameMode: This class defines the rules of your game, such as the scoring system or how players join. It only exists on the server, not on the client's machine.
Cameras: These define the player's viewpoint, showing them the game world.
User Interfaces (UI) & HUDs: These display information to the player and allow for interaction, like showing health bars or a main menu.
Gameplay Timers: These are used to schedule events to happen after a specific delay or at regular intervals, such as a countdown timer.
Modular Gameplay: This refers to a development approach using plugins to create separate, self-contained features. This keeps your game's code organized and prevents features from interfering with one another.
This flowchart illustrates how these core gameplay classes relate to each other. A game is made up of a GameMode and GameState. Human players joining the game are associated with PlayerControllers. These PlayerControllers allow players to possess pawns in the game so they can have physical representations in the level. PlayerControllers also give players input controls, a heads-up display, or HUD, and a PlayerCameraManager for handling camera views.
The Editor Architecture
Unreal Engine 5 provides a combination of tools, editors, and systems you can use to create your game or application. This page is from original unreal documentation and uses the following terms:
A tool is something you use to perform a specific task, like placing Actors inside a level, or painting terrain.
An editor is a collection of tools you use to achieve something more complex. For example, the Level Editor enables you to build your game's levels, or you can change the look and feel of Materials inside the Materials Editor.
A system is a large collection of features that work together to produce some aspect of the game or application. For example, Blueprint is a system used to visually script gameplay elements.
Different Editors:
Level Editor
The Level Editor is the primary editor where you construct your gameplay levels. This is where you define the play space for your game by adding different types of Actors and Geometry, Blueprints Visual Scripting, Niagara, and so on. By default, when you create or open a project, Unreal Engine 5 will open the Level Editor.
Static Mesh Editor
You can use the Static Mesh Editor to preview the look, collision, and UV mapping, as well as set and manipulate the properties of Static Meshes. Inside the Static Mesh Editor, you can also set up LODs (or Level of Detail settings) for your Static Mesh assets to control how simple or detailed they appear based on how and where your game is running.
Material Editor
The Material Editor is where you create and edit materials. Materials are assets that can be applied to a mesh to control its visual look. For example, you can create a dirt Material and apply it to floors in your level to create a surface that looks like it is covered in dirt.
Blueprint Editor
The Blueprint Editor is where you can work with and modify Blueprints. These are special Assets that you can use to create gameplay elements (for example, controlling an Actor or scripting an event), modify Materials, or implement other Unreal Engine features without the need to write any C++ code.
Physics Asset Editor
You can use the Physics Asset Editor to create Physics Assets for use with Skeletal Meshes. In practice, this is how you implement physics features like deformations and collisions. You can start from nothing and build to a full ragdoll setup, or use the automation tools to create a basic set of Physics Bodies and Physics Constraints.
Behavior Tree Editor / AI Behavior
The Behavior Tree Editor is where you can script Artificial Intelligence (AI) through a visual node-based system (similar to Blueprints) for Actors in your levels. You can create any number of different behaviors for enemies, non-playing characters (NPCs), vehicles, and so on.
Niagara Editor / Particle Effects
The Niagara Editor is for creating special effects by leveraging a fully modular particle effects system composed of separate particle emitters for each effect. Emitters can be saved in the Content Browser for future use, and serve as the basis of new emitters in your current or future projects.
UMG UI Editor
The UMG (Unreal Motion Graphics) UI Editor is a visual UI authoring tool that you can use to create UI elements, such as in-game head's up displays, menus or other interface-related graphics.
Font Editor
Use the Font Editor to add, organize and preview Font Assets. You can also define font parameters, such as Font Asset layout and hinting policies (font hinting is a mathematical method that ensures your text will be readable at any display size).
Sequencer Editor
The Sequencer Editor gives you the ability to create in-game cinematics with a specialized multi-track editor. By creating Level Sequences and adding Tracks, you can define the makeup of each Track, which will determine the content for the scene. Tracks can consist of things like Animations (for animating a character), Transformations (moving things around in the scene), Audio (for including music or sound effects), and so on.
Animation Editor
The Animation Editor within Unreal Engine 5 is used for editing Skeleton Assets, Skeletal Meshes, Animation Blueprints, and various other animation assets.
Control Rig Editor
Control Rig is a suite of animation tools for you to rig and animate characters directly in-engine. Using Control Rig, you can bypass the need to rig and animate in external tools, and instead animate in Unreal Editor directly. With this system, you can create and rig custom controls on a character, animate in Sequencer, and use a variety of other animation tools to aid your animating process.
Sound Cue Editor
The behavior of audio playback in Unreal Engine 5 is defined within Sound Cues, which can be edited using the Sound Cue Editor. Inside this editor, you can combine and mix several sound assets to produce a single mixed output saved as a Sound Cue.
Media Editor
Use the Media Editor to define media files or URLs to use as source media for playback inside Unreal Engine 5.
You can define settings for how your source media will play back, such as auto-play, play rate, and looping, but you can't edit media directly.
nDisplay 3D Config Editor
nDisplay renders your Unreal Engine scene on multiple synchronized display devices, such as powerwalls, domes, and curved screens. With the nDisplay Configuration Editor, you can create your nDisplay setup and visualize how content will be rendered across all the display devices.
DMX Library Editor
DMX (Digital Multiplex) is a standard for digital communication used throughout the live-events industry to control various devices, such as lighting fixtures, lasers, smoke machines, mechanical devices, and electronic billboards. In the DMX Library Editor, you can customize these devices and their commands.
The Editor Architecture in Source Code
Editor itself is modular just like the Runtime, but with its own separate module tree.
All editors and tools are implemented as Editor modules under Engine/Source/Editor/.
Each tool/editor you listed (Level Editor, Blueprint Editor, Material Editor, etc.) is its own module or part of a suite of modules.
Example:
Engine/Source/Editor/LevelEditor/ → Level Editor.
Engine/Source/Editor/BlueprintEditor/ → Blueprint Editor.
Engine/Source/Editor/MaterialEditor/ → Material Editor.
The Runtime has a clear layered hierarchy like Core → Engine → Renderer → Gameplay Framework → Game. but the Editor does not follow the same strict layering; instead, it is organized by tool modules that depend on runtime modules.
For example:
The Material Editor depends on Runtime/Engine (materials, shaders).
The Animation Editor depends on Runtime/Engine and AnimGraphRuntime.
The Behavior Tree Editor depends on AIModule.
So instead of a vertical hierarchy, the Editor is more of a tool ecosystem sitting on top of the Runtime.
UPROPERTY() macros in C++ are exposed to the Editor by the Reflection System.
Under the hood:
UHT (Unreal Header Tool) parses these macros and generates metadata (in .generated.h files).
The Details Panel in the Editor queries this metadata to build editable UI widgets (sliders, checkboxes, etc.).
This is why adding UPROPERTY(EditAnywhere) makes a field editable in the Editor without writing custom editor code.
Slate / SlateCore: Low-level UI framework used by all editors and tools.
UnrealEd: The core editor module that provides shared editor infrastructure (menus, asset browser, undo/redo, property editors).
EditorFramework: Provides base classes for asset editors and editor modes.
AssetTools: System for asset creation/import/export, used by all specialized editors.
As we said all editor-only code lives under: Engine/Source/Editor/
The build system (UnrealBuildTool, UBT) treats everything in Source/Editor/ as editor-only.
By convention:
Source/Runtime/… = included in packaged game.
Source/Editor/… = stripped from packaged game.
Source/Developer/… = usually editor/dev-only helpers (also excluded from shipping).
Key Editor Modules in Unreal Engine
UnrealEd – The heart of the editor. Provides base classes for asset editing, Play-in-Editor (PIE), undo/redo, property editing, and general editor services. Almost all editor tools depend on this module.
EditorFramework – Provides shared functionality for building asset editors and editor modes (e.g., selection, gizmos).
PropertyEditor – Generates details panels and property customization UI based on UObject reflection metadata.
AssetTools – Asset creation, import/export pipelines, asset type actions.
ContentBrowser – The Content Browser tool for asset discovery, organization, and editing.
LevelEditor – The main viewport editor for constructing levels, placing actors, and working with worlds.
EditorModes – Supports different level editing modes (geometry editing, foliage painting, landscape sculpting).
FoliageEdit – Tools for placing and editing foliage assets.
LandscapeEditor – Terrain sculpting, painting, and layer blending for landscapes.
MaterialEditor – Node-based editor for creating and previewing materials.
StaticMeshEditor – Inspect, edit, and configure static meshes (collision, LODs, UVs).
SkeletalMeshEditor / AnimationEditor – Tools for editing skeletons, meshes, animations.
ControlRigEditor – Rigging and animation authoring inside UE.
PhysicsAssetEditor – Ragdoll and physics setup for skeletal meshes.
SoundCueEditor – Combines multiple sounds into playable sound cues.
NiagaraEditor – Fully modular particle effects editor.
FontEditor – Asset-specific editor for fonts.
MediaEditor – Preview and configure media assets.
BlueprintEditor – Visual scripting editor for Blueprints.
Kismet – Legacy visual scripting tools, still used in some editor workflows.
BehaviorTreeEditor – Editor for AI behavior trees.
EnvironmentQueryEditor – Editor for EQS (Environment Query System).
Sequencer – Timeline-based editor for cinematics.
MovieSceneTools – Shared editor tools for tracks and clips used by Sequencer.
UMGEditor – WYSIWYG editor for UMG widgets and HUDs.
CurveEditor – General-purpose curve editing UI, reused across many tools (material curves, animation curves, etc.).
ViewportInteraction – Framework for viewport gizmos, snapping, and transform widgets.
DMXEditor – Tools for DMX (live event/lighting) integration.
nDisplayEditor – Multi-screen, cluster rendering configuration.
ChaosEditor – Authoring and preview tools for Chaos destruction and physics assets.
MeshPaint – Editor mode for vertex painting and texture painting.
Unlike the Runtime, which is structured into vertical layers (Core → Engine → Renderer → Gameplay), the Editor is structured into modules by domain.
UnrealEd + EditorFramework + PropertyEditor form the “core” upon which all other editors are built.
Each specialized editor (MaterialEditor, BlueprintEditor, NiagaraEditor, etc.) is its own module under Source/Editor/.
Editor modules depend heavily on runtime modules (Engine, CoreUObject, Slate, Renderer, AIModule, etc.), but runtime never depends on Editor.
Resources:
[1] Gregory, Jason. Game engine architecture. AK Peters/CRC Press, 2018.
[2] islamhaqq/UnrealEngineDeepDive: A deep dive into Unreal Engine's architecture
[3] Unreal Engine 5.1 Documentation | Unreal Engine 5.1 Documentation | Epic Developer Community