Unreal Engine 4.23 released!

Press release 2019-09-05 08:17 article  > Software

Raytracing improvements and chaos physics.

New Features:

  • New: Chaos - Destruction (Beta): Revealed in a demo at GDC 2019, Chaos is Unreal Engine's new high-performance physics and destruction system available to preview in Beta form with the 4.23 release. With Chaos, users can achieve cinematic-quality visuals in real-time in scenes with massive-scale levels of destruction and unprecedented artist control over content creation.
    • Geometry Collections - These are a new type of asset in Unreal for destructible objects. They can be built from one or more Static Meshes, including those gathered together in Blueprints or even nested Blueprints. Geometry Collections let the artist choose what to simulate and they also offer flexibility in terms of how you organize and author your destruction.
    • Fracturing - Once you have a Geometry Collection, you can break it into pieces using the Fracturing tools. You can fracture each part individually, or apply one pattern across multiple pieces. In addition to standard Voronoi fractures, you can use Radial fractures, Clustered Voronoi fractures, and Planar Cutting using noise to get more natural results.
    • Clustering - With optimization in mind, Sub-fracturing allows you to control where to add complexity. Each time you sub-fracture, an extra Level is added to the Geometry Collection. The Chaos system keeps track of each subsequent Level and stores that information into something you can control called a Cluster. Below is an example of a mesh where each fracture Level is combined into its own set of Clusters.
    • Fields - Fields are the way that you can directly interact with and control simulations. Fields can be used to control any attribute on any part of your Geometry Collection. If you want to vary the mass for instance, or make something static, make the corner more breakable than the middle, apply some force; all of this can be controlled with Fields.
    • Cached Simulations - With caching, high fidelity simulations can be pre-cached and played back in real-time resulting in a kinematic Geometry Collection. This means that you can author large scale destruction events and still allow interactivity with the player and environment.
    • Niagara Integration - The Chaos system is a first class citizen of UE4, and as such, lives alongside all other systems that simulate your world including Niagara. Incorporating visual effects into your simulations can add a lot of depth and realism to the world. For example, when a building breaks apart it generates a large amount of dust and smoke. To create the marriage between destruction and VFX, data from the physics system can be sent to Niagara when an object collides or breaks apart, and that data can be used to generate interesting secondary effects.

 

  • Real-Time Ray Tracing Improvements (Beta) - Ray Tracing support has received many optimizations and stability improvements in addition to several important new features.
    • Performance and Stability - A large focus this release has been on improving stability, performance, and quality of Ray Tracing features in Unreal Engine. This means:
      - Expanded DirectX 12 Support for Unreal Engine as a whole
      - Improved Denoiser quality for Ray Traced Features
      - Increased Ray Traced Global Illumination (RTGI) quality
      - Additional Geometry and Material Support
      - Support for additional geometry and material types, such as: Landscape Terrain (Measured Performance on a 2080Ti in KiteDemo: ~2ms geometry update time and ~500MB video memory), Hierarchical Instanced Static Meshes (HISM) and Instanced Static Meshes (ISM), Procedural Meshes, Transmission with SubSurface Materials, World Position Offset (WPO) support for Landscape and Skeletal Mesh geometries
    • Multi-Bounce Reflection Fallback - improved support for multi-bounce Ray Traced Reflections (RTR) by falling back to Reflection Captures in the scene. This means that intra-reflections (or reflections inside of reflections) that are displaying black, or where you've set a max reflection distance, will fall back to these raster techniques instead of displaying black. This can subtly improve the quality of reflections without using multiple raytraced bounces, greatly increasing performance.

 

  • New: Virtual Texturing (Beta) - With this release, Virtual Texturing beta support enables you to create and use large textures for a lower and more constant memory footprint at runtime.
    • Streaming Virtual Texturing - Streaming Virtual Texturing uses Virtual Texture assets to offer an alternative way to stream textures from disk compared to existing Mip-based streaming. Traditional Mip-based texture streaming works by performing offline analysis of Material UV usage, then at runtime decides which Mip levels of textures to load based on object visibility and camera distance. For Virtual Textures, all Mip levels are split into tiles of a small fixed size, which the GPU can then determine which Virtual Texture tiles are accessed by all visible pixels on the screen. Streaming Virtual Texturing can reduce texture memory overhead and increase performance when using very large textures (including Lightmaps and UDIM textures), however, sampling from a Virtual Texture is more expensive than sampling a regular texture.
    • Runtime Virtual Texturing - Runtime Virtual Texturing uses a Runtime Virtual Texture asset with a volume placed in the level. It works similarly to traditional texture mapping except that it's rendered on demand using the GPU at runtime. Runtime Virtual Textures can be used to cache shading data over large areas making them a good fit for Landscape shading.
  • New: HoloLens 2 Native Support (Beta) - Developers are now able to begin creating for the HoloLens 2.

 

  • New: Virtual Production Pipeline Improvements - Unreal Engine continues to lead the way with advancements in what is possible in a virtual production pipeline! Virtually scout environments and compose shots, use the virtual world to light the real world, connect live broadcast elements with digital representations to build a seamless experience, and control it all remotely using custom-built interfaces.
    • In-Camera VFX (Beta) - Using improvements for In-Camera VFX, you can achieve final shots live on set that combine real-world actors and props with Unreal Engine environment backgrounds, using an LED wall that can either display an Unreal Engine scene, or a digital greenscreen for real-time compositing in UE4.
    • Camera frustum-based rendering enables real-world actors and props to receive lighting and reflections from the CG environment, and in some cases eliminates post-production workflows, significantly accelerating overall production. Save time by quickly placing greenscreens digitally on an LED wall with a click of a button instead of physically setting them up on stage. The entire solution can scale to LED walls of virtually any size or configuration thanks to nDisplay multi-display technology.
    • VR Scouting for Filmmakers (Beta) - The new VR Scouting tools give filmmakers in virtual production environments new ways to navigate and interact with the virtual world in VR, helping them make better creative decisions.

 

 

More in Unreal Engine 4.23 blog entry

Author: Press release Editor: Michał Franczak
Tags: unreal engine ue4
You may also like...
Unreal Engine  - free direct access for students

Unreal Engine - free direct access for students

Unreal Engine 4 is now free for students through Github.
×

LEAVE A COMMENT

You need to be logged in to leave a comment. Don't have account? Register now.
  • ul. Przedzalniana 8, 15-688 Bialystok, POLAND