Aligning Unreal Engine Live Streaming for Real-Time Use Cases

metaverse VR
SHARE

Cutting-edge tools used in extended reality (XR) and video game development will soon become far more effective at enabling creators to speed production and delivery of the realistic user experiences markets have been clamoring for. Groundbreaking innovations in Epic Games’ Unreal Engine live streaming production tools have made photorealism the new benchmark in the development… Continue reading Aligning Unreal Engine Live Streaming for Real-Time Use Cases

Cutting-edge tools used in extended reality (XR) and video game development will soon become far more effective at enabling creators to speed production and delivery of the realistic user experiences markets have been clamoring for.

Groundbreaking innovations in Epic Games’ Unreal Engine live streaming production tools have made photorealism the new benchmark in the development of animated applications. Notably, advances like real-time ray tracing, densification of micro-polygon geometries, streamlined aggregation of visual effects (VFX), batch rendering of multiple camera inputs, and accelerated encoding bring stunning levels of realism to user experiences.

The Real-Time Infrastructure Imperative

But to fully exploit the power of these innovations, developers need to be able to employ Unreal Engine live streaming tools in workflows running on real-time interactive streaming infrastructure. And they must be able to do this expeditiously without going through the laborious task of integrating Unreal Engine tools for use with an infrastructure platform adequate to their needs.

There are many reasons why ready access to real-time streaming infrastructure is essential to the development process, starting with the fact that game and XR application developers, like workers everywhere in the Covid era, must be able to collaborate remotely. They require synchronized, high-resolution visibility into the elements they’re working on with the ability to see each other as they work. Moreover, all participants in the real-time workflow must have delay-free access to cloud-based tools.

These requirements not only rule out reliance on traditional high-latency, unidirectional streaming infrastructure. They’re also beyond the capabilities of traditional video conferencing systems, which, as explained in this blog, place unacceptable limitations on scalability, video quality, and other aspects of user experience.

Beyond the real-time collaboration requirements, developers must be able to design their productions for interoperability with real-time streaming infrastructure. The commercial success of live-action XR and gaming applications depends on infrastructure that can stream content in any direction among any number of users at any distance with end-to-end latencies below the threshold of perceptible delay.

The New Tie-In Between Unreal Engine and Real-Time Infrastructure

Developers will soon have ready access to these capabilities. Red5 Pro is developing a plugin that will allow Unreal Engine users to easily implement their workflows on its multi-cloud Experience Delivery Network (XDN) platform.

As explained in this white paper and underscored by use cases in operation worldwide, XDN architecture meets all real-time streaming requirements with the scalability of simultaneous connectivity to millions of users at global distances. Multimedia streams running on XDN infrastructure stream multimedia at end-to-end latencies in the 200ms-400ms range. The range dips well below that level when XDN edge points are positioned close to end users.

These latest Unreal Engine innovations make applications in the XR domain, including virtual, augmented, and mixed reality (VR, AR, MR), far more compelling to end users. In all cases, these advances rely on the accelerated processing power of device hardware to execute operations in real time.

For example, one of the biggest contributors to enabling more realistic user experiences is ray tracing, an image processing technique NVIDIA first made available for rendering games in real time with the release of its GeForce RTX GPUs in 2018. “Never have you had the ability to do things like shadows and reflections like this in real time,” noted NVIDIA developer Richard Cowgill in a blog describing Unreal Engine’s adoption of the technique.

As supported by Unreal Engine, ray tracing estimates the spatial dispersion of light rays hitting image surfaces to simulate realistic light and shadow effects that aren’t captured by cameras or, in the case of animated scenes, aren’t normally factored into the creation process. Real-time execution allows scene lighting to adapt on the fly to things like the sun’s changing angle, someone turning on a flashlight, or the door opening to an adjacent room.

Another recently introduced major advance in image realism is Unreal Engine’s Nanite micro-polygon geometry tool, which eliminates the need to create multiple versions of 3D models by breaking images into millions or billions of scalable triangles. Polygons used in frame-by-frame real-time renderings can be instantly retrieved from storage in the solid-state drives (SSDs) that populate much of today’s gaming and XR hardware.

Pre-integrations between the Unreal Engine toolset and the functions supported by XDN architecture will make it easier to incorporate these innovations into the design of live real-time applications. Furthermore, there are Unreal Engine enhancements that directly rely on real-time streaming in the development process.

For example, Epic has made it possible for remote team members, connected via real-time infrastructure, to collaborate simultaneously on the same region of a virtual world without interfering with each other. Working on their own replication of the same segment streamed from a grid automatically created by the engine’s new World Partition function, they can see what everyone else is doing as they work toward consensus on an outcome.

More broadly, real-time collaboration workflows extend to making cloud-based iterations of development tools available for simultaneous usage in remotely staffed development projects. Real-time connectivity to tool components running as virtual machines or containers in datacenters avoids the costs of equipping all remote workstations with full toolsets and frees team members to participate from any location using personal devices.

Hardware Integration Strengthens XDN Foundation for Application Development

Red5 Pro’s progress toward creating plugins that tie next-gen creative tools to XDN architecture parallels similar efforts toward tight integration with the hardware that’s essential to processing the complex functions initiated by those tools. These hardware integrations provide a direct path to reducing processing latencies in use cases that depend on real-time streaming.

For example, XDN pre-integrations with chipsets produced by AMD, previously Xilinx, provide XDN users with ready access to hardware acceleration that speeds transcoding to levels suited to real-time streaming. Creating an environment where XDN users can bring together pre-integrated processors with pre-integrated Unreal Engine tools is an obvious next and major step toward realizing the Metaverse vision of seamless flow between real and virtual worlds

A project demonstrating advances in holographic applications development undertaken by Epic and Microsoft in 2019 provide a hint of what’s in store as these integrations come together to create the foundation for a new realm of possibilities. The purpose was to demonstrate a shared holographic experience for people wearing Microsoft’s HoloLens 2, the latest version of the company’s MR eyewear.

In the demo, John Knoll, Executive Creative Director at Industrial Light & Magic, and Andrew Chaikin, space historian and author of Man on the Moon, told the story of the Apollo mission. They used various holographic images to examine the technical details of the Saturn V rocket stages and the moon lander and to stream holographic motion-picture renderings of the initial launch, landing maneuvers, views of the approaching moon surface, and the first steps taken by Neil Armstrong.

In essence, the project showed how Unreal Engine’s advanced rendering techniques employed with reliance on real-time networking enabled tightly synchronized shared experiences with holograms manipulated by hand motions in free space. Using a wirelessly connected PC powerful enough to execute Unreal Engine’s processing innovations with images comprising 15 million polygons made it possible to project holograms that were far more detailed and realistic than the usual HoloLens projections involving the use of smartphone compute power.

Commenting on the project in a blog posted by Unreal Engine, Microsoft Cloud and AI group technical fellow Alex Kipman said, “Epic just showed us how to directly stream high-polygon content, with no decimation, to HoloLens. Unreal Engine enables HoloLens 2 to display holograms of infinite detail, far in excess of what is possible with edge compute and rendering alone.”

The demonstration relied on local Wi-Fi connectivity, but more distant real-time connectivity as provided by XDN infrastructure could work as well. What that means to developers is that they will be able to collaborate at any distance in the development of MR use cases involving interactions with free-space holograms displayed at dazzling levels of clarity and detail.

More generally, the project showed how end users will one day be able to interact socially in real or virtual scenarios populated with realistic, movable holographic images that are represented uniformly to all participants. There are many aspects to the technological details behind the Apollo 11 demo having to do with things like manual camera tunings that will have to be automated before such use cases can be normalized, but this is the future that will unfold as real-time connectivity over XDN infrastructure takes hold.

As noted earlier, the XDN platform is built on a cross-cloud architecture that supports tightly synchronized real-time streaming to and from any number of endpoints at any distance at latencies no greater than 200ms-400ms. With ingestion of content at XDN Nodes positioned at deep edge points with distributed cloud compute resources, roundtrip latencies can be reduced to the sub-60ms levels that prevailed in the Apollo 11 demo.

The XDN Architecture

XDN infrastructure is built on automatically orchestrated hierarchies of Origin, Relay, and Edge Nodes operating in one or more cloud clusters. The platform makes use of the Real-Time Transport Protocol (RTP) as the foundation for interactive streaming via WebRTC (Real-Time Communications) and Real-Time Streaming Protocol (RTSP). In most cases, WebRTC is the preferred option for streaming on the XDN platform by virtue of its support by all the major browsers, which eliminates the need for device plugins.

There are also other options for receiving and transmitting video in real time when devices are not utilizing any of these browsers. RTSP, often the preferred option when mobile devices are targeted, can be activated through Red5 Pro iOS and Android SDKs. And video can be ingested onto the XDN platform in other formats as well, including Real-Time Messaging Protocol (RTMP), Secure Reliable Transport (SRT), and MPEG-Transport Protocol (TS). The XDN retains these encapsulations while relying on RTP as the underlying real-time transport mechanism.

The XDN platform also provides full support for the multi-profile transcodes used with ABR streaming by utilizing intelligent Edge Node interactions with client devices to deliver content in the profiles appropriate to each user. And to ensure ubiquitous connectivity for every XDN use case, the platform supports content delivery in HTTP Live Streaming (HLS) mode as a fallback. In the rare instances where devices can’t be engaged via any of the other XDN-supported protocols, they will still be able to render the streamed content, albeit with the multi-second latencies that typify HTTP-based streaming.

XDN Nodes can be deployed on multiple cloud infrastructure-as-a-service (IaaS) platforms. This can be done by leveraging pre-integrations with major suppliers like AWS, Google Cloud, Microsoft Azure, and DigitalOcean or through integrations with many other IaaS platforms enabled by Red5 Pro’s use of the Terraform multi-cloud toolset.


Developers employing Unreal Engine or any other toolsets used in game and XR development have two major priorities: in a work environment transformed by the Covid-19 pandemic they must be able to collaborate remotely in real time, and they must be able to develop/integrate interfaces that will allow them to exploit all the benefits that accrue with real-time interactive streaming.

This is why making XDN connectivity readily available for collaboration and innovation in the design of live-streamed gaming and XR applications is a top priority at Red5 Pro. To learn more about our progress in this direction and the many ways real-time streaming can be used to enhance realism and other advances in user experience, contact info@red5.net or schedule a call.