Beyond the Games: Video Streaming in Unreal Engine

SHARE

When Epic Games launched Unreal and the underlying game engine in 1998, their goal was simple enough: create better games. As they made their tools available to others, however, first through direct licensing agreements, then via the Unreal Developer Kit in 2009, and eventually through Unreal Engine in 2014, it became clear that there was… Continue reading Beyond the Games: Video Streaming in Unreal Engine

When Epic Games launched Unreal and the underlying game engine in 1998, their goal was simple enough: create better games. As they made their tools available to others, however, first through direct licensing agreements, then via the Unreal Developer Kit in 2009, and eventually through Unreal Engine in 2014, it became clear that there was a demand for tools that could create photo-realistic video content that extended beyond the gaming world. And over the past few years, this demand has grown further. We’re seeing all kinds of video and video streaming in Unreal Engine, in industries that span from television to cars.

Film/Television

Given how realistic video games have become, it shouldn’t at all be surprising to see other forms of video begin to take advantage of the same tools for creating amazing special effects with real-time rendering and on-the-fly editing capabilities. Star Wars Fans likely know that The Mandalorian utilized Unreal Engine paired with Industrial Light & Magic’s 270-degree LED wall soundstage (“The Volume”) for the backdrops of their stunning shots, allowing them to control every aspect of their environment for perfect shooting and acting conditions without the costly expense of on-location work.

Less widely known, the BBC also utilized Unreal Engine technology for their coverage of the 2022 Olympic winter games (as well as other news stories), creating several virtual “sets” to transport their hosts from a green screen room to fanciful winter-theme locales.

Perhaps the splashiest example comes from the live finale of the singing show, The Voice, where Coldplay and BTS performed together on stage while in different parts of the world. While Coldplay performed live on the studio stage, BTS appeared alongside as holograms in a synchronized and dynamic example of video streaming technology at work.

Vehicle Prototyping, Design, Customization, User Interfaces, and Testing

Cars may not be the first things that come to mind when you think of video streaming, especially when it relates to a video game engine, but manufacturers are starting to see the benefits of real-time rendering of in-car graphics as well as the ability to stream content to – and from – vehicles. Volvo recently announced that they will be utilizing Unreal Engine in their cars to support rendering on their video-based user interfaces. Likewise, other premium car manufacturers, such as Audi and Pagani employ Unreal Engine to allow users to design custom configurations of their vehicles. Lastly, Porsche creates real-world simulations in the game engine to test how their driving technology will perform outside of the R&D walls.

Fashion and Shopping

If you were to imagine the least likely shoe brands to end up in the metaverse, Timberland would probably rank high on that list. Known as a rugged, outdoorsy brand, it’s hard to imagine them hiking their way through a virtual landscape. But that’s exactly what happened when they paired up with Epic Games and CONCEPTKICKS: a reimagining of the classic footwear for a virtual lifestyle with an interactive display in Fortnite.

This is not fashion’s first foray into gaming, of course, but we are seeing more and more immersive and engaging examples. One of the more well-known is the Balenciaga Fall 2021 fashion lineup, which was presented via a video game called Afterworld: Age of Tomorrow. As players delved into the story, they were surrounded by examples of the fashion icon’s latest line in a realistic virtual environment.

Indeed, fashion is the perfect fit for virtual environments, allowing users to see and even virtually try on the clothes before purchasing. We may not have to wait long for this to be ubiquitous, either. Bods is a platform that invites users to create realistic avatars called “bods” and see what clothes actually look like on their body type. If retailers buy in, it could minimize returns for online purchases, while opening up more shopping opportunities, especially for those who are homebound. Better still, there’s room to move beyond virtual mirrors to streaming video content that allows users to see how the clothes would look and respond in various environments from rock climbing to a night out at the club.

Science

The photo-realistic nature of Unreal Engine opens a number of possibilities for scientific endeavors, including simulations. Nasa co-hosted a competition with Epic Games which invited scientists and creatives to design virtual reality environments and scenarios that reflect experiences voyagers might face on Mars. But we don’t need to look out into the Milky Way for examples. Unreal Engine has been used for everything from realistic human anatomy renderings to an entirely virtual biology lab. These types of simulations offer immersive testing sandboxes and, when paired with a robust video streaming platform like Red5 Pro, powerful remote learning opportunities.


As Red5 Pro focuses on providing businesses with ultra-low latency video streaming at the speed of thought, supporting tools like Unreal Engine is important to us, as well. This capability will be coming soon and we can’t wait to see what you make. Drop us a note at info@red5.net for early access! And if we can support you in bringing your ideas to life, feel free to reach out to set up a call to discuss the possibilities.