How Interactive TV Becomes Metaverse TV

SHARE

What once was known as interactive TV is morphing into a potpourri of online entertainment experiences. But these experiences can only reach their full potential with support from interactive real-time video infrastructure. So far, the transition from legacy to over-the-top (OTT) TV services has spawned an outpouring of fairly primitive interactive use cases that are… Continue reading How Interactive TV Becomes Metaverse TV

What once was known as interactive TV is morphing into a potpourri of online entertainment experiences. But these experiences can only reach their full potential with support from interactive real-time video infrastructure.

So far, the transition from legacy to over-the-top (OTT) TV services has spawned an outpouring of fairly primitive interactive use cases that are bound by the limitations of traditional one-way, latency-intensive streaming technology. This results in applications marred by things like:

  • Text-only chat responses to out-of-sync live video streams.
  • Severe scaling limitations when audiences are able to engage via video communications.
  • Less-than-satisfactory attempts to personalize user experiences with hard-to-implement graphic overlays, multi-screen viewing options, and second-screen applications.
  • Highly limited applications of extended reality (XR) technologies in support of immersive engagements with live-streamed content.

Notwithstanding these drawbacks, it stands to reason that media companies and advertisers would be looking at interactivity as fundamental to drawing viewers as they bring traditional TV fare to the internet. After all, they’re trying to appeal to a generation of viewers who have grown up consuming video on platforms like Twitch, YouTube, Snapchat and TikTok, where interactions with each other and their video stars are taken for granted.

But there’s no reason providers of high-value video services should limit themselves to mimicking what people are accustomed to on those platforms when it’s possible to augment the appeal of their content by taking interactivity to another level. A far more compelling picture of what can be done comes into focus when providers choose to rely on network infrastructure that supports multidirectional video streamed across any size audience at any distance in real time.

This is the foundation for ITV in the emerging Metaverse era, which, as discussed in this blog, is witnessing the seamless merger between real-world and video-rich virtualized experiences in multiple realms of internet activity. As providers who are relying on Red5 Pro’s Experience Delivery Network (XDN) infrastructure are demonstrating worldwide, there’s no technological barrier to delivering a Metaverse version of ITV.

The Future Version of Shared Viewing Experiences

For example, consider the contrast between what can be done using real-time interactive video infrastructure and most current approaches to audience engagement with live-streamed sports, esports, concerts, game shows, and other content. Lately, text-based chat, a mainstay in esports, has been cropping up in many other programming categories, and there’s been a flurry of video-based watch party initiatives supported by the likes of Facebook, Amazon, Hulu, HBO Max, and Sling TV.

Communication via video is a big improvement over chat, but most of these watch party platforms are limited to shared viewing of on-demand content, which makes it easier to ensure a group’s chosen video file is streamed at the same time to all participants over one-way CDN infrastructure. Sling TV claims its Sling Watch Party feature is the first to offer video chat with live TV content, but, like the others, it restricts participation in a shared viewing session to just a handful of people.

There also are some live video-chat-enabled watch-party apps now available from various entities working with the NFL, NBA, soccer leagues, and other sports organizations, albeit with similar limits on the number of participants. For example, Yahoo Sports imposed a four-person limit on the watch-party feature it added to its mobile app in conjunction with the start of the 2020 NFL season.

There are no such restrictions when live-streamed content is delivered for social viewing over XDN infrastructure. For example, software platform developer StageConnect is using XDN infrastructure to enable simultaneous real-time reception of live concert streams across mass audiences with support for video-based interactions among any number of viewers. When on-stage personalities entertain discussions with audience members, remote video responses are curated at the venue and highlighted on a display wall, allowing everyone to be a visible part of the event.

This kind of real-time engagement with remote audiences can be brought into play in a variety of scenarios beyond those commonly associated with watch parties. The fact that audience members can participate with an on-screen presence in any type of program is fundamental to the transformations in traditional TV entertainment that will become the norm in the Metaverse era.

Video-Rich Audience Participation in Game Shows

Game shows are a case in point.

It was just over four years ago that HQ Trivia, a twice-daily live-streamed version of the old Trivial Pursuit game, began making waves by drawing millions of people competing for hundreds of thousands of dollars in prize money. HQ Trivia, named Time Magazine’s 2017 App of the Year, crashed and burned in the wake of a series of largely self-imposed problems, but the enthusiasm for such competitions, with or without prize money, has spawned a cottage industry in streamer-generated quiz shows playing on Twitch and elsewhere.

According to Twitch records, the top five of 28 such shows running on the platform registered 5,484 viewership hours over a recent 30-day tracking period. Meanwhile, at the professional level, trivia appears to be alive and well after newcomer BigBrain drew $4.5 million in seed funding to launch a mobile app on iOS offering cash prizes in competitions that run a dozen times daily.

In another noteworthy instance, the BBC augmented its popular Antiques Road Show series with an interactive game-playing version offered through Twitch, Facebook, and YouTube. Winners competing for the title of Roadie Scholar of the Day earned points for offering the most accurate appraisals of featured items. User shoutouts highlighting rankings on a leaderboard contributed to the game’s success, according to an account written by Eli Stonberg, CEO of Hovercast, which supplied tools used in the production.

Real-time interactive video streaming will go much further in bringing new vitality to the game show paradigm by giving contestants a visible presence in the proceedings. And it will spark new approaches to building viewership through audience participation in other types of linear programming.

Another recent initiative undertaken by the BBC involving its music reality show, The Voice UK, offers a glimpse of what’s in store. As described in a blog by Tom Bowers, executive producer of developer Hypothesis Media, show producers commissioned his firm to create and manage an interactive visual component for the show that would appeal to its large social media following. This led to implementation of a virtual room hosting over 500 participants whose video streams can be selected for on-screen appearances during the live show.

As Bowers notes, enabling virtual audience participation via video “doesn’t have to look like ‘Zoom on TV.’” In this case, Hypothesis partnered with other developers who are using a proprietary platform rather than an open-source one like XDN to support interactive streaming. The platform, touting “sub-second” latencies, approaches but doesn’t consistently achieve real-time speeds and requires use of client plug-ins. But it does reflect intensifying efforts to address the rising demand for video-enhanced interactivity in linear programming.

What’s really needed, though, is what the XDN platform provides, as described in this white paper: a multi-cloud approach to engaging mass audiences in interactive video streaming. Streams are delivered through the most popular browsers, with no plug-in requirements, at latencies no greater than 200-400ms across multiple continents, and as low as 50ms or less over intra-regional distances.

Enhancing Shopping, Gambling and Other Interactive Engagements

Such capabilities are also essential to delivering on the full potential of livestream shopping, a phenomenon that took off in China and is now gaining broad traction with marketers worldwide. For example, in the U.S. several major players have mounted cloud-based livestream shopping platforms, including Instagram Shopping Live, Amazon Live, and Google Shoploop. There’s also a range of innovative youth-oriented approaches to the category emanating from startups like Popshop Live and TalkShopLive.

Livestream shopping typically involves long-form streaming scenarios analogous to traditional TV shopping channels like QVC and HSN but with the added benefit of direct interactivity between presenters and viewers. The underlying platforms support direct in-video purchasing options and voice or text chat links that presenters can respond to live as part of the webcast. But they generally lack any support for video-based communications.

For anyone looking for a more compelling way to draw viewers, the obvious next step is to enable a synchronized real-time viewing experience in conjunction with support for real-time video communications from shoppers, no matter how many might be watching. As in the watch party, quiz show and other scenarios discussed here, higher levels of engagement can be expected with livestream shopping when viewers know they have an opportunity to be seen during a presentation.

Another side of e-commerce that requires synchronized interactions with video streamed in real-time involves auction-based sales. Multiple types of traditional auctions requiring real-time interaction between auctioneers and bidders are moving online. And, in the individual selling and bidding space pioneered by eBay, newcomers to the field are beginning to support real-time live presentations by individual sellers.

Online gambling, too, is starting to incorporate real-time streaming. Casino-type gambling involves remotely located players that need to interact visually with each other and dealers in real time. Sports betting takes on new dimensions, known as micro-betting, when people watching an event unfold in real time can all bet in advance on what might happen next. Video engagement with winners provides an added inducement to participation.

The same principles spill into many other applications where split-second decisions depend on synchronized real-time reception of content across large audiences. For example, answers to polling questions about what’s happening in real time are more meaningful if they can be displayed while the referenced action is still relevant. Investing in stocks benefits from knowing what share prices really are as opposed to what they were a half minute earlier.

Capabilities like those embodied in XDN architecture are also critical to achieving another component to next-generation video services, which is an evolved form of the old ITV vision related to personalizing viewing experiences. Once latency becomes imperceptible in production collaborations over great distances, producers have a real-time operating environment to work in to accomplish things that were either not doable before, or were done with less than stellar results.

Better Approaches to Personalizing Viewing Experiences

For example, Skreens Entertainment Technologies operates a platform-as-a-service (PaaS) that enables producers to compile multiple video, graphics, and data feeds synchronized to the unfolding event in a unified screen rendering that can be personalized for each viewer. Here, real-time streaming enables everything to be brought together from remote locations for packaging with the live streams, sans  production delays.

“We’re supporting very sophisticated, ultra-low latency engagements, doing all the heavy lifting with studio-quality production mechanisms that can be called through our APIs from the cloud,” says Skreens founder and CEO Marc Todd. “With a production studio at your fingertips, you can be resizing video on the fly in real time.”

All elements are collected, decoded, combined, and re-encoded into single live production streams by way of operational commands to the cloud from the Skreens UI on producers’ computers. Viewers can order up their own event highlights and access stats relevant to their fantasy leagues, favorite players and teams, and other interests.

In addition, interactive video streaming with real-time end-to-end latencies in the 200-400ms range can easily be added to traditionally linear end-user experiences. “A customer can be broadcasting to millions, and anyone in that audience can send a video up to us for distribution to everyone else,” Todd notes. “Or you might have a guy doing a cooking blog who wants the audience to be able to participate in a recipe with their own videos.”

Another approach to using XDN infrastructure for compelling personalized experiences  involves streaming virtually any combination of dynamically responsive graphics, video clips, sound bites, and text in overlays that can be rendered on each user’s device in sync with live streamed sports or other linear video. The ability to precisely pair the content and overlays frame by frame in real time is “a real game changer,” says Andrew Heimbold, who serves as president of applications developer Reality Check Systems (RCS) and helps lead digital overlay platform supplier Singular.Live.

As described by Heimbold, RCS is working with leading sports leagues, networks, federations, and social media outlets worldwide to drive audience engagement through customized fusions of dynamic graphics, real-time data, and social media. As a case in point, he cites Sky Deutschland’s affiliation with London-based online video publisher (OVP) Grabyo in conjunction with the May 2020 return to top-tier European football league play following a drop in the Covid-19 infection rate.

Sky complemented its broadcast of Germany’s Bundesliga competition with a live-streamed “Fan Engagement” feed featuring graphics enhancements curated by graphics operators working from remote locations. Along with contributing to the safety of personnel amid the ongoing pandemic, the strategy cut the costs of bringing the graphics operators and all their gear to the game sites or Sky’s studios.

Grabyo enables launch of such overlays from its Grabyo Producer platform with support for multiple live inputs, social data, and TV graphics. Outlets on the Grabyo Producer are aggregating billions of views annually on streams from websites, mobile apps, YouTube, and social media operations like Facebook Live and Periscope.

Several other OVPs have integrated the digital overlay capabilities in their workflows so that any customer can readily add enhancements customized to their needs directly from the publishing platform. For example, Los Angeles-based OVP Frequency Studio has made such capabilities available to over 100 outlets with multiple channel feeds reaching more than 100 million monthly viewers.

“You can generate graphics in real time and distribute them anywhere,” Heimbold says. “You can render them in a broadcast feed or on a million devices. We’ve been working with Red5 Pro because they have an amazing scalable infrastructure that allows you to synchronize distribution of the video and overlay streams at ultra-low latency to millions of end users.”

The same techniques apply when it comes to providing multiple viewing options with a live-streamed event. In these cases, multiple camera feeds arriving in sync from a remote site can be curated by production personnel at a central location to provide single-stream coverage to a mass audience. Or, when XDN infrastructure is employed end to end, the camera feeds can be sent as separate streams to all users to enable individual selection of views on unfolding events.

An Expanding Software Foundation for Metaverse TV Development

The developers cited so far are just some of the enterprises that are responding to the demand for next-generation interactive video services. While some, like Hovercast, may not have engaged real-time video streaming to enhance their applications, most are well positioned to put their software stacks to work for this big leap forward on XDN infrastructure.

For example, Genvid Technologies, a leading supplier of interactive streaming technology for the esports market, has designed its software stack to work on next-gen real-time and traditional CDN infrastructures. As a result, when real-time infrastructure is employed, developers can easily add video capture and streaming capabilities to their spectator client applications.

But there’s more to what can be done than empowering viewers to communicate via video. When real-time infrastructure is employed, Genvid’s customers can create new types of audience participation environments that put them in the middle of gaming action, not just as commentators, but as influencers on the outcomes.

This represents still another realm of development pointing to the Metaverse era. For example, one application developer using the Genvid SDK gives audience members influence over AI-driven animated creatures battling Twitch streamers, who usually just webcast their exploits for people to watch. Another innovation along similar lines allows audience members to control the conditions confronting participants in a survival game.

Such capabilities are also entering the domain of audience-influenced storytelling, where, rather than game-like competition, the action flow is a function of a narrative that can branch along different trajectories depending on what the audience decides. This sort of real-time mass audience engagement also has implications for new types of reality shows that could involve viewers’ interactions with real people and their surroundings rather than animated characters.

The XR Viewing Experience

Advanced software platforms are also bringing XR technology to life in networked entertainment. This goes to the heart of developments associated with the emerging Metaverse, where pervasive use of VR and other versions of XR technology in everyday life is a defining characteristic.

Where live programming is concerned, much of the activity centers on delivery of 1800 or 3600 immersive viewing experiences through VR headgear. In the U.S., the NBA has shown the biggest commitment to the technology with dozens of VR broadcasts over the past three seasons. VR coverage of pro soccer is now a regular feature in Europe. In the UK, Sky Worlds is a VR programming service offering regular 1800 coverage of netball, a European version of basketball, as well as Premier League soccer. BT, too, has made VR coverage of Premier League games a regular feature of its services.

Mobile carriers’ see live VR coverage extending to other live events as well as sports as a strong selling point for 5G. Verizon, for example, has begun offering 3600 streaming from multiple event venues, including the Indianapolis 500, Liga MX soccer games, the 2021 Oscars, and Live Nation music clubs throughout the U.S. Deutsche Telekom, which has already experimented with VR coverage of music events, has launched an initiative aimed at delivering sports, gaming, and other VR content over its 5G network.

South Korea’s KT, too, is on an aggressive VR track. Having already launched a “Super VR TV” IPTV service, the carrier has signaled it will be among the first to introduce VR on 5G networks.

These are but the rudimentary beginnings of what can be accomplished with development of programming for immersive viewing on VR headsets. There’s an abundance of narrative content developed for offline viewing, including documentaries, music videos, and new types of dramas that puts the user in the middle of the action. Once real-time networking capabilities are brought into play, such programming is likely to become abundantly available to the growing legions of people who own VR gear.

One example can be seen in the BBC’s popular home makeover show Your Home Made Perfect, now in its third season. In its current iteration, the program provides viewers 2D visibility into the VR experiences of homeowners involved in exploring alternative house designs created by the show’s architects. A version optimized for VR viewing would allow the audience to share in the immersive experience of the featured homeowners.

The mixed reality integrations of virtual elements with the real world that are supported by AR eyewear and, a bit farther out on the horizon, holographic renderings in free space, will add to the vast array of new programming formats that are sure to come as we enter full immersion into the Metaverse. The XDN platform provides all the infrastructural support needed by anyone who wants to take interactive programming to the next level.

To learn more about how the XDN platform can bring Metaverse TV to life, contact info@red5.net or schedule a call.