LiDAR VR/AR: How LiDAR Raises Prospects for Extended Reality Adoption

metaverse VR
SHARE

The prospects for mass participation in extended reality (XR) applications appear to be getting better with implementations of the 3D spatial laser scanning technology—known as LiDAR—in new products from Apple and other suppliers. LiDAR, an acronym for light detection and arranging, parallels what radio detection and arranging (radar) does with sound, but to far greater… Continue reading LiDAR VR/AR: How LiDAR Raises Prospects for Extended Reality Adoption

The prospects for mass participation in extended reality (XR) applications appear to be getting better with implementations of the 3D spatial laser scanning technology—known as LiDAR—in new products from Apple and other suppliers.

LiDAR, an acronym for light detection and arranging, parallels what radio detection and arranging (radar) does with sound, but to far greater precision.  In this case, reflected laser pulses are used to support AI-assisted representations of a targeted space in three dimensions. The technology has been used since the 1960s in surveying, archeology, meteorology, space exploration and many other disciplines, but it’s only recently that component miniaturization and advances in AI have enabled use of 3D photonic imaging in smartphone cameras, XR eyewear, and other consumer devices.

As explained in our blog opening this series on XR trends, the development of more useful virtual reality (VR), augmented reality (AR), mixed reality (MR), and holographic applications largely depends on market access to the type of real-time interactive video streaming infrastructure supported by Red5 Pro’s Experience Delivery Network (XDN) platform. The emergence of LiDAR technology as a facilitator to XR use cases is another advance signifying how fast the need for such networking capabilities is taking hold in the XR space, just as it is in myriad other live streaming scenarios across the consumer and enterprise markets.

A Photonic Approach to Improving 3D Imaging Granularity

LiDAR scanning, sometimes referred to as time-of-flight (ToF) scanning, is used to analyze the pico-second variations in roundtrip durations of recurring laser pulses bouncing off everything in a viewed space. AI-fueled algorithms reference this information to instantaneously generate volumetric point clouds that map how light and color should be applied to each pixel in the captured image.

The device uses these 3D pixel maps to produce higher quality photos and videos, and to enable users to accurately position AR elements in screen displays with adherence to occlusion, shadows, coloring, and other effects matched to real conditions. In addition, and especially important to networked XR applications, the technology can be used to support realistic holographic projections of users and other photographed elements into real and virtual spaces.

Apple made waves this year when it introduced LiDAR scanning with releases of high-end iPhone 12 and 13 Pro and Pro Max devices following its 2020 implementation of LiDAR in the iPad Pro. It remains to be seen whether press reports that the company will extend the capability to all versions of the iPhone 13 prove accurate, but there’s little doubt the technology will become more widely supported in succeeding iOS devices.

Apple says these initial implementations scan at distances up to 5 meters. No doubt the company will continue to improve performance, as it did with the September introduction of high-quality portrait photography in the iPhone 13 Pro and Pro Max, which relies on LiDAR to accurately blur backgrounds in order to highlight faces and other elements in the foreground.

More Devices, More XR Use Cases for LiDAR

In fact, the iPhone uses appear to be just the beginning of what the company is planning to do with LiDAR technology. According to an early 2021 report from Bloomberg and more recent updates elsewhere, Apple is working on a high-end VR head-mounted device (HMD) with pricing in the $3,000 range for possible release in 2022. These outlets’ sources say the company plans to follow up at an unspecified date with release of an even pricier set of eyewear supporting MR, which takes AR to the level of virtually inserting realistic, life-size 3D elements in real spaces.

The longer development schedule and pricing for the MR glasses attests to the challenges of packing in all the miniaturized components essential to injecting virtual elements into the eyewear view of real spaces. Components include multiple cameras, sensors, image projectors, LiDAR emitters and much else. How much harder this is to accomplish than the creation of a completely walled off VR experience is reflected in the difference between current prices for popular HMDs in the $300-$900 range versus price tags like $3,500 for Microsoft’s HoloLens 2 and $2,295 for Magic Leap’s eyewear.

Apple has been building support for creating AR apps since the release of iOS 11 and iPhone X in 2017. Now, with LiDAR scanning coming into play, users downloading previously developed AR apps can do things with far greater confidence in their authenticity.

For example, as reported by PC Magazine, they can:

  • get more accurate screen views of themselves outfitted with clothing or wearing makeup they’re considering for purchase;
  • better assess how a new piece of furniture would look in their home;
  • more reliably explore different approaches to landscaping property;
  • see how their belongings would really look in the rooms of a prospective new home;
  • play more interesting versions of games like Hot Lava, where virtual characters race through a course that more realistically combines virtual with real objects.

The AR app count, expedited by developers’ ready access to Apple’s ARKit development framework, continues to grow across multiple fields with ever greater levels of sophistication. A case in point is the Seeing AI app developed by Microsoft for iOS to provide blind and visually impaired people AI-driven verbal descriptions of people, objects, scenes, and text appearing in iPhone and iPad viewfinders. The LiDAR support even makes it possible for the device’s haptic proximity sensor to help people feel how close they are to objects.

Apple’s use of LiDAR scanning is at the vanguard of what promises to be pervasive use of the technology in the years ahead. So far, Android devices have relied on the far less granular non-scanning approach to ToF measurements with single laser bursts. But reports that some Android OEMs have ordered shipments of vertical-cavity surface-emitting lasers (VCSELs) used with LiDAR scanning suggest this could change.

Google, of course, has been promoting AR since the introduction of Google Glass eyewear in 2013, which, after dismal consumer market performance, has evolved into a product for the enterprise market. Since 2016, the company has made its ARCore development kit available for developing Android apps for AR use on smartphones, mirroring a lot of the apps in the Apple store.

The LiDAR Contribution to Holographic Use Cases

What role, if any, LiDAR will play in Android app development remains to be seen. But hints of what’s in store for use of LiDAR in XR beyond what Apple is doing can be seen elsewhere, including in developments related to holography.

For example, Looking Glass Factory, a supplier of holographic applications, has developed a precursor to what might be expected with the aid of photos and video captured by LiDAR-equipped smartphones with or without holographically-enabled AR and VR head gear. Depth-sensing cameras like Microsoft’s Azure Kinect or Intel’s discontinued RealSense can be employed as well, but they don’t have the mass market implications of an iPhone.

The Looking Glass Portrait and its higher end 4K and 8K iterations are computerized displays that generate holograms without the aid of eyewear. The devices turn a selfie or any other photo as well as video clips captured by an iPhone 13 Pro into holograms floating at the screen surface for 3D observation from any direction within a 58-degree horizontal viewing angle measured from a center-screen vertex.

The Implications for Real-Time Networking with XDN Infrastructure

The technology is positioned for a wide range of use cases, including many involving real-time network connectivity in video conferencing, live entertainment, medical care and training, design collaboration and much else. The baseline 7.9-inch screen units retail for less than $400.

This is but one example of emerging hologram-producing products that have a bearing on future directions in XR. For example, Light Field Lab has taken the technology to a new level with development of projectors that generate completely free-standing holograms suspended in thin air beyond the display surface. We’ll take a deeper look at holography and how it might play in XR in a forthcoming blog.

Here the point is the emergence of LiDAR-enabled smartphones harbingers mass availability of a means by which people can project themselves in virtual spaces with the verisimilitude that’s lacking with today’s avatar-populated VR animations. As XR experiences become ever more user friendly, demand for the real-time interactive network connectivity essential to bringing the most compelling use cases to life will intensify.

For an in-depth explanation of how XDN infrastructure fulfills the requirements surrounding XR and all other live streaming scenarios where real-time infrastructure is essential, see the white paper entitled The World Needs an Interactive Real-Time Streaming Infrastructure. To learn how Red5 Pro can assist with putting multi-cloud XDN architecture to work serving any real-time streaming use case at any scale over any distance contact info@red5.net or schedule a call.