AR is currently limited by technology — but it won’t be for long

Science fiction has long flirted with blending the real and the virtual. From holographic rooms aboard spaceships to full immersion within virtual worlds, movie and TV heroes have existed in environments where there’s no distinction between what’s real and what is not.

Back in the present, technology isn’t yet at that stage, but that doesn’t mean elements of sci-fi can’t leak through. For example, gaming continues to embrace the virtual reality space, where you don a helmet to be plunged into a virtual universe that exists all around you, rather than solely within the confines of a screen. Augmented reality (AR), though, takes a different approach, instead overlaying the digital atop the real.

Apple’s early ARKit advertising showcased integrating a massive virtual dinosaur with the real world

Applications for this technology suggest a future where AR objects could be ‘projected’ in front of your eyes throughout your daily life. Right now, bar miserable attempts at AR spectacles, the technology from a consumer perspective remains limited to device displays. You launch an AR app and use its camera to integrate virtual objects with your surroundings.

Apple was at the forefront of this movement, introducing ARKit in 2017 and rapidly evolving it over subsequent years. The technology can now more readily understand multiple surfaces — walls rather than just floors, and apply occlusion to mask virtual objects when in the real world they would be obscured by something.

Apple’s ’Pro’ devices all have a LiDAR as part of the camera system

LiDAR takes this to the next level, and it’s all about speed and accuracy. The LiDAR scanner first appeared on 2020’s iPad Pro and is part of the camera system of the new iPhone 12 Pro. It works by using a pulsed laser to generate a 3D model of objects up to five meters/15 feet away. This happens almost instantaneously and without you, the user, having to do anything.

When comparing AR apps on devices with and without LiDAR, it’s like night and day. On those without, you have to move your device around until it finds a surface to use; and even then, the image can be wobbly. With a LiDAR-equipped device, AR elements appear instantly, are more stable and are more accurate. This provides the potential for moving AR beyond niche fare. For example, Apple’s Measure app on older iPhones and iPads is an amusing curiosity, but you wouldn’t want to rely on it; with LiDAR, it’s solid enough to use for measurements.

LiDAR and Complete Anatomy combine to help assess a patient’s motion

The result is a new breed of applications meaningfully able to better people’s lives. Complete Anatomy utilizes LiDAR to assess the motion of patients in three dimensions to help them recover from injury. RoomScan LiDAR does what you’d expect from its name, enabling you to quickly scan rooms to create floor plans for redesigning your home. Elsewhere, the system assists in photography as well, improving autofocus in low light.

Limitations remain with Apple’s LiDAR system: it can’t see behind objects; the resolution isn’t sufficient to provide enough data for 3D scanning; and it’s currently only included with Apple’s most expensive hardware. But as we’ve seen in recent years, technology evolves at speed. If Apple remains committed to AR, its LiDAR will improve and drip down to lower-priced hardware. In a few years, every device you own could be equipped with one. And although you still won’t quite be reveling in the fun of a Holodeck, you will at least be able to better explore objects in 3D space and virtually manipulate your home and the wider world around you — all from a single handheld device.