Skip to content

Fake eye contact – iOS 13 will adjust your focus in FaceTime

iOS 13 and iPadOS 13 are on the horizon, and we heard about most of the new features coming our way back at WWDC in June. But since the betas have gone public, people have been discovering even more changes that haven’t previously been mentioned by Apple.

One of those little tidbits will change the way people use FaceTime, Apple’s video-calling app. A new feature called Attention Correction will digitally adjust your eyes in real-time to simulate eye contact with the other caller. We’re living in the future, folks!

There is always a slight social disconnect when video calling somebody due to the distance between the screen and the camera. If you look directly into the camera, it will appear to the recipient as though you’re looking straight at them – but most of the time, users are looking down towards the screen so they can see who they’re talking to. This makes it difficult to maintain eye contact over a video call.

That’s where this clever feature comes in, using ARKit alongside Apple’s facial recognition smarts to track your head in real-time and subtly warp your eye line so it appears as though you’re looking directly into the camera when you’re actually focused on the screen. This feature was discovered and tested by Will Sigmon on Twitter, who you can see demoing its use below.

The image on the right has Attention Detection activated (credit: Will Sigmon)

It’s a small feature for sure, but looks to work surprisingly well and could give an extra edge to FaceTime as the video calling service of choice.

There’s no official word yet, but it seems as though this process is limited to only Apple’s newest devices – iPhone XS, iPhone XR, and iPad Pro – and we strongly suspect it only works in one-on-one chats. Still, this is a smart addition. It just goes to show how much is coming this year that such a technically impressive feature didn’t even make the cut at Apple two-hour-plus presentation last month.