Hands on with Apple’s FaceTime Attention Correction feature in iOS 13

 

In the latest iOS 13 beta, Apple is leaning into its augmented reality prowess to fix a common eye contact issue had during FaceTime calls. AppleInsider goes hands on and dives in to how it works.

FaceTime Attention Correction fixes your gaze

FaceTime Attention Correction fixes your gaze

A quick precursor, for the uninitiated. When making a FaceTime call, a user naturally wants to look at the screen —at the person they are conversing with. On the other end, the recipient of the call just sees the caller looking down.

When the caller looks at the camera, the recipient sees this as the caller looking them in the eye —a much more natural point of view. However the caller is no longer looking at the recipient on the screen at that point, which means they can only count on their peripheral vision to see the other person’s reactions, rather than a clearer image when viewing normally.

You can see how this looks for yourself by opening the camera app and looking directly at the screen and taking a picture, then looking at the camera and taking a picture.

FaceTime Attention Correction toggle

FaceTime Attention Correction toggle in Settings

With the third beta of iOS 13, Apple added a new toggle for FaceTime Attention Correction. This aims to correct your eyesight to appear more natural to the call recipient during a FaceTime call.

An iPhone's TrueDepth Camera System

An iPhone’s TrueDepth Camera System

This uses the TrueDepth camera system, the same camera system used for Animoji, unlocking the phone, and even augmented reality features found in FaceTime.

Taking a look behind the scenes, the new eye contact functionality is powered, of course, by Apple’s ARKit framework. It creates a 3D face map and depth map of the user, determines where the eyes are, and adjusts the eyes accordingly.

We don’t know with certainty, but Apple is likely using ARKit 3 APIs, which would explain why this functionality is limited to the iPhone XS and iPhone XS Max and not on the iPhone X, as those APIs aren’t supported on that model or earlier. It doesn’t explain however the lack of support on the iPhone XR which has the A12 Bionic processor as well.

Slight warping from the augmented reality feature of the attention correction feature

Slight warping from the augmented reality feature of the attention correction feature

If we take a straight object and move it in front of our eyes while on camera, you are able to see the slight distortion that is a byproduct of the augmented reality addition. There is a bit of a curve around the eyes and even the nose.

The whole thing looks very natural and no one would likely even notice without prior knowledge of the feature. It simply looks like the person calling is looking right at you, instead of your nose or your chin.

FaceTime Attention Correction example

FaceTime Attention Correction example

Here you can see us looking at the screen and the camera within the Camera app, and how when looking at the screen it looks like we are looking down. When we are in a FaceTime call with the feature on, you can see how it looks like we are looking upwards slightly when we look directly at the camera.

This article was originally posted here