iOS uses ARKit to solve one of the biggest FaceTime complaints

2018 iPad Pro Animoji
The eye-line problem is finally fixed. If you own an iPhone XS or XS Max, that is!
Photo: Apple

There’s something weirdly off-putting about the eye-line problem with video calling services like FaceTime and Skype.


That’s because users have to choose either to look directly at the camera lens, and miss what’s happening on screen, or look at the screen and appear to be staring at a person’s neck. That’s not ideal for a tool that’s meant to make it seem like you’re having a face-to-face conversation.

Fortunately, it’s something Apple fixes in iOS 13.

The new “FaceTime Attention Correction” feature is present in the latest iOS 13 developer beta. It’s only available for iPhone XS and XS Max devices, but it’s nonetheless appreciated.

To do this, it uses Apple’s vaunted augmented reality tech ARKit. In essence, it maps a user’s face and then changes the position of their eyes. In other words, what you’re seeing is a doctored image — but one that shouldn’t be overly noticeable for the majority of users.

For more information on the other great features found in the latest iOS 13 beta, check out the roundup by my Cult of Mac buddy Charlie.

Via: The Verge

This article was originally posted here