
Photo: Apple
There’s something weirdly off-putting about the eye-line problem with video calling services like FaceTime and Skype.
That’s because users have to choose either to look directly at the camera lens, and miss what’s happening on screen, or look at the screen and appear to be staring at a person’s neck. That’s not ideal for a tool that’s meant to make it seem like you’re having a face-to-face conversation.
Fortunately, it’s something Apple fixes in iOS 13.
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.
Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN
— Dave Schukin ? (@schukin) July 3, 2019
The new “FaceTime Attention Correction” feature is present in the latest iOS 13 developer beta. It’s only available for iPhone XS and XS Max devices, but it’s nonetheless appreciated.
To do this, it uses Apple’s vaunted augmented reality tech ARKit. In essence, it maps a user’s face and then changes the position of their eyes. In other words, what you’re seeing is a doctored image — but one that shouldn’t be overly noticeable for the majority of users.
For more information on the other great features found in the latest iOS 13 beta, check out the roundup by my Cult of Mac buddy Charlie.
Via: The Verge
This article was originally posted here