- Apple added a new feature to the iPhone in the iOS 13 beta that makes it seem like you’re making eye contact during a FaceTime call even if you’re looking at the screen instead of the camera, as two people who have used the beta posted on Twitter.
- Apple may be using ARKit, its suite of tools for developing augmented reality apps, to generate this effect.
- The feature only appears to be available in the developer beta of iOS 13, but it could make its way to every iPhone when iOS 13 launches later this year.
- Visit Business Insider’s homepage for more stories.
Anyone who has ever video chatted using FaceTime or a similar app knows that it can be difficult to maintain eye contact while also looking at the person you’re speaking with. That’s because establishing eye contact requires you to look into your device’s camera, not at the other participant in the conversation on screen.
Apple is apparently trying to solve this in iOS 13 with a new feature called FaceTime Attention Correction, as app designer Mike Rundle and podcast co-host Will Sigmon recently posted on Twitter. The description of the feature simply says that eye contact with the camera will be more accurate during FaceTime calls, as a screenshot posted by Rundle indicates.
And the feature actually works, according to Sigmon and Rundle.
“Looking at him on-screen (not at the camera) produces a picture of me looking dead at his eyes like I was staring into the camera,” Rundle said in a tweet.
See Sigmon’s tweet below to see how a FaceTime conversation could look with the feature turned on.
Guys – “FaceTime Attention Correction” in iOS 13 beta 3 is wild.
Here are some comparison photos featuring @flyosity: https://t.co/HxHhVONsi1 pic.twitter.com/jKK41L5ucI
— Will Sigmon (@WSig) July 2, 2019
Apple did not immediately respond to Business Insider’s request for confirmation and additional details about how the feature works. The FaceTime Attention Correction function appears to only be available in the developer version of the iOS 13 beta, as it is not currently appearing in the public beta’s FaceTime settings.
Read more: I’ve been using Apple’s new iPhone software for the past week —here are my 5 favorite things about iOS 13 so far
The feature was discovered just as Apple released the third iteration of its iOS 13 developer beta on Tuesday.
To achieve this effect, Apple uses ARKit to build a depth map of the user’s face and adjusts the eyes as needed, Dave Schukin, cofounder of Observant, a company that makes software for monitoring whether drivers are paying attention to the road, wrote on Twitter. The Verge first spotted Schukin’s tweet.
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.
Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN
— Dave Schukin 🤘 (@schukin) July 3, 2019
The feature could be a new addition coming to every iPhone with iOS 13 later this year, along with confirmed features like Dark Mode and redesigns of apps like Reminders, Photos, and Apple Maps. The company typically releases the latest version of iOS in September to coincide with the launch of its new iPhones, but the firm has not announced an official release date beyond this fall. If you own an iPhone but aren’t a developer, you can try the software by installing the public beta after registering on Apple’s website.
SEE ALSO: Jony Ive’s departure may be a sign that one of Tim Cook’s top lieutenants is becoming even more powerful
Join the conversation about this story »
NOW WATCH: Google finally revealed Stadia pricing, games, and release date