iOS 13's new feature will fake eye-contact between FaceTime users
The third beta version of iOS 13, Apple's latest mobile software, is out with a new feature that will fake eye contact for you during FaceTime calls. The feature, which sounds like some sort of wizardry, artificially adjusts your eyes to create an impression for the other person that you are directly looking at them when you could actually be looking at the screen.
Everything to know about "FaceTime Attention Correction" feature
Normally, in video calls, we tend to look at the person on our display rather than make eye contact with them by directly looking into the front-facing camera (like we do during a Skype interview). But with this new feature, dubbed as FaceTime Attention Correction, Apple will auto-adjust your eye contact with the camera even when you are staring at the screen.
The feature is currently available for iPhone Xs and Xr
According to several user posts on Reddit and Twitter, the feature only appears to be working on the iPhone Xs, iPhone Xs Max, and iPhone Xr that are running the third beta version of iOS 13. If you have any of these devices, you can test out the feature by turning it on from within the FaceTime's settings.
So, how does the feature work?
Our guess is Apple is using some image-manipulation algorithm wherein it tracks your eye movement (via Face ID gadgetry) and automatically creates a realistic-looking eye contact when it detects a change in the gaze. However, in case of face movement, it won't adjust your eye contact. Notably, Apple hasn't shared any details on how the feature works.
Here's everything we don't know about the feature
Evidently, there are many unanswered questions. Firstly, what we have mentioned earlier is pure conjecture so one would want Apple to tell us how it is achieving the effect. Secondly, we would like to know all the devices which would get this feature And finally, will it work on group calls or what if there are multiple people in the frame?