#NewsBytesExplainer: How Apple's Face ID system verifies your face
What's the story
For more than a year, Apple has been serving a proprietary facial recognition system on the iPhones, a tech known as Face ID.
It replaces Touch ID and employs a 'TrueDepth' camera to authenticate a user and allow them access to the device.
Now, you may have already used Face ID, but the million-dollar question is, how does it really work?
Let's find out.
Purpose
Face ID creates a facial map for authentication
Face ID unlocks the iPhone, like Touch ID, but by verifying your face, not fingerprint.
It creates an incredibly detailed 3D map of your face and uses it to authenticate a person looking at the device.
If the onlooker's face matches the registered one, it accepts and unlocks the phone; if not, it prompts to try again or enter the passcode of the iPhone.
3D face map
How the face map is created
In order to create the depth map of a user's face, Face ID leverages what Apple calls a TrueDepth camera system.
It employs the camera of the device as well as a series of sensors and light projectors to produce several images of a person's facial features.
Then, these images are merged to build the 3D map for authentication.
Step #1
Infrared light is employed for face modeling
The process of building a face map begins when you initiate Face ID setup and move your face in a circle.
The system first leverages flood illuminator to light your face with invisible infrared light and ensure it's detected in low light or dark conditions.
Then, its dot projector produces over 30,000 dots of IR light to create the 3D map of facial features.
Step #2
Infrared camera captures the dot pattern
After the dot pattern is created, the infrared camera of the iPhone captures its images.
All this information is then processed with Apple's neural networks to build and store a mathematical model of your face.
The company says its Face ID technology is backed by multiple neural networks trained on more than a billion images.
Verification
Matching faces against stored facial models
Now, when a person looks at an iPhone, the IR camera of the device captures their image, matching it against the stored facial model in real-time.
The entire processing, as Apple says, is done by its A11 chip's neural engine, which is capable of performing 600 billion operations/second.
As a result, Face ID unlocks the device, processes payments just in a split second.
Information
Comparison score is generated for authentication
To note, Face ID assigns a certain similarity score after comparing the scanned image with the stored model. If this score is higher than a certain pre-defined threshold, the authentication is successful.