Meta's new AI ad tech aims to reduce discrimination
What's the story
In a bid to address concerns about its ads being discriminatory, Meta has rolled out some changes to ensure that the real audience for an ad matches closely with the target audience.
Dubbed Variance Reduction System (VRS), it is meant to ensure that ads are not biased toward certain cultural groups.
The update is part of a settlement with the Department of Justice (DOJ).
Context
Why does this story matter?
Despite the day and age we live in, discriminatory ads or ads that target one demographic group over the other are still rampant.
Meta's new AI tech sends a strong message against such ads. It will also inspire other companies to act against discriminatory advertisements.
Although the tech itself is exciting, we will have to see how successful it is to make tall claims.
Ad targeting
Advertisers used ad tools to discriminate against certain groups
Meta allows advertisers to target people based on interests, demographics, and behavior. Through an 'ad auction,' it was then decided what ad to show a user at a particular time.
Although it had rules in place against discriminatory ads, the company received complaints about advertisers potentially using ad tools to exclude people from certain services, including housing, employment opportunities, and more.
VRS
VRS compares actual audience against targeted audience
With VRS, Meta aims to ensure that users who see a housing, credit, or employment ad closely reflects the targeted audience.
The company will achieve this by measuring the audience for a particular ad and then comparing it with the demographic distribution (age, gender, or estimated race) of the audience the advertiser selected.
The new system is based on aggregate data.
Application
The system will tweak the auction value of ads
Once the VRS system has the data, it will tweak the ad's auction value to display it more or less to a certain group. Initially, Meta is applying the new system to housing ads.
The company plans to expand it to employment and credit ads over the next year.
Per Meta, the system won't compromise privacy as it is based on aggregate data.
Settlement
DOJ pulled Meta in 2019 for enabling housing discrimination
The implementation of VRS comes six months after Meta agreed to settle charges against it by the DOJ. In 2019, the department accused Meta of enabling housing discrimination through ad targeting.
It was the first time the DOJ moved against algorithmic bias. As part of the settlement, Meta agreed to shelve its 'Special Ad Audiences' tool that allegedly used a biased algorithm.