SAN FRANCISCO — Meta on Tuesday agreed to change its ad technology and pay a $115,054 fine, as part of a settlement with the Justice Department over allegations that the company’s ad systems discriminated against Facebook users by limiting who could see real estate listings on the platform. based on their race, sex and postal code.
Under this agreement, Meta, the company formerly known as Facebook, said he would change his technology and using a new computer-assisted method that aims to regularly check whether people targeted and eligible to receive real estate ads are actually seeing those ads. The new method, called a “variance reduction system,” relies on machine learning to ensure that advertisers deliver housing-related ads to specific categories of protected people.
“Meta will – for the first time – change its ad serving system to address algorithmic discrimination,” said Damian Williams, U.S. Attorney for the Southern District of New York, said in a press release. “But if Meta fails to demonstrate that it has modified its delivery system sufficiently to guard against algorithmic bias, this office will pursue litigation.”
Facebook, which has become a commercial colossus by collecting data from its users and allowing advertisers to target ads based on characteristics of an audience, has faced complaints for years that some of these practices are biased and discriminatory. The company’s advertising systems allowed marketers to choose who saw their advertisements using thousands of different characteristics, which also allowed those advertisers to exclude people in a number of protected categories, such as as race, sex and age.
The Justice Department filed its lawsuit and settlement against Meta on Tuesday. In its complaint, the agency said it had concluded that “Facebook could achieve its interests by maximizing its revenue and providing users with relevant advertising through less discriminatory means.”
Although the settlement relates specifically to real estate ads, Meta said it also plans to enforce its new system to verify the targeting of employment and credit-related ads. The company has previously faced backlash for allowing bias against women in job postings and excluding certain groups of people from see credit card ads.
The issue of biased advertising targeting has been particularly debated in real estate advertisements. In 2016, Facebook’s advertising discrimination potential was revealed in a investigation by ProPublica, which showed that the company’s technology made it easy for marketers to exclude specific ethnic groups for advertising purposes.
In 2018, Ben Carson, who was the secretary of the Department of Housing and Urban Development, announced an official complaint against Facebook, accusing the company of having ad systems that “unlawfully discriminate” based on categories such as race, religion and disability. In 2019, HUD sued Facebook for engaging in housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems do not serve ads to “a diverse audience” even if an advertiser wants the ad to be seen widely.
“Facebook discriminates against people based on who they are and where they live,” Mr Carson said at the time. “Using a computer to limit someone’s housing choices can be just as discriminatory as slamming a door in someone’s face.”
The Justice Department’s lawsuit and settlement are based in part on HUD’s 2019 discrimination investigation and accusation against Facebook.
In its own testing related to the matter, the US Attorney’s Office for the Southern District of New York found that Meta’s ad systems steered real estate listings away from certain categories of people, even when advertisers did not intend to do so. The ads were directed “disproportionately toward white users and away from black users, and vice versa,” according to the Justice Department’s complaint.
Many real estate ads in neighborhoods where most people were white were also directed primarily at white users, while real estate ads in predominantly black areas were primarily served at black users, the complaint added. As a result, according to the complaint, Facebook’s algorithms “actually and predictably reinforce or perpetuate racially segregated housing models.”
In recent years, civil rights groups have also pushed back against the large and complex advertising systems that underpin some of the largest internet platforms. The groups argued that these systems have inherent biases and that tech companies like Meta, Google and others should do more to combat those biases.
The area of study, known as “algorithmic fairness”, has been a topic of significant interest among computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists like Timnit Gebru and Margaret Mitchell, have been sounding the alarm about these biases for years.
In the years that followed, Facebook narrowed the types of categories marketers could choose from when buying real estate listings, reducing the number to the hundreds and eliminating targeting options based on race, age and postal code.
Chancela Al-Mansour, executive director of the Housing Rights Center in Los Angeles, said it was “essential” that “fair housing laws be aggressively enforced”.
“Real estate listings had become tools of illegal behavior, including segregation and discrimination in housing, employment and credit,” she said. “Most users had no idea they were either being targeted or denied based on their race and other characteristics.”
Meta’s new ad technology, which is still in development, will occasionally check who is receiving ads for housing, jobs, and credit, and ensure that those audiences match the people marketers want to target. If the ads being served start to be heavily skewed towards white males in their twenties, for example, the new system will theoretically recognize this and shift the ads to serve more evenly among larger and more varied audiences.
“We’ll occasionally take a snapshot of marketers’ audiences, see who they’re targeting, and remove as much variance from that audience as possible,” Roy L. Austin, Meta’s vice president of civil rights and associate general counsel. , said in an interview. He called it a “significant technological advancement in how machine learning is used to deliver personalized ads.”
Meta said it would work with HUD over the next few months to integrate the technology into Meta’s ad targeting systems, and agreed to a third-party audit of the new system’s effectiveness.
The company also said it will no longer use a feature called “special ad audiences,” a tool it developed to help advertisers broaden the groups of people their ads reach. The Department of Justice said the tool also engages in discriminatory practices. Meta said the tool was a first effort to fight bias and that his new methods would be more effective.
The $115,054 penalty Meta agreed to pay in the settlement is the maximum available under the Fair Housing Act, the Justice Department said.
“The public should know that Facebook’s latest abuse was worth the same amount of money Meta makes in about 20 seconds,” said Jason Kint, chief executive of Digital Content Next, an association of premium publishers.
As part of the settlement, Meta admitted no wrongdoing.