Facebook has promised to remove “sensitive” ads. Here is what he left behind.

Late last year, after facing years of criticism for its practices, Facebook announced a change to its multibillion-dollar advertising system: Companies that buy ads would no longer be able to target people based on interest categories such as race, religion, health status, politics, or sexual orientation.

More than three months after the change supposedly took effect, however, The Markup found that such ad targeting is still available on Facebook’s platform. Some obvious ad categories have indeed been removed, such as “Young Conservatives”, “Rachel Maddow”, “Hispanic Culture” and “Hinduism” – all categories we found as options on the platform in early January but have since faded away. However, other obvious indicators of race, religion, health status and sexual orientation remain.

As early as 2018, CEO Mark Zuckerberg told Congress that the company had “removed the ability to exclude ethnic groups and other sensitive categories from ad targeting. So it’s no longer a feature that’s even available.

The markup revealed, however, that while “Hispanic culture” was removed, for example, “Spanish language” was not. “Tea Party Patriots” was removed, but “Tea party” and “The Tea Party” were still available. “Social Equality” and “Social Justice” are gone, but advertisers could still target “Social Movement” and “Social Change”.

Starbucks, for example, was still able to use existing options after the change to place an ad for its pistachio latte focused on users interested in “contemporary R&B”, “telenovela”, “Spanish language” and “K-pop,” all proxies for black, Latino, and Asian audiences on Facebook.

Facebook hasn’t explained how it determines which ad options are “responsive” and, in response to questions from The Markup, declined to detail how it makes those decisions. But in the days following The Markup’s Facebook request for comment, several potentially sensitive ad targeting options we flagged were removed by the company.

“Removing sensitive targeting options is an ongoing process, and we’re constantly reviewing available options to make sure they meet people’s evolving expectations of how advertisers can reach them on our platform,” he said. said Dale Hogan, spokesman for Facebook’s parent company Meta, in a statement. statement. “If we discover additional options that we deem sensitive, we will remove them.”

Facebook’s ad targeting system is the not-so-secret key to the company’s massive financial success. By tracking the interests of users online, the company promises, advertisers can find the people most likely to pay for their products and services and show ads directly to them.

But the company has faced backlash for offering advertisers “interest” categories that deal with more basic – and sometimes very personal – details about a user. These interests can be used in surprising ways to discriminate, from excluding people of color from real estate listings to fueling political bias to tracking users with specific illnesses.

Facebook critics say the company has had ample opportunity to fix problems with its ad system, and trying to fix the platform by removing individual “sensitive” terms masks an underlying problem: the platform form of the business might simply be too big and heavy to fix without more. fundamental changes.

Removing a handful of terms it deems sensitive isn’t enough, according to Aleksandra Korolova, an assistant professor of computer science at the University of Southern California.

“It’s obvious to everyone on the ground that it’s not a complete solution,” she said.

Clear proxies for deleted terms are still on Facebook

The markup gathered a list of potentially sensitive terms from late October, before Facebook removed all interest targeting options. The data was collected through Citizen Browser, a project in which The Markup receives Facebook data from a national panel of Facebook users.

We also put together a list of terms that Facebook’s tools recommended to advertisers when they entered a potentially sensitive term – the company suggested “BET” and “Essence (magazine)” when advertisers searched for “African American culture“, for example.

Then, also using Facebook’s tools, we calculated how similar the terms were to their suggestions by visualizing the number of users the ads were meant to reach, what Facebook calls an “audience”. (See the details of our analysis on Github.)

To find gaps in Facebook’s cleanup process, we then searched those terms again in Facebook’s public advertising tools at the end of January to see which ones the company had removed following its change.

In some cases, we found that options still available reached almost exactly the same users as options removed. “BET,” the acronym for Black Entertainment Television, was removed, but “BET Hip Hop Awards,” which was previously recommended with BET and had a 99% audience overlap, was still available.

“Gay pride” was also removed as an option, but using the term “RuPaul’s Drag Race” advertisers could still reach over 13 million of the same users.

These proxies were not only theoretically available to advertisers on Facebook. The markup revealed that companies were actively using them to target ads to people on the social network. Using Citizen Browser, we found several examples of proxies for race and political affiliation being used for targeting.

Ancestry, the genealogy service, for example, targeted ads using the terms “telenovela”, “BET Hip Hop Awards”, “African culture” and “Afrobeat”.

Facebook removed “Fox News Channel” as a targeting option that could reach conservative users, but we saw conservative satire website The Babylon Bee target an ad ridiculing Anthony Fauci using the then-still-available interest category “Judge Jeanine Piro”, a Fox News Personality.

Prior to Fox News Channel’s removal, we found that 86% of users marked with an interest in Judge Jeanine Piro also marked an interest in the cable news network.

Facebook also hasn’t completely phased out targeting based on medical conditions, we found. “Autism Awareness” was removed, but “Epidemiology of Autism” was still available. The “diabetes mellitus awareness” was removed, but the closely related “sugar substitute” was not. We found an ad from Medtronic, a medical device company, using the term to promote an insulin pen for diabetes management on Facebook.

Even Facebook itself used the proxies. We found an ad placed by the company promoting its groups to interested users under “Vibe (magazine)”, a substitute for removed terms that target black audiences.

Starbucks, Ancestry and Babylon Bee did not respond to requests for comment on their ad targeting practices. Pamela Reese, a spokeswoman for Medtronic, said the company has stopped using the “sugar substitute” as a targeting option and that Medtronic is “well” adhering to FDA regulations on medical device advertising.

The markup provided several examples of these potential proxy terms to Facebook, including “telenovela”, “BET Hip Hop Awards”, “RuPaul’s Drag Race”, and “Judge Jeanine Piro”. They were quietly removed after our request for comment was sent.

Facebook critics like Korolova say Facebook has a habit of promising to implement significant changes to its advertising platform for breaking the pledge. Research has shown problems with “proxies” advertising for years, and Facebook could have taken stronger action to fix the problems, she argues.

“If they wanted to, they could do better,” Korolova said.

Facebook says its recent changes were necessary to prevent abuse, but some organizations that say they use Facebook for social good have complained that the new policies put up barriers to their work. Climate activists and medical researchers have complained that the changes have limited their ability to reach relevant audiences.

Daniel Carr, recruitment consultant for SMASH Labs, a medical research group that uses Facebook to recruit gay and bisexual men for studies, said recent changes have forced them to shift from terms such as “LGBT culture” to pop culture references such as “RuPaul’s Drag Race”. .” Carr said enrollment in the study was stable, but the change was not for them.

“It made things more complicated on our end, and it actually didn’t change anything, except that Facebook can now say, ‘We don’t allow you to be targeted by these things,'” Carr said. . “It’s a political gesture, if there is one.”

Angie Waller manages The Markup’s Citizen Browser project, its custom application that monitors what is served algorithmically to paid panels of Facebook users in the United States and Germany. Colin Lecher is a journalist at The Markup.

Header artwork by Gabriel Hongsdusit is republished with permission from The Markup.

Comments are closed.