How Facebook let advertisers target ‘white genocide’ conspiracy theorists


November 2, 2018 10:26 pm Published by

Credit: Illustration by Tam Nguyen/ Ad Age

Facebook let another racist trope slip into its advertising platform, this time allowing marketers to select a target audience of people interested in the bogus “white genocide” theory, it was revealed Friday.

After running a series of tests on Facebook’s automated platform, The Intercept discovered the ability to target people who had demonstrated an interest in the conspiracy. Facebook said this specific audience — people who believe outside influences are colluding to destroy the white race — should never have been created in its system.

This wasn’t the first time Facebook’s ad tools were used to create offensive targeting groups. A year ago, ProPublica found that advertisers could target “Jew haters” and other anti-Semitic terms.

So, what happened?

In this case, under the “interests” category, which advertisers use to direct ads to a relevant audience, the term “white genocide conspiracy theory” was presented as a group available for targeting. Facebook said 168,600 people fell into that bucket. Those were people who “expressed an interest in or like pages related to ‘white genocide conspiracy theory.'”

How does this kind of target category even pop up?

In this case, humans had to approve the category. Interest-based target audiences are updated all the time so marketers can use current references to reach people. For instance, a new pop star or movie might need to be added as an interest for targeting. Also, popular conversations permeating the culture can lead to new interest-based target audiences, as in this case. The interests are sometimes suggested by an algorithm that looks at terms and hashtags from new pages on Facebook. Ultimately, however, all the interest-based categories pass through human approval, and this one should not have been accepted, according to Joe Osborne, a Facebook spokesman.

“It’s against our ads principles and never should have been in our system to begin with,” Osborne said in a prepared statement form the company. “We deeply apologize for this error.”

Where did the ‘white genocide’ term originate?

This category was approved by Facebook in August, which would have been around the time that President Trump was tweeting about alleged “white genocide” in South Africa. There have been media reports of racial tension in South Africa, and rising fear particularly among the right-wing of the political spectrum that the white population of the country was being threatened. The term “white genocide” is often thrown around by white supremacists looking to stoke fear about minorities. The alleged Pittsburgh synagogue shooter appeared to harbor beliefs about Jewish people orchestrating a genocide against white people.

Who belongs to this group?

Facebook does not say exactly how it comes up with the groups, but there are pages on the site with content related to the subject. The Intercept said that as further targeting options, Facebook’s automated system suggested people of interest in far-right media like Tucker Carlson. The interest group related to white genocide conspiracies has been removed from Facebook.

Were any ads bought for that group?

Yes. Facebook said there were a few advertisers that found the interest group and used it for their campaigns. The ads were not offensive, however, and would have been approved with any targeting parameters, Osborne said. One ad was for a lecture about conspiracies and another was about news stories that referenced “white genocide,” according to Osborne. Facebook would not have checked those ad campaigns for their targeting parameters because all the options were presumable already vetted before entering the system.

Is Facebook’s ad platform all better now?

No. This is a very complex problem. ProPublica was previously able to find “Jew haters” as a targeting option because thousands of Facebook’s more than 2 billion users had typed that term in the fields where they were asked for their “job” and “education” experiences. Facebook did not account for the likelihood that those self-reported fields could be abused when it opened them for targeting. Since then, Facebook approves all employment and education targeting categories. These types of system flaws are common throughout digital advertising, where it’s all too easy for automated platforms to drop the ball. For instance, the term “kill all Jews” was trending today on Twitter, because of a proliferation of news stories about anti-Semitism in Brooklyn that referenced that term. Twitter removed the trend, and apologized, too.

Most Popular

Categorised in:

This post was written by Keywords