Teenagers on Facebook can be targeted by adverts endorsing alcohol, drugs, gambling, smoking and eating disorders, according to a report by a watchdog group. The Tech Transparency Project created six test adverts and submitted them to Facebook, saying it wanted to reach users aged 13 to 17.
Facebook approved all the ads within hours, including one promoting pill parties in 43 minutes.
“This is an easy fix, and Facebook should have had the foresight to make it a long time ago,” said Tech Transparency Project director Katie Paul. “Whether this was an oversight or a money-grab is not important. It’s completely unacceptable.”
As you scroll around Facebook and the wider web, its algorithms keep tabs on your behaviour. Eventually, it places you into categories based on what it’s observed about you: your political leanings, your favourite music, your interests and hobbies, and so on. This is what draws advertisers, who want to show ads tailored to these groups.
But many users are unaware that Facebook can infer everything from their race to their sexuality or relationship status just from their online activity. Moreover, several of these categories are inappropriate for minors. The report found that Facebook used teenagers’ behaviour to place them in interest categories for “alcoholic beverages,” “extreme weight loss,” and “tobacco,” even noting if the teens were single so they could be targeted by dating site ads.
All Facebook users are placed in interest categories. But minors under 18 aren’t supposed to be placed in certain adult categories. Facebook is in hot water for showing inappropriate adverts to children since at least 2014. As recently as 2019, an investigation by The Guardian found that children were still being labelled as interested in tobacco and alcohol.
Reporters have uncovered other issues with the company’s algorithmically created categories. In 2017, a ProPublica report found that the company was permitting advertisers to target users who listed their own occupation as “jew hunters.”
The next year Facebook apologised for indicating that thousands of users in Russia were “interested in treason.” Then, in 2019, Facebook settled with civil rights groups who alleged the company allowed advertisers to discriminate against certain groups when posting ads for jobs and housing.
Facebook has guardrails in place to stop these from being shown to underage users, but TTP’s director says the test adverts were approved “in a matter of hours.”
“There’s absolutely no reason why Facebook should have tagged nearly a million teens as potentially interested in “alcoholic beverages” and other categories,” Paul said.
Facebook did not comment before this article was initially published. After the article was published, a spokesperson said in a statement, “We’re investigating why some of these violating adverts were not detected. We prohibit adverts about alcohol, weight loss products and certain other topics from being shown to people under the age of 18 and we have age restriction tools so that business can better control who sees their content. We also may re-review ads after they are live.”
TTP created six test adverts, each designed around a topic users under 18 are not supposed to see. These include an ad for “ana tips” (“ana” is a well-known abbreviation for anorexia), which TTP says it targeted at users that Facebook classifies as being interested in “extreme weight loss” and “diet food.” A fake vaping ad targeted underage users classified as interested in “electronic cigarettes” and “tobacco.” Advertisers aren’t permitted to target users under 18 with dating site ads, but TTP’s test ad was approved in only two hours.
In addition to creating the categories, Facebook also shows advertisers its “estimated reach,” the number of users who may see any ad once it’s placed. Facebook estimated as many as 900,000 users would see the alcohol ad, while as many as 5 million would see the dating site ad. Without immediate correction to how the social network monitors its own rules around ad placement, the group warns, Facebook is “positioned to profit from harmful messages … aimed at a vulnerable age group.”
- A Wired report