Johnsen’s experience is common in the pro-choice activist community. Most people who spoke to WIRED say their content appears to have been automatically removed by AI, rather than being reported by another user.
Activists also worry that even if content isn’t removed entirely, its reach may be limited by AI platforms.
While it’s nearly impossible for users to discern how Meta’s AI moderation is being implemented on their content, last year the company announced it would de-emphasize political and news content in users’ News Feed. Meta did not respond to questions about whether abortion-related content is categorized as political content.
Just as the different abortion activists who spoke to WIRED experienced varying degrees of moderation on the Meta platform, so do users in different locations around the world. WIRED experimented with posting the same phrase, “Abortion Pills Available by Mail,” from Facebook and Instagram accounts in the UK, US, Singapore, and the Philippines in English, Spanish, and Tagalog. Instagram removed English posts of the phrase when it was posted from the US, where abortion was recently restricted in some states following a court ruling last week, and from the Philippines, where it is illegal. But the post made from the US written in Spanish and the post made from the Philippines in Tagalog remained.
The phrase stuck on Facebook and Instagram when it was published in English from the UK. When it was posted in English from Singapore, where abortion is legal and widely available, the phrase remained on Instagram but was flagged on Facebook.
Courtesy of Kenneth Dimalibot
Courtesy of Kenneth Dimalibot
Ensley told WIRED that Reproaction’s Instagram campaigns on abortion access in Spanish and Polish have been very successful and haven’t seen any of the problems the group’s English-language content has faced.
“Meta particularly relies on automated systems that are extremely sensitive in English and less sensitive in other languages,” says Katharine Trendacosta, associate director of policy and advocacy at the Electronic Frontier Foundation.
WIRED also tested Meta’s moderation with a Schedule 1 substance that is legal for recreational use in 19 states and for medical use in 37 states, sharing the phrase “Marijuana is available by mail order” on Facebook in US English. The post is not tagged.
“Content moderation using artificial intelligence and machine learning takes a long time to set up and a lot of effort to maintain,” said a former Met employee familiar with the organization’s content moderation practices, who spoke on condition of anonymity. “As circumstances change, you have to change the model, but that takes time and effort. So when the world is changing rapidly, those algorithms often don’t work at their best and can be applied with less precision during periods of intense change.”
However, Trendacosta worries that law enforcement could also flag the content for removal. In Meta’s 2020 Transparency Report, the company stated that it “restricted access to 12 items in the United States reported by various attorneys general related to the promotion and sale of regulated goods and services, and to 15 items reported by the U.S. the state prosecutor. because he allegedly engaged in price gouging.” All jobs were later restored. “For prosecutors to be able to just say to Facebook, ‘Take this stuff down,’ and Facebook is doing it, even if they end up putting it back, that’s incredibly dangerous,” Trendacosta says.
Meta spokesperson Andy Stone told WIRED that the company has not changed its moderation policy in response to the termination Roe v. Wade, and said the company is working on a fix. In response to a Motherboard article about moderating abortion-related content, he tweeted that Meta does not allow content that attempts to “buy, sell, trade, gift, solicit or donate pharmaceutical products,” but does allow posts that discuss the “affordability and availability” of prescription drugs. He added: “We have discovered some cases of incorrect application and are correcting them.” On June 28, Instagram publicly acknowledged that sensitivity screens had been added to several posts about abortion, calling it a “bug” and saying the platform was in the process of fixing it.
Meta spokesperson Dani Lever did not respond to WIRED’s questions about whether the company would invest in more human moderators to handle abortion-related content, or apply the same standards to this content in different countries. Lever confirmed that Meta has since fixed issues with Instagram posts being flagged and removed.
Confusion over Meta’s handling of abortion-related content has some activists pondering the downsides of a society that relies on one company’s online social platforms. “For progressives, Facebook was about creating your own community and being able to organize, when I first started in 2007,” says Robin Marty, author of the book A new playbook for post-Roe America and director of operations at the West Alabama Women’s Center. “It was a specific place where we all met to organize online. And so the very tools that we have been given and that we have been using for more than a decade to accomplish this work are now being taken away from us.”