After years of on-and-off temporary suspensions, Instagram permanently banned Pornhub’s account in September. Then, for a short period of time this weekend, the account was reinstated. By Tuesday, it was permanently banned again.
“This was done in error,” an Instagram spokesperson told TechCrunch. “As we’ve said previously, we permanently disabled this Instagram account for repeatedly violating our policies.”
Instagram’s content guidelines prohibit nudity and sexual solicitation. A Pornhub spokesperson told TechCrunch, though, that they believe the adult streaming platform’s account did not violate any guidelines. Instagram has not commented on the exact reasoning for the ban, or which policies the account violated.
It’s worrying from a moderation perspective if a permanently banned Instagram account can accidentally get switched back on. Pornhub told TechCrunch that its account even received a notice from Instagram, stating that its ban had been a mistake (that message itself was apparently sent in error).
The conflict over Pornhub’s Instagram account illustrates a larger issue impacting creators like sex educators, pole dancers and sex workers. Even if their posts do not violate content guidelines, many have found that their accounts are prone to being disabled or suspended. Adult stars may not be able to share their NSFW content on Instagram, but their large followings help them communicate with their audiences.
Due to U.S. legislation like SESTA/FOSTA, which passed in 2018, it has become harder for online sex workers to safely and legally make a living. The law is positioned as a way to curb sex trafficking, but in practice, the policy has been shown to have made sex work less safe. Since the legislation carves out an exception to Section 230 that holds online platforms liable for facilitating prostitution and trafficking, social networks and credit card processors alike have become very skittish about running afoul of the law. Yet as of 2021, federal prosecutors have only used the legislation one time, according to a government report.
When Pornhub’s account received a permanent suspension in September, the platform shared an open letter to Meta executives on Twitter.
“Pornhub’s safe-for-work account has been disabled for three weeks,” the letter reads. “In the interim, Kim Kardashian has posted her fully exposed ass to her 330 million followers without any restrictive action from Instagram. We are happy to see that Kim and the artistic team behind the image are free to share their work on the platform, but question why we are denied the same treatment.”
Now, a Pornhub spokesperson continues to speak out against what it called Instagram’s “arbitrarily and selectively-enforced ‘standards.’”
“Within hours of reinstating our Instagram account, Meta has demonstrated that its policies have no rhyme or reason when they deactivated our account again, despite not violating any guidelines,” the spokesperson told TechCrunch. “… Meta and Instagram’s haphazard and irrational enforcement of their policies place undue hardships on the livelihoods of those in the adult industry, an already marginalized group. We call on Meta to once again reverse its decision.”
Pornhub’s parent company MindGeek is currently in the midst of multiple lawsuits that allege it has knowingly profited off of child sexual abuse material (CSAM). As of December 2020, the platform removed all non-verified content and now requires those who upload videos to verify their identity.
Online platforms are legally required to report incidents related to CSAM to the National Center for Missing and Exploited Children’s CyberTipline, though this does not necessarily require companies to proactively seek out CSAM to flag and remove. According to a report from 2021, Meta reported 22 million instances of CSAM on Facebook and over 3 million on Instagram. MindGeek made 13,229 reports.
Source @TechCrunch