Reddit faces class action suit for hosting child pornography
Reddit is facing an unprecedented lawsuit under the 2018 Fight Online Sex Trafficking Act and the Stop Enabling Sex Traffickers Act (FOSTA-SESTA) Bills introduced by then US President Donald Trump. The self-proclaimed front page of the internet is being sued by an unnamed woman for failing to remove child sexual abuse content, or more precisely pornographic images of her posted by her ex-boyfriend.
Victim contends Reddit wrongfully monetized child sex abuse content
The suit contends that Reddit ran advertisements on the platform, while also acknowledging the presence of underage pornographic content. This technically makes the aforementioned content a "commercial sex act" according to the suit. This logic allows the plaintiff to argue that Reddit's alleged knowledge of the presence of such pornographic materials on a platform that is monetized by ads violates the FOSTA-SESTA provisions.
Lawsuit invokes 2018 FOSTA-SESTA amendments to deny Reddit safe harbor
The woman claims that Reddit's inaction is in violation of the FOSTA-SESTA amendment to the Communications Decency Act's Section 230, which strips safe harbor protections shielding Reddit from legal ramifications of user generated "sex trafficking" content. The Section 230 provides limited liability from frivolous suits resulting from user generated content. The FOSTA-SESTA amendments neuter these safe harbor provisions when sex trafficking is involved.
Reddit accused of failing to stop repeated policy violations
"Because Reddit refused to help, it fell to Jane Doe to monitor no less than 36 subreddits — that she knows of — which Reddit allowed her ex-boyfriend to repeatedly use to repeatedly post child pornography," said the woman in her lawsuit against Reddit.
Woman claims Reddit was slow to scrub content off platform
The woman had informed Reddit moderators to scrub the pornographic photos uploaded by her ex-boyfriend when she was legally still a minor at the age of 16. The lawsuit claims that Reddit took "several days" to comply and ban the perpetrator. It also allegedly couldn't stop the boyfriend (whose age hasn't been revealed) from creating new accounts to re-upload the photos.
Reddit spokesperson denies allegations, claims the platform is in compliance
"We deploy both automated tools and human intelligence to proactively detect and prevent the dissemination of CSAM material. When we find such material, we purge it and permanently ban the user from accessing Reddit," the company clarified in a statement to The Verge.
Reddit claims it goes 'above and beyond' with automated moderation
Meanwhile, Reddit has denied the suit's claims that it condones child sexual abuse. The platform countered that it went "above and beyond" by using both automated tools and human moderation to remove such content. Reddit claimed that it removes illegal content and bans perpetrators, while reporting such users and preserving data for legal recourse.
Reddit faces class action lawsuit for hosting revenge porn
The plaintiff is now pursuing this as a class action lawsuit, which includes all victims of unsolicited pornographic images posted on Reddit. This is a fairly widespread activity dubbed as "revenge porn," where jilted lovers post sexually compromising photos of their partners to loosely moderated platforms such as Reddit in order to get even. The class action suit could entail serious ramifications for Reddit.