It’s hard to shock Collin Williams,Watch Cheating Husband Movies Online a volunteer moderator for r/RoastMe, a comedic insult forum with 2.3 million members. But about a year ago he was surprised by something in his own comment history.
The ghosts of posts he had attempted to remove were still there.
Per 2020 numbers, 52 million people visit Reddit’s site each day and peruse some three million topic-specific forums, or subreddits. Those communities are kept on-topic (and in theory, consistent with Reddit’s content policies), mostly thanks to human volunteers and automated bots, collectively referred to as moderators or “mods.”
As a moderator, Williams tends to a garden of posts and comments, a process that includes removing posts that break subreddit rules. Moderators set their own rules for what is postable in their own forums but also play a role in enforcing Reddit’s content policy.
On r/RoastMe, those moderator-created rules outlaw posting pictures of someone without their permission or of people under 18, for instance. But even after Williams removed offending posts from his subreddit, they were, somehow, still viewable in his account’s comment history.
The moderator’s entire history becomes this giant list … of everything they’ve removed and for what reason."
Then, Williams had an epiphany. This probably wasn’t a problem just on his subreddit.
Williams was right. Across Reddit, when a moderator removes a post, the post is unlisted from the subreddit’s main feed. But images or links within that post don’t actually disappear. Posts removed by moderators are still readily available to anyone on Reddit in the comment history of the moderator who flagged it—complete with an explanation of the rule it violates—or to anyone who retained a direct URL to the post.
“The moderator’s entire history becomes this giant list—basically an index—of everything that they’ve removed and for what reason. Reddit kind of accidentally created this giant index of stuff that humans have flagged as being inappropriate on the site,” Williams said.
This is happening across subreddits. You can still find moderator-removed video game memes or reposts (a major Reddit faux pas) of dog photos. But it also happens on subreddits dedicated to posting sexual or sexualized content. In those cases, this loophole allows posts to persist on the site, even though they violate Reddit’s content policy.
For example, the subreddit-specific rules of r/TikTokThots, a subreddit dedicated to sexualized TikTok videos, explicitly instruct users not to post videos of people presumed to be under 18, in keeping with the Reddit-wide content policy against “the posting of sexual or suggestive content involving minors.” But The Markup was still able to find removed posts in moderators’ comment histories in which certain images, flagged by moderators for depicting presumed minors, were still live and visible.
In the more risqué subreddit r/TikTokNSFW, moderators removed at least one nude image because it “contains minors,” according to moderator judgment, yet the image is still visible on the site through the removing moderator’s comment history.
This is thus far a huge miss by the engineering and UX team…."
“This is thus far a huge miss by the engineering and UX team and must be rectified immediately in order to prevent new or recurrent victimization of those featured in the content,” said Sameer Hinduja, co-director of the Cyberbullying Research Center and a professor of criminology at Florida Atlantic University.
Williams privately reached out to Reddit’s r/ModSupport, a point of contact between Reddit administrators (i.e., employees) and volunteer moderators, on Sept. 15, alerting them to this issue. He later received a DM from a Reddit administrator saying, “Thanks for sending this in. We’ll take a look!” But as of publication time, there had been no change.
About one month later, Williams also posted publicly in Reddit’s r/ModSupport to alert other moderators to the potential problem. Reddit’s Community Team removed his post.
Reddit has not responded to multiple requests for comment for this story.
Reddit has continually grappled with how to police its endless list of online communities. This has played out most publicly in Reddit’s NSFW (not safe for work) content, particularly nude or sexualized images.
Before 2012, Reddit did not have a publicly posted policy explicitly banning content that sexualized minors, but the company did report “clear cut cases” of child sexual abuse material (CSAM) to the National Center for Missing and Exploited Children. When it came to what Reddit termed “legally grey” areas surrounding CSAM, Reddit initially dealt with them on a “case by case basis.” But eventually Reddit took more sweeping action to curb CSAM on its platform. For example, Reddit administrators banned the controversial community r/jailbait after a nude picture of a 14-year-old girl was posted on the subreddit.
By 2012, Reddit had reformed its content policy: All subreddits that focused on the sexualization of children were banned. But in 2014, The Washington Post suggested little had substantively changed, with “every kind of filth and depravity you can conceive of—and probably quite a bit that you can’t” still live on the site.
Reddit’s current policy bans any sexual or suggestive content involving minors or anyone who even appears to be under 18. Users and moderators can report images sexualizing minors to Reddit administrators, who will remove the posts themselves and then report blatant CSAM to the National Center for Missing and Exploited Children.
However, Reddit is currently facing a class-action lawsuit related to alleged lax enforcement of CSAM policy. In a complaint filed in April 2021 in the U.S. District Court for the Central District of California, one woman accused the company of allowing an ex-boyfriend to repeatedly post nude videos and images of her as a teenager.
“Reddit claims it is combating child pornography by banning certain subreddits, but only after they gain enough popularity to garner media attention,” the complaint reads.
Reddit filed a motion to dismiss this case, citing Section 230 of the Communications Decency Act, which protects websites against liability from content posted by third parties. A court granted that motion in October, and the plaintiff has since appealed to the U.S. Court of Appeals for the Ninth Circuit.
While Reddit has banned communities that violate its rules, the architecture of Reddit’s site itself allows content deemed by moderatorsto violate Reddit’s policies to persist in a moderator’s comment history. Hinduja said this loophole can present problems for people struggling to remove sensitive content from Reddit’s platform.
“It clearly is problematic that sexual content or personally identifiable information which violates their policies (and in the history, clear explanation is typically provided as to why the material transgresses their platform rules) still is available to see,” he said.
There are some cases in which removed posts don’t have accessible images. If users delete their own content (if prompted by a moderator, or of their own volition), the images seem to truly disappear from the site, including any comment history. If a user doesn’t take this action, there’s not much moderators can do to truly delete content that breaks Reddit’s rules, other than report it to Reddit.
Reddit’s guidelines for moderators acknowledges that moderators can remove posts from subreddits but don’t have the power to truly delete posts from Reddit. Moderators, to some extent, are aware of this fact.
A moderator for six Reddit communities, u/manawesome326, told The Markup that removing posts as a moderator can prevent posts from showing up in subreddit feeds, but it “leaves the shell of the post accessible from the post’s URL,” which makes it accessible via moderator histories.
Moderators of more risqué subreddits are also aware that they can’t truly remove content. A moderator for r/TikTokNSFW, u/LeThisLeThatLeNO, also noted to The Markup that removed posts on Reddit are still viewable and that the best a moderator can do is to “click the report button and hope the overlords notice it.”
"[The best a moderator can do is] click the report button and hope the overlords notice it."
“In my experience (over 1 year ago) making reports almost always led to nothing, unless a whole avalanche of reports go in,” the moderator said via Reddit.
Reddit has previously attempted to limit who can view removed content. In a post in r/changelog, a subreddit dedicated to platform updates, Reddit administrators wrote, “Stumbling across removed and deleted posts that still have titles, comments, or links visible can be a confusing and negative experience for users, particularly people who are new to Reddit.”
The platform noted it would allow only moderators, original posters, or Reddit administrators to view posts that had been removed that would have previously been accessible from a direct URL. The limits applied only to posts with fewer than two upvotes or comments.
Many moderators, though, had major issues with this change, because subreddits often referred to removed content in conversations and moderators have found it useful to view accounts’ past posts (even user-deleted ones) to investigate patterns of bad behavior.
The trial was discontinued in late June. Reddit did not respond to questions asking for specifics on why the trial had been discontinued.
Meanwhile, Williams has tried to block the route most Redditors might use to find deleted posts: his comment history. He has started to privately message users who post rule-breaking content on his subreddit to explain why it was removed rather than leaving a public comment. This method, he hopes, will make removed posts harder to find.
If anyone had happened to squirrel away a direct link to a removed post, however, the offending images are still accessible.
Although Reddit does have the power to truly delete posts, it’s still the volunteer moderators, real and automated, who do the bulk of content management on Reddit. (Posts removed by bots also still contain visible images if viewed through the bot’s comment history.)
According to Reddit’s 2020 transparency report, Reddit administrators removed 2 percent of all content added to the site in 2020, whereas moderators, including bots, removed 4 percent of content.
Loopholes like this one create a gray area when it comes to content moderation, said Richard Rogers, the chair in New Media and Digital Culture at the University of Amsterdam.
While moderators are a first line of defense responsible for keeping prohibited content off of their subreddits, this loophole means they have little power to enforce what Rogers calls “platform cleanliness.”
“Then the responsibility shifts from the individual subreddit and the moderators to the platform,” Rogers said to The Markup. “So seemingly, here, we have a disconnect between those two levels of content moderation.”
In the meantime, images in removed posts will continue to linger in Reddit’s corners, just a click away from being discovered.
This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.
Previous:5 Affordable Last
Get free burgers for life at this joint, if you get a tattoo of a burger that is5 beautiful tributes to José Fernández from Monday night's gameThe Earth just permanently passed a symbolic carbon dioxide thresholdA foot of rain may fall in the Washington, D.C., area by Friday'Street Fighter V' Capcom Cup is coming to California in DecemberStolen penguin at risk after being released into the wildHere are the top 5 takeaways from Elon Musk's big Mars speechNo music taste? Spotify's 'Daily Mix' will just curate some for you.Dad executes a stunning recreation of his daughter's photoshootWomen's mag tweets without a link for context are absurdly hilariousTim Tebow hit a home run on the first pitch of his first at4 simple ways couples can save moneyEtsy proposes 3 ways to improve the gig economy in new reportClever crocodile shows off terrifying new fishing technique'Stranger Things' if it were a comic book from the '80sSomeone just bought the 'Dark Knight' batsuit for $250,000See the world through the eyes of this 19Nurses in flooded Vietnam hospital had to catch eels swimming through corridors'Street Fighter V' Capcom Cup is coming to California in DecemberTyra Banks and her Lyft driver sang 'Hamilton' tunes at the top of their lungs Online jobs: How to dodge scams and time Super lucky Kentucky Derby gambler wins $75,000 off a $1 bet The Resistance is now a lifestyle brand and please, don't buy it Final 'Wonder Woman' trailer: More action, more weapons, and Dr. Poison A real 'Rick and Morty' Rickmobile is slobbering its way to a freeway near you The Microsoft Surface Pro 5 doesn't exist How to style your vagina so it's no longer a pre Secretive military space plane lands in Florida with a sonic boom This week in apps: Houzz gets an AR update, Strava goes social and Spotify launches QR codes Instagram will now let you share posts using its mobile website Jay Leno spends his weekends cruising around in a vintage fire truck Chris Pine reminds us he’s not Hemsworth, Evans, or Pratt in 'SNL' monologue Tiny, adorable baby animal rescued from imminent doom by hero Arizonans Adele celebrates her 29th birthday with a delightful 'old lady' photoshoot This Iranian soccer fan looks just like Lionel Messi 'SNL' cuts the sexual tension on 'Morning Joe' with an awkward knife Microsoft is putting AI everywhere it can Latest batch of lawsuits reveal new horror stories from Fyre Fest Why Facebook and Snapchat are pushing QR codes in our faces Apple patents a way to eject water from iPhones using sound
1.816s , 8273.7734375 kb
Copyright © 2025 Powered by 【Watch Cheating Husband Movies Online】,Openness Information Network