apps
Giphy Is Back on Instagram and Snapchat After Claiming to Have Solved Its Racist Sticker Problem

Gif-sharing service Giphy is back on Snapchat after an incident last month involving an extremely goddamn racist gif that slipped past moderation filters, according to the Verge. Read More >>

youtube
YouTube: What If We Just Inserted a Link to Wikipedia Under Some of the Conspiracy Videos

On Tuesday at tech festival SXSW, streaming site and Google subsidiary YouTube’s CEO Susan Wojcicki proposed one solution to the conspiracy screeds and “false flag” hoax videos that are slowly but surely taking over the site by gaming its algorithms. Read More >>

youtube
YouTube’s New Moderation Team Stumbles Out the Gate

Following the mass school shooting which killed 17 people and wounded over a dozen others in Parkland, Florida this month, YouTube launched a campaign to use some of its 10,000 new moderators to somewhat thin out the ranks of the conspiracy peddlers and far-right nuts which have become rampant across the site. Though other tech companies have been pressured into such action over the past year, the matter was particularly pressing for YouTube—which yet again promoted virulent conspiracy theories speculating that the shootings were a hoax or that the survivors were “crisis actors.” Read More >>

reddit
Reddit Bans AI-Powered Fake Porn

Following Twitter, Pornhub, Discord, and Imgur, Reddit has taken action to ban the posting of AI-generated fake porn—commonly called “deepfakes.” Read More >>

youtube
YouTube Is Probably, Maybe Hiring Some People to Make It Less Welcoming to Child Predators

As you may have heard, YouTube is growing its content moderation team to 10,000 staffers. Sounds like progress! Of course, the move comes as a response to the ever-expanding gallery of horrors the site has unwittingly played host to over the years—most recently, various forms of child exploitation and predation—but let’s review the actual announcement from CEO Susan Wojcicki: Read More >>

youtube
Google Gives 10,000 People the Worst Job in the World

Google is Doing Something about the latest scandal it finds itself in (YouTube/comments/paedophiles), revealing a plan to boost its video content moderation team to as many as 10,000 people. That's 10,000 people who will be watching awful YouTube content, day in, day out, like the millions of children around the world their paymaster profits from. Read More >>

youtube
YouTube Says It Will Crack Down on All Those Creepy Videos Targeted at or Featuring Kids

Google, along with fellow tech giants like Amazon, Facebook, and Twitter has drawn increasing scrutiny this year over concerns that its “concentrated authority resembles the divine right of kings,” as the New York Times put it. In recent months, it’s faced stumbling blocks when it returned misinformation and conspiracy theories during crises like mass shootings and became embroiled in the ever-expanding Russian electoral interference scandal. But one particularly disturbing note concerned Google subsidiary YouTube and its YouTube Kids section, which everyone seems to have recently realised was promoting weird-ass, creepy content to children via algorithmically suggested videos and a seeming lack of moderation. Read More >>

twitter
Twitter Says it Will Finally do Something About Those Hordes of Nazis

Twitter CEO Jack Dorsey admitted on Friday that the company’s minimalist approach to moderation was not working, saying the site is committed to taking a “more aggressive stance in our rules and how to enforce them” in the wake of numerous high-profile public relations nightmares. Read More >>

facebook
Facebook Will Add 3,000 More People to Watch Murders and Suicides

Facebook has a problem. Not the one where they admitted to being a megaphone for propaganda and psy-ops. Or the one where they narced on at-risk teens. No, today’s news concerns how the social giant/massive data collection scheme has (increasingly) become an unwilling platform for users to broadcast violent crimes, sexual acts, child exploitation, and suicide. Read More >>

facebook
Facebook Reported BBC Journalists To The Police Because They Flagged Up Child Abuse Images

Facebook has reported a group of BBC journalists to the police for providing, at their request, child images found while investigating exploitation on the social network. Read More >>

internet
How a Video Game Chat Client Became the Web’s New Cesspool of Abuse

Over 25 million users have flocked to Discord, a text and voice platform for gamers, since its launch in May of 2015. Despite the company raising at least 30 million in venture capital funding, the company has only five “customer experience” personnel and no moderators on its staff. From what I’ve seen, users who wish to engage in harassment, raid servers, or bombard chats and users with child pornography suffer no lasting repercussions for doing so. That seemingly any server can become the victim of organised attacks represents the strained and failing infrastructure of moderation—of Discord, and of virtually any community on the internet. Read More >>

state of the internet
Why Trolls Won in 2016

A decade is all it took, more or less, for the internet to become unmanageable. Read More >>

facebook
The Horrifying Job of Facebook Content Moderators

Isn't Facebook great? (It's not.) But isn't it nice and clean and kid friendly? This is true for a very specific reason: the social media giant outsources the gnarly task of finding and deleting inappropriate content. In the November issue of Wired, Adrian Chen offers a peek into the darkest corners of the industry. It's only a little horrifying. Read More >>