Facebook’s Militia 'Mistake,' FCC’s Speech Police, and TikTok’s Secret Sauce - 6 minutes read
Photo : Chris McGrath ( Getty Images )
Hellfeed Hellfeed is your bimonthly resource for news on the current heading of the social media garbage barge. Prev Next View All
The never-ending battle over content moderation is only intensifying as we get closer to the 2020 elections and beyond. We’re trying out new formats to round up and summarize developments in this space—both in the interest of everyone’s sanity and to allow more time for the ramifications and context of the outcomes of various screaming matches to become clear. For now, we’re calling this Hellfeed. Let us know your thoughts and feedback in the comments below.
Advertisement
Facebook ruled heavily armed militia group didn’t violate TOS
Facebook moderators declined to take down the Facebook page of a militant group calling itself the “Kenosha Guard” or its “Defend Kenosha” event calling for an ad hoc militia to “take up arms” against Black Lives Matter protesters in Kenosha, Wisconsin. The pages had been reported hundreds of times for numerous posts and comments threatening or encouraging violence, but Facebook moderators repeatedly ruled the content didn’t break its rules. The night of the event, August 25, 17 year old, self-avowed militiaman Kyle Rittenhouse opened fire, killing two people.
Advertisement
The Kenosha Guard page and the event didn’t disappear from Facebook until some nine hours after the shooting. To make this clusterfuck even worse, Facebook and CEO Mark Zuckerberg admitted they screwed up big time, but told employees and the media that its moderation teams were the ones to take the event page down under its recent change to policies on militia groups. It later emerged that the organizer of the event deleted it themselves; Facebook said it took credit for taking the event down by “mistake” rather than, say, an outright lie.
All of this was predictable to the point of being the default way the company operates. Facebook doesn’t have a blanket policy prohibiting events where attendees are encouraged to bring firearms, just a ban on encouraging people to bring guns to specific locations like houses of worship and separate policies on threatening violence with weapons. Changing that policy would doubtless draw the wrath of the gun lobby—and its well-placed allies in Congress and the White House—and in the past few years the Facebook has repeatedly caved when mulling chances that could anger conservatives. In the meantime, far-right activists and other armed groups have made ample use of the lax rules.
But even when the rules are tightened, Facebook is unwilling or unable to consistently enforce them. It can’t even prevent banned gun sales. The Kenosha group was full of violent rhetoric that should have sent up red flags even without the policy update on militias. But again and again, Facebook ignores these kind of things or forestalls taking action until they inevitably blow up in its face. Same old.
Trump’s birdbrained executive order on Section 230 moves forward
The Federal Communication Commission’s deadline for submitting comments on the Trump administration’s executive order to turn the FCC into its online speech police was this week. In short, Trump’s petition would task FCC with investigating if websites and social networks unfairly discriminate against conservatives—something it doesn’t have the jurisdiction to do—and punish those it finds in violation by removing their Section 230 protections against civil liability for user-generated content.
Advertisement
Trump’s order would totally break the internet, and it may be unlikely the FCC considers it at all—one of the three Republican members of the agency’s five-member commission has already trashed it. But we compiled some of the most depraved comments Trump supporters sent in to support the petition. For a more thoughtful take on the situation, TechDirt’s Mike Masnick submitted comments on how Section 230 enabled the site to build a thriving web community.
Reddit has big plans for its streaming network
Reddit has plans to extend Reddit Public Access Network, its streaming service, according to a report in the Verge. It is considering removing 45-minute caps on the length of broadcasts, and in the coming months, has plans to allow all subreddits to broadcast. (Currently, while streaming is open to all users, just over a dozen subreddits have access.) Best of luck with that can of worms.
Advertisement
It grows. It feeds.
QAnon, the future of the GOP, has continued its death march into the mainstream despite announcements last month by Facebook and Twitter that they were cracking down on violent rhetoric associated with the far-right, anti-Semitic movement. It’s not clear what anyone can do to slow it down at this point, as it is merging with rest of the right-wing media ecosystem. (Evangelicals are increasingly being sucked in, according to the MIT Technology Reviw and USA Today.) But Facebook deleted an ad in which QAnon candidate Marjorie Taylor Greene threatened AOC and others with a gun, so there’s that.
Advertisement
Facebook is all over the fucking place on political ads
Facebook announced a ban on political advertising in the week before the 2020 elections—another wild swing from its double-down on free speech rhetoric earlier this year. (It still allows candidates to lie in ads.) Slate argues this will prevent the Trump or Biden campaigns from responding to late-breaking developments or send out reminders on voting procedures. The ban doesn’t apply to political ads purchased in advance, which means campaigns, PACs, and lobbyists might place massive ad buys at the last minute.
Advertisement
Pinterest, like Twitter, has a ban in place on political advertising. This week it announced it will restrict ads on searches for and stop recommending election-related content like “Trump,” “Biden,” “polling place,” and “voting.”
2020 burnout
Worth a read: The Washington Post published a feature examining how moderators paid and unpaid, from sites ranging from Reddit and Nextdoor to Facebook and Google, are getting seriously burned out in 2020.
Advertisement
The pandemic is still making everything worse
The rapid spread of Covid-19 misinformation continues to embarrass social networks; Twitter deleted a post from a QAnon account that claimed the U.S. has only seen 9,000 deaths from the novel coronavirus after it was retweeted by the president.
Advertisement
YouTube’s second-quarter Community Guidelines Enforcement Report showed it doubled down on automation amid “greatly reduced human review capacity” due to covid-19, removing 11 million videos from April to June 2020. That predictably increased error rates; YouTube said that although less than 3 percent of removals were appealed, it reinstated 50 percent of those videos. Compare that to Facebook, which processed virtually no appeals in its second quarter.
China might not let TikTok give away its secret sauce
The feds’ effort to coerce Beijing-based ByteDance into selling its wildly popular TikTok app to a U.S. company hit a big snag this week: new Chinese government restrictions on exporting AI technology. ByteDance is reportedly trying to get clarity on whether this extends to the black box algorithms it uses to choose videos that appear on user feeds; TikTok’s asking price of about $30 billion would plummet if its finely-tuned brainjacking techniques aren’t part of the deal.
Source: Gizmodo.com
Powered by NewsAPI.org
Hellfeed Hellfeed is your bimonthly resource for news on the current heading of the social media garbage barge. Prev Next View All
The never-ending battle over content moderation is only intensifying as we get closer to the 2020 elections and beyond. We’re trying out new formats to round up and summarize developments in this space—both in the interest of everyone’s sanity and to allow more time for the ramifications and context of the outcomes of various screaming matches to become clear. For now, we’re calling this Hellfeed. Let us know your thoughts and feedback in the comments below.
Advertisement
Facebook ruled heavily armed militia group didn’t violate TOS
Facebook moderators declined to take down the Facebook page of a militant group calling itself the “Kenosha Guard” or its “Defend Kenosha” event calling for an ad hoc militia to “take up arms” against Black Lives Matter protesters in Kenosha, Wisconsin. The pages had been reported hundreds of times for numerous posts and comments threatening or encouraging violence, but Facebook moderators repeatedly ruled the content didn’t break its rules. The night of the event, August 25, 17 year old, self-avowed militiaman Kyle Rittenhouse opened fire, killing two people.
Advertisement
The Kenosha Guard page and the event didn’t disappear from Facebook until some nine hours after the shooting. To make this clusterfuck even worse, Facebook and CEO Mark Zuckerberg admitted they screwed up big time, but told employees and the media that its moderation teams were the ones to take the event page down under its recent change to policies on militia groups. It later emerged that the organizer of the event deleted it themselves; Facebook said it took credit for taking the event down by “mistake” rather than, say, an outright lie.
All of this was predictable to the point of being the default way the company operates. Facebook doesn’t have a blanket policy prohibiting events where attendees are encouraged to bring firearms, just a ban on encouraging people to bring guns to specific locations like houses of worship and separate policies on threatening violence with weapons. Changing that policy would doubtless draw the wrath of the gun lobby—and its well-placed allies in Congress and the White House—and in the past few years the Facebook has repeatedly caved when mulling chances that could anger conservatives. In the meantime, far-right activists and other armed groups have made ample use of the lax rules.
But even when the rules are tightened, Facebook is unwilling or unable to consistently enforce them. It can’t even prevent banned gun sales. The Kenosha group was full of violent rhetoric that should have sent up red flags even without the policy update on militias. But again and again, Facebook ignores these kind of things or forestalls taking action until they inevitably blow up in its face. Same old.
Trump’s birdbrained executive order on Section 230 moves forward
The Federal Communication Commission’s deadline for submitting comments on the Trump administration’s executive order to turn the FCC into its online speech police was this week. In short, Trump’s petition would task FCC with investigating if websites and social networks unfairly discriminate against conservatives—something it doesn’t have the jurisdiction to do—and punish those it finds in violation by removing their Section 230 protections against civil liability for user-generated content.
Advertisement
Trump’s order would totally break the internet, and it may be unlikely the FCC considers it at all—one of the three Republican members of the agency’s five-member commission has already trashed it. But we compiled some of the most depraved comments Trump supporters sent in to support the petition. For a more thoughtful take on the situation, TechDirt’s Mike Masnick submitted comments on how Section 230 enabled the site to build a thriving web community.
Reddit has big plans for its streaming network
Reddit has plans to extend Reddit Public Access Network, its streaming service, according to a report in the Verge. It is considering removing 45-minute caps on the length of broadcasts, and in the coming months, has plans to allow all subreddits to broadcast. (Currently, while streaming is open to all users, just over a dozen subreddits have access.) Best of luck with that can of worms.
Advertisement
It grows. It feeds.
QAnon, the future of the GOP, has continued its death march into the mainstream despite announcements last month by Facebook and Twitter that they were cracking down on violent rhetoric associated with the far-right, anti-Semitic movement. It’s not clear what anyone can do to slow it down at this point, as it is merging with rest of the right-wing media ecosystem. (Evangelicals are increasingly being sucked in, according to the MIT Technology Reviw and USA Today.) But Facebook deleted an ad in which QAnon candidate Marjorie Taylor Greene threatened AOC and others with a gun, so there’s that.
Advertisement
Facebook is all over the fucking place on political ads
Facebook announced a ban on political advertising in the week before the 2020 elections—another wild swing from its double-down on free speech rhetoric earlier this year. (It still allows candidates to lie in ads.) Slate argues this will prevent the Trump or Biden campaigns from responding to late-breaking developments or send out reminders on voting procedures. The ban doesn’t apply to political ads purchased in advance, which means campaigns, PACs, and lobbyists might place massive ad buys at the last minute.
Advertisement
Pinterest, like Twitter, has a ban in place on political advertising. This week it announced it will restrict ads on searches for and stop recommending election-related content like “Trump,” “Biden,” “polling place,” and “voting.”
2020 burnout
Worth a read: The Washington Post published a feature examining how moderators paid and unpaid, from sites ranging from Reddit and Nextdoor to Facebook and Google, are getting seriously burned out in 2020.
Advertisement
The pandemic is still making everything worse
The rapid spread of Covid-19 misinformation continues to embarrass social networks; Twitter deleted a post from a QAnon account that claimed the U.S. has only seen 9,000 deaths from the novel coronavirus after it was retweeted by the president.
Advertisement
YouTube’s second-quarter Community Guidelines Enforcement Report showed it doubled down on automation amid “greatly reduced human review capacity” due to covid-19, removing 11 million videos from April to June 2020. That predictably increased error rates; YouTube said that although less than 3 percent of removals were appealed, it reinstated 50 percent of those videos. Compare that to Facebook, which processed virtually no appeals in its second quarter.
China might not let TikTok give away its secret sauce
The feds’ effort to coerce Beijing-based ByteDance into selling its wildly popular TikTok app to a U.S. company hit a big snag this week: new Chinese government restrictions on exporting AI technology. ByteDance is reportedly trying to get clarity on whether this extends to the black box algorithms it uses to choose videos that appear on user feeds; TikTok’s asking price of about $30 billion would plummet if its finely-tuned brainjacking techniques aren’t part of the deal.
Source: Gizmodo.com
Powered by NewsAPI.org