Friday, July 27, 2018

Facebook trips on its own moderation failures

After weeks of speculation around how it plans to handle conspiracy website Infowars, its creator Alex Jones and others that spread false information, Facebook finally gave us an answer: inconsistently.

The company hit Jones with a 30-day ban after it removed four videos that he shared on the Infowars Facebook Page.

The move is Facebook’s first that curtails the reach of Jones, who has been a major talking point in the media because he is continually allowed a voice on the social network, despite spreading “alternative theories” on events like 9/11 and the San Bernardino shootings.

Confusion

Sounds good so far, but, for a six-hour period today, it didn’t seem as though Facebook itself even knew what is going on.

CNET reported that Jones’ had been hit by a 30-day suspension for posting four videos that violate its community standards on the Infowars page that counts him as a moderator. When reached by TechCrunch to confirm the report, Facebook said Jones had only been handed a warning and that, in the event of another warning, a 30-day ban would then follow.

After hours of waiting for further confirmation and emails to the contrary, Facebook clarified that in fact Jones’ personal account was given a 30-day ban, while Infowars received a warning but no ban.

Facebook is literally shooting the messenger but allowing the page — which pushed the video out to its audience — to remain in place.

In subsequent emails, Facebook explained that the inconsistency is because Jones’ personal account had already received a past warning, which triggers the 30-day ban. Surprisingly, though, this is a first warning for the Infowars page.

At least, that’s what we think has happened because Facebook hasn’t fully clarified the exact summary of events. (We have asked.)

Beyond the four videos, there’s a lot riding on this decision — it sets a precedent. Infowars is one of the largest of its kind, but there are plenty of other organizations that thrive on pumping out misleading/false content that plays into insecurities, misplayed nationalistic pride and more.

That’s why Infowars (involuntarily) became the subject of two Facebook video events held with press his month. On both occasions, Facebook executives said that even those peddling false information deserve to have a voice on the social network, no matter how questionable or inflammatory their views may be. CEO Mark Zuckerberg himself even said Holocaust deniers have free speech on the service.

Based on today, so long as they spew their message within the Facebook community rules, they are fine.

Follow fast

In fact, you could take it further and suggest that if they don’t raise the suspicions of rival platforms like YouTube, they’ll remain untouched on Facebook.

The Jones/Infowars videos were pulled by Facebook days after being removed from YouTube. Indeed, one of the Facebook videos had even survived a review after it was flagged to Facebook moderators last month. The reviewer marked the video as acceptable and it remained on the platform — until this week.

Facebook called that decision a mistake, but arguably it’s a mistake that wouldn’t have been rectified had YouTube not raised the alarm by banning the videos on its platform first. (YouTube has well-documented content moderation problems so that it is it running circles around Facebook should draw much concern from the social network’s management.)

That Facebook is unable to communicate a significant decision like this in a cohesive manner doesn’t give the confidence to think it has its house in order when it comes to video moderation. If anything, it shows that the social network is playing catch up and winging what is a critical topic.

Its platform is being used nefariously worldwide, whether it is to sway elections or incite racial violence in foreign lands, so now, more than ever, Facebook needs to nail down the basics of handling malicious content like Infowars which, unlike those other threats, is hiding in plain sight.



from TechCrunch https://ift.tt/2uU1bYh

No comments:

Post a Comment