Tags

, , , , , , , , , ,

censorship

Do social media and networking platforms unfairly restrict speech and content by users expressing certain political views? Is the “marketplace of ideas” subject to entry barriers imposed by the platforms themselves? Twitter has been in the news recently for a “Block & Report Spam” feature whereby complaints can trigger a suspension. Some claim that leftists are gaming the system to take down certain posters. Twitter claims to never filter or moderate content proactively, but the system seems to invite abuse by activists at either end of the political spectrum.

Facebook admits that it attempts to enforce a set of “community standards” that cover the general areas of safety, respect (covering hate speech and nudity), security, and intellectual property. There is ample evidence, however, that enforcement of these standards is “arbitrary and capricious“. Examples range from inconsistent treatment of “Death to Israel” posts, images of buttocks, sculptures or drawings of body parts vs. actual body parts, and a variety of gay-themed images. These cases and many others are likely a consequence of different moderators responding to complaints differently in attempts to interpret and enforce rules that are vague by necessity. In addition, decisions to censor or suspend users are sometimes reversed by committee at higher levels, only to be made again later. And there have been allegations that content from advertisers is treated with a “lighter touch” than from non-advertisers. Perhaps the organization is simply trying to find a fair way to moderate the complicated social thicket, but the effort seems largely misplaced. A broader policy of content neutrality and allowing users to censor their feeds for themselves, as they are empowered to do, would avoid many of the inconsistencies.

Facebook also admits to employing contractors as “news curators”. This, and the mysterious disappearance of certain “trending topics” having a conservative bent, have led to strong complaints of blacklisting and censorship. The curators’ instructions from Facebook are described by Gizmodo:

They were also told to select articles from a list of preferred media outlets that included sites like the New York Times, Time, Variety, and other traditional outlets. They would regularly avoid sites like World Star Hip Hop, The Blaze, and Breitbart, but were never explicitly told to suppress those outlets….  News curators also have the power to “deactivate” (or blacklist) a trending topic—a power that those we spoke to exercised on a daily basis. A topic was often blacklisted if it didn’t have at least three traditional news sources covering it, but otherwise the protocol was murky—meaning a curator could ostensibly blacklist a topic without a particularly good reason for doing so.

This has the potential to create a bias in favor of certain viewpoints. If a trending topic comes from a source or involves a viewpoint that is not in favor, “news curation” amounts to a distasteful cover for outright political censorship. The Facebook system is also vulnerable to the sort of “mobbing” by activists that has been problematic for Twitter. Some of the complaints against unfair treatment by Facebook undoubtedly have merit. Such bias could have an influence sufficiently great to alter election outcomes.

Some forms of censorship on these platforms may be justified, such as preventing threats, abuse or harassment. As well, the platforms are required to comply with laws that are more restrictive in certain countries. Nevertheless, whatever the content standards, and whatever political bias might be created, the platforms are operated by private entities. They can do whatever they want, as much as anyone might hate it. The accusers are entitled to complain, of course, but they should bear in mind that these platforms are not exactly an open marketplace or a public square, however tempting it is to think of them that way. They could be open and free, given a more enlightened approach by the organizations that run them, but as things stand they are not. Positive action remains an option for those who object: agitate, package your content more carefully, or get off the platform and find an on-line community to your liking.