{"id":1632,"date":"2013-03-07T10:47:03","date_gmt":"2013-03-07T15:47:03","guid":{"rendered":"http:\/\/blogs.law.harvard.edu\/herdict\/?p=1632"},"modified":"2013-03-07T10:47:03","modified_gmt":"2013-03-07T15:47:03","slug":"social-media-censorship","status":"publish","type":"post","link":"https:\/\/archive.blogs.harvard.edu\/herdict\/2013\/03\/07\/social-media-censorship\/","title":{"rendered":"Social Media Censorship"},"content":{"rendered":"<p>Recently, Facebook has been accused of <a href=\"http:\/\/www.examiner.com\/article\/is-facebook-actively-censoring-conservative-bloggers\">actively censoring the accounts of conservative bloggers<\/a>. As might be expected, Facebook posters from the opposite end of the social and political spectrum have reported <a href=\"http:\/\/allfacebook.com\/liberals-cry-foul-over-censorship-on-facebook_b55197\">liberal censorship<\/a> as well. Perhaps the problem isn\u2019t a systematic political bias, but instead overzealous application of censorship defined by Facebook\u2019s <a href=\"http:\/\/www.facebook.com\/communitystandards\">community standards<\/a>. Individual interpretation of proscribed content categories may lead to erring on the side of \u201cprotection\u201d of users rather than protection of free speech.<\/p>\n<p>Diane Sori, a blogger for Patriot Factor, reports that she has repeatedly been blocked from posting. As an experiment, The Examiner attempted to post some content, and <a href=\"http:\/\/www.examiner.com\/article\/is-facebook-actively-censoring-conservative-bloggers\">was warned to slow down before being blocked for two days<\/a>. According to the website <a href=\"http:\/\/www.facebookcensorship.com\/\">FacebookCensorship.com<\/a>. Facebook has actively been censoring conservative content for some time now, while leaving left-wing and liberal content untouched, <a href=\"http:\/\/www.examiner.com\/article\/double-standard-palin-hate-page-flourishes-as-facebook-bans-conservatives\">even if it could reasonably be deemed as offensive<\/a>.<\/p>\n<p>As counterpoint to the discussion of conservative censorship, Liberal Lamp Post <a href=\"http:\/\/www.addictinginfo.org\/2011\/08\/15\/hey-facebook-are-you-really-censoring-political-free-speech-in-america\/\">presents examples of censorship of liberal posts<\/a>, specifically in blocking links to a liberal guide to Republican talking points and other material, with blocks lasting for 15 days. Commenters on the site go on to note examples of apolitical areas of animal rescue, OxFam charity, and outside-US issues that have also been blocked under the banner of anti-spamming.<\/p>\n<p>In October last year, Facebook came under fire <a href=\"http:\/\/www.slate.com\/blogs\/future_tense\/2012\/10\/31\/facebook_censors_anti_obama_navy_seals_meme_apologizes_breitbart_outraged.html\">for censoring an anti-Obama meme<\/a> posted by the account Special Operations Speaks (SOS). While Facebook is known to have an automatic spam detection filter, it also has a staff of human moderators who manually check content for anything deemed offensive or inappropriate. The deletion of the anti-Obama meme was done by one of these moderators in accordance with Facebook\u2019s policy. \u00a0Facebook subsequently <a href=\"http:\/\/www.slate.com\/blogs\/future_tense\/2012\/10\/31\/facebook_censors_anti_obama_navy_seals_meme_apologizes_breitbart_outraged.html\">reversed the decision and apologized for it<\/a>. Because of these decisions and reversals, many people feel the policies are incomprehensible and\/or inconsistently applied. \u00a0For instance Facebook has prohibited photos of breastfeeding mothers and drunk people sleeping with things drawn onto their faces, <a href=\"http:\/\/gawker.com\/5885714\">but not crushed heads, excessive blood, or humorously offensive content<\/a>.<\/p>\n<p>In December 2012, Richard Gage, the founder of an organization known as Architects &amp; Engineers for 9\/11 Truth, found that <a href=\"http:\/\/digitaljournal.com\/article\/339963\">his page had been taken down<\/a> along with the pages of several of his peers. A reporter for the alternative news website Infowars, Darrin McBreen, has also had his page removed, having been told by Facebook that he \u201cshould be careful about making political statements\u201d and that \u201c<a href=\"http:\/\/digitaljournal.com\/article\/339963\">Facebook is about building relationships not a platform for your political viewpoint<\/a>.\u201d<\/p>\n<p>Facebook supposedly instituted the community standard policies in order \u201c<a href=\"http:\/\/www.facebook.com\/communitystandards\">to balance the needs and interests of a global population<\/a>,\u201d and to protect its users from spam, hate speech, and abuse. This is a reasonable position given how quickly the user experience would degenerate if automatic spammers and abusive trolls were allowed to run amok on the network. The problem, of course, is that no organization can be completely neutral and that what constitutes offensive content is always subjective. Attempting to police the content of users who question the truth of 9\/11, criticize Barack Obama, or spin Republican talking points certainly seems misguided, even if it is not politically motivated.<\/p>\n<p>Used as a political tool, Facebook could be incredibly powerful. In 2010 and 2012 elections in the United States, Facebook allowed users to tell their friends when they voted. \u00a0According to Facebook&#8217;s research, <a href=\"http:\/\/techcrunch.com\/2012\/11\/06\/click-facebooks-im-voting-button-research-shows-it-boosts-turnout\/\">this may have increased turnout by as much as 2.2%<\/a>. \u00a0But as Harvard University Professor Jonathan Zittrain has pointed out, <a href=\"http:\/\/www.technologyreview.com\/view\/511111\/thank-god-for-facebook-when-platforms-proselytize\/\">Facebook could use this power to try to influence elections<\/a>; what if they only showed the voting message to people that they thought were from one party? \u00a0To be clear, Facebook hasn&#8217;t done such a thing. \u00a0However, this thought experiment demonstrates the risks if Facebook is not even handed in their content removal policies. \u00a0Couldn&#8217;t skewing the content removed (and the content that remains) influence the political leanings of users in the same way an &#8220;I voted&#8221; message would?<\/p>\n<p>Going beyond systematic political censorship, is it appropriate for Facebook to impose any censorship through the lens of the sensibilities of the moderators? It\u2019s difficult for individuals to maintain total objectivity in controversial areas once they are authorized to judge posts against vague policy that simply cannot provide rules for consistent treatment of every possibility. Personal bias is likely to creep into moderators\u2019 interpretation of the already lengthy community standards. Moreover, in order to keep operations cost low, Facebook moderators are given<a href=\"http:\/\/www.theatlantic.com\/technology\/archive\/2013\/02\/facebook-workers-try-to-spend-less-than-1-second-determining-whether-content-is-appropriate\/273402\/\"> only half a second to look at each page<\/a>. As a result, they might miss controversial content or make mistakes when determining whether content is &#8216;appropriate&#8217;.<\/p>\n<p>Facebook could avoid this problem by taking a more hands-off approach to potentially offensive content. \u00a0While Facebook has chosen to <a href=\"http:\/\/www.facebook.com\/communitystandards\">implement a comprehensive policy<\/a>, outlawing anything which they deem to be violent or threatening, hate speech, bullying, spam, pornography, fraud, as well as anything which violates copyright or encourages self-harm, Twitter has chosen a <a href=\"https:\/\/support.twitter.com\/articles\/18311-the-twitter-rules\">far more liberal policy<\/a>. \u00a0Twitter\u00a0allows almost everything except pornography, copyright infringement, threats and impersonation of someone in a way that is meant to be misleading. Twitter does have a policy which allows them to remove content following a government request, but they don\u2019t have to do so, and <a href=\"http:\/\/articles.cnn.com\/2012-10-18\/tech\/tech_twitter-censorship_1_alex-macgillivray-twitter-neo-nazi\">have already refused to do so several times<\/a>. Rather than banning a user or deleting \u201coffensive\u201d content, Twitter instead helpfully <a href=\"https:\/\/support.twitter.com\/articles\/15794-abusive-behavior\">suggests that users simply block users that they find offensive<\/a>. It seems clear that this is a sensible option, which preserves the so-called offender\u2019s right to free speech and allows each user to make a personal decision on what is and what is not acceptable.<\/p>\n<p><strong>Jean-Loup Richet, Special Herdict Contributor<\/strong><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Recently, Facebook has been accused of actively censoring the accounts of conservative bloggers. As might be expected, Facebook posters from the opposite end of the social and political spectrum have reported liberal censorship as well. Perhaps the problem isn\u2019t a systematic political bias, but instead overzealous application of censorship defined by Facebook\u2019s community standards. Individual [&hellip;]<\/p>\n","protected":false},"author":4591,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2},"jetpack_post_was_ever_published":false},"categories":[4236],"tags":[3687,981,253,4306,3261],"class_list":["post-1632","post","type-post","status-publish","format-standard","hentry","category-herdict-web","tag-censorship","tag-facebook","tag-filtering","tag-herdict","tag-twitter"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p4LdGs-qk","_links":{"self":[{"href":"https:\/\/archive.blogs.harvard.edu\/herdict\/wp-json\/wp\/v2\/posts\/1632","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/archive.blogs.harvard.edu\/herdict\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/archive.blogs.harvard.edu\/herdict\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/herdict\/wp-json\/wp\/v2\/users\/4591"}],"replies":[{"embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/herdict\/wp-json\/wp\/v2\/comments?post=1632"}],"version-history":[{"count":10,"href":"https:\/\/archive.blogs.harvard.edu\/herdict\/wp-json\/wp\/v2\/posts\/1632\/revisions"}],"predecessor-version":[{"id":1636,"href":"https:\/\/archive.blogs.harvard.edu\/herdict\/wp-json\/wp\/v2\/posts\/1632\/revisions\/1636"}],"wp:attachment":[{"href":"https:\/\/archive.blogs.harvard.edu\/herdict\/wp-json\/wp\/v2\/media?parent=1632"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/herdict\/wp-json\/wp\/v2\/categories?post=1632"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/herdict\/wp-json\/wp\/v2\/tags?post=1632"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}