Before Twitter and YouTube could respond, millions of users saw video of journalist James Foley being beheaded. An equally large number of people were privy to naked photos of celebrities on sites such as 4chan and reddit, after a hacker illegally stole the photos from the cloud. These were eventually removed, but not before a surprising controversy rose up around these instances: Should social media sites be in the business of censoring content-and, by extension, users?
In other words, if you’re a user who thinks Facebook, 4chan, or any other website is treading on your right to free speech, you probably won’t have a compelling court case. Most sites describe their censorship to some degree within the terms of service (TOS) that all users must agree to. However, the censorship process itself need not be disclosed, and these private entities have the final word when it comes to what users can and cannot post.
In some cases, there are existing laws that govern the content some social media users may be sharing. “With regard to intellectual property issues, they are immune from liability for violations by their users as long as they honor take-down notices,” says Mark B. Baer, founder of the law corporation Mark B. Baer, Inc. and a frequent contributor to The Huffington Post. “Social media sites are not owned and operated by the government, and therefore, the owners of those sites may limit otherwise constitutionally protected speech. Furthermore, sites have been known to allow otherwise constitutionally prohibited material to remain online until ordered otherwise by a court of law. The nude photos of celebrities were removed only after the sites received copyright takedown notices.”
On YouTube, the TOS lists several things a user agrees to avoid when uploading videos or making comments. Most of this covers content with copyrights and trademarks, which have clear legal guidelines, but for other content, it asks you follow “common-sense rules that will help you steer clear of trouble.” Fail to follow the rules, and your content may be deleted.
Eric R. Chad, of the intellectual property law firm Merchant & Gould, says there are exceptions to First Amendment protections of free expression, including restrictions on obscenity and the distribution of stolen media, which might subject people sharing the stolen pictures and videos of celebrities to criminal and civil liability. Alternatively, according to Chad, from a legal perspective, social media sites aren’t required to take anything down and have no responsibility to protect users from potentially troubling or offensive material.
In the Foley case, Twitter deleting images and videos of the beheading may be understandable. However, many argue that because more and more people are getting their news from social media sites, social media sites need to err on the side of free speech. But that doesn’t seem to be the norm yet.
In September, the pastor of a Baptist church in Georgia had his video banned from YouTube after he posted his Sunday sermon about the persecution of Christians in the Middle East. His entire account was terminated, which he told his congregation had to do with what the site mistakenly believed to be “hate speech.”
“YouTube has clear policies that prohibit content like gratuitous violence, hate speech and incitement to commit violent acts, and we remove videos violating these policies when flagged by our users,” the social media site said in a statement. “We also terminate any account registered by a member of a designated Foreign Terrorist Organization and used in an official capacity to further its interests.”
Robert Quigley, senior lecturer for the School of Journalism at the University of Texas-Austin believes that if corporate offices, worried about ad sales, adopt the policy of shutting down uncomfortable speech for entire platforms-such as Google, Twitter, and Facebook-we could lose something in the process. “For example, Facebook doesn’t allow nudity. Therefore, is certain art banned from criticism? What about breastfeeding? I think it’s a slippery slope if they want to morally police their platforms,” he says. “I think social media sites should err on the side of allowing objectionable material. In the case of a beheading, I think we can all agree that it is horrific. However, once we entrust the social media corporations to make morality decisions, we can end up having controversial topics or even noncontroversial topics taken right out of the public discourse.”
“Personally, I find the trend of petitioning these networks to limit the dissemination of otherwise legal material bothersome,” Chad says. “A recent example is a college cheerleader who hunts big game in Africa and posts pictures of her kills on Facebook. A number of users petitioned Facebook to ban the woman or otherwise restrict her ability to post such pictures. Facebook has no obligation to ‘protect’ users from this content, and I find the notion that it does to be disturbing.” In the end, Facebook decided not to remove the photos, standing behind its user’s right to free speech. But Facebook users may also find that they are unable to post content that violates the site’s TOS.
Ultimately, we are talking about free platforms that provide a service, and no one has a right to use those platforms to post whatever they want, whenever they want. Whether or not one likes the policies of any given search engine or social media site, remember-when you’re in someone else’s home, you must play by his rules. If you want to right to post freely, you’ll have to start your own site.