Social network provides a platform where people are able to express themselves freely. In Kenya Bloggers and social media users are governed by very interesting laws that reportedly have gaps leading to many bloggers being arrested. The question that is always raised by many critics is how far should social networks go in censoring hate speech? Extreme racist comments posted on the discussion website Reddit in the wake of the Charleston church shooting have once again raised questions about freedom of speech and the internet.
Freedom of speech is highly upraised by Reddit. It has continued to uphold that idea even in the face of criticism. One day pictures of an underage girl were posted on Reddit without anybody’s consent. The site didn’t ban the user Violentacrez, but he did lose his job after his real identity was exposed by the website Gawker.
Reddit’s anti-censorship stance was in trouble after reports emerged of posts expressing support for the man charged with murdering nine worshipers in a black church in Charleston. The posts were made under a thread or “subreddit” called Coontown – which, as the offensive name suggests, is a corner of Reddit made up mostly of virulently racist and white supremacist posts. One commenter called the shooter “one of us”. In another popular post, a moderator said “we don’t advocate violence here”, but went on say the life of a black person “has no more value than the life of a flea or a tick.”
The extreme comments that can easily raise civil wars are legal under American law and legally Reddit is by and large an American company – according to Eugene Volokh, a law professor. “American law protects people’s ability to express all sorts of views, including views in support of crime or violence, without fear of government restriction,” Volokh says. “But a private institution like Reddit is also free to say we don’t want our facilities to be used as a means of disseminating this information.”
Earlier this month Reddit banned five subreddits, including racist and anti-gay threads and a mocking forum called “Fat People Hate”. The company said it was “banning behavior, not ideas” and that the offending threads were specifically targeting and harassing individuals. “We want as little involvement as possible in managing online interactions but will be involved when needed to protect privacy and free expression, and to prevent harassment,” the company said. “While we do not always agree with the content and views expressed on the site, we do protect the right of people to express their views.”
Mark Potok of the Southern Poverty Law Centre, a non-profit organisation based in Alabama which tracks US hate speech, says the site has attracted people who previously hid out on other sites dedicated to extremist ideology. “More and more people in the white supremacist world are moving out of organised groups and into more public spaces. Reddit really has become a home for some incredible websites,” he says, citing not only a string of racist subreddits but a thread that actively encourages the rape of women.
A report was recently issued by SPLC and it provided that the world of online hate, long dominated by website forums like Stormfront and its smaller neo-Nazi rival Vanguard News Network, has found a new – and wildly popular – home on the Internet.
“There’s a lot of hypocrisy in Reddit banning a particular subreddit that mocks people for being fat while they allow other extreme content to trundle along just fine,” Potok says. While he acknowledged that Reddit is within its rights to allow racist comments, he suggested that the site and other social networks could act more like traditional publishers who exercise editorial control over what does or doesn’t go into their publications.
Jillian Yorke of the Electronic Frontier Foundation questioned whether private companies are best placed to make decisions about censorship. She referred to controversies over breastfeeding pictures on social media and noted that the biggest social networks ban nudity, which might not be offensive to all or even most users.
“Legally these companies have the right to ban whatever they want, but when it comes down to it the rules are skewed and the enforcement of them is as well,” she says. Yorke says that the biggest social networks should become more pro-active in thinking of ways to implement community policing – for instance Twitter’s recent announcement about shared “block” lists which will make it easier for users to block multiple accounts.
“There are some subreddits with very little viewership that get highlighted repeatedly for their content, but those are a tiny fraction of the content on the site,” the company said. Reports BBC.