Algorithms for hate

The foundational notion of the Internet needs to change


Hassan Niazi March 26, 2019
The writer is a lawyer based in Lahore and also teaches at the Lahore University of Management Sciences. He holds an LL.M. from New York University where he was a Hauser Global Scholar. He tweets @HNiaziii

Since the dawn of social media, we have seen it as a force that fosters communication and offers new avenues for people to interact. At its core it was a platform for bringing people together. It allowed us to consume information that interested us in almost limitless quantities and created new forms of expression for people. Social media didn’t seem to have a downside, and even if it did, those flaws failed to tip the scales when compared with the convenience that was on offer.

So, platforms like Facebook, Twitter, YouTube and Reddit thrived. Free from any constraints or accountability under the law for the content published on their platforms.

That utopia is now unravelling. The darker side of social media was exposed when Facebook was used to manipulate an election involving Donald Trump. Algorithms turned us into slaves who could no longer control their opinions or determine the truth from falsehood. Our private life was laid bare by companies like Cambridge Analytica.

Concerns for privacy were our first indication that all was not well in the world of hashtags and photo filters. Then came the truth, or our inability to find it, as fake news monopolised our news feeds. Now, after a terrorist attack in Christchurch that seemed to have been conducted in a way to make the footage go viral, social media’s ability to propagate violent ideas is something the world has to confront.

The terrorist attack in Christchurch has shown us the way in which the power of social media can be harnessed by terrorists to project their propaganda to millions of people by simply clicking the ‘share’ button. Once that is done, algorithms like the one deployed by YouTube, will make sure that the material reaches the right audience. Those on the verge of radicalisation — but not quite there yet — will view the video and perhaps embrace its violent call.

Never before have terrorists had such an easy way to share their manifestos. The original video of the Christchurch terror spree was taken down within an hour of it being uploaded, but copies of it multiplied faster than Facebook could take them down. A hydra of violent content swept through different social media platforms as the world grappled with the new reality of viral terrorism. To get an idea of the scale of it all, consider this: Facebook announced that it had removed 1.5 million copies of the video. While at one point, a copy of the video was being posted on YouTube every second.

No terror group has been able to harness the potential of social media quite like white supremacists. 8chan, the sick platform where most of them assemble, is a testament to that. A terrorist hideout in plain sight. But since different standards apply to white supremacists from other terrorists, tech giants allow their platforms to be overrun by them.

Content such as the Christchurch shooting should be banned. This is a no-brainer for the law. It is content that is not protected by traditional free speech jurisprudence since it involves actual harm to people in order for it to be created. The trickier question is whether racist speech inciting violence on the Internet should suffer the same fate. Government censorship of such speech is easy to propose, but as a solution, is neither as simple as it sounds nor are its consequences easy to predict.

Social media gives us a wealth of avenues to exercise free expression. Any law that tries to curb online speech must do so within the ambit of established free speech jurisprudence. That jurisprudence is currently ill-suited to tackle social media. The law regarding free speech works within certain categories. Different standards apply for speech in newspapers, books, public spaces and television. The contours of free speech on social media are developing and remain a big point of contention in legal debate. As more and more terrorist organisations claim control of the Internet’s power to organise and disseminate information, the law will have to answer the intractable problem of how to balance free expression with potential threats to national security. It will have to do so while trying to maintain all the vibrancy of diverse ideas that exist on social media while trying to stamp out the negativity. Not an easy task, but the debate must begin.

While the law wrestles with its conundrums, social media companies are not absolved of their moral responsibility to prevent radicalising content. A responsibility owed to the people of the world who have allowed them to net such tremendous amounts of profit. As private profit-making organisations, these companies need to rethink their business models. Models that are built around content being as shocking as possible in order to draw more eyeballs. This involves platforms like YouTube having to rethink its current algorithm that gradually exposes users to more extreme content through suggested videos. Helping the radicalisation process.

The foundational notion of the Internet — that apart from extreme cases, websites are not responsible for the content that users upload — needs to change. Publishers have a responsibility to prevent hate on their platform. When social media companies allow their platforms to be used to inspire fear in certain segments of the population via hate speech or violent conduct, the public must hold them accountable.

In the midst of all this, there is one element that needs to be at the centre of this debate. The human element. At the end of the day, violent content is not being viewed by fringe groups of unhinged individuals. A community on Reddit called ‘WatchPeopleDie’ was recently taken down. The community’s name leaves little to the imagination with regard to its content. Yet, it had over four hundred thousand subscribers. We can ask for all the reckoning we want from social media companies, but we need to reckon with what it is in human nature that drives the appetite to consume vast quantities of real, violent conduct in our spare time.

Published in The Express Tribune, March 26th, 2019.

Like Opinion & Editorial on Facebook, follow @ETOpEd on Twitter to receive all updates on all our daily pieces.

COMMENTS

Replying to X

Comments are moderated and generally will be posted if they are on-topic and not abusive.

For more information, please see our Comments FAQ