Tackling fake news

Without traditional institutional media gatekeepers, political discourse is no longer based on a common set of facts


M Ziauddin December 09, 2017
The writer served as executive editor of The Express Tribune from 2009 to 2014

Concern about the proliferation of disinformation, misinformation and propaganda has reached the point where many governments are proposing new legislation, according to Kelly Born’s article titled ‘Six features of the disinformation age’ published in International Politics and Society newsletter. This year, Germany’s parliament adopted a law that includes a provision for fines of up to €50 million on popular sites like Facebook and YouTube, if they fail to remove ‘obviously illegal’ content, such as hate speech and incitements to violence, within 24 hours. Singapore has announced plans to introduce similar legislation next year to tackle ‘fake news’. In July, the US Congress approved sweeping sanctions against Russia, partly in response to its alleged sponsorship of disinformation campaigns aiming to influence the US elections.

In Born’s view, such action is vital if we are to break the vicious cycle of disinformation and political polarisation that undermines democracies’ ability to function. “But while these legislative interventions all target digital platforms, they often fail to account for at least six ways in which today’s disinformation and propaganda differ from yesterday’s,” Born contends.

First, there is the democratisation of information creation and distribution. Any individual or group can now communicate with — and thereby influence — large numbers of others online. This has its benefits, but also carries risks — beginning with the loss of journalistic standards of excellence. Without traditional institutional media gatekeepers, political discourse is no longer based on a common set of facts.

The second feature — a direct by-product of democratisation — is information socialisation. Rather than receiving our information directly from institutional gatekeepers, today we acquire it via peer-to-peer sharing. Such peer networks may elevate content based on factors like clicks or engagement among friends, rather than accuracy or importance.

The third element is atomisation — the divorce of individual news stories from brand or source. Previously, readers could easily distinguish between non-credible sources. Today the original source of an article matters less than who in their network shares it. The fourth element is anonymity in information creation and distribution. Online news often lacks not only a brand, but also a by-line. This obscures potential conflicts of interest, creates plausible deniability for state actors intervening in foreign information environments and creates fertile ground for bots to thrive.

Fifthly, today’s information environment is characterised by personalisation. Internet content creators can carry out controlled experiments with two variables and adapt micro-targeted messages in real-time. “By leveraging automated emotional manipulation alongside swarms of bots, Facebook dark posts (unpublished posts) and fake news networks,” according to a recent exposé, groups like Cambridge Analytica can create personalised, adaptive and ultimately addictive propaganda.

The final element, as the Stanford law professor Nate Persily has observed, is sovereignty. Unlike television, print and radio, social-media platforms like Facebook or Twitter are self-regulating — and are not very good at it. It was not until mid-September that Facebook even agreed to disclose information about political campaign ads; it still refuses to offer data on other forms of disinformation.

It is this lack of data that is undermining responses to the proliferation of disinformation and propaganda, not to mention the political polarisation and tribalism that they fuel. Facebook is the chief culprit: with an average of 1.32 billion daily active users, its impact is massive, yet the company refuses to give outside researchers access to the information needed to understand the most fundamental questions at the intersection of the internet and politics. (Twitter does share data with researchers, but it remains an exception.)

We are living in a brave new world of disinformation. As long as only its purveyors have the data we need to understand it, the responses we craft will remain inadequate. And to the extent that they are poorly targeted, they may even end up doing more harm than good.

Published in The Express Tribune, December 9th, 2017.

Like Opinion & Editorial on Facebook, follow @ETOpEd on Twitter to receive all updates on all our daily pieces.

COMMENTS (1)

thoughts | 6 years ago | Reply No problem, the world has been considering Pakistan’s claim of fighting ALL terrorist organizations to be fake news for years now!
Replying to X

Comments are moderated and generally will be posted if they are on-topic and not abusive.

For more information, please see our Comments FAQ