A few days back, MIT launched the world’s first artificial intelligence-powered psychopath called ‘Norman’ — an algorithm that sees the negative extremes compared to a normal algorithm generated by AI. For instance, when Norman is put to a psychological test in the form of Rorschach inkblots instead of seeing birds sitting on a tree, it sees a person being electrocuted. Likewise, when shown an inkblot of a baseball glove it is sees instead a man murdered by a machine gun in broad daylight. This experiment presents an alarming situation for us living in Pakistan.
The reason why Norman is a ‘psychopath’ and only sees the negative extremes is because it was trained only in a selective negative data consisting of pictures of people dying, violence and horrible human conditions. MIT trained another similar algorithm with happy images of cats, birds and people. Naturally, this happy algorithm saw happier images in the same inkblot test. The experiment draws our attention to the fact that data is not only more important than the algorithm but it also determines what the AI machine will eventually come to be.
In many ways human beings are no different from an AI machine. We learn through data in the form of images, texts, experiences, and sounds and our life-long perceptions are essentially based on that. This has far and deeper level repercussions for us humans. For instance, if we are to take the situation of people in Pakistan and especially the young generations of the country, the type of images, conversations, and experiences we are going through in the country, and have been since 9/11, reflects a very dark ‘Norman’ like future for most of us.
With a click of a button, we have over a 100 news channels 24/7 telecasting images of misery, terror, violence, brutality and controversy. Even when the security situation has improved in the country, our debate has slipped below any moral threshold with the Reham Khan fiasco and many more that have happened previously.
Turn off the TV and you might escape mainstream media, but there is no escape from social media that has become a venting ground of our collective anger and outrage against anything that does not fall in line to our ‘logic’. With political parties spending vast amounts of money on political campaigns to lie, malign, and abuse other party leaders through social media brigades mostly that are bots, the type of data that is being fed to our people is evidently radicalising and polarising the country.
In this madness, as the MIT experiment has proven, the people of Pakistan are en route to Norman-like pessimism, negativity and psychopathy. There are clear signs of that already when we refuse to acknowledge anything good happening in the country, or degrade our heroes and only see the negative in people. These are not just random signs but a ‘code red’ of a decaying nation and its people.
However, can we blame the people when they are only the consumers of data — just like the algorithm? What matters more as the MIT experiment shows is the ‘data’ and of course those that are responsible for generating this data. It is about time that the elected political leaders, media houses, and government institutions are made to realise the type of hell that they are unleashing for the future. This is not a secluded militant camp where a dozen kids are being brainwashed against the state. This is an entire country of 200 million people of which 60% are under the age of 25 and more vulnerable to absorbing bad data without filters.
The political leaders must stop immediately with the vicious barbarity and sit across one another to devise a mutual ground on social media and media ethics to bring sanity to our national discourse. Let political leaders be reminded that the first to go down in history when madness takes over are those in power.
Published in The Express Tribune, June 13th, 2018.