Algorithmic cauldron

The fact that some can manipulate these algorithms seems to give policymakers worldwide a cause for concern


Farrukh Khan Pitafi December 17, 2022
The writer is an Islamabad-based TV journalist. He tweets @FarrukhKPitafi and can be reached at contact@farrukh.net

print-news

What do you think these three clips, ‘pawry ho rahi hai’, ‘mera dil ye pukare aaja’ and ‘Lahore da pawa, Akhtar Lawa’ have in common? Yes, they all went viral on social media, but what explains this virality? Algorithms. We live in the age of algorithms. Their rise and might have only just begun remoulding a world aggressively reshaped by the internet and technology.

When we try to understand the current transformation, Nietzche’s aphorism, “if you gaze into the abyss, the abyss gazes into you”, comes to one’s mind. We have been learning and commanding machines and software for a while. It is their turn to understand and moderate our behaviour. I am not complaining. So far, they have nothing but help. But that at least tells of the shape of things to come, our feeble attempts to control these unfixed and mutating forces of change and the need to put them to good use.

But before trying to look under the hood and appreciating the enormity of the change, let us return to whence we began — social media virality. How many times have you looked at the videos above and wondered what was so special about them? I have often heard people complain that her use of the word party is pretentious, her dance is just so-so, or that Akhtar Lawa has an unfortunate past. Think whatever you will; an audience and algorithms have decided they are stars. That’s right — one-hit wonders. Dananeer Mobeen is now a budding model and a TV star. The number of foreign and local videos repeating Ayesha’s steps and her TV appearances keep multiplying daily. The same about Akhtar Lawa, even though we still don’t know what shape his career will take given his age and limited skill sets on display.

But how does it work? Unlike other algorithms, the story here is pretty simple. You post something on social media. If you are lucky and a social media influencer stumbles upon it and reshares it, you may soon witness a snowball effect. How does it reach these influencers? Enough has been written on the art and science of online content creation and distribution to merit repeating here. Suffice it to say that from the nature of the content and its production values to search engine optimisation (SEO) and marketing (SEM), everything counts. In a nutshell, a lot of thought goes into it. Others choose to go to professional boosters who would get your work trending. But that requires some money, and not everyone has it. And it certainly does not explain spontaneous content like the ‘pawry’ clip. But then, if you look at the growth of that clip, you realise that the initial reaction was critical and sarcastic. That takes us to another aspect of virality. That it is value-neutral. Whatever the original reason for its trending, if it is seen, it sells.

But there also comes the role of social media algorithms. Social media’s core engines assess each item carefully. If they decide that a clip, tweet, photo, or text may evoke public interest, they escalate it to the recommended content list, and varied audiences can access it.

In our growing corporate/job culture where anonymous, identical work cubicles threaten to turn us into nameless, faceless work drones, this is a welcome change of speed. Create something awesome, and social media algorithms will make you an instant star. If you find a way to maintain that moment, you get a very successful career. If not, well, better luck next time.

The fact that some can manipulate these algorithms seems to give policymakers worldwide a cause for concern. In Pakistan, you must have heard a lot about hybrid and fifth-generation warfare. And then, of course, the Prevention of Electronic Crimes Act (PECA). But all of this does not fully understand the beast’s true nature. The algorithms that worry you because they can be subverted to the potential social detriment are vulnerable only because they are still evolving. If you know anything about Machine Learning, Artificial Neural Networks, or Artificial Intelligence, you will see that they learn very swiftly. In fact, terms like recursion and nesting loops (where several processing loops work inside the main processing loop, like Russian dolls) indicate that their ability to evolve and grow is truly exponential. Such software, search engines, and trend subroutines will neither be easy to manipulate nor regulate. Their powers grow with your use. More data you generate is directly teaching them who you are and how you choose. Stripped of all our attempts to project motives and hidden desires onto these programs, this is a very useful skill set. Take, for instance, the issue of deep fakes. If software keeps processing datasets as big as the internet itself and has virtually infinite processing power, will it not be able to detect what is fake and what is real? That is happening every single day. And they operate within defined parameters. When our imagination runs amok, we see monsters in these machines. But when have they ever allowed you to complain? So, in the coming days, the ability of both state and non-state actors to manipulate the merit system will be pretty limited. And similarly, the ability to regulate them as well. I consider initiatives like PECA failing attempts by outdated legacy outfits to maintain some semblance of control. Laws and ordinances take some time to come together and pass or update; these changes don’t.

But as the reliance on such platforms continues to grow, many concerns emerge. How will it affect the social fabric? What about the safety of your data? What about doxxing that could jeopardise your safety? Will this trend of instant likes and dislikes seep into your daily lives, the justice system, and social and familial interactions? These are the real questions. The Cambridge Analytica fiasco has shown how your data and information can be exploited and weaponised against you and the system. What happens when data harvesters develop much more sophisticated tools and deploy them on a mercenary basis? What if a nation’s lawmakers decide that the judicial system requires a jury that should include all citizens through a social media app and digital identity? That your like and unlike buttons determine who is innocent and who is guilty. These questions require the state’s immediate attention and resources. If there is a lesson for our state, it is that instead of trying to regulate what keeps changing at a pace much faster than our imagination, it is prudent to find answers to these questions because it is doable.

Published in The Express Tribune, December 17th, 2022.

Like Opinion & Editorial on Facebook, follow @ETOpEd on Twitter to receive all updates on all our daily pieces.

COMMENTS

Replying to X

Comments are moderated and generally will be posted if they are on-topic and not abusive.

For more information, please see our Comments FAQ