Facebook has trapped us with one hilariously simple oversight

Welcome to the matrix


Dilshan Senaratne November 28, 2016
Facebook is perhaps the most frequented opinion billboard of modern times. PHOTO: AFP

Sometime in mid-2006, Zuckerberg and his team at Facebook came out with what today is arguably the cornerstone of global information dissemination — the news feed. The move at the time created such an uproar that Zuck himself penned (keyed?) a blog post titled “Calm down. Breathe. We hear you.” in a bid to pacify public rage.

A decade down the line, the outcries are no more, and Facebookers hardly remember a pre-news feed era. The news feed today, for many of us, is the strongest link to what is going on in the community we live in and the world as a whole. As a result, Facebook today is the most powerful media tool at the disposal of politicians, marketers, and artists alike.

Over the years as Facebook grew its user base, its users also grew their personal networks exponentially. In 2014, Statista reported an average 650 friends per user in the 18 to 24 age category, illustrating the kind of connections we are dealing with. Considering that an average Facebook user reportedly generates 90 pieces of content per month, simple math tells us that the typical Facebook user is dealt with something in the range of 58,500 individual items of content on a monthly basis.

Zuckerberg sure fake news on Facebook didn't sway election

The growth of content itself plays right into the strategy envisioned by Zuckerberg in his college dorm many years ago, but this success yields a pretty obvious complication — priority content.

It’s critical for engagement’s sake to maintain this kind of content (we are presented first with what interests us most). In order to address this deceiving and critical issue, Facebook came out with a complex algorithm.

You see, Facebook in its earliest days was driven by a motto which read in the boldest of fonts — “Move fast and break things,” and that’s exactly what they did. They moved fast enough to extend their data capture perimeter to include our preferences. Unknown to us for many years now, a team of data scientists based out of the company’s headquarters in California is analyzing the way we like, comment, share, and engage with the content we see.

The result is an algorithm smart enough to know what we might like and prefer to see on top of our news feeds. In essence, Facebook is gently nudging the right content in our direction to help us see what we want to see.

Relevance is the name of the game and while the concept is not without noble merit, Zuckerberg hasn’t quite let go of his self-serving, intellectual property robbing Harvard days. The advertising revenue that Facebook is rolling in right now is also heavily dependent on relevance for its effectiveness.

The commercial targeting efficacy offered by Facebook is unparalleled at this point and the ROI is completely transparent and based on engagement. This ability to know who might potentially buy something is the ace up Facebook’s sleeve and it hasn’t cost us too much. Not till now at least.

Zuckerberg accused of abusing power after Facebook removed 'napalm girl' post

The hilariously simple oversight

The year is 2016, and Donald Trump, the least likely candidate to have arisen to presidential consideration in USA’s modern history triumphed over Hillary Clinton, former first lady and secretary of state. The turmoil and chaos are ringing through America, and the blame game has only just begun.

The Trump victory caught many off guard. In the age of connectivity, could blatant public sentiments such as presidential preferences stay largely unknown? When most of us know what our on average 650 connections had for lunch, it’s strange to imagine that we didn’t know the same people’s political standpoints. Were they concealing this?

They didn’t.

Facebook is perhaps the most frequented opinion billboard of modern times and it stands to reason that a reported 28 percent of all online activity is spent on social media. To add to the mindshare monopoly that social media has occupied, any articles and external content platforms we visit are most likely linked from Facebook.

The usage in itself is not the problem. In fact, people express their views freely on social media, and rationally, the platform should offer a very real view of the ground situation.

Unfortunately, however, the same algorithm that sorted relevant content for you has been at work behind the curtain. Facebook has unknowingly (or knowingly) bubbled us in our own content preference. Hillary supporters only ever saw content that called Trump out on his outlandish behavior and supported the Clinton cause. All Democrats stayed steadily grounded and assured in the coming of America’s first female President, while all hell broke loose outside of what is now coming to be called the filter bubble.

Facebook had tool to weed out fake news: report

The filter bubble is a hilariously simple oversight with some pretty sober consequences. Everything you know about what the world is like, at least the portion of your experiences that have been molded digitally, may have no real validity.

More room for unbiased facts

Present day. Zuckerberg took the stage in the election aftermath to address accusations directed at Facebook in association with the filter bubble. Of course, he denied any plausibility, passed the blame to how the content is managed rather than how it is ranked, and then proceeded to point out the humans that are now playing a role in moderating the news feed algorithm. All great defenses; they don’t really solve the problem, though, but hey, let’s not let the truth get in the way of the story.

Humans as a species, contrary to their sentient status, have severe difficulties in navigating through their natural biases. To keep it simple, we are constantly perceiving things inaccurately. Add to the fire, a primary information source only serving to heighten the bias and we have soup in our hands.

Unbiased facts are critical for argument formation, they are also critical for formulating accurate hypotheses — none of which can be achieved if we are shown what we would like to see.

The key learning here is simply a matter of adjusting judgment. There is no way to overcome the filter bubble, except by consciously questioning our standpoints and the information we receive. A little time away from news feed-driven content is a potential solution, but who are we kidding, we are too far in the bubble to tear our way out of it.

So, in closing, welcome to the matrix; try and stay aware.

This article originally appeared on Tech in Asia.

COMMENTS

Replying to X

Comments are moderated and generally will be posted if they are on-topic and not abusive.

For more information, please see our Comments FAQ