Facebook might know more about you than your wife.

In a recent article, a man reported that Facebook uncovered a long-withheld piece of information from within the family annals of that family and began tailoring advertisements towards it. This is not the first instance to be seen of Facebook seeming to understand and deduce interests that, while possibly true, are best kept outside of a social media setting. However, this is fundamentally how social media works. While the applications undoubtedly provide wonderful services of connectivity, communication, and information sharing, the platforms themselves have spiraled into an amber trap for time and information

Most recently, ProPublica broke the story that “Facebook’s ad tools could target racists and anti-Semites using the very information those users self-report.” Following this discovery, a slew of other investigations have shown that even Google’s search engine allows sellers to deliberately place ads next to slanderous and hateful rhetoric, sometimes even suggesting advertisers to include hateful terms in SEO.


According to New York Times contributor and UNC professor, Zeynep Tufekci, “Facebook is a social media which, at its core, is an architecture designed to capture your attention. Except, You may in the moment want to do something, but that is not necessarily what you want to do if you were asked in the morning.”

So, what are the implications of this attentional, political, and statistical bottleneck?


It has become common knowledge that Facebook and social media played an enormous role in the 2016 American political election cycle. After Facebook’s algorithm recognized a person’s political leanings, the application lumped them into silos of their own respective opinions. Facebook thus, created an illusion of freedom for Information but, in reality, had placed its users in a room full of self-reflecting mirrors.

In the aforementioned interview, Zeynep mentions that “people who supported Hillary Clinton and posted regularly about them, never actually saw the posts of Donald Trump supporters. That is unless you are one of the few to be combative and engage in a conversation which, in turn, keeps you in the app for longer.” An argument means more time on site and is a win for Facebook.

In the past, people received their information from Television, Radio, and News writing. These were all public performances where politics and advertisements could be condemned and fact-checked by the opposing view. However, with Facebook, the user is the only one who sees this particular post and it is usually something with which they agree – even it isn’t true. Thus arises the phenomena of “fake news:” as an individual’s digital universe becomes more and more private, exaggeration goes unnoticed because the other side never sees it. It is not that these fictions are a new development to humanity. People have been lying and exaggerating forever. What is most vapid is that now, rather than of using clubs and hammers to provoke a public reaction, people can be surgical and manipulative behind closed doors.


The explosion of social media has thus born a new brand of capitalism which siphons personal and behavioral data to be sold and used in advertising. In a sense, Facebook, Google, and other tech giants have automated a system that force-feeds a person’s tendencies back to him/her, rather than allowing for them to vary. This is surveillance capitalism at its finest: a global machine that both produces and consumes the same resource – data.

Facebook headquarters entrance sign in Menlo Park.
Facebook headquarters entrance sign in Menlo Park.

One of the scarier aspects of this growth in surveillance capitalism and data sales is the potential socio-political implications further down the line. As a platform that pervades our lives, “Facebook knows what happened in the last election but they’re keeping it as private data. It is asymmetric information and we are not allowed to understand it. The powerful have the knowledge and the masses are kept in the dark,” says Professor Tufekci.

This obscurity is a strange outcome for a concept like social media which, at inception, was based upon transparency and sociability. One would think that Facebook’s ability to understand group and individual actions more thoroughly would not lead to murky behavior, but rather clarity. Instead of using population-level data to analyze what a society really wants and needs, large social media companies are propagating exactly the opposite.

In their defense, this is largely an unintentional side effect of being both a wickedly new technology as well as one that scaled faster than anticipated. It took just eight years for Facebook to grow from 1million users to 1 billion users. That is perhaps the most rapid scaling of anything in the history of mankind. As intelligent as Mr. Zuckerberg and Ms. Sandberg may be, humans did not evolve to wield so much power so quickly. They simply were not prepared.

In the words of historian and biologist Yuval Noah Harari: “what is more dangerous giving nuclear weapons to sheep that unaccustomed to power, or to a wolf who has evolved with it for millions of years?”

There is no doubt that social media and the data it generates can be of great value to the world; however, for the time being, we are just sheep with nuclear weapons.