What We Learned from Facebook This April

It feels like it was yesterday that we watched Mark Zuckerberg’s overly earnest eyes stare into the dozens of cameras as he was giving testimony in light of revelations that Cambridge Analytica used Facebook data to influence US voters. Almost two weeks passed since Zuckerberg spoke to members of the U.S. Congress and the House of Representatives for 10 grilling hours. Where are we now?

Many people are used to thinking of Facebook as a symbol of the “American dream” with its “humble” beginnings of a college dorm (Harvard), started by a college drop-out (a smart, but still very lucky white man). Facebook connects over two billion users across the globe; it played an important part in allowing people to mobilize for the Arab Spring of 2011, helping overthrow dictatorships in favor of more democratic regimes.

However, in the past few months, and in the uncovering of the 2016 U.S. elections, Facebook made news not as a tool that supports democracy, but as a threat to it. It all started when Aleksander Kogan, a Russian data scientist who gave lectures at St. Petersburg State University, gathered Facebook data from millions of American users. He then sold it to Cambridge Analytica, which worked in support of President Donald Trump’s 2016 presidential campaign. As a result, we are left with a myriad of questions regarding Facebook’s privacy and its intimate role in our lives.

For the past few years, Mark Zuckerberg, Facebook’s CEO and its public face, has been portraying his company as humanitarian and “unbiased.” But there is a dark side to Facebook. Like many large corporations, Facebook has been outsourcing labor that is unwanted in the U.S. and is cheaper in the developing world. Those Facebook employees, known under a vague title of “Community Operations team,” have to screen a copious amount of created content, including disturbing and traumatizing content, like murder, suicide, and rape. A lesser-known ethical issue is how much Facebook pays its employees. While last year’s median salary appears to be high—$240,000, in reality, employees’ salary is calculated by a formula that would take into account previous salary, meaning that salaries are significantly lower for most, especially considering the current gender pay gap, which is even worse for women of color.  A Facebook project manager can receive as little as $46,000 a year.

Ethical questions are looming large, and the issue of privacy has taken the center stage. And apart from the main concern about the user data, the testimony also revealed a desperate need for politicians’ tech literacy. The footage featured Zuckerberg’s reassurance that Facebook does not listen to you through your phone. The main source of the company’s revenue is advertisements, as a result, you clicks affect your ad suggestions and your newsfeed in general; no spying needed.

Both Zuckerberg and lawmakers seemed to agree that Facebook needs regulation. “I think some bills will pass,” said Abhishek Nagaraj, an assistant professor at UC Berkeley’s Haas School of Business. “I see some agreement on both sides of the aisle that a legal framework for internet/data businesses is important, and with the GDPR being passed in the EU, there is some precedent and pressure on U.S. lawmakers.”

Another very important issue raised during the testimony was the need to make the terms of agreement more accessible for general public. It was clear that an average user, including myself, had no idea how to find the information Facebook has on you. Seeing it all on one page makes you really reevaluate what you share on Facebook. We know that our newsfeed is a reflection of our bubbles, which are sharply divided along the party lines. But no matter what you would like the world to be, it is healthy to be informed about what other people read and think. I am not exactly looking forward to reading “Fox News” or Trump’s twitter, but I cannot and should not pretend they do not matter.

Zuckerberg could make Facebook explicitly liberal, since it is a private company, but he realizes that Facebook became something much bigger. It requires us to discuss the issue of freedom of speech, freedom of expression, and hate speech. A popular stance appears to be against limiting any amount of free speech as that may be a slippery slope to censorship. Freedom of speech is however an ideal that does not account for realities of our society such as race- and gender-based hatred online, which produces very real outcomes, including a psychological toll of online abuse and IRL discrimination and violence against people of color and women, and other marginalized groups.

In addition to aforementioned issues, there are still more Facebook-related questions we as a society need to contemplate:

  • What is a Facebook friend? How does it affect our IRL social interactions?
  • Are our worries about our Facebook data simply a reflection of our broader concerns of our digital footprint? How do we become digitally literate?
  • Zuckerberg admitted he is responsible for the Facebook content, but what about our ability to critical think about the world? The sheer amount of data makes it impossible to identify every article containing fake news. While Facebook holds a responsibility for its platform, we also must retain a personal responsibility for the ways we interact with and digest content. It is time that we went back and also addressed issues concerning public schools, which do not teach critical thinking.

As we are waiting to see what lawmakers and Facebook will decide to do in the future, we too have a lot of thinking to do.

 

 

 

 

 

Laura Auketayeva

Laura Auketayeva

History Phd Student at American University
Hobbies include making memes and stand-up comedy
Laura Auketayeva

Comments

comments