From its stance on extremist content, to its vast caches of user data, Facebook is a corporation whose power must, finally, be reined in, says freelance journalist Ellie Mae OHagan
The revelation that Cambridge Analytica exploited the data of 50 million Facebook profiles to target American voters is indeed frightening. But Cambridge Analytica shouldnt act as a diversion from the real bad guy in this story: Facebook. It is mystifying that as his company regulates the flow of information to billions of human beings, encouraging certain purchasing habits and opinions, and monitoring peoples interactions, Mark Zuckerberg is invited to give lectures at Harvard without being treated with due scepticism.
We have now reached the point where an unaccountable private corporation is holding detailed data on over a quarter of the worlds population. Zuckerberg and his company have been avoiding responsibility for some time. Governments everywhere need to get serious in how they deal with Facebook.
After trolls were sent to jail for sending threatening messages to the activist Caroline Criado-Perez and MP Stella Creasy, a debate ensued over whether the likes of Facebook and Twitter should be classified as platforms or publishers. Facebook is treated as if it is simply a conduit for information, meaning it is not liable for the content its users share in the same way that BT cant be sued when people make threatening phone calls.
In 2014 Iain MacKenzie, a spokesperson for Facebook, said, Every piece of content on Facebook has an associated report option that escalates it to your user operations team for review. Additionally, individuals can block anyone who is harassing them, ensuring they will be unable to interact further. Facebook tackles malicious behaviour through a combination of social mechanisms and technological solutions appropriate for a mass-scale online opportunity.
But the company is evasive about the number of moderators it employs, how they work, and how decisions are made. It has started taking a firmer line on far-right content recently removing Britain First pages from the site but it is still resisting many legislative attempts to regulate its content. What content users then see is decided by an algorithm that can change without any consultation, including with the government or the businesses that rely on Facebook for revenue meaning that some can be quickly wiped off the map. In February 2018 the website Digiday reported on LittleThings, a four-year-old site that shut down overnight after Facebook decided to prioritise user posts over publisher content. A hundred jobs were lost.
Facebook wasnt the only contributor to LittleThings demise, but those working at the website said there was nowhere else to go after the algorithm change. And this isnt the only example: in 2013 an algorithm change halved the traffic of viral content website Upworthy something from which the website has never recovered.