Facebook goes dark, shining a light on where we are

More than 3.5 billion people worldwide use Facebook and its apps. On the evening of October 4, Facebook, Instagram and Whatsapp, experienced a 6-hour outage, disrupting billions of lives. Fatima Dia looks at whistleblower Frances Haugen's hearing, the blackout, and what this all means for our mental health.

On the evening of October 4, Facebook, Instagram and Whatsapp, experienced an outage that lasted six hours. Several rather funny memes and a Twitter dominance later, this almost six hour pause reminds us once again that we have delved too deep into the virtual world. 

“Hello literally everyone,” tweeted the official Twitter account. 

Tottenham Hotspur footballer and Brazilian national, Lucas Moura, tweeted that the six hours without social media had him actually engaged in real time conversation with his wife, saying that “she’s a really cool person.” 

See, the jokes were hilarious, and the world did seem to unite over a mutual shock of Facebook and its apps going down. It was a big hit to Facebook, having lost $121 billion on Monday, and Mark Zuckerberg’s net worth falling by $6 billion. That quickly. 

The darker undercurrent remains: we have become too reliant on social media, and sometimes we don’t even realise just how much. 

How imbued in our lives is social media? 

Raymond Zhong points out in the New York Times that the use of Facebook and its apps has gone way beyond mere chatting and sharing of pictures. 

Businesses are run through Instagram, political campaigns are carried over Facebook, and aid services are offered through Whatsapp. Zhong wrote that politicians in Mexico were “cut off from their constituents,” pharmacies in Brazil stopped receiving prescription orders, and a nonprofit organization in Colombia that connects to victims of gender-based violence via Whatsapp halted its services.  

Have you recently attempted to create an account with an online store, or log into some online game or another? Usually, there’s an option that allows you to log in through your Facebook account. Access to these websites and services was also cut off when Facebook went down.  

There is an illusion of division among the different apps that are used for all these different purposes, from just having fun to offering mental health services to education. The truth is these platforms have offered opportunity after opportunity for growth and connection, as well as for accessibility to services some people wouldn’t have otherwise. 

But, the even bigger truth remains that there’s a single company monopolising all this access. 

Facebook has acquired 78 companies over the past 15 years; these include Instagram and Whatsapp, Oculus, com, LiveRail, Threadsy, and Atlas Solutions.  

Director of digital operations for Cosas de Mujeres in Colombia, Alex Berryhill, told Zhong: “Technologies are tools, not solutions.” 

And just as any other tool, they can be abused. 

Facebook and the law

Cambridge Analytica had collected personal data belonging to 87 million Facebook users in the 2010s. The data collected by Cambridge Analytica, who worked for the Trump campaign, was used as analytical assistance during the 2016 presidential race in the United States, according to the company’s whistleblower Christopher Wylie. Wylie writes in his book, Mindf*ck: Cambridge Analytica and the Plot to Break America, how our information made available through social media can be “synthesized and weaponized” to impact our thoughts and feelings, and even manipulate our habits—including voting ones. 

“It really starts to hit home that you have ended up contributing to manipulating these people’s world views to a point where they believe things that aren’t true… you promote radicalized thinking at scale or you provoke and encourage misogynistic viewpoints, and you end up harming society,” said Wylie in an interview with Times. 

Facebook CEO Mark Zuckerberg testified in front of Congress and the company apologised for its role in the acquisition of supposedly private information. The company was fined $5 billion by the Federal Trade Commission (FTC). 

In December 2020, The FTC made claims about Facebook regarding the country’s antitrust law, a collection of federal and state government laws that regulate the conduct and organization of business corporations and are generally intended to promote competition and prevent monopolies. 

Federal Trade Commission v. Facebook concerns the company’s acquisition of Instagram and Whatsapp, where the FTC claims Facebook holds monopolistic power in the US social networking market. However, the suit was dismissed by a federal judge in June of 2021 due to insufficient evidence pointing to Facebook being a monopoly. 

Having been allowed to mend its case, the FTC went back in August of 2021, asserting that Facebook had been a monopoly since 2011. The suit also shows evidence of Facebook’s power to control prices or exclude competition, most notably with the creation of the Bulletin, a writing platform that requires no entry fee as opposed to other ones — essentially putting a stop to competitors that wouldn’t be able to afford no entry fees. 

The Whistleblower 

A couple of months ago, a whistleblower shared documents to the Wall Street Journal for their three-year-long research on Facebook. The study shows that the platform, especially Instagram, can damage mental health and body image. An internal presentation slide said one in three girls develop body issues due to the app, according to the Journal. 

Head of public policy at Facebook told CNN when the Journal’s report came out that the company was working towards fixing this issue, and that they were “cautiously optimistic”. Critics point out that such a statement, even if it’s considered a step, is not enough, as the report shows Facebook has known about the impact of Instagram on mental health since the beginning, but done nothing about it. 

“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimize for its own interests,” Frances Haugen told 60 minutes on Sunday night, when the whistleblower revealed herself. Haugen had joined Facebook as a product manager and part of the Civic Integrity team to work on issues including democracy and misinformation, back in 2019. Haugen’s sentiments towards the company began to change when Facebook dissolved the civic integrity team shortly after the elections in 2020. The whistleblower handed in her resignation in April, and left in May of this year. 

Facebook spokesperson Lena Pietsch told CNN Business that stating Facebook encourages the spread of misinformation and harm to its users is “simply not true,” reiterating the claims that the company “continues to make significant improvements” to tackle these issues. 

Haugen Hearing 

The Guardian called Tuesday’s hearing “one of the most useful Big tech hearings yet” due to Haugen’s expertise and lengthy list of “actionable suggestions” for changing the company for the better. 

One of the issues highlighted in the Journal’s report in September that lead to this hearing is Facebook’s creation of the Instagram Kids. Haugen said that Instagram is “distinctly worse” than other platforms, stating that it’s about “bodies and about comparing lifestyles.” This characteristic of the app is culprit to the continuing deterioration of children and young teens’ mental healths, especially teen girls as the Journal’s report states. As such, having an Instagram Kids app seems redundantly harmful — and Haugen expressed her worries that Facebook is not planning on dismantling that idea, despite the company saying so. 

In addition, Haugan called for new and stronger regulations for Facebook, to which senators agreed. Comparing the Big Tech with Big Tobacco, senators argued that similar regulation against cigarettes could be adapted for social media. 

“Congress will be taking action,” said senator Ed Markey during the hearing. “We will not allow your company to harm our children and our families and our democracy, any longer,” Markey added. 

One of the most interesting points Haugen made during her hearing was shifting the perspective outside the US, claiming that Facebook did not distribute resources for research to misinformation and hate speech to non-English content fairly. She argues that such gaps help influence violence in other areas in the world, including Ethiopia. This is a significant point because of Facebook’s global status-- a company with this much access, and this much potential to influence its users, should not fall short with its global audience. Such systemic discrimination is problematic in many ways, not the least being the message it shows: a subtle nod to an English language superiority. The platform is global, so should the resource allocation be. 

In addition, the impact of social media on the health of users is evident in the US; and this is with the current resources available for the english content-- what’s left to say about the unfiltered, unmoderated, potentially harmful speech in other languages and how that affects other people’s mental healths? 

Last but not least, Haugen condemned Facebook’s lack of transparency and research suppression by internal and external auditors, referencing August’s decision to remove access to data about vaccine misinformation to New York University students. 

It’s important to note, however, that Haugen’s entire purpose from the beginning, as she states, is not to make people hate Facebook, but to change it for the better. Some of the changes she suggests include: an independent government body consisting of former staff workers who understand the algorithm, changing the Newsfeed to be chronological rather than ranking content using an algorithm, and requiring Facebook to be more transparent with their internal research.

The general public seemed to agree that this hearing was one of the most productive ones on Big Tech, with possible concrete solutions drawn. However, Haugen did explicitly say she was against the breaking up of Facebook, a statement that many saw as a bit hypocritical. Will Oremus, Washington Post journalist and tech writer, points out that Haugen views Facebook’s problems as the same as other social media platforms, and in her eyes, regulation is the answer—which would make concentrating the problem in one place, aka Facebook as a “monopoly”, easier to deal with than if it were fragmented across the market. 

“One thing I think Haugen’s anti-break up argument overlooks,” Oremus said on twitter shortly after the hearing, “a company of Facebook’s size and power can influence, outmaneuver, and in some cases just steamroll government efforts to rein it in. So instead of independent oversight, you get regulatory capture.” 

The truth is we rely on social media for more than just enjoyment or sharing of news. It has become a platform for information, businesses, help, and services— essentially encompassing all areas in our lives. The outage on Monday night paused the world for a few hours, and although many saw it as a reprieve from the fast-paced never-ending flow of information we’re dependent on, others felt it a little more fundamentally, with their livelihoods put on hold until the services were made available again. 

Facebook’s sheer size and monopolization of such a vital tool in today’s world gives the company too much power. We’ve seen over and over again the subtle and not-so-subtle ways in which social media use algorithms to influence users, children even more so.

Some of Haugen’s suggestions, specifically changing the newsfeed from an algorithmic timeline back into a chronological one, imply a drastic change to the way social media functions now. Oremus says “[Chronological timeline] would make our feeds crappy and boring, and either spam-ridden or overly sanitized or both. The question is: have things gotten so bad that it’s worth it?” 

Well, have they? 

More Stories