In Facebook’s early days, Mark Zuckerberg ended weekly meetings by raising his fist and shouting “domination”.
On a call with investors on Monday, the boyish social media titan struck a similarly defiant tone, promising that Facebook would throw its weight behind efforts to lure younger users back to the platform after their numbers had dwindled. He pledged the company would build the “successor to the mobile internet”, an avatar-filled virtual world known as the metaverse.
But the chief executive was also swift to address mounting allegations against his company that it has relentlessly placed “profits over safety” and downplayed its alleged role in the poisoning of democratic society - misleading both investors and the public.
“When we make decisions, we need to balance competing social equities,” Zuckerberg said, citing as his first example balancing free speech - of which he has been a fierce proponent - with reducing harmful content.
“It makes a good sound bite to say that we don’t solve these impossible trade-offs because we’re just focused on making money, but the reality is these questions are not primarily about our business, but about balancing different difficult social values,” he added.
Thanks to one of the biggest leaks in history, Facebook is fighting claims that it has done little to shed its “growth at all costs” culture that turbocharged its rise to capture 3.58 billion users and quarterly sales of more than $29 billion. It is an image that Zuckerberg has sought to overturn with billion-dollar investments in moderation, safety and what his company calls “integrity” work.
But Frances Haugen, an employee on the Facebook integrity team until May 2021, argues that the company’s commitment to the cause is insincere. To prove it, she leaked tens of thousands of internal documents - including many from employee discussion sites, company presentations and research papers - that have unveiled the inner workings of Facebook. She has also filed eight complaints against the company with US securities regulators. Facebook, in turn, has sought to portray Haugen as a junior employee cherry-picking to fit her own narrative, with little knowledge of some of the issues on which she has taken a view.
The reality is that the documents show Facebook is painfully aware of the harm the platform and its algorithms can cause: exacerbating the poor mental health of teenagers, accelerating polarisation in countries where the political landscape is fragile, fuelling misinformation and conspiracy theories.
In many cases, Facebook researchers are actively wrangling with ways to solve these thorny issues. When these good faith efforts fall short, it is often because they are stymied sometimes by top-down pressure, but also by the technical and bureaucratic challenges that come with managing a sprawling $915 billion company. Although its share price remains resilient the leak has also shown where Facebook’s future problems could lie, as global regulators circle the company and its Big Tech counterparts.
Facebook has “huge numbers of people working on analysing and fixing” its content problems, says Benedict Evans, an independent technology analyst. “But trade-offs, [organisational] structure, conflicting priorities, language, tech limitations, politics and massive growth means lots of that work is broken.”
Losing the youth
The latest firestorm of Facebook criticism has built up in several stages, part of a slick campaign by Haugen and a team of legal and press relations professionals supporting her. Copies of the internal documents were disclosed to US regulators and provided to Congress in redacted form by Haugen’s legal counsel. The Wall Street Journal was the first media group to receive and report on the documents.
Haugen (37) then went on 60 Minutes, the US TV news programme, in early October to reveal herself as the source. Two days later, she testified before Congress. Now, a consortium of news organisations, including the Financial Times, has obtained the redacted versions of the documents received by Congress, prompting fresh coverage.
They reveal Facebook to be acutely aware that it is considered by younger generations to be desperately uncool. It’s not a new phenomenon but has become more pronounced in recent years. According to a March 2021 document, daily user numbers in the US on Facebook for teenagers and young adults – aged 18 to 29 – are in decline and projected to fall by 4 and 45 per cent respectively in the next two years.
“Young adults perceive [Facebook] content as boring, misleading and negative,” one November 2020 research presentation reads, citing data from multiple qualitative and quantitative studies. As a network, it is considered “outdated” and that time spent on it is “unproductive”.
Even Instagram, the photo app it bought in 2012 for $1 billion that has until now been a magnet for young people, shows “worrying” trends in the consumption and production of content by users, other documents from 2021 reveal. This is blamed partly on the dizzying rise of Chinese-owned rival TikTok during the pandemic and, experts say, does not bode well for the company.
“I can’t think of a social platform which has begun a sustained decline in terms of users that has then been able to recover from that,” says Andrew Lipsman, principal ecommerce analyst at Insider Intelligence. “[Though] the trends can take time to bear out.”
On Monday, Zuckerberg scrambled to dispel allegations that Facebook was hiding such challenges from investors by announcing that the company would be “retooling” its teams “to make serving young adults the north star, rather than optimising for the larger number of older people”.
It is against this tense backdrop that Facebook has been making many of the decisions outlined in the documents. Among them, the company has embraced controversial efforts to build a version of Instagram for under-13s, Instagram for Kids. This, the documents reveal, is despite internal research showing a complex impact on the mental health of young people, with some affected detrimentally but others benefiting from social media use.
Facebook says the push to attract those aged under 13 is an attempt to offer parents extra controls when their children will probably be on the internet anyway. But two former staffers, speaking on condition of anonymity, dismiss this suggestion.
The efforts are instead designed to get young people hooked on the platform early, one former employee says. “[This remains] an internal culture where product managers are motivated - and compensated - to show impact by driving engagement and user growth,” the person adds.
‘Embarrassing to work here’
Brian Boland, the former vice-president of partnerships marketing at the company, agrees that Facebook cares about safety “to a point” but “errs on the side of growth”. The company is also, he says, the clearest example of the issues thrown up by the new frontier of surveillance capitalism - the commodification of a person’s online data for profitmaking purposes.
These issues do not all appear to be of Facebook’s own deliberate making, rather an unpredictable byproduct of dizzying growth, some documents show. But the fundamentals of how the site operates - engaging users with “likes” and “shares” - have not escaped internal scrutiny, leading to several soul-searching comments among employees.
One Facebook worker wrote in response to an August 2019 research paper that the company had “compelling evidence” that its “core product mechanics” such as recommending groups or people to users and optimising for engagement were a significant part of why hate speech and other unwanted material was able to “flourish” on the platform.
There is evidence of some efforts to measure and mitigate the issue. A document presented to Zuckerberg in February 2020 outlined Project Daisy, an initiative to hide “likes” and other metrics from users to see if removing the measure of popularity would make users feel better about using the platform.
The effects of Project Daisy on wellbeing were negligible, an internal study subsequently found. It did, however, have a much more marked impact on advertising - driving down performance. An option, introduced after the Project Daisy discussions, to hide likes can now be found in Instagram. Another project, codenamed Drebbel, sought to monitor the spiralling effects of so-called “rabbit holes”, where users are directed towards harmful material by the site’s recommendation algorithms.
Among Haugen’s most powerful accusations against Facebook are that the company has not just ignored but knowingly fanned the flames of violence and misinformation across the world, and especially outside the English-speaking world. Internal documents show a crippling lack of in-country moderation and local linguistic support for widely-spoken languages such as Arabic, which have helped compound horrific real-world harms from ethnic cleansing to sex trafficking and religious rioting in places such as Dubai, Ethiopia, India and Myanmar.
In her testimony to members of the UK parliament on Monday, Haugen said the consequences of Facebook’s choices in the so-called Global South were a “core part” of why she came forward, describing the violent ethnic conflict amplified by Facebook in Ethiopia as the “opening chapter of a novel that is going to be horrific to read”.
Other documents show that Facebook has become a Petri dish for co-ordinated extremist groups all over the world. Sophie Zhang is a former data scientist in Facebook’s “fake engagement” team, which was created to identify and shut down inauthentic activity. She blew the whistle on the company’s inertia against political manipulation and says she began uncovering evidence of manipulative activity on the platform in September 2018 in multiple non-US markets.
“Essentially I thought when I handed [the evidence] over, they would prosecute it. I didn’t expect I would have to do it myself. I was trying to get [colleagues] interested, and everyone agreed it was terrible. But no one thought it was important enough to act,” she says.
For poorer, smaller countries, it would often take outside reports and pressure to spur change, Zhang says. This included complaints from NGOs, opposition parties or reputable US media. Zhang described a time she found an instance of political manipulation but Facebook didn’t want to act on it until an external party threatened to go to The New York Times. Facebook says the allegations are untrue.
But Zhang is not alone in her experience. Documents show internal dismay at the company’s failures to curb hate speech and misinformation ahead of the January 6 Capitol insurrection in Washington. In others, frustrated staffers vent on its internal message board that their efforts to clean up the platform are hindered by senior leadership.
“It’s hard not to feel like the work my team and I do on preventing incitement of violence is completely erased by forces within the company but outside of our control,” wrote one staffer. “It makes it embarrassing to work here.”
Several employees argue in the posts that content policy in particular is at the whim of political and media pressure, with senior leadership allowing exceptions to be made either to placate a vocal high-profile figure, or to avoid accusations of bias from rightwing leaders. One calls for a “firewall” between the policy and lobbying team and the content team to avoid conflicts of interest.
‘Profits over safety’
It is clear from the documents that within the sprawling company there are competing teams with different priorities, leading to complaints of inconsistent policy enforcement. And whatever happens, Zuckerberg, as the chief executive, president and controlling shareholder with about 58 per cent of the voting shares, has the final word.
In one episode unveiled in the documents, he is alleged to have blocked a change that would have helped stem hateful content in users’ news feeds ahead of the 2020 US presidential election over concerns it might hurt active engagement. Facebook denies this, pointing to changes that came into force in September 2020.
“It’s worth trying to think creatively about ways to have due process,” one former senior staffer told the Financial Times. “He can’t get fired and that doesn’t seem fair.”
Facebook defenders note that it is a public, profitmaking company with an obligation to shareholders. Critics hope, given what they see to be Zuckerberg’s unaccountable power in a regulatory vacuum, that the latest deluge of bad press will prompt Congress to step in to, for example, tackle harmful speech or to mandate more transparency by companies about how their algorithms work.
Indeed, in a so-far floundering experiment in self-regulation, Facebook’s own Oversight Board, launched 12 months ago to help it make tricky moderation decisions, has complained that the company has not been forthcoming in its response to requests for information.
“There are enough examples of [Facebook choosing profits over safety] that make it worthy of congressional and parliamentary oversight,” says Boland.
The question is what this will look like and whether any areas of legislation can earn bipartisan support. Issues around free speech in the US tend to split left and right, although lawmakers are co-operating on areas such as improving online protection for children.
“I hope this deepens the conversations about what we think companies should be prioritising and where we should be regulated versus driven by them,” says Katie Harbath, a former Facebook public policy director. “The discussion around Facebook really does centre around the trade-offs the company makes and where they draw the line.”
The US Securities and Exchange Commission has yet to say whether it will act on Haugen’s concerns and pursue any investigations: “The closer the relationship between the whistleblower’s allegations and the company’s financial performance, the higher the probability that the agency will take action,” says Joseph Grundfest, a law professor at Stanford and a former SEC commissioner.
Investors, for now, are unperturbed; Facebook’s share price moved up slightly on Monday morning despite the deluge of news. The company’s pockets remain as deep as ever. “The smaller competitors are the ones who are going to get hurt,” says Youssef Squali, managing director at Truist Securities and a leading Facebook analyst. “Facebook will be able to invest in technology, and lobbying, to get to where it needs to get to.” –Copyright The Financial Times Limited 2021