Facebook chose to maximise engagement at users’ expense, whistleblower says

US senators call for probes as ex-employee Frances Haugen testifies before Congress

Facebook repeatedly chose to maximise online engagement instead of minimising harm to users, as it struggled to retain staff and younger users, a whistleblower told Congress on Tuesday.

During testimony before the Senate commerce committee, Frances Haugen, a former Facebook employee, described how the company prioritised social interaction on its platforms, even when those interactions exacerbated addiction, bullying and eating disorders.

“Facebook knows that content that elicits an extreme reaction from you is more likely to get a click, a comment or reshare,” she told the committee. “They prioritise content in your feed so you will give little hits of dopamine to your friends, and they will create more content.”

Former Facebook employee Frances Haugen at a US Senate Committee on Commerce, Science, and Transportation hearing entitled ‘Protecting Kids Online: Testimony from a Facebook Whistleblower’ on Capitol Hill in Washington, DC. Photograph:  Drew Angerer /EPA/Pool
Former Facebook employee Frances Haugen at a US Senate Committee on Commerce, Science, and Transportation hearing entitled ‘Protecting Kids Online: Testimony from a Facebook Whistleblower’ on Capitol Hill in Washington, DC. Photograph: Drew Angerer /EPA/Pool

Her testimony deepened the sense of crisis gathering around Facebook, just a day after a widespread outage made its popular apps inaccessible for hours, and strengthened calls from many members of Congress from both parties for stricter regulation of its services.

READ SOME MORE

"Congress has to intervene," said Richard Blumenthal, the Democratic chair of the consumer protection subcommittee, who said the Securities and Exchange Commission and Federal Trade Commission should investigate Ms Haugen's claims.

‘Defensive and demeaning’

Facebook said it disagreed with Ms Haugen’s characterisation of the issues discussed. Ms Haugen had “worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives,” Lena Pietsch, the company’s director of policy communications, said in a statement after the hearing.

In comments to reporters, Mr Blumenthal called the statement “defensive and demeaning”.

Ms Haugen worked on Facebook's civil integrity unit until it was disbanded last year. She left the company earlier this year, taking with her tens of thousands of pages of internal documents that she says show how the company hid the effect of its platforms on users, especially younger ones, such as teenage girls whose preoccupation with body image may be deepened by viewing photos on Instagram.

2020 election

During the hearing, she described how Facebook changed its algorithms in the run-up to the 2020 US election to make it less likely that harmful or false content would go viral, but then undid those changes immediately afterwards.

She suggested the decision might have helped lead to the events of January 6th, when rioters who falsely believed Joe Biden had stolen the election from Donald Trump stormed the Capitol building in Washington.

“The fact that they had to break the glass on January 6th and turn [the controls] back on, I think that’s deeply problematic,” she said.

She said Mark Zuckerberg, the company’s chief executive, made a similar decision in April 2020 when Facebook employees suggested making it less likely for content to go viral only in countries at risk of social violence.

“Mark was presented with these options, and chose to not remove downstream MSI [a metric which prioritises content if it is likely to trigger a response from users], in April of 2020 – even just in isolated and at-risk countries.”

In her testimony, Ms Haugen painted a picture of a fragile company desperate to maintain rapid growth even amid a series of crises, from the Cambridge Analytica scandal to allegations it allowed political misinformation to flourish online. She told senators that Facebook struggled to attract staff, thanks in part to its plummeting reputation, which in turn led to it being unable to tackle problems such as the spread of hate speech.

“A pattern of behaviour that I saw on Facebook was that often problems were so understaffed [that] there was an implicit discouragement from having better detection systems,” she said.

In a separate complaint to the SEC, she alleged the company had concealed a years-long decline in younger users in the US. Ms Haugen pointed to internal company projections that a drop in engagement from American teens could drive an overall decline in its US daily users by as much as 45 per cent between 2021 and 2023.

As a result, Ms Haugen told the committee, Facebook has tried to recruit children on to its platforms.

“Facebook’s internal documents talk about the importance of getting younger users – for example, tweens on Instagram – because they know that children bring their parents online,” she said.

While Facebook recently paused plans to launch a version of Instagram for children aged 13 and under, Ms Haugen said she would not be surprised if they continued to develop it.

She said Facebook had even hidden internal research which shows how damaging their services can be for those children. “When Facebook is directly asked questions as important as, ‘How do you impact the health and safety of our children?’, they choose to mislead and misdirect.”

Ms Haugen’s revelations reinforced calls from senators in both parties for tougher regulation of large technology companies, including enacting federal privacy legislation and limiting legal protections for platforms that publish user content.

She also suggested reforming Section 230 of the Communications Decency Act to strip social media companies of the right not to be sued over decisions they make on how algorithms promote certain content.

“[Platforms] have 100 per cent control over their algorithms,” she said. “Facebook should not get a free pass on choices it makes to prioritise growth and virality and reactiveness over public safety.” – Copyright The Financial Times Limited 2021