Frances Haugen's interview reveals damning truths about Facebook

Frances Haugen’s interview reveals damning truths about Facebook

Facebook

Former Facebook employee Frances Haugen, who leaked materials to the Wall Street Journal last month, has revealed some damning truths about her former workplace after speaking in an interview with US news program 60 Minutes on Sunday night. 

The 37-year old tech professional told CBS journalist Scott Pelley that Facebook is “tearing our societies apart”, and that it has continued to put profit over safety.

“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,” Haugen said. “And Facebook, over and over again, chose to optimise for its own interests, like making more money.”

In the interview, Haugen revealed that the tech firm changed its algorithm after the 2020 presidential election, causing a spate of misinformation to spread across the platform.

“And as soon as the election was over, they turned them [the safety systems] back off or they changed the settings back to what they were before, to prioritise growth over safety,” she said. “And that really feels like a betrayal of democracy to me.”

Haugen joined the company as a product manager to help it combat misinformation in 2019. Before that, she was a co-founder and CTO of the dating app, Hinge. 

With a career spanning 15-years, Haugen has worked for tech giants including Google and Pinterest, and said that Facebook had the worst policies regarding the restriction of harmful content. 

“I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before.” 

When asked about Facebook’s founder and chief executive, Mark Zuckerberg, Haugen said she has “…a lot of empathy” for the multi-billionare.

“Mark has never set out to make a hateful platform,” she explained. “But he has allowed choices to be made where the side-effects of those choices are that hateful, polarising content gets more distribution and more reach.”

Last month, the leaked Facebook Files, which Haugen was responsible for, revealed it withheld its own research on the harmful effects of Instagram for two years, including stats surrounding the damage to mental health that the social platform poses to teenage girls. 

Journalists at the Wall Street Journal reported seeing a slide from an internal presentation given within Facebook in 2019, showing their research that indicated the platform, which has roughly one billion monthly active users globally, is harmful for a large proportion of users, especially teenage girls.

“We make body image issues worse for 1 in 3 teen girls,” one of the slides presented

“What’s super tragic is Facebook’s own research says, as these young women begin to consume this eating disorder content, they get more and more depressed,” Haugen remarked. 

“And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook’s own research says it is not just that Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media.”

“Imagine you know what’s going on inside of Facebook and you know no one on the outside knows. I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground.”

When asked why she decided to leak the materials now, Haugen said she copied tens of thousands of documents from Facebook’s internal system that showed Facebook was not making significant steps towards combating online hate and misinformation, despite public comments to the contrary. 

“At some point in 2021, I realised, ‘OK, I’m gonna have to do this in a systemic way, and I have to get out enough that no one can question that this is real,’” Haugen said.

“When we live in an information environment that is full of angry, hateful, polarising content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other,” she added. 

In 2018, Facebook changed the algorithm on its news feed (its primary feature) to prioritise content that propelled user engagement.

Haugen said this shift highlighted divisive issues and content.

“One of the consequences of how Facebook is picking out that content today is it is optimising for content that gets engagement, or reaction,” she said.

“But its own research is showing that content that is hateful, that is divisive, that is polarising – it’s easier to inspire people to anger than it is to other emotions.”

“Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”

“You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media.”

Facebook responded to requests for comments from 60 Minutes, saying, “Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place.”

“We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”

Haugen is due to testify in Washington DC on Tuesday before a Senate subcommittee in a hearing titled “Protecting Kids Online”, about Facebook’s research into Instagram’s effect on the mental health of young users.

Image: CBS 60 Minutes

×

Stay Smart! Get Savvy!

Get Women’s Agenda in your inbox