Facebook whistleblower Frances Haugen speaks publicly forward of congressional testimony


A Facebook whistleblower revealed her id in a Sunday evening interview whereas trashing the social media large for prioritizing divisive content material over security to garner larger income.

Frances Haugen, 37, spoke out publicly for the primary time since quitting Facebook in May when the corporate dismantled her unit that tried to deal with misinformation on the favored platform.

Before leaving the corporate, Haugen copied 1000’s of pages on inner paperwork — a few of which had already been reported on — to again up her claims.

“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,” Haugen stated on CBS’s “60 Minutes.”

“Facebook, over and over again, chose to optimize for its own interests, like making more money,” stated Haugen.

Haugen, a knowledge scientist from Iowa, linked what she characterised as Facebook’s inaction in squashing misinformation to the Jan. 6 US Capital riot.

After the polarizing 2020 election, Haugen stated the corporate removed the Civic Integrity unit and disabled some security options they’d put in place to cut back misinformation.

“They told us, ‘We’re dissolving Civic Integrity.’ Like, they basically said, ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now,” stated Haugen.

“Fast forward a couple months, we got the insurrection.”

Haugen said Facebook may be complicit in the Capitol riot due to its disbanding of the Civic Integrity unit after the 2020 election.
Haugen stated Facebook could also be complicit within the Capitol riot attributable to its disbanding of the Civic Integrity unit after the 2020 election.
AP Photo/Julio Cortez, File

“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” Haugen stated of the options.

“And that really feels like a betrayal of democracy to me.”

Facebook advised CBS that work undertaken by the dissolved division was allotted internally to different items.

Haugen advised host Scott Pelley that Facebook allows divisive content material to flourish due to adjustments it made in 2018 to its algorithms that prioritize content material for particular person accounts based mostly on their previous engagement.

“One of the consequences of how Facebook is picking out that content today is it is optimizing for content that gets engagement, or reaction,” stated Haugen.

Examples of disinformation posted on Facebook.
Examples of disinformation posted on Facebook.

“But its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions,” stated Haugen.

“Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,” the girl charged.

Haugen is ready to testify earlier than Congress this week. She has already filed reams of nameless complaints towards the corporate with federal authorities.

In the interview that aired Sunday, Haugen stated she acquired a 2019 inner report that particulars an argument from European political events over the content material dominating on its platform attributable to its algorithm.

Haugen is scheduled to testify before Congress about Facebook this week.
Haugen is scheduled to testify earlier than Congress about Facebook this week.
Robert Fortunato/CBS News/60 Minutes by way of AP

Haugen stated the events “feel strongly that the change to the algorithm has forced them to skew negative in their communications on Facebook … leading them into more extreme policy positions,” in response to Pelley.

In an announcement to “60 Minutes,” Facebook denied the allegations that the corporate encourages dangerous content material.

“We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true,” the corporate stated.

“If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”