The person who revealed internal Facebook research to the Wall Street Journal that served as the basis for a series of stories about the social network’s knowledge of the harm its platforms cause and its efforts to downplay those harms publicly revealed herself on 60 Minutes on Sunday. She is a former algorithmic product manager at Facebook named Frances Haugen.
Haugen, who worked at Facebook for about two years, told 60 Minutes she leaked the documents to the Journal after seeing a conflict of interest at Facebook between what’s good for the company and what’s good for the public.
Get the CNET Daily News newsletter
Catch up on the biggest news stories in minutes. Delivered on weekdays.
“Facebook, over and over again, chose to optimize for its own interests, like making more money,” she told 60 Minutes’ Scott Pelley in an interview.
“I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground,” the 37-year-old data scientist said.
The Wall Street Journal’s series of stories on the documents, found among other things, that the company ignored research about Instagram’s negative effects on teen girls and performed an algorithm change to improve interaction on the platform that actually made users “angrier.”
Haugen explained how the algorithm has “thousands of options” for what it could show you in your feed based on what you’ve engaged with in the past.
“One of the consequences of how Facebook is picking out that content today is it is — optimizing for content that gets engagement, or reaction,” she said. “But its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions.”
During last year’s elections, Haugen said she was assigned to Facebook’s Civic Integrity project, which worked to identify and reduce risks to elections including misinformation. She said the company knew the dangers associated with the 2020 election, but that the company’s response was temporary. She said employees were told the unit was being dissolve unit because the election had ended without riots.
“Fast forward a couple months, we got the insurrection,” she said. “And when they got rid of Civic Integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.’
“And as soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” she said. “And that really feels like a betrayal of democracy to me.”
Facebook didn’t immediately respond to a request for comment to Haugen’s appearance on 60 Minutes. However, The New York Times reported earlier in the weekend that Nick Clegg, Facebook’s head of policy and global affairs, sent a 1,500-word memo to employees ahead of the news magazine’s segment.
“Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out,” he wrote, according to The Times. “But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization.”
Haugen’s appearance on 60 Minutes comes after a Senate subcommittee held a hearing about Facebook’s and Instagram’s harmful mental health impact on young people, including teenagers. US lawmakers are seeking more answers from the social media giant after The Wall Street Journal published a series of stories about the company’s knowledge of the platform’s problems even as it downplayed them publicly. One in three teen girls reported that Instagram made their body issues worse, according to a 2019 presentation cited by the Journal.
During the hearing, Facebook’s Global Head of Safety Antigone Davis pushed back on the news outlet’s characterization of its internal research. “I want to be clear that this research is not a bombshell,” Davis said. “It’s not causal research.”
Instagram, owned by Facebook, is pausing the development of a kid’s version of the app. The social network also released some of its internal research and said it’s looking at ways to release more data.
Davis’ remarks didn’t appear to appease lawmakers who are planning to hold more hearings on the issue. Haugen is scheduled to testify before the Senate subcommittee on consumer protection on Tuesday. During the 60 Minutes interview, she suggested the federal government should impose regulations.
“Facebook has demonstrated they cannot act independently,” she said. “Facebook, over and over again, has shown it chooses profit over safety. It is subsidizing, it is paying for its profits with our safety.”
CNET’s Queenie Wong and Andrew Morse contributed to this report.
Leave a Reply Cancel reply
document.addEventListener(‘DOMContentLoaded’,function(){var commentForms=document.getElementsByClassName(‘jetpack_remote_comment’);for(var i=0;i<commentForms.length;i++){commentForms[i].allowTransparency=false;commentForms.scrolling='no';}}); <!–