Everything that Facebook has been accused of by the whistleblower.
Data scientist Frances Haugen, 37, a former Facebook product manager on the news company’s team, appeared and became the culprit behind the leak of thousands of internal documentation pages that could damage the Wall Street Journal.
Ms Haugen, who left Big Tech in March this year, appeared for 60 minutes on CBS on Sunday night to discuss her efforts to expose what she saw as Facebook’s misconduct and to accuse the company of giving priority to provocative content. on public welfare and “paying for its benefits and our security”.
Her earlier claims, published anonymously in the WSJ in recent weeks, showed that celebrities, politicians and high-level users were treated differently from the site from the general public and released certain volume protocols under a known system. Under the name “XCheck,” that the company’s response to complaints about human traffickers and drug dealers using its pages was often weak and that Facebook was actively involved in advertising itself as part of the “Amplify Project”.
Shevalso revealed that the company is involved in a lawsuit against a group of its shareholders who claim that its £ 3.65 billion payment to the US Federal Trade Commission to resolve the Cambridge Analytica data scandal was crucial because it was designed to protect its founder Mark. Zuckerberg’s personal responsibility.
Perhaps the most shocking accusation of Ms. Haugen is that Facebook’s own research revealed that her Instagram lifestyle was harmful to the mental health and self-esteem of adolescent girls, but that did not take the initiative to solve the problem.
According to the documents she received, an internal study found that 32% of young women who reported feeling bad in their bodies admitted to feeling bad after logging on to Instagram.
Speaking to 60-minute host Scott Pelley, the analyst, who has worked with other great Silicon Valley animals like Google, Pinterest and Yelp, said that while he did not believe Mr. Zuckerberg decided to create a negative space: “The current Facebook version is breaking our society and caused ethnic violence throughout the world. “
“What I saw on Facebook from time to time was that there was a conflict of interest between what was good for the public and what was good for Facebook,” he said. “And Facebook, from time to time, has chosen to improve its interests, as to earn more money.”
Ms. Haugen said during the show that she believes the change in the algorithm that dictates what appears in the Information User for 2018 saw the company focus on prioritizing created content.
“Her own research shows that hate, divisive, and divisive content is easier to provoke people with anger than any other emotion,” he told CBS.
She also claimed that Facebook introduced security systems during the 2020 US presidential election to prevent the spread of information, but “as soon as the election was over they shut down or changed plans. To return to what they were before, prioritize growth over security,” allowing some who staged a January 6 Capitol Riot protest in Washington, DC, planning via Facebook, among other platforms.
Before the episode aired, vice president of the global affairs company (and former Liberal Democrat leader) Nick Clegg issued an internal memo in which he wrote: “The available evidence does not support the idea that Facebook, or social media in in general, it is the main cause of discrimination.
Mr Clegg then appeared on CNN on Sunday to reiterate his view that it was “ridiculous” to suggest that social media should be blamed for the Capitol attack by supporters of Donald Trump seeking to undo the defeat of the 45th president in the election.
Lena Pietsch, Facebook’s director of political communications, also responded to CBS. place. We continue to make significant improvements to combat the spread of unsolicited information and bad content. Proposing that we encourage bad content and do nothing is not true.
“If research had found a real solution to these complex challenges, the technology industry, governments and communities would have solved them a long time ago.”
Yael Eisenstat, another former employee who turned out to be a major critic of social media, said Vox’s revelation Ms Haugen was a “big time” for the company.
“For years we have known about a lot of these issues – through journalists and researchers – but Facebook has managed to claim that they have a problem handling so we should not believe what they are saying. All this time, the documents speak for themselves.
Frances Haugen will be seen testifying Tuesday in front of a Senate subcommittee in a session called “Protecting Children Online” on the company’s research on the impact of Instagram on the mental health of young users.