Frances Haugen, a Facebook whistleblower, spoke at a hearing of the Senate Commerce, Science and Transportation Subcommittee in Washington, DC, USA on Tuesday, October 5, 2021.
Stephanie Reynolds | Bloomberg | Getty Images
Facebook Whistleblower Frances Haugen refused to take responsibility for the service or encourage employees to speak out about problematic behavior on Monday. Said that it created the toxic situation that exists today.
“We don’t want to admit that Facebook is responsible to anyone,” Hogen said in a hearing in the British Parliament on Monday. New law The purpose is to tackle harmful content online.
Haugen has been published for the second time since revealing himself as the source behind numerous sparking internal documents. The Wall Street Journal series “Facebook files.“Hogen testified before the US Congress Early this month Since then, a pile of her documents Numerous media outlets..
Facebook’s leadership is centered on growth, creating a culture that focuses on the positive aspects of the company’s services at the expense of addressing the issues they pose, Hogen said Monday.
“Facebook is overwhelmingly full of conscientious, kind and empathetic people,” she said. “Good people built into a system with bad incentives lead to bad behavior. It promotes the true pattern of people who are willing to look the other way, rather than those who warn.”
According to Haugen, Facebook hasn’t introduced a way for employees to point out issues that management should address or that researchers can consider.
“Facebook has repeatedly shown that not only do we not want to publish that data, but even if we publish it, it often misleads people,” she said.
This is an attitude rooted in Facebook’s start-up culture and will not change until the company is forced to change its incentives through regulation, Hogen said.
“When they see a conflict of interest and interest between people, they continue to choose interest,” Hogen said.
A Facebook spokeswoman said in an email statement that he agreed with the need for regulation “to prevent companies like us from making these decisions on our own.” Stated. Representatives also repeated Facebook’s controversy from a recent news article, where the company “spent $ 13 billion and hired 40,000 people to do one job: keep people safe with our app.” Said.
The highlights of the inquiry on Monday are:
Facebook Chairman and CEO Mark Zuckerberg.
Erin Scott | Reuters
Is Facebook Evil?
Parliamentarian John Nicholson asked Hogen if Facebook was just bad.
“Your evidence has shown us that Facebook can’t prevent harm to children, it can’t prevent the spread of disinformation, it can’t prevent the story of hatred.” Nicholson said. “It has the power to deal with these issues, it just chooses not to do it, it wonders if Facebook is just fundamentally evil. Is Facebook evil?”
Hogen said the word she said was “negligence.”
“I believe there is an inadequate pattern that Facebook is reluctant to acknowledge its power,” she said. “They believe in flatness and do not accept the consequences of their actions, so I think it is negligent and ignorant, but I can’t see their hearts.”
Adam Moseri, Facebook
Beck Diffenbach | Reuters
Instagram kids worries
In the journal, its series, Highlighting Facebook was aware that Instagram services were detrimental to the mental health of teenagers.
Following public protests following the report, Facebook last month Pause development An Instagram version designed for children under the age of 13.
The topic was revisited at a hearing on Monday.
Inside Facebook, Haugen said the company’s reliance on products is called “problematic use.” According to Hogen, Facebook found that problematic use was much worse for younger people than for older people.
To meet the criteria for problematic use, you need to be honest enough to be self-aware and admit that you have no control over your use. According to Haugen, 5.8% to 8% of teens have had problematic use of Facebook products by the time they turn 14 after using them for a year.
“That’s a big problem,” she said. “If 14-year-olds were so self-aware and honest, the real numbers would probably be 15% and 20%. I’m deeply concerned about Facebook’s role in hurting the most vulnerable of us. increase.”
According to Haugen, Facebook itself reports that the problem is that Instagram is not only dangerous to teenagers, but more harmful than other forms of social media.
“When kids explain how to use Instagram, Facebook’s own research explains that it’s a story of an addict. Kids say,” This makes me unhappy. To me it. I feel like I don’t have the ability to control how I use it. I’m leaving saying I’ll be banished, “Hogen said. “I’m deeply worried that I can’t make my 14-year-old Instagram safe, and I’m really skeptical that I can make my 10-year-old Instagram safe.”
“A novel that is scary to read”
Hogen mentioned at the hearing One of the journal articles It pointed out that armed groups used Facebook to incite violence in Ethiopia. According to the report, the company does not have enough employees to speak the relevant language to monitor the status of Facebook’s services.
Hogen said this situation risks occurring in other vulnerable countries in the north and south. This is one of the main reasons she has made progress.
“I believe that situations like Ethiopia are only part of the opening chapter of a horrifying novel to read,” Hogen said.
Regulation may be good
Haugen praised the UK for considering regulations on social media services and said the regulations could help Facebook.
“I think the regulation could actually be good for Facebook’s long-term success, as it forces Facebook to be in a more comfortable place,” she said.
Crisis on Monday Publish the report Based on Haugen’s document, which shows that the number of teenage users of Facebook apps in the United States is expected to decline by 13% from 2019 and by 45% over the next two years. According to internal documents, the number of users between the ages of 20 and 30 was expected to decline by 4% during that period.
Haugen said the company could reverse this decline if regulations changed Facebook’s incentives to make apps more comfortable for users.
“If we make Facebook safer and more comfortable, I think it will be a more profitable company in 10 years, because toxic versions of Facebook are gradually losing users,” she said.
Facebook doesn’t want to accept responsibility for anyone
Source link Facebook doesn’t want to accept responsibility for anyone