By Barbara Ortutay and David Klepper
Facebook has learned the lesson since Russian agents and other opportunists abused the platform in an attempt to manipulate the 2016 U.S. presidential election, saying it is no longer a path of misinformation, voter oppression, and election turmoil. I have repeatedly insisted.
But it was a long and stopping journey for social networks. Despite some of the key outsiders and Facebook employees spending billions of dollars on the project, the company’s efforts to revise the rules and strengthen safeguards are quite on this task. It states that it is inadequate. As for why, they point out the company’s persistent reluctance to act decisively for most of its time.
“Am I worried about elections? Silicon Valley venture capitalist and early Facebook investor Roger McNamie has become a loud critic.” At the current size of the company, It is a clear and current danger to democracy and national security. ”
The company’s rhetoric is certainly updated. CEO Mark Zuckerberg said the platform is currently facing challenges that could have been unimaginable in 2016, such as the potential for public unrest and the potential for Facebook to easily exacerbate. Mentions a disputed election.
“This election won’t be a business as usual,” Zuckerberg wrote in a September Facebook post outlining Facebook’s efforts to encourage voting and remove false information from the service. “We are all responsible for protecting democracy.”
Still, for years, Facebook executives seemed to be surprised every time the platform created to connect the world was used for malicious purposes. It’s as if no one could have predicted that Zuckerberg would use Facebook to live stream murders and suicides, incite ethnic cleansing, promote fake cancer treatments, or try to steal elections. Has provided multiple apologies over the years.
Other platforms such as Twitter and YouTube also struggle to deal with false information and offensive content, but Facebook identified in 2016 as its reach and scale, and compared to many other platforms. It stands out because of its slow response to the challenges.
Shortly after President Donald Trump’s election, Zuckerberg said in a very jarring tone about the idea that Facebook’s “fake news” could have influenced the 2016 election. It’s a pretty crazy idea. ” A week later he returned a comment.
Since then, Facebook has been slow to act against the threat to the 2016 elections, so it has issued a series of mea culpas and promised to take better action. “I don’t think they’re good at listening,” said David Kirkpatrick, author of the book on the rise of Facebook. “What has changed is that more and more people are saying they need to do something.”
The company hired an external fact checker to add restrictions to political advertising, add more restrictions, and remove thousands of accounts, pages, and groups found to be involved in “coordinated fraud.” did. This is a Facebook term for fake accounts and groups that maliciously target political discourse in countries from Albania to Zimbabwe.
We have also begun adding warning labels to posts that contain false information about voting, and are taking steps to limit the distribution of sometimes misleading posts. In recent weeks, the platform has also banned posts denying the Holocaust and joined Twitter to limit the spread of unconfirmed political stories about Hunter Biden, the son of Democratic presidential candidate Joe Biden, published by the conservative New York Post. Did.
All of this definitely puts Facebook in a better position than it was four years ago. But that doesn’t mean it’s fully prepared. Despite the strict rules prohibiting them, violent militias still use the platform for organizing. Recently, this included a failed plan to kidnap the Governor of Michigan.
In the four years since the last election, Facebook’s revenue and user growth have skyrocketed. Analysts expect the company to generate $ 80 billion in revenue and $ 23.2 billion this year, according to FactSet. It currently boasts 2.7 billion users worldwide, up from 1.8 billion as of 2016.
Facebook faces government investigations into its size and market power, including an antitrust investigation by the US Federal Trade Commission. A previous FTC study fined Facebook $ 5 billion but didn’t require any additional changes.
“Their top priority is growth, not reducing harm,” Kirkpatrick said. “And it’s unlikely to change.”
Part of the problem: Zuckerberg has a firm grasp of the company, but doesn’t take criticisms of him or his creation seriously, says Jennifer Grigiel, a professor of communications at Syracuse University, a social media expert. Is accusing. But they said the public knew what was going on. “They are looking at the false information about COVID. They understand how Donald Trump is using it. They can’t see it.”
Facebook claims that it takes the challenge of misinformation seriously, especially when it comes to elections.
In a statement, the company stated its election and voting policy, saying “Elections have changed on Facebook since 2016.” “There are more people and better technology to protect the platform, and content policies and enforcement have improved.”
Grigiel says such comments are natural. “This company uses PR instead of an ethical business model.”
Kirkpatrick says board members and executives who opposed the CEO (a group that includes the founders of Instagram and WhatsApp) have left the company.
“He’s convinced that Facebook’s overall impact on the world is positive,” critics haven’t given him enough credit, Kirkpatrick said of Zuckerberg. As a result, Facebook CEOs are less likely to accept constructive feedback. “He doesn’t have to do what he doesn’t want to do. He has nothing to overlook,” Kirkpatrick said.
The federal government has so far left Facebook on its own device, and lack of accountability has helped the company, according to Washington Democratic Party U.S. Congressman Pramira Jayapal, who grilled Zuckerberg at the Capitol Hill hearing in July. Is just giving.
She said the value of the warning label is limited if the underlying algorithm of the platform is designed to impose a polarizing material on the user. “I think Facebook did a few things to show that it understood its role, but in my opinion it was too few and too late.”
Source: Associated Press