Researchers explain why Facebook believes it mishandles political advertising: NPR

Facebook has been revamping its handling of political advertising over the years, but researchers who have conducted a comprehensive audit of millions of ads say the efforts of social media companies have had uneven results. increase.

They say that issues include overestimation of political advertising in the United States and underestimation of political advertising in other countries.

Also, despite Facebook’s ban on political advertising around the time of last year’s US elections, the platform has over 70,000 political ads anyway, according to a research team based at NYU Democratic Cybersecurity and the Belgian university KU Leuven. Allowed to be posted.

Their research was published early Thursday. They will also present their findings at a security conference next August.

After analyzing more than 4.2 million political ads and 29.6 million non-political ads from more than 215,000 advertisers, researchers said in Facebook’s enforcement efforts against US political ads, “more than 61 detected worldwide. % More ads, and 55% of the US are overlooked. ” The ads detected are actually non-political. “

Researchers criticize Facebook’s use of “elementary” methods

Laura Edelson of New York University, the lead author of the study, says that two things that emerged from the study surprised her.

“One is this very high false positive rate in the United States,” Edelson said.

She added that some of the surprises there were due to the “elementary” way Facebook seems to be using the keyword model to classify ads and content.

“We can do better,” Edelson said. “This isn’t the cutting edge of content moderation or problematic content detection. There are many more sophisticated methods available here that Facebook doesn’t seem to use.”

“Facebook involves humans as part of its advertising and content moderation policy, which is arguably the first to be automated,” she said. “There is a precision issue with that approach.”

Another surprise was the obvious problem Facebook had when it enforced a political advertising ban in the United States. After the policy was announced, many political advertisers simply stopped advertising. But not everyone kept the ban, Edelson said: “a significant number of them continued to advertise and stopped declaring them political.”

She said it should have been easy for political advertisers to ignore the ban.

“The error here isn’t subtle,” Edelson said. “They really only reflect the lack of investment.”

Facebook responds to researcher findings

A spokeswoman for Facebook’s parent company, Meta, told NPR in response to a request for comment on the pending investigation:

“Most of the political ads they surveyed were disclosed and labeled as they should. In fact, their findings show that less than 5% of all political ads have potential problems. It suggests that.

“If that was the perfect view, this report would also mention that it makes political advertising more transparent than television, radio and other digital advertising platforms.”

At issue in the US: more topics are being politicized

For what is considered a political ad, researchers say Facebook itself That political advertising policy says Applies to “advertising on social issues, elections and politics”. Then configure the system to apply the rules based on that definition.

In recent years, as the language of social and health issues has become more and more politicized, the definition of what can be interpreted as a political message has become widespread. Researchers have linked that trend to Facebook’s tendency to mislabel non-political ads as US political ads.

Edelson points out how Facebook approaches COVID-19 information.

“Much of the pandemic and COVID-related content has been politicized,” she said. “Much of the vaccine-related content has been politicized, but the way Facebook manages it wasn’t that subtle or subtle.”

According to Edelson, most of the problems aren’t very accurate in her view, as Facebook relies on an auto-discovery mechanism.

“Advertisements for masked people were flagged as political. Ads that mentioned or talked about vaccines or COVID were flagged as political,” she said.

By mislabeling the health message as a political ad, Facebook has created a new problem that needs to be resolved, researchers said.

“Facebook has created a well-separated policy for government health agencies,” Edelson said. It was political about COVID without catching anything like “This is where you can get the vaccine”. “”

In that scenario, if you want to run an ad that a community organization is hosting a weekend vaccine drive, “Facebook may flag that ad as a political ad,” said KU Leuven, another lead author of the study. Victor Le Pochat said. .. From there, he said, the ads could be removed.

“Facebook’s false detections of this kind could prevent this kind of community organization from publishing vaccine propulsion,” he said.

Researchers have also found some improvements

Given what we know about Facebook’s handling of political advertising in the 2016 election, we asked. Has the researchers seen any improvement since then?

“I think it’s a little better,” Edelson said.

“We see it in our data,” Le Pochat added.

“We find that Facebook can get more ads that are improperly declared,” he said, compared to a few years ago.

Le Pochat said that the majority of ads are properly declared: “Advertisers are in compliance with Facebook policy. We have found over 4 million correctly declared political ads. Then we have ads. I found over 150,000 ads that the Lord didn’t declare. “

This study supports a recently leaked Facebook record

Research is emerging after a few weeks Pile of Facebook documents Whistleblower Frances Haugen said that social media giants have a lot of political and social complexity, especially in countries where people post content in Arabic, Hindi and other widely spoken languages. I have described that I have not been able to deal with.

In some cases, the company mistakenly banned everyday language, according to the document. Others have reportedly allowed Facebook’s screen system to spread incendiary bombs.

“Our findings confirm that Facebook pays less attention to ensuring that communities outside the United States are also protected from the harm done by misleading political advertising,” Le Pochat said. Said.

“We see this very high false positive rate in the United States because it looks like Facebook is using the keyword model to detect political content in the United States, and not so much in other countries. I can’t see that pattern. “

This reflects where Facebook chose to invest money and time, she said.

“To adopt such a keyword model, you need to have some knowledge of the country’s politics so that you can create that keyword list,” Edelson said. “And it doesn’t look like Facebook has invested to understand the politics of all the countries that have political ads. To make such a detection in Malaysia, Macedonia and Argentina, the politics of those countries. Understand the situation. “

Researchers explain why Facebook believes it mishandles political advertising: NPR

Source link Researchers explain why Facebook believes it mishandles political advertising: NPR

Back to top button