Home Tech Bluesky sees a 17x increase in moderate reports in 2024 after rapid...

Bluesky sees a 17x increase in moderate reports in 2024 after rapid growth

62
0
Bluesky sees a 17x increase in moderate reports in 2024 after rapid growth

Bluesky on Friday published a moderated report over the past year, noting the huge growth experienced by social networks in 2024 and how it will affect the workload of the Trust & Safety team. It also noted that the largest number of reports came from users who reported accounts or posts for harassment, trolling, or intolerance – problems that plagued Bluesky during its growth, and even led to large-scale protests in some individual moderation decisions. The company’s report does not provide details or explain why it did or did not act against individual users, including those on the most blocked list. The company is expected to add more than 23 million users by 2024, as Bluesky becomes a new destination for ex-Twitter/X users for various reasons. During the year, the social network benefited from several changes in X, including its decision to change its blocking method and train AI on user data. Other users left X after the results of the US presidential election, based on how the politics of X owner Elon Musk began to dominate the platform. The app also saw a surge in users when X was temporarily banned in Brazil in September. To meet the demand caused by this growth, Bluesky has increased its moderation team to about 100 moderators, he said, and continues to hire. The company also began offering psychological counseling to team members to help them cope with the difficult task of constantly being exposed to graphic content. (An area that we hope AI will address one day, because humans are not built to handle this type of work.) In total, there were 6.48 million reports to the Bluesky moderation service, up 17x from 2023 when there were only 358,000 reports. Starting this year, Bluesky will start receiving moderation reports directly from the app. Like X, this will allow users to track actions and updates more easily. Later, it will support in-app appeals as well. When Brazilian users flooded into Bluesky in August, the company saw up to 50,000 reports a day, at its peak. This resulted in a backlog to address moderated reports and required Bluesky to hire more Portuguese-speaking staff, including through contract vendors. In addition, Bluesky began automating more reporting categories beyond just spam to help deal with the influx, although this sometimes led to false positives. Still, automation helps reduce processing times to mere “seconds” for “high certainty” accounts. Before automation, most reports were handled within 40 minutes. Currently, human moderators remain in the loop to deal with false positives and appeals, if not always handling initial decisions. Bluesky says that 4.57% of active users (1.19 million) will make at least one moderated report in 2024, down from 5.6% in 2023. The most – 3.5 million reports – are for individual posts. Account profiles were reported 47,000 times, often for profile pictures or banner photos. The list is reported 45,000 times; DM was reported 17,700 times, with Feed and Starter Pack receiving 5,300 and 1,900 reports respectively. Most reports on anti-social behavior, such as trolling and harassment – signals from Bluesky users that they want to see a less toxic social network, compared to X. Other reports are for the following categories, Bluesky says: Misleading content (impersonation). , false information, or false claims about identity or affiliation): 1.20 million Spam (mentions, replies, or repeated content): 1.40 million Unsolicited sexual content (indecent or mature content not labeled with correct): 630,000 Illegal or urgent problems (clear violation of the law or Bluesky’s terms of service): 933,000 Other (problems that do not fit into the above categories): 726,000 The company also offers updates on label services, which include labels added to posts and accounts. Human labelers added 55,422 “sexual figures” labels, followed by 22,412 “abusive” labels, 13,201 “spam” labels, 11,341 “intolerance” labels, and 3,046 “threatening” labels. In 2024, 93,076 users submitted a total of 205,000 appeals regarding Bluesky’s moderation decisions. There are also 66,308 account takedowns from moderators and 35,842 automatic account takedowns. Bluesky also fielded 238 requests from law enforcement, government, and law firms. The company responded to 182 of those and complied with 146. Most of those requests were law enforcement requests from Germany, the US, Brazil, and Japan, he said. Bluesky’s comprehensive report also investigates other types of issues, including trademark and copyright claims and child safety/CSAM reports. The company noted that it submitted 1,154 confirmed CSAM reports to the National Center for Missing & Exploited Children (NCMEC).

Source link