World

UK online safety bill could change the face of the internet

BBritish government I like to play the benefits of free speech with the trumpet. “Freedom of speech is at the heart of our democracy,” Boris Johnson said last year in a bill to prevent “platformless” speakers from students who don’t deplatform at college. rice field. The Higher Education (Freedom of Speech) bill was included in the Queen’s Speech, which set the government’s legislative agenda earlier this month.

But outside the auditorium, Mr Johnson’s government has been accused of censorship on its own. Also, in the Queen’s Speech, “Online Safety Bill” (osb), A legislative bumper piece that imposes radical new obligations on search engines, social media sites, forums, video sites, etc. The minister says it “leads the world.” Everyone agrees to be ambitious, as it is 225 pages long, has 194 individual clauses, and can affect up to 25,000 companies. The discussion is about the result.

For its advocates, the bill dates back to Theresa May’s prime ministerial era and aims to make Britain “the safest place in the world to use the Internet.” for that, osb We impose “duty of care” on technology companies, a concept that began with the Industrial Safety and Health Act of the workplace. Tech companies are required by law to protect UK users from racism, threats of killing, sexual exploitation, dangerous advertising, loss of appetite, and more.

Sites that children may visit (which actually means most children) may require identification to verify that the visitor is 18 years of age or older. Although it is legal, it is considered “harmful” by Whitehole. Penalties can reach 10% of a company’s global revenues or even be banned altogether.

Citizen libertarians are not happy. According to the legal opinion commissioned by the charity Index on Human Rights, the bill is likely to violate the free speech provisions of the European Convention on Human Rights (UK remains the signatories). Former Interior Minister David Davis describes it as “a crackdown on free speech online” and “a charter of censorship.”

The bill establishes some layers of annoyance. The worst are reserved for things that violate existing law, such as aiding suicide, threatening murder, and supporting illegal immigrants. The largest companies (the definition of “large” remains unclear) should actively exclude such things from their sites. The large platform size (500 hours of video uploaded to YouTube per minute) means that it’s impractical for humans to check every post. Companies need to rely on auto-execution.

But algorithms are a dull tool, says Mark Johnson, a big brother watch that appeals to citizens’ freedom. They often suffer from nuances and context. “Is it an algorithm that can definitely tell the difference between someone who is promoting suicide and someone who is posting postpartum depression about feeling suicide on Mumsnet? [a big web forum]?? With billions of dollars at stake, the risk risks companies making careful mistakes and enthusiastically blocking harmless posts, he says.

The second layer of content is about posts that are not illegal in their own right, but are considered “harmful”. It is unknown exactly what belongs to that category (it is up to the minister to decide later). But the government is talking about everything from skepticism and bullying about vaccines to beautifying loss of appetite and throwing racist insults at British football teams. The site is required by law to minimize the likelihood that children will see such posts. For everyone else, they must make a positive decision on whether to block or downplay such content, or continue to promote and recommend it like anything else.

Activists argue that these “legal but harmful” provisions correspond to backdoor censorship, creating a whole new category of speech in the law. (Former Ruth Smyth mp Censorship index bosses call them “cluster fucks.” The government sought to reassure suspicious people by pointing out that tech companies could freely leave such posts if they chose to do so. Mr. Smith is negative. “Can you imagine the political pressure on the platform that publicly says they are? have understood Is it something like this? Those who decide to suppress it again rely on the idiot Savant algorithm.

I have other questions. A series of new communication violations relies on the subjective definition of psychological harm. Jonathan Hall, an independent government reviewer of terrorism law, is dissatisfied with similar ambiguities about the definition of terrorism. Some police officers are worried that deleting a post could destroy digital evidence of the crime. However, parliamentary support remains strong. Last year, Labor leader Sir Kiel Starmer accused the government of not acting any faster. The bill is likely to appear in the bill.

Online, no entry

It can have an international impact. Tech companies can choose to try to apply the new law only in the UK. If you find it too difficult to create a new set of rules for a subset of your users, one option is to apply at least some of the rules. osbRegulations for their services everywhere. There is precedent. Last year, the UK introduced an age-appropriate design code that regulates the stricter privacy of online children. As a result, Google, TikTok and others have undergone global changes. Another possibility is that some foreign companies, especially small ones, may stop servicing the UK altogether. In many ways, the price of safety can be silence. ■■

UK online safety bill could change the face of the internet

Source link UK online safety bill could change the face of the internet

Back to top button