Tech world may face huge fines if it doesn’t scrub CSAM from encrypted chats – The Register

Tech companies could be fined $25 million (18 million) or ten percent of their global annual revenue if they don't build suitable mechanisms to scan for child sex abuse material (CSAM) in end-to-end encrypted messages and an amended UK law is passed.

The proposed update to the Online Safety bill [PDF], currently working its way through Parliament, states that British and foreign providers of a "regulated user-to-user service" must report any shared child sexual exploitation and abuse (CSEA) content to the country's National Crime Agency. The amendment to the legislation makes it clear that companies must develop software capable of peering into messages, even end-to-end encrypted chatter, to actively detect and report CSEA material to the authorities or face sanctions.

Truly secure end-to-end encrypted messages can only be read by those participating in the conversation, not network eavesdroppers nor the app's makers. However, it is possible for chat software developers to add a filter, potentially on each device, that automatically scans for certain illegal material before it's encrypted and sent or after it's received and decrypted.

How well that computer-vision process would work in practice, and whether the false positive rate causes people's private and lawful chatter to be beamed to the government, remains to be seen. Netizens may also not trust that just CSEA content is being reported.

Alternatively, an app maker could engineer their service and code to intercept and inspect the messages as they whiz between a conversation's participants, but that would undermine the whole end-to-end nature. And stuff that isn't end-to-end encrypted can be monitored as the app or service provider chooses.

However which way it's implemented, the British Conservative government, or what's left of it after a ministerial revolt against Prime Minister Boris Johnson, wants communications, encrypted or not, to be screened for CSEA material, and has positioned its Online Safety bill to that effect.

"Things like end-to-end encryption significantly reduce the ability for platforms to detect child sexual abuse," the UK's Home Secretary Priti Patel well, Home Secretary at time of writing on Wednesday argued earlier in the day. "The Online Safety Bill sets a clear legal duty to prevent, identify, and remove child sexual abuse content, irrespective of the technologies they use. Nobody can sensibly deny that this is a moral imperative."

If the legislation is passed by Parliament, Ofcom the UK's communications watchdog will have the power to force tech companies to pay penalties if this inspection system isn't implemented. "The onus is on tech companies to develop or source technology to mitigate the risks, regardless of their design choices. If they fail to do so, Ofcom will be able to impose fines of up to 18 million or [ten percent] of the company's global annual turnover depending on which is higher," Patel warned.

"We do not want to censor anyone or restrict free speech, but we must do more to combat these foul, hugely destructive crimes," she added.

Building in automatic detection of CSEA content is controversial. Engineers, legal experts, and activists have highlighted the risks of developing such capabilities. It may torpedo users' privacy, and potentially gives government officials a foot in the door to monitoring people's conversations. For instance, these filters, once implemented, could be expanded beyond child abuse.

Patel, however, believes changes to encryption systems to support this scanning can still preserve users' privacy while combating CSEA: "The UK government wholeheartedly supports the responsible use of encryption technologies We, and other child safety and tech experts, believe that it is possible to implement end-to-end encryption in a way that preserves users' right to privacy, while ensuring children remain safe online."

"If end-to-end encryption is implemented without the relevant safety mitigations in place, this will become much harder. It will significantly reduce tech companies' and law enforcement's ability to detect child sexual abuse happening online. This is obviously unacceptable," she said.

Last year, Apple quietly paused plans to scan for CSAM on iPhones. Apple's detection scheme was heavily criticized by academics and advocacy groups.

"Once this capability is built into Apple products, the company and its competitors will face enormous pressure and potentially legal requirements from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable," declared a letter signed by more than 90 human-rights groups.

The Online Safety bill also attempts to tackle disinformation by getting social networks to filter out state-made interference, and reduce the distribution of stolen information for the purposes of undermining democracy.

The likes of YouTube may also be banned from removing news content until publishers and outlets have had a chance to appeal.

Read the original post:
Tech world may face huge fines if it doesn't scrub CSAM from encrypted chats - The Register

Related Posts
This entry was posted in $1$s. Bookmark the permalink.