Police Chiefs Call for Solutions to Access Encrypted Data in Serious Crime Cases
24.4.24 Crime The Hacker News
European Police Chiefs said that the complementary partnership between law enforcement agencies and the technology industry is at risk due to end-to-end encryption (E2EE).
They called on the industry and governments to take urgent action to ensure public safety across social media platforms.
"Privacy measures currently being rolled out, such as end-to-end encryption, will stop tech companies from seeing any offending that occurs on their platforms," Europol said.
"It will also stop law enforcement's ability to obtain and use this evidence in investigations to prevent and prosecute the most serious crimes such as child sexual abuse, human trafficking, drug smuggling, homicides, economic crime, and terrorism offenses."
The idea that E2EE protections could stymie law enforcement is often referred to as the "going dark" problem, triggering concerns it could create new obstacles to gather evidence of nefarious activity.
The development comes against the backdrop of Meta rolling out E2EE in Messenger by default for personal calls and one-to-one personal messages as of December 2023.
The U.K. National Crime Agency (NCA) has since criticized the company's design choices, which made it harder to protect children from sexual abuse online and undermined their ability to investigate crime and keep the public safe from serious threats.
"Encryption can be hugely beneficial, protecting users from a range of crimes," NCA Director General Graeme Biggar said. "But the blunt and increasingly widespread rollout by major tech companies of end-to-end encryption, without sufficient consideration for public safety, is putting users in danger."
Europol's Executive Director Catherine de Bolle noted that tech companies have a social responsibility to develop a safe environment without hampering law enforcement's ability to collect evidence.
The joint declaration also urges the tech industry to build products keeping cybersecurity in mind, but at the same time provide a mechanism for identifying and flagging harmful and illegal content.
"We do not accept that there need be a binary choice between cybersecurity or privacy on the one hand and public safety on the other," the agencies said.
"Our view is that technical solutions do exist; they simply require flexibility from industry as well as from governments. We recognise that the solutions will be different for each capability, and also differ between platforms."
Meta, for what it's worth, already relies on a variety of signals gleaned from unencrypted information and user reports to combat child sexual exploitation on WhatsApp.
Earlier this month, the social media giant also said it's piloting a new set of features in Instagram to protect young people from sextortion and intimate image abuse using client-side scanning.
"Nudity protection uses on-device machine learning to analyze whether an image sent in a DM on Instagram contains nudity," Meta said.
"Because the images are analyzed on the device itself, nudity protection will work in end-to-end encrypted chats, where Meta won't have access to these images – unless someone chooses to report them to us."