Preda Deutsch Website
More content here @ xxnxx, xnxx, filme xxx, xnxx, xxx

E.U. Privacy Rule Would Rein In the Hunt for Online Child Sexual Abuse

December 7, 2020 ·  By Gabriel J.X. Dance and Adam Satariano for www.nytimes.com

Share this page:
Share

Regulators argue that while abuse imagery on the internet is abhorrent, unchecked scanning for it by tech companies could violate privacy rights. A showdown looms.

E.U. Privacy Rule Would Rein In the Hunt for Online Child Sexual Abuse

The Europol headquarters in The Hague. Law enforcement agencies and child advocacy groups are worried about a new digital privacy rule scheduled to take effect this month.Credit…Yuriko Nakao/Bloomberg via Getty Images

E.U. Privacy Rule Would Rein In the Hunt for Online Child Sexual Abuse

Privacy concerns in Europe have led to some of the world’s toughest restrictions on companies like Facebook and Google and the ways they monitor people online.

The crackdown has been widely popular, but the regulatory push is now entangled in the global fight against child exploitation, setting off a fierce debate about how far internet companies should be allowed to go when collecting evidence on their platforms of possible crimes against minors.

A rule scheduled to take effect on Dec. 20 would inhibit the monitoring of email, messaging apps and other digital services in the European Union. It would also restrict the use of software that scans for child sexual abuse imagery and so-called grooming by online predators. The practice would be banned without a court order.

European officials have spent the past several weeks trying to negotiate a deal allowing the detection to continue. But some privacy groups and lawmakers argue that while the criminal activity is abhorrent, scanning for it in personal communications risks violating the privacy rights of Europeans.

“Every time things like these unbelievable crimes are happening, or there is a terrorist attack, it’s very easy to say we have to be strong and we have to restrict rights,” said Birgit Sippel, a German member of the European Parliament. “We have to be very careful.”

Birgit Sippel, a German member of the European Parliament, is seeking a compromise that would allow much of the scanning to continue.Credit…Gergely Kelemen Zoltan/MTI via AP

Of the more than 52 million photos, videos and other materials related to online child sexual abuse reported between January and September this year, over 2.3 million came from the European Union, according to the U.S. federal clearinghouse for the imagery.

Under the new rule, part of Europe’s ePrivacy Directive, the rate of reports would drop precipitously, because automated scanning is responsible for nearly all of them. Photo- and video-scanning software uses algorithms to compare users’ content with previously identified abuse imagery. Other software targeted at grooming searches for key words and phrases known to be used by predators.

Facebook, the most prolific reporter of child sexual abuse imagery worldwide, said it would stop proactive scanning entirely in the E.U. if the regulation took effect. In an email, Antigone Davis, Facebook’s global head of safety, said the company was “concerned that the new rules as written today would limit our ability to prevent, detect and respond to harm,” but said it was “committed to complying with the updated privacy laws.”

There are also concerns among child protection groups that there could be a domino effect — that Facebook and other companies may cease scanning worldwide, because they do not currently have a legal obligation to do so.

“The issue that we’re talking about is global,” said Julie Cordua, the chief executive of Thorn, a nonprofit that develops and licenses technologies to defend children from online abuse. “What happens in the E.U. will have cascading effects around the world.”

Child protection organizations, international law enforcement agencies and U.S. lawmakers have warned that the rule would be a major setback for global efforts to combat the exploitation of children.

“It would be a total failure if during the pandemic and the lockdowns going on in many countries that we should now forbid the detection of grooming,” said Ylva Johansson, a Swedish member of the European Commission with responsibilities for security strategy and terrorism.

Ms. Johansson and other officials are pushing to find a compromise that would allow the scanning to continue for several years, but under a deadline imposed by previous privacy legislation, they would need to settle on a solution by Dec. 20.

“There is this balance between the privacy of the user and the privacy of the child victim,” she said. “The role for politicians is to find the right balance.”

The New York Times reported in 2019 that online child sexual abuse imagery had grown exponentially in recent years and was rampant across the internet, infesting nearly all major platforms. Perpetrators often leverage multiple services, including cloud storage, messaging apps and social media networks. Online video games are another frequent target, with some abusers grooming hundreds and even thousands of victims while they play.

The new restrictions in Europe can be traced to a policy change in 2018 that brought email, some direct messaging and internet services like Facebook, Gmail and Skype under regulations that would prevent companies from monitoring electronic communications. The rule was scheduled to take effect this month to give companies and governments time to prepare.

With the deadline looming, European officials are facing criticism for waiting until the last minute to resolve an issue with broad implications for privacy and child safety.

Whether a compromise can be reached may depend on the debate over grooming-detection software. Last month, Ms. Sippel proposed a competing rule that would allow scanning for photos and videos but ban the grooming software, although it was unclear if she had enough support in Parliament for that position. A committee is scheduled to consider the proposal on Monday.

Unlike imagery-scanning technology, which is almost 100 percent accurate in identifying illegal photos and videos, grooming software is right about 90 percent of the time, according to Hany Farid, a professor at the University of California, Berkeley, who assisted on the development of both technologies. That means about a tenth of the material flagged by the grooming software is not illicit.

Hany Farid, who has worked to develop technologies that detect online abuse, in his office at the University of California, Berkeley. Credit…Kholood Eid for The New York Times

Dr. Farid compared grooming software used by companies in the United States and Europe to spam-filtering software, which searches for combinations of words and phrases. Technologies that scan for spam and malware would be exempted from the new regulation.

“I don’t hear anybody complaining that my spam filter reads my email,” Dr. Farid said.

When grooming is discovered, there is a major upside compared with the detection of illegal photos and videos: A grooming report is more likely to result in the rescue of a child because the illegal activity is happening in real time.

“The grooming of children for sexual purposes is always about a child on the verge of or in the midst of abuse,” said John Shehan, a vice president at the National Center for Missing and Exploited Children, the U.S. federal clearinghouse that works with technology companies and law enforcement agencies around the world.

As of September, according to the clearinghouse, 1,020 reports of grooming had come from the European Union. Cases of grooming were reported in all 27 E.U. countries and contained many examples of “sextortion” — when an adult poses as a minor to solicit photos or videos, then uses the imagery as blackmail to further exploit the child.

Diego Naranjo, head of policy at European Digital Rights in Brussels, an advocacy group, said the subject was fraught because anyone who questioned the tech companies’ practices was cast as “somebody who doesn’t care about the children.”

Even so, he said, the tech companies and child protection groups had not made a strong enough case for scanning to justify the intrusion on privacy.

“They haven’t given evidence needed to show this is proportionate,” he said. “We don’t open every letter in the mail to see if there is something illegal.”

The European Data Protection Supervisor, an agency that advises on privacy issues, said clearer safeguards were needed for consumers. Privacy is considered a legally protected human right in the European Union. In an opinion published last month, the agency said “confidentiality of communications is a cornerstone of the fundamental rights to respect for private and family life.”

The tech industry has largely stayed out of the public debate.

While Facebook said it would stop proactive scanning in Europe, other companies have remained quiet. In October, Microsoft filed a declaration with authorities saying that its detection software was used solely to identify child abuse and not for any commercial purpose. But a company spokesman would not indicate if it would stop scanning under the new regulations.

Google, which reported 3.6 million illegal photos and videos in 2019, did not respond to multiple requests for comment.

Advocacy groups and law enforcement agencies around the world have drawn attention to the rule in recent weeks in hopes of derailing it.

In a statement last month, the European Union Cybercrime Task Force described scanning by companies as an “essential function in the fight against child sexual exploitation and child sexual abuse online,” and warned that stopping it could result in a significant reduction in criminal investigations.

On Thursday, Senator Tom Cotton, Republican of Arkansas, announced that he would introduce a resolution urging the E.U. to let companies continue monitoring. “Closing our eyes to child exploitation doesn’t mean it stops,” Mr. Cotton said.

Twelve members of the U.S. House of Representatives, both Democrats and Republicans, made a similar case in a letter last month to the European Parliament.

Ms. Sippel said she was hopeful a compromise could be found. In the meantime, she predicted, the broader debate about how to balance privacy and security will continue.

“You can always go too far, or you cannot go far enough,” she said. “That is what we are debating.”

 

Share this page:
Share

Copyright © 2024 · Preda Foundation, Inc. All Rights Reserved