Preda Deutsch Website
More content here @ xxnxx, xnxx, filme xxx, xnxx, xxx

Operation Renewed Hope identifies more than 300 probable victims of child sexual abuse using controversial AI

August 24, 2023 ·  By Antonia O'Flaherty for www.abc.net.au

Share this page:
Share
Operation Renewed Hope identifies more than 300 probable victims of child sexual abuse using controversial AI

A global investigation has used facial recognition to identify more than 300 probable child victims of sexual abuse online.(ABC News: Lisa Batty)

Operation Renewed Hope identifies more than 300 probable victims of child sexual abuse using controversial AI

A global investigation has identified more than 300 probable victims of child sexual abuse with the help of facial recognition technology that is currently banned in Australia. 

The children’s identities were previously considered cold cases — some decades old — with not enough distinguishable clues in the online abuse material seized by authorities to identify them.

Operation Renewed Hope, run by Homeland Security Investigations Cyber Crimes Centre’s Child Exploitation Investigations Unit out of the United States, finished earlier this month.

It involved two Australian investigators. 

But the technology used to identify hundreds of victims is banned here, after it was found to breach the Australian Privacy Act. 

Clearview AI privacy breaches

The operation used the US-based company Clearview AI’s facial recognition technology as well as other victim identification techniques. 

Founded by Australian Hoan-Ton-That, the company scrapes images of people online and stores biometric information in its database of more than 30 billion images.

The company then sells access to that database to companies and law enforcement agencies, which has been banned by the Australian privacy commissioner.

The privacy commissioner ordered the company to cease collecting and destroy existing data and images from Australia.

The AFP-led Australian Center To Counter Child Exploitation (ACCCE) trialled the technology for free between 2019 and 2020, and conducted searches using facial images of individuals located in Australia.

The privacy commissioner determined the AFP had failed to comply with its privacy obligations in using the Clearview AI facial recognition tool.

Recently the Australian Administrative Appeals Tribunal (AAT) upheld an earlier decision by the Office of the Australian Information Commissioner in 2021 that the company had breached the Privacy Act by collecting images from Australian servers for use.

Fines have been issued to the company in the UK and in some European countries. 

Law enforcement agencies in the United States are able to use the technology.

In Operation Renewed Hope, victim identification specialists trawled through the child sexual abuse material held by Homeland Security Investigations, with each “series” of abuse material containing anywhere from one image or video to hundreds.

The three-week operation led to the arrests of active abusers in the United States and Canada.

Queensland Police Service’s Taskforce Argos analyst Scott Anderson helped identify the forgotten children in Operation Renewed Hope. 

He said only one Australian victim had been identified. 

“I think out of the 650 series [of abuse material] that we did look through out of the possible 10,000 that are still sitting there, resulted in about 311 referrals for victims,” he said.

“To my knowledge there was only one Australian victim that came up during the review.

“Fortunately, she had already made disclosures to the police about that, but nobody had made the connections between this material and some of the other material that she had disclosed about.

“So, we’re still able to close those cases, which is good.”

Authorities use new technology to track down victims of abuse on the dark web.(ABC News: Lisa Batty)

He said the operation was a “good case study” of how this type of technology could help victim ID specialists to identify children in millions of abuse images already on the internet.

“It would have been nice to see the abuse identified earlier on to prevent the abuse from continuing,” he said.

“If we can start to shave off years or decades off abuse of people, it’s definitely some technology and capabilities that we should probably start discussing on how to responsibly use them.

“It’s not just a one-to-one, we don’t just press a button, and it says, ‘hey this is your person’ and somebody goes out and arrests, there’s a lot of investigation that goes around once these tools provide a potential identification.”

‘Anything that helps save children’

Former head of Taskforce Argos, Jon Rouse, said the potential identification of more than 300 children showed the technology could be incredibly beneficial in child protection investigations.

“I’d be an advocate for anything that helps us save children,” he said. 

“Ultimately, we’re fighting this war with our hands tied behind our backs already, because of Tor [software], for example, anonymised environments.

“The advantage is well and truly in the hands of the child sex offenders.

“I’m asking that the Commonwealth government review the use of this kind of technology by law enforcement to save children.”

Griffith University’s senior lecturer in cyber security, David Tuffley, said scraping videos and images from the web was a breach of the privacy act.

“Obtaining it in the first place is the problem with it from the Australian point of view,” he said.

“Australian law enforcement, both at the federal and state level, are pretty keen to make use of these latest tools, because one of the problems with cyber crime is that bad guys are constantly using the latest things.

“The good guys are sometimes hamstrung as to what they can do about that.”

QUT chair in digital economy Professor Marek Kowalkiewicz said there was a risk of people being wrongly identified. 

“It means we are all in a constant police line up, a lot of police departments use it.

“Clearview AI and others would point out that this technology can be very successful and helpful in identifying potential victims.

“Obviously identifying perpetrators in this case, in this way, it is a law enforcement dream but this is where we need to ask: do the ends justify the means?”

Privacy Act review

A spokesperson for the federal Attorney-General’s office said agencies using facial recognition technology must comply with the Privacy Act.

“Biometric information used for the purpose of verification or identification is considered ‘sensitive information’ under the Privacy Act,” the spokesperson said.

“An enforcement body may collect sensitive information if it believes that collecting the information is reasonably necessary for, or directly related to, one or more of its functions or activities.”

Earlier this year the Attorney-General released the department’s report on the review of the Privacy Act — with submissions currently being reviewed.

“The review recommended that entities engaging in high privacy risk activities, including in relation to biometric verification or identification, be required to do more to identify these risks and implement measures to mitigate them,” the spokesperson said.

Share this page:
Share

Copyright © 2024 · Preda Foundation, Inc. All Rights Reserved