“This eventually leads to harassment and threats to share the images unless money is sent,” said Withers. “Children are usually the victims of sextortion; one study found that 25% of victims were 13 or younger when they were first threatened and over two-thirds of sextortion victims were girls threatened before the age of 16 (Thorn, 2018).”
Is Twitter Failing Our Children?
Experts suggest it is very concerning that Twitter and other social media platforms are not doing their part to eliminate the CSAM materials that are spread through their platforms. The amount of voluminous data that needs to be scrubbed internally is substantial, and one or two people conducting that job should be seen as simply inefficient, even with external agencies assisting.
“Having a child safety team for online monitoring is critical for organizations operating on social media,” suggested Dr. Brian Gant, assistant professor of cybersecurity at Maryville University.
“In Twitter’s case most importantly because there is consensual pornography that is shared in large numbers on the platform,” Gant noted. “Not having an internal team to discern what is consensual, and what would be considered innocent images or child exploitation is paramount.”
The failure to act could be seen as enabling the predators to strike.
“Social media platforms are exacerbating child abuse when they allow users to condone pedophilia, exploitation, pornography, and other forms of abuse as well as enhancing the ability for children to be groomed, controlled, and exploited,” added Lois A. Ritter, associate professor for the masters of the public health program at the University of Nevada, Reno.
The reduction in the child safety team is thus seen with alarm.
“Social media platforms have a social and ethical responsibility to monitor the material on their sites to prevent and disrupt such horrific acts and prevent child victimization,” said Ritter. “Having staff monitor posts and follow up on complaints in a timely manner is critical. Unfortunately, profit often trumps child welfare. If this is a permanent staffing change, children will suffer.”
However, even with a large team of individuals, it could be impossible to monitor all the content on the platform.
“Automated technological tools can help but these should not take the place of a human moderator who will have to make decisions about what is real or not, or what is child sex abuse or not,” said Withers. “Maybe we need to hold these companies to a higher standard? They could be held responsible for creating an environment which allows for the proliferation of child sex abuse material.”
Of course, such content isn’t just spread on social media. It existed long before the Internet age.
“We should also remember that the United States is one of the largest producers and consumers of child abuse content in the world,” Withers continued. “We need to ask ourselves why and what we can do about reducing the demand for such content.”