Featured

Porn algorithms ‘fueling’ child sex abuse crisis: watchdog

A finger about to press a key on a computer keyboard.
A finger about to press a key on a computer keyboard. | Getty Images

Pornography websites are fueling the rise in child sex abuse by using algorithms to direct users to extreme material, desensitizing them to abuse and sexual assault, according to the Washington-based National Center on Sexual Exploitation. 

Haley McNamara, NCOSE’s senior vice president of strategic initiatives and programs, says pornography websites utilize these algorithms to keep people engaged as they direct users to more and more extreme content. 

“They’re trying to zero in on what you’re interested in,” the anti-sexual exploitation advocate told The Christian Post. “And unfortunately, we know that with even mainstream pornography, even on the front page, first-time users are exposed to scenes of sexual violence.” 

Get Our Latest News for FREE

Subscribe to get daily/weekly email with the top stories (plus special offers!) from The Christian Post. Be the first to know.

These algorithms, McNamara explained, are intentionally designed to lead users to content featuring sexual violence, racist or incestuous themes, and even criminal content like child sexual abuse material.

The algorithms create what she described as a “large funnel” that urges users to travel “deeper and deeper down those rabbit holes.” 

As NCOSE has reported, Pornhub, one of the largest online pornography websites, has repeatedly faced lawsuits over allegations that it knowingly profited from child sex abuse material or other content featuring sexual violence or assault.

Pornhub and its parent company, Aylo, formerly known as MindGeek, did not immediately respond to The Christian Post’s request for comment. 

In a press release last week, NCOSE cited a report published in the Victims & Offenders Journal in 2024 as an example of how easily accessible pornography is contributing to the rise of online child sex abuse materials.

The report cited data by the National Police Chief’s Council lead for Child Protection, which found that around 850 men a month are arrested for online child abuse offenses in England and Wales.

Another survey published in September 2021 by Protect Children, a Finnish human rights group, collected responses from 10,000 individuals, finding over 50% of those who admitted to watching online child abuse said they were not seeking these images out when they were first exposed to the material. 

Seventy percent of respondents said that they first saw child sex abuse material when they were under the age of 18, and nearly 40% said that they were under 13. Regarding the material they viewed, 45% of participants said it was girls between the ages of 4 and 13, while 18% said they looked at boys. 

The remaining respondents said that they looked at violent or sadistic material related to infants and toddlers, with ages ranging from 0 to 3. As the survey reported, viewing child sex abuse material can be addictive, and rehabilitative programs are usually necessary in this case to help people change their behavior.  

“The individual is still responsible for their choices,” McNamara said about users who access illegal pornographic material. “The children are the victims in all of this. But as a society, we need to take all of the on-ramps that are feeding the crisis of child sexual abuse material seriously.” 

The National Center for Missing and Exploited Children received over 36 million reports of suspected child sexual abuse in 2023 alone, and a majority of the reports were related to the circulation of child sex abuse material.

Regarding potential solutions to the issue, NCOSE has called for repealing Section 230 of the Communications Decency Act, which McNamara said would remove the “broad immunity” online platforms have under the law. 

The advocate cited age verification laws to keep children from accessing pornography as another solution and believes that states should pass something similar to the App Store Accountability Act signed into law in Utah earlier this year. The act places the responsibility of verifying users’ ages on app stores like Google and Apple. 

“There are multiple approaches that are needed, but when we layer in these different prevention mechanisms, it can create a much safer space online for kids, especially,” McNamara said. 

Samantha Kamman is a reporter for The Christian Post. She can be reached at: samantha.kamman@christianpost.com. Follow her on Twitter: @Samantha_Kamman



Source link

Related Posts

1 of 234