In a troubling revelation, the National Center on Sexual Exploitation (NCOSE) has highlighted a disturbing trend: pornography websites are contributing to the rise in child sexual abuse by employing algorithms that lead users to increasingly extreme and harmful content. This development raises urgent questions about the responsibility of these platforms and the impact on society.
Haley McNamara, NCOSE’s senior vice president of strategic initiatives, explains that these algorithms are designed to keep users engaged by directing them toward content that often includes sexual violence and exploitation. “They’re trying to zero in on what you’re interested in,” McNamara states. Unfortunately, this means that even first-time visitors to these sites can encounter scenes depicting sexual violence right from the start.
The implications of such exposure are grave. According to McNamara, these algorithms create a “large funnel” that encourages users to explore deeper into a world of disturbing material. This is not just a theoretical concern; Pornhub, one of the largest pornography websites, has faced numerous lawsuits alleging that it profits from child sexual abuse material and other forms of sexual violence.
In a recent press release, NCOSE referenced a report from the Victims & Offenders Journal, which highlights the alarming accessibility of pornography as a factor in the increase of online child sexual abuse materials. The report cites data from the National Police Chief’s Council, revealing that approximately 850 men are arrested each month for online child abuse offenses in England and Wales alone.
A survey by Protect Children, a Finnish human rights organization, found that over half of those who viewed online child abuse material were not actively seeking it out when they first encountered it. This is particularly concerning, as 70% of respondents reported first seeing such material before the age of 18, with many exposed before the age of 13. The findings underscore the urgent need for protective measures to shield children from such content.
McNamara emphasizes that while individuals are responsible for their choices, society must acknowledge the systemic issues that allow such exploitation to flourish. “The children are the victims in all of this,” she asserts. The National Center for Missing and Exploited Children reported over 36 million suspected cases of child sexual abuse in 2023, with a significant portion linked to the distribution of child sexual abuse material.
To combat this growing crisis, NCOSE advocates for several solutions, including the repeal of Section 230 of the Communications Decency Act. This would strip online platforms of their broad legal immunity, holding them accountable for the content they host. Additionally, McNamara supports implementing age verification laws to prevent children from accessing pornography, similar to the App Store Accountability Act recently signed into law in Utah. This legislation shifts the responsibility of verifying user ages to app stores like Google and Apple.
As the conversation around online safety continues, it’s clear that a multifaceted approach is necessary. By layering various prevention mechanisms, we can create a safer online environment for children. It is imperative that we take these concerns seriously and work collectively to protect the most vulnerable members of our society.
For more information on the efforts to combat online sexual exploitation, visit the National Center on Sexual Exploitation and stay informed about the steps being taken to ensure a safer digital landscape for all.