Uncategorized

Instagram connects pedophiles to child sex content: researchers

Instagram has created a task force to investigate researchers’ claims that its algorithms connected pedophiles to underage sexual content.

An investigation into the Meta-owned social media website by The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst found that its services cater to people who wish to view illicit materials by connecting them to sellers of such content through its recommendation system.

Instagram’s recommendation systems are intended to connect people online with shared interests. 

As the newspaper reported Wednesday, the researchers behind the investigation discovered that the social media website allowed users to search for the hashtags “#pedowh**e” and “#preteensex,” connecting them to accounts advertising child-sex material. 

The Christian Post attempted to use these hashtags in Instagram’s search bar on Thursday, but the search did not yield any results.

According to the report, the accounts offering child sex abuse material post menus of content instead of publishing it openly. The Standford Internet Observatory found that some accounts also sell videos of children harming themselves and performing sexual acts with animals. For a price, children are even available for “meet-ups.” 

The distribution or production of child pornography is prohibited under federal law, and violators face severe penalties. 

In response to a Thursday inquiry from CP about the WSJ report, a Meta spokesperson condemned child exploitation as a “horrific crime,” assuring that the company works to prevent it on its platforms and supports the prosecution of the criminals behind it. 

“Predators constantly change their tactics in their pursuit to harm children, and that’s why we have strict policies and technology to prevent them from finding or interacting with teens on our apps, and hire specialist teams who focus on understanding their evolving behaviors so we can eliminate abusive networks,” the spokesperson wrote. 

The Meta representative shared that, between 2020 and 2022, the team dismantled 27 abusive networks. In January 2023, the team deactivated over 490,000 accounts for violating the company’s child safety policies. 

Meta also has “detailed and robust policies” against child abuse, nudity or exploitation, including child sexual abuse material (CSAM) and inappropriate interactions with children, with the spokesperson providing a link to such policies.

“We’re continuously exploring ways to actively defend against this behavior, and we set up an internal task force to investigate these claims and immediately address them,” the spokesperson continued. 

“For example, we fixed a technical issue that unexpectedly prevented certain user reports from reaching content reviewers, we provided updated guidance to our content reviewers to more easily identify and remove predatory accounts, and we restricted thousands of additional search terms and hashtags on Instagram. We’re committed to continuing our work to protect teens, obstruct criminals, and support law enforcement in bringing them to justice.” 

Providing some background on the issue, the Meta spokesperson noted that WSJ’s reporting showed that child sex abuse material is not shared openly on the company’s platforms, highlighting that the article stated “accounts offering to sell illicit sex material generally don’t publish it openly.” 

The spokesperson also stressed that the report did not find evidence that child sex abuse material is traded on Meta’s platforms, pointing out that the WSJ report acknowledged the accounts selling it often link “to off-platform content trading sites.”

The spokesperson noted that the report found such accounts may connect accounts that “advertise child-sex material for sale.” 

“To take on predators who attempt to use our services to connect online, we hire specialists with backgrounds in law enforcement and work with child safety experts and organizations to identify these networks, monitor their latest tactics and eliminate them,” the Meta representative stated. 

“We know predators may attempt to set up multiple accounts to evade our enforcement, so when they violate certain child safety policies, we disable the accounts held by the account holder and restrict the device from setting up future accounts.”

Between May 27 and early June 2, Meta automatically blocked over 29,000 devices for violating the company’s Instagram child safety policies, according to the spokesperson.

The National Center on Sexual Exploitation lists Instagram on the organization’s 2023 Dirty Dozen’s List of mainstream contributors to sexual exploitation. 

“Instagram has enabled predators to reach children for far too long,” NCOSE Vice President Haley McNamara told CP Thursday, adding that NCOSE is “glad to see Instagram take some action to remove pedophile networks, but the social media giant must continue to work proactively to prevent this from continuing to happen on a mass scale — without waiting for bad press.” 

McNamara recommended several systemic improvements Instagram could make to prevent sexual exploitation, such as “investing in technology to scan for and block sexually explicit content in posts and messages.”

McNamara also suggested that the social media website prohibits accounts, hashtags, comments, and content sexualizing minors. In addition, she advised the website to restrict adult Instagram users from seeing minors on a “suggested users” list or “discover people” page and to remove or limit high-risk features such as “Vanish Mode” for young teens. 

Samantha Kamman is a reporter for The Christian Post. She can be reached at: [email protected]. Follow her on Twitter: @Samantha_Kamman

Free Religious Freedom Updates

Join thousands of others to get the FREEDOM POST newsletter for free, sent twice a week from The Christian Post.



Source link

Related Articles

Back to top button