“I have to tell you, our son is gone.”
These are words that parents should never hear or speak. But one father had to say to his wife on the phone one night:
Their teenage son died by suicide after being sexually exploited on Instagram. The exploiters used the secretly obtained nude images to blackmail him, demanding money or else posting the images online for everyone to see. They also threatened to harm or kill his parents.
“You better stop,” they told him at one point.
And that's what the boy did.
This shocking true story, whose name has been withheld for the privacy of the family, is unfortunately all too common. This shows the frightening reality of the dangers children face when using social media and other technology platforms.
Sadly, an increasing number of children are dying or being deeply traumatized by social media apps that are dangerous by their very design. Online, children are exposed to threats of sexual extortion, unwanted sexual advances, and harmful content that is actively fed by powerful algorithms and features that actually allow criminals to find and connect with minors. facing. It is difficult to imagine ever again in history when children would be so easily exposed to sexual exploitation online.
The Internet Watch Foundation recently lamented, “There's never been a worse time for kids to be online.” The hotline uncovered a record amount of child sexual abuse material (child pornography is a better term) and an “unprecedented” and “shocking” number of exploited children. There is.
The National Center on Missing and Exploited Children received more than 32 million reports of child sexual abuse content online in 2022. The FBI has also repeatedly warned of the growing threat of sexual extortion.
Big tech companies recognize the dangers their platforms pose to children, but are slow to fix them or ignore them entirely. This not only increases profits but also puts children's well-being at serious risk.
Child safety meta documents have been unsealed as part of a lawsuit filed by the New Mexico attorney general. They revealed that Meta estimated that 100,000 children per day were receiving sexually explicit content, such as photos of adult genitalia, in 2021.
At a Congressional hearing last November, former Meta security consultant and whistleblower Arturo Bejar said Meta only responds to 2 percent of user reports, including those on Instagram. It said it could also include reports of sexual harassment or sexual advances from strangers. The Wall Street Journal reported that one in eight users under the age of 16 “said they had received unwanted sexual advances online.” [Instagram] Over the past 7 days. ”
The same goes for other platforms. In October 2023, Discord will actively It will blur sexually explicit content for teenage users by default, becoming the first social media company to say it will do so. However, even after several months, these changes have still not been implemented. Discord also deceptively claims that it actively detects and removes child sexual abuse material, although the eSafety Commissioner's transparency report shows that Discord does not monitor or remove child sexual abuse material across all areas of its platform. We do not provide an in-app reporting mechanism.
Snapchat's unique disappearing technology has made it a favorite platform for sex offenders to exploit young users. It is listed as the largest platform used for sexual extortion. Snap also made several changes to the platform, but none addressed the most abused features.
X (formerly Twitter) was just marked second to Kik for “serious sexual content” in parental control app Bark's annual report.
There are also concerns about TikTok's effectiveness in managing and reporting child sexual abuse content. According to Thorne, TikTok was tied with X (along with Snapchat, Facebook, Instagram and Messenger) in the top five platforms for the most minors reporting having had sexual experiences online. That's what it means.
The CEOs of Meta, Snap, X, Discord, and TikTok are scheduled to testify before Congress on January 31 about how their platforms promote child sexual abuse material. These executives will testify that they are doing everything in their power to stop child sexual abuse. We will no doubt hear more lies and excuses.
The truth is that despite all the tools and so-called safety changes they advertise and promote, more and more children are being victimized on their platforms. Big tech companies have proven unwilling or unable to keep children safe online. And whatever it is, we should all be afraid.
We can't expect big tech companies to voluntarily do the right thing. It is long past time for Congress to hold these companies accountable and pass legislation to force big tech companies to get serious about protecting our children.
Several child protection bills have been introduced in Congress. Despite mounting evidence of online harm to children, Congress has yet to pass legislation to prevent this harm.
We've all heard enough excuses. Children are being threatened, harassed and exposed to extreme harm online, to the point where an alarming number of people believe ending their lives is their only option. This fact alone makes me fear that if Congress won't force us to impose even the most basic protections, it won't impose any.
Lina Nealon is Vice President and Director of Corporate Advocacy. National Sexual Exploitation Center.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.