As technology evolves, so do the methods of victimization. Digital sexual abuse, including “revenge porn,” cyberflashing, and deepfake “pornography” is rapidly increasing. Recent studies show that 1 in 12 U.S. adults has experienced image-based sexual abuse, with women being the primary targets. Globally, the situation is even more dire. In South Korea, deepfake-related sexual crimes have surged by 65% in the first seven months of this year, with 297 cases reported—up from 180 last year and nearly double the number from 2021.
Companies like Telegram, with millions of users, facilitate these abuses by allowing users to commit digital sex abuse crimes on their platforms. The outdated 1990 era law, the U.S. Communications Decency Act’s Section 230, prevents us from holding tech platforms accountable, allowing harmful content to spread, severely impacting victims’ mental health, reputations, and safety. Despite some recent improvements—like Pornhub’s removal of unverified videos and anti-cyberflashing laws in the U.K.—these measures are insufficient. Without coordinated global action, millions will continue to be victimized.
Below, I delve into some of the latest issues, their urgency, and the necessary steps forward.
Telegram’s Founder Arrested
Pavel Durov, the founder of Telegram, was arrested recently, bringing further attention to the platform’s role in enabling digital sexual abuse. Telegram has long been under scrutiny for its lax content moderation policies, which allow for the spread of non-consensual content, including “revenge porn” and deepfake pornography. Durov’s arrest raises serious questions about whether the platform will finally be held accountable for facilitating widespread exploitation.
This arrest comes on the heels of mounting international criticism, particularly in countries like South Korea, where Telegram has been used to distribute illegal deepfake content. While Telegram offers its users privacy and encryption, these features have made it a hotspot for the spread of illicit material, further highlighting the need for legal reforms to ensure platform accountability.
Section 230 and Platform Immunity in the U.S.
The Communications Decency Act’s Section 230 continues to be a critical barrier in holding platforms accountable for the harmful content they allow. This law shields tech platforms from liability for the material their users post, making it nearly impossible for victims of digital sexual abuse to seek justice.
While there are efforts to chip away at this protection both in the Courts and in Congress, the tech industry has resisted meaningful change. 1 in 5 women report being sexually harassed online, yet platforms like Meta and Telegram remain largely unaccountable, able to profit from user engagement without being held responsible for moderating dangerous or illegal content.
“Revenge Porn” Scandal: Love Island and the Spread of Intimate Images
The “Love Island” scandal centered around Georgia Harrison, whose ex-boyfriend, Stephen Bear, secretly recorded and shared an intimate video of them without her consent. Bear uploaded the footage to OnlyFans, leading to his conviction and arrest for revenge porn and voyeurism in 2023. Harrison, who bravely took legal action, helped bring about changes to UK laws, making it easier for victims to seek justice. She was awarded £207,900 in damages and continues to advocate for victims of online sexual abuse. She is now releasing a docuseries documenting her journey towards justice.
This case highlights just how widespread and damaging non-consensual image sharing has become. Revenge porn affects 1 in 8 young women, according to a 2017 survey conducted by the Cyber Civil Rights Initiative, and the consequences can be devastating—emotionally, professionally, and personally.
To make online spaces safer, we need key reforms. Section 230 must be updated to hold platforms accountable for harmful content. Deepfake technology should be regulated, providing victims with clear legal recourse. International cooperation is essential for tackling the global nature of these crimes. Platforms like Meta and Telegram must adopt stricter content moderation, improve response times, and invest in victim support to protect users effectively.