Podcasts > Rotten Mango > #386: New Nth Room: Middle Schoolers Deepfake Videos Of Mom, Sister, Classmates In "Humiliation Room"

#386: New Nth Room: Middle Schoolers Deepfake Videos Of Mom, Sister, Classmates In "Humiliation Room"

By Stephanie Soo & Ramble

In this episode of Rotten Mango, Margot Sanger-Katz and Michael Barbaro examine the alarming rise of deepfake pornography targeting women and minors in South Korea. They reveal the harrowing scale of the Telegram chat room ecosystem, where thousands of users—including young students—create and share non-consensual deepfakes of classmates, friends, and family.

The blurb delves into the devastating impact on victims and the systemic failures enabling these crimes. Perpetrators, who view deepfake creation as mere "entertainment," face little legal deterrence, often receiving only probation or suspended sentences. Law enforcement struggles to combat secure platforms like Telegram, and some officials adopt dismissive attitudes toward victims, further compounding the issue.

#386: New Nth Room: Middle Schoolers Deepfake Videos Of Mom, Sister, Classmates In "Humiliation Room"

This is a preview of the Shortform summary of the Aug 29, 2024 episode of the Rotten Mango

Sign up for Shortform to access the whole episode summary along with additional materials like counterarguments and context.

#386: New Nth Room: Middle Schoolers Deepfake Videos Of Mom, Sister, Classmates In "Humiliation Room"

1-Page Summary

The Deepfake Epidemic in South Korea

In South Korea, there is growing alarm over thousands of Telegram chat rooms producing and sharing non-consensual deepfake pornography, often targeting women and minors. According to Sanger-Katz and Barbaro, the scale of this problem is staggering:

Chat Room Requirements and Reach

To access the deepfake chat rooms, users must provide extensive personal details and photos of their victims, including classmates, friends, and family members. The rooms extend across over 400 schools and universities, with over 70 individual rooms tied to one channel alone, Sanger-Katz reports.

Young Perpetrators View It as "Entertainment"

Many of the perpetrators are students themselves, some as young as middle schoolers, who view creating and sharing deepfakes of their peers and teachers as a form of "entertainment" or "prank." They go to extreme lengths like secret recordings and physical assaults to obtain source material.

Devastating Impact on Victims

Sanger-Katz describes victims experiencing intense trauma, panic, and a profound loss of trust upon discovering deepfakes of themselves. They feel disgusted with their bodies and paranoid about who has seen the content, sometimes being forced to cut ties with trusted individuals like family members involved in the crimes. Victims fear long-term consequences like ruined job prospects and physical harm if recognized. Many feel compelled to delete all online presence to avoid further exploitation.

Systemic Failures Enable the Crimes

Despite the severity, the legal system provides little deterrence, often giving perpetrators mere probation or suspended sentences, Barbaro and Sanger-Katz find. Law enforcement claims to be powerless against secure platforms like Telegram. Some officials express concerns about overregulating deepfakes or even imply that victims are partially responsible, further enabling dismissive attitudes.

1-Page Summary

Additional Materials

Actionables

  • You can enhance your digital literacy by learning about the technology behind deepfakes and how to spot them, which will help you become more aware of the content you encounter online. Start by exploring free online resources or tutorials that explain how deepfake technology works and the common signs that a video or image may be manipulated. This knowledge can make you more critical of the media you consume and share, reducing the spread of deepfake content.
  • Protect your online privacy by adjusting your social media settings to limit who can view and share your personal photos and information. Dive into the privacy settings of each platform you use and customize them to ensure that only people you trust have access to your personal content. This can help prevent your images from being used without your consent.
  • Support victims of deepfake abuse by donating to or volunteering with organizations that provide legal aid, counseling, or advocacy for those affected. Research groups that focus on digital rights and victim support, and consider contributing your time or resources to their efforts. Your involvement can help create a stronger support network for individuals who have been harmed by deepfake technology.

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free
#386: New Nth Room: Middle Schoolers Deepfake Videos Of Mom, Sister, Classmates In "Humiliation Room"

The epidemic of deepfake and non-consensual explicit content in South Korea, particularly targeting women and minors

In light of a pervasive crisis in South Korea, growing alarm is spreading over the mass production and sharing of non-consensual deepfake pornography, with a disconcerting number of cases involving women and minors.

Thousands of Telegram chat rooms across South Korea have been uncovering a massive network of perpetrators creating and sharing deepfake sexual content of victims, many of whom are minors.

The controversy intensified with the revelation that over 400 schools had related Telegram chat rooms. This network of perpetrators demanded users submit personal information and images of individuals they know—including classmates, friends, and family members—to be manipulated into pornographic deepfake videos.

The chat rooms require users to submit personal details and photos of acquaintances, friends, and family members, which are then used to generate deepfake pornographic videos.

To access these chat rooms, users had to provide extensive personal details and photos of their victims. Facts required could include name, date of birth, occupation, school, grade, age, Instagram handle, phone number, and even home addresses.

The scale of this epidemic is staggering, with over 400 schools, or 7% of all middle schools, high schools, and colleges in South Korea, identified as having associated Telegram chat rooms.

Affected institutions list includes dozens of middle schools, high schools, and even top universities. Over 70 individual chat rooms were identified within one Telegram channel, each corresponding to different universities or schools.

Perpetrators, often young students, view creating and sharing these deepfakes as a form of "entertainment" or "prank", with little awareness of the severe consequences.

Many perpetrators are students, some as young as middle schoolers, who aim at their own classmates and teachers. The casual nature of their conversations in the threads reveals a disturbing attitude toward creating and distributing these images.

Many perpetrators are students themselves, even as young as middle schoolers, who target their own classmates and teachers.

Reports have surfaced of middle school students deepfaking their peers and of a middle school boy who produced deepfake content of 12 female classmates and two teachers.

Some perpetrators go to extreme lengths, such as secretly recording victims or obtaining personal photos, in order to create more content to share in the chat rooms.

The lengths to which perpetrators will go to create this content are alarming; cases of secret recordings and obtaining personal photographs are common. Some have gone as far as committing physical assaults to garner material for their deepfakes.

The epidemic’s reach extends to a range of victims, from schoolchildren to professional adults such as lawyers and teachers. The deep fakes have been used to create videos that depict individuals in sexually compromising and abusive situations. For instance, Stephanie Soo brought up a case of a woman who became a lawyer, only to discover her images shared in a group chat from her law school days, implicating someone within her professional circle.

Thi ...

Here’s what you’ll find in our full summary

Registered users get access to the Full Podcast Summary and Additional Materials. It’s easy and free!
Start your free trial today

The epidemic of deepfake and non-consensual explicit content in South Korea, particularly targeting women and minors

Additional Materials

Actionables

  • You can enhance your digital literacy by learning about the technology behind deepfakes and the legal implications of creating and sharing such content. Start by researching articles and free online resources that explain how deepfake technology works and the current laws regarding digital consent and privacy. This knowledge will empower you to recognize deepfakes and understand the seriousness of the issue, which can lead to more informed decisions about your online behavior and the content you share.
  • Protect your personal information by conducting a privacy audit on your social media accounts and tightening security settings. Go through your profiles on platforms like Facebook, Instagram, and Twitter to review what information is public. Remove or hide sensitive details that could be used to create deepfakes, such as your full date of birth, location, and photos where your face is clearly visible. This proactive step reduces the risk of your personal data being misused.
  • Foster open conversations wi ...

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free
#386: New Nth Room: Middle Schoolers Deepfake Videos Of Mom, Sister, Classmates In "Humiliation Room"

The devastating psychological and social consequences for victims, including shattered trust and safety

The rise of deepfake technology has brought about an undercurrent of psychological trauma and societal disruption, particularly for female victims. Their lived experiences depict an alarming picture of the repercussions deepfakes have on individuals' sense of security and their trust in the social framework.

Victims, upon discovering the deepfake content, experience intense trauma, panic, and a profound loss of trust in those around them.

Women in Korea express a feeling akin to 'social collapse,' which speaks volumes about the impact on their sense of security. The distribution of these deepfakes—often through numerous chat rooms—promotes this distrust, potentially affecting victims' employment opportunities and exposing them to physical harm.

Both students and teachers have found themselves embroiled in deepfake scandals. A high schooler was horrified by explicit deepfake photos of herself that appeared authentic, and a teacher discovered deepfake material of herself being disseminated by her students. This sense of betrayal, particularly when those involved were trusted individuals such as students or, in one horrific instance, a victim's brother, disrupts the foundational sense of safety one feels in personal relationships and community.

Victims report intense emotional disturbances, such as feeling the need to cut themselves out of their own bodies—a graphic illustration of the severity of the psychological impact of being deepfaked. This trauma reinforces paranoid thoughts, leaving victims in constant fear of who might have seen the manipulated images and videos.

Victims report feeling disgusted with themselves and their bodies, and become paranoid about every interaction, unsure of who may have seen the content.

Victims endure an overwhelming sense of paranoia, constantly questioning who among their acquaintances or passersby may have seen or distributed the harmful content. This paranoia extends beyond the digital realm, with implications on every personal interaction they have.

The case of Stephanie Soo, who struggled for a year after discovering her brother had taken compromising photos of her, underscores this burden. Unable to trust even her family, she illustrates the deeply personal nature of the violation and the struggle to reclaim ownership and comfort within one's own body.

The deepfake videos can have far-reaching consequences, including jeopardizing victims' future employment prospects and exposing them to potential physical harm if recognized.

Beyond personal trauma, the threat looms of potential blackmail and physical danger due to recognition from deepfake content. Officials warn that such images can ruin job prospects and lead to physical violence. Victims are left to grapple with both the immediate emotional toll and the long-term social and economic repercussions.

The impact of these violations extends beyond the individual victims, damaging the broader social ...

Here’s what you’ll find in our full summary

Registered users get access to the Full Podcast Summary and Additional Materials. It’s easy and free!
Start your free trial today

The devastating psychological and social consequences for victims, including shattered trust and safety

Additional Materials

Actionables

  • Educate yourself on digital literacy to discern real from fake content by taking free online courses or tutorials that focus on media literacy and critical thinking skills. Understanding the technical aspects of how deepfakes are created and the common signs of manipulated media can empower you to question the authenticity of suspicious content you encounter online. For example, you might look for inconsistencies in lighting, shadows, or facial expressions in videos you come across on social media.
  • Support victims of deepfakes by advocating for and donating to organizations that provide legal and psychological assistance to those affected. By contributing to these causes, you help create a support network for individuals who have been harmed by deepfake technology. You can research and find reputable organizations that work on these issues and consider setting up a monthly donation to help sustain their efforts.
  • Implement robust personal cybersecurity measures to protec ...

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free
#386: New Nth Room: Middle Schoolers Deepfake Videos Of Mom, Sister, Classmates In "Humiliation Room"

The systemic failures and dismissive attitudes that enable this problem to persist, including lenient sentences for perpetrators

In South Korea, the systemic failures and dismissive attitudes towards deepfake-related crimes allow such activities to continue with minimal deterrence. Despite the severity of these crimes, the legal system often fails to provide significant consequences for the perpetrators.

Lenient sentences that frequently involve probation or suspended sentences send a clear message that these crimes are not taken seriously. An egregious example is a high schooler convicted of creating and distributing deepfake videos who received only six months in prison because he admitted his mistakes and had no prior criminal record. In reality, perpetrators often receive probation and suspension without a fine, even though making deepfakes for distribution can be punished with up to five years in prison or a 50 million won fine. Moreover, only five individuals were sentenced to prison for purely deepfake-related crimes in the previous year.

Even when perpetrators are identified and prosecuted, the penalties are viewed as a "slap on the wrist," doing little to discourage future offenders.

Officers note a lack of awareness about deepfaking as a crime, suggesting deficiencies in education and law enforcement regarding the issue. A middle schooler who produced deepfakes facilitated the narrative that there are negligible consequences for young offenders. Perpetrators share among themselves that unless they've left identifying information, they likely will not be caught, due to Telegram's lack of cooperation with the police.

The dismissive attitudes of some government officials and the public have further compounded the problem, minimizing the experiences of victims.

Parents have been noted to question the fairness of severely punishing their sons for what they perceive to be a mere "joke," trivializing the crime and downplaying its impacts. Even when victims seek help from authorities, they're often told there's little that can be done against crimes committed on secure platforms like Telegram. Victims' family reactions can also be dismissive, such as a mother who merely apologized during an argument, and a father who questioned the duration his daughter would talk about her experience.

Some officials have expressed concerns about "overregulation" of deepfakes, while others have suggested that women are partially res ...

Here’s what you’ll find in our full summary

Registered users get access to the Full Podcast Summary and Additional Materials. It’s easy and free!
Start your free trial today

The systemic failures and dismissive attitudes that enable this problem to persist, including lenient sentences for perpetrators

Additional Materials

Counterarguments

  • The legal system may prioritize rehabilitation over punishment, especially for youthful offenders, to avoid the negative long-term impacts of incarceration.
  • Leniency in sentences could be due to the challenges in quantifying the harm caused by deepfake crimes compared to more traditional offenses.
  • The small number of prison sentences might reflect the difficulty in prosecuting new types of cybercrimes, where evidence gathering and attribution can be complex.
  • The perception of penalties as insufficient may not account for the full range of consequences for the perpetrator, including social stigma and future legal restrictions.
  • Lack of awareness about deepfaking as a crime could be due to its relatively recent emergence, and efforts to educate law enforcement and the public may be underway.
  • Dismissive attitudes among officials and the public could stem from a lack of understanding of the technology and its potential for harm, rather than a willful minimization of victim experiences.
  • Concerns about overregulation may be valid in the context of mai ...

Actionables

  • You can educate yourself on the technical aspects of deepfakes to better understand how they are created and spread. By learning the basics of deepfake technology, such as the software used and the process of creating a deepfake, you'll be better equipped to identify potential deepfakes. This knowledge can also help you explain the issue to others, raising awareness in your own circles.
  • Start a digital literacy initiative in your local community to inform others about the implications of deepfake technology. This could involve simple activities like creating informative flyers, hosting small group discussions, or sharing educational content on social media. The goal is to increase public understanding of deepfakes, which can contribute to a more informed community that takes the issue seriously.
  • Advocate for clearer communication channels between victim ...

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free

Create Summaries for anything on the web

Download the Shortform Chrome extension for your browser

Shortform Extension CTA