Sometimes we are not aware of the dangers inherent in many software tools that appear under the “free” label. Without going any further, last August, one of those instant “wow” effect apps went viral: a free app that lets you create deepfakes in real time. The problem is, what you imagine as a simple transition or entertainment can be a terrifying weapon in other hands. We knew about the problems that accompanied deepfakes, but South Korea exploded.
Deepfakes in schools. After police were revealed to be investigating deepfake porn networks at two universities in South Korea, local journalist Ko Narin decided to take on the investigation herself. A few weeks ago, she published a report about what she found: dozens of chat groups on Telegram where users shared photos of women they knew had been “transformed” into porn stars via deepfakes.
Narin’s work showed that there are not only chats of university students, but also private rooms dedicated to institutes and even schools where minors are present, some with up to 220,000 members, virtual spaces called “humiliation rooms” or “rooms of friends of friends”, with strict access conditions. A report that shocked the country, where all levels are targeted, as well as the Telegram application.
The first stone It all started a few years ago. In 2020, while South Korean authorities were chasing a blackmail ring that was forcing young women to make explicit videos for viewers to watch in exchange for money, they found something darker on social media: pornographic images with other people’s faces attached.
The authorities then didn’t know what to do with these first attempts at deepfake pornography. The Turkish Grand National Assembly eventually passed a vaguely worded law against those who produced and distributed it. This didn’t stop a new wave of crimes using AI technology; at the moment, and after several investigations, it seems that the country has developed a whole online misogynist culture.
In this process, the figure of women in society is once again questioned.
The last raid. Last week the national police agency said it was investigating 513 cases of deepfake pornography, where the faces of real women and girls are digitally inserted onto a body without their knowledge.
This represents a 70% increase in cases in just 40 days since the last investigation and underlines the country’s struggle to control the use of digital technology to sexually abuse women and girls.
Molka. It’s been several years since South Koreans became familiar with the popular term for secretly recording victims’ bodies using both mobile phones and small hidden cameras. So much so that in 2020, the country’s president recognized that it had become “a part of everyday life” for citizens. According to police data, while there were around 1,100 complaints about molka crimes in 2010, by 2018 there were 6,800 cases per year. And while in 2006 the percentage of sexual crimes caused by spy cameras was 3.6%, in 2015 they already accounted for almost half of the crimes.
The country has seen an increase in sexual crimes, but it’s unclear how much of this is due to an increase in the number of incidents or Korean women being more willing to report. Given that the digital nature of society is also very important, a combination of both was intended: for example, when the first camera phones started to be sold in the early 2000s, authorities forced manufacturers to include this in the design: when starting a recording, the device would emit sounds that could be detected by bystanders, in order to solve the same problem.
Molka and Deepfake today. It’s been a few years since the news of online sex crimes began, and the only thing that has improved is the technology to spread the images. In fact, today’s technology is so advanced that it’s hard for ordinary people to tell if they’re fake images.
As the country tries to tackle the threat, experts have noted that excitement about new technologies in South Korea can sometimes outweigh concerns about their ethical implications.
Misogyny. Even so, for many women, these deepfakes are the latest online expression of their country’s deep-rooted misogyny, a culture that has bred young men who now find it amusing to share sexually degrading images of women online. An effect that has led dozens of women and teenagers across the country to delete their photos from the web or outright deactivate their accounts in recent weeks for fear of being “next.”
“Korean society does not treat women as human beings,” Lee Yu-jin, a student whose university was among hundreds of middle, high school and university students who were victimized, told the New York Times. She wondered why the government had not done more “before stealing photos of friends and using them for sexual humiliation became a digital culture.”
Deepfake capital. South Korea currently holds the title of the country most attacked by deepfake pornography. According to a 2023 report by Security Hero, an American startup focused on protecting against identity theft, its singers and actresses account for 53% of the people featured in deepfakes worldwide.
In parallel, the police have launched an investigation into Telegram, and the country’s media regulator plans to hold talks with representatives of the messaging app to find a common solution to the problem. The Ministry of Education also announced the creation of a working group to investigate incidents in schools, teach children how to protect their image and support victims.
It’s all about effort, not much. In fact, even students and teachers who aren’t directly affected “experience extreme fear and anxiety about the possibility of it being used in sexual crimes or distributed online without their knowledge,” according to the Korean Teachers Federation union.
Image | Andrew and Annemarie, Pixabay
In Xataka | The nations and stereotypes of the world as seen by Japan in this magnificent map from 1932
In Xataka | South Korea is already considering starting school earlier for girls. Reason: Rising birth rate