“You remove it but it keeps coming back”: New laws leave adult digital sex crime victims little recourse

Posted on : 2021-12-12 09:01 KST Modified on : 2021-12-12 09:32 KST
Some experts are calling for the abolishment of the statute of limitations for all digital sexual crimes, regardless of the victim’s age
courtesy of Getty Images Bank
courtesy of Getty Images Bank

A number of changes have taken place since the Hankyoreh first reported in November 2019 on the digital sexual exploitation taking place on the messaging service Telegram. Hidden cameras, known commonly in Korea as “molka,” are now referred to as “illegal photography”; pornography is called “sexually exploitative material.”

The changes have been broad in scope, from the public’s sensitivity to matters of sexual abuse to the formulation of related policies and the enactment and amendment of legislation.

But when it comes to practically reducing sexual crimes and offering reparations for the damage done, the picture is still incomplete. Much remains unresolved in connection with reporting, investigations, trials and support for victims.

Each stage of the process can mean substantial suffering for the victims. But attention is being focused in all different directions. This explains the slow but steady efforts to shed light on the areas of victimization that go relatively unnoticed.

“Up until recently, I’ve still been helping with the removal of illegally photographed images of Soranet victims from nine years ago,” said “K,” who works with the deletion support team at the Digital Sexual Crime Victims Support Center.

Created in 1999, Soranet became South Korea’s biggest nonconsensual pornography website before it was shut down in 2016. But the illegally filmed footage that was distributed through it continues to circulate online.

A simple search for “Soranet” on social media sites turns up a list of addresses — available to anyone — that lead to a “second Soranet” site. That site features vast amounts of pornographic material of indeterminate dates.

The sexually exploitative material that K has been deleting in the past two weeks is the same material from years earlier.

“You remove it and remove it, but it keeps coming back,” K said, adding that “many of the victims suffer because of its recirculation.”

Since last year, the so-called Nth Room Prevention Act has been phased in, with much stronger investigations and punishments for digital sexual crimes. For the victims, however, the reality has not changed much. Some analysts have suggested that with the focus set mainly on children and adolescent victims, adult victims are stuck in a blind spot.

Seven years — that’s the statute of limitations for digital sexual crimes against adult victims. That means that if seven years go by without the victim having been aware when the material was first being circulated, they have no recourse to hold the original perpetrator legally accountable if they become aware of their victimization or attempt to pursue action.

It also means that when videos that first circulated seven years earlier begin appearing again online, there are no means available to punish the person who first put them there.

For many crimes, time limits are placed on the ability to file suit against a perpetrator, owing to factors such as limitations of investigations. But analysts seem to have more of a point when they argue that not enough consideration has been given to the nature of digital sexual crimes as a long-term threat, which comes back to repeatedly threaten a victim’s social stability as the crime is committed over and over with each new viewer of sexually exploitative material.

Based on these characteristics of digital sexual crimes, the response to them can be different from cases where the victim is a young child or adolescent. On Sept. 24, an amendment to the Act on the Protection of Children and Youth Against Sex Offenses went into effect, doing away with the statute of limitations on the production, importation or exportation of sexually exploitative material depicting children or adolescents. This follow-up measure was adopted in the wake of the Nth Room incident on Telegram, where 62% of the victims were in their teens.

It also remains commonplace for adult victims not to be recognized fully as victims. Based on an amendment enacted in June 2020, the above act refers to videos showing the victimization of children and adolescents as “sexually exploitative material,” rather than as “pornography using children and adolescents,” the term that had previously been used.

But the Act on Special Cases Concerning the Punishment of Sexual Crimes, which is chiefly applied in cases of adult victims, victimizing videos are still referred to as “photographs and reproductions thereof that may elicit feelings of sexual desire or shame.”

“When the Children and Youth Sex Offense Protection Act spells out that victimizing videos are ‘sexually exploitative material,’ that makes it clear that the victims are indeed victims of a crime of exploitation,” explained attorney Cho Eun-ho of MINBYUN-Lawyers for a Democratic Society, who has represented victims of sexual exploitation on Telegram. “This means that they do not intend to hold the victim in any way accountable.”

“But victimizing videos showing adult victims are still referred to as ‘pornography,’” she added.

“In respects, the fact that the law hasn’t changed means that the existing attitudes in courts and investigative institutions haven’t improved either,” she said.

Experts agreed that when it comes to victims of digital sexual crimes, distinctions are meaningless.

Victims of all ages live in a state of constant anxiety — unable to rest easy even after the center has helped get the videos or photographs in question removed. Once uploaded to a website, a video can spread to dozens to hundreds of other sites instantaneously through simultaneous uploads; downloads are simple, meaning that the videos might show up again at any time.

It’s an even bigger problem when the original videos went up with titles, including ones referring to specific universities where the victims attended school. Over the years, they may end up mentioned again on online communities, resulting in secondary victimization — sometimes with the video itself resurfacing.

“It’s unfortunate that since the Nth Room incident, the focus of legislation has been mainly on underage victims,” said K, who works to delete videos at the center.

“I worry that it could send the implicit message that we don’t intend to protect adult victims.”

Some have argued that in light of the nature of digital sexual crimes, the statute of limitations should be abolished in cases of material showing adult victims, or that measures should be put in place to allow for its extension. In the case of sexually exploitative material showing children and adolescents, the statute of limitations on production and importation or exportation was abolished outright.

“As far as the matter of abolishing the statute of limitations on digital sexual crimes is concerned, there’s no need to distinguish between children and adults,” argued Cho Eun-ho.

“Digital evidence can be used to establish the facts, such as who circulated something and when. Just as they’ve extended the statute of limitations in cases where scientific evidence has been discovered, we need to move in the direction of extending the statute of limitations in cases of digital sexual crimes,” she urged.

A 2010 amendment to the Sexual Crime Punishment Act allowed for a 10-year extension to the statute of limitations in cases where scientific evidence proving a sexual crime is discovered.

In the National Assembly, a bill was recently introduced to fill in the cracks of the Nth Room Prevention Act.

On Nov. 18, Democratic Party lawmaker Hong Jung-min spearheaded a proposal to an amendment of the Sexual Crime Punishment Act to expand special investigation provisions that were previously only applied in cases of digital sexual crimes involving minors — including non-disclosure of identity and undercover investigations — to crimes involving adult victims as well.

This was motivated by the nature of digital sexual crimes and their basis in anonymity, which makes it difficult to investigate them or acquire evidence.

On Oct. 28, Korea’s National Police Agency announced that in the first month since it began conducting undercover investigations of digital sexual crimes involving minors, it had arrested 58 suspects in connection with 35 cases.

By Park Go-eun, staff reporter

Please direct questions or comments to [english@hani.co.kr]

button that move to original korean article (클릭시 원문으로 이동하는 버튼)

Related stories

Most viewed articles