SEOUL, Jan. 14 — Following a heated scandal over Luda, a chatbot that ended up being unplugged amid controversies over its hypersexualisation and unfiltered comments on sexual minorities, South Korea faces another socio-technological issue on how to tackle artificial intelligence technology that targets real, living celebrities as victims of deepfake porn.
On Wednesday, an anonymous petitioner began an online petition demanding stronger punishment for websites that distribute deepfake porn involving Korean female celebrities and for people who download them, Yonhap news agency reported.
“Videos featuring the victimised female celebrities are distributed on various social network services, and (they) are tortured with malicious comments of a sexually harassing and insulting nature,” the petitioner wrote.
The person mentioned how this often leaves young female celebrities, including those who are underaged, powerlessly exposed to sexual predators. “Deepfake is undeniably a sexual crime,” the petitioner stressed.
With unusually swift speed, the petition has earned more than 330,000 signatures in a single day as of Thursday afternoon.
Growing calls urging the government to regulate deepfake porn have spread to Twitter, where fans are actively sharing hashtags such as “deepfake_strong punishment” and “publicise_illegal composite” and reporting names of online spaces or mobile apps where deepfake porn is shared and created.
Deepfake, a portmanteau of “deep learning” and “fake,” refers to digital representations, such as videos or images, made via artificial learning or sophisticated technology that could lead viewers to wrongly perceive the processed media to be real.
While there are cases of deepfake being used positively, such as to create digital renderings of deceased family members or celebrities, the controversial technology has often been criticised for being a source of fake news, fraud and defamation.
Source: BERNAMA