It’s been a while since K-pop idols have gotten caught up in a pickle regarding AI-generated edits and deepfake videos of them on the internet, and the issue seems to have gotten worse, as per the latest developments.
In the recent few days, many K-Pop fans have brought to attention a “disturbing” new trend on social media involving a different kind of AI-generated content. Instead of deepfakes that superimpose the faces of idols onto the bodies of other people, these edited videos are created out of still pictures.
This technology allows, for instance, a mirror selfie of TXT‘s Yeonjun to be turned into a video of him kissing a random internet user.
A similar video was created by a BTS fan using a picture from Jungkook‘s Calvin Klein campaign.
This is worrying wtf !.?.?. pic.twitter.com/7ysrSaguCY
— J⩜⃝zzy⁷ ♡’s sammie🍉 D-18 🏴☠️ (@hobbitopia) January 27, 2025
And Stray Kids’ Bangchan wasn’t spared either.
These videos, which popped up within a very short span of one another, have gotten fans concerned about this kind of content becoming a trend in online spaces. Many vocalized their discomfort with the moral implications of it as they saw these videos as clearly violating the idols’ personal dignity.
21k likes on violating chan as a person by using AI to bend him to your will. getting AI to make an idol kiss you is fucked up pic.twitter.com/MZWK9UQAv7
— libby ۶ৎ (@chanignab) January 27, 2025
Comment
byu/ForceApprehensive597 from discussion
inkpop_uncensored
Comment
byu/ForceApprehensive597 from discussion
inkpop_uncensored
It is sexual harassment to use AI on idols and create videos. It is sexual harassment to use AI to create scenarios with idols or any other person. To prevent this, we need people to shame individuals who do so incessantly until they stop or until there is legislation to stop it https://t.co/PU70SE3MVe
— taazie ☆ (@realtxtsoobin) January 27, 2025
This needs to be stopped, reported and criminalized. Genuinely.
If you care about idols so much get in contact with their companies and have them tackle this shit because it will only get worse from here.
This is disgusting.
This is sexual harassment. https://t.co/ZLLMGbIsQ7— Louder Than Bombs⁷ 🍉🔻 (@astroboy_LTB) January 28, 2025
This new AI edit trend has added to the troubles of the K-Pop industry still grappling with deepfake pornography created using idols’ faces. Read more about it here.