Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”
Yes, lets name the tool in the article so everybody can participate in the abuse
I doubt it will do much of anything not to name it.
Considering that AI services typically cost money, especially those advertising adult themes, it kinda does do support the hosters of such services.
Then again, naming and shaming puts pressure on them too. But in the end I doubt it matters. Those who want to use them will find them.
Of course, which isn’t even the problem but rather people using the edited pictures for things like blackmail or whatever. From a technical standpoint it isn’t too dissimilar to the old photoshopping. Face swapping can probably even provide much higher quality results, especially if you have a lot of source material to pull from (you want like matching angles for an accurate looking result). Those AI drawn bodies often have severe anatomical issues that make them very obvious and look VERY different to their advertisement materials.
True. Especially as just googling ‘undress AI free’ yields tons of results which may be less or more legit.
You can literally Google ‘AI nude generation tool’ and get multiple results already. And I do sort of agree with you as I’m not sure how naming this specific tool was necessary or beneficial here. But I don’t think not naming it is going to prevent anyone interested in such a tool from finding one. The software/tool itself is (currently) not illegal.