Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”
I doubt it will do much of anything not to name it.
Considering that AI services typically cost money, especially those advertising adult themes, it kinda does do support the hosters of such services.
Then again, naming and shaming puts pressure on them too. But in the end I doubt it matters. Those who want to use them will find them.
Of course, which isn’t even the problem but rather people using the edited pictures for things like blackmail or whatever. From a technical standpoint it isn’t too dissimilar to the old photoshopping. Face swapping can probably even provide much higher quality results, especially if you have a lot of source material to pull from (you want like matching angles for an accurate looking result). Those AI drawn bodies often have severe anatomical issues that make them very obvious and look VERY different to their advertisement materials.
True. Especially as just googling ‘undress AI free’ yields tons of results which may be less or more legit.