• webghost0101@sopuli.xyz
    link
    fedilink
    arrow-up
    8
    ·
    6 months ago

    They did literally nothing and seem to use the default stable diffusion model which is supposed to be a techdemo. Would have been easy to put “(((nude, nudity, naked, sexual, violence, gore)))” as the negative prompt

    • megopie@beehaw.org
      link
      fedilink
      arrow-up
      7
      ·
      6 months ago

      The problem is that negative prompts can help, but when the training data is so heavily poisoned in one direction, stuff gets through.