shish_mish@lemmy.world to Technology@lemmy.worldEnglish · 4 months agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square25fedilinkarrow-up1298arrow-down14cross-posted to: technology@lemm.ee
arrow-up1294arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comshish_mish@lemmy.world to Technology@lemmy.worldEnglish · 4 months agomessage-square25fedilinkcross-posted to: technology@lemm.ee
minus-squareEven_Adder@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up2·4 months agoYou should know this exists already then,
You should know this exists already then,