misk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agoJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.comexternal-linkmessage-square80fedilinkarrow-up1485arrow-down117cross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
arrow-up1468arrow-down1external-linkJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agomessage-square80fedilinkcross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
minus-squareSyrus@lemmy.worldlinkfedilinkEnglisharrow-up12·1 year agoYou would need to know the recipe to avoid making it by accident.
minus-squareEcho DotlinkfedilinkEnglisharrow-up7·1 year agoEspecially considering it’s actually quite easy to make by accident.
You would need to know the recipe to avoid making it by accident.
Especially considering it’s actually quite easy to make by accident.