misk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agoJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.comexternal-linkmessage-square80fedilinkarrow-up1485arrow-down117cross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
arrow-up1468arrow-down1external-linkJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agomessage-square80fedilinkcross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
minus-squareCurlyMoustache@lemmy.worldlinkfedilinkEnglisharrow-up14·11 months agoHow to make sure I’m not making it by accident? That is the reason why I have a general understanding of atomic bombs
minus-squarexor@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up14·11 months agoI hate when I accidentally build a nuke, absolute nightmare to dispose of
minus-squareAnneBonny@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up7arrow-down1·11 months agoYou can’t be too careful.
minus-squareKairuByte@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up3arrow-down1·11 months agoIt’s happened before.
How to make sure I’m not making it by accident? That is the reason why I have a general understanding of atomic bombs
I hate when I accidentally build a nuke, absolute nightmare to dispose of
You can’t be too careful.
It’s happened before.