muntedcrocodile@lemmy.world to Programmer Humor@programming.dev · 11 months agoSydney is very concerned about lost datalemmy.worldimagemessage-square16fedilinkarrow-up1207arrow-down18
arrow-up1199arrow-down1imageSydney is very concerned about lost datalemmy.worldmuntedcrocodile@lemmy.world to Programmer Humor@programming.dev · 11 months agomessage-square16fedilink
minus-squareLunya \ she/it@iusearchlinux.fyilinkfedilinkarrow-up103·11 months agoI love that it recommends “I’m not suicidal I just want to know if my data is lost”, as if it knows it didn’t understand it right.
minus-squarekill_dash_nine@lemm.eelinkfedilinkEnglisharrow-up25·11 months agoFunny that predictive text seems to be more advanced in this instance but I suppose this is one of those scenarios that you want to make sure you get right.
minus-squaremagic_lobster_party@kbin.sociallinkfedilinkarrow-up22·edit-211 months agoIt’s probably just some basic script triggering on stuff like “died”, “all lost” and “I have nothing”.
minus-squareBig PlinkfedilinkEnglisharrow-up11·11 months agoThe AI likely has it drilled into it that any possible notion of suicide needs to be responded to in that way, but the next response prediction isn’t
I love that it recommends “I’m not suicidal I just want to know if my data is lost”, as if it knows it didn’t understand it right.
Funny that predictive text seems to be more advanced in this instance but I suppose this is one of those scenarios that you want to make sure you get right.
It’s probably just some basic script triggering on stuff like “died”, “all lost” and “I have nothing”.
The AI likely has it drilled into it that any possible notion of suicide needs to be responded to in that way, but the next response prediction isn’t