alb_004@lemm.ee to Technology@lemmy.worldEnglish · 7 months agoChatGPT provides false information about people, and OpenAI can’t correct itnoyb.euexternal-linkmessage-square61fedilinkarrow-up1206arrow-down111cross-posted to: goodnews@lemmy.mltechnology@lemmy.worldfuck_ai@lemmy.worldnews@lemmy.world
arrow-up1195arrow-down1external-linkChatGPT provides false information about people, and OpenAI can’t correct itnoyb.eualb_004@lemm.ee to Technology@lemmy.worldEnglish · 7 months agomessage-square61fedilinkcross-posted to: goodnews@lemmy.mltechnology@lemmy.worldfuck_ai@lemmy.worldnews@lemmy.world
minus-squareNeoNachtwaechter@lemmy.worldlinkfedilinkEnglisharrow-up4·7 months ago LLMs don’t actually store any of their training data, Data protection law covers all kinds of data processing. For example, input is processing, too. Output is processing, too. Section 4 of the GDPR. If you really want to rely on excuses, you would need wayyy better ones.
minus-squarevithigar@lemmy.calinkfedilinkEnglisharrow-up2arrow-down2·7 months agoRight, so keep personal data out of the training set and use it only in the easily readable and editable context. It’ll still “hallucinate” details about people if you ask it for details about people, but those people are fictitious.
Data protection law covers all kinds of data processing.
For example, input is processing, too. Output is processing, too. Section 4 of the GDPR.
If you really want to rely on excuses, you would need wayyy better ones.
Right, so keep personal data out of the training set and use it only in the easily readable and editable context. It’ll still “hallucinate” details about people if you ask it for details about people, but those people are fictitious.