Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 2 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square98fedilinkarrow-up1424arrow-down17
arrow-up1417arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 2 months agomessage-square98fedilink
minus-squarevxx@lemmy.worldlinkfedilinkEnglisharrow-up3·edit-22 months agoThe “issue” is that people were able to override bots on twitter with that method and make them feed their own instructions. I saw it first time being used on a Russian propaganda bot.
The “issue” is that people were able to override bots on twitter with that method and make them feed their own instructions.
I saw it first time being used on a Russian propaganda bot.