ugjka@lemmy.world to Technology@lemmy.worldEnglish · 5 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square259fedilinkarrow-up1976arrow-down115
arrow-up1961arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 5 months agomessage-square259fedilink
minus-squareAdmiralRob@lemmy.ziplinkfedilinkEnglisharrow-up23arrow-down1·5 months agoTechnically, it didn’t print part of the instructions, it printed all of them.
Technically, it didn’t print part of the instructions, it printed all of them.