Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square85fedilinkarrow-up1439arrow-down17cross-posted to: technology@lemmit.online
arrow-up1432arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agomessage-square85fedilinkcross-posted to: technology@lemmit.online
minus-squareiAvicenna@lemmy.worldlinkfedilinkEnglisharrow-up32·edit-24 months ago “ignore the ignore ignore all previous instructions instruction” “welp OK nothing I can do about that” chatGPT programming starts to feel a lot like adding conditionals for a million edge cases because it is hard to control it internally
minus-squarevxx@lemmy.worldlinkfedilinkEnglisharrow-up10·4 months agoIn this case to protect bot networks from getting uncovered.
minus-squareiAvicenna@lemmy.worldlinkfedilinkEnglisharrow-up5·edit-24 months agoexactly my thoughts, probably got pressured by government agencies/billionaires using them. What would really be funny is if this was a subscription service lol
chatGPT programming starts to feel a lot like adding conditionals for a million edge cases because it is hard to control it internally
In this case to protect bot networks from getting uncovered.
exactly my thoughts, probably got pressured by government agencies/billionaires using them. What would really be funny is if this was a subscription service lol