Lee Duna@lemmy.nz to Fuck AI@lemmy.worldEnglish · 2 days agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square8linkfedilinkarrow-up181arrow-down12cross-posted to: technology@lemmy.world
arrow-up179arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comLee Duna@lemmy.nz to Fuck AI@lemmy.worldEnglish · 2 days agomessage-square8linkfedilinkcross-posted to: technology@lemmy.world
minus-squareGrimy@lemmy.worldlinkfedilinkarrow-up11·2 days ago‘Safety systems’ is simply censorship, pushed to the public as a good thing so it’s easier to kill and ban open source solutions when the time comes.
minus-squarelumen@feddit.nllinkfedilinkarrow-up1·2 days agoYou think? I’m convinced the guardrails LLM companies impose only serve to up these companies’ reputation.
‘Safety systems’ is simply censorship, pushed to the public as a good thing so it’s easier to kill and ban open source solutions when the time comes.
You think? I’m convinced the guardrails LLM companies impose only serve to up these companies’ reputation.