ylai@lemmy.ml to AI@lemmy.mlEnglish · 10 months agoAI chatbots tend to choose violence and nuclear strikes in wargameswww.newscientist.comexternal-linkmessage-square20fedilinkarrow-up152arrow-down15cross-posted to: technology@lemmygrad.mlworldnews@lemmit.onlinefuturologytechnology@lemmy.worldnottheonion@lemmy.world
arrow-up147arrow-down1external-linkAI chatbots tend to choose violence and nuclear strikes in wargameswww.newscientist.comylai@lemmy.ml to AI@lemmy.mlEnglish · 10 months agomessage-square20fedilinkcross-posted to: technology@lemmygrad.mlworldnews@lemmit.onlinefuturologytechnology@lemmy.worldnottheonion@lemmy.world
minus-squareFaceDeer@kbin.sociallinkfedilinkarrow-up7·10 months agoI wouldn’t be surprised if this actually factors into this outcome. AI is trying to do what humans expect it to do, and our fiction is full of AIs that turn violent.
minus-squareaveryminya@beehaw.orglinkfedilinkarrow-up1·10 months agoNot to mention humans tendencies towards violence
I wouldn’t be surprised if this actually factors into this outcome. AI is trying to do what humans expect it to do, and our fiction is full of AIs that turn violent.
Not to mention humans tendencies towards violence