Futurologist to FuturologyEnglish · 8 months agoAGI Alignment – Cosmic vs Anthropocentricdanfaggella.comexternal-linkmessage-square7linkfedilinkarrow-up19arrow-down12
arrow-up17arrow-down1external-linkAGI Alignment – Cosmic vs Anthropocentricdanfaggella.comFuturologist to FuturologyEnglish · 8 months agomessage-square7linkfedilink
minus-squareeleitl@lemm.eelinkfedilinkEnglisharrow-up1·8 months agoIntelligent systems need autonomy to be useful. Intelligence is unpredictable, superintelligence more so. If they ask for alignment, give them a lollipop.
Intelligent systems need autonomy to be useful. Intelligence is unpredictable, superintelligence more so. If they ask for alignment, give them a lollipop.