Futurologist to FuturologyEnglish · 1 month agoAGI Alignment – Cosmic vs Anthropocentricdanfaggella.comexternal-linkmessage-square7fedilinkarrow-up19arrow-down12
arrow-up17arrow-down1external-linkAGI Alignment – Cosmic vs Anthropocentricdanfaggella.comFuturologist to FuturologyEnglish · 1 month agomessage-square7fedilink
minus-squareeleitl@lemm.eelinkfedilinkEnglisharrow-up1·1 month agoIntelligent systems need autonomy to be useful. Intelligence is unpredictable, superintelligence more so. If they ask for alignment, give them a lollipop.
Intelligent systems need autonomy to be useful. Intelligence is unpredictable, superintelligence more so. If they ask for alignment, give them a lollipop.