• @EspiritdescaliMA
    link
    English
    27 months ago

    I don’t think LLM’s will lead to AGI, but at some point a system is going to be implemented that does lead to AGI but it will be unexpected.

    This seems to be the worse case scenario as the AGI will likely be clever enough to ensure humans don’t realise it’s AGI. There are network effects and complexity that we are not 100% knowledgeable about. The net result is the same: we lose the control problem. Badly.