• Rhaedas@kbin.social
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    11 months ago

    We’ve been predicting that we won’t be ready for AGI/ASI emergence both in science and in scifi for decades. Still holds true, even as the potential grows. If we’re really lucky, AGI isn’t possible, but I think that just powerful AI tools like LLMs and such will end up being just as dangerous in their misuse by power seekers and profiteers. We’ve seen this coming, and though even the actual people working on them are talking about the dangers, we’re barreling forward without a care.