• Red_October@piefed.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    15 days ago

    Yeah, no shit? LLMs don’t actually know or understand anything, the idea that a scientific paper was retracted means nothing to them, all the training cares about is the patterns of word usage. It doesn’t matter that some core part of the paper was wrong, it was SAID, and it is presumed to be grammatically correct.

    This is a core reason why using LLMs as a search engine is fucking stupid, they don’t filter for what’s accurate, only what has been said before.

  • Kairos@lemmy.today
    link
    fedilink
    arrow-up
    11
    ·
    16 days ago

    Once again, the statistical word parrot does not have any understanding of what it’s doing or what it’s saying or have any ability to “learn” new things.