• General_Effort@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    In theory there’s an inflection point at which models become sophisticated enough that they can self-sustain with generating training data to recursively improve

    That sounds surprising. Do you have a source?