• Scubus@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    3 days ago

    Yeah, i definitely get that. I suspect there will soon be techniques for sanitizing training data, although that just makes unethical capture more easy. And assuming the final goal is sentience, im not entirely sure it is unethical to train of others peoples data as long as you control for overfitting. The reasoning being that humans do the exact same thing. We train on every piece of media weve ever seen and use that to inspire “new” forms of media. Humans dont tend to have original thoughts, we just reshasg what weve heard. So every time you see a piece of media, you quite literally steal it mentally. It’s clearly a different argument with modern AI, I’m not claiming it does the same thing. But its main issue when it comes to that seems to be overfitting, too much of it’s inspiration can be directly seen. Sometimes it comes off as simply copying an image that was in its training data. Thats not inspiration, thats plagirism.

    And yeah i tend to assume were going to kill off capitalism because if we dont this discussion isnt going to matter anyways