- cross-posted to:
- technology@lemmy.world
- artificial_intel@lemmy.ml
- cross-posted to:
- technology@lemmy.world
- artificial_intel@lemmy.ml
Since it’s been a controversy on here a couple times, here’s a great example of how you can demonstrate an LLM can produce something it can’t have seen before.
It doesn’t prove anything beyond doubt, but I think these kinds of experiments show to something like a civil law standard that they’re not merely parrots.
I think we can all agree that LLMs are just simple machines responding to stimuli and producing what we want to hear, but we’re apparently not ready for the conversation that complex machines made out of proteins are any different?
The human exceptionalism runs strong. At best, you can argue that this specific machine isn’t thinking. Once you say no machine could do it, you’re forgetting that you too are stardust.
I get accused of falling to the ELIZA effect, but I suspect some of the critics are driven by something like hubris.