It seems difficult that anyone could make that statement anywhere near definitively until we have quantum computers in our pockets. The real issue with scaling is the binary digital computing paradigm, not the limits of AI/neural network intelligence. In fact, it really depends on how you define “intelligence” and my own research into a unified “theory of everything” indicates that ours as humans is fundamentally repetitive imitation (mimicry). No different than AI learning algorithms, simply more advanced: we have far more neurons than any AI model.
It seems difficult that anyone could make that statement anywhere near definitively until we have quantum computers in our pockets. The real issue with scaling is the binary digital computing paradigm, not the limits of AI/neural network intelligence. In fact, it really depends on how you define “intelligence” and my own research into a unified “theory of everything” indicates that ours as humans is fundamentally repetitive imitation (mimicry). No different than AI learning algorithms, simply more advanced: we have far more neurons than any AI model.