I picked sentience as the culmination of the definitions of Intelligence, awareness, etc. as it ends up being circular with the definitions of those terms, and has a concrete definition that has been widely accepted to be provable by society and science.
I would argue otherwise, as a black box that I have coded an algorithm to prove the collatz conjecture for, actually has no intelligence whatsoever, as it doesn’t do anything intelligent, it just runs through a set of steps, completely without awareness that it is even doing anything at all. It may seem intelligent to you, because you don’t understand what it does, but at the end of the day it just runs through instructions.
I wouldn’t call the snake head responding to stimulus intelligent, as it is not using any form of thought at all to react, it’s purely mechanical. In the same way, a program that has been written that solves a problem is mechanical, it itself doesn’t solve any problem, it simply runs through, or reacts to, a set of given instructions.
I picked sentience as the culmination of the definitions of Intelligence, awareness, etc. as it ends up being circular with the definitions of those terms, and has a concrete definition that has been widely accepted to be provable by society and science.
I would argue otherwise, as a black box that I have coded an algorithm to prove the collatz conjecture for, actually has no intelligence whatsoever, as it doesn’t do anything intelligent, it just runs through a set of steps, completely without awareness that it is even doing anything at all. It may seem intelligent to you, because you don’t understand what it does, but at the end of the day it just runs through instructions.
I wouldn’t call the snake head responding to stimulus intelligent, as it is not using any form of thought at all to react, it’s purely mechanical. In the same way, a program that has been written that solves a problem is mechanical, it itself doesn’t solve any problem, it simply runs through, or reacts to, a set of given instructions.