Because training data is just weighted data for accuracy.
It is like holding two balls one vibrant blue the other navy. You ask it to pick the blue ball and it goes constant to the vibrant one. This is because it is accurately blue.
Sure it is some aspect of AI. But hopefully we don't go down the path of assuming it is sentient or capable of decision making on the basis of emotional intelligence.
It is like holding two balls one vibrant blue the other navy. You ask it to pick the blue ball and it goes constant to the vibrant one. This is because it is accurately blue.
Sure it is some aspect of AI. But hopefully we don't go down the path of assuming it is sentient or capable of decision making on the basis of emotional intelligence.