If you are familiar with my views on anthropomorphization (that is, “humanize”) of robots and Ia, I could guess what I am about to say. Amazon says that his Robot Vulcan can feel (without quotes). This is not true.
When a robot like Vulcan “feels” something, uses sensors that measure strength, pressure and, sometimes, texture or shape, turning these signs into data that AI can interpret. Vulcan sensors are integrated into their clamp and joints, so when it touches or grabs an object, it detects how much force is applying and the contours it finds. Automatic learning algorithms then help Vulcan decide how to adjust their grip or movement based on this feedback.
On the contrary, a person feels with a network of millions of nerve endings on the skin, especially in fingers’ yolks. These nerves send detailed information in real time to the brain about the pressure, temperature, texture, pain and even the direction of force. The sense of human touch is deeply connected to memory, emotion, judgment and consciousness.
#Amazon #invents #robot #time #feeling #Computerworld