This robot can feel objects by simply seeing them
Researchers at MIT's Computer Science and Artificial Intelligence Lab have developed a robot that can simply look at an object to define how it 'feels'. The machine uses a novel AI algorithm that has been trained on a mix of tactile and visual data. Ultimately, the researchers believe, this work could make warehouse robots more efficient at handling objects of different types. Here's more.
MIT's machine imagines feeling of touching an object
The robot developed by MIT researchers - a robotic arm named KUKA - captures visual data representing an object. It processes this information with a sophisticated AI engine and predicts how it would feel to touch the seen object. But, here's the thing, the robot even works the other way around - where it touches an object to predict what it would look like.
So, it's feel from sight and vice versa
"By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge," project lead Yunzhu Li said. "By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings."
But, how all of this happens?
The algorithm developed by the researcher achieves this identification by connecting tactile and visual data fed by the researchers. First, the team applied a special tactile sensor called GelSight on the robotic arm and made it touch 200 household objects 12,000 times. Then, the tactile data captured by the robot and visual data representing the objects was fed into the AI algorithm.
This could make robots efficient in real world settings
As of now, the AI-powered robotic arm works in controlled environments. However, the team hopes to expand the capabilities of this system by training it on a wide set of data. They hope that the AI would be able to use these capabilities in real-world environments - like in a warehouse - and differentiate between different types of objects to handle them appropriately.