Cover Image for Meta AI researchers equip robots with a sense of touch, creating unsettling sensations.

Meta AI researchers equip robots with a sense of touch, creating unsettling sensations.

Touch the future.

Meta has made progress in the capabilities of robots by adding a new dimension to their skills, allowing machines to not only "see" and "hear," but also to "feel" by developing tools that mimic the sense of touch. The team from Meta's Fundamental AI Research (FAIR) division has launched technologies that enable robots to identify and react to the surfaces and objects they touch, which could be essential for handling delicate items without the risk of damaging them.

In this effort, Meta has introduced tactile detection technology called Sparsh, which allows artificial intelligence to recognize features such as pressure, texture, and movement of objects without requiring large databases. The analogy used is that just as a person can feel and describe an object in the dark, robots will be able to perform a similar task.

To enable the AI system to receive data about physical contact, Meta partnered with GelSight to develop a sensor that acts as a robotic "finger," called Digit 360. This device is equipped with highly sensitive sensors that allow the AI system to capture precise information about the objects it touches and adjust the action pressure according to the required task, such as lifting or rotating an object.

Additionally, to equip the rest of the robotic hand, Plexus has been created in collaboration with Wonik Robotics, integrating multiple tactile sensors throughout the device. With Plexus, the goal is for robots to handle fragile or intricately shaped objects with the same dexterity as a human.

Meta emphasizes that coordination between tactile perception and motor action is crucial for the development of embodied artificial intelligence. The potential applications of this technology are vast; they could include robotic surgical assistants capable of detecting subtle variations in the human body and reacting rapidly with precise movements, as well as improving the manufacturing of delicate devices and optimizing the teamwork of multiple robotic hands.

The ability of robots to feel could also transform virtual experiences, making digital environments feel more real by translating physical sensations into their virtual counterparts.

This approach of mimicking human abilities is not limited to touch. Research at Penn State has demonstrated that artificial intelligence models can simulate the sense of taste, capable of detecting subtle differences in flavors, while another company, Osmo, has managed to replicate the sense of smell in machines, even allowing for the creation of scents from chemical combinations without human intervention.