May 3, 2025
Trending News

Meta introduced touch technology to robots

  • November 3, 2024
  • 0

Meta company, known for its developments in the field of artificial intelligence, has introduced a new set of tools that will enable robots to experience the world by


Meta company, known for its developments in the field of artificial intelligence, has introduced a new set of tools that will enable robots to experience the world by touch. This technology could greatly expand the capabilities of robots and make them more useful in a variety of situations, including manipulating and interacting with fragile objects.


Sparsh’s sensory recognition technology gives AI a way to detect pressure, texture, and motion without the need for a large database. This is similar to how a person can feel in the dark and describe sensations without even knowing what they are touching.

To transfer touch information to the AI ​​model, Meta collaborated with GelSight to create the Digit 360 robotic fingertip. Digit 360’s sensors are extremely sensitive and allow not only to detect details of an object but also to adjust the pressure to perform an action. Tasks such as lifting or rotating the object.

For the rest of the robot arm, Meta partnered with Wonik Robotics to create the Plexus system, which distributes multiple touch sensors throughout the device. Meta* claims that Plexus can mimic human touch well enough to work with fragile or “awkward” objects.

“The human hand perfectly transmits tactile information to the brain through the skin from the fingertips to the palm. This allows you to activate the hand muscles, for example, when making decisions about how to type on the keyboard or how to interact with a very hot object. The developers explained in their blog: Embodied artificial Achieving intelligence requires similar coordination between tactile perception and motor activation in the robot arm.

This technology could have many applications, including the development of robotic surgical assistants that can sense small changes in the body and respond more quickly with precise yet precise movements that match or exceed human responses. It may also be useful in manufacturing precision devices without breaking them and improving coordination between multiple robotic arms. It can also make virtual experiences more realistic for people by understanding how the objects and environments used to deliver information should feel.

Also read – ‘Robotic cat eyes’ give drones perfect vision

Using AI to simulate touch for robots isn’t the only human experience AI is simulating for machines. Researchers at the University of Pennsylvania recently showed that models associated with electronic speech can mimic the sense of taste well enough to recognize subtle differences in taste. Osmo, meanwhile, trained the model to simulate a sense of smell that is much better than a human’s. The company demonstrated how AI can analyze a scent accurately enough to recreate it from scratch by selecting and combining chemicals without human intervention.

Source: Port Altele

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version