Home Artificial Intelligence A Recent Dawn in Robotics: Touch-Based Object Rotation

A Recent Dawn in Robotics: Touch-Based Object Rotation

0
A Recent Dawn in Robotics: Touch-Based Object Rotation

In a groundbreaking development, a team of engineers on the University of California San Diego (UCSD) has designed a robotic hand that may rotate objects using touch alone, without the necessity for visual input. This progressive approach was inspired by the effortless way humans handle objects without necessarily needing to see them.

A Touch-Sensitive Approach to Object Manipulation

The team equipped a four-fingered robotic hand with 16 touch sensors spread across its palm and fingers. Each sensor, costing around $12, performs a straightforward function: it detects whether an object is touching it or not. This approach is exclusive because it relies on quite a few low-cost, low-resolution touch sensors that use easy binary signals—touch or no touch—to perform robotic in-hand rotation.

In contrast, other methods depend upon a couple of high-cost, high-resolution touch sensors affixed to a small area of the robotic hand, primarily on the fingertips. Xiaolong Wang, a professor of electrical and computer engineering at UC San Diego, who led the study, explained that these approaches have several limitations. They minimize the possibility that the sensors will are available contact with the thing, limiting the system’s sensing ability. High-resolution touch sensors that provide details about texture are extremely difficult to simulate and are prohibitively expensive, making it difficult to make use of them in real-world experiments.

The Power of Binary Signals

“We show that we don’t need details about an object’s texture to do that task. We just need easy binary signals of whether the sensors have touched the thing or not, and these are much easier to simulate and transfer to the true world,” said Wang.

The team trained their system using simulations of a virtual robotic hand rotating a various set of objects, including ones with irregular shapes. The system assesses which sensors on the hand are being touched by the thing at any given time point throughout the rotation. It also assesses the present positions of the hand’s joints, in addition to their previous actions. Using this information, the system instructs the robotic hand which joint must go where in the subsequent time point.

The Way forward for Robotic Manipulation

The researchers tested their system on the real-life robotic hand with objects that the system has not yet encountered. The robotic hand was capable of rotate quite a lot of objects without stalling or losing its hold. The objects included a tomato, a pepper, a can of peanut butter, and a toy rubber duck, which was essentially the most difficult object because of its shape. Objects with more complex shapes took longer to rotate. The robotic hand could also rotate objects around different axes.

The team is now working on extending their approach to more complex manipulation tasks. They’re currently developing techniques to enable robotic hands to catch, throw, and juggle, for instance. “In-hand manipulation is a quite common skill that we humans have, but it is extremely complex for robots to master,” said Wang. “If we can provide robots this skill, that may open the door to the sorts of tasks they will perform.”

This development marks a big step forward in the sphere of robotics, potentially paving the best way for robots that may manipulate objects at the hours of darkness or in visually difficult environments.

LEAVE A REPLY

Please enter your comment!
Please enter your name here