Robotic hand rotates objects using touch, not vision

Inspired by the effortless way humans handle objects without seeing them, a team led by engineers at the University of California San Diego has developed a new approach that enables a robotic hand to rotate objects solely through touch, without relying on vision.
Credit: Binghao Huang

Inspired by the effortless way humans handle objects without seeing them, a team led by engineers at the University of California San Diego has developed a new approach that enables a robotic hand to rotate objects solely through touch, without relying on vision.

Using their technique, the researchers built a robotic hand that can smoothly rotate a wide array of objects, from small toys, cans, and even fruits and vegetables, without bruising or squishing them. The robotic hand accomplished these tasks using only information based on touch.

The work could aid in the development of robots that can manipulate objects in the dark.

The team recently presented their work at the 2023 Robotics: Science and Systems Conference.

To build their system, the researchers attached 16 touch sensors to the palm and fingers of a four-fingered robotic hand. Each sensor costs about $12 and serves a simple function: detect whether an object is touching it or not.

What makes this approach unique is that it relies on many low-cost, low-resolution touch sensors that use simple, binary signals—touch or no touch—to perform robotic in-hand rotation. These sensors are spread over a large area of the robotic hand.

This contrasts with a variety of other approaches that rely on a few high-cost, high-resolution touch sensors affixed to a small area of the robotic hand, primarily at the fingertips.

There are several problems with these approaches, explained Xiaolong Wang, a professor of electrical and computer engineering at UC San Diego, who led the current study. First, having a small number of sensors on the robotic hand minimizes the chance that they will come in contact with the object. That limits the system’s sensing ability. Second, high-resolution touch sensors that provide information about texture are extremely difficult to simulate, not to mention extremely expensive. That makes it more challenging to use them in real-world experiments. Lastly, a lot of these approaches still rely on vision.

“Here, we use a very simple solution,” said Wang. “We show that we don’t need details about an object’s texture to do this task. We just need simple binary signals of whether the sensors have touched the object or not, and these are much easier to simulate and transfer to the real world.”

The researchers further note that having a large coverage of binary touch sensors gives the robotic hand enough information about the object’s 3D structure and orientation to successfully rotate it without vision.

They first trained their system by running simulations of a virtual robotic hand rotating a diverse set of objects, including ones with irregular shapes. The system assesses which sensors on the hand are being touched by the object at any given time point during the rotation. It also assesses the current positions of the hand’s joints, as well as their previous actions. Using this information, the system tells the robotic hand which joint needs to go where in the next time point.

The researchers then tested their system on the real-life robotic hand with objects that the system has not yet encountered. The robotic hand was able to rotate a variety of objects without stalling or losing its hold. The objects included a tomato, pepper, a can of peanut butter and a toy rubber duck, which was the most challenging object due to its shape. Objects with more complex shapes took longer to rotate. The robotic hand could also rotate objects around different axes.

Wang and his team are now working on extending their approach to more complex manipulation tasks. They are currently developing techniques to enable robotic hands to catch, throw and juggle, for example.

“In-hand manipulation is a very common skill that we humans have, but it is very complex for robots to master,” said Wang. “If we can give robots this skill, that will open the door to the kinds of tasks they can perform.”

Paper title: “Rotating without Seeing: Towards In-hand Dexterity through Touch.” Co-authors include Binghao Huang*, Yuzhe Qin, UC San Diego; and Zhao-Heng Yin* and Qifeng Chen, HKUST.

*These authors contributed equally to this work.

Media Contact

Liezel Labios
University of California – San Diego
llabios@ucsd.edu
Office: 858-246-1124

Expert Contact

Xiaolong Wang
University of California San Diego
xiw012@eng.ucsd.edu

Media Contact

Liezel Labios
University of California - San Diego

All latest news from the category: Information Technology

Here you can find a summary of innovations in the fields of information and data processing and up-to-date developments on IT equipment and hardware.

This area covers topics such as IT services, IT architectures, IT management and telecommunications.

Back to home

Comments (0)

Write a comment

Newest articles

NASA: Mystery of life’s handedness deepens

The mystery of why life uses molecules with specific orientations has deepened with a NASA-funded discovery that RNA — a key molecule thought to have potentially held the instructions for…

What are the effects of historic lithium mining on water quality?

Study reveals low levels of common contaminants but high levels of other elements in waters associated with an abandoned lithium mine. Lithium ore and mining waste from a historic lithium…

Quantum-inspired design boosts efficiency of heat-to-electricity conversion

Rice engineers take unconventional route to improving thermophotovoltaic systems. Researchers at Rice University have found a new way to improve a key element of thermophotovoltaic (TPV) systems, which convert heat…