The following time you’re scrolling your phone, take a moment to understand the feat: The seemingly mundane act is feasible because of the coordination of 34 muscles, 27 joints, and over 100 tendons and ligaments in your hand. Indeed, our hands are probably the most nimble parts of our bodies. Mimicking their many nuanced gestures has been a longstanding challenge in robotics and virtual reality.
Now, MIT engineers have designed an ultrasound wristband that precisely tracks a wearer’s hand movements in real-time. The wristband produces ultrasound images of the wrist’s muscles, tendons, and ligaments because the hand moves, and is paired with a synthetic intelligence algorithm that repeatedly translates the pictures into the corresponding positions of the five fingers and palm.
The researchers can train the wristband to learn a wearer’s hand motions, which the device can communicate in real-time to a robot or a virtual environment.
In demonstrations, the team has shown that an individual wearing the wristband can wirelessly control a robotic hand. Because the person gestures or points, the robot does the identical. In a form of wireless marionette interaction, the wearer can manipulate the robot to play a straightforward tune on the piano and shoot a small basketball right into a desktop hoop. With the identical wristband, a wearer may also manipulate objects on a pc screen, as an illustration pinching their fingers together to enlarge and minimize a virtual object.
The team is using the wristband to collect hand motion data from many more users with different hand sizes, finger shapes, and gestures. They envision constructing a big dataset of hand motions that will be plumbed, as an illustration, to coach humanoid robots in dexterity tasks, corresponding to performing certain surgical procedures. The ultrasound band is also used to understand, manipulate, and interact with objects in video games, design applications, or other virtual settings.
“We expect this work has immediate impact in potentially replacing hand tracking techniques with wearable ultrasound bands in virtual and augmented reality,” says Xuanhe Zhao, the Uncas and Helen Whitaker Professor of Mechanical Engineering at MIT. “It could also provide huge amounts of coaching data for dexterous humanoid robots.”
Zhao, Gengxi Lu, and their colleagues present the wristband’s recent design in a paper appearing today in Their MIT co-authors are former postdocs Xiaoyu Chen, Shucong Li, and Bolei Deng; graduate students SeongHyeon Kim and Dian Li; postdocs Shu Wang and Runze Li; and Anantha Chandrakasan, MIT provost and the Vannevar Bush Professor of Electrical Engineering and Computer Science. Other co-authors are graduate students Yushun Zheng and Junhang Zhang, Baoqiang Liu, Chen Gong, and Professor Qifa Zhou from the University of Southern California.
Seeing strings
There are currently a lot of approaches to capturing and mimicking human hand dexterity in robots. Some approaches use cameras to record an individual’s hand movements as they manipulate objects or perform tasks. Others involve having an individual wear a glove with sensors, which records the person’s hand movements and transmits the info to a receiving robot. But erecting a fancy camera system for various applications is impractical and susceptible to visual obstacles. And sensor-laden gloves could limit an individual’s natural hand motions and sensations.
A 3rd approach uses the electrical signals from muscles within the wrist or forearm that scientists then correlate with specific hand movements. Researchers have made significant advances on this approach, nonetheless these signals are easily affected by noise within the environment. Also they are not sensitive enough to tell apart subtle changes in movements. For example, they could discern whether a thumb and index finger are pinched together or pulled apart, but not much of the in-between path.
Zhao’s team wondered whether ultrasound imaging might capture more dexterous and continuous hand movements. His group has been developing various types of ultrasound stickers — miniaturized versions of the transducers utilized in doctor’s offices which are paired with hydrogel material that may safely stick with skin.
Of their recent study, the team incorporated the ultrasound sticker design right into a wearable wristband to repeatedly image the muscles and tendons within the wrist.
“The tendons and muscles in your wrist are like strings pulling on puppets, that are your fingers,” Lu says. “So the concept is: Every time you’re taking an image of the state of the strings, you’ll know the state of the hand.”
Mapping manipulation
The team designed a wristband with an ultrasound sticker that’s the scale of a smartwatch, and added onboard electronics which are about as small as a cellphone. They attached the wristband to a volunteer’s wrist and confirmed that the device produced clear and continuous images of the wrist because the volunteer moved their fingers in various gestures.
The challenge then was to relate the black and white ultrasound images of the wrist to specific positions of the hand. Because it seems, the fingers and thumb are capable of twenty-two degrees of freedom, or other ways of extending or angling. The researchers found that they may discover specific regions of their ultrasound images of the wrist that correlate to every of those 22 degrees of freedom. For example, changes in a single region relate to thumb extension, while changes in one other region correlate with movements of the index finger.
To determine these connections, a volunteer wearing the wristband would move their hand in various positions while the researchers recorded the gestures with multiple cameras surrounding the volunteer. By matching changes in certain regions of the ultrasound images with hand positions recorded by the cameras, the team could label wrist image regions with the corresponding degree of freedom within the hand. But to do that translation repeatedly, and in real-time, can be an unattainable task for humans.
So, the team turned to artificial intelligence. They used an AI algorithm that will be trained to acknowledge image patterns and correlate them with specific labels and, on this case, the hand’s various degrees of freedom. The researchers trained the algorithm with ultrasound images that they meticulously labeled, annotating the image regions related to a selected degree of freedom. They tested the algorithm on a brand new set of ultrasound images and located it accurately predicted the corresponding hand gestures.
Once the researchers successfully paired the AI algorithm with the wristband, they tested the device on more volunteers. For the brand new study, eight volunteers with different hand and wrist sizes wore the wristband while they formed various hand gestures and grasps, including making the signs for all 26 letters in American Sign Language. In addition they held objects corresponding to a tennis ball, a plastic bottle, a pair of scissors, and a pencil. In each case, the wristband precisely tracked and predicted the position of the hand.
To display potential applications, the team developed a straightforward computer program that they wirelessly paired with the wristband. As a wearer went through the motions of pinching and grasping, the gestures corresponded to zooming out and in on an object on the pc screen, and virtually moving and manipulating it in a smooth and continuous fashion.
The researchers also tested the wristband as a wireless controller of a straightforward industrial robotic hand. While wearing the wristband, a volunteer went through the motions of playing a keyboard. The robot in turn mimicked the motions in real-time to play a straightforward tune on a piano. The identical robot was also capable of mimic an individual’s finger taps to play a desktop basketball game.
Zhao is planning to further miniaturize the wristband’s hardware, in addition to train the AI software on many more gestures and movements from volunteers with wider ranging hand styles and sizes. Ultimately, the team is constructing toward a wearable hand tracker that will be worn by anyone, to wirelessly manipulate humanoid robots or virtual objects with high dexterity.
“We imagine that is probably the most advanced technique to track dexterous hand motion, through wearable imaging of the wrist,” Zhao says. “We expect these wearable ultrasound bands can provide intuitive and versatile controls for virtual reality and robotic hands.”
This research was supported, partly, by MIT, the U.S. National Institutes of Health, the U.S. National Science Foundation, the U.S. Department of Defense, and Singapore National Research Foundation through the Singapore-MIT Alliance for Research and Technology.
