German China

Human–Machine Interaction AI Wearable Lets You Control Machines With Gestures

Source: University of California 3 min Reading Time

Related Vendor

A new AI-powered wearable overcomes one of the biggest barriers in gesture tech: motion noise. The system can accurately control machines during running, vibrations and even turbulent ocean conditions, marking a major leap for human-machine interfaces.

Wearable technology uses everyday gestures to reliably control robotic devices even under excessive motion noise, such as when the user is running, riding in a vehicle or in environments with turbulence. (Source:  David Baillot/ UC San Diego Jacobs School of Engineering)
Wearable technology uses everyday gestures to reliably control robotic devices even under excessive motion noise, such as when the user is running, riding in a vehicle or in environments with turbulence.
(Source: David Baillot/ UC San Diego Jacobs School of Engineering)

Engineers at the University of California San Diego have developed a next-generation wearable system that enables people to control machines using everyday gestures — even while running, riding in a car or floating on turbulent ocean waves. The system, published on Nov. 17 in Nature Sensors, combines stretchable electronics with artificial intelligence to overcome a long-standing challenge in wearable technology: reliable recognition of gesture signals in real-world environments.

Wearable technologies with gesture sensors work fine when a user is sitting still, but the signals start to fall apart under excessive motion noise, explained study co-first author Xiangjun Chen, a postdoctoral researcher in the Aiiso Yufeng Li Family Department of Chemical and Nano Engineering at the UC San Diego Jacobs School of Engineering. This limits their practicality in daily life. “Our system overcomes this limitation,” Chen said. “By integrating AI to clean noisy sensor data in real time, the technology enables everyday gestures to reliably control machines even in highly dynamic environments.”

Gallery

The technology could enable patients in rehabilitation or individuals with limited mobility, for example, to use natural gestures to control robotic aids without relying on fine motor skills. Industrial workers and first responders could potentially use the technology for hands-free control of tools and robots in high-motion or hazardous environments. It could even enable divers and remote operators to command underwater robots despite turbulent conditions. In consumer devices, the system could make gesture-based controls more reliable in everyday settings.

The work was a collaboration between the labs of Sheng Xu and Joseph Wang, both professors in the Aiiso Yufeng Li Family Department of Chemical and Nano Engineering at the UC San Diego Jacobs School of Engineering.

To the researchers’ knowledge, this is the first wearable human-machine interface that works reliably across a wide range of motion disturbances. As a result, it can work with the way people actually move.

The device is a soft electronic patch that is glued onto a cloth armband. It integrates motion and muscle sensors, a Bluetooth microcontroller and a stretchable battery into a compact, multilayered system. The system was trained from a composite dataset of real gestures and conditions, from running and shaking to the movement of ocean waves. Signals from the arm are captured and processed by a customized deep-learning framework that strips away interference, interprets the gesture, and transmits a command to control a machine — such as a robotic arm — in real time.

“This advancement brings us closer to intuitive and robust human-machine interfaces that can be deployed in daily life,” Chen said.

The system was tested in multiple dynamic conditions. Subjects used the device to control a robotic arm while running, exposed to high-frequency vibrations, and under a combination of disturbances. The device was also validated under simulated ocean conditions using the Scripps Ocean-Atmosphere Research Simulator at UC San Diego’s Scripps Institution of Oceanography, which recreated both lab-generated and real sea motion. In all cases, the system delivered accurate, low-latency performance.

Originally, this project was inspired by the idea of helping military divers control underwater robots. But the team soon realized that interference from motion wasn’t just a problem unique to underwater environments. It is a common challenge across the field of wearable technology, one that has long limited the performance of such systems in everyday life.

“This work establishes a new method for noise tolerance in wearable sensors,” Chen said. “It paves the way for next-generation wearable systems that are not only stretchable and wireless, but also capable of learning from complex environments and individual users.”

Original Article: “A noise-tolerant human-machine interface based on deep learning-enhanced wearable sensors.” Co-first authors on the study are UC San Diego researchers Xiangjun Chen, Zhiyuan Lou, Xiaoxiang Gao and Lu Yin. DOI:10.1038/s44460-025-00001-3

Subscribe to the newsletter now

Don't Miss out on Our Best Content

By clicking on „Subscribe to Newsletter“ I agree to the processing and use of my data according to the consent form (please expand for details) and accept the Terms of Use. For more information, please see our Privacy Policy. The consent declaration relates, among other things, to the sending of editorial newsletters by email and to data matching for marketing purposes with selected advertising partners (e.g., LinkedIn, Google, Meta)

Unfold for details of your consent

(ID:50635524)