Choosing exoskeleton settings like a Pandora radio station

October 19, 2023
Using a simple and convenient touchscreen interface, the algorithm learns the assistance preferences of the wearer. Video: Levi Hutmacher.

Taking inspiration from music streaming services, a team of engineers at the University of Michigan, Google and Georgia Tech has designed the simplest way for users to program their own exoskeleton assistance settings.

Of course, what’s simple for the users is more complex underneath, as a machine learning algorithm repeatedly offers pairs of assistance profiles that are most likely to be comfortable for the wearer. The user then selects one of these two, and the predictor offers another assistance profile that it believes might be better. This approach enables users to set the exoskeleton assistance based on their preferences using a very simple interface, conducive to implementing on a smartwatch or phone.

Continue reading ⇒

How evolution overshot the optimum bone structure in jerboas

October 17, 2022

Simulation of jerboa bones showing stress with a fused metatarsal and unfused metatarsals breaking under stress
In simulation, a jerboa species’ fused metatarsals (left), or foot bones, withstood greater stress than unfused metatarsals (right), but not as well as partially-fused metatarsals.

Foot bones that are separate in small hopping rodents are fused in their larger cousins, and a team of researchers at the University of Michigan and University of California, San Diego, wanted to know why. 

It appears that once evolution set jerboa bones on the path toward fusing together, they overshot the optimum amount of fusing—the structure that best dissipated stresses from jumping and landing—to become fully bonded.

This finding could inform the design of future robotic legs capable of withstanding the higher forces associated with rapid bursts of agile locomotion.

Continue reading ⇒

How we can better link mind and machine

July 28, 2022
A user's legs walking with a powered ankle exoskeleton on a treadmill
A user demonstrates walking with a lower-body exoskeleton. In a new study, powered exoskeleton users had trouble incorporating instructional haptic feedback cues, informing how future human-machine interaction must be designed. Photo: Brenda Ahearn/University of Michigan, College of Engineering, Communications and Marketing

A team led by University of Michigan researchers recently tested how exoskeleton users responded to the task of matching haptic feedback to the timing of each footstep. The team found that the haptic cues added mental workload, causing less effective use of the exoskeleton, and demonstrated the hurdles in future human-machine design.

“When we introduce haptic feedback while walking with an exoskeleton, we usually intend for the user to understand and maintain coordination with the exoskeleton,” said Man I (Maggie) Wu, a robotics PhD student.

“We discovered that the exoskeleton actually introduces a competing mental load. We really need to understand how this affects the user while they attempt to complete tasks.”

Continue reading ⇒

Exoskeletons with personalize-your-own settings

March 30, 2022
Leo Medrano, a PhD student in the Neurobionics Lab at the University of Michigan, tests out an ankle exoskeleton on a two-track treadmill. Researchers were able to give the exoskeleton user direct control to tune its behavior, allowing them to find the right torque and timing settings for themselves.

To transform human mobility, exoskeletons need to interact seamlessly with their user, providing the right level of assistance at the right time to cooperate with our muscles as we move. 

To help achieve this, University of Michigan researchers gave users direct control to customize the behavior of an ankle exoskeleton.

Not only was the process faster than the conventional approach, in which an expert would decide the settings, but it may have incorporated preferences an expert would have missed. For instance, user height and weight, which are commonly used metrics for tuning exoskeletons and robotic prostheses, had no effect on preferred settings.

Continue reading ⇒

$1M for open-source first-responder robots

September 16, 2021
A mini-cheetah out on the Robot Garden at the Ford Motor Company Robotics Building. Photo: Levi Hutmacher.

Tomorrow’s wildfire fighters and other first responders may tag-team with robotic assistants that can hike through wilderness areas and disaster zones, thanks to a University of Michigan research project funded by a new $1 million grant from the National Science Foundation. 

A key goal of the three-year project is to enable robots to navigate in real time, without the need for a preexisting map of the terrain they’re to traverse.

The project aims to take bipedal (or two-legged) walking robots to a new level, equipping them to adapt on the fly to treacherous ground, dodge obstacles or decide whether a given area is safe for walking. The technology could enable robots to go into areas that are too dangerous for humans, including collapsed buildings and other disaster areas. It could also lead to prosthetics that are more intuitive for their users.

Continue reading ⇒