Researchers at the University of Waterloo have developed a system that uses gait gestures—deliberate changes in how we walk—to control augmented reality devices, offering a hands-free alternative to interacting with technology.
Many futurists envision a world where we use our minds to control apps. While this concept may still seem several years away from coming to fruition, new research suggests that a first step toward achieving it may entail bypassing our hands altogether and instead harnessing an ability to control new technologies with a little help from our feet.
Researchers at the University of Waterloo in Ontario, Canada, have explored how deliberate changes in walking patterns can control augmented reality (AR) devices. The study, led by Daniel Vogel, a professor of computer science, introduces a novel approach to interacting with technology that could redefine how we engage with AR systems.
“There’s a long history of using feet to control machines. For example, the pedals on a car, but very little research has been done into using the way we walk as an input for a device,” said Ching-Yi Tsai, the lead author of the study and a former visiting scholar at the University of Waterloo’s David R. Cheriton School of Computer Science.
The idea originated during the pandemic when Vogel faced the practical frustrations of using touchscreens in freezing temperatures. Ontario winters can be cold enough that removing your hands from pockets or mittens is nearly unbearable, even for short periods. On one such occasion, while walking to get coffee in the cold, Vogel had a breakthrough: what if walking itself could replace hand-based controls?
“Extreme movements like dance steps or a jump would likely be easy for a system to recognize, but these might be harder to perform, and they would deviate too far from normal walking for people to feel comfortable doing them in public,” Vogel said. “We didn’t want users to feel like someone from Monty Python’s Ministry of Silly Walks!”
In the study, volunteers tested 22 different foot motions, evaluated based on ease of movement, compatibility with natural walking, and social acceptability in public settings. The research narrowed the list to seven ideal gait gestures.
In subsequent trials, participants wore an AR headset displaying a simple menu overlaid onto their real-world environment. They could use these gait gestures to control a music player, order coffee, and answer calls. While the AR technology is still in development, the researchers remotely triggered the commands to simulate the experience. They also created a proof-of-concept recognizer that achieved a 92 percent accuracy rate in identifying the gestures.
“We aren’t at a point yet where AR headsets are widely used,” Tsai said. “But this research shows that if we get there, this input option has got legs!”
The study, “Gait Gestures: Examining Stride and Foot Strike Variation as an Input Method While Walking,” was recently published in the proceedings of UIST 2024.
Chrissy Newton is a PR professional and founder of VOCAB Communications. She hosts the Rebelliously Curious podcast, which can be found on The Debrief’s YouTube Channel. Follow her on X: @ChrissyNewton and at chrissynewton.com.