Audio Clouds: head, hand and device gestures for input on mobile devices

(via). Still the same lab in Glasgow, they also have a nice project related to the investigation of 3D audio on wearable computers to increase display space plus how head, hand and device gestures may be used for input on mobile devices. It's called "Audio Clouds". There is news on the BBC about it.

"The idea behind the whole thing is to look at new ways to present information," Professor Stephen Brewster told. (...) "We hope to develop interfaces that are truly mobile, allowing users to concentrate on the real world while interacting with their mobile device as naturally as if they were talking to a friend while walking." "Lots of times, you need to use your eyes to operate a gadget - even with an iPod, you need to take it out of pocket to look at screen to control it. "If you could do something with your hands, or other gestures you would not have to take it out of your pocket," explained Professor Brewster. The researchers have developed ways to control gadgets, such as personal digital assistants (PDAs) and music players, using 3D sound for output and gestures for input. (...) Professor Brewster and his Multimodal Interaction Group realised that they could get other information out of accelerometers too. The actual variations in a person's gait could be read and harnessed for different uses.

This kind of stuff is now closer to the market. Phone companies are up to releasing similar projects. I am eager to see people waving in the streets just to zip files or to shuffle songs in their ipods!