About tongue-based interactions

People interested in tongue-based interactions should have a glance at this thesis (in japanese though), there are results from different tests/analyses of potential stimulus recognition (at least judging from what babelfish managed to translate).

The next step is then to find uses as in Nikawa's work: "Tongue-Controlled Electro-Musical Instrument", The 18th International Congress on Acoustics, Vol.III, pp.1905-1908, (2004.4)

This study aims to develop a new electronic instrument that even severely handicapped people with quadriplegia can play in order to improve their quality of life (QOL). Ordinary orchestral and percussion instruments require fine movements of the limbs and cannot be used by those with quadriplegia. In this study, we made a prototype of an electronic musical instrument that can be played by tongue movement. This instrument is composed of an operation board inside the mouth and a sound generator. The signals emitted from the operation board are transmitted to the sound generator equipped inside a personal computer. Music is generated through speakers.

Another example is the Nintendo's tongue controlled GBA which is a curious hack too using a New abilities TTK: a tongue-touch wireless keyboard transmitter (an orthodontic retainer with nine membrane buttons).

Others also use it as a "third arm" for astronauts:

The proposed alternative hands-free computer control system ACCS - Alternative Computer Control System - (...) ACCS will provide pilots and astronauts with an additional flight control contour, which will allow for continuous computer control of the flying apparatus at max. G-force, vibration, as well as blindly due to blood surge back from retina. ACCS is placed in a person's mouth (and comprises a tongue controlled directional command module along with 12 additional commands). It does not interfere with breathing, talk and consumption of fluids.

Why do I blog this? websurf about curious human-computer interaction systems...