Beyond the Big Barrier
First published in Personal Computer World, February 1996.
IN JUST 50 YEARS, the computer has revolutionised our lives, and our futures. But one problem remains: the interface between computer and mind. This is the Big Barrier, and it is almost as impenetrable today as it was in those early days when we punched paper tape.
The race is on to breach the Big Barrier. Researchers in industry and academia are seeking new, more expressive ways to communicate with machines. One exciting field is Virtual Reality, where the goal is to make human and machine occupy the same conceptual space. Done properly, the illusion can be compelling. But there are drawbacks: the head-mounted displays and datagloves are expensive, often cumbersome and uncomfortable, and inappropriate for many interaction tasks. Other approaches include natural speech recognition, gaze tracking, and gesture recognition, where neural networks interpret hand, body and face gestures. Recently, researchers writing in the prestigious journal Presence suggested a "Nose Gesture Interface Device". Such is the climate of today's research that not everyone spotted the joke.
All of these methods, however, share one unhelpful and ultimately limiting feature: they're all based on the user's muscular movement. Imagine if we could communicate with the computer just by thinking.
The idea of using our brains to directly control a machine isn't particularly new. As far back as 1967, Edmond Dewan described experiments using subjects wired to an electroencephalograph (EEG), which records and graphs the electrical activity of the brain. With practice, the subjects were able to reduce the amplitude of their brains' alpha rhythms, to transmit Morse code to a teleprinter.
Research into the Brain-Computer Interface, or BCI, began in earnest in the early 70's, when the United States Department of Defense saw the promise of fighter pilots using their minds to directly control their planes. Given the technology of the time, there was limited success, and the program was cancelled. But the groundwork was laid for a field of research now growing rapidly. A major motivation has been to help patients suffering from conditions such as cerebral palsy, or spinal injuries, which inhibit physical control, but which leave intellectual faculties intact. Over the last decade, great advances have been made.
A successful BCI must be bi-directional. Getting information into the brain is relatively easy: we can use the normal sensory channels, such as sight or hearing. But getting the information out of the brain by studying its electrical signature is a much harder problem. Although leading researcher Jonathan Wolpaw has commented that "in theory the brain's intentions should be discernible in the spontaneous EEG", the sheer complexity of the brain's measurable activity produces EEG traces which present a formidable problem of interpretation. However, by focusing on very specific areas of brain activity, such as motor function, it is possible to analyse EEG data using filters, fourier transforms, and neural networks, to extract some useful signal from the noise.
Using this technique, workers at the New York State Department of Health have been conducting experiments with "mu rhythm", an 8-12 Hz brain rhythm centred on the sensorimotor cortex. With some biofeedback training, subjects learned to move a cursor around a screen, by modulating their mu waves. Similarly, at the University of Illinois, researchers have trained subjects to control a "thought typewriter", which displays their chosen letters and words on a screen. Research of this kind is going on in a number of laboratories worldwide.
Although the equipment used in these experiments is complex, and requires substantial calibration and tuning to work with specific individuals, it is possible to have a go yourself. There are a number of cheap EEG monitoring systems available aimed at the hobbyist, with PC or Mac interfaces. IBVA Technologies' Interactive Brain Wave Analyser, for example, comprises a headband with adhesive electrodes which send data by radio to a Mac interface box. This processes the raw EEG to give data which can be viewed as 3D graphics, or converted to MIDI to control a soundcard or synthesiser. One user is Sylvia Pengilly, Professor of Music Theory at Loyola University in New Orleans. "I always wanted to 'think' my music into the computer," says Pengilly. "It's still in the beginning stages, but I can control the form of the music according to the moods I set".
But one's feet should stay firmly on the ground, at least for now. Although the principles of EEG-based control sound simple, the extraction of meaningful data from an EEG trace remains at an extremely simplistic level. Motor functions are readily recognisable; "thinking" signals, like "I'm thinking about blue skies" remain a challenge to decipher.
An alternative to EEG analysis is to implant electrodes into the brain matter itself, and stimulate/monitor groups of brain-cells directly. Stimulating individual nerve cells has always been problematic due to the toxic effects of the metal leads, but recently researchers at the Max Planck Institute for Biochemistry in Munich have overcome this by creating a "silicon to neuron junction", which can directly stimulate a nerve cell without damaging it. Used in conjunction with existing "neuron transistors" which sense the ionic potential of a nerve cell, this technology paves the way for two-way communication. The dangers of inserting electrodes into brains, of course, remain.
BCI research is red-hot. Although still at an exploratory stage, the implications of recent research results are phenomenally exciting. Perhaps one day, the human-computer interface, the Big Barrier, will simply disappear altogether. And instead of typing this article on my PC, I will merely have to think it.
Toby Howard is a Lecturer in Computer Graphics at the University of Manchester.
Here is a collection of links to information about BCI.
Toby Howard is a lecturer at the University of Manchester.