Neural interfaces to computers

Toby Howard

This article first appeared in Personal Computer World magazine, December 1997.

WHEN READERS OF New Scientist were recently asked to vote on what applications of genetic engineering they would most like to see, the responses were illuminating. Among the ideas were increased bladder elasticity for beer drinkers, tan-on-command skin for sunbathers, telepathy for everyone, and the genetic modification of politicians to make them incapable of lying. All rather implausible, but one item high on the wish list may soon become reality: controlling computers using nothing more than our nerve impulses. This is "bio-sensing", and it offers a revolutionary new way for humans and computers to interact.

It's two years since Futures last reported on efforts to create a direct brain-to-computer interface, and there have been few newsworthy developments. Using scalp electrodes to monitor the brain's electrical activity, about the best that can be currently achieved is to train subjects to modulate their "mu rhythm" brain waves, which are known to be connected with limb movement. These changes can be picked up and used to control the movement of a cursor on a computer screen. But it's an erratic business at the best of times, and the subject needs to undergo intensive biofeedback training

The problem with this approach is that the brain waves picked up by even the most sophisticated scalp sensors reveal a cranial cacophany of electrical activity. It's incredibly hard to filter out the noise to reveal signals which the subject can (to some extent) control voluntarily.

Many researchers are now turning their attention away from the brain itself, towards the nerves that communicate its instructions around the body. If you want to detect a subject wiggling his toes, for example, it makes more sense to monitor the relatively simple electrical activity of the muscles in the feet, rather than to try and fathom the mysteries of the whole brain.

A technique currently attracting much attention is electro-oculography, which measures the tiny electrical impulses associated with voluntary movements of the eyes. Using small electrodes attached to the forehead and under the eyes, it's possible to accurately track the horizontal and vertical components of a person's gaze.

One company playing a leading role in using muscle and eye signals for computer control is BioControl Systems Inc, based in Palo Alto, California. In a recent demonstration of a simulated surgical procedure, BioControl's Anthony Lloyd was able to accurately move an endoscope attached to a six degree of freedom robot, controlled by the neural signals generated by his eye movement.

The technique has obvious applications for the handicapped, and there have already been some remarkable success stories. Researchers at Syracuse University have created a system which enables a 16-year old quadriplegic boy to navigate the Web using signals entirely derived from his facial muscles. Another patient, an 18-month-old girl paralysed after a severe spinal injury, quickly learnt how to move an icon around the screen using just her eyes.

As well as non-invasive bio-sensing, direct neural interfaces are also under development, to monitor the state of an individual nerve cell without interfering with its activity. Researchers at the Max Planck Institute for Biochemistry in Munich have created a "silicon to neuron junction", although so far they have only experimented with leeches.

Working somewhat higher up the evolutionary scale, a team at the Georgia Institute of Technology are undertaking an ambitious program to enable a patient with paralysed limbs to control a robot arm, simply by thinking of the desired movement. Their idea is to implant microscopic electrodes in the motor cortex of the brain to record the neural signals directly responsible for arm and hand movements. These signals will then be analysed using pattern recognition techniques to determine the intended limb movements, from which commands will be generated to control the robot arm. Currently, the system is being tested with rhesus monkeys.

Bio-sensing needn't just be for expensive research programs -- there are several cheap devices on the market to fiddle about with at home. Late last year a Californian company called The Other 90% Technologies introduced a bio-PC interface using on a sensor fitting around a finger. "The first computer product operated by human thought", runs their somewhat hyperbolic publicity, although the sensor can apparently be used to control the player in a number of games sold by the company.

While the true Brain-Computer Interface remains beyond our reach, it is entirely possible that neural links with computers will one day become mainstream, and today's genetic fantasies might not sound so daft. We might indeed think of our computers as extensions to our own mind and body, and enjoy the luxury of LED arrays implanted into our skin in discreet places, offering detailed readouts of the body's condition.

This is the realm of the "trans-human", the merging of computer and flesh, claimed by some to be the next step in our evolution. We'll all eventually become six-million-dollar-men (and women) say groups like the Extropians who see the the human body as obsolete in an increasingly technological world.

Of course, all these computer attachments to our clapped-out bodies will need a power source. One New Scientist reader has the answer: we'll simply alter our genetic code to include that clever muscle found in electric eels.

Toby Howard teaches at the University of Manchester.