Computer mind-reading

Toby Howard

This article first appeared in Personal Computer World magazine, November 2000.

Have you ever longed for a computer that could read your mind and, for once, do exactly what you wanted? With new reseach into a direct brain-computer interface, that wish might soon be coming true.

Research into a hands-free brain-computer interface (BCI) has traditionally followed two approaches: biofeedback and stimulus-and-response. With biofeedback, a subject is connected to an electroencephalograph (EEG), and particular groups of brain signals are monitored. One widely-used signal is the "mu" rhythm, an 8-12 Hz brain rhythm centred on the sensorimotor cortex. The varying amplitude of the signal is used to control a cursor on a computer display. After a period of training, 4 out of 5 subjects can to some extent learn how to make the cursor move -- even if they're not consciously aware of exactly how they're doing it. The problem with biofeedback is that the training period can stretch to months, and the results can be very variable between subjects and the tasks they try to perform.

The stimulus-and-response technique differs from biofeedback in that when a subject is given a certain stimulus, the brain will automatically produce a measurable response -- so there's no need to train the subject to manipulate specific brain waves. One signal, the "P300 evoked potential", is ideal for this approach. First discovered in 1965, the P300 signal is the signature of the rush of neural activity that occurs about 300 milliseconds after a subject notices an external stimulus which they've been asked to watch out for.

At the University of Rochester in New York, researcher Jessica Bayliss is using the P300 signal to let people control objects in a virtual 3D world. Subjects in her experiments wear a skull-cap instrumented with 27 EEG sensors. On top of the cap the subject dons a pair of standard VR goggles, which provide a view of a computer-generated 3D world. In this simple world there's a table-lamp, a stereo system, and a TV. Above each object there's a flashing light. Each of the lights flashes at its own rate, and out of sync with the others. If the subject wants to switch on any of the objects, they simply think about which object they're interested in, and the P300 signal does the rest.

Suppose the subject wishes to switch the TV on. Whenever the light above it flashes on, the brain recognises the correspondence between "the light is on" and "I want to switch the TV on", and 300ms later generates a P300 signal. Bayliss's system automatically compares any P300 traces recorded on the EEG with the states of the flashing lights. If any are found to be in sync -- such that one of the lights flashes on followed 300ms later by a P300 signal -- then the system knows which object the subject was thinking of, and can switch the object on in the virtual world. In effect, the subject has communicated a "yes/no" decision entirely by thinking, and it works about 85% of the time.

What makes these experiments significant is that picking up the brain's electrical activity is fraught with problems. The signals are tiny, no more than a few millionths of a volt, so they're easily swamped by any stray electromagnetic noise in the environment. Most BCI experiments are therefore done in carefully shielded laboratories. Bayliss's achievement is to have found a way to clearly measure the signals in a very electrically noisy place -- a virtual reality laboratory stuffed with computers, displays and tracking devices.

The work of Bayliss and others is paving the way for BCI-enabled consumer products. You'll be able to control your ear-implanted phone/MP3 player just by thinking about it. Minds will soon be boggling, and presumably so will any machines they happen to be controlling.

Toby Howard teaches at the University of Manchester.