From fringeware@org.cactus.wixer Sun Feb 14 14:54:32 1993 Received: from uk.ac.nsf by helios.dmu.ac.uk with NiFTP; Sun, 14 Feb 93 14:54:31 GMT Received: from raid.dell.com by sun3.nsfnet-relay.ac.uk with Internet SMTP id ; Sun, 14 Feb 1993 14:54:38 +0000 Received: from bigtex by raid.dell.com (5.67/jrv-1.2i) with UUCP id AA00779; Sun, 14 Feb 93 13:56:45 GMT Received: from wixer by bigtex.cactus.org (5.67/jrv-1.2u) with UUCP id AA08758 (for cph@dmu.ac.uk); Sun, 14 Feb 93 09:45:25 GMT Received: by wixer (5.65/1.35) id AA11554; Sun, 14 Feb 93 03:15:17 -0600 Message-Id: <9302140915.AA11554@wixer> To: cph@uk.ac.dmu From: fringeware%wixer.cactus.org%wixer@org.cactus.bigtex (FringeWare Inc.) List-Server: fringeware-request@wixer.cactus.org Errors-To: fringeware-owner%wixer.cactus.org%wixer@org.cactus.bigtex Date: Wed, 10 Feb 93 09:54:53 -0800 Reply-To: space2%u.washington.edu%wixer@org.cactus.bigtex (Tim) Subject: GROK - Mind Control of Computers? Sender: fringeware@org.cactus.wixer Status: OR Sent from the cyberdeck of: space2@u.washington.edu (Tim) In article <1993Feb10.023303.10672@reed.edu> you write: >Saw this in the times thought you all might be interested. > >Zeke > >------ > >THE NEW YORK TIMES, TUESDAY, FEBRUARY 9, 1993 > >Computers Are Starting to Take Humans' Wishes as Their Commands > >By ANDREW POLLACK > >ATSUGI, JapanQ People now control computers with a keyboard, a >mouse or in some cases with spoken commands But at Japan's largest >computer company, Fujitsu Ltd, and at several other laboratories >around The world, researchers are developing ways to control a >compu~er by merely thinking a command > >A New York State Department of Health research Team has >developed a system that allows users, after some training, to move a >cursor slowly up and down or side to side on a computer screen by >mental action alone. University of Illinois psychologists developed a >way of allowing people to type, albeit at a rate of only 2.3 characters >a minute, by spelling out words in their minds > >And at the research laboratories of the Nippon Telegraph and >Telephone Corporation, Japan's main telephone company, researchers >have devised techniques to tell from brain waves, with a fair degree >of accuracy, the direction a person will move a joystick. A similar >project is under way at Graz University of Technology in Austria. > >"This is no parapsychological exercise," said Emanuel Donchin, a >professor of psychology at the University of Illinois who led the >development of the thought controlled typewriter. >Rather, such mind-over-cursor techniques work by having >computers analyze electric signals emitted by the brain as it works >The signals are collected by electroencephalography, or EEG, a >technique that involves attaching electrodes to the scalp. It has long >been used to diagnose brain disorders. > >Complete human-brain computer interaction is certainly decades >away and might never move beyond science fiction. But in the next >decade, practical if limited systems for helping severely handicapped >people communicate or operate appliances are seen as feasible. A >related technique in which electrical signals to the muscles are >detected and analyzed is also being explored to help paralyzed >people operate artificial arms or legs. > >RIf we can use a computer without even uttering a sound, it would be >easier," said Norio Fujimaki, one of the three researchers >participating in an experimental program on thought driven >computers at Fujitsu's research laboratory in this city near >Yokohama. > >Attempts to develop thought input for computers began in the >1970's with the "biocybernetics" program financed by the United >States Defense Department. One goal was to enable a computer to >determine the state of mind of a fighter pilot so it could better assist >him in operating the plane, said Professor Donchin, who was involved >in the work. > >But the program was discontinued. in the early 1980's, and since >then work in this field, aimed mainly at medical uses, has been >sporadic, hurt by shortages of financing and technical obstacles. >Research in this area often raises concerns about whether technology >will be developed to read minds. But Professor Donchin and others, >say that most of the systems under development cannot eavesdrop >on a person's thoughts. > >Indeed, for now and in the near future it is a major challenge to >recognize from brain waves if a person means "yes" or "no," let alone >to understand complex thoughts. That is because there is little >understanding about the connection between any particular thought >and the voltages emitted by brain cells. > >Moreover, any one signal may be drowned out by the signals from all >the other brain activities going on at the same time. > >Don't Breathe, Please > >"It's difficult enough to have a speech recognition device, but there >you know the language," said Erich Sutter, a senior scientist at the >Smith-Kettlewell Eye Research Institute in San Francisco who >developed a system using EEG that can tell where on a computer >screen a person is looking. "With EEG signals, we really don't know >the language the brain uses, and the brain may be doing all sorts of >things unrelated to the thought you are trying to dig out." > >Consider the first efforts at thought input by Dr. Fujimaki of Fujitsu >and his collaborator, Prof. Shinya Kuriki of Hokkaido University. > >A volunteer sitting in a chair would have 12 electrodes attached to >his or her scalp. Because any movement, even blinking or looking at >the scenery, would generate a brain signal 10 times larger than the >one the researchers were trying to detect, subjects had their heads >locked in one position with a special brace. They were told to stare at >a black dot and to breathe, blink and swallow as little as possible. > >The subjects were told to say the sound "ah" in their mindQwithout >actually voicing itQwhen they saw one color of flashing light, but not >to say it when they saw another color. By averaging dozens of >readings, Dr. Fujimaki could detect a difference in brain pattern >when a person was mentally saying "ah" > >TIt's Far From PracticalU > >But the need to take so many readings rules out the use of the >technique for computer control. Ideally, a person would want to >think the letter "a" only once and have it recognized. "In our >experiment, 10 hours are required to communicate only one vowel " >Dr. Fujimaki said. "It's far from practical communications." > >Other researchers have made more progress by using particular >signals that are easier to detect and analyze. > >At the University of Illinois, Professor Donchin took advantage of >what; is known as the "oddball paradigm." When someone sees >something that he or she has been waiting for but that occurs only >rarely, the brain emits a detectable signal about three-tenths of a >second later. > >To develop his brain-activated typewriter, Professor Donchin >arranged the letters of the alphabet in rows and columns that were >displayed on a computer screen. The rows and columns were flashed >one by one in a random order. When either the row or the column >containing the letter a person was thinking about flashed on the >screen, the person's brain would emit the telltale signal. By knowing >the row and column, the computer could then identify the proper >letter. > >Disciplining the Brain > >At the New York State Department of Health's Wadsworth Center for >Laboratories and Research in Albany, Dr. Jonathan R. Wolpaw and his >colleagues get around the problem of having a computer try to guess >what the brain is thinking. Their approach is to train the brain to >emit signals that can be easily understood by a computer. "It's >putting the task on the brain," Dr. Wolpaw said. > >Dr. Wolpaw's technique uses mu waves, which are rhythmic signals >emitted by the brain's sensorimotor center when it is in idle mode. >In Dr. Wolpaw's system, electrodes measure the amplitude of the mu >waves and translate large amplitudes into an upward movement of >the cursor and low amplitudes into a downward movement. > >In one experiment, four of five subjects gradually learned to control >their mu waves enough to move a cursor from the center of the >screen to either the top or the bottom in about three seconds. > >Some subjects found that particular thoughts, say, about weightlifting >would move the cursor down, while thoughts about relaxing moved >the cursor up. After a while, such imagery was no longer needed, Dr. >Wolpaw said. > >By using more detailed measurements of the mu rhythms, Dr. >Wolpaw's team has recently succeeded in enabling people to move >the cursor side to side as well as up or down. But people still cannot >bring the cursor to a particular point and stop, a level of control >needed to develop the mental equivalent of a computer's mouse. > >Akira Hiraiwa and his colleagues at Nippon Telegraph and Telephone >have taken advantage of the fact that the brain emits certain >voltages before an action, is taken. They developed a pattern- >matching computer known as a neural network that could tell the >difference between signals corresponding to a left and right >movement of a joystick. But it was difficult to have the system work >fast enough to make the prediction before the movement occurred, >although researchers in Austria, using a similar technique, say they >can do this. > >Even for paralyzed people, brain control right now is still >impractical, compared with other techniques that have been >developed to allow people to control computers by eye movements >or breath. > >In recent years, techniques have been developed that provide better >images of the working brain than EEG does. Positron emission >tomography and fast magnetic resonance imaging have provided >pictures of the brain as it performs a function like recalling a word > >Dr. Fujimaki of Fujitsu hopes to use extremely sensitive supercon >ducting sensors to read the faint magnetic waves emitted by the >brain. Such magnetic measures would be more detailed than the >electric sig nals now used, he said. > >The new techniques might not prove practical for computer control, >however, because they require multi-million-dollar machines. >Reading magnetic brain signals, for instance, requires a person to be >in a special magnetically shielded room and to wear a special helmet, >equipment that together costs about $1 million. > >Still, even if they cannot be directly used for thought input into >computers, the new techniques will provide insights into the operations of the brain. Such >knowledge could help develop the thought-driven machines of the >future. > >-- > zeke@reed.edu