Biometrics

Toby Howard

This article first appeared in Personal Computer World, March 1997.

IT WILL BE THE ULTIMATE in personal ID. No cards, no passwords -- just you. You're scanned, and a computer checks if 'you' really are you. The machine won't be fooled by someone made up to look like you, or even by your identical twin. It's coming. Most of us have several passwords to remember, along with at least one PIN number for the cash dispenser, and perhaps security codes for our workplace, our home burglar alarm, mobile phone, and all the other things. Do we really need to fill our heads with this rubbish? No, say the proponents of 'biometrics', an emerging technology set to supersede PINs and text-based security systems.

Biometric systems make measurements of a person's physical characteristics and behaviour, which can subsequently be used for comparison against a database of known individuals. Using biometrics, computers will be able to identify us from the shape of our face or the distribution of the heat it emits, from fingerprints, voice, the patterns of blood vessels in the retina, the shape of our iris, DNA, handwriting, the length and shape of fingers, and the patterns of veins in our hands.

With all these techniques, the principle is the same: if there's a match against a stored template previously scanned from you, then 'you' are you; if there isn't a match, you're somebody else.

The widespread applications of biometric identification are clear: access control, immigration checking, election management, identification of newly-born babies, law-enforcement, and so on. Some commercial systems are already available and in place. In the United States, the Connecticut Department of Social Services has introduced an identification system based on finger images, to crack down on welfare fraud, and devices which can recognise the geometry of hands are in use at San Francisco airport. But if there is a Holy Grail for biometrics researchers, it is to create a foolproof recognition system based on a computer just looking at you.

Like almost anything to do with the way our brains work, the mechanisms by which we recognise faces remain largely unknown. The problem has long fascinated psychologists, who have devised endless ingenious experiments to try and understand how our brains perform this monumental task of data processing. Studies of patients suffering from prosopagnosia, the neurological inability to recognise faces, and grim experiments on macaque monkeys have suggested that there are specific collections of brain cells concerned with face recognition, but there is still no deep understanding of how normal brains recognise faces.

It is this ignorance which makes trying to program a computer to recognise a face so extraordinarily hard, as researchers have been finding for decades. One major problem has been the lack of any underlying theoretical model, but although most implementors of face recognition systems take quite different approaches, their techniques share a number of common steps: (1) capture the image; (2) strip off any background features and isolate the face; (3) attempt to identify the main facial features -- eyes, nose, lips; (4) normalise the facial features according to a standardised grid system. At this stage the normalised face can be matched against a face database.

The process is fraught with difficulty. As Abraham Lincoln once said, "Common-looking people are the best in the world, that is the reason the Lord makes so many of them". And common-looking people are very difficult to tell apart too. Another problem is that people change their hairstyles and facial hear, wear hats and glasses, stand at odd angles to the camera, and frown and smile. And when did you last look like your passport photograph?

Nevertheless, there are several face recognition systems already on the market. Most are PC-based, such as TrueFace, ZN-Face, FaceVACS and NeuraWare, and all need fast Pentium support. Manufacturers of face recognition and other biometric identification systems generally make sweeping claims for their accuracy, but according to the US Government's Biometric Consortium, field tests often reveal far lower reliability. For face recognition the reliability of the match will rarely exceed 90%, but for many applications an accuracy of less than 100% simply won't be acceptable.

Research at the Sandia National Laboratories has shown that of the various biometric systems available or under development, hand recognition is the most reliable, and voice recognition the least. There are sometimes unexpected drawbacks, too: it was recently reported in New Scientist that a researcher at the University of Adelaide has discovered that the fingerprints of koala bears are uncannily similar to those of humans. So a fingerprint recognition system would probably be a bad choice for an Australian wildlife park.

The use of biometric recognition techniques, when they become reliable and cheap enough, will fundamentally change our relationship with computers. Traditionally, it has been up to us to announce our presence to a computer. If you walk up to a PC running a screensaver, you have to touch the keyboard or mouse to wake it up. You are in control. On the other hand, with the idea of a dozing machine that will only wake up if it likes the look of the person approaching it, the game changes. Now who is in control?

The widespread use of biometrics is undoubtedly around the corner, but it raises serious cultural and ethical questions. Do you want to be scanned and measured by a computer? To what extent are you still free if a computer can look at you and decide whether you are 'you' or not? What if someone obtains your biometric profile and impersonates you with it? Thoughts like this make some people afraid that biometric technology will steal their souls.

Nevertheless, it can't be long before Big Brother will not only be watching us, but sensing us -- whether we like it or not.

For more information on face recognition and other biometric technologies, visit www.cs.rug.nl/~peterkr/FACE and www.vitro.bloomington.in.us:8080/~BC .

Toby Howard teaches at the University of Manchester.