This article first appeared in Personal Computer World magazine, March 2000.
AS FAR AS communicating with computers goes, we've just about got sight and sound figured out. But have you ever wondered what it would be like to take your data, and instead of just visualizing it, touch it too?
Touchy-feely interfaces, or "haptic systems" as they're more properly known, have been around for some time. Game enthusiasts happily blast away using devices like the Microsoft SideWinder joystick, or the more recent Logitech WingMan force-feedback mouse. But as a general-purpose method for human-computer interaction, touch has been little explored. Now recent advances in the technology mean a whole range of haptic applications are on the horizon -- from virtual surgery training to 3D sculpting.
A new system called ReachIn offers convincing evidence that the haptic interface is going to become very big news indeed. ReachIn provides stereoscopic imagery and force-feedback, using a customised PC configuration. The monitor is angled at 45 degrees down towards the desktop. Instead of the keyboard is a mirror, which reflects the monitor screen. Underneath the mirror is a force-feedback stylus, which the user holds.
The stereo graphics are achieved using the CrystalEyes system, which works on the principle that because our eyes are a few centimetres apart, when we view a real 3D scene each eye sees a slightly different perspective image. The brain then magically knits the two views together. Using CrystalEyes, the monitor rapidly switches between two displays of the scene: one as it would appear if viewed by the left eye only, the other as the right eye would see it.
Looking at the monitor with the naked eye, all you see is a shimmering blurry image. But put on the cordless CrystalEyes glasses, rather like a pair of oversized wraparound sunglasses, and -- voila! -- you see a true 3D image floating in space. It works because each lens in the glasses is a liquid crystal shutter, which can be electrically flipped from transparant to opaque. A small box mounted on top of the monitor sends an infra-red signal to the glasses to tell it which image is curently on display. If it's a left-eye image, the left lens opens, and the right one shuts. And vice versa, at least 30 times a second. It might sound unlikely that this works, but it does, and the image quality is superb.
Below the mirror the user holds the stylus of a Phantom force-feedback device -- a wonder of miniature engineering. The stylus, about the size of a ballpoint pen and attached to the body of the device by a lightweight linkage, can be moved freely around by the user. The device can also press the stylus back against the user, using three tiny motors under software control.
It's hard to describe in words how it feels to use the system, but the effect is stunning. At a recent demo I was not the first person in the room to shout "wow!" as I dragged the stylus across an object floating in space in front of my eyes, able to feel the bumps on the surface. It was weird, to say the least, and not a little schizophrenic -- you're seeing and touching an object you know does not really exist. Then I tried making an injection into a vein in a virtual hand, and the feeling of resistance as I pushed the "needle" through the "skin" was very creepy indeed.
The haptic interface is compelling. As for the full virtual experience, it can only be a matter of time before we find ways for computers to stimulate our other senses too. But that's probably best left to your imagination.
Toby Howard teaches at the University of Manchester.