Features 21stC home page

S U R G I C A L R E A L I T Y

The doctor will see you now

Long seen as a technology in search of an application, virtual reality is finding a home in medical research, training, surgery, and rehabilitation. Its potential sounds futuristic, but its practical advantages may put VR in the mainstream of 21st century medicine

By STUART M. DAMBROT

VIRTUAL REALITY, OR VR--the experience of exploring and interacting with a computer-generated environment--ranges from text-oriented online forums and multiplayer games to spatial simulations that combine audio, video, animation, 3-D graphics, and even scent. Some of the more realistic effects use a helmet-like apparatus with small screens in front of the eyes, each giving a slightly different view so as to mimic stereoscopic vision. Sensors mounted in gloves, a bodysuit, and footwear convey the user's movements to the computer, which changes the images accordingly to simulate movement. Computer-generated physical feedback adds a "feel" to the visual illusion, and computer-controlled sounds and odors reinforce the virtual environment. VR is used in electronic games, amusement-park attractions, and simulated construction designs; experimental and envisioned uses include education, industrial design, and art.

Practical VR applications have great potential in activities where real presence is complex or impossible, such as medicine and surgery. By modeling molecular structures in three dimensions, VR is aiding in genetic mapping and drug synthesis. VR helps surgeons learn invasive techniques without a cadaver and lets physicians conduct real-time remote diagnosis and treatment, leveraging resources and improving survival rates. Borrowing from entertainment applications, the creation of interactive worlds allows the disabled to communicate and control their environment to an unprecedented degree.

Like most new fields, VR relies on both starry-eyed visions and realistic engineering. Columbia's leading medical VR researcher places his work in the latter category. For now, says Michael Treat, associate professor of clinical surgery at the College of Physicians and Surgeons, virtual medicine is still in its infancy. "If you draw an analogy to aviation," says Dr. Treat, "we're only about one to two years after Kitty Hawk."

Medicine as a communicative art

SPEAKING AT VRWORLD Spring 95, Shaun Jones, program manager of advanced biomedical technology at the Advanced Research Projects Agency, stated that "there no longer is medicine, there is simply information with a medical flavor. Nearly everything we need to know about a patient can be brought to us electronically or digitally." Jones described a clinical setting in which electronic sensors put vital signs, electronic medical records, and sophisticated imaging directly at a physician's disposal.

"This virtual world," Jones continued, "allows us to electronically educate, train, prototype, test, and evaluate.... We can practice and optimize before we perform the actual surgery.... We can bring the expertise of the surgeon to remote areas of the country or the world. We can dissolve time and space, and we can project not only surgical expertise in terms of dexterity, but also judgment and wisdom, anywhere we can put bits and bytes."

VR's ability to display previously unimaged data in rich 3-D detail has myriad biomedical uses. One of these, created by Rachael Brady and her coworkers at the National Center for Supercomputing Applications and at the University of Illinois at Urbana-Champaign, is Crumbs, an immersive virtual environment for the visualization and identification of biological structures.(1) This tool's CAVE (Cave Automatic Virtual Environment) system, an immersive projection-screen stereo display (see figure 1), uses data from diverse sources including confocal microscopy, magnetic resonance imaging (MRI), electron microscopic serial sections and tomography, and computational fluid dynamic simulations. (The system takes its name from the fairy tale of Hansel and Gretel marking their way through the forest with bread crumbs; users immerse themselves inside a data volume, walking along a structure and dropping markers along its path.)

Anatomical images are essential in training as well as in the surgeon's toolkit, but they are often not available where and when they are needed. Digital image storage, retrieval, and manipulation may finally solve this dilemma. One such example is the National Library of Medicine's Visible Human Project, an online digital database of body images from MRI, computed tomography (CT), and high-resolution photography, maintained at the University of Colorado School of Medicine.

At the Mayo Foundation, Richard Robb and Bruce Cameron are working on the Virtual Reality Assisted Surgery Program (VRASP). Designed for operating rooms, VRASP will display images in real time during craniofacial, orthopedic, brain, and prostate operations. Images will be retrieved from databases, rendered, displayed, and manipulated in response to the surgeon's commands. The clinical goal is dynamic fusing of 3-D body-scan data with the actual patient. VRASP will bring presurgical planning data and rehearsal information to the OR to optimize effectiveness, minimize morbidity, and reduce costs.(2)

Virtual Presence Ltd. of London has taken another approach with the Minimally Invasive Surgery Trainer (MISTVR), consisting of a frame supporting a pair of trocars (sharp instruments used with a cannula to puncture a body cavity for fluid aspiration) linked to a high-end PC, rather than the graphics workstations or supercomputers typically required for VR training. MISTVR fosters basic perceptual-motor competencies for remote handling of human tissue and allows recording of these skills for certification purposes. Tasks being developed for MISTVR include the presentation of basic visual cues and manipulative features for tissue cut and lift, arterial/duct clipping, acquisition and lifting of objects, stacking, one- and two-handed operation, and object interchange between instruments.

Another project, dedicated to delivering 3-D images over the World Wide Web, is the Digital Anatomist Program, a collaboration between anatomists and computer scientists within the University of Washington Department of Biological Structure. This program's long-term goals are fourfold: to develop methods for representing and managing structural information about the human body; to develop computer programs that use this information to solve problems in clinical medicine, research, and education; to develop a knowledge base of human gross anatomy; and to disseminate this information.

Being there (not!): from telepresence to telemanipulation

AS USEFUL AS 3-D images are, rendered graphics alone cannot simulate the feel of surgery: the weight, elasticity, contractility, bleeding, and deformations of flexible and living tissue. VR's tactile capacities are well-suited to this technical challenge. "Computerized surgical simulation offers the potential of improving surgical skills without risk to patients," says Jonathan Merril, M.D., co-founder of High Techsplanations, Inc., at VRWorld. This firm's TELEOS software authoring environment converts sequential flat images into organ and tissue models that behave as realistically as they look. This lifelike simulation, with interfaces to tactile and force-feedback hardware, offers the promise of real-time interaction with deformable objects, blood-flow simulation, and collision detection, as well as photorealism.

"Surgical instruments must also behave; they must know how to interact with the tissues," Merril adds. "A scissors will cut tissue when a certain amount of pressure is applied, while a blunt instrument may not." Training applications range from skills as common as intravenous needle placement to more involved endoscopic procedures. In Merril's view, working with these simulations is "as challenging as laparoscopic surgery."

In the view of Columbia's Michael Treat, laparoscopy is the closest we now come to true VR medicine. Less than a decade ago, laparoscopic cholecystectomy (gallbladder removal) was viewed with amazement, suspicion, and even fear; today, it's the gold standard. Still, says Treat, laparoscopic instruments and VR surgery have a long way to go. Laparoscopes currently have only one degree of freedom of movement in the instruments attached to their ends. Treat envisions the day when more advanced laparoscopes will rotate as a human wrist can; the difficult part is actually designing a device capable of these complex motions. "We're working backwards from what we know we want the device to do," he says. "It's not a trivial thing to move your hand."

Future VR devices might also ease the surgical interface. "Now you get a stiff neck from looking up at a TV set while your hands are down, working," says Treat. A more natural interface would immerse the surgeon in the environment, bringing back the feel of an operation. In addition, "heads up" displays, like those designed for fighter pilots, might provide on-site, real-time CT or ultrasound pictures. "Instead of sweating it out, looking at the screen and guessing where a blood vessel is, for instance, the surgeon would have increased input from these imaging modalities," says Treat.

Taking the technology one step further, Treat envisions robotic devices and instruments that could be coupled to the entire interface and programmed to perform surgery. An instrument equipped with a pulsed laser light, for instance, could determine whether an intestinal polyp is benign or malignant; actuators on the same instrument could then remove the polyp if necessary. The system could even modify scale and give the operator the feeling of walking into the intestine and harvesting the polyp.

Other robotic instruments could be programmed to perform simple tasks such as making incisions or suturing tissues. When a surgeon ties knots, says Treat, tension varies widely; a surgical robot could ensure equal tension and equal distance between stitches. Furthermore, robots programmed to complete surgical tasks could reduce the time necessary to perform an operation--removing a gallbladder in, say, one minute--thereby reducing the impact on the body."The human surgeon will always be a part of it. You'll always have the judgment factor: when to cut or when not to," says Treat. "But if you could harness the speed and accuracy of a machine, you lower the costs and have quicker and better recoveries. Machines will do the work faster and more reliably, and a human will balance the factors in an intuitive way."

Conquering disabilities, extending abilities

THE SAME APPROACH can help patients as well as surgeons. A rehabilitation system patented by Grigore Burdea and Noshir Langrana employs a force-feedback system, such as a glove, to simulate virtual deformable objects. Before rehabilitation, the patient places his or her hand in a sensing glove that measures the force exerted by the fingers. Information from the sensing glove is received by an interface and transmitted to a computer, which uses the information to diagnose the patient's manual capability and generates rehabilitation-control signals for a force-feedback glove. The patient places a hand in this glove and attempts to bring the fingers together, as though grasping the virtual object. The force-feedback glove resists the squeezing movement in a manner that simulates the feel of the object. The force exerted by the patient's fingers is fed back to the computer control system, where it can be recorded or used to modify future rehabilitation-control signals. The basic concept can also be applied to the arms, legs, neck, knees, elbows, and other articulated joints.(3)

Already deployed is Virtual Reality Concepts' PT-VR, a combination of PC hardware, Sense8 WorldToolKitTM software, and VR input devices that enable physical therapists to treat patients without the weight restrictions of real objects. While the PT-VR system can be used for any part of the body, its primary use is in hand therapy because the hand uses the most intricate muscle and nerve functions. (Full-body suits are also available.) A glove picks up movements of the hand requiring therapy and sends signals to the PC, which displays hand motions on a monitor or VR helmet. Since virtual objects have no weight, therapists can concentrate on developing hand-eye coordination and fine motor skills.

The Internet itself is embracing VR; new technologies like VRML (Virtual Reality Modeling Language) are replacing 2-D home pages with a true sense of place. Predictions on when 3-D browsers and full interactivity will take over the Web, however, have varied widely; in this field, hype is widespread and vaporware inevitable. As cyberspace sprouts a third dimension, researchers such as Treat are doing the patient, practical work of translating 3-D VR concepts into the surgical mainstream--advancing the field from Kitty Hawk toward Charles Lindbergh's transatlantic crossing, if not quite as far as Cape Canaveral.

How does one prepare for this brave new world? If nothing else, by making sure that the family physician has a good screen-side manner.

  1. Brady RB, Pixton J, Baxter G, et al. "Crumbs: a virtual environment tracking tool for biological imaging." Proc. IEEE Symposium on Frontiers in Biomedical Visualization, pp. 18-25. Atlanta, Ga., Oct. 30, 1995.

  2. Medicine Meets Virtual Reality III: Interactive Technology and the New Paradigm for Healthcare, Jan 19-22, Hyatt Regency San Diego. Sponsored by the University of California, San Diego School of Medicine in cooperation with the Advanced Research Projects Agency (ARPA) and Commission of the European Union.

  3. Assignee: Greenleaf Medical Systems Inc. Burdea, Grigore C. Langrana, Noshir. Patent Number: 5429140. Issue Date: 1995 07 04. Inventor(s): Burdea, Grigore C. Langrana, Noshir A. Copyright 1995, MicroPatent, File: m0707055.0pa.



STUART M. DAMBROT is president of Prose Communications, Inc., a California-based firm covering established and emerging technologies.

PHOTOS: Jo Ann Eurell and Janet Sinn-Hanlon (1&2); Mark Martinelli(3)


21stC home
page 21stC is. . . special metanews next feature