College of American Pathologists
CAP Committees & Leadership CAP Calendar of Events Estore CAP Media Center CAP Foundation
About CAP    Career Center    Contact Us      
Search: Search
  [Advanced Search]  
CAP Home CAP Advocacy CAP Reference Resources and Publications CAP Education Programs CAP Accreditation and Laboratory Improvement CAP Members
CAP Home > CAP Reference Resources and Publications > CAP TODAY > CAP TODAY 2008 Archive > Making AP count with computer-assisted imaging
Printable Version

  Making AP count with computer-assisted imaging


CAP Today




September 2008
Feature Story

Anne Paxton

It’s been over a year since Baystate Health started integrating computer-assisted imaging analysis, or CAIA, into the Department of Pathology. And there is one thing department chairman Richard Friedberg, MD, PhD, can definitely attest to about CAIA or any other new technology: “You don’t just pop them in and turn them on.”

Presenting with Liron Pantanowitz, MD, Baystate’s director of pathology informatics, at the CAP Future­scape conference in June, Dr. Friedberg described their talk as a “reality check” on CAIA. “There’s a learning curve any time you adopt a new technology. You have to figure things out, and that’s very important.”

That computer-assisted imaging tools are less readily accepted by pathologists over 45 years of age is one of several potential obstacles. But Drs. Friedberg and Pantanowitz said the use of computer analysis techniques in a decentralized arrangement can improve reproducibility and decrease the effects of observer biases in quantitative immunohistochemistry. CAIA, they said, is a prime example of anatomic pathology’s transition to a more quantitative basis.

With a 22-pathologist department, Baystate sees about 50,000 surgical specimens a year and performs up to 7 million laboratory tests. “As a rule of thumb, we see about one new breast cancer case a day, so there’s enough new stuff coming through for us to apply these tools and really do something with them,” Dr. Friedberg said.

Unlike clinical pathology with its reliance on numbers, anatomic pathology is still at a semiquantitative stage, but it is evolving, he said. “We’re getting away from the idea that this is a bad cell because I sat next to somebody who told me it was a bad cell—that sort of guild mentality with anointed experts—and moving more toward quantitative, reproducible, validated, specific, and reliable approaches.”

Pointing to the current effort to make the HER2/neu immunohistochemical test a quantitative one, Dr. Friedberg said, “What we’re seeing, in essence, is that stains are becoming assays, and that really is the difference between a quantitative and qualitative perspective.”

“We’re having debates among ourselves as to whether the FISH approach, counting gene copies, is a better way of looking at HER2/ neu expression than the IHC approach of seeing what proteins are on the surface. But in general, we’re beginning to see results that are directly tied to treatment, not just prognosis,” he said. “We’re trying to find technologies to use in anatomic pathology to predict and dictate treatments. And that is actually more and more of a clinical pathology perspective.”

A second key trend is that anatomic pathology is evolving along radiology and imaging lines. Because radiology acquires its images digitally, it can process them, put them in series to make them move, and change settings. In anatomic pathology, where digital microscopic images are still being acquired optically, “we don’t have that luxury yet. We have to photograph photons coming through a stained piece of tissue.” As pathology transitions to digital image acquisition, “then you can do all the digital processing afterwards that the radiologists have such fun with.”

Ironically, it wasn’t for quality purposes that hospitals made the move to digital radiology, pointed out Dr. Friedberg, who was involved with early digital radiology systems from 1996 to 2001. That was an unintended consequence. “The way you sold digital radiology to hospital administrators was to show them they could get rid of the file room. With a 7/24/365 operation, you no longer had the cost of the people to fetch films, the cost of the film and silver and developing chemicals and space.”

Pathology will benefit from some of the same techniques that have evolved from digital radiology, such as coincidence imaging. “Most pathologists are not accustomed to taking two images and overlaying them.” For radiologists, however, it has become standard. “Not only can you take a CT scan, but you can superimpose on it a coregistered PET scan to determine if the targeted area is PET positive.”

Can pathologists do that? With spectral imaging or quantum dots, Dr. Friedberg says, “we probably can. You can see whether the exact same cell is HER2/neu positive or ER positive or PR positive. With additional techniques, you can effectively remove a component, change the contrast, and so on. Those are great tools. Imagine if you were looking at a lesion saying ‘Let me remove the stroma’ and so on; you’d have different ways of looking at the same tissue.”

Dynamic images will also become common. “The next time you see a radiologist trying to look at 1,600 CT images, you’ll know why they prefer to see them as a movie. It’s a lot easier to look at it that way.”

As the tools of radiology and pathology converge, especially in the area of breast cancer and immunohistochemistry, “it doesn’t necessarily mean that pathologists are going to be reading chest x-rays in six years. But the fields are moving closer together because the technology is aligning,” he says. Eventually, all “image-based” pathologists will use computer-assisted analytic tools to assay specimens, and intelligently designed PACS (Picture Archiving and Communications Systems) will revolutionize pathology workflow, Dr. Friedberg predicts.

Under the currently accepted gold stan­dard for deter­mining breast markers like estrogen or progesterone receptors and HER2/neu in breast cancers, pathologists provide a semiquantitative score by evaluating immuno­stained tissue on a slide or perhaps even a digital image, Dr. Pantano­witz said.

But in borderline cases, or where the cases are weakly stained, the intra- and interobservability of reporting out these immunostains worsens. Moreover, although scoring criteria exist, the postanalytical interpretation is still somewhat subjective, he said. Can computer-aided analysis give everyone the same yardstick with which to score their breast cancer cases?

Early studies from 1992 showed that manual scoring by pathologists was better than computer-assisted analysis. Later, a few studies showed that manual analysis and CAIA were comparable, but most of the data since 2004 have shown that computer-assisted analysis is better: more effective, more consistent, and more precise, Dr. Pantanowitz noted.

Pathologists should take into account certain potential drawbacks before moving to CAIA, he said. For example, “the expense of a computer-assisted system may be hard to justify if volumes are low. It also requires commitment from the pathologists and may be a time-consuming process. Small amounts of tissue may generate erroneously low results. And studies have found that nonspecific staining may cause some interference with the analysis, as may artifacts such as dust on the slide.”

The algorithm for object-oriented image analysis is morphology-based, Dr. Pantanowitz explained. Using steps such as color normalization, background extraction, segmentation in the image, classification, and selection, “what it’s trying to do is separate out tissue elements, such as the tumor epithelium from stroma. That permits selection of regions of interest and filtering out of unwanted areas. You can then count selected cells to obtain a quantitative result, and thereby a measure of protein expression.”

This type of CAIA system was implemented and validated at Baystate. “We have distant medical centers, some as far as an hour’s drive away, and we have pathologists there who interpret and score breast markers,” Dr. Pantanowitz said. “We have a significant breast immunohistochemistry caseload, with more than one run a day.”

But Baystate did not want to use CAIA to centralize interpretation. “We didn’t want only one pathologist restricted to doing all the work. Also, there were bandwidth considerations to take into account.”

After upgrading everyone’s computers to handle multimedia, and attaching the same digital camera to their microscope, Baystate put BioImagene’s Pathium product on its server. “Then we trained everyone and began our validation.”

“The system is complicated,” Dr. Pantanowitz said, but it was rolled out at Baystate and designed to mimic the real workflow in the department. “Digital images of the control immunohistochemical slides were acquired by an advanced technologist who entered the images along with patient data into the computer system.” Christopher N. Otis, MD, director of surgical pathology and immunohistochemistry at Baystate, then validated the controls before pathologists received their slides with stained breast markers to image and analyze based on the control for that run.

Baystate quickly found that there is a need to standardize the acquisition of digital images. “Even though our own informatics team installed all these cameras and uniformly calibrated them, we soon found out that these cameras all took different quality images, even of the exact same material. So we had to make sure, in moving toward image analysis, that all cameras were calibrated, not on a weekly or daily basis, but on every case to be imaged,” he said. They also asked participating pathologists to manually score their cases. “As a result, we were able to compare their manual score versus CAIA and determine how long each method took. A number of HER2/neu cases had FISH performed.”

“We found that the results for ER and PR were mostly concordant. Pathologists’ CAIA and FISH results for HER2/ neu cases were also all in concordance.”

Tissue heterogeneity with respect to immunostaining presented challenges. “When looking at an immunohistochemical-stained slide, not all areas stain equally; some areas are often more positive than others. So where do you score, photograph, analyze? The literature wasn’t very helpful in this regard.”

Baystate is now looking at ways to improve the digital image analysis process. “We feel that with adoption of a virtual workflow system like whole-slide imaging,” Dr. Pantanowitz said, “it will be much easier to standardize. It would also be helpful if a one-click process were available for analysis, including deciding which areas were appropriate to analyze.”

It’s clear that pathologists need to be involved in this whole process if they’re going to adopt it, he noted. The companies, too, have to coordinate their efforts. “Ideally, imaging systems need to be integrated into our AP LISs so that we can improve the workflow, being able to readily share data and images between disparate systems. Also, digital image algorithms need to incorporate some degree of artificial intelligence, so they can learn from pathologists feeding back into the system.”

But more clinical outcome studies are needed, Dr. Pantanowitz said. “At the end of the day, does it matter if a little more accuracy gained by using these computer-assisted imaging analysis systems doesn’t really affect patient outcome?” he said. “We need to know how our patients respond to their targeted therapy. In other words, we need prospective clinical validation of this novel technology.”

Anne Paxton is a writer in Seattle.
 © 2014 College of American Pathologists. All rights reserved. | Terms and Conditions | CAP ConnectFollow Us on FacebookFollow Us on LinkedInFollow Us on TwitterFollow Us on YouTubeFollow Us on FlickrSubscribe to a CAP RSS Feed