College of American Pathologists
Printable Version

  Recommended Reading


CAP Today




January 2009
PAP/NGC Programs Review

Analysis of appealed cases in PT

Roger B. Lane Jr., MD

The CAP proficiency testing program for gynecologic cytology, known as Pap PT, began in 2006. Participants who receive a failing score can appeal the result on any slide in their proficiency testing slide set. A study by Barbara Crothers, et al, published this month, details the appeals process and examines the results of all appealed cases from the 2006 Pap PT program, providing insight into trends in appeal cases (Appeals in gynecologic cytology proficiency testing: review and analysis of data from the 2006 College of American Pathologists Gynecologic Cytology Proficiency Testing Program. Arch Pathol Lab Med. 2009;133:44–48).

The study provides a brief history of national proficiency testing for gynecologic cytology in the United States, beginning with the passage of the Clinical Laboratory Improvement Amendments of 1988 and culminating with the implementation of federal regulations 17 years later. The government-mandated diagnostic categories and scoring system are reviewed. The study describes how slides are selected for CAP educational programs, the process of field validation, and the criteria for including slides as challenges in the CAP proficiency testing program.

The Pap PT appeals process begins when a participant contacts the CAP and requests review of a slide. The challenged slide is submitted to masked review by three cytopathologists from the CAP Cytopathology Committee. The referees are provided with the original information given to the appealing participant but are masked to the participant’s answer, the reference diagnosis, any other reviewer’s opinion, and the reason for the appeal. The three referees’ results must exactly match the target interpretation, and the quality of the preparation must be approved unanimously. If the slide fails review or is deemed to be of poor technical quality by a single referee pathologist, the appeal is granted and the slide is withdrawn from the PT program. Contesting individuals are awarded the missed points for the case and the results of their PT event are re-graded accordingly.

The passing rate for Pap PT in 2006 was 94 percent. Of the 589 participants who failed, 155 (26 percent of failures and 1.6 percent of total participants) appealed the reference category on 86 slides. Forty-five (29 percent) appeals were granted, and 43 of these participants received passing scores. One hundred and ten appeals were denied. Appeals were requested most often in cases of CMS category D (HGSIL or carcinoma), and category D was the category with the highest absolute numbers of appeals granted. Slides from category C (LGSIL) were appealed infrequently, but the appeals were frequently granted. Slides from category B (negative slides) were appealed frequently but most often denied.

The study by Crothers, et al, discusses these findings and others in more detail, including specific features of the slides that were appealed and how these features were related to the ultimate chance for a successful appeal. The study concludes with an excellent discussion of the shortcomings and flaws of the current government-mandated proficiency testing system for gynecologic cytology.

Dr. Lane, a member of the CAP Cytopathology Committee, is a pathologist with Southeastern Pathology Associates, Brunswick, Ga.

Performance in the test setting

Roger B. Lane Jr., MD

Do participants interpret the same Pap test slide differently in a proficiency testing program versus an educational program? This question is examined in an article by Jonathan Hughes, et al, due out next month (Changes in participant performance in the “test-taking” environment: observations from the 2006 College of American Pathologists Gynecologic Cytology Proficiency Testing Program [PAP PT]. Arch Pathol Lab Med. 2009;133; in press).

The CAP Interlaboratory Comparison Program in Gynecologic Cytology has existed for 18 years as an educational and laboratory accreditation activity for pathologists and cytotechnologists. The CAP Gynecologic Cytology Proficiency Testing Program began in 2006 after approval from the Centers for Medicare and Medicaid Services. The CAP proficiency testing slide sets are derived from field-validated CAP educational program slides, so all slides in the CAP proficiency testing program have previously circulated through the CAP educational program. Many of the same participants are enrolled in both programs.

The article reviews the slide selection procedure for the educational program and describes how a slide becomes field validated. The process by which proficiency testing slides are chosen from CAP educational program slides is also discussed, as are differences in criteria for evaluating participant responses in the two programs. Finally, performance characteristics of slides in the proficiency testing program are compared with historical data for the same slides in the CAP educational program to see if the test-taking environment alters slide interpretation.

Hughes, et al, found that pathologists and cytotechnologists in the test-taking environment of the CAP proficiency testing program were more likely to interpret negative slides as positive and less likely to interpret positive slides as negative when compared with historical data from the same slides in the CAP educational program. This is most likely because proficiency testing examinees are acutely aware that the CLIA-mandated differential scoring system for proficiency testing penalizes under-interpretation more severely than over-interpretation. If examinees encounter a slide that they perceive to be borderline between negative and positive, they will (and should, based upon the rules of the examination) upgrade borderline cases.

The article concludes that both cytotechnologists and pathologists appear to employ a defensive strategy in the CLIA-mandated gynecologic cytology proficiency testing program that results in evaluating a Pap test slide in a different fashion than when the same slide is evaluated in nonpunitive educational or laboratory accreditation exercises. This difference appears to be a direct result of the artificial testing environment and relates specifically to the over-interpretation of abnormal or potentially abnormal slides. Further evaluation of this phenomenon may be helpful in the future in designing more appropriate measures of cytologist proficiency.

Dr. Lane, a member of the CAP Cytopathology Committee, is a pathologist with Southeastern Pathology Associates, Brunswick, Ga.