This month, the CAP unveils a new version of its Laboratory Accreditation Program checklists, which includes content changes and a new format aimed at improving the laboratory inspection and accreditation process. “It’s probably the biggest set of changes we’ve done all at once since 1961,” says Richard C. Friedberg, MD, PhD, vice chair of the CAP Council on Accreditation.
Extensively revised and redesigned, the new checklists are intended “to improve the consistency, to facilitate the accuracy of inspectors’ activities on site, and to make it easier for labs to prepare for inspection,” says Frank R. Rudy, MD, chair of the Commission on Laboratory Accreditation and former chair of the Accreditation Committee.
All the elements of the new version have been hammered out, debated, tested in the field, refined, and re-trialed, Dr. Rudy notes. “We’ve tried out different examples of checklists using various fonts, and using content changes to greater or lesser degrees. And we’ve surveyed our inspectors and inspectees to find whether the changes would enhance the inspection experience and ability of labs to remain consistently in compliance.”
Some of the changes have to do with the checklists’ “look and feel.”
“We made relatively simple changes in the fonts, used a new layout, and changed the graphics in ways we thought made it easier to follow the checklist and find specific items,” Dr. Rudy says. “We put in subject headers, so there’s a method to provide a phrase that gives the key concept of the requirement. That makes it easy for the inspector to very quickly identify the intent and nature of a specific checklist item. It facilitates the use of the checklist by less experienced inspectors.”
Across the board, one of the most noticeable changes is the conversion of checklist items from questions to declarative statements. “We think it’s a little easier to understand a requirement in declarative form as opposed to a question,” says Stephen J. Sarewitz, MD, chair of the Checklists Committee. “It’s certainly easier to write a complex statement in that form, and to make it concise and clear.”
Two substantive additions to the checklists are Evidence of Compliance and R.O.A.D. (Read, Observe, Ask, Discover). R.O.A.D., a framework to help inspectors with their reviews of labs, has been used on a trial basis but will now become officially incorporated into the checklists.
Evidence of Compliance, or EOC, addresses a gap that some inspectors and laboratories had observed over the years. “At times, inspectors and laboratories have questions about exactly what documents or proof must be provided to show the requirements have been met, and for many requirements, there is more than one way you can comply,” Dr. Rudy says. Members and staff identified a number of checklist items that needed elaboration. So Evidence of Compliance highlights the specific policies, procedures, records, reports, charts, or other materials that are needed to show the lab is in compliance.
One example would be the checklist item for QC data in hematology. “What, as an inspector, do you need to look for? You need a written procedure to define how you monitor the analytic precision, including the statistical analysis of data so you can determine whether or not the testing is within control limits. And you’ll need the QC records to make sure there was regular monitoring by the lab of the QC results. If there were situations—as occurs in all labs—in which the testing was outside parameters, you need evidence that appropriate action was taken,” Dr. Rudy says.
Not every checklist requirement needs an Evidence of Compliance component; for example, a requirement that there be a written policy on some issue is self-evident. But Dr. Sarewitz explains how Evidence of Compliance will clarify some checklist requirements. “In almost every checklist there is an item on PT participation. For example, in chemistry it’s item 10000: ‘The laboratory participates in appropriate required CAP Surveys or another PT program accepted by CAP.’” Now there is a new Evidence of Compliance field that says records are required, such as a CAP order form or purchase order indicating the lab is enrolled in proficiency testing for all analytes on its activity menu for which the CAP requires PT, or a record of completed submitted results for those analytes. “That wasn’t there at all before,” Dr. Sarewitz says.
Beyond just a list of documentation, Evidence of Compliance “is providing to inspectees a greater understanding of exactly what lab practices they need to conduct,” Dr. Rudy emphasizes. “The key thing is that you are, in fact, doing very specific scientific activities, and it’s the performance of those activities that makes certain one is carrying out the checklist requirements.”
From time to time, inspectors have been finding inadequate documentation of checklist compliance. “It happens with some frequency. But we expect that Evidence of Compliance will not only facilitate the inspection, it will also bring greater consistency across lab practice and greater consistency in the inspections themselves,” Dr. Rudy says.
Will Evidence of Compliance mean extra paperwork? Quite the reverse, Dr. Sarewitz says. EOC will make it easier for labs facing inspection to gather materials. “It really doesn’t add any burden. In fact, it might save some labs trouble because they’ll know what to provide, instead of using documents that don’t provide good evidence.”
Inspectors have already become familiar with “Read, Observe, Ask, Discover” in its trial phase, but now the R.O.A.D. acronym will be a permanent fixture of the checklists. “These are instructions to the inspector as to how to evaluate compliance in various areas of the checklist,” Dr. Sarewitz explains. Rather than review each individual requirement, CAP inspectors are encouraged to focus on the inspector instructions for a group of related questions. Once inspectors identify areas of concern through R.O.A.D., they are asked to “drill down” to more specific requirements when necessary, and review more details outlined in the prescriptive Evidence of Compliance statements.
The motivation behind R.O.A.D., Dr. Rudy says, was to help streamline the inspection approach and provide greater consistency in inspections. “We hope it provides greater efficiency and allows inspectors, at times, to look more at the big picture, rather than necessarily at just individual requirements.”
For R.O.A.D., icons have been added as a graphic aid. “In the inspector’s version of the checklist, R.O.A.D. instructions are scattered in strategic parts,” Dr. Sarewitz says. “What is new are the icons to help draw the inspector’s eye. It doesn’t affect the content of the checklist requirements but it’s a little more information for the inspector.” A short subject header sits above each item to help inspectors quickly locate the one they are seeking.
Aside from the broad-based changes, there are other content updates throughout the checklists that have “revised” flags. “These flags are designed to key labs into changes where they may need to change their procedures,” Dr. Sarewitz explains. He talked with CAP TODAY about the most significant revisions:
The checklists currently require review and approval of new and substantially changed procedures by the laboratory director or a designee qualified to be a director. This requirement corresponds to a CLIA provision. The CAP is now engaged in discussions with the CMS about whether review of these new and changed procedures needs to be limited to the director only. At CAP TODAY press time, no decision had been reached. No revision is being contemplated for the separate CAP requirement for annual review of all CAP procedures (for example, CHM. 11100), which may be fulfilled by the director or the director’s designee, he says.
In the previous checklist for chemistry, the term “clinically reportable range” was used in reference to quantitative analytes that are reported with allowance for dilution or concentration. “That term is not going to exist anymore,” Dr. Sarewitz says. “Instead, there are two new checklist items on dilutions and concentrations of specimens. One will simply state that if the value of the analyte or concentration of analyte in a patient specimen is outside the analytic measurement range, then it must be diluted or concentrated so that it can be brought into the range if a quantitative result is to be reported.” If it chooses, the lab can just say the result is greater than or less than the limits of the analytic measurement range. The second new checklist item says the lab must determine the maximum degree of dilution or concentration that is allowable for each applicable analyte. “The reason is that after a certain point, dilution or concentration affects the accuracy of the method, and it’s no longer reliable. Those maximums need to be reviewed every year to ensure they are reasonable.”
The last edition of the checklist said not every element of competency assessment had to be evaluated. Although the CMS has “gone back and forth a little” on competency assessment, Dr. Sarewitz says, it is now saying that for nonwaived tests, “all six elements have to be evaluated at every assessment.”
But changes are being made to these checklist requirements (GEN. 55500) that should make inspections a little easier. “There is an explicit statement that certain ongoing activities may be used for competency assessment,” he says. “For example, if a supervisor has observed an individual recording results on worksheets, or doing instrument maintenance, just over the normal course of work, and it’s satisfactory, that would be acceptable as a competency assessment element. The supervisor could just check off that the element had been observed and is okay.”
The CAP does require competency assessment for waived tests even though the CMS does not. However, all six elements need not be assessed at each event, so that provision of the checklist does not change.
A new checklist item on training has been added to emphasize that training is separate from competency assessment. “If individuals are trained to do a certain test or method, then they must have a competency assessment event prior to starting patient testing. That is separate from training. Then in the first year of employment, they would have another competency assessment event at six months, then annually thereafter. That requirement has always been there; it just hasn’t been explicit enough.”
Another requirement added to the quality management elements is that for the error-monitoring program (GEN.20262), the lab must keep occurrence records and error logs. “I think most labs do this anyway,” Dr. Sarewitz says. “But it’s important enough to warrant a requirement that the lab keep logs and must do an appropriate review and followup of the logs to find patterns of errors, to determine if system changes can be made to reduce the incidence of errors.”
One element is not new but should be mentioned, Dr. Sarewitz says: the CLIA requirement that every instrument or method in the lab be compared twice a year (CHM.13800). “Before 2009, we stated that the requirement applied only to quantitative tests, but CMS said no, it applies to both qualitative and quantitative tests.” There has been a lot of unnecessary concern about this requirement, he says. “It’s important to note that it does not require the two methods you are comparing to give the same result. It‘s just that for instruments within the same CLIA number, the lab has to determine the relationship between the two. In terms of how many samples need to be run and what the acceptability requirements are, that’s entirely up to the lab director.” The requirement does not apply to waived tests, he adds, though the lab may choose to compare a waived method within a single CAP number.
The requirements on temperatures of refrigerators or freezers where reagents are stored (CHM. 12333, CHM.24700) are now more explicit as to recording of the temperatures,” Dr. Sarewitz says. “That temperature must be checked and recorded every day, and the requirement is that the number has to be written down, or there has to be a mark on a graph where you can look over and see the number. You can’t just put a check mark and say it was okay. There also has to be documentation of who did the check, and initials are acceptable for that.”
Another new element is that automated temperature recording devices and remote devices are acceptable, but the lab must have access to the data at all times and the functionality of the system must be checked daily.
Two new checklist items in the Laboratory General checklist bear directly on the quality of data transmission across computer interfaces, Dr. Sarewitz says. One requirement is that the lab must verify the accuracy of data transmitted across interfaces (GEN.48500). The other is that the lab director, or designee with director qualifications, must approve the content and format of reports (GEN. 41067). “We’ve made some changes to make these two items match a little better and make it a little easier for people.” A key question was how frequently the accuracy checks need to be done. “It was formerly ‘periodically,’ but we’re changing it so that both checks must be done every two years. That’s the same as the requirement to check calculations that are done by instruments or computer systems. So all three requirements are going to be the same to make it easier.” The calculation requirement applies only to calculations that are modifiable by the user.
There was considerable discussion, he notes, on how far down the interface chain the checks must be made. “It could go from the laboratory information system, to the hospital system, to a clinic system, to the PDA carried by the physician—but there is a practical limit to how far down the interface chain a laboratory can check data transmission and reports, particularly for interfaces and systems over which it has no control. So we’ve refined the previous requirement. The checks are limited to the first downstream system in which the clinician can be expected to access the data, and the interfaces to this system.”
For a hospital laboratory, that would generally be the HIS, he says. “For a commercial lab that might have numerous interfaces to various offices, the lab does need to check every different interface, and this can be documented by a screen shot or electronic records.”
As to checks of accuracy, the revised checklist includes specific types of reports that should be verified and recommends that two examples of each type be checked. For example, for new interfaces, the lab should validate at least two examples of surgical pathology reports, cytopathology reports, a clinical lab test report like electrophoresis, a quantitative lab report like routine chemistry or hematology, a qualitative lab report such as serology or microbiology, and corrected reports. For repeat validations, the laboratory would select four of these report types and validate those.
Checklist items addressing many aspects of direct-to-consumer testing had not yet been finalized at CAP TODAY press time, but one requirement (GEN.41465) was deleted from this round of checklist changes, Dr. Sarewitz says. Until now, the Laboratory Accreditation Program has required that the laboratory send a test result to the health care provider of the consumer’s choice, if the consumer requested it. That provision has turned out to be impractical. “You could have a health fair with hundreds and hundreds of people, and to send that many results out presents a problem. Second, there is at least one state where you need permission in advance from the clinician prior to sending the result. Also, situations have arisen in which the result is sent to a physician with whom the consumer has no relationship. So we felt it would be better to remove that requirement.” However, the lab is still required to provide a health care professional contact whom the consumer can contact if there’s a problem.
There are revisions to the quality control requirements for certain nonwaived tests (in CHM.13900, for example). The revisions involve tests for which daily QC can be limited to internal (built-in) controls (that is, the laboratory does not need to run external controls each day of patient testing).
First, daily QC can be limited to internal controls for FDA-cleared or -approved tests classified as moderate or high complexity. “Before, for high-complexity tests, except in molecular microbiology, external controls had to be run each day of testing,” Dr. Sarewitz says.
Second, before relying on internal controls as the only daily QC, laboratories must perform a validation study by running external controls for at least 25 samples. In the 2009 checklist, there was no minimum sample size for the validation study, except for internal liquid controls, for which there was a minimum sample size of 20 samples. “Now there is no distinction between liquid and other types of internal controls. All need the 25-sample minimum,” Dr. Sarewitz says. If the laboratory uses multiple identical devices, the 25-sample minimum applies only to validation of the initial device. The number of samples for the validation of the remaining devices is at the discretion of the laboratory director.
Third, external controls must be run in some additional situations: after major system maintenance, after software upgrades, and at least every 30 days (unless the manufacturer requires more frequent external QC).
There are two big areas of change in anatomic pathology, Dr. Sarewitz says. The first is how to report cancer cases. “In the 2009 checklist, the requirements had been changed to be consistent with the American College of Surgeons Commission on Cancer. But it was decided that the LAP really couldn’t be the vehicle to enforce that, for two reasons.” Many CAP-accredited labs are not in institutions with certified cancer programs, so some of what was required was unnecessary for them. “Second, we want to limit the LAP to those parameters that are really important to quality testing and patient safety. There has to be a different program to look at complete compliance with COC accreditation.”
In line with this thinking, “we first removed the requirement that the lab has to audit its pathology reports to ensure that all scientifically valid data elements are included. That requirement is totally gone. There already is a standing checklist item that all data elements required are included in the surgical pathology report, and our feeling is that once a lab sets up the system to follow that, it’s pretty much on auto-pilot.”
The LAP also retained the “phase zero” checklist item (meaning the accreditation program doesn’t require it at this time) that protocols should be reported in synoptic format. “We do want information on what percentage of labs are doing synoptic reporting, and in general it’s a recommendation, but there’s not universal belief that it’s the best method to report results. In our practice, we do it that way. But others don’t, and that’s fine.”
But changes were made to this checklist item. “It previously said that every individual element of the synoptic report had to be in a separate line. You could not say ‘left breast, invasive cancer.’ You would have to say ‘laterality: left, organ: breast,’ etc. That’s removed, because that generates a report that’s extremely difficult to read, and the idea is to communicate the result to clinicians in a way that avoids misunderstanding.”
The checklist now requires that the lab use the current version of the cancer reporting protocols, though it may use previous versions for up to eight months after publication of a new edition to allow for the time lag that occurs in accreditation, he says. “Labs receive the checklists they are going to be inspected against quite a number of months in advance, so there still could be some labs, after the date of publication of new protocols, that are getting old checklists.”
All required data elements from the cancer protocols must be included in the cancer reports, but the inspector is instructed to cite the lab only if there is a pattern of repeated failure to include all required items. An occasional example of noncompliance on this checklist item will not be cited.
The new guideline for estrogen and progesterone receptor testing was released in April and is now part of the 2010 checklist edition, Dr. Sarewitz says. “We ran the guidelines by the Immunohistochemistry Committee and then refined them for accreditation purposes.”
The key requirement is a uniform scoring system. “The lab will need to define a positive test as one percent or greater positive-staining cells. There’s variation in that now, and it’s a problem, because patients go from one lab to another, from one hospital to another, and they have a result that’s estrogen positive in one hospital and negative in another.”
Labs now need to report both the percentage of cells that are immunoreactive and the intensity of the staining. “That intensity can be stated as just a subjective comment such as weak, moderate, or strong, or it can be an actual numerical score,” Dr. Sarewitz says. “There are a number of situations in which the laboratory should suggest that the result could be questioned and repeat testing considered. These would most often be if you have negative results in situations where there’s some question as to whether it’s a true negative. For example, well-differentiated, low-grade carcinomas are usually positive, and if a receptor assay turns out to be negative, the lab might question that. Or if there are no normal ducts in the specimen to serve as positive internal controls and there is a negative result, you could not exclude a problem with the reactivity of the tissue—though this would apply only to primary breast biopsies.”
Proficiency testing for PR and ER testing is available this year and required in 2011. There is a separate document on validation of ER and PR testing, Dr. Sarewitz notes, but the timing of its release was such that the 2010 checklist will not include specific validation requirements. The ER and PR tests, however, are still subject to the generic validation requirements of the IHC section of the checklist.
In the last two editions of the checklists, at the CMS’ request, the accreditation program distinguished between two types of microscopic examinations: processing and grossing, Dr. Sarewitz says. “For non-pathologists who assist at grossing, the individual had to be qualified as high-complexity testing personnel under CLIA. But for processing, that was not the case because processing—defined as macroscopic examination of simple specimens requiring no sampling or dissection—is much simpler.”
Late last year, he notes, the CMS determined that for all macroscopic tissue examination, the individual had to be qualified as high-complexity personnel. “So essentially, the entire concept of processing goes away,” he notes. While this change may cause some difficulty in some labs, “I think sometimes people don’t quite understand the requirements for high-complexity personnel. It does not mean a bachelor’s degree is required—just an associate’s degree in science or a certain number of semester hours in various sciences” (details are in the Anatomic Pathology checklist, ANP. 11610).
The new version of the accreditation checklists won’t be the last word. “This is a work in progress,” Dr. Rudy says. “As we get feedback from inspectors and inspectees, we will make modifications.”
Because of the growing complexity in the laboratory—including more complex tests, new disciplines like molecular pathology and genetic testing, added regulations—the checklist requirements are inevitably getting more complicated and longer, says Dr. Sarewitz.
“In the old days, our former chairman had a rule that if anyone wanted to add a checklist requirement, we had to find one to get rid of, so the list wouldn’t get any bigger. Unfortunately, the world has changed, and we just can’t keep to that ideal paradigm anymore. But we’re making a concerted effort to keep the checklists as clear and user-friendly as possible.”
Anne Paxton is a writer in Seattle.