Return to CAP Home
Printable Version

  Lab Accreditation Program in sync with CLIA

 

February 2003
Originally published in CAP TODAY

In its best performance yet, the CAP Laboratory Accreditation Program scored a two percent disparity rate on its Clinical Laboratory Improvement Amendment validation review for fiscal year 2001. Government surveyors conduct the annual followup inspections to confirm that accrediting organizations with deeming authority are on the mark in determining whether laboratories meet condition-level CLIA requirements.

"The two percent disparity rate tells us that in two out of 100 times, our inspectors did not uncover a condition out of compliance with a CLIA requirement," explains Peter Mockridge, PhD, director of the CAP Laboratory Accreditation Program.

CAP's two percent variance for 2001 compares to seven percent for the previous year and an overall four percent nonweighted average—far from the 20 percent threshold that would trigger the Centers for Medicare and Medicaid Services to review an accrediting organization's deeming authority under CLIA. "The CAP has always had the lowest disparity rate for a major accrediting program for laboratories," says Ronald Lepoff, MD, chair of the CAP Commission on Laboratory Accreditation.

Equally impressive, CMS validation inspectors unveiled no survey disparities in CAP-accredited laboratories in California, a region in which a large number of labs have been disciplined for CLIA noncompliance. And out of the big players in laboratory accreditation, the CAP's program was the only one CMS approved for the maximum six years. "The Joint Commission on Accreditation of Healthcare Organizations got a three-year approval of its program and COLA got a two-year approval," Dr. Mockridge reports.

CMS calculated the CAP's two percent variance rate based on its validation surveys of 50 CAP-accredited laboratories. Of that number, CMS inspectors uncovered two labs with condition-level deficiencies, one of which had been missed by the LAP's team of voluntary inspectors.

"The condition-level deficiency recorded by CMS validation surveyors suggests the laboratory director was not qualified to direct moderate-complexity testing," Dr. Mockridge says. He believes the laboratory in question may have changed directors without notifying the College in the intervening period between the CAP inspection and the government's validation survey. "The CAP does have its own standards for director qualifications, which are consistent with CLIA, and we review the CVs of the lab directors," he says.

CMS is required to conduct the annual CLIA-mandated validation surveys within 90 days after the accreditation inspection, a period of time that's too long in Dr. Mockridge's view. Over the past two years, CMS has been performing some validation surveys simultaneously with the accrediting organization inspections, and these almost always produce the same results, reports Judy Yost, director of CMS' division of laboratory services.

Rate expected to rise

While the College is pleased with the two percent disparity rate, it expects the rate to go back up for 2002. "That needs to be said," Dr. Mockridge emphasizes, "because we don't want to set up unreasonable expectations for performance." Dr. Lepoff projects the 2002 rate will be around five to six percent, although the final number is still unknown. "We do track the validation surveys and know where we have a disagreement with CMS' findings," he says.

To some extent, the validation survey disparity rate can be the luck of the draw—or it can reflect legitimate discrepancies between the College's findings and those of CMS. In some cases, the College does not concur with CMS' findings or the significance of those findings, "but there's no process to remedy that disagreement," Dr. Mockridge says.

According to Yost, accrediting organizations can question CMS' findings for validation surveys if they disagree with them, but, essentially, CMS' findings stand. However, if an accrediting organization were in danger of losing its deemed status, CMS would consider doing a point-by-point review of its findings, Yost says. "The regulations do include a reconsideration process for that situation," she notes.

"CMS uses a very broad approach in comparing the findings of accreditation organizations' surveys to its own," Yost adds. "We don't nit-pick standards but only look for condition-level—e.g. more serious, comprehensive—discrepancies."

Comparing apples and oranges

Yet, in some ways, comparing CMS' validation reviews and CAP inspections is like looking at apples and oranges. "If generalizations can be made about the difference in focus between CMS and the CAP, "it's that the CAP has a lot more questions on its checklist," Dr. Lepoff explains. "And many of these questions are not focused on regulatory issues but rather on quality improvement and on whether labs are practicing state-of-the-art laboratory medicine." By contrast, CMS focuses only on CLIA requirements, though CAP also checks laboratories for CLIA compliance because it has deeming authority.

The CAP also has the only accreditation program inspecting the entire laboratory in which people who work in a laboratory daily conduct the inspections. "Other than the four or five medical technologist staff in the CAP central office who assist in inspecting small, mostly physician office labs, all CAP inspectors are unpaid volunteers," Dr. Lepoff says.