College of American Pathologists
Printable Version

  Clinical Abstracts





January 2009

Michael Bissell, MD, PhD, MPH

Use of unbound bilirubin versus total bilirubin to predict cytotoxicity
Blood donor screening for parvovirus B19 in Germany and Austria
Link between evolutionary and physiological variation in hemoglobin
Predictive biomarker in rheumatoid arthritis
Use of MRSA risk factor screening to improve surveillance culturing
Use of a reverse line blot hybridization assay for fungal identification

Use of unbound bilirubin versus total bilirubin to predict cytotoxicity Use of unbound bilirubin versus total bilirubin to predict cytotoxicity

Plasma levels of unconjugated bilirubin are elevated in almost all newborn infants. In some infants with markedly elevated levels, bilirubin causes neurotoxicity, sometimes resulting in permanent neurologic dysfunction. Management guidelines for jaundiced term and near-term infants, published by the American Academy of Pediatrics, are based on the premise that total serum bilirubin concentration (BT) is the best predictor of risk for bilirubin-induced neurologic damage (BIND). However, clinical evidence has indicated that BT beyond a threshold value of 20 mg/dL is a poor discriminator of individual risk for BIND. Because more than 99.9 percent of total plasma unconjugated bilirubin (UCB) (BT) is bound to albumin or apolipoprotein D, and only unbound bilirubin can enter the brain across an intact blood-brain barrier, the level of unbound free bilirubin (Bf) should provide a more accurate indication of the risk of kernicterus. In jaundiced newborns, plasma Bf levels at any given BT or BT/albumin ratio can vary widely due to varying concentrations of albumin and apolipoprotein D, differences in the binding affinity, or the presence of inhibitors of binding. Therefore, Bf cannot be predicted from the concentrations of BT and albumin in plasma or culture medium and must be measured directly. A modified, enzymatic peroxidase method has been developed to measure Bf in plasma and tissue culture media with minimal dilution of the sample. The authors conducted a study to directly test the hypothesis that Bf measured with the peroxidase method, rather than BT, predicts the toxicity of UCB in several cell lines under a variety of incubation conditions, using tetrazolium reduction to assess cell viability. The authors specifically assessed in vitro cytotoxicity in four cell lines exposed to different Bf concentrations obtained by varying BT/albumin ratio, using serum albumins with different binding affinities, or by displacing UCB from albumin with a sulphonamide. They assessed Bf by the modified, minimally diluted peroxidase method. The authors found that cytotoxicity varied among cell lines but was invariably related to Bf and not BT. Light exposure decreased toxicity parallel to a decrease in Bf. In the absence of albumin, no cytotoxicity was found at a Bf of 150 nM, whereas in the presence of albumin, a similar Bf resulted in a 40 percent reduction of viability, indicating the importance of total cellular uptake of UCB in eliciting toxic effect. The authors concluded that in the presence of albumin-bound UCB, bilirubin-induced cytotoxicity in a given cell line is predicted by Bf, irrespective of the source and concentration of albumin or total bilirubin level.

Calligaris SD, Bellarosa C, Giraudi P, et al. Cytotoxicity is predicted by unbound and not total bilirubin concentration. Pediatr Res. 2007;62:576–580.

Correspondence: Dr. Sebastian D. Calligaris at

[ Top ]

Blood donor screening for parvovirus B19 in Germany and Austria Blood donor screening for parvovirus B19 in Germany and Austria

Parvovirus B19 was detected for the first time in 1975 in a blood product from a healthy donor. During the onset of B19 infection, virus concentration can increase up to 1014 virions/mL. Because B19 is a nonlipid-enveloped viral pathogen, inactivation methods, such as solvent detergent treatment, are ineffective for reducing virus concentration in plasma. Most infections occur in childhood and result in a mild rash and formation of protective antibodies. Infection normally results in seroconversion with neutralizing immunoglobulin G (IgG) antibodies, affording lifelong protection from reinfection in most cases. However, chronic infection may be associated with a poor antibody response. Screening for B19 DNA by minipool real-time nucleic acid amplification technology (NAT; testing in donor pools up to 96 samples per pool) was introduced into the authors’ blood donor screening protocol in April 2000. The authors conducted a study involving 2.8 million blood donations from Germany and Austria that were screened for B19 by real-time minipool NAT. A subgroup of 50 B19 DNA-positive donors was screened for B19 IgG and IgM antibodies and B19 DNA during a six-month period. Results were compared to those for 100 B19 DNA-negative donors. The authors found that data accumulated from 2000 to 2006 indicated a high incidence period from May 2004 to January 2006. In total, the incidence was 12.7 and 261.5 per 100,000 donations with high virus loads equal to or above 105 and below 105 IU/mL, respectively. Median virus concentration in the case group was 4.85 ∞ 107 IU/L at time point T0 and was reduced to 4 ∞ 102 IU/mL at the time of the next donation (three months later). Neutralizing antibodies (VP2) were detected in all donations if virus load was reduced to less than 105 IU/mL. The authors concluded that the release of B19 DNA-positive blood products with a concentration of less than 105 IU/mL is thought to be safe due to the high level of neutralizing VP2 antibodies. Blood products with a high B19 DNA concentration (105 IU/L or greater), some of which did not contain neutralizing antibodies, were discarded to protect at-risk individuals.

Schmidt M, Themann A, Drexler C, et al. Blood donor screening for parvovirus B19 in Germany and Austria. Transfusion. 2007;47:1775–1782.

Correspondence: Dr. Michael Schmidt at

[ Top ]

Link between evolutionary and physiological variation in hemoglobin Link between evolutionary and physiological variation in hemoglobin

Physiological mechanisms responsive to the environment may enable rapid and reversible variation in phenotype without a change in the genotype. On longer time scales, such as generations, mutations can alter the genotype and, therefore, permanently alter the phenotype. Despite the mechanistic differences in how physiological and evolutionary (genetic) variations arise, both may act in similar ways and on similar molecular targets to change the phenotype. For example, response to the environment can change the activity of an enzyme by a posttranslational modification of an amino acid, such as phosphorylation of serine. A stable change in environmental conditions will result in a physiological adaptation that will allow a significant proportion of the population to survive, even at reduced fitness. Although qualitative discussions of the relationship between evolutionary and physiological variations abound, there has been little quantitative analysis. To this end, the authors conducted a study to quantitatively evaluate the relationship between physiology and evolution by putting this discussion into a very specific biological context that is amenable to quantitative analysis. They used the hemoglobin molecule as a model system to quantify the relationship between physiological and evolutionary adaptations. They compared the measurements of oxygen saturation curves of 25 mammals with those of human hemoglobin under a wide range of physiological conditions. They fit the data sets to the Monod-Wyman-Changeux model to extract microscopic parameters. Their analysis demonstrated that physiological and evolutionary change act on different parameters. The main parameter that changes in the physiology of hemoglobin is relatively constant in evolution, whereas the main parameter that changes in the evolution of hemoglobin is relatively constant in physiology. This orthogonality suggests continued selection for physiological adaptability and hints at a role for this adaptability in evolutionary change.

Milo R, Hou JH, Springer M, et al. The relationship between evolutionary and physiological variation in hemoglobin. Proc Natl Acad Sci USA. 2007;104:16998–17003.

Correspondence: Marc W. Kirschner at

[ Top ]

Predictive biomarker in rheumatoid arthritis Predictive biomarker in rheumatoid arthritis

Most cardiovascular deaths among people with rheumatoid arthritis are due to congestive heart failure or myocardial infarction, implicating ischemic heart disease. As in the general population, classic risk factors, such as age, gender, hypertension, diabetes mellitus, smoking, and socioeconomic status, have been associated with mortality. In addition, comorbid conditions and clinical manifestations of rheumatoid arthritis (RA), including markers of inflammation, rheumatoid factor, nodular disease, joint counts, and functional disability, have been shown to be significant risk factors. Increased systemic inflammation, in particular, appears to confer in people with RA an additional risk of developing atherosclerosis and cardiovascular mortality. Little information exists about how genetic factors might affect mortality among people with RA. The authors conducted a study in which HLA-DRB1 genotyping was carried out on blood samples from 767 patients recruited for the Early RA Study (ERAS), a multicenter, inception cohort study with followup of more than 18 years. Dates and causes of death (n=186) were obtained from the Office of National Statistics. The association of HLA-DRB1 alleles with risk of mortality was compared using Cox proportional hazards regression analyses. Multivariate stepwise models were used to assess the predictive value of HLA-DRB1 genotypes relative to other potential baseline risk factors. The authors found that the shared epitope was not significantly associated with overall mortality. However, the presence of two shared epitope alleles was associated with risk of mortality from ischemic heart disease (hazard ratio [HR], 2.02; 95 percent confidence interval [CI], 1.04–3.94; ρ=.04) and malignancy (HR, 2.18; 95 percent CI, 1.17–4.08; ρ=.01). Analysis of specific shared epitope genotypes (corrected for age and gender) revealed that the HLA-DRB1*0101/ *0401 and 0404/*0404 genotypes were the strongest predictors of mortality from ischemic heart disease (HR, 5.11 and 7.55, respectively). DRB1*0101/ *0401 showed a possible interaction with smoking. Male gender, erythrocyte sedimentation rate, and Carstairs Deprivation Index were also predictive, but the Health Assessment Questionnaire score, rheumatoid factor, nodules, and swollen joint counts were not. Mortality due to malignancy was particularly associated with DRB1*0101 genotypes. The authors concluded that the risk of mortality due to ischemic heart disease or cancer in people with RA is increased in those carrying HLA-DRB1 genotypes with particular homozygous and compound heterozygous shared epitope combinations.

Mattey DL, Thomson W, Ollier WER, et al. Association of DRB1 shared epitope genotypes with early mortality in rheumatoid arthritis: results of eighteen years of followup from the Early Rheumatoid Arthritis Study. Arthritis Rheum. 2007;56:1408–1416.

Correspondence: Dr. D. L. Mattey at

[ Top ]

Use of MRSA risk factor screening to improve surveillance culturing Use of MRSA risk factor screening to improve surveillance culturing

Methicillin-resistant Staphylococcus aureus (MRSA) continues to cause significant morbidity and mortality. Infection with MRSA, unlike infection with methicillin-sensitive S. aureus strains, has been associated with higher mortality rates, longer hospital stays, and higher hospital charges. The prevalence of community-acquired MRSA has increased in some segments of the community in recent years, resulting in increasing numbers of patients who are MRSA positive being admitted to the hospital, where they can spread the organism. These findings have increased the need to devise systems that screen patients at admission to identify those at high risk of having MRSA so they can be isolated to prevent subsequent nosocomial spread. To this end, the authors conducted a study in which they performed anterior nares surveillance cultures of all patients on admission to and discharge from the general internal medicine floor in a community hospital over a seven-week period. The patients completed a questionnaire on MRSA risk factors. Forty-one (10.2 percent) of the 401 patients had MRSA on admission. Of 48 risk measures analyzed, 10 were significantly associated with admission MRSA, and seven of these were independently associated in stepwise logistic regression analysis. Factor analysis identified eight latent variables that contained most of the predictive information in the 48 risk measures. Repeat logistic regression analysis, including the latent variables, revealed three independent risk measures for admission MRSA: prior nursing home stay (relative risk [RR], 6.18; 95 percent confidence interval [CI], 3.56–10.72; ρ<.0001), history of MRSA infection (RR, 3.97; 95 percent CI, 1.94–8.12; ρ=.0002), and factor 3 (RR, 3.14; 95 percent CI, 1.56–6.31; ρ=.0013), which represented the combined effects of homelessness, jail stay, promiscuity, and illicit drug use. Multivariable models had greater sensitivity for detecting admission MRSA than any single risk measure and allowed detection of 78 percent to 90 percent of admission MRSA from admission surveillance cultures on 46 percent to 58 percent of admissions. The authors concluded that their study illustrates the potential usefulness of multivariable models in admission screening and potential approaches to developing such models. Additional research should be conducted to determine whether multivariable screening improves sensitivity and efficiency in other hospital settings and which risk measures contribute most to admission screening.

Haley CC, Mittal D, LaViolette A, et al. Methicillin-resistant Staphylococcus aureus infection or colonization present at hospital admission: multivariable risk factor screening to increase efficiency of surveillance culturing. J Clin Microbiol. 2007;45:3031–3038.

Correspondence: Clinton C. Haley at clinton.

[ Top ]

Use of a reverse line blot hybridization assay for fungal identification Use of a reverse line blot hybridization assay for fungal identification

Invasive fungal infections can cause morbidity and mortality in severely ill and immunocompromised patients. Recent epidemiological trends indicate a significant shift toward species of Candida and Aspergillus, other than Candida albicans and Aspergillus fumigatus, and a diverse range of less common fungal opportunists. Given many of these pathogens’ reduced susceptibility to standard antifungal agents, timely and accurate identification at the species level is essential in guiding clinical management. However, conventional, culture-based, phenotypic identification methods are slow and prone to misidentification, particularly with less common or unusual species. In addition, the databases of commercial yeast identification systems do not contain all potential pathogens. Molecular approaches using polymerase chain reaction-based methods have been developed to rapidly detect fungi. In particular, the internal transcribed spacer (ITS) regions ITS1 and ITS2, of the fungal ribosomal DNA gene complex, have shown promise as targets for identifying species in a variety of formats, including DNA sequencing and DNA probe hybridization. Length and sequence polymorphisms within the ITS region have permitted accurate identification of pathogenic yeasts and molds. The authors evaluated a combined panfungal PCR-reverse line blot (RLB) hybridization assay based on ITS1 and ITS2 region polymorphisms to identify 159 Candida, Cryptococcus neoformans, and Aspergillus isolates (22 species). The authors also studied the assay’s ability to identify fungal pathogens directly from 27 clinical specimens. ITS sequence analysis was performed to resolve discrepant identifications or where no RLB result was obtained. Species-specific ITS2- and ITS1-based probes identified 155 of 159 isolates (98 percent) and 149 of 159 isolates (93.7 percent), respectively. All strains were unambiguously differentiated, with the exception of cross-reactivity between the Candida norvegensis probe and Candida haemulonii DNA product. Species identification of the pathogen was made for all 21 specimens (sensitivity, 100 percent) where species-specific probes were included in the RLB. However, there was no ITS2 probe-based hybridization signal for two specimens. Results were concordant with the culture results for 18 (85.7 percent) specimens. The assay provided species identification without a culture result (two specimens) and detected mixed infection (one specimen). The results indicated that the RLB assay can detect yeasts and Aspergillus spp. in clinical specimens and that it is necessary to incorporate ITS1- and ITS2-targeted probes for optimal sensitivity. The test has potential utility in the early diagnosis of invasive fungal infection since fungal DNA was detected in all 27 specimens. ITS sequencing may be performed to achieve species identification before incorporating probes to detect other fungal species.

Zeng X, Kong F, Halliday C, et al. Reverse line blot hybridization assay for identification of medically important fungi from culture and clinical specimens. J Clin Microbiol. 2007;45:2872–2880.

Correspondence: Tania C. Sorrell at tsorrell@

[ Top ]

Dr. Bissell is professor, Department of Pathology, Ohio State University, Columbus.