Rapidly growing mycobacteria represent a subgroup of environmental mycobacteria that includes the commonly recognized pathogen Mycobacterium abscessus as well as other species, such as M. fortuitum and M. chelonae. Evidence suggests that rapidly growing mycobacteria (RGM) are significant respiratory pathogens in patients with chronic lung diseases, such as cystic fibrosis (CF). Although data suggest that RGM are significant pathogens in patients with CF, detecting RGM respiratory infection poses considerable challenges. While RGM can grow on many standard culture media, they grow relatively slowly—they are rapidly growing only relative to the rate of growth of other mycobacteria. RGM are typically overgrown by other common CF pathogens, such as Pseudomonas aeruginosa, on nonselective agar. As a result, RGM in respiratory secretions from patients with CF is typically detected using culture for acid-fast bacilli (AFB; AFB culture) by techniques that involve a decontamination step to eliminate Pseudomonas and other competing pathogens. AFB cultures are time-consuming and expensive and are generally performed less often and more selectively than routine culture of respiratory secretions. Cystic Fibrosis Foundation guidelines recommend more selective surveillance for nontuberculous mycobacteria, as opposed to the minimum quarterly surveillance for many other respiratory pathogens. The authors’ anecdotal evidence suggested that RGM could be recovered from Burkholderia cepacia selective agar (BCSA), one of the culture media used in routine cultures of respiratory secretions from CF patients. Furthermore, recovery was found to be enhanced if the incubation period for BCSA was extended from the standard five days to 14 days. To assess the potential impact of extending the incubation of respiratory secretions from CF patients in BCSA cultures, the authors implemented this change and examined results from all routine and AFB cultures of samples from CF patients for two years before (4,212 samples by routine culture and 1,810 samples by AFB culture; 670 patients) and two years after (4,720 samples by routine culture and 2,179 samples by AFB culture; 695 patients) the change. Clinical relevance was assessed with samples from a subgroup of 340 patients followed regularly throughout both periods. Extending incubation of BCSA-enhanced RGM recovery from routine cultures (0.7 percent before and 2.8 percent after; P<.001), recovery from AFB cultures was unchanged (5.5 percent before and 5.7 percent after). Estimates of RGM-detection sensitivity by culture or patient-based methods ranged from about 65 percent to 75 percent for routine cultures (nonsignificantly lower than the approximately 80 percent to 85 percent for AFB cultures) and were adversely affected by co-culture with mold or nonpseudomonal, nonfermenting Gram-negative rods. In the after period, 16 CF patients met the criteria for RGM infection by routine culture, including four who did not meet the criteria for RGM infection by AFB culture. The authors concluded that a simple methodological change enhanced recovery of RGM from routine cultures. The modified culture method could be used to improve screening for RGM in CF patients or as a simpler method for following patients with known RGM infection. However, this method should be used cautiously in patients with certain co-infections.
Esther CR, Hoberman S, Fine J, et al. Detection of rapidly growing mycobacteria in routine cultures of samples from patients with cystic fibrosis. J Clin Microbiol. 2011;49:1421–1425.
Correspondence: Peter Gilligan at email@example.com
[ Top ]
Choosing the optimal management strategy for a newly diagnosed localized prostate cancer is based, at least in part, on understanding the natural history of the disease in the absence of aggressive treatment intervention. Useful information about disease natural history for men who were clinically diagnosed and initially untreated before widespread adoption of the prostate-specific antigen (PSA) test is available from population-based cohort studies. However, relatively little information is available for men diagnosed after a routine PSA test. Although the prognosis for screen-detected tumors seems to be better than that for clinically diagnosed tumors, predicting the natural history of a specific PSA-detected tumor is complicated by overdiagnosis and the lead time associated with the test. A large proportion of PSA-detected cancers are overdiagnosed—that is, they would never have progressed to a symptomatic state nor have been clinically diagnosed in the absence of testing. By definition, an overdiagnosed tumor has an entirely different prognosis than a tumor that was not overdiagnosed. The clinical challenge is to determine whether a given case is overdiagnosed at the time of diagnosis. Further complicating the situation, even if a cancer that is not overdiagnosed is identified, prognosis depends critically on lead time, which is the time by which diagnosis is advanced through screening. Lead times can be highly variable across patients, primarily due to the heterogeneity of the disease. Unfortunately, there is no sure way to assess whether a tumor is overdiagnosed or to predict its lead time in clinical practice. Consequently, once a cancer has been detected by screening, it is typically treated, altering its natural history. Medical personnel can then no longer observe whether or when the tumor would have progressed in the absence of treatment. This data limitation has spawned the development of model-based approaches for inferring lead time, overdiagnosis, and future survival from observed data on disease-specific incidences and deaths. The authors used three independently developed models of prostate cancer natural history to project risks of clinical progression events and disease-specific deaths for PSA-detected cases, assuming patients received no primary treatment. The three models projected that 20 percent to 33 percent of men have preclinical onset—of these, 38 percent to 50 percent would be clinically diagnosed and 12 percent to 25 percent would die of the disease in the absence of screening and primary treatment. Men younger than 60 at PSA detection and with a Gleason score of 2 to 7 have a 67 percent to 93 percent risk of being clinically diagnosed in the absence of screening and a 23 percent to 34 percent risk of dying of the disease in the absence of primary treatment. For a Gleason score of 8 to 10, these risks are 90 percent to 96 percent and 63 percent to 83 percent, respectively. The authors concluded that risks of disease progression among untreated PSA-detected cases can be nontrivial, particularly for younger men and men with high Gleason scores. Model projections can be useful for making decisions about treatment.
Gulati R, Wever EM, Tsodikov A, et al. What if I don’t treat my PSA-detected prostate cancer? Answers from three natural history models. Cancer Epidemiol Biomarkers Prev. 2011;20:740–750.
Correspondence: Roman Gulati at firstname.lastname@example.org
[ Top ]
The most widely used creatinine-based glomerular filtration rate estimating equation (eGFR) is MDRD-4. It and its successor, CKD-EPI, do not include a term for the patient’s body weight or actual body surface area. Consequently, these eGFR equations are vulnerable to bias (nonrandom errors) that can confound their application to individuals and to group comparisons. The authors proposed to retrofit these eGFR equations to include the patient’s actual body weight (or a related term) and actual body surface area. They addressed the problem incurred by these omissions, how the problem can be remedied, and what should be done until it is remedied. MDRD-4 was devised at a time when it was advantageous not to require body weight or body surface area because clinical laboratories, which produced the creatinine measurement, did not have ready access to patients’ height or weight. However, these laboratories did have patient demographics-—age, race, and gender—each of which influences serum creatinine level. These demographics and the serum creatinine level became the data set for MDRD-4/CKD-EPI. Nowadays, with widespread and growing use of electronic medical records, it would be easy to incorporate the patient’s body weight and body surface area into MDRD-4/CKD-EPI. The MDRD-4 equation was devised to estimate actual GFR more accurately than is possible from interpreting serum creatinine level alone. The MDRD-4 equation has become very influential. It is the basis for the K-DOQI stages of chronic kidney disease, which are widely used clinically and in chronic kidney disease epidemiology. Furthermore, most clinical laboratories now automatically report MDRD-4 using a standardized creatinine measurement championed by K-DOQI. MDRD-4’s predecessor was the Cockroft-Gault (CG) eGFR. CG is used widely in Europe but never gained popularity in the United States. The disadvantages of CG are that it requires body weight, does not adjust for body surface area or African ancestry, and estimates creatinine clearance, not GFR. In addition, CG shows greater variability than MDRD-4, which has been interpreted as evidence of decreased accuracy. However, the authors suggest that MDRD-4 only seems more accurate than CG because, in effect, MDRD-4 has only a single variable contributing to its variance—that is, serum creatinine—whereas CG eGFR has serum creatinine and body weight contributing to its variance. Therefore, for patients of the same age, race, gender, and serum creatinine level, there is only one possible value for MDRD-4 eGFR. However, for this same set of patients, there are numerous possible correct values for CG eGFR, depending on body weight. The greater variability of CG compared to MDRD-4 reflects this reality. Therefore, the authors assert that adding a weight term and the patient’s actual body surface area would improve the accuracy of MDRD-4/CKD-EPI. There is now considerable effort underway to replace creatinine-based eGFR with other measures. The authors suggest, however, that a properly retrofitted CKD-EPI may make that effort unnecessary. Body weight is the main determinant of serum creatinine at any GFR. MDRD-4 attempts to adjust for lack of a body weight term by using an averaged body surface area. However, even an accurately determined body surface area is not an adequate substitute for body weight.
Hebert PL, Nori US, Bhatt UY, et al. A modest proposal for improving the accuracy of creatinine-based GFR estimating equations. Nephrol Dial Transplant. 2011;26:2426–2428.
Correspondence: Lee A. Hebert at email@example.com
[ Top ]
Glutamate neurotoxicity is determined by the balance between glutamate release in the brain and efflux of excess glutamate from the brain. Brain-to-blood efflux of glutamate is increased by decreasing the concentration of glutamate in blood. Little is known about the effect of hyperthermia on blood glutamate concentrations and the effectiveness of blood glutamate-decreasing mechanisms in these conditions. Although hyperthermia is hypothesized to decrease blood glutamate concentrations by activating stress mechanisms, blunting the stress response by blocking β-adrenergic receptors should prevent this decrease. Furthermore, there should be a concurrent process of leakage of glutamate from muscle tissue into blood during hyperthermia, resulting in a contradictory increase in blood glutamate concentrations. The authors conducted a study in which they investigated the effects of hyperthermia on blood glutamate levels and studied the effects of the β-adrenergic receptor antagonist propranolol on stress-induced changes in glutamate levels. They then studied the effectiveness of the blood glutamate scavenger oxaloacetate on hyperthermia-induced increases of glutamate levels. For the study, 24 rats were randomly divided into three groups. Rats’ body temperatures were increased by 1°C every 40 minutes, going from 37°C to 42°C. The first group of rats received 1 mL/100 g of isotonic saline (control). The second group received 1 mL/100 g of 1M oxaloacetate when their temperatures reached 39°C. The third group received 10 mg/kg of propranolol before warming was initiated. The authors found that warming the rats from 37°C to 39°C decreased the blood glutamate levels in the control group (P<.01) and oxaloacetate treatment group (P<.0001), whereas further increases in temperature from 40°C to 42°C increased the blood glutamate levels (P<.01 and P<.0001, respectively). Pretreatment with propranolol prevented the decrease in blood glutamate concentrations seen in mild hyperthermia and did not affect the increase in blood glutamate levels seen at temperatures of 41°C and 42°C (P<.005). The results of this study demonstrated that hyperthermia leads to decreases in glutamate levels in the blood, presumably by activating the sympathetic nervous system. Oxaloacetate, previously reported to reduce blood glutamate levels at 37°C, was ineffective at temperatures above 40°C. Propranolol pretreatment blunted the initial decrease in blood glutamate but thereafter had no effect when compared with the outcomes in the control and treatment groups. The authors concluded that understanding the mechanisms underlying glutamate regulation in the blood during states of hyperthermia and stress has important clinical implications in treating neurodegenerative conditions.
Zlotnik A, Gurevich B, Artru AA, et al. The effect of hyperthermia on blood glutamate levels. Anesth Analg. 2010;111:1497–1504.
Correspondence: Dr. Akiva Leibowitz at firstname.lastname@example.org
[ Top ]
The Joint Task Force for the Redefinition of Myocardial Infarction predicated its redefinition of acute myocardial infarction on detecting an increase or decrease in cardiac troponin T (cTnT) or cardiac troponin I (cTnI), with at least one value greater than the 99th percentile reference value in patients with evidence of myocardial ischemia. The advent of high-sensitivity cTnT and cTnI assays has enabled measurement of previously undetectable cardiac troponin concentrations in healthy individuals and patients with an acute coronary syndrome. Despite adding changes in cardiac troponin concentrations to the aforementioned definition, the magnitude of an increase or decrease during serial sampling that is indicative of acute myocardial infarction has not been fully determined. A change of 20 percent or more has been suggested for patients with cardiac troponin already elevated at baseline. To aid in the interpretation of changes in cardiac troponin concentration, the authors sought to establish biological variation and reference change values (RCVs) by applying the normal and lognormal approaches for cTnT sampled at hourly and weekly intervals in healthy individuals and measured on the Roche E 170 and Elecsys 2010 automated platforms. High-sensitivity cTnT was measured at baseline and then after one, two, three, and four hours, and after one, two, three, and four weeks in 20 and 17 healthy individuals, respectively. A healthy status was established by physical examination and MRI analysis at rest and during dobutamine stress, lung function testing, and blood sample testing. The authors found that hourly total and within-individual coefficients of variation (CV) were 18 percent and 15 percent, respectively, for the E 170 assay and 24 percent and 21 percent, respectively, for the Elecsys 2010 assay. Weekly total and within-individual CVs for these assays were 32 percent and 31 percent, respectively, for the E 170 assay and 32 percent and 30 percent, respectively, for the Elecsys 2010 assay. The RCVs for the E 170 and Elecsys 2010 assays were +46 percent and +62 percent (hourly), respectively, and +87 percent and +86 percent (weekly), respectively. The corresponding lognormal values were +64 percent/-39 percent and +90 percent/-47 percent (hourly), and +138 percent/-58 percent and +135 percent/-58 percent (weekly). The authors concluded that RCVs appear attractive for interpreting high-sensitivity TNT results. The short-term biological variation of high-sensitivity TNT is low but becomes somewhat more important at intermediate sampling intervals. Knowledge of this variation is important for interpreting results from patients in whom cTnT values increase from low concentrations.
Frankenstein L, Wu AHB, Hallermayer K, et al. Biological variation and reference change value of high-sensitivity troponin T in healthy individuals during short and intermediate follow-up periods. Clin Chem. 2011;57(7):1068–1071.
Correspondence: Lutz Frankenstein at email@example.com
[ Top ]
Total hemoglobin (SpHb) is one of the most frequently ordered laboratory tests. Hemoglobin levels must be obtained through blood draws that are invasive and time-consuming and can only provide data at one time point. Since the introduction of pulse oximetry, continuous noninvasive monitoring of blood oxygenation in the operating room has become a standard of care. During the past two decades, attempts have been made to find a suitable alternative to laboratory hemoglobin as well as provide a continuous noninvasive hemoglobin monitor for use in clinical medicine. A noninvasive hemoglobin monitor has the potential to provide continuous, noninvasive hemoglobin measurements that would provide immediate clinical information, prompting more rapid medical intervention. The Radical-7 Pulse CO-Oximeter from Masimo Corp. uses signal extraction technology pulse oximetry, which enables the use of adaptive filters to isolate arterial signals using parallel processing engine technology. Masimo Rainbow technology, which uses multiple (7+) wavelengths of light, enables the device to noninvasively measure SpHb, carboxyhemoglobin, and methemoglobin, as well as oxyhemoglobin and pulse rate. The use of this technology to determine and follow SpHb continuously is of clinical interest in all areas of medicine, but particularly in the operating room for patients undergoing major surgical procedures. The device has been shown to provide validated hemoglobin values in physiologically normal patients. However, minimal data are available about the accuracy and reliability of this device for patients with severe illness or who are undergoing major operative interventions. The authors conducted a study to analyze the agreement and correlation between the SpHb measurement and the gold standard laboratory hemoglobin measurement in a series of patients undergoing major surgical procedures. They collected data prospectively for two cohorts of patients: those undergoing major operations requiring hemodynamic monitoring (operating room) and critically ill patients (intensive care unit). Noninvasive hemoglobin measurements were recorded and compared with laboratory hemoglobin measurements. Data were collected on 70 patients (OR, 25; ICU, 45). The overall correlation of the Masimo SpHb and laboratory Hb was 0.77 (P<.001) in the operating room group, with a mean difference of 0.29 g/dL (95 percent confidence interval [CI], 0.08–0.49). The overall correlation in the intensive care unit group was 0.67 (P<.001), with a mean difference of 0.05 g/dL (95 percent CI, -0.22 to -0.31). The authors concluded that noninvasive hemoglobin monitoring is correlated with laboratory values and supports the continued study of noninvasive hemoglobin monitoring.
Causey MW, Miller S, Foster A, et al. Validation of noninvasive hemoglobin measurements using the Masimo Radical-7 SpHb Station. Surgery. 2011;201:590–596.
Correspondence: Dr. Marlin Wayne Causey at firstname.lastname@example.org
[ Top ]
Clinical pathology abstracts editors: Deborah Sesok-Pizzini, MD, MBA, associate professor, Department of Clinical Pathology and Laboratory Medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, and medical director, Blood Bank and Transfusion Medicine, Children's Hospital of Philadelphia; Michael Bissell, MD, PhD, MPH, professor, Department of Pathology, Ohio State University, Columbus.