Responses to changes in sodium intake in compensated heart failure
Characteristics of HLA-specific B cells and clinical transplantation
Recent changes in Staphylococcus aureus strains in bloodstream infections
Use of rapid tests for malaria in the United Kingdom
Significance of short-chain aliphatic amines in human urine
Partial immunodeficiencies and recurrent respiratory infections
Comparison of CA 19-9 assays
Link between urine nitrite and hyperbilirubinemia?
GGT as a predictor of chronic kidney disease
The inability to excrete sodium in excess is well characterized in untreated heart failure patients. Therefore, it is recommended that these patients reduce dietary salt intake. Throughout the past two decades, medical treatment of heart failure has improved, and today most patients receive β-adrenoreceptor blockers and inhibitors of the reninangiotensin-aldosterone system (RAAS). It has been demonstrated in such patients that acute central intravascular volume expansion elicits a hemodynamic, neuroendocrine, and renal response similar to that in healthy controls. Therefore, the intact baroreflex-mediated response to increased cardiac and arterial filling leads to decreased systemic vascular resistance and suppressed neuroendocrine mediators, thereby promoting renal sodium excretion. High sodium intake augments intravascular filling, decreases vascular resistance, improves cardiac performance, and suppresses vasoconstrictor hormones in healthy people. Therefore, it is conceivable that patients with medically treated compensated heart failure would exhibit similar beneficial effects. The authors examined 12 patients with compensated heart failure receiving angiotensin-converting enzyme inhibitors and β-adrenoreceptor blockers and 12 healthy controls after one week of high and one week of low sodium intake. They investigated whether the hemodynamic and neuroendocrine responses to variations in sodium intake would be different between the two groups. During steady state conditions, hemodynamic and neuroendocrine examinations were performed at rest and during bicycle exercise. In seated heart failure patients, high sodium intake increased body weight (1.6±0.4%), plasma volume (9±2%), cardiac index (14±6%), and stroke volume index (21±5%), whereas mean arterial pressure was unchanged. Therefore, the total peripheral resistance decreased by 10±4 percent. Similar hemodynamic changes were observed during an incremental bicycle exercise test. Plasma concentrations of angiotensin II and norepinephrine were suppressed, whereas plasma pro-B-type natriuretic peptide remained unchanged. The authors concluded that high sodium intake was tolerated without any excessive sodium and water retention in medically treated patients with compensated heart failure. The observation that high sodium intake improves cardiac performance, induces peripheral vasodilatation, and suppresses the release of vasoconstrictor hormones does not support the advice for heart failure patients to restrict dietary sodium.
Damgaard M, Norsk P, Gustafsson F, et al. Hemodynamic and neuroendocrine responses to changes in sodium intake in compensated heart failure. Am J Physiol Regul Integr Comp Physiol. 2006; 290: R1294–R1301.
Reprints: M. Damgaard, Dept. of Cardiovascular Medicine, Bispebjerg Hospital, Bispebjerg Bakke 23, DK2400 Copenhagen, Denmark; email@example.com
Antibody to donor human leukocyte antigens is a risk for transplants of all types of organs and tissues. The development, during the last decade, of highly sensitive and specific solid-phase immunoassays for human leukocyte antigen (HLA)-specific antibody has greatly improved the opportunities for identifying patients at increased risk for antibody-mediated rejection and successfully transplanting the sensitized patient. It is well-known that the frequency of sensitization and, in turn, of reduced access to and success with transplant varies among groups based on gender and race. The increased frequency of sensitization among females has been attributed to sensitization via pregnancy, while exposure via transfusion to HLA antigens more common among whites has been cited as one cause for higher rates of sensitization among blacks. However, there may also be underlying differences in immune responsiveness among one or both groups, as is suggested by higher rates of autoimmune disease in both populations. Persistence of HLA-specific antibody also varies among individuals. The antibodies persist for years in some and are short-lived in others. This variability is probably related to the extent and route of sensitization and to inherent differences in the immunocompetency and immunoregulation among individuals. The authors investigated the characteristics of HLA-specific B cells that may be relevant to clinical transplantation. HLA-specific B cells were identified by staining with HLA tetramers (tet), and the distribution of CD27 and CD38 among these cells was measured in groups defined by various parameters. The authors also investigated a possible correlation between frequencies of HLA-specific B cells and production of HLA-specific antibody after transplantation. They found no correlation between the frequencies of CD27+tet+ (33–44% versus 34–36%) or CD38+tet+ (57–65% versus 59–66%) B cells and a previous mismatch for the HLA antigen of the tetramer. However, there was an increase in CD38+tet+ B cells among patients making antibody to the tetramer antigen (67–72% versus 53–56%). Blacks had lower frequencies of CD27+ B cells than did whites (11.8 versus 28.9%; P=.003) but had greater increases of these cells among tet+ cells than did whites. There was a higher frequency of tet+ B cells among patients who developed new antibody to the HLA antigen (3.9–8.6%) of the tetramer after transplantation than among those who did not (1.1–3.7%). The authors concluded that the phenotype of HLA-specific B cells reflects current or historic sensitization to HLA and may reflect inherent differences between groups defined by race or gender, or both. The frequencies of HLA-specific B cells may predict patients at risk of producing donor-specific antibody after transplantation.
Zachary AA, Kopchaliiska D, Montgomery RA, et al. HLA-specific B cells. Transplantation. 2007;83:989–994.
Reprints: Dr. Andrea A. Zachary, Johns Hopkins University Immunogenetics Laboratory, 2041 E. Monument St., Baltimore, MD 21205; firstname.lastname@example.org
Methicillin-resistant Staphylococcus aureus frequently causes disease outbreaks and has become endemic in many regions, adding to the morbidity, mortality, and cost of care associated with hospital-acquired infections. Health care institutions have adopted enhanced surveillance and infection control measures to address this unresolved problem. In particular, reporting of bloodstream infections (BSI) by MRSA is often mandatory, and reduction of BSI rates is a performance target. In the Centre region of France, an extensive, prospective, longitudinal, region-wide survey of BSI has been underway since 2000. Data are collected for three months of each year in a large number of health care facilities to establish a comprehensive picture of the epidemiology of severe hospital-acquired infections. MRSA BSI and methicillin-sensitive S. aureus (MSSA) BSI are extensively studied within this framework. All of the S. aureus strains isolated during successive study periods are sent to the authors’ central laboratory for susceptibility testing, molecular typing, and analysis of virulence genes with the aim of determining the spread and diversity of S. aureus strains in the region. For this study, the authors looked for major changes in the epidemiology of antibiotic resistance and of virulence genes in strains of S. aureus responsible for BSI. They identified a need to focus efforts on preventing MRSA and MSSA BSI infections and raised the issue of whether the use of fluoroquinolones has contributed to the acquisition of resistance and virulence genes by S. aureus strains. The authors studied 358 S. aureus strains isolated from BSI observed during an epidemiological study covering 2,007,681 days of hospitalization in 32 health care institutions between 2004 and 2006. The strains were tested for antibiotic susceptibility and characterized genetically. The authors found that the incidence of S. aureus BSI declined regularly through 2004 and 2005 and then increased significantly in 2006 (80%+). This was largely due to an increase in BSI involving MSSA strains and non-multiresistant methicillin-resistant S. aureus (NORSA) strains. Ninety-six percent of the NORSA strains were resistant only to methicillin and fluoroquinolones. Most of the MSSA strains belonged to a small number of pulsed-field gel electrophoresis (PFGE) divisions and were associated with epidemic phenomena in health care facilities. The NORSA strains also clustered into a limited number of PFGE divisions but could not be related to any local outbreak in medical facilities. In 2006, there was a significant increase in the incidence of BSI associated with tst gene-positive MSSA strains (275%+), and the first three BSI associated with tst gene-positive MRSA were observed. PFGE data revealed a limited heterogeneity among the tst gene-positive strains without any outbreak in the health care facilities. This study underlines the need for infection control teams to focus efforts on preventing MRSA and MSSA BSI. As demonstrated in vitro, fluoroquinolones may enhance horizontal transfer of virulence and antibiotic resistance genes. These antibiotics are widely used in France, so these findings raise the issue of whether the use of such antibiotics has contributed to the acquisition of mecA and tst genes by S. aureus strains.
Van der Mee-Marquet N, Epinette C, Loyau J, et al. Staphylococcus aureus strains isolated from bloodstream infections changed significantly in 2006. J Clin Microbiol. 2007;45:851–857.
Reprints: Nathalie van der Mee-Marquet, Laboratoire de Bacteriologie et Hygiene, Hopital Trousseau, 37044 Tours Cedex, France; n.vandermee@chu- tours.fr
The diagnosis of malaria in clinical laboratories in the United Kingdom has, until recently, depended almost exclusively on microscopy, and this technique remains the most widely used. Although polymerase chain reaction (PCR) has a sensitivity and specificity superior to blood film microscopy, using PCR for the primary diagnosis of malaria is an unrealistic goal for most laboratories in the United Kingdom. Approximately 2,000 imported cases of malaria are reported in the United Kingdom each year, a major proportion of the 12,000 cases reported annually in Europe. Furthermore, the case mortality from Plasmodium falciparum species is reported to be as high as 3.6 percent, and even up to 20 percent in some non-endemic countries. Prompt and accurate diagnosis and treatment of malaria should be a priority in any region; but in a non-endemic area, many of the malaria cases are seen in non-immune travelers in whom even very low malaria parasitemias can result in serious illness. The situation is compounded if the diagnosis is delayed by lack of familiarity with the clinical presentations of malaria or by difficulty in detecting and speciating malaria parasites in blood films. Rapid diagnostic tests to aid in diagnosing malaria can be important in the future for detecting malaria parasites both in U.K. practices and in countries that are endemic to malaria, under certain circumstances. Rapid diagnostic tests use immunochromatographic methods to detect malaria antigens from parasites in lysed blood. They usually use a test strip bearing monoclonal antibodies that are directed against the target parasite antigens, and they are commercially available in kits provided with all the reagents. The depth and sophistication of training required to carry out the tests and to interpret the results are substantially less than that required to achieve proficiency in malaria microscopy. As yet, there is no established protocol for the use of rapid diagnostic tests in U.K. laboratory practice. Consequently, the authors conducted a survey of the use of such tests in U.K. laboratories subscribing to the United Kingdom National External Quality Assessment Scheme blood parasitology and hematology schemes. The survey generated an overall response rate of 60.3 percent. Rapid diagnostic tests were found to be the preferred choice, either alone or in conjunction with microscopy in 31.2 percent of the samples examined during normal working hours and in 44.3 percent of the specimens examined on call. The authors found that during on-call hours, the use of rapid diagnostic tests increased, and the tests changed the diagnosis in 12 percent of laboratories. However, no established protocol for the use of rapid diagnostic tests was observed in the United Kingdom. A protocol that needs to be validated in the laboratory setting is suggested.
Chilton D, Malik ANJ, Armstrong M, et al. Use of rapid diagnostic tests for diagnosis of malaria in the UK. J Clin Pathol. 2006; 59: 862– 866.
Reprints: P.L. Chiodini, Dept. of Clinical Parasitology, Hospital for Tropical Diseases, Mortimer Market Centre, Capper St., London WC1E 6AU, United Kingdom; email@example.com
Several low-molecular-weight aliphatic amines are present in human urine, and all appear to have endogenous and exogenous origins. Despite having been detected many decades ago, the amines’ basic physiological significance and possible pathological associations are still unclear, although some have been shown to influence cell growth. It has been proposed that these amines play a significant role in the central nervous system disturbances observed during renal and hepatic dysfunction, especially when the blood-brain barrier is also compromised. This is a reasonable assumption, because owing to their low molecular weight, ease of solubility in aqueous and lipid environments, and electron-rich amino groupings, these molecules are able to readily access the brain and spinal tissues and interfere with neurologic function. Nevertheless, the relationships between these amines are not known, and the relative importance of metabolic suggestions based on knowledge of chemical structures is uncertain, especially in humans. The authors conducted a study in which mathematical analysis of data concerning the daily urinary excretion of four of these aliphatic amines (methylamine, dimethylamine, trimethylamine, and ethylamine) and one related N-oxide (trimethylamine N-oxide) from 203 healthy volunteers provided insights into the interrelationships of these materials. The advantage of analyzing parallel data sets is also evident, and the use of such mathematical approaches may help to render transparent relationships that are otherwise unclear. Principal component analysis highlighted a female subgroup with raised trimethylamine levels, and the possibility of hormonal influence on the N-oxidation of trimethylamine has been proposed. A second subgroup of men, who ate a large meal of fish before the study, displayed raised levels of all compounds, except ethylamine. The authors found that in all cases, ethylamine was least significantly correlated with the other urinary components and appeared metabolically unrelated.
Mitchell SC, Bollard ME, Zhang AQ. Short-chain aliphatic amines in human urine: a mathematical examination of metabolic interrelationships. Metabolism. 2007;56:19–23.
Reprints: Stephen C. Mitchell, SORA Division, Biomolecular Medicine, Faculty of Medicine, Imperial College London, South Kensington, London SW7 2AZ, United Kingdom; firstname.lastname@example.org
Streptococcus pneumoniae and Haemophilus influenzae are important bacteria in recurrent respiratory infections. The immunological response to them is based on the synthesis of specific immunoglobulins, generation of complement factors, and phagocytosis. Antibodies to polysaccharide antigens of S. pneumoniae and H. influenzae are important in protecting against these microorganisms. The role and relative frequency of partial immunodeficiencies and polymorphisms of the immune system in increased susceptibility to respiratory infections is poorly understood. Neither is it known whether the combined presence of several (partial) immune defects contributes to increased susceptibility to infection. The authors conducted a study to identify the nature and incidence of the immune defects that underlie increased susceptibility to recurrent respiratory infections. They evaluated the prevalence of IgA, IgM, IgG, and IgG subclass deficiencies, impairment in the antibody response against pneumococcal polysaccharides, G2m(n) allotypes, FcyRIIa polymorphisms, partial C2 and partial C4 deficiency, promoter polymorphisms in MBL2, and lymphocyte subset deficiencies in a control population and in consecutive children with recurrent respiratory infections. IgA or IgG subclass deficiency, or both, was found in 27 of 55 patients (49%) and six of 43 controls (14%) (P=.0006). An impaired antibody response to polysaccharides was found in seven patients (19%) and zero of 37 controls (P=.002). The Gm(n) marker was absent in 25 of 55 patients (45%) and six of 42 controls (14%) (P=.009). The MBL2 variants O/O, A/O, and A/A occurred in 9, 14, and 32 of the 55 patients, respectively, and in 1, 19, and 23 of the 43 controls, respectively (P=.05). There was no increase in the prevalence of partial C4 deficiency, C2 deficiency, lymphocyte subset deficiency, or FcyRIIa polymorphism in the patients compared with the controls. A combination of at least two immune defects was found in 31 of 55 patients (56%) and four of 42 controls (11.6%) (P<.0001). The authors concluded that specific antipolysaccharide antibody deficiency; IgA or IgG subclass deficiency, or both; Gm(n) allotype; and MBL2 genotype are susceptibility factors for recurrent respiratory infections. The coexistence of several immune defects is the strongest risk factor in this study.
Bossuyt X, Moens L, Van Hoeyveld E, et al. Coexistence of (partial) immune defects and risk of recurrent respiratory infections. Clin Chem. 2007;53:124–130.
Reprints: Xavier Bossuyt, Laboratory Medicine, University Hospital Leuven, Herestraat 49, B-3000 Leuven, Belgium; email@example.com
Tumor markers are useful in managing various cancers. Among these, cancer antigen (CA) 19-9 is a marker for pancreatic and colorectal carcinoma. However, CA 19-9 has not been shown to be a good screening test for pancreatic cancer in asymptomatic people. CA 19-9 is used primarily in serial monitoring during palliative chemotherapy in conjunction with imaging tests. Serial measurements are also useful for followup after potentially curative surgery. Measurement of CA 19-9 may also be useful for monitoring other cancers, including gastric, hepatobiliary, hepatocellular, breast, and ovarian. CA 19-9 is a glycolipid, the sialylated form of the Lewis blood group antigen. In serum, it exists as a mucin, a high-molecular-mass (200–1,000 kd) glycoprotein complex. CA 19-9 is synthesized by normal human pancreatic and biliary ductular cells and by gastric, colon, endometrial, and salivary epithelia. The original monoclonal antibody against CA 19-9 was developed from a human colon carcinoma cell line, SW-1116. Immunoradiometric assays formerly were used to test for CA 19-9, but they have, in large part, been replaced by automated, nonisotopic immunoassays. Even though imprecision has improved from the use of nonisotopic immunoassays, the agreement of CA 19-9 results has worsened in the last few years, since the automated techniques became available. The concentration of CA 19-9 in a given specimen, determined using assays from different manufacturers, can vary due to differences in assay methods, antibodies used, and reagent specificity. Although many quality assessment efforts have been initiated, discrepancies are commonly observed. The authors undertook a study to assess the performance characteristics of five automated, commercially available CA 19-9 assays—the Architect i2000 (Abbott Diagnostics), Advia Centaur (Bayer Diagnostics), UniCel DxI 800 (Beckman Coulter), Immulite 2000 (Diagnostic Products), and Elecsys E170 (Roche Diagnostics). All methods were evaluated for limit of detection, linearity, imprecision, method comparison, and reference intervals. All limit of detection results were below 2 kU/L and met the manufacturers’ claims. Linearity had deviation from target values that ranged from 4.5 to 26.7 percent. All methods showed acceptable imprecision with total coefficients of variation less than eight percent. Method comparison by Passing-Bablok analysis resulted in slopes ranging from 1.00 to 2.06 and correlation coefficients of 0.85 to 0.98. Between 97.6 and 99.2 percent of results from healthy volunteers were less than 35 kU/L. All methods showed acceptable analytic performance.
La’ulu SL, Roberts WL. Performance characteristics of five automated CA 19-9 assays. Am J Clin Pathol. 2007;127:436–440.
Reprints: W. L. Roberts, c/o ARUP Laboratories, 500 Chipeta Way, Salt Lake City, UT 84108
The specificity of urine nitrite for urinary tract infection is high, ranging from 85 percent to 99 percent, according to published literature. Despite this, many physicians in the authors’ emergency department have noticed an apparent association between elevated bilirubin and false-positive urine nitrite tests. A typical example is a young man with signs and symptoms of viral hepatitis who has a positive urine nitrite test with no evidence of urinary tract infection (no white blood cells or bacteria in his urine and a negative urine culture). Such cases are not rare events in the authors’ emergency department, located near the Texas-Mexico border, because the seropositivity rate for hepatitis A in U.S. counties along the Mexico border is two to three times greater than in the rest of the country. In addition, the prevalence of chronic hepatitis C in the authors’ county has been estimated to be greater than two percent (more than 10,000 cases), whereas most Texas counties have fewer than 1,000 cases and a prevalence rate of less than 1.75 percent. The authors undertook a study to determine if the anecdotal observations correlating hyperbilirubinemia with false-positive urine nitrite tests are accurate. They tested this hypothesis by determining whether the specificity and positive predictive value of urine nitrite for urinary tract infections differed between patients with normal and elevated total serum bilirubin levels. The authors conducted an institutional review board-approved, retrospective review of 12 months of patient data, compiling information from patients having urinalysis, urine culture, and total serum bilirubin measured. Patients were divided into three groups according to total serum bilirubin: less than 1.5 mg/dL, 1.5 to 3.0 mg/dL, and greater than 3.0 mg/dL. The point estimates and 95 percent confidence intervals of the sensitivity, specificity, false-positive proportion (proportion of false-positive to all-positive tests), and other test characteristics of urine nitrite as an indicator of urinary tract infection were calculated and tested for trend as a function of the three total serum bilirubin ranges. The authors found that 3,174 patients met their study criteria. Specificity of the nitrite test decreased as a function of increasing total serum bilirubin (0.974, 0.966, and 0.855 for the three total bilirubin levels, respectively) with a significant trend (P<.0001). No significant trend was noted in comparable sensitivity values (0.380, 0.417, and 0.241, respectively) (P=.55). The false-positive proportion also increased as a function of total serum bilirubin (17.5%, 17.3%, and 72%) (P<.0001). Therefore, if a patient’s total serum bilirubin was elevated to the point of jaundice (>3.0 mg/dL), it was approximately four times more likely that a positive urine nitrite test would be a false positive (nitrite positive/ culture negative) compared with those having normal serum bilirubin levels. The authors concluded that specificity of the urine nitrite test for urinary tract infection decreases as a function of increasing serum bilirubin. Most patients with hyperbilirubinemia and a positive nitrite test in this sample did not have an associated urinary tract infection.
Watts S, Bryan D, Marill K. Is there a link between hyperbilirubinemia and elevated urine nitrite? Am J Emerg Med. 2007; 25: 10– 14.
Reprints: Dr. Susan Watts, Dept. of Emergency Medicine, Texas Tech University Health Sciences Center at El Paso, 4800 Alberta Ave., El Paso, TX 79905; firstname.lastname@example.org
The U.S. Renal Data System, in 1999, documented a persistent increase in kidney failure and that 340,000 patients required dialysis therapy or transplantation, and it projected these numbers would increase to 651,000 patients by 2010. Korea has seen a recent dramatic increase in the prevalence of end-stage renal disease (ESRD) patients requiring renal replacement therapy—from 303.6 per million population in 1994 to 854 per million population in 2004. Chronic kidney disease (CKD) often progresses to ESRD with its attendant complications, and treating the earlier stages of CKD is effective in slowing progression to ESRD. Consequently, it is important to identify the precursors of CKD. However, few prospective studies have provided data on risk factors for developing CKD in the Asian population. Serum y-glutamyltranferase (GGT) has been used widely as an index of alcohol intake or liver dysfunction. Serum GGT was proposed as a sensitive marker of oxidative stress because it has dose–response associations with many cardiovascular disease risk factors, as well as with future risk of diabetes. However, little research has been done to examine whether GGT is associated with the prospective development of CKD. The authors conducted a study to evaluate the association between GGT and risk for CKD in nonhypertensive and nondiabetic male Korean workers. The study cohort included 10,337 healthy males with normal baseline kidney functions and without proteinuria. Participants were workers in a semiconductor manufacturing company and its 13 affiliates. CKD was defined as the presence of proteinuria or a glomerular filtration rate (GFR) of less than 60 mL/min.–1 (1.732)–1. Cox proportional hazards models were used to calculate the adjusted hazard ratios in separate models for CKD. During a follow-up period of 25,774.4 person years, 366 men developed CKD. After adjusting for age, baseline GFR, triglyceride, and high-density lipoprotein cholesterol, the risk for CKD increased with an increasing quartile of serum GGT (P<.001). The top one-fourth of serum GGT versus the bottom one-fourth of relative risks for CKD was 1.90 (95% confidence interval, 1.37–2.63). These associations were also apparent in participants who consumed no more than 20 g/day of alcohol and those with normal weight, with values of alanine aminotransferase within reference intervals, or with C-reactive protein of less than 3.0 mg/L, and participants without metabolic syndrome. The authors’ findings indicated that serum GGT may be an early predictor for developing CKD, independent of baseline confounding factors.
Ryu S, Chang Y, Kim D-I, et al. y-glutamyltransferase as a predictor of chronic kidney disease in nonhypertensive and nondiabetic Korean men. Clin Chem. 2007; 53: 71–77.
Reprints: Seungho Ryu, Kangbuk Samsung Hospital, 108 Pyung dong, Jongro-Gu, Seoul, Korea 110-746; email@example.com
Dr. Bissell is Professor and Director of Clinical Services and Vice Chair, Department of Pathology, Ohio State University Medical Center, Columbus.