Return to CAP Home
Printable Version

  Clinical Abstracts

title

 

 

 

cap today

August 2002

Female genital tract shedding of HIV-1
HIV-1 transmission from mother to child and between sexual partners is likely to be related to plasma viral load. And female-to-male transmission is likely to be as frequent as male-to-female transmission. Transmission typically is due to direct contact with the virus in the genital tract. Previous studies of HIV-1 shedding in the female genital tract have been done in small cohorts or among African women. The authors conducted a cross-sectional study involving a large cohort of HIV-1-infected women to assess the nature of HIV-1 shedding in the genital tract. They enrolled 311 HIV-positive women from Jan. 30, 1997 to July 1, 1998. These women were representative of a portion of the Women's Interagency HIV Study, which was conducted at five sites nationwide. The authors conducted clinical assessments, cultured HIV-1, and measured RNA in peripheral blood mononuclear cells and genital secretions. The presence of HIV-1 RNA or culturable virus in genital secretions was defined as HIV-1 shedding. The authors found that HIV-1 RNA was present in 57 percent of the women's genital secretions. Infectious virus was detected in only six percent. Genital tract HIV shedding was found in 80 percent of women with detectable plasma RNA and 78 percent of women with positive peripheral blood mononuclear cell cultures. Thirty-three percent of women with less than 500 copies/mL of plasma RNA and 39 percent of those with negative peripheral blood mononuclear cell cultures also had genital tract shedding. The authors concluded that plasma RNA concentration was the most important factor in predicting genital HIV shedding, even among women receiving antiretroviral therapy. HIV-1 shedding did occur in women with less than 500 copies/mL of plasma HIV-1 RNA however, suggesting that a separate reservoir of HIV-1 replication may exist in some women.

Kovacs A, Waserman SS, Burns D, et al. Determinants of HIV-1 shedding in the genital tract of women. Lancet. 2001;358:1593-1601.

Reprints: Dr. Andrea Kovacs, Comprehensive Maternal-Child and Adolescent HIV Management and Research Center, Los Angeles County and University of Southern California Medical Center, University of Southern California Keck School of Medicine, 1640 Marengo St., HRA Bldg., Los Angeles, CA 90033; akovacs@hsc.usc.edu

Cerebrospinal fluid LDH isoenzymes
The normal lactic dehydrogenase isoenzyme pattern in cerebrospinal fluid is LDH-1, 38 to 58 percent; LDH-2, 26 to 36 percent; LDH-3, 12 to 24 percent; LDH-4, one to seven percent; and LDH-5, zero to five percent. Alterations of this pattern are indicative of different disease states in the central nervous system. Patients with bacterial meningitis, for example, tend to show elevated levels of LDH-4 and LDH-5. The authors measured the levels of total LDH and LDH isoenzyme concentrations in the CSF of children with arrested or progressive hydrocephalus. They collected CSF from 10 patients, aged two to 16 months, who had hydrocephalus. The findings were compared to those from 15 pediatric patients with normal results. The mean total LDH in the CSF was significantly higher in patients with hydrocephalus (101 ± 23.11 U/L) than in the controls (33.53 ± 5.75 U/L). In the controls, LDH-1 was the main fraction, followed by LDH-2 and LDH-3. Only small concentrations of LDH-4 and LDH-5 were detected. In contrast, hydrocephalus patients showed lower concentrations of LDH-1 and higher LDH-2 and LDH-3 concentrations. The differences between these results and those of the controls were statistically significant (P<0.001). LDH-4 and LDH-5 appeared to be unaffected in the hydrocephalus group.

Nussinovitch M, Volovitz B, Finkelstein J, et al. Lactic dehydrogenase isoenzymes in cerebrospinal fluid associated with hydrocephalus. Acta Paediatr. 2001;90:972-974.

Reprints: M. Nussinovitch, Dept. of Paediatrics C, Schneider Children's Medical Centre of Israel, Petah Tikva, Israel 49202

ToRCH testing by antigen microarray
Advances in automated microdeposition technologies have led to the development of high-density ordered arrays of molecules, including DNA and proteins and peptides. These microarrays may be an extremely powerful tool for understanding functional relationships between a large repertoire of genes or antigens. The authors developed such a protein microarray for the ToRCH panel, printing antigens for Toxoplasma gondii, rubella virus, cytomegalovirus, and herpes simplex virus types 1 and 2. The antigens were printed on activated glass slides using high-speed robotics. The detection limit of the resulting microarray assay was 0.5 pg of IgG or IgM bound to the slides. The within-slide, between-slide, and between-batch precision profile showed coefficients of variation of 1.7 to 18 percent for all antigens. Overall, there was a greater than 80 percent concordance in results obtained between enzyme-linked immunosorbent assays and the microarrays in classifying sera. The authors concluded that microarrays are suitable for the serodiagnosis of infectious diseases and that the ToRCH assay performs as well as ELISA.

Mezzasoma L, Bacarese-Hamilton T, Di Cristina M, et al. Antigen microarrays for serodiagnosis of infectious diseases. Clin Chem. 2002;48:121-130.

Reprints: Andrea Crisanti, Imperial College of Science, Technology and Medicine, Dept. of Biology, Biomedical Sciences Building, 5th floor, Imperial College Road, London SW7 2AZ, United Kingdom; acrs@ic.ac.uk

Antibiotics and false-positive urine drug screens
Random drug testing has been a common practice in the workplace and the criminal justice system, as well as under other circumstances. The immunoassays used to conduct such screening tests generally are very reliable and have relatively few false-positive results. It has been reported as early as 1997, however, that quinolone antibiotics, in particular ofloxacin, could result in false-positive test results for opiates by the EMIT II (enzyme multiplied immunoassay technique) assay (Syva, San Jose, Calif.). This finding does not appear to be well-known, and the authors, upon encountering a case of false-positive opiate screening with levofloxacin, studied the effects of quinolone antibiotics on urine drug assays. From September 1998 to March 1999, the authors confirmed the cross-reactivity of levofloxacin or ofloxacin using five commercial opiate screening assays and working with six healthy volunteers. The authors examined the activity of 13 quinolone antibiotics—levofloxacin, ofloxacin, pefloxacin, enoxacin, moxifloxacin, gatifloxacin, trovafloxacin, sparfloxacin, lomefloxacin, ciprofloxacin, clinafloxacin, norfloxacin, and nalidixic acid. The main outcome measured was the opiate assay threshold for a positive result of 300 ng/mL of morphine. Nine of the quinolones caused assay results to rise above the threshold for positive in at least one of the assays. Four of the assay systems caused false-positive results for at least one quinolone, and 11 of 13 compounds caused some opiate activity by at least one assay system. At least one compound caused opiate assay activity in all five assay systems. The quinolones most likely to cause false-positive results were levofloxacin, ofloxacin, and pefloxacin. Positive results were obtained in the urine from all six volunteers. The authors concluded that greater attention should be focused on the problem of cross-reactivity of quinolones with immunoassays for opiates.

Baden LR, Horowitz G, Jacoby H, et al. Quinolones and false-positive urine screening for opiates by immunoassay technology. JAMA. 2001;286:3115-3119.

Reprints: Dr. Lindsey R. Baden, Division of Infectious Diseases, Brigham and Women's Hospital, 15 Francis St., PBB-A400, Boston, MA 02115; lbaden@partners.org

C-reactive protein and surgical complications
C-reactive protein levels generally have been found to be superior to leukocyte counts and the erythrocyte sedimentation rate in detecting surgical complications in patients with bacterial infections. They have also been shown to reflect the extent of surgical trauma. Preoperative C-reactive protein levels are considered a risk factor for predicting postoperative outcomes. The time course or kinetics of change of C-reactive protein levels are not well understood with respect to differentiating infections from surgical trauma. The authors studied the kinetics of C-reactive protein levels of 330 patients who had operative fracture treatment. The measurements were obtained before and after surgery. The pattern of C-reactive protein values was similar in all patients who had an uneventful postoperative course. The peak level of C-reactive protein occurred on the second postoperative day and was higher in cases of femoral fractures (15.4 mg/dL) versus ankle fractures (3.5 mg/dL). In 47 patients with complicated postoperative recoveries, C-reactive protein was useful as a marker for risk stratification and as an early indicator of infection. Nine patients with deep wound infections showed high rises of C-reactive protein, and seven of these patients showed an elevation in C-reactive protein level before the onset of clinical symptoms. A cut-off level of 14 mg/dL on the fourth day after surgery was recorded for patients with deep wound infection.

Scherer MA, Neumaier M, von Gumppenberg S. C-reactive protein in patients who had operative fracture treatment. Clin Orthop. 2001;393:287-293.

Reprints: Dr. Michael A. Scherer, Dept. for Trauma and Reconstructive Surgery, Abt. für Unfallchirurgie Ismaningerstr. 22, D-81675, University Hospital rechts der Isar der TU-München, München, Germany

Methods for estimating plasma bicarbonate in critically ill patients
Some researchers have questioned the reliability and applicability of the constants in the Henderson-Hasselbalch equation when used to estimate plasma bicarbonate concentration in critically ill patients. An alternative to this approach is to measure total carbon dioxide, but this method is undermined by the loss of carbon dioxide to the atmosphere. A third approach is the Stewart approach to acid-base physiology, which estimates the plasma bicarbonate concentration by manipulating the strong-ion-gap equation. This method has not been examined. Using data from a recent study of acid-base disorders in critically ill patients, the authors asked, what is the agreement in estimating plasma bicarbonate concentration between the Henderson-Hasselbalch, enzymatic, and strong-ion-gap methods? The authors collected 100 data sets from the records of routine daily blood samples from critically ill patients. They constructed Bland-Altman diagrams and analyses to compare the three methods. The authors proposed that bias greater than ±1 mmol/L and limits of agreement wider than a bias of ±2 mmol/L were clinically significant. Comparing the Henderson-Hasselbalch method to the enzymatic method, the bias was 2.1 mmol/L and the limits of agreement were -1.8 mmol/L to 5.9 mmol/L. In comparing the Henderson-Hasselbalch method to the strong-ion-gap method, the bias was -9.1 mmol/L and the limits of agreement were -17.1 mmol/L to -1.1 mmol/L. Finally, comparing the enzymatic to the strong-ion-gap method, the bias was -11.2 mmol/L and the limits of agreement were -18.2 mmol/L to -4.2 mmol/L. The authors concluded that there is poor agreement between the two established methods and even poorer agreement between the established assays and the strong-ion-gap method. The latter is too inaccurate for clinical application.

Story DA, Poustie S, Bellomo R. Comparison of three methods to estimate plasma bicarbonate in critically ill patients: Henderson-Hasselbalch, enzymatic, and strong-ion-gap. Anaesth Intensive Care. 2001;29:585-590.

Reprints: Dr. D.A. Story, Dept. of Anaesthesia, Austin and Repatriation Medical Centre, Studley Road, Heidelberg, Victoria 3084, Australia

Candidate genetic marker for bone mineral density variation
Osteoporosis is characterized by low bone mineral density measurements. Population genetic evidence supports the concept that variation in bone mineral density is under strong genetic control, with heritability estimates ranging from 0.5 to 0.9. Molecular genetic studies of bone mineral density variation have produced inconsistent findings. The few linkage studies that have been reported have generated intriguing results concerning the marker D11S987 on chromosome 11q12-13. Three distinct Mendelian traits that are related to bone mineral density have been reported in human pedigrees, with significant linkage to chromosome 11q12-13. The traits are osteoporosis-pseudoglioma, autosomal recessive osteopetrosis, and autosomal dominant high bone mass. Research was undertaken to investigate a possible link between marker D11S987 and bone mineral density variation. The authors studied 374 sibling pairs who showed significant linkage of D11S987 to normal BMD variation, with a maximum logarithm of odds score of 3.5. A subsequent linkage study involving 595 sibling pairs demonstrated reduced significance for the linkage, with a LOD score greater than 2.2. The authors genotyped five markers in a genomic region of about 27 cM centering on D11S987 and measured BMD in other traits, including weight, for 635 individuals from 53 pedigrees. Each of these pedigrees was ascertained through a proband with BMD Z-scores of less than -1.28 at the hip or spine. Using several different analyses, the authors found little evidence linking these five markers to BMD of the hip, wrist, and total body bone mineral content. The maximum LOD score at D11S987 was 0.15. The authors concluded that they could not exclude linkage of the D11S987 region to BMD variation, but there is no evidence linking the marker to BMD in their study population.

Deng HW, Xu FH, Conway T, et al. Is population bone mineral density variation linked to the marker D11S987 on chromosome 11q12-13? J Clin Endocrinol Metab. 2001;86: 3735-3741.

Reprints: Dr. Hong-Wen Deng, Osteoporosis Research Center, Creighton University, 601 N. 30th St., Ste. 6787, Omaha, NB 68131; deng@creighton.edu

Haptoglobin phenotypic variation
Human haptoglobin is characterized by a genetic polymorphism involving three distinct phenotypes—Hp 1-1, Hp 2-1, and Hp 2-2—that are molecularly heterogeneous. Hp 1-1 is a small (86 kDa) molecule of well-defined structure; Hp 2-1 is a set of heteropolymers between 86 and 300 kDa; and Hp 2-2 forms large macromolecular complexes of 170 to 1000 kDa. Hp-Hb complex formation is greatly influenced by the Hp phenotypes. The hepatic clearance of free Hb in plasma, for example, appears to be less efficient for the Hp 2-2 phenotype than for other Hp phenotypes, and this produces a degree of oxidative stress driven by iron. The authors postulated a relationship between Hp phenotypes and iron status. They studied the Hp phenotypes of 717 healthy adults. They measured serum indicators of body iron compartments: iron and transferrin saturation, ferritin, and soluble transferrin receptors. Intracellular iron status in human monocyte-macrophages was studied by measuring cytosolic L- and H-ferritin concentrations as well as the in vitro uptake of I-125 Hb-Hp complexes. In males, but not in females, the Hp 2-2 phenotype was associated with higher serum iron transferrin saturation and a higher ferritin concentration than the Hp 1-1 or Hp 2-1 phenotypes. But soluble transferrin receptor concentrations were lower. Serum ferritin correlated with monocyte L-ferritin content, which was also highest in the male Hp 2-2 subgroup. The authors concluded that the Hp 2-2 phenotype affects serum iron status markers in healthy males and is associated with higher L-ferritin concentrations in monocyte-macrophages because of an iron delocalization pathway that selectively occurs in Hp 2-2 subjects.

Langlois MR, Martin ME, Boelaert JR, et al. The haptoglobin 202 phenotype affects serum markers of iron status in healthy males. Clin Chem. 2000;46:1619-1625.

Reprints: M.R. Langlois, Laboratory of Clinical Chemistry, University Hospital Gent, De Pintelaan 185, B-9000 Gent, Belgium; michel.langlois@rug.ac.be

Selenium levels in autopsy tissues in Poland
Selenium content varies from one tissue to another and one geographic region to another depending on the selenium content of the soil. The highest tissue levels of selenium normally occur in the kidney. Approximately half this amount is found in the liver, and lower amounts are found in the brain, lungs, and muscles. Selenium deficiency has been implicated in certain human diseases, including cardiovascular disease and cancer. The authors conducted a study in Poland in which they determined selenium levels in tissues taken at autopsy from 46 healthy individuals killed in accidents and 75 corpses of victims of various diseases. Selenium levels on a per-weight-unit basis ranged (in descending order) from kidney at 469 to liver, spleen, pancreas, heart, brain, lung, bone, and skeletal muscle at 51. Nevertheless, the highest proportion of total body selenium was found in skeletal muscles (27.5 percent); much less was found in bones (16 percent) and blood (10 percent). Levels were much lower in tissues of corpses with cancer than in controls. The lowest levels were found in the livers of alcoholics. The levels of tissue selenium in the study subjects were significantly lower than the levels reported in Japan, United States, Canada, and other countries because of inadequate selenium levels in the soil in Poland.

Zachara BA, Pawluk H, Bloch-Boguslawska E, et al. Tissue level, distribution, and total body selenium content in healthy and diseased humans in Poland. Arch Environ Health. 2001;56:461-466.

Reprints: Dr. Bronislaw A. Zachara, Dept. of Biochemistry, Medical University, 85-092 Bydgoszcz, Poland; bronz@aci.amb.bydgoszcz.pl

Salmonella epidemiology
Epidemiologic surveillance for Salmonella species in the United States began in 1962 and is jointly conducted by the Council of State and Territorial Epidemiologists, Association of Public Health Laboratories, and the CDC. Its objectives are to define endemic patterns of salmonellosis, identify trends in disease transmission, detect outbreaks, and monitor control efforts. The authors reviewed trends in Salmonella infections in the United States from 1987 through 1997. During this time, 441,863 Salmonella isolates were reported, with the highest age-specific rate among infants (159.5/100,000 infants at two months). The isolation rates for the organism per annum decreased from 19/100,000 persons to 13/100,000 persons. The trends in these isolation rates, however, varied by serotype of Salmonella. The isolation rate of Salmonella serotype Enteritidis increased until 1996, whereas serotypes Hadar and Heidelberg declined. The serotypes that increased in frequency were significantly more likely than those that decreased to be associated with contact with reptiles. Recent declines in food-associated serotypes may reflect changes in the meat, poultry, and egg industries that preceded or anticipated the 1996 implementation of the pathogen-reduction programs. Additional educational measures should be taken to control the emergence of reptile-associated salmonellosis.

Langlois MR, Martin ME, Boelaert JR, et al. The haptoglobin 202 phenotype affects serum markers of iron status in healthy males. Clin Chem. 2000;46:1619-1625.

Reprints: Dr. Sonja J. Olsen, Centers for Disease Control and Prevention, Foodborne and Diarrheal Diseases Branch, 1600 Clifton Road, MS A-38, Atlanta, GA 30333; sco2@cdc.gov