Clinical Laboratory Medicine
2nd Edition

CLINICAL TOXICOLOGY OF SELECTED DRUGS
Part of "22 - Toxicology"
Ethanol
Ethanol is the most widely used social drug, and its abuse is one of the most important causes of injury and disease. Recent attempts to combat the problems of driving while under the influence of alcohol (DUI) have placed additional responsibilities on hospitals and clinical laboratories because the results of blood alcohol analyses, although obtained for medical reasons, may be used for legal purposes.
Ethanol usually is consumed in beers (3% to 6% ethanol), wines (10 to 12%), and distilled beverages (40% to 50%). The term “proof” means two times the percent of ethanol by volume. Ethanol is absorbed rapidly, and there is significant intersubject variability in the peak blood-alcohol concentration attained and the time for its achievement because of food intake and physiological variables (29). Ethanol distributes in total body water. Therefore, at equilibrium, ethanol concentration in any tissue or fluid is a function of the water content of that specimen. A plasma/blood ratio of 1.18 (range of 1.10 to 1.35), urine/blood ratio of 1.3, and breath/blood ratio of 2,407 (range of 1,981 to 2,833) have been determined (30, 31).
Ethanol is metabolized mostly by liver alcohol dehydrogenase to acetaldehyde at a rate that follows first-order kinetics at low concentration (less than 20 mg/dL). At higher concentration, metabolism proceeds at zero-order kinetics, which is independent of dose or initial blood-alcohol concentration. The average rate of metabolism is approximately 16 mg/dL/hr for men with significant variability (10 to 25 mg/dL/hr) and may be 10% to 20% lower for women (32). Individuals with high blood-ethanol concentration (greater than 3 g/L) have a rate of metabolism that is higher and more unpredictable (33).
Ethanol is a central nervous system (CNS) depressant. In intolerant subjects, ethanol-induced CNS dysfunction ranges from limited muscular incoordination at blood concentrations less than 50 mg/dL to coma, respiratory failure, and death at 400 mg/dL or higher. It is difficult to interpret blood levels in tolerant subjects; seemingly alert patients with ethanol levels in excess of 600 mg/dL have been reported.
Other diseases such as diabetic ketoacidosis and subdural hematoma can simulate the clinical signs of alcohol intoxication. Blood alcohols are ordered for the investigation of those patients who present with anion gap metabolic acidosis or those who are comatose. Medically, it may be desirable, or necessary, to know the patient’s blood alcohol concentration before administrating anesthetics or medications.
Clinical toxicology laboratories are increasingly involved in the collection and analysis of blood for ethanol that subsequently has forensic implications. A set of guidelines has been developed in response to the need for information dealing with various aspects of blood-alcohol analysis (34).
Type of Specimen
Plasma or serum usually is analyzed in clinical laboratories. Most state laws on drinking and driving, however, define alcohol concentration in terms of whole blood. Alcohol concentration in whole blood is lower than in plasma or serum; the frequently used serum/whole blood ratio is 1.18 (34). The analysis of either serum or whole blood yields results with the same clinical significance. However, the difference between serum and whole-blood–ethanol concentration is significant in forensic analysis particularly when a “blood” alcohol result falls in the vicinity of a statutory threshold pertaining to drinking and driving. Conversion of results of alcohol analysis performed on serum or plasma to whole blood using a population mean ratio of 1.18 is discouraged; for legal purposes, it is best to analyze whole blood. Laboratories should identify the specimen type in their reports. Venipuncture sites should be cleansed by nonalcohol-containing disinfectant such as benzalkonium chloride or aqueous povidone-iodine. If analysis will be delayed for a few hours, sodium fluoride at 1% (higher for longer-term storage) is an effective preservative (34).
Urine is not a suitable specimen because blood-alcohol concentration cannot be established with sufficient reliability from
P.423

alcohol concentration of a pooled-bladder urine specimen and because of the great variability of the blood:urine ratio.
The use of breath and saliva for ethanol measurement has considerable appeal relative to blood (35, 36). Its advantages are the noninvasive nature of sample collection, the ease of test performance, and rapid turnaround time. Breath-alcohol test is based on Henry’s Law, which states that in a closed system under constant temperature, the concentration of a gas above the liquid is proportional to the concentration of the gas dissolved in the liquid. The distribution of the gas in the gas and liquid phases is determined by the partition ratio. In the breath alcohol test, the assumption is that an equilibrium state exists between the alcohol in the blood perfusing the lung and that in the alveolar (breath). If the partition ratio for alcohol is known, blood-alcohol concentration can be determined indirectly by measuring alcohol concentration in breath. In forensic testing, a partition ratio of 2,100 has been adopted and used by measuring devices, although values have been reported to range from 1,981 to 2,833 (31). Careful considerations must be given to developing a quality-assurance program for POCT alcohol tests. Users of clinical breath-alcohol tests can benefit from the collective experience of the forensic community, which has extensive experience in breath-alcohol testing in traffic law enforcement (37).
The advantages of saliva testing are the noninvasive nature of sample collection, ready availability of saliva in sufficient quantity for analysis, and the ease of test performance (36). A device for measuring alcohol in saliva (STC Diagnostics, Bethlehem, PA) is based on enzymatic oxidation of alcohol by alcohol dehydrogenase. Saliva-alcohol results are in good agreement with alcohol concentrations in venous blood and end-expired breath, but insufficient sample volume was found to be a problem with highly intoxicated patients (36).
Analytical Methods
Many methods for alcohol analysis are in use in clinical laboratories (34). They can be grouped into the following categories:
  • Chemical oxidation of ethanol with acid dichromate.
    The reduction of dichromate is proportional to ethanol concentration. This is a nonspecific reaction; other alcohols and paraldehyde (as its metabolite acetaldehyde) also will give positive results. It is used in older breath analyzers.
  • Enzymatic oxidation using alcohol dehydrogenase (ADH).
    The extent of interference by methanol and isopropanol depends on the source of ADH and can be minimized by careful choice of assay parameters. Enzymatic methods are simple and the most frequently used methods in clinical laboratories. Laboratories, however, must be prepared to investigate suspected ingestions of alcohols other than ethanol or the co-ingestion of ethanol and another alcohol.
  • Gas chromatography.
    Gas chromatography is the most specific method (38). It yields information on both the identity of the alcohol as well its concentration. For emergency analysis, direct injection of a diluted sample has the advantage that it eliminates the 15- to 30-minute equilibration period required for headspace analysis. The use of an internal standard (e.g., n-propanol) will improve precision.
Concentration Units
Many state laws on drinking and driving define blood-alcohol concentration as percent by weight/volume (% w/v). Clinical laboratories usually report alcohol results in concentration units of milligram alcohol per 100 mL (dL) of blood (or serum). An alcohol concentration of 100 mg/dL is equivalent to 0.1% w/v.
For breath alcohol testing in the clinical setting, the devices should display results in concentration unit of gram of alcohol per 100 mL of blood and not as gram of alcohol per 210 L of breath. Physicians and nurses are accustomed to reviewing medical laboratory test results in concentration units based on 100 mL of blood (or serum).
Methanol and Isopropanol
Methanol and isopropanol are important industrial chemicals that also are available as household items: methanol as a constituent of some antifreeze and windshield washer solutions; isopropanol as a disinfectant (30% to 99.9% solution) or as rubbing alcohol (70% solution). Intoxication with methanol and isopropanol can be from accidental ingestion, industrial exposure, self-poisonings, or as substitutes for ethanol.
Both methanol and isopropanol are absorbed readily following ingestion. They are metabolized by hepatic alcohol dehydrogenase at rates one tenth or less of that for ethanol. Methanol is oxidized to highly toxic formaldehyde and then to formic acid. Formic acid is much more toxic than methanol and accounts for the profound anion gap metabolic acidosis and its ocular toxicity. Formate levels (generally not available in clinical laboratories) are better correlated with poor outcome (39). Methanol toxic symptoms may include inebriation, headache, dizziness, seizure, and coma. Nausea, vomiting, stiff neck, abdominal pain, and malaise also are common complaints. After ingestion, methanol serum level peaks at about 2 hours, but there may be a latent period of about 24 hours when there is a deceiving lack of severe toxic manifestation but during which appropriate treatment is critical (40).
Isopropanol is metabolized to acetone, which accounts for the CNS effects and ketonemia. Toxic manifestations include CNS depression, lethargy, weakness, hypotension, and abdominal pain. Hemorrhagic tracheobronchitis and gastritis are characteristic findings. Clinical laboratory findings include ketonemia, ketonuria, and elevated osmolality. There is no metabolic acidosis. Treatment of isopropanol overdose mainly is supportive and most patients respond well. Patients with very high isopropanol levels (400 to 500 mg/dL) usually are associated with severe hypotension and coma and hemodialysis should be considered (40).
Because the toxicity of methanol is because of ADH-generated toxic metabolites, treatment involves competitive inhibition of ADH with a saturating concentration (100 to 150 mg/dL) of ethanol, the preferred substrate, and allowing the kidneys to eliminate methanol provided adequate urine flow is maintained (40).
P.424

Hemodialysis is indicated if methanol level is greater than 25 mg/dL and should be performed until methanol reaches undetectable levels. 4-Methylpyrazol, a specific inhibitor of ADH, is undergoing a clinical trial to assess its efficacy as an antidote for methanol and ethylene glycol poisoning (41).
The method of choice for the identification and measurement of methanol and isopropanol is GC. The GC methods for ethanol generally are also applicable to these two alcohols if acetone is adequately resolved from the alcohols (38). Low methanol levels are seen periodically in alcoholics after binge drinking. Ethanol inhibits methanol metabolism, allowing the accumulation of a small amount of methanol originating from endogeneous production or consumption of fermented beverages. As many as 6% of patients with ethanol levels greater than 100 mg/dL had serum-methanol levels of 4.5 mg/dL or more (42).
The popular enzymatic assay for ethanol will not detect methanol and isopropanol because of the weak enzyme activity when these alcohols are substrates. Toxic serum levels of these osmotically active substances, however, will result in a significant osmolal gap between measured osmolality and calculated serum osmolality, thus allowing the evaluation of an acute situation when specific assays for methanol and isopropanol are not available. A formula to estimate serum osmolality is (43):
[1.86 × Na (mmol/liter) + Glucose (mg/dl)/18 + BUN (mg/dl)/2.8] ÷ 0.93
The calculated osmolality is divided by 0.93 because serum is approximately 93% water. The measured osmolality must be determined by freezing-point depression osmometry (not by using a vapor-pressure osmometer). The expected contribution to measured osmolality of a serum-methanol concentration of 100 mg/dL is 31 mOsm/kg. Ethanol is the most common cause for elevation of serum osmolality. Users of this approach must be aware of its nonspecificity because other alcohols, acetone, and ethylene glycol also can raise serum osmolality if present in sufficient concentration.
Ethylene Glycol
Ethylene glycol is the principal component of antifreeze products and brake fluids. It is absorbed readily in the gastrointestinal tract and is metabolized by oxidization to glycolic acid, glyoxylic acid, and formic acid by alcohol dehydrogenase and glycolic acid oxidase (40). Ethylene glycol itself is only mildly toxic; its metabolites, however, are highly toxic (44). Persistent vomiting, and gradual onset of CNS depression appear within 4 to 8 hours and are accompanied by a large anion gap metabolic acidosis. A common urinalysis finding is the presence of calcium oxalate or hippurate crystals, which can lead to acute renal tubular necrosis developing within 12 hours to 2 days.
Treatment of ethylene-glycol poisoning consists of correction of metabolic acidosis, inhibition of metabolism by ethanol or 4-methylpyrazol, and hemodialysis. Hemodialysis rapidly clears both ethylene glycol and its toxic metabolites from the bloodstream and is recommended for patients who are symptomatic or have blood levels of ethylene glycol greater than 25 mg/dL (40).
There is no easy assay for ethylene glycol, and most clinical laboratories do not provide ethylene glycol analysis. Serum osmolality measurement and the calculation of the osmolal gap can provide a rough approximation of ethylene glycol concentration, but only if ethanol and other alcohols are known to be absent (43).
Early gas chromatographic methods, including those based on direct injection, were plagued by “trailing peaks,” “ghost peaks,” variable recovery, and interference by propylene glycol or 2,3-butandiol (present in serum of some alcoholics) (45). A recent GC procedure has improved precision and accuracy, and also can measure glycolic acid (46).
An enzymatic method is based on the action of glycerol dehydrogenase on ethylene glycol is rapid and easy to perform (47). The enzyme assay will not detect propylene glycol and it is not specific for ethylene glycol; known interfering substances include glycerol and 2,3-butandiol (48). Specimens containing high lactate dehydrogenase activity or lactate level will give false-positive results (49). For laboratories unable to maintain a chromatographic assay for ethylene glycol, the enzyme assay, despite its specificity limitation, is a clinically useful alternative. It has high negative predicative value in eliminating ethylene glycol from the list of etiologies causing unexplained metabolic acidosis.
Acetaminophen
Acetaminophen is an effective analgesic and antipyretic drug that lacks antiinflammatory action. It is available in pure form and also in combination with other drugs such as codeine and propoxyphene. It presents less risk for producing gastrointestinal ulceration and hemorrhage than aspirin and other nonsteroidal antiinflammatory drugs. With the reported link of Reye’s syndrome to aspirin use, usage of acetaminophen as an over-the-counter medication has surpassed that of aspirin in recent years. At typical nonprescription doses of 325 to 1,000 mg every 4 hours, acetaminophen is a safe drug. At higher doses, acetaminophen is hepatotoxic, although toxic dosage is variable; liver damage may occur after single doses of 7.5 g or greater in healthy adults or 150 mg/kg in children (50). Individuals on other medications that are hepatotoxic or those who have liver diseases or who are alcoholics, however, will be susceptible to acetaminophen toxic effects at lower doses (51).
Acetaminophen after a therapeutic dose is eliminated mostly as glucuronide or sulfate conjugates. Following overdose, the conjugation pathways are saturated and formation of a highly reactive intermediate (probably N-acetyl-benzoquinoneimine) takes place via the cytochrome P-450 mixed-function oxidase system (52). This metabolite, normally detoxified by endogenous glutathione, reacts with and destroys hepatocytes once glutathione stores are depleted.
Acetaminophen is absorbed rapidly with peak plasma concentration reached within 30 to 120 minutes after therapeutic doses. Delayed peaks may occur with slower gastric emptying following large doses. Clinically, a patient overdosed on acetaminophen may present in four phases (50). In the initial phase, lasting 12 to 24 hours after ingestion, the patient usually exhibits gastrointestinal (GI) irritability, nausea, and vomiting. Some patients may be asymptomatic. During the second phase (24 to 72 hours postingestion) the patient may feel reasonably well while liver function tests prove abnormal. If significant hepatic
P.425

necrosis has occurred, the third phase (72 to 96 hours postingestion) is characterized by the sequelae of hepatic necrosis including coagulopathy, jaundice, and encephalopathy. If the patient survives phase 3, complete resolution of hepatic dysfunction will ensue (4 days to 2 weeks).
N-Acetylcysteine is an effective antidote. In the United States, the standard oral regimen consists of a loading dose of 140 mg/kg followed by 17 doses of 70 mg/kg every 4 hours. Treatment is most successful when started within 8 hours of ingestion, and its effectiveness appears to extend to those high-risk patients who are treated as late as 24 hours post ingestion (53). In Canada and Britain, intravenous administration over a 20-hour period is the approved protocol (150 mg/kg loading dose over 15 minutes followed by an additional dose of 50 mg/kg over 4 hours and then 100 mg/kg over 16 hours) (54). A nomogram relating time since ingestion, plasma drug concentration, and risk of hepatotoxicity is used in evaluating the need for N-acetylcysteine treatment (Fig. 22.2) (55). The nomogram is for use with acute ingestion only. It cannot be used to assess toxicty resulting from chronic misuse of acetaminophen. Because early treatment is critical to a favorable outcome, and initial plasma-drug concentration is a crucial deciding factor to initiate therapy, prompt and reliable measurement of plasma-acetaminophen level is an important emergency toxicology service. Numerous methods are available for the analysis of acetaminophen. Colorimetric tests such as the cresol-ammonia spot test (11) or the Glynn and Kendal method are fast and sensitive although the Glynn and Kendal method is interfered by salicylates. The colorimetric method based on the prior hydrolysis of acetaminophen and its conjugated metabolites to indophenol is
P.426

not recommended because the aforementioned nomogram is based on serum concentration of unconjugatd acetaminophen only (56). HPLC procedures, though simple and rapid, are not as convenient as the FPIA and EMIT methods that are available in most clinical laboratories.
FIGURE 22.2. Nomogram relating acetaminophen plasma concentration, time since ingestion, and risk of toxicity. (Modified from Rumack BH, Matthew H. Acetaminophen poisoning and toxicity. Pediatrics 1975;55:871–876.)
Salicylates
Salicylate, as one of the least expensive and most widely used drugs, has been the cause of many drug overdose cases, particularly in the very young and the elderly. Salicylate still ranks as a leading cause of childhood poisoning deaths and still is commonly used in self-poisoning by adults (57, 58). Many derivatives of salicylic acid are available commercially; the most important is acetylsalicylic acid (aspirin), which is hydrolyzed rapidly to salicylic acid, and circulates in the blood in the ionized form, salicylate. In serum, salicylate at therapeutic concentration is highly protein bound, and the extent of binding varies with total salicylate concentration. The major metabolic pathways of salicylate conjugation with glycine and with glucuronic acid are saturable. Thus, the half-life of elimination of salicylate as well as the serum level of salicylate will increase disproportionately with increasing dosage (57).
The primary pathophysiological effects of salicylism are complex (59). They include direct stimulation of the respiratory center resulting in hyperventilation, respiratory alkalosis and compensatory excretion of base, uncoupling of oxidative phosphorylation, interference with the Krebs cycle, and accumulation of organic acids leading to metabolic acidosis. In children, respiratory alkalosis is transient and a late-stage dominant metabolic acidosis is common. In adult patients, the most common acid-base disturbance is mixed respiratory alkalosis and metabolic acidosis. Associated with acid-base disturbances are fluid and electrolyte imbalance and dehydration. Other metabolic effects of salicylate toxicity are hyperthermia and impaired glucose metabolism with either hyperglycemia or hypoglycemia. In overdosed patients who are acidemic, the lower blood pH will increase the amount of nonionized salicylate (pKa = 3.0) for transfer into the CNS. Thus, CNS disturbances often accompany those intoxicated patients who are severely acidemic.
The toxic severity after acute ingestion is related to the amount of drug ingested. Ingestion of less than 0.15 g/kg is unlikely to result in toxic symptoms. Mild to moderate toxic reactions can be expected from an ingested dose of 0.15 to 0.39 g/kg and doses more than 0.3 g/kg lead to severe reactions. Ingestion of greater than 0.5 g/kg is potentially lethal. Chronic intoxication or therapeutic overdose is a result of excessive therapeutic administration of salicylate over a period of 12 hours or longer, and zero order kinetics lead to accumulation of salicylate in serum to toxic level (60). Chronic salicylate intoxication is a diagnostic problem particularly among elderly patients because the presenting symptoms often are ascribed to other causes. Thus, the intoxication often goes unrecognized, and as a result, appropriate therapy is delayed. These patients suffer significant morbidity and mortality.
Treatment for salicylate intoxication includes measures to prevent further absorption of salicylate and to correct the metabolic imbalances such as fluid and electrolyte depletion and acid-base disturbances. Alkalization with sodium bicarbonate to enhance the renal elimination of the drug should be considered in adults with serum salicylate level exceeding 50 mg/dL and children with levels greater than 35 mg/dL. If the salicylate level is greater than 100 mg/dL, and the patient is very ill or is unable to eliminate the salicylates, hemodialysis is indicated (59).
The availability of salicylate assay on an emergency basis is critical to confirm the clinical suspicion of acute salicylate intoxication. Diagnosis of chronic salicylate intoxication, particularly in the elderly, is much more difficult without a high degree of suspicion because patients may have become drowsy and confused and are unable to offer a reliable drug history. Therefore, documentation of elevated serum salicylate levels becomes very important in the differential diagnosis (57).
A nomogram (Done’s nomogram) (62) has been constructed to facilitate the interpretation of salicylate levels at different intervals after ingestion in order to predict the severity of intoxication (Fig. 22.3). The clinical value of the nomogram is limited because it was based on data from a pediatric population after a single acute ingestion of regular formulation (63). It does not predict the rate of salicylate elimination or future serum salicylate levels and it is not applicable for assessing the severity of chronic salicylate intoxication.
FIGURE 22.3. Nomogram for salicylate poisoning. (From Done AK. Salicylate intoxication. Significance of measurements of salicylate in blood in cases of acute ingestion. Pediatrics 1960;26:800–807.)
If the ingested salicylate is of an enteric-coated or sustained-released formulation, the absorption of salicylate will be delayed. The diagnosis of salicylate intoxication based on an elevation of serum level on admission can be missed because of delayed absorption (64). If ingestion of enteric-coated or sustained-release salicylate is suspected, the patient must be observed for at least
P.427

24 hours, and serum salicylate determination should be repeated. Peak salicylate level may not be attained until 60 to 70 hours postingestion.
Simple qualitative screening tests such as ferric chloride and Trinder’s reagent for urine salicylate are useful for quick confirmation of salicylate overdose if quantitation of serum levels is not available immediately. All positive results should be confirmed using quantitative assays and serum samples (57).
The majority of salicylate assays in use are colorimetric assays based on the reaction of salicylic acid with ferric ion in acid medium to give a purple color complex (56, 57). This simple and rapid reaction is not specific for salicylate; metabolites of salicylate, ketone bodies, catechols such as tyrosine and the catecholamines Ketone bodies and catecholamines frequently are elevated in disease states that are associated with metabolic derangement, such as ketoacidosis and Reye’s syndrome. Salicylate concentrations of these patients that are measured by a colorimetric method can be falsely high (65). Colorimetric assays have proven acceptable for the routine clinical use of salicylate levels in the diagnosis of salicylate intoxication. The assay based on Trinder’s method (66) is the most commonly used, although newer techniques such as FPIA, HPLC, and enzymatic assay using salicylate hydrolase are available.
A different approach to the measurement of salicylate is the use of the enzyme salicylate hydrolase (EC 1.14.13.1) purified from Pseudomonas cepacia (67, 68 and 69). This enzyme, in the presence of reduced nicotinamide adenine dinucleotide (NADH) or reduced nicotinamide adenine dinucleotide phosphate (NADPH), converts salicylate to catechol. Salicylate concentration can be determined by monitoring, photometrically, the consumption of NADH (67) or the formation of catechol (68, 69). Compounds that are known to interfere with the colorimetric assays do not affect the enzymatic methods.
Barbiturates
Barbiturate poisoning either by accident or more often in suicide attempts was a major health problem in the past. It has lessened in recent years because their use as anxiolytics has been replaced by the safer benzodiazepines; medical use of barbiturates now is primarily for treatment of insomnia and convulsive disorders, and as anesthetic and preanesthetic medications. The barbiturates are derivatives of barbituric acid (Fig. 22.4). Pharmacologically, they can be divided broadly into ultra-short (e.g., thiopental), short- to intermediate- (e.g., secobarbital and butalbital), and long-acting groups (e.g., phenobarbital) depending on their duration of action (70). Drug abusers prefer the short- to immediate-acting barbiturates because of the relative rapid onset of action (15 to 40 minutes). Duration of drug action, which may last up to 6 hours, is dependent on lipid solubility. Thus, the short-
P.428

acting barbiturates have higher lipid solubility, greater potency, and more rapid clearance from the central compartment. In contrast, the long-acting barbiturates have less lipid solubility, lower potency, and much longer half-lives of elimination.
FIGURE 22.4. Common barbiturates.
The barbiturates are weak acids with pKa values ranging between 7.2 and 7.9. Hence, at pH 7.4 a barbiturate such as phenobarbital (pK = 7.2) will be about 40% nonionized, whereas secobarbital (pK = 7.9) is 76% nonionized and therefore is more lipid soluble and more rapidly distributed to tissues.
The major actions of the barbiturates are their depressant action on the CNS and cardiovascular system. Acute barbiturate intoxication is characteristically associated with coma and shock; the former must always be differentiated from other forms of coma or CNS injury. Plasma barbiturate levels may be helpful in making a diagnosis, but they are of limited value in predicting the severity of the overdose or the duration of the coma, which are related more closely to brain than plasma barbiturate concentration. In addition, chronic barbiturate users are expected to have higher plasma concentration for any grade of coma because of the tolerance they have developed. Moreover, the depth of coma can be greater than might be expected from plasma concentration if other CNS depressants, such as alcohol, also have been ingested.
Treatment of barbiturate intoxication consists of aggressive support to combat shock and hypoxia in addition to lavage and charcoal treatment. Alkalization of urine to pH 7.5 to 8.0 with sodium bicarbonate will increase the fraction of ionized drug and will enhance excretion in urine. This is helpful for long-acting barbiturates because their principal means of elimination is renal. Alkalization is less effective for short-acting barbiturates because they have higher pKa values, are more highly protein-bound, and they are metabolized primarily by the liver (70).
Immunoassays detect barbiturates as a group and provide rapid results. Barbiturates other than the one used as the assay calibrator may have different cross-reactivities to the assay antibody. For example, an enzyme immunoassay that uses secobarbital as the calibrator will detect the less reactive phenobarbital only at much higher concentrations.
Identification of the specific barbiturate(s) requires chromatographic separation. TLC can effectively separate phenobarbital from other barbiturates, but the short- and intermediate-acting group (amobarbital, butabarbital, butalbital, secobarbital, and pentobarbital)– have similar migration, and identification requires skill and experience (18). Gas chromatography and HPLC methods for identification of the commonly encountered barbiturates are available (71, 72), and mass spectral identification of the barbiturates must be done with care because these drugs are structurally similar and yield similar spectral data (73).
The benzodiazepines are among the most frequently prescribed drugs. They vary in their potency in anxiolytic, hypnotic, muscle relaxant, anticonvulsant, and anesthetic effects. One useful classification of the benzodiazepines is according to their half-lives: long (greater than 24 hour), intermediate-short (5 to 24 hours) and ultra short-acting (less than 5 hours) (Table 22.2). A long-acting benzodiazepine such as diazepam is well suited as an anxiolytic agent, whereas a short-acting benzodiazepine such as triazolam is more appropriate as a hypnotic (70).
TABLE 22.2. COMMON BENZODIAZEPINES

Generic Name Trade Name Half-life (hr)

Diazolobenzodiazepines    
Chlordiazepoxide Librium 6–27
Diazepam Valium 21–37
Oxazepam Serax 4–11
Clorazepate Tranxene 2
Flurazepam Dalmane 2–3
Lorazepam Ativan 9–16
Temazepam Restoril 3–13
Halazepam Paxipam 14–16
Prazepam Centrax 1.3
Triazolobenzodiazepines    
Alprazolam Xanax 10–12
Trialzolam Halcion 2.6
Midazolam Versed 1–4

Benzodiazepines also are useful for treating neuromuscular diseases, sleep disorders, seizure, drug withdrawal, and as pre-anesthetic agents. Continuous use has led to tolerance, addiction, dependence, and abuse (74).
All the benzodiazepines are metabolized and cleared by the kidney as glucuronides or sulfates. Many of the metabolites are active, and some benzodiazepines such as clorazepate and flurazepam are considered to be prodrugs (Fig. 22.5). The major biotransformation routes are demethylation (e.g., chlordiazepoxide, temazepam) and hydroxylation (diazepam, nondiazepam).
FIGURE 22.5. Major biotransformation pathways of common benzodiazepines. 1, dealkylation; 2, hydroxylation; 3, glucuronidation; 4, decarboxylate.
With the popularity of these drugs, overdose is a frequent occurrence, yet fatalities resulting from benzodiazepines alone are very rare (75). Patients intoxicated with benzodiazepines are sedated and side effects include respiratory depression, weakness, headache, blurred vision, nausea, and diarrhea. Serious intoxication with benzodiazepines usually results from co-ingestion with other CNS depressants (e.g., ethanol). Therefore, in the evaluation of a patient suspected of benzodiazepine intoxication, it is important to investigate if other drugs also have been ingested. Plasma benzodiazepine concentration does not predict the severity of intoxication or outcome. A specific benzodiazepine antagonist, flumazenil, has been reported to effectively reverse benzodiazepine-induced CNS symptoms. Its benefit, however, must be weighed against the risk of precipitating acute benzodiazepine withdrawal, or seizure among patients who also may have taken seizure-causing drugs such as the cyclic antidepressants (76).
Benzodiazepine levels in blood and plasma can be determined by HPLC or GC assays (77, 78). Some benzodiazepines, such as chlordiazepoxide and oxazepam, are heat-labile, and are more suitable for HPLC analysis. GC methods using electron capture or mass spectrometer detection have the best sensitivity capable of detection of triazolo-benzodiazepines at low ng/mL levels, whereas those dependent on FID detection are adequate for drug levels in the high therapeutic or toxic ranges. TLC does not easily detect benzodiazepines without prior acid hydrolysis to form benzophenones, which are strongly fluorescent (79). Specific identification is not possible because different benzodiazepines can form the same benzophenone. Also, some benzodiazepines
P.429

(e.g., alprazolam and triazolam) are stable to acid hydrolysis and are not converted to benzophenones.
Commercially available immunoassays for benzodiazepines in urine or serum are designed to detect oxazepam or nordiazepam, metabolites that are common to several benzodiazepines (Fig. 22.5). Immunoassays have sufficient cross-reactivity with many of the newer benzodiazepines to detect therapeutic levels of some (e.g., alprazolam), but only higher levels for others (e.g., triazolam) (78, 80, 81).
Cyclic Antidepressants
The cyclic antidepressants are a major cause of life-threatening drug overdose, and are responsible for more deaths than any other drug classes except the analgesics (58). Drugs in this class of commonly prescribed antidepressants include the tricyclic compounds, such as imipramine, desipramine, trimipramine, amitriptyline, nortriptyline, doxepin, and loxapine, the tetracyclic compounds maprotiline and miaserin, and the bicyclic fluoxetine. Trazadone, a triazolopyridine, and amoxapine, a dibenoxapine, while structurally unrelated to the tricyclic compounds, often are classified or described with them. These cyclic compounds are rapidly absorbed by the GI tract, but absorption may be delayed at higher blood levels as a result of the anticholinergic effect of delayed gastric emptying. The volumes of distribution of these drugs are large (10 to 50 liters/kg) and extent of binding to plasma protein is high (greater than 90%). There is extensive first-pass metabolism, and the major hydroxylated metabolites of the tricyclic and demethylated metabolites of the bi- and tetracyclic compounds are pharmacologically active. The toxic effects of the tricyclic and tetracyclic antidepressants are similar and are mostly related to their anticholinergic and cardiotoxic effects. Anticholinergic effects result in agitation, seizure, coma, and hallucination. Circulatory collapse, serious cardiac arrhythmia, conduction disturbances, and heart block can occur; arrhythmia is the leading cause of death in tricyclic overdoses (82). The toxicity of trazadone differs from that of the tricyclic and tetracyclic antidepressants in that there are no anticholinergic signs or symptoms and the primary manifestations are CNS effects and hypertension. Fluoxetine adverse effects are anxiety, nervousness, and insomnia.
Life-threatening events following a tricyclic overdose usually occur within the first 6 hours of ingestion (83). A patient is at low risk for toxicity if he has not developed the following within 6 hours of ingestion: QRS interval > 100 msec, arrhythmia, altered mental status or seizures, respiratory depression, or hypotension (84).
Tricyclic antidepressant-intoxicated patients who had peak (not admission) serum drug levels greater than 1,000 ng/mL were associated with seizures and ventricular arrhythmias (85). There was no correlation, however, between drug levels and toxic symptoms over a wide range of tricyclic antidepressant levels including those below 1,000 ng/mL and drug levels did not predict severity of symptoms (83). Therefore, measurement of serum tricyclic antidepressant levels, particularly if requested on a “stat” basis, to predict the severity of acute tricyclic antidepressant poisoning is unwarranted.
The tricyclic antidepressants can be detected and identified in urine by TLC or one of the tricylic antidepressant group-specific immunoassays. Because the toxic effects resulting from an overdose of these compounds are so similar, demonstration of the presence of one of these drugs without specific identification often is sufficient for management. Immunoassays (EIA, FPIA) are designed primarily for detection of the major tricyclic antidepressants (imipramine, desipramine, amitriptyline, nortriptyline) but, because of the cross-reactivities of the antibody for other members of the tricyclic antidepressant family, these immunoassays also are capable of detecting other tricyclic antidepressants and metabolites (86). A single-use device designed to test for tricyclic antidepressants and several drugs of abuse has been evaluated for use in the emergency department (13). Although a positive result does not necessarily indicate a toxic concentration of the drug, a negative response will rule out the possibility of an overdose. Quantitative immunoassays (EMIT) for the four major tricyclic antidepressants also are available. These assays require solid-phase extraction from serum or plasma before analysis. Quantitative determination of serum levels is performed best using HPLC (77, 87).
Cyclobenzaprine is prescribed extensively as a muscle relaxant. It is a tricyclic compound, structurally differing from amitriptyline only in having a double bond in the cycloheptane ring. Analytical distinction between the two is difficult—they are indistinguishable in immunoassay cross reactivity and chromatographic
P.430

elution on TLC or HPLC. Distinction can be achieved by HPLC using a photo-diode detector to exploit differences in their UV spectra (88) or by careful examination of their different fluorescence characteristics in Stage III of the TOXILAB TLC system (89)
Iron
Iron supplements in various forms of iron salts are readily available and ferrous sulfate, the cheapest and most common iron salt, is involved frequently in overdose. Other iron salts are gluconate, fumarate, succinate, lactate, chloride, ferrocholinate, and glutamate. The elemental iron dose ingested can be calculated from the percentage of elemental iron in the formulation (Table 22.3). Acute iron poisoning is particularly common in the pediatric population, with the majority of the reported exposure occurring in children less than 6 years of age (58).
TABLE 22.3. ELEMENTAL IRON EQUIVALENTS IN IRON SALTS

Salt % Fe mg Fe/Tableta

Sulfate 20 65
Gluconate 12 38
Fumarate 33 106
Lactate 19
Chloride 28
Ferrocholinate 13

a 325 mg tablet
Fe, iron
In acute iron poisoning, an estimation of the amount of elemental iron is important in assessing potential toxicity. An ingestion of 10 to 20 mg of elemental iron per kilogram of body weight can be associated with toxicity (90). A dose of 20 to 60 mg/kg can pose moderate risk, whereas a dose greater than 60 mg/kg has high risk for serious toxicity.
Five clinical stages of acute iron poisoning have been described. The first stage is the initial postingestion period lasting up to 6 hours. As iron is absorbed in the duodenum and upper small intestine, GI symptoms manifested at this stage are a direct result of the corrosive action of iron; the absence of these symptoms during this period excludes serious injury. During the next stage (latent period) of up to 12 hours, the patient may appear to improve. If the dose ingested is sufficiently large, the patient’s condition may progress directly to a third stage (12 to 48 hours postingestion) of systemic toxicity with cardiovascular collapse, seizure, coma, and shock. The fourth stage is hepatic failure 2 to 3 days postingestion. Some patients who survive are still at risk of developing late strictures and some GI tract obstruction (2 to 4 weeks) (90).
Ingestion of iron can be documented by abdominal radiograph, which can reveal radiopaque iron-containing pills prior to dissolution as much as 6 hours or longer after ingestion of adult-strength tablets. Pediatric preparations, which are chewable iron supplements, dissolve rapidly in 30 to 60 minutes and are not always seen (91).
A qualitative test to predict potential for toxicity is the deferoxamine challenge test. A sufficiently large deferoxamine dose (90 mg/kg) is given to bind free iron in plasma and forms the reddish feroxamine complex, which is excreted in urine. The appearance of feroxamine in urine within 4 to 6 hours implies potential toxicity. False-negative results have been reported, and a negative challenge test should not rule out toxicity (92).
Serum iron level usually is determined along with total iron-binding capacity (TIBC). Serum iron levels between 300 and 500 μg/dL are associated with significant GI toxicity. Patients with serum iron levels above 500 μg/dL can exhibit systemic toxicity and shock, and will require chelation therapy. Sampling of blood for determination of serum iron should be on admission and again at 4 to 6 hours postingestion when peak levels are thought to occur.
Theoretically, if the serum iron value is greater than the TIBC, then the free circulating iron is potentially toxic. The use of TIBC is discouraged because of spurious elevation of TIBC following iron poisoning from a laboratory artifact (93).
The management of acute iron poisoning includes removal of residual iron in the GI tract by emesis or lavage and standard support therapy (90). Deferoxamine therapy is used to chelate free serum iron.
Most methods for measuring serum iron levels are based on spectrophotometric measurement of a color complex formed by the chelation of ferrous iron to a dye. Deferoxamine, as a competing chelator, interferes with dye-binding calorimetric assays to give falsely low results (94). Therefore, blood for serum iron levels should be drawn before deferoxamine chelation therapy. Atomic absorption spectroscopy for serum iron levels is not routinely available but is a method that can be used in the presence of deferoxamine.
Cannabinoids
The term cannabinoids denotes a group of more than 60 compounds found in the plant Cannabis sativa L. Δ9-Tetra-hydrocannabinol (THC) is the major psychoactive compound of marijuana.
The effects of THC, when smoked, appear within minutes and seldom last longer than 2 or 3 hours. Oral intake delays the onset of symptoms for 30 minutes to 2 hours but the duration of drug action is longer. The effects of THC are tempered by dose, route of administration, and experience of the user. There usually is a sense of euphoria, an altered perception of time, a keener sense of hearing, and more vivid visual imagery. Both short-term memory and task performance are impaired. Higher doses can induce frank hallucinations, delusions, and paranoid feelings. The most consistent cardiovascular effects are an increase in pulse rate and conjunctival reddening (95).
THC, a lipophilic drug with a large volume of distribution estimated to be about 10 L/kg, is sequested in certain organs such as liver and lung. In man, THC is transformed rapidly first by the hepatic cytochrome P450 enzyme system to 11-hydroxy-Δ9-THC (11-hydroxy-THC) which then is oxidized by alcohol dehydrogenase to 11-nor-Δ9-THC-9-carboxylic acid (THCA). 11-hydroxy-THC is a psychoactive metabolite, whereas THCA is devoid of psychoactivity (95).
The primary urinary metabolite is THCA, which exists both as the free acid and glucuronide conjugate. Because of the large tissue storage of THC, continuous reentry of THC from tissues into the central compartment followed by metabolism means
P.431

that THCA is excreted into the urine long after a person has stopped using marijuana. The duration of urine samples testing positive is dependent on the dose, metabolism of THC, route and frequency of use, timing of urine collection, quantity of liquid taken prior to specimen collection, and assay cutoff. Generally, infrequent users test positive from 2 to 5 days after each dose when a 20-μg/L cutoff is used, but heavy users have been reported to test positive for over 46 days (96, 97). Thus, a positive urine result indicates only that the subject has been exposed to marijuana hours to weeks before the collection of the urine specimen, and the result cannot be used to estimate the time of exposure. Because of variation in a patient’s hydration status, serial tests may see fluctuation of test results between positive and negative. A positive result following a negative one could be a result of residual excretion or dehydration and not reuse of marijuana. Serial determination of urine THCA concentrations normalized to urine creatinine can be helpful in interpretation (98, 99).
Because marijuana is smoked frequently in social situations, it is possible that nonsmokers could inhale a sufficient amount of cannabinoids present in sidestream smoke to produce a positive urine cannabinoids test. Studies have shown that this can occur, but only after exposure to high concentrations of marijuana smoke in a small, unventilated area (100). Such extreme exposure conditions are not encountered in the usual social situations, and with higher cutoffs (e.g., 50 or 100 μg/L) used by many screening assays, the detection of a positive urine from passive inhalation is unlikely (101).
Screening usually is done with immunoassays as they have good sensitivity and they can be automated for batch analysis of large numbers of samples. The popular commercial immunoassays use antibodies to detect THCA, the major urinary metabolite, and many of the other metabolites of THC. These assays, therefore, measure the sum of the immunoreactive THC metabolites. In contrast, the chromatographic assays (TLC, GC, and GC-MS) separate and specifically detect and quantify THCA. The threshold concentrations of these confirmation chromatographic methods usually are set lower than those of the initial tests. For example, in the Federal Workplace Drug Testing Programs, a confirmation threshold of 15 ng/mL is used when the immunoassay cutoff is 50 ng/mL (102).
If a chromatographic assay is used, hydrolysis of the glucuronide metabolites is necessary. TLC has a detection limit as low as 10 μg/L, whereas GC-MS methods, using a deuterated internal standard and selected ion monitoring, can have a limit of quantitation of 5 ng/mL and lower.
Hemp and marijuana both are Cannabis sativa L. Hemp, cultivated to produce hemp fiber and seeds for commercial use, generally contains relatively small amount of Δ9-THC. Recently, food products derived from hemp seeds and hemp seed oil have become popular health food items and are easily available on the internet. These products contain varying amount of Δ9-THC (103) and ingestion of these products can lead to positive drug test results for marijuana (104, 105). In one study, volunteers ingested 11 or 22 g of hemp seed oil containing high Δ9-THC concentration (1,500 μg/g of oil). Urine specimens collected were positive for the marijuana test for 6 days with Δ9-THCA concentrations reaching 431 ng/mL by GC-MS analysis. All subjects felt the pyschotropic effects of THC (103).
Opioids
The opioid drugs can be divided into naturally occurring, semi-synthetic, and synthetic groups. The naturally occurring opioids are morphine and codeine, which are derived from the opium poppy, Papaver somniferum L. Semi-synthetic opioids include heroin, hydromorphone, oxycodone, and oxymorphone. Synthetic opioids are meperidine, methadone, diphenoxylate, propoxyphene, and the fentanyls. Opioids have similar pharmacological effects and potential for addiction and tolerance; they vary only in their potency for analgesia, duration of action, and extent of abuse (106).
Opioids produce analgesic respiratory depressant, euphoric, and emetic effects. The triad of miosis, coma, and respiratory depression are pathognomonic of opioid poisoning (107). The response to the pure opioid antagonist, naloxone, is both diagnostic and therapeutic for opioid intoxication.
Heroin and methadone are the most frequently abused opioids. In recent years, concomitant use of heroin or methadone with cocaine has been reported. The preferred route of heroin administration is intravenous, although heroin of sufficient purity now is available and is smoked or administered intranasally. Heroin (diacetylmorphine) is deacetylated rapidly to 6-mon-oacetylmorphine (6-MAM) with a half-life of 3 minutes, which then undergoes a second deacetylation step to form morphine (108). Consequently, only 6-MAM and morphine are found in the urine of heroin users (109). Morphine and 6-MAM are equally potent in their opioid effects. Thus, heroin acts as a prodrug for providing the active metabolites, 6-MAM and morphine.
Codeine is rapidly absorbed from an oral dose, with plasma concentration peaking at 1 hour postingestion. It is extensively metabolized to norcodeine, but at least 10% of the dose is transformed to morphine. Most of the dose is excreted in the urine as glucuronide conjugates of codeine, norcodeine, and morphine. Thus, morphine is a metabolite of both heroin and codeine. Consumption of baked goods containing poppy seeds can result in detectable amounts of morphine and codeine in urine (110, 111) because some batches of poppy seeds are contaminated with these opiates. When a urine tests positive for morphine, it becomes important to ascertain whether the morphine comes from codeine in prescription medications, from heroin/morphine abuse, or from poppy-seed ingestion. A set of guidelines has been proposed to help in the interpretation of urine opiates results and the determination of the source of morphine and codeine (111, 112). Test results that can rule out poppy seeds as the sole source for morphine and codeine are: (i) codeine levels greater than 300 ng/mL with a morphine-to-codeine ratio less than 2 (indicative of codeine use); (ii) high levels of morphine (greater than 1 ng/mL) when codeine is undetectable; (iii) total morphine levels exceeding 5,000 ng/mL (indicative of abuse of heroin, morphine, or codeine); (iv) the presence of 6-MAM (a positive indication of heroin use).
To avoid “false-positive” morphine results as a result of poppy-seed ingestion, the Federal Workplace Drug Testing Program has raised the initial and confirmation test thresholds to 2,000 ng/mL of morphine and codeine. In addition, the Program now requires the test for the presence of 6-MAM in urine specimens confirmed to contain morphine (113).
P.432

The fentanyls are analgesic-anesthetic drugs that are many times more potent than morphine. Many analogs of fentanyls (3-methyl-, α-methyl-, and parafluoro derivatives) have appeared on the street as “China white” (114). Heroin containing fentanyl is a grave risk to heroin abusers. A standard urine drug screen most likely will fail to detect a fentanyl overdose because fentanyl concentration in urines is very low (μg/L), and the available urine immunoassays for opiates have no reactivity for fentanyls.
Immunoassays are used commonly as preliminary tests for the opiates (morphine and codeine), methadone, and propoxyphene. The synthetic narcotics (dihydrocodone, hydrocodone, hydromorphone, and oxycodone) also are detected by opiates immunoassays, thus necessitating a confirmation test (TLC, HPLC, GC, and GC-MS) that can distinguish between morphine and the other opioids. Conventional TLC has a detection limit of 1 mg/L of free morphine, which makes it suitable for use in overdose cases but not for drug abuse testing.
Cocaine
Cocaine (benzoylmethylecgonine) is an alkaloid extracted from the leaves of the Erythroxylon coca plant and purified as the hydrochloride salt (cocaine HCl). It is a powerful CNS stimulant, and in recent years the illicit use of cocaine has increased rapidly.
Cocaine hydrochloride is snorted or administered intravenously. Many cocaine users used to prepare free-base crystals for smoking (“free-basing”) by dissolving cocaine HCl in a solution of baking soda or ammonia, and extracting it with a solvent (ether) that is evaporated, leaving relatively pure cocaine crystals. “Crack” (so named because of the crackling sound made by the crystals when heated) also is a free-base form of cocaine, prepared by precipitation from an alkaline solution. Crack is relatively pure cocaine (80% to 90%), and when heated, is mostly vaporized rather than pyrolyzed. Its low cost and wide availability have worsened the cocaine epidemic.
Cocaine, a powerful CNS stimulant, produces heightened alertness, self-confidence, and an intense feeling of euphoria (rush). These stimulatory effects are followed by depression (crash). It is the positive reinforcement of the rush to escape the crash that leads to chronic cocaine abuse (115). Psychosis, repeated grand mal seizures, and coma are common following acute intoxication. Other clinical manifestations include arrhythmia, myocardial infarction, hypertensive crisis, cerebral vascular accidents, hyperthermia, and respiratory arrest. Spontaneous abortion and abruptio placentae are obstetric complications for women who use cocaine during pregnancy (3). Low birth weight and higher risk of congenital malformations, perinatal mortality, and neurobehavioral impairment are common results of a fetus exposed to cocaine in utero (4). Necrosis of the nasal septum from snorting cocaine, lung damage, and pulmonary edema are some of the other complications associated with cocaine use.
Bioavailability of intranasal cocaine is variable (20% to 60%) and appears to be dose dependent. Smoking free-base provides a more effective absorption of cocaine (57% bioavailability) (116), and a rapid peaking in its plasma concentration within 5 minutes. The euphoric effect of smoking cocaine usually lasts for only 20 minutes. Similar results were obtained in subjects given cocaine intravenously. Cocaine is lipid soluble and readily crosses cell membranes and distributes rapidly across the blood-brain barrier and the placenta.
The main routes of metabolism of cocaine are enzymatic and nonenzymatic hydrolysis of the methyl ester giving benzoylecgonine (BE) and enzymatic hydrolysis of the benzoyl group by plasma and liver esterases, yielding ecgonine methyl ester (EME) (117). Further hydrolysis of both of these compounds gives ecgonine. Norcocaine, the product of N-demethylation, undergoes similar hydrolysis and is believed to be pharmacologically active, but it is present in very small amounts. The two major metabolites, BE and EME, are excreted into urine in about equal amounts (40% to 50%). Cocaine itself is excreted in very small amounts, less than 1% of the dose in 3 days. Elimination half-lives for BE, EME, and cocaine have been calculated from literature data to be 7.5, 3.6, and 1.5 hours, respectively (118). A pyrolytic product of crack cocaine, anhydroecgonine methyl ester (methylecgonidine, MEG) has been identified and detected in urine, hair, saliva, sweat, and blood of crack smokers, but not in samples obtained form users of cocaine by other routes (119).
A pharmacologically active metabolite, cocaethylene, has been found in cocaine users following concurrent use of cocaine and ethanol (120). Cocaethylene is frequently present in clinical specimens which are positive for benzoylecgonine (121).
Addition of sodium fluoride to inhibit esterase activity and low temperature will stabilize cocaine in blood specimens for long-term storage. The major urine metabolites, BE and EME, are relatively stable.
Immunoassays are popular because they do not require sample pretreatment, and they are fast and easy to perform. Since these assays were designed primarily for use with urine specimens, the target analyte is the major urinary metabolite benzoylecognine (122). Benzoylecognine is not effectively extracted by nonpolar solvents and requires the addition of an alcohol (e.g., ethanol) or chloroform to make the extraction solvent more polar. The resulting extract can be used for various chromatography techniques (TLC, HPLC, GC, and GC-MS). TLC sensitivity is generally at 1 to 2 μg/mL, which is adequate for cocaine overdose cases. Sensitivity has been improved with claimed detection limits of 0.25 μg/mL for use in drug abuse testing (TOXI-LAB). A disadvantage of this TLC procedure is the limited migration of BE away from the origin.
Gas chromatography procedures require derivatization of the carboxylic side group of BE before chromatography (23). Procedures using a flame ionization detector have sensitivity down to 200 μg/mL, which can be enhanced further by the use of a mass-spectrometer detector.
The high specificity and sensitivity of GC/MS assays are utilized routinely for confirmation of BE in urine using deuterated BE as internal standard and selected ion ionization mode.
Ecgonine methyl ester (EME) is a major urinary metabolite of cocaine. Its presence in urine is an indicator of cocaine use is as sensitive and specific as that of BE. The use of EME has the advantage that, unlike BE, it can be easily extracted urine and it has good chromatographic properties in TLC and GC systems (123).
Urine BE levels typically decline rapidly to below the usual 300ng/mL cutoff in 24 to 96 hours since last exposure, but
P.433

among long-term, high-dose abusers BE was reportedly detectable after 10 to 22 days (124). Detection times, however, are assay-dependent. In a study of human subjects given 20 mg of cocaine HC1 intravenously, the mean times of detection of the last positive urine specimen (greater than or equal to 300 ng/mL BE) after cocaine administration varied from 16.9 to 52.9 hours, depending on the commercial tests used (TLC, EIA, FPLA, RIA) (122).
Sympathomimetic Drugs
Sympathomimetic drugs mimic the actions of the endogenous neurotransmitters that stimulate the sympathetic nervous system. Sympathomimetic agents include the over-the-counter medications (e.g., ephedrine, pseudoephedrine, and phenylpropanolamine), illegal street drugs (e.g., amphetamines, and methamphetamine, MDMA, and MDA), and herbal preparations (ephedra, Ma Huang). The clinical presentations from these many different agents are the classic sympathomimetic signs and symptoms such as tachycardia, hypertension, diaphoresis, hyperthermia, acute psychosis, and seizures. Poisoning occurs secondary to the use of prescription agents, over-the-counter medications, and illicit drugs. A contemporary scene is the abuse of the designer agents (particularly MDMA or “ecstasy”) at rave dance parties where users have been afflicted with dehydration, hyperthermia, and cardiac arrhythmia (125).
Methamphetamine and amphetamine are strong CNS stimulants, and at low doses produce euphoria, increase alertness, intensified emotions and sense of well being. Because of its ease of manufacturing and ready availability and its longer half-life (10 hours vs. minutes for cocaine), the abuse of methamphetamine has increased in recent years (126). Methamphetamine may be taken orally, intravenously, or by smoking. The smokeable form of methamphetamine has the street name “ice.” Methamphetamine is excreted in the urine largely as the unchanged drug and a small amount (7%) is demethylated to amphetamine. With a pKa of approximately 9.9 for both amphetamine and methamphetamine, elimination of these drugs is highly dependent on urine pH: up to 74% of a dose of amphetamine is excreted unchanged in acidic urine, but only 2% in alkaline urine (30).
The sympathomimetics can be detected by immunoassays. There are many commercially available kits either as reagent packs to be used on automated instruments or single-use devices designed for near-patient testing (15, 16). In choosing which kit to use, one must be familiar with the immunospecificity of the assay and the cutoff used. Most of the products on the market are designed to meet the demand of workplace drug testing for high specificity for methamphetamine and amphetamine, the target drugs of abuse in most testing programs. Assays with low cross reactivity with over-the-counter sympathomimetic amines such as ephedrine or phenylpropanolamine will reduce the need for costly GC-MS confirmation. In the clinical setting, however, it is important to be able to detect other sympathomimetics as well. Emergency department or drug-treatment clinic encounters with designer drugs (MDMA, MDA), prescription (phentermine) or nonprescription (ephedrine, PPA) drugs, and herbal preparations (ephedra) are common.
Specific identification of individual sympathomimetics requires chromatographic assays, typically those based on TLC or GC-MS. In a TLC kit commonly used in many clinical toxicology laboratories, the migration and color characteristics of the sympathomimetics are similar although subtle differences do allow correct identification by expert chromatographers (18). The manufacturer of the kit provides a special (remigration) procedure for detection and differentiation of the common sympathomimetics. The detection limits for the sympathomimetics are in the range of 0.5 to 1.0 (g/mL, which are adequate for testing patients who are suspected of an overdose.
Gas chromatography of the sympathomimetics requires chemical derivatization. The most commonly used with primary and secondary amines include heptafluorobutyric anhydride (HFBA), pentafluoropropionic anhydride (PFPA), trifluoroacetic anhydride (TFAA) and 4-carbethoxyhexfluorobutyryl chloride (4-CB). The methamphetamine artifact has been reported in some specimens when they were used with some of these derivatives (127). The source appeared to be ephedrine, which was in very high concentrations in these specimens. Under the conditions of the assay, a small amount of the ephedrine was converted to methamphetamine. The laboratory should be careful in reporting a positive methamphetamine result in the presence of a high concentration of ephedrine (> 0.1 mg/mL) and when amphetamine, a methamphetamine metabolite, is not detected at ≥200 ng/mL. Various solutions to this problem have been proposed, including lowering the injection port temperature or using a different derivative (128). The most effective solution is the fragmentation of ephedrine and related compounds with periodate treatment (129).
The d and l enantiomers of methamphetamine have very different CNS stimulant potency. d-Methamphetamine is 10 times more powerful as a stimulant, is a controlled substance in the United States, and is a drug of abuse of choice. l-methamphetamine has greater peripheral vasoconstrictive action and is used in over-the-counter nasal inhaler (Vicks Inhaler, Procter & Gamble, Cincinnati, OH). All pharmaceutical methamphetamine is the d isomer whereas illicitly synthesized methamphetamine can be pure d-methamphetamine or racemic methamphetamine. The standard clinical toxicology tests for the amphetamines (TLC, HPLC, or GC-MS) cannot differentiate between the enantiomers. Special chiral derivatizing reagent may be used to convert the isomers into chiral derivatives, which can be separated by standard chromatography (130).
A number of drugs or compounds are metabolized to methamphetamine and/or amphetamine. These include amphetaminil, benzphetamine, clobenzorex, deprenyl, dimethylamphetamine, ethylamphetamine, famprofazone, fencamine, fenethylline, fenproporex, furfenorex, mefenorex, mesocarb, and prenylamine (128). Proper interpretation of positive methamphetamine or amphetamine results must take into account the medication history of the patient.
Date-Rape and Knockout Drugs
The use of drugs to facilitate sexual assault is not new. Ethanol is the most frequently involved drug, being particularly effective when used together with other CNS depressants. Newer drugs
P.434

that have been implicated include flunitrazepam and γ-hydroxybutyrate (GHB), although any benzodiazepine is effective, particularly when it is ingested with ethanol (131). These drugs cause the victim to lose consciousness and the ability to make rational decisions. Moreover, perpetrators take advantage of the anterograde amnesia that their victims frequently suffer.
Flunitrazepam (Rohypnol), with street names such as “rooffies,” “rochies,” “rocha,” “rophies,” is not available legally in the United States. It dissolves readily in alcohol and is colorless, odorless, and tasteless. Detection of flunitrazepam in blood or urine is difficult because of the small amount administered (typically 1 to 2 mg) and its extensive metabolism. Documentation of exposure usually is based on the detection of the major metabolite, 7-aminoflunitrazepam. Many of the commercially available immunoassays for benzodiazepines have sufficient cross-reactivity with this metabolite to detect its presence in urine. Definitive confirmation is by GC-MS (132). It is recommended that urine and blood specimens should be collected as soon as possible after the alleged assault and preferably within 24 hours (133).
Gamma-hydroxybutyrate is a naturally occurring product of the metabolism of the neurotransmitter γ-aminobutyric acid (GABA). In the 1980s, it was abused by body builders as a steroid enhancer and was easily available in health-food stores and mail-order outlets until its ban by the Food and Drug Administration in 1990. It continues its popularity as an illicit drug among “rave” party participants for its euphoric and hallucinogenic properties. Gamma-hydroxybutyrate is a CNS depressant and an excessive amount ingested in combination with ethanol can cause profound coma (134, 135). Hence, its implication as a date-rape drug. Gamma butyrolactone (GBL), with a street name of “blue nitro,” and 1,4-butandiol (BD), are converted, in vivo, to GHB. In February 2000, GHB, GBL, and BD were added to schedule I of the Control Substance Act.
There is no readily available screening test for GHB. Some laboratories may be able to use their urine organic-acid screen methodologies or specific GC-MS procedures to detect GHB (136, 137).