CHAPTER 3

Clinical Chemistry

Margi Sirois

KEY POINTS

• The availability of affordable in-house analyzers for blood chemistry analysis has improved patient care.

• Clinical chemistry testing usually requires either a serum or plasma sample.

• Protein assays commonly performed in the veterinary practice laboratory are the total protein and albumin tests.

• Tests of the hepatobiliary system commonly performed in the veterinary practice laboratory are alanine aminotransferase, aspartate aminotransferase, bilirubin, and alkaline phosphatase.

• Common tests for kidney function include blood urea nitrogen and creatinine.

• Pancreatic function tests evaluate either the endocrine or the exocrine functions of the pancreas.

• Tests of pancreatic exocrine function include amylase and lipase.

• Pancreatic endocrine function can be evaluated with the glucose, fructosamine, and glycosylated hemoglobin tests.

• Other endocrine function tests (e.g., adrenocorticotropic hormone, thyroid assays) primarily use immunologic methods rather than chemical analyses.

• Electrolyte assays performed in the veterinary practice laboratory include sodium, potassium, and chloride.

• Some electrolyte analyzers can also evaluate calcium, phosphorus, magnesium, and bicarbonate.

In both human and veterinary medical practice, current trends indicate a move toward greater point-of-care capabilities. This translates into better customer service and enhances the practice of veterinary medicine. Determinations of levels of the various chemical constituents in blood can be an important aid in the formulation of an accurate diagnosis, prescription of proper therapy, and documentation of the response to treatment. The chemicals being assayed are generally associated with particular organ functions and may be enzymes associated with particular organ functions or metabolites and metabolic by-products that are processed by certain organs. Analysis of these components usually requires a carefully collected blood serum sample. Plasma may be used in some cases. Chemical measurements should be completed within 1 hour after blood collection. If testing will be delayed, freezing of the sample will preserve the integrity of most of the constituents. Freezing may interfere with some test methods, however. Certain anticoagulants may also interfere with particular chemical analyses. Many factors other than disease influence the results of chemistry tests. These factors may be preanalytical, analytical, or postanalytical (see Chapter 1).

Many veterinary practices own or lease chemistry analyzers to perform routine chemical assays. This focus on in-house laboratory work makes the veterinary technicians’ laboratory skills perhaps their biggest asset to the practice.

As the person most likely to be in charge of the laboratory, the veterinary technician must become familiar with the types of analytic instruments available (see Chapter 1), the variety of testing procedures used, and the rationale underlying the analyses. The most important contribution the technician can make to the practice laboratory is accurate and reliable test results. In vitro results must reflect, as closely as possible, the actual in vivo levels of blood constituents.

SAMPLE COLLECTION

Most chemical analyses require collection and preparation of serum samples. Whole blood or blood plasma may be used for some test methods or with specific types of equipment. The instructions accompany the chemistry analyzers and should be consulted for the type of sample required. Collection of a high-quality sample on which to perform an assay has a direct effect on the quality of test results. Most adverse influences on sample quality can be avoided with careful consideration of sample collection and handling.

Specific blood collection protocols vary depending on the patient species, volume of blood needed, method of restraint, and types of samples needed. Chapter 2 contains additional information on blood collection protocols, supplies, and equipment. Blood samples for chemical testing should always be collected before treatment is initiated. Administration of certain medications and treatments may affect results of biochemical testing. Preprandial samples, or samples from an animal that has not eaten for 12 hours, are preferred. Postprandial samples, or samples collected after an animal has eaten, may produce erroneous results. Samples taken after the patient has eaten can produce false values for a number of blood components, including glucose, urea, and lipase. Regardless of the method of blood collection, the sample must be labeled immediately after it has been collected. The tube should be labeled with the date and time of collection, the owner’s name, the patient’s name, and the patient’s clinic identification number. If submitted to a laboratory, include with the sample a request form that includes all necessary sample identification and a clear indication of which tests are requested.

Plasma

Plasma is the fluid portion of whole blood in which the cells are suspended. It is composed of approximately 90% water and 10% dissolved constituents, such as proteins, carbohydrates, vitamins, hormones, enzymes, lipids, salts, waste materials, antibodies, and other ions and molecules. Procedure 3-1 describes obtaining a plasma sample. The sample must not be contaminated with any cells from the bottom of the tube after centrifugation. If the sample cannot be centrifuged within 1 hour, it must be refrigerated. If heparinized plasma has been stored overnight after separation or has been frozen, the sample should be centrifuged again to remove any fibrin strands that may have formed. Freezing may affect certain test results; the test instructions should be consulted for all the tests that must be run before a plasma sample is frozen.

PROCEDURE 3-1   Plasma Sample Preparation

1. Collect a blood sample in a container with the appropriate anticoagulant.

2. Mix the blood-filled container with a gentle rocking motion 12 times.

3. Make sure the container is covered to prevent evaporation during centrifugation.

4. Centrifuge (within an hour of collection) at 2000 to 3000 rpm for 10 minutes.

5. With a capillary pipette, carefully remove the fluid plasma layer from the bottom layer of cells.

6. Transfer the plasma to a container labeled with the date, time of collection, patient’s name, and case or clinic number.

7. Process immediately or refrigerate or freeze as appropriate.

Serum

Serum is plasma from which fibrinogen, a plasma protein, has been removed. During the clotting process, the soluble fibrinogen in plasma is converted to an insoluble fibrin clot matrix. When blood clots, the fluid that is squeezed out around the cellular clot is serum. Obtaining a serum sample is described in Procedure 3-2. Centrifuging at speeds greater than 2000 to 3000 rpm or for a prolonged time may result in hemolysis. Serum separator tubes (SST) contain a gel that forms a physical barrier between serum or plasma and blood cells during centrifugation. The inside walls of the tube also contain silica particles that assist in clot activation. Blood collected into an SST should be mixed by inverting the tube several times and then allowing the sample to clot for 30 minutes before centrifugation. SST transport tubes are also available. These contain approximately double the amount of gel in a standard SST. The additional gel barrier helps minimize any interaction between the serum and cells after centrifugation so that test results are not likely to be affected if tests are delayed. Any prolonged delays in testing require that the serum be removed from the SST and placed in a sterile tube. The tube can then be refrigerated or frozen. Freezing may affect some test results; therefore the test instructions should be consulted for all the tests that must be run before a serum sample is frozen.

PROCEDURE 3-2   Serum Sample Preparation

1. Collect a whole blood sample in a container that contains no anticoagulant.

2. Allow the blood to clot in its original container at room temperature for 20 to 30 minutes.

3. Gently separate the clot from the container by running a wooden applicator stick around the wall of the container between the clot and the wall.

4. Cover the sample and centrifuge at 2000 to 3000 rpm for 10 minutes.

5. With a capillary pipette, remove the serum from the clot.

6. Transfer the serum to a container labeled with the date, time of collection, patient’s name, and clinic or case number.

7. Refrigerate or freeze the sample as appropriate

Factors Influencing Results

Many factors other than disease influence the results of chemistry tests. Hemolysis, lipemia, certain medications, and inappropriate sample handling can all lead to inaccurate results. Effects of sample compromise are summarized in Table 3-1.

TABLE 3-1

Effects of Sample Compromise

image

*Variable effect depending on analyte and test method.

From Sirois M: Principles and practice of veterinary technology, ed 2, St Louis, 2004, Mosby.

Hemolysis

Hemolysis may result when a blood sample is drawn into a moist syringe, mixed too vigorously after sample collection, forced through a needle when being transferred to a tube, or frozen as a whole blood sample. A syringe must be completely dry before it is used because water in the syringe may cause hemolysis. The needle from a syringe should be removed before blood is transferred to a tube. Forcing blood through a small needle opening may rupture cells. When transferring a blood sample to a tube, the veterinary technician should expel the blood slowly from the syringe without causing bubbles to form. Hemolysis can also result when excess alcohol is used to clean the skin and not allowed to dry before beginning the blood collection procedure.

Hemolysis, regardless of cause, can greatly alter the makeup of a serum or plasma sample. For example, fluid from ruptured blood cells can dilute the sample, resulting in falsely lower concentrations of constituents than are actually present in the animal. Certain constituents normally not found in high concentrations in serum or plasma escape from ruptured blood cells, causing falsely elevated concentrations in the sample. Hemolysis may elevate levels of potassium, organic phosphorus, and certain enzymes in the blood. Hemolysis also interferes with lipase activity and bilirubin determinations. Therefore, plasma or serum is frequently the preferred sample over whole blood, and serum is frequently preferred over plasma.

Chemical Contamination

Sterile tubes are not necessary for collection of blood samples for routine chemical assays. However, the tubes must be chemically pure. Detergents must be completely rinsed from reusable tubes so that the detergents do not interfere with test results.

Improper Labeling

Serious errors may result if a tube containing the sample is not labeled immediately after the sample is collected. The tube should be labeled with the date, time of collection, patient’s name, and clinic number. The veterinary technician should double check the sample identification with the request form, if one is used, as the sample is prepared and the test is run.

Improper Sample Handling

Ideally, all chemical measurements should be completed within an hour of sample collection, which is not always feasible. In this case, samples must be properly handled and stored so that levels of their chemical constituents approximate those in the patient’s body at the time of collection. Samples must not be allowed to become too warm. Heat may be detrimental to a sample, destroying some chemicals and activating others, such as enzymes. If a serum or plasma sample has been frozen, it must be thoroughly mixed after thawing to avoid concentration gradients.

Patient Influences

If practical, a sample should be obtained from a fasting animal. The blood glucose level can be elevated and the inorganic phosphorus level decreased immediately after a meal. Also, postprandial (after-eating) lipemia results in a turbid or cloudy plasma or serum. Kidney assays are also affected due to the transient increase in GFR after eating. Water intake need not be restricted before obtaining a blood sample.

REFERENCE RANGES

Reference ranges are also known as normal values. The reference range for a particular blood constituent is a range of values derived when a laboratory has repeatedly assayed samples from a significant number of clinically normal animals of a given species by specific test methods. Numerous medicine and clinical pathology books list the reference ranges of blood constituents for domestic species. Alternatively, reference ranges may be formulated by local diagnostic laboratories or in individual practice laboratories.

Establishing reference range values for any laboratory is time-consuming and expensive. To establish a list of reference values for the laboratory, the veterinary technician would have to assay samples from a significant number of clinically normal animals. Some investigators recommend analysis of at least 20 animals and others recommend more than 100 animals with similar characteristics. Other considerations include the variety of breeds and species most often seen in the veterinary practice; the gender and sexual status, such as intact or neutered, of the tested animals; the environ-ment, including husbandry and nutrition, of these animals; and climate. Climate is a consideration because drastic seasonal changes may also affect assay results.

PROTEIN ASSAYS

Plasma proteins are produced primarily by the liver and the immune system, consisting of reticuloendothelial tissues, lymphoid tissues, and plasma cells. Proteins have many functions in the body, and alterations in plasma protein concentrations occur in a variety of disease conditions, especially disease of the liver and kidneys. More than 200 plasma proteins exist. Some plasma protein concentrations change markedly during certain diseases and can be used as diagnostic aids. Other protein concentrations change little during disease. Age-related changes in plasma protein concentrations are also seen. Plasma protein functions include the following:

• Helping form the structural matrix of all cells, organs, and tissues

• Maintaining osmotic pressure

• Serving as enzymes for biochemical reactions

• Acting as buffers in acid-base balance

• Serving as hormones

• Functioning in blood coagulation

• Defending the body against pathogenic microorganisms

• Serving as transport/carrier molecules for most constituents of plasma

The plasma protein assays commonly performed in veterinary medicine include total protein, albumin, and fibrinogen.

Total Protein

Total plasma protein measurements include fibrinogen values, whereas total serum protein determinations measure all the protein fractions except fibrinogen, which is removed during the clotting process. The total protein concentration may be affected by altered hepatic synthesis, altered protein distribution, and altered protein breakdown or excretion, as well as dehydration and overhydration.

Total protein concentrations are especially valuable in determining an animal’s state of hydration. A dehydrated animal usually has a relatively elevated total protein concentration (hyperproteinemia), whereas an overhydrated animal usually has a relatively decreased total protein concentration (hypoproteinemia). Total protein concentrations also are useful as initial screening tests for patients with edema, ascites, diarrhea, weight loss, hepatic and renal disease, and blood clotting problems.

Two methods are commonly used for determination of total protein levels: the refractometric method and the biuret photometric method. The refractometric method measures the refractive index of serum or plasma with a refractometer (see Chapter 1). The refractive index of the sample is a function of the concentration of solid particles in the sample. In plasma, the primary solids are the proteins. This method is a good screening test because it is fast, inexpensive, and accurate. The biuret method measures the number of molecules containing more than three peptide bonds in serum or plasma. This method is commonly used in analytic instruments in the laboratory. It is a simple method and yields accurate results. Other chemical tests to measure protein include dye-binding methods, precipitation methods, and the Lowry method. These tests are not commonly performed in veterinary practice. They are usually used to measure a small amount of protein in urine and cerebrospinal fluid (CSF). Specialized tests to separate the various protein populations are performed in some reference laboratories and research facilities. These methods include salt fractionation, chromatography, and gel electrophoreses (see Chapter 8).

The aformentioned tests can be performed on samples other than serum and plasma (e.g., urine, CSF). Other tests also used include the sulfosalicylic acid test, Pandy test, and Nonne-Apelt test. For the Pandy test, 1 drop of CSF is added to 1 ml of saturated aqueous phenol. Turbidity is observed before and after the mixture is shaken. After shaking, the CSF disperses as small droplets in the phenol, which should not be confused with a positive reaction (i.e., development of turbidity). If the sample is normal, no appreciable immunoglobulin is present and the solution remains clear (at most, slightly turbid), which is considered a negative result. If immunoglobulin is present at a concentration of 25 mg/dl or more, the solution becomes cloudy white. The degree of turbidity may be subjectively graded from 11 to 41, corresponding to increasing immunoglobulin concentration. For the Nonne-Apelt test, 1 ml of saturated ammonium sulfate solution is overlaid carefully with 1 ml of CSF and allowed to stand undisturbed for 3 minutes. The junction between the two fluids remains clear with normal CSF. However, if CSF immunoglobulin concentration is increased, a white-gray zone forms at the junction. This reaction may be graded subjectively from 11 to 41, reflecting increasing immunoglobulin concentration. The sulfosalicylic acid test is described in Chapter 5.

Albumin

Albumin is one of the most important proteins in plasma or serum. It makes up 35% to 50% of the total plasma protein in most animals, and any significant state of hypoproteinemia is most likely caused by albumin loss. Hepatocytes synthesize albumin, and any diffuse liver disease may result in decreased albumin synthesis. Renal disease, dietary intake, and intestinal protein absorption also may influence the plasma albumin level. Albumin is the major binding and transport protein in the blood and is responsible for maintaining osmotic pressure of plasma. The primary photometric test for albumin is the bromcresol green dye-binding method.

Globulins

The globulins are a complex group of proteins. Alpha globulins are synthesized in the liver and primarily transport and bind proteins. Two important proteins in this fraction are high-density lipoproteins and very-low-density lipoproteins. Beta globulins include complement (C3, C4), transferrin, and ferritin. They are responsible for iron transport, heme binding, and fibrin formation and lysis. Gamma globulins (immunoglobulins) are synthesized by plasma cells and are responsible for antibody production (immunity). Immunoglobulins (Ig) identified in animals are IgG, IgD, IgE, IgA, and IgM.

Direct chemical measurements of globulin are rarely performed. Globulin concentration is normally estimated by determining the difference between the total protein and albumin concentrations.

Albumin/Globulin Ratio

An alteration in the normal ratio of albumin to globulin (A/G) is frequently the first indication of a protein abnormality. The ratio is analyzed in conjunction with a protein profile. The A/G can be used to detect increased or decreased albumin and globulin concentrations. Many pathologic conditions alter the A/G. However, if the albumin and globulin concentrations are reduced in equal proportions, such as with hemorrhage, no alteration in A/G will be present.

The A/G is determined by dividing the albumin concentration by the globulin concentration. In dogs, horses, sheep, and goats, the albumin concentration is usually greater than the globulin concentration (A/G is more than 1.00). In cattle, pigs, and cats, the albumin concentration is usually equal to or less than the globulin concentration (A/G is less than 1.00).

Fibrinogen

Fibrinogen is synthesized by hepatocytes. It is the precursor of fibrin, the insoluble protein that forms the matrix of blood clots, and is one of the factors necessary for clot formation. If fibrinogen levels are decreased, blood does not form a stable clot or does not clot at all. Fibrinogen makes up 3% to 6% of the total plasma protein content. Because it is removed from plasma by the clotting process, no fibrinogen is found in serum. Acute inflammation or tissue damage may elevate plasma fibrinogen levels. The most common method of fibrinogen evaluation is the heat precipitation test described in Chapter 2. The fibrinogen value is calculated by subtracting the total plasma protein value of the heated tubes from that of the unheated tubes. (This value should be lower because fibrinogen has been removed from the plasma.) Plasma is the only sample that may be used because serum does not contain fibrinogen. Plasma collected with ethylenedianine tetraacetic acid (EDTA) is preferred. Heparinized plasma may yield falsely low results.

HEPATOBILIARY ASSAYS

The liver is the largest internal organ and is complex in structure, function, and pathologic characteristics. It has many functions, including metabolism of amino acids, carbohydrates, and lipids; synthesis of albumin, cholesterol, plasma proteins, and clotting factors; digestion and absorption of nutrients related to bile formation; secretion of bilirubin, or bile; and elimination, such as detoxification of toxins and catabolism of certain drugs. These functions are run by enzymatic reactions. The gallbladder is closely associated with the liver, both anatomically and functionally. Its primary function is as a storage site for bile. Malfunctions in the liver or gallbladder result in predictable clinical signs of jaundice, hypoalbuminemia, problems with hemostasis, hypoglycemia, hyperlipoproteinemia, and hepatoencephalopathy.

Hepatic cells exhibit extreme diversity of function and are capable of regeneration if damaged. As a result, more than 100 different types of tests are available to evaluate liver function. Usually liver disease is greatly progressed before clinical signs appear. Liver function tests are designed to measure substances that are produced by the liver, modified by the liver or released when hepatocytes are damaged or those enzymes with altered serum concentrations as a result of cholestasis. Liver cells also compartmentalize the work so that damage to one zone of the liver may not affect all liver functions. Liver function tests are usually performed with serial determinations and several different types of liver tests completed to assist in verifying the functional status of the organ. No single test is superior to any other for detecting hepatobiliary disease. New tests are being developed to allow detection of hepatic disease before the liver is severely damaged. The primary tests used in veterinary medicine for evaluation of the liver and gallbladder are summarized in Box 3-1.

BOX 3-1   Major Hepatobiliary Assays

Enzymes Releaswed from Damaged Hepatocytes

Alanine Aminotransferase

Aspartate Aminotransferase

Sorbitol Dehydrogenase

Glutamate Dehydrogenase

Enzymes Associated with Cholestasis

Alkaline Phosphatase

Gamma Glutamyltranspeptidase

Hepatocyte Function Tests

Bilirubin

Bile Acids

Dye Excretion

Enzymes Released from Damaged Hepatocytes

With this type of liver disease, the hepatocytes are damaged and enzymes leak into the blood, causing a detectable rise in blood levels of enzymes associated with liver cells. These components, commonly referred to as the “leakage enzymes,” include the transferase enzymes alanine aminotransferase (ALT) and aspartate aminotransferase (AST) and the dehydrogenase enzymes sorbitol dehydrogenase (SDH) and glutamate dehydrogenase (GLDH). Transferases catalyze the reactions that transfer amine groups from amino acids to keto acids in the production of new amino acids. The enzymes are therefore found in tissues that have high rates of protein catabolism. Although other transferases are present in hepatocytes, the only readily available tests are for ALT and AST. Dehydrogenases catalyze the transfer of hydrogen groups, primarily during glycolysis. Transferases and dehydrogenases are found either free in the cytoplasm of hepatocytes or bound to the cell membrane. The serum levels of these enzymes vary in different species, and most also have nonhepatic sources.

Alanine Aminotransferase

ALT was formerly known as serum glutamic pyruvic transaminase (SGPT). In dogs, cats, and primates, the major source of ALT is the hepatocyte, where the enzyme is found free in the cytoplasm. ALT is considered a liver-specific enzyme in these species. Horses, ruminants, pigs, and birds do not have enough ALT in the hepatocytes for this enzyme to be considered liver specific. Other sources of ALT are renal cells, cardiac muscle, skeletal muscle, and the pancreas. Damage to these tissues may also result in increased serum ALT levels. Administration of corticosteroids or anticonvulsant medications may also lead to increases in serum ALT. ALT is used as a screening test for liver disease because it is not precise enough to identify specific liver diseases. No correlation exists between the blood levels of the enzyme and the severity of hepatic damage. Increases in ALT are usually seen within 12 hours of hepatocyte damage and peak levels are seen in 24 to 48 hours. The serum levels will return to reference ranges within a few weeks unless a chronic liver insult is present.

Aspartate Aminotransferase

AST was formerly known as serum glutamic oxaloacetic transaminase (SGOT). AST is present in hepatocytes, both free in the cytoplasm and bound to the mitochondrial membrane. More severe liver damage is required to release the membrane-bound AST. AST levels tend to rise more slowly than do ALT levels and return to normal levels within a day, provided chronic liver insult is not present. AST is found in significant amounts in many other tissues, including erythrocytes, cardiac muscle, skeletal muscle, the kidneys, and the pancreas. An increased blood level of AST may indicate nonspecific liver damage or may be caused by strenuous exercise or intramuscular injection. The most common causes of increased blood levels of AST are hepatic disease, muscle inflammation or necrosis, and spontaneous or artifactual hemolysis. If the AST level is elevated, the serum or plasma sample should be examined for hemolysis. Creatine kinase activity should also be assessed to rule out muscle damage before attributing an AST increase to liver damage.

Sorbitol Dehydrogenase

The primary source of SDH is the hepatocyte. Smaller amounts of the enzyme are found in the kidney, small intestine, skeletal muscle, and erythrocytes. SDH is present in the hepatocytes of all common domestic species but is especially useful for evaluating liver damage in large animals such as sheep, goats, swine, horses, and cattle. Large animal hepatocytes do not contain diagnostic levels of ALT, so SDH offers a liver-specific diagnostic test. The plasma level of SDH rises quickly with hepatocellular damage or necrosis. SDH assay can be used in all species to detect hepatocellular damage or necrosis, thus eliminating the need for other tests, such as the ALT assay. The disadvantage of SDH analysis is that SDH is unstable in serum and its activity declines within a few hours. If testing is delayed, samples should be frozen. SDH tests are not readily available to the average veterinary laboratory. Samples sent to outside laboratories should be packed in ice for transport.

Glutamate Dehydrogenase

GLDH is a mitochondrial-bound enzyme found in high concentrations in the hepatocytes of cattle, sheep, and goats. An increase in this enzyme is indicative of hepatocyte damage or necrosis in cattle and sheep. GLDH could be the enzyme of choice for evaluating ruminant and avian liver function, but no standardized test method has been developed for use in a veterinary practice laboratory.

Enzymes Associated with Cholestasis

Blood levels of certain enzymes become elevated with cholestasis (bile duct obstruction), metabolic defects in liver cells, and administration of certain medications and also as a result of the action of certain hormones, especially those of the thyroid. These enzymes are primarily membrane bound. The exact mechanism that induces increased levels of these enzymes in cholestasis is not well documented.

Alkaline Phosphatase

Alkaline phosphatase (AP) is present as isoenzymes in many tissues, particularly osteoblasts in bone; chondroblasts in cartilage, intestine, and placenta; and cells of the hepatobiliary system in the liver. The isoenzymes of AP tend to remain in circulation for approximately 2 to 3 days, with the exception of the intestinal isoenzyme that circulates for just a few hours. A corticosteroid isoenzyme of AP has been identified in dogs with exposure to increased endogenous or exogenous glucocorticoids. Because AP occurs as isoenzymes in these various tissues, the source of an isoenzyme or location of the damaged tissue may be determined by electrophoresis and other tests performed in commercial or research laboratories.

In young animals, most AP comes from osteoblasts and chondroblasts because of active bone development. In older animals, nearly all circulating AP comes from the liver as bone development stabilizes. The assays used for AP in a practice laboratory determines the total blood AP concentration. AP concentrations are most often used to detect cholestasis in adult dogs and cats. Because of wide fluctuations in normal blood AP levels in cattle and sheep, this test is not as useful for detecting cholestasis in these species.

Gamma Glutamyltranspeptidase

Gamma glutamyltransferase (GGT or γGT) is sometimes referred to as gamma glutamyltranspeptidase. GGT is found in many tissues, including renal epithelium, mammary epithelium (particularly during lactation), and biliary epithelium, but its primary source is the liver. Cattle, horses, sheep, goats, and birds have higher blood GGT activity than dogs and cats. Other sources of GGT include the kidneys, pancreas, intestine, and muscle cells. The blood GGT level is elevated with liver disease, especially with obstructive liver disease.

Hepatocyte Function Tests

Many substances are taken up, modified, produced, and/or secreted by the liver. Alteration in the ability to perform these specific functions provides an overview of liver function. Tests of hepatocyte function performed in veterinary practice include bilirubin and bile acids. Other substances produced by hepatocytes are less-sensitive indicators of liver function because test results may not show abnormalities until two thirds to three fourths of liver tissue is damaged. These less-sensitive tests include albumin, ammonia, and cholesterol.

Bilirubin

Bilirubin is an insoluble molecule derived from the breakdown of hemoglobin by macrophages in the spleen. The molecule is bound to albumin and transported to the liver. The hepatic cells metabolize and conjugate the bilirubin to the molecule bilirubin glucuronide. This molecule is then secreted from the hepatocytes and becomes a component of bile. Bacteria within the gastrointestinal system act on the bilirubin glucuronide and produce a group of compounds collectively referred to as urobilinogen. Urobilinogen is broken down to urobilin before being excreted in feces. Bilirubin glucuronide and urobilinogen may also be absorbed directly into the blood and excreted by the kidneys (Fig. 3-1).

image

Figure 3-1 Bilirubin metabolism. (From Sirois M: Principles and practice of veterinary technology, ed 2, St Louis, 2004, Mosby.)

Measurements of the circulating levels of these various populations of bilirubin can help pinpoint the cause of jaundice. Differences in the relative solubility of each of these molecules allow them to be individually quantified. In most animals, the prehepatic (bound to albumin) bilirubin comprises approximately two thirds of the total bilirubin in serum. Increases in this population indicate problems with uptake (hepatic damage). Increases in conjugated bilirubin indicate bile duct obstruction.

Assays can directly measure total bilirubin (conjugated bilirubin plus unconjugated bilirubin) and conjugated bilirubin. Conjugated bilirubin is sometimes referred to as direct bilirubin because test methods directly measure the amount of conjugated bilirubin in the sample. Unconjugated bilirubin is sometimes referred to as indirect bilirubin because its concentration is indirectly calculated by subtracting the conjugated bilirubin concentration from the total bilirubin concentration of the sample.

Bilirubin is assayed to determine the cause of jaundice, evaluate liver function, and check the patency of bile ducts. Blood levels of conjugated (direct) bilirubin are elevated with hepatocellular damage or bile duct injury or obstruction. Blood levels of unconjugated (indirect) bilirubin are elevated with excessive erythrocyte destruction or defects in the transport mechanism that allow bilirubin to enter hepatocytes for conjugation.

Bile Acids

Bile acids serve many functions. They aid in fat absorption (enabling the formation of micelles in the gastrointestinal system) and modulate cholesterol levels by bile acid synthesis. Bile acids are synthesized by hepatocytes from cholesterol and are conjugated with glycine or taurine. Conjugated bile acids are secreted across the canalicular membrane and reach the duodenum by way of the biliary system. The gallbladder stores bile acids (except in the horse) until contraction associated with feeding. When bile acids reach the ileum, they are transported to the portal circulation and travel back to the liver. Ninety percent to 95% of the bile acids are actively resorbed in the ileum. The remaining 5% to 10% is excreted in the feces. The reabsorbed bile acids are carried to the liver where they are reconjugated and excreted as part of the enterohepatic circulation of bile acids (Fig. 3-2).

image

Figure 3-2 Circulation of bile acids.

Spillover bile acids that escape from enterohepatic circulation may be detected in normal animals; serum concentrations of bile acids correlate portal concentrations. As a result, postprandial serum bile acid (SBA) concentrations are higher than fasting concentrations. Any process that impairs the hepatocellular, biliary, or portal enterohepatic circulation of bile acids results in elevated SBA levels. The great advantage of SBA determinations as a liver function test is that they evaluate the major anatomic component of the hepatobiliary system and are stable in vitro.

The SBA level is normally elevated after a meal because the gallbladder has contracted and released increased amounts of bile into the duodenum. Paired serum samples performed after 12 hours of fasting and 2 hours postprandial are needed to perform the test. The difference in the bile acid concentration of the samples is reported. In horses, a single sample is tested. Inadequate fasting or spontaneous gallbladder contraction can increase fasting bile acid levels. Exposing the patient to even the aroma of food can result in spontaneous gallbladder contraction. Prolonged fasting and diarrhea can decrease bile acids.

Elevated SBA levels usually indicate liver diseases such as congenital portosystemic shunts, chronic hepatitis, hepatic cirrhosis, cholestasis, or neoplasms. Bile acid levels are unspecific regarding the type of liver problem that exists and are therefore used as a screening test for liver disease. Bile acid levels may detect liver problems before an animal becomes icteric. They also may be used to follow the progress of liver disease during treatment. Increased bile acid concentrations can also result from extrahepatic diseases that secondarily affect the liver. Decreased bile acid concentration may be seen in intestinal malabsorptive diseases. In horses, increased bile acid concentrations can be the result of hepatobiliary disease or decreased feed intake. The reference ranges for bile acids in cows are widely variable. Bile acid testing is not a sensitive indicator of disease in cows.

Bile acids may be determined by several methods; the most commonly used is an enzymatic method. The 3-hydroxy bile acids react with 3-hydroxisteroid dehydrogenase and then with diformazan. Color generation is measured by end point spectrophotometry. Lipemic postprandial samples must be cleared by centrifugation to avoid interference with spectrophotometry. A bile acid test that uses immunologic methods (enzyme-linked immunosorbent assay) is now available for use in the veterinary clinic.

Cholesterol

Cholesterol is a plasma lipoprotein produced primarily in the liver, as well as ingested in food. Cholestasis caused an increase in serum cholesterol in some species. However, large differences exist in lipoprotein profiles of different species, and the clearance of lipoproteins is not well characterized in most veterinary species. A number of automated analyzers are available that provide cholesterol and other lipoprotein values. Hyperlipidemia is often secondary to other conditions (Box 3-2). Primary hyperlipidemia is rare and associated with inherited conditions in some breeds.

BOX 3-2   Causes of Secondary Hyperlipidemia

Cholestasis

Diabetes mellitus

Hepatic lipidosis

Hypothyroidism

Hyperadrenocorticism

Acute necrotizing pancreatitis

Nephrotic syndrome

Corticosteroid administration

Cholesterol assay is sometimes used as a screening test for hypothyroidism. Thyroid hormone controls synthesis and destruction of cholesterol in the body. Insufficient thyroid hormone, or hypothyroidism, results in hypercholesterolemia because the rate of cholesterol destruction is relatively slower than the rate of synthesis. Other diseases associated with hypercholesterolemia include hyperadrenocorticism, diabetes mellitus, and nephrotic syndrome. Dietary causes of hypercholesterolemia are rare but may include high-fat diets or postprandial lipemia.

Cholesterol by itself does not cause the grossly lipemic plasma seen after eating; triglycerides also are usually present. Administration of corticosteroids also may cause an elevated blood cholesterol concentration. Fluoride and oxalate anticoagulants may elevate enzymatic method results.

Other Tests of Liver Function

Although not commonly performed in the veterinary practice, several additional tests are available at reference laboratories and research facilities. The tests are based on the ability of the liver to excrete waste and foreign substances and include the dye excretion tests, ammonia tolerance test, and caffeine clearance test.

Dye Excretion

The two available dye excretion tests are the bromsulfophthalein (BSP) excretion and indocyanine green (ICG) excretion tests. Both require administration of a dye that binds to a protein in serum. The dyes are taken up by the hepatocytes and excreted into the bile. The disappearance of the dye from the plasma requires functional hepatocytes, adequate hepatic blood flow, and bile flow. The dye used for BSP is no longer used in human medicine, and its availability is limited. The overall complexity and expense of the testing have relegated their use primarily to research facilities.

Bromsulfophthalein Excretion: BSP excretion is a sensitive hepatic function test that is especially useful for detecting chronic lesions or portosystemic shunts with no leakage of liver enzymes. Coagulation defects and mysterious anemias sometimes may be explained by BSP test results. In horses, the test is handy to differentiate the jaundice of hepatic disease from that of simple anorexia, and hepatoencephalopathy from “wobbler syndrome.” Specific indications in ruminants include fascioliasis, liver abscesses, ketosis, and photosensitization. BSP clearance may aid in diagnosis of aflatoxicosis in swine. Hepatic lesions delay BSP excretion. Delays caused by hepatocellular injury indicate a loss of at least 55% of the liver’s functional mass. The magnitude of delay is poorly correlated to both the extent of hepatic lesions, when mild, and the clinical signs of liver dysfunction.

Delayed BSP clearance can be erroneous. Slowed BSP excretion results from perivascular dye injection (which can be painful), poor hepatic perfusion (shock, heart failure, dehydration), and fever. Ascites also interferes with BSP clearance because the dye lingers in pooled fluids. Certain inborn defects of BSP metabolism occur without liver disease, notably in some Southdown and Corriedale sheep. Obesity prolongs BSP retention because the amount of BSP per unit of body mass is relatively increased. In conclusion, bilirubin competes with BSP for excretion by the liver. For this reason, the BSP test should not be performed in animals with hyperbilirubinemia of 3 mg/dl or more. The results reveal nothing more than the already apparent jaundice.

Two conditions may disguise liver disease by speeding BSP clearance. Because albumin carries BSP in the plasma, hypoalbuminemia (nephrotic syndrome, protein-losing gastroenteropathies, extreme liver disease) speeds clearance by increasing hepatic access to the free dye. Phenobarbital use also hastens BSP clearance. BSP clearance has no correlation with the extent of fatty infiltration of the liver.

Indocyanine Green Clearance: ICG is an organic dye similar to BSP. Introduced to human medicine because of occasional reactions to BSP, it is used to estimate hepatic blood flow.

Preinjection plasma is required to prepare a blank and a standard solution. This test can be used in both fed and fasted animals, but the latter is preferred. An intravenous dosage of 0.8 to 1.1 mg/kg body weight is recommended for horses, whereas 1 mg/kg body weight is recommended for dogs and 1.5 mg/kg body weight for cats. Usually five to six plasma samples are taken between 0, 5, 10, 15, and 30 minutes after injection. ICG concentration is measured photometrically at 805 nm, and the half-life is determined. Normal ICG clearances are as follows: dogs, 8.4 ± 2.3 minutes; horses (fed), 3.5 ± 0.67 minutes; and horses (fasted), 1.6 ± 0.57 minutes. Normal 30-minute retention is 14.7% ± 5% in dogs and 7.3% ± 2.9% in cats. Delayed ICG clearance has been reported in a variety of disorders, such as hyperbilirubinemia, hypoproteinemia, decreased hepatic blood flow, hepatic necrosis, and extrahepatic bile duct obstruction.

Ammonia Tolerance

Ammonia is produced by enteric microflora and during amino acid metabolism and transported to the liver through the portal circulation. Enzymes in the liver convert the ammonia to urea for excretion. Any condition that reduces the uptake of ammonia or conversion of ammonia to urea can lead to increased plasma ammonia concentration. However, normal compensatory mechanisms of the liver may result in normal fasting plasma ammonia concentrations. Impaired ammonia uptake is best identified with an ammonia tolerance test. A fasting sample is collected to provide baseline date. Patients identified with increased plasma ammonia in fasting samples should not undergo the tolerance test because of the high risk of nervous system damage. After collection of the fasting sample, ammonium chloride is administered either rectally or orally by stomach tube or in a gelatin capsule. An additional sample is collected 30 minutes later. In patients with adequate hepatic function, the postadministration results may be the same as the fasting sample or moderately increased. Patients with urea cycle enzyme deficiencies (arginosuccinate synthetase) and abnormal portal blood flow, particularly congenital portovascular anomalies, often demonstrate a threefold to tenfold increase in plasma ammonia concentrations above the baseline. The test’s chief limitation is in the handling of blood samples. Samples must be collected in ammonia-free heparin and the blood must be centrifuged immediately. The plasma should be placed on ice and analyzed within 30 minutes of collection or frozen at 220° C. Ammonia levels are stable in frozen plasma for a few days. Whole blood cannot be tested because its ammonia content increases with storage. Although the test can dramatically elevate the blood ammonia level, it does not cause nor worsen neurologic signs. Occasional vomiting with oral administration of the ammonia chloride is the only problem.

Caffeine Clearance

This is a specific assay of hepatic microsomal function. Demethylation of caffeine depends only on the specific P448 microsomal system of healthy hepatocytes; therefore it accurately reflects aberrations in hepatocellular function. This test is now used in human medicine; few experimental studies have been performed in canine species.

Caffeine sodium benzoate is dissolved in 2 ml sterile water (50:50 wt/wt), and 7 mg of caffeine/kg body weight are then injected intravenously. Plasma samples are collected from 15 to 480 minutes, and the caffeine concentration (in milligrams per milliliter) is measured by automated enzyme immunoassay. Plasma caffeine clearance and half-life are calculated from those values.

Normal half-life values in dogs are 6 ± 0.6 hours, and clearance is 1.7 ± 0.1 ml/min per kilogram of body weight. Elimination of caffeine is prolonged in hepatic insufficiency, with a clearance of 0.8 ± 0.1 ml/min per kilogram of body weight.

KIDNEY ASSAYS

The kidneys play a major role in maintaining homeostasis in animals. Their primary functions are to conserve water and electrolytes in times of a negative balance and increase water and electrolyte elimination in times of a positive balance; excrete or conserve hydrogen ions to maintain blood pH within normal limits; conserve nutrients, such as glucose and proteins; remove the end products of nitrogen metabolism, such as urea, creatinine, and allantoin, so that blood levels of these end products remain low; produce renin (an enzyme involved in controlling blood pressure), erythropoietin (a hormone necessary for erythrocyte production), and prostaglandins (fatty acids used to stimulate contractility of uterine and other smooth muscle, lower blood pressure, regulate acid secretion in the stomach, regulate body temperature and platelet aggregation, and control inflammation); and aid in vitamin D activation.

The kidneys receive blood from the renal arteries. The blood enters the glomerulus of the nephrons where nearly all water and small dissolved solutes pass into the collecting tubules. Each nephron contains sections that function to reabsorb or secrete specific solutes. Resorption of glucose occurs in the proximal convoluted tubule. Mineral salts secretion and reabsorption occurs in the ascending limb of the loop of Henle and distal convoluted tubule. The nephron has a specific resorptive capability for each substance called the renal threshold. Most water is reabsorbed as well. As a result of water reabsorption, the volume excreted is less than 1% of the volume that originally entered the kidney. Blood returns from the kidneys to the rest of the body through the renal veins, which connect to the caudal vena cava. Urine and blood may be analyzed to evaluate kidney function. Chapter 5 details urinalysis procedures. The primary serum chemistry tests for kidney function are urea nitrogen and creatinine. Other tests include various assays designed to evaluate the rate and efficiency of glomerular filtration.

Blood Urea Nitrogen

Some references use the term serum urea nitrogen (SUN) instead of blood urea nitrogen (BUN). Urea is the principal end product of amino acid breakdown in mammals. BUN levels are used to evaluate kidney function on the basis of the ability of the kidney to remove nitrogenous waste (urea) from blood. Under normal conditions, all urea passes through the glomerulus and enters the renal tubules. Approximately half of the urea is reabsorbed in the tubules and the remainder excreted in the urine. If the kidney is not functioning properly, sufficient urea is not removed from the plasma, leading to increased BUN levels.

Contamination of the blood sample with urease-producing bacteria (e.g., Staphylococcus aureus, Proteus spp., and Klebsiella spp.) may result in decomposition of urea and subsequently decreased BUN levels. To prevent this, analysis should be completed within several hours of collection or the sample should be refrigerated. A variety of photometric tests are available for measurement of urea nitrogen. All have an acceptable level of accuracy and precision. Chromatographic tests are also available and provide a semiquantitative serum urea nitrogen result. These methods tend to be less accurate and should be used only as quick screening tests.

Urea is an insoluble molecule and must be excreted in a high volume of water. Dehydration results in increased retention of urea in the blood (azotemia). High-protein diets and strenuous exercise may cause an elevated BUN level because of increased amino acid breakdown, not because of decreased glomerular filtration. Differences in rate of protein catabolism in male versus female animals, as well as young and older animals, will also affect BUN levels.

Serum Creatinine

Creatinine is formed from creatine, which is found in skeletal muscle, as part of muscle metabolism. Creatinine diffuses out of the muscle cell and into most body fluids, including blood. If physical activity remains constant, the amount of creatine metabolized to creatinine remains constant and the blood level of creatinine remains constant. The total amount of creatinine is a function of the animal’s total muscle mass. Under normal conditions, all serum creatinine is filtered through the glomeruli and eliminated in urine. Any condition that alters the glomerular filtration rate (GFR) will alter the serum creatinine levels. Creatinine also may be found in sweat, feces, and vomitus and may be decomposed by bacteria.

Blood creatinine levels are used to evaluate renal function on the basis of the ability of the glomeruli to filter creatinine from blood and eliminate it in urine. Like BUN, creatinine is not an accurate indicator of kidney function because nearly 75% of the kidney tissue must be nonfunctional before blood creatinine levels rise. Commonly used test methods for serum creatinine include the Jaffe method, as well as several enzymatic methods. Postprandial decreases in creatinine occur from transient increase in the GFR after a meal.

BUN/Creatinine Ratio

Because BUN and creatinine both have a wide range of reference intervals, their use as indicators of renal function is limited. The GFR may be decreased as much as four times below normal before changes are seen in the BUN or serum creatinine levels. In addition, healthy animals often have values below the reference ranges. In renal disease, hyperplasia of renal tissue may mask early signs of renal failure. The ratio of BUN to creatinine is used in human medicine for diagnosis of renal disease. Although this is not yet well established in veterinary species, it can be used to assess patient status during treatment.

BUN and creatinine have an inverse logarithmic relationship. The reciprocal of creatinine tracked over time can be used to track progress of disease and effectiveness of treatment. A disproportionate increase in BUN can indicate dehydration, dietary treatment failure, or owner noncompliance with treatment regimens.

Urine Protein/Creatinine Ratio

Quantitative assessment of renal proteinuria is of diagnostic significance in renal disease. In the absence of inflammatory cells in the urine, proteinuria indicates glomerular disease. For accurate determination of proteinuria, a 24-hour urinary protein value should be determined. This is a tedious task, and errors are common. A mathematical method that compares the urine protein level with the urine creatinine levels in a single urine sample is more accurate and comprehensive. This urine protein to creatinine (P/C) ratio is based on the concept that the tubular concentration of urine increases both the urinary protein and creatinine concentrations equally.

This method has been validated for the canine species. Usually 5 to 10 ml of urine are collected between 10 am and 2 pm, preferably by cystocentesis. The urine sample should be kept at 4° C or stored at 20° C. The sample is centrifuged and the supernatant is used. The protein and creatinine concentrations for each sample can be determined by a variety of photometric methods. The urine P/C ratio for healthy dogs should be less than 1. A urine P/C between 1 and 5 may have prerenal (hyperglobulinemia, hemoglobinemia, myoglobinemia) or functional (exercise, fever, hypertension) origin, whereas urine P/C greater than 5 is caused by renal disease.

Uric Acid

Uric acid is a metabolic by-product of nitrogen catabolism and is found mainly in the liver. Uric acid is usually transported to the kidneys bound to albumin. In most mammals, the compound passes through the glomerulus and is largely reabsorbed by the tubule cells. It is then converted to allantoin and excreted in the urine. In Dalmatian dogs a defect in uric acid uptake into hepatocytes results in decreased conversion to allantoin. Therefore this breed excretes uric acid, and not allantoin, in urine.

Uric acid is the major end product of nitrogen metabolism in avian species. It constitutes approximately 60% to 80% of the total nitrogen excreted in avian urine and is secreted actively by the renal tubules. Measurement of plasma or serum uric acid is used as an index of renal function in birds. Uric acid can also be increased artifactually in samples from toenail clippings because of fecal urate contamination. Uric acid concentrations will increase after a meal in carnivorous birds. With renal disease, uric acid concentrations increase when the kidney has lost more than 70% of its functional capacity.

Tests of Glomerular Function

In patients with azotemia or those that are symptomatic for renal disease without azotemia, several additional tests can be performed to evaluate kidney function. These clearance studies require collection of timed, quantified urine samples along with concurrent plasma samples. Two primary types of clearance studies are performed: the effective renal plasma flow (ERPF) and GFR. The ERPF uses test substances eliminated by both glomerular filtration and renal secretion, typically the amide p-aminohippuric acid. The GFR uses test substances eliminated only by glomerular filtration, typically creatinine, inulin, or urea. The test substance is administered and urine and plasma samples collected. The ERPF or GFR are then calculated as follows:

image

where Ux represents substance present in urine (in milligrams per milliliter), V represents the amount of urine collected over a defined period (in milligrams per kilogram per minute), and Px represents the plasma concentration of substance.

Creatinine Clearance Tests

Endogenous Creatinine Clearance: Because creatinine appears in the glomerular filtrate with negligible tubular secretion, it is a natural tracer of glomerular filtration. Fortunately, its short-term blood concentrations are stable enough to satisfy the clearance formula used for steady infusion studies of inulin and p-aminohippuric acid. The test is relatively simple (Box 3-3). A measure of blood creatinine and an accurate, timed urine collection are required for this test. Precision is of the utmost importance. Sloppy bladder catheterization and sampling ruin the results, especially with the briefer methods. The bladder must be rinsed before and after the test, saving the after-rinses with the urine for creatinine analysis. Clearance is calculated by dividing urinary creatinine excretion (urine creatinine concentration × urine volume) by plasma creatinine concentration. The estimate, if imprecise, is practical.

BOX 3-3   Overview of Endogenous Creatinine Clearance Test

• A pretest blood sample is obtained for plasma creatinine analysis.

• The urinary bladder is catheterized and rinsed several times with saline

• All voided urine is collected over a specified time frame (most commonly 24 hours)

• The urinary bladder is catheterized at the end of the specified time and the remainder of the urine is collected

• Saline bladder rinse is repeated and the creatinine concentration of the urine rinse is determined

• Creatinine clearance is calculated with the following equation:

image

• Uv, urine volume (ml/min); Uc, urine creatinine concentration (mg/dl); Pc, plasma creatine concentration (mg/dl)

• Normal clearance in dogs is 2.8 ± 0.96 ml/min/kg.

To avoid errors, plasma creatinine should be determined by the combination creatinine PAP test instead of the Jaffe method. The combination creatinine PAP test is an enzymatic chromogenic method to determine creatinine concentration. The Jaffe method also determines noncreatinine chromogens in plasma, which do not appear in urine. Excess serum ketones, glucose, and proteins all falsely elevate GFR estimates because of chromatic interference and cross-reactivity.

Exogenous Creatinine Clearance: Exogenous creatinine clearance is an accurate method to measure GFR in small animals. Plasma creatinine concentration increases, making plasma noncreatinine chromogens concentration negligible. This allows the application of the Jaffe method to determine creatinine concentrations (Box 3-4). Avoiding dehydration in the animal is critical in the performance of this test; free access to water must be ensured before any glomerular filtration tests.

BOX 3-4   Overview of the Exogenous Creatinine Clearance Test

• A subcutaneous injection of creatinine is administered

• A measured volume of water is administered per os by gastric intubation.

• The urinary bladder is catheterized and rinsed with saline after a specified time period (typically 40 minutes)

• A blood sample is obtained for plasma creatinine analysis

• All voided urine is collected for a specified time period and a second blood sample obtained

• Calculate creatinine clearance using the mean values of both samples.

• Normal values in the dog are 4.09 ± 0.52 ml/kg/min.

• A related procedure for evaluating GFR involves the use of an isohexol injection and does not require collection of urine samples

Single-Injection Inulin Clearance

Inulin is excreted entirely by glomerular filtration, without tubular secretion, reabsorption, or catabolism. As a result, inulin clearance tests that use a constant infusion rate and quantitative urine sampling may be considered the best method to evaluate GFR. Single-injection inulin clearance is a simpler method that alternatively may be used. After a 12-hour fast (free access to water is permitted during the test), inulin is injected intravenously at a dosage of 100 mg/kg or 3 g/m2 (body surface calculation gives more accurate results); serum samples are then obtained at 20, 40, 80, and 120 minutes. Total inulin clearance is calculated from the decrease of serum inulin concentration by using a two-compartment model. Normal dogs present a GFR of 83.5 to 144.3 ml/min per square meter of body surface area.

Sodium Sulfanilate

Sodium sulfanilate is removed only by glomerular filtration in dogs; its disappearance from the plasma is an index of glomerular filtration. The test can detect unilateral nephrectomy and diminished renal function in dogs before azotemia develops. The half-life of sodium sulfanilate is prolonged up to five times in horses with glomerulonephritis. Although the test also is performed in cats, the mode of sodium sulfanilate excretion is not confirmed in this species. This test is no longer widely used.

Phenolsulfonphthalein Clearance

Phenolsulfonphthalein is an organic dye excreted by the renal tubules. Nonetheless, its clearance is accepted as a measure of renal blood flow because this usually limits its efflux more than tubular secretion rates. Phenolsulfonphthalein clearance is decreased only when more than two thirds of the nephrons are nonfunctional or when renal perfusion is compromised. This test is no longer widely used.

Other Estimates of Glomerular Filtration Rate and Effective Renal Plasma Flow

These methods are performed at reference and research centers and require specialized equipment. The double-isotope method involves injection of a tracer solution and uses a gamma camera for nuclear imaging of kidneys at serial time intervals. The nonisotope method is similar, but after injection of contrast media serial blood samples are collected and analyzed with x-ray fluorescent laboratory assays.

Water-Deprivation Tests: Polyuria or polydipsia may lead to suspicions about the kidney, which may be erroneous. Diuresis and subsequent polydipsia may mean failing nephrons or kidney function disrupted by hyperadrenocorticism (Cushing’s disease), diabetes mellitus, or nephrogenic diabetes insipidus. The kidneys may be normal but not receive the signal to concentrate urine, as in neurogenic diabetes insipidus. Finally, the diuresis may be a totally appropriate renal compensation for pathologic water intake (psychogenic polydipsia).

Vasopressin or antidiuretic hormone (ADH), from the neurohypophysis, signals the kidneys to retain water by increasing the duct’s permeability to water. Water in the urine passes out of the collecting duct and into the hypertonic renal medulla, concentrating the urine that remains behind in the collecting duct. If the system fails (e.g., inappropriate diuresis), either the neuroendocrine pathway that releases ADH in response to hypovolemia/plasma hyperosmolarity has been interrupted or the nephrons are unable to respond.

Water-Deprivation Test: This test is performed by observation of the response to endogenous or exogenous ADH. The basis for this test is to dehydrate the patient safely until a definite stimulus exists for endogenous ADH release (usually at approximately 5% body weight loss). That end point may vary. When denied water, patients dehydrate at different rates and must be monitored for weight loss, clinical signs of dehydration, and increased urine osmolarity or specific gravity. At the end point, the kidney should be under strictest endocrine orders to concentrate urine. Continued diuresis and dilute urine indicate lack of endogenous ADH or unresponsive nephrons. In dogs with kidney failure, this unresponsiveness precedes azotemia.

Contraindications to this test include dehydration and azotemia. Dehydrated patients risk hypovolemia and shock. They already should have maximal ADH release; if they could concentrate urine, they would. The test then is useless and dangerous, especially in animals with diabetes insipidus or neurogenic diabetes insipidus. Azotemia already attests to kidney dysfunction. Again, the test reveals nothing new and adds a prerenal component to the azotemia.

Vasopressin Response: When patients demonstrate the above-mentioned signs or a previous water-deprivation test has failed, a vasopressin response test is indicated. The vasopressin response test is simply a challenge with exogenous ADH; it focuses on the kidneys’ abilities to respond. Urine osmolarity or specific gravity is the index of function. Normal kidneys should concentrate urine with this technique despite the patient’s free access to water. Vasopressin must be handled carefully because it is a labile drug and settles out in oil suspensions. Test failures may result from use of old or poorly mixed solutions. Also, intramuscular vasopressin injection causes pain. Because of vasopressin’s vasomotor activity, its use is theoretically contraindicated in pregnancy.

In both tests, even normal kidneys may be unable to concentrate urine to normal extremes. Diuresis quickly washes solutes from the renal medulla, weakening the osmotic gradient that draws water from the collecting ducts. Gradual water deprivation over a 3- to 5-day period before use of the water deprivation test is recommended to renew renal solutes and allow an evaluation of the impact of dehydration on the animal.

The basic water deprivation and vasopressin response tests may be combined in a single protocol that may differentiate several causes of polyuria/polydipsia (Box 3-5). The modified water-deprivation test is specifically contraindicated in patients with known renal disease, uremia resulting from prerenal or primary renal disorder, or suspected or obvious dehydration.

BOX 3-5   Overview of Water Deprivation-Vasopressin Response

• Water intake is gradually reduced over a 72-hour period before initiation of the test

• All food and water are then withdrawn and the urinary bladder emptied at the start of the test

• An accurate exact body weight is obtained at the start of the test and repeated every 30 to 60 minutes

• Urine specific gravity, osmolality, and serum urea nitrogen are recorded and hydration and CNS status are evaluated at the start of the test and repeated every 30 to 60 minutes and at the conclusion of the test

• The test is ended when the animal is clinically dehydrated, appears ill, or has lost about 5% of its body weight

• A final blood sample is obtained for determination of the vasopressin concentration before the vasopressin response test.

Vasopressin Response

• Aqueous vasopressin is administered by IM injection

• At 30 minute intervals for a maximum of 2 hours, the urinary bladder is emptied and body weight, urine specific gravity, osmolality, and serum urea nitrogen are recorded and hydration and CNS status are evaluated

After Testing

• Small amounts of water are provided every 30 minutes for 2 hours.

• If the patient shows no evidence of vomiting, dehydration, or CNS abnormalities 2 hours after the test, water is provided ad lib.

Fractional Clearance of Electrolytes: The fractional clearance (FC), also referred to as fractional excretion (FE), of electrolytes is a mathematical manipulation that describes the excretion of specific electrolytes (particularly sodium, potassium, and phosphorus) relative to the GFR. The most commonly used FE test is that of sodium. Bicarbonate and chloride FE testing is rarely performed. The tests can differentiate prerenal from postrenal azotemia. Random, concurrent blood and urine samples are required. The FEX is calculated as follows:

image

where X is the electrolyte measurement used, which can be any of the four (sodium, potassium, phosphorus, and chloride); UX and PX are the urine and plasma concentrations, respectively, of that specific electrolyte; and PCR and UCR are the urine and plasma concentrations of creatinine, respectively. Normal results are as follows:

• Dogs: sodium, 1; potassium, 20; chloride, 1; phosphorus, 39

• Cats: sodium, 1; potassium, 24; chloride, 1.3; phosphorus, 73

Urethral Pressure Profilometry: Urinary incontinence is a common complaint in canine medicine. Most cases are caused by sphincter mechanism incompetence. The specific functional test to explore the urethral sphincter mechanism is the urethral pressure profilometry. This test requires appropriate equipment and is restricted to referral institutions. A double-sensor microtip pressure transducer catheter is used. The catheter is inserted through the urethra into the urinary bladder. The tip sensor measures intravesical pressure at the same time as the other sensor records urethral resistance to perform a continuous comparison.

Inorganic Phosphorus: Serum inorganic phosphorus (Pi) is usually the reciprocal of serum calcium. Normally, serum Pi is reabsorbed in kidney tubules. This mechanism is under hormonal control (parathyroid hormone) and is affected by serum pH. Initially, renal damage that alters the GFR leads to decreased urinary Pi and increased serum Pi. Subsequent alteration in calcium and Pi leads to increase in serum calcium and decrease in serum Pi. See the electrolyte information later in this chapter for additional information on testing for Pi.

Enzymuria: Many of the chemical tests performed on serum or plasma can also be performed on urine samples. Enzymes that may be present in urine of patients with renal disease include urinary GGT and urinary N-acetyl-d-glucosaminidase (NAG). Urinary GGT and NAG are enzymes released from damaged tubule cells. Comparison of the units of GGT or NAG per milligram of creatinine can indicate the extent of renal damage. Both GGT and NAG increase rapidly with nephrotoxicity, and increases occur sooner than changes in serum creatinine, creatinine clearance, or fractional excretion of electrolytes.

PANCREAS ASSAYS

The pancreas is actually two organs, one exocrine and the other endocrine, held together in one stroma.

The exocrine portion, also referred to as the acinar pancreas, comprises the greatest portion of the organ. This portion secretes an enzyme-rich juice that contains enzymes necessary for digestion into the small intestine. The three primary pancreatic enzymes are trypsin, amylase, and lipase. These digestive enzymes are released into the lumen of other organs through a duct system. Trauma to pancreatic tissue is often associated with pancreatic duct inflammation that results in a backup of digestive enzymes into peripheral circulation.

Interspersed within the exocrine pancreatic tissue are arrangements of cells that, in a histologic section, take on the appearance of “islands” of lighter-staining tissue. These are called the islets of Langerhans. Four types of islet cells are present but they cannot be distinguished on the basis of their morphologic characteristics. The four cell types are designated α, β, δ, and PP cells. The δ and PP cells comprise less than 1% of the islet cells and secrete somatostatin and pancreatic polypeptide, respectively. β-Cells comprise approximately 80% of the islet and secrete insulin. The remaining area, nearly 20%, consists of α-cells that secrete glucagon and somatostatin. The pancreas has little regenerative ability. When pancreatic islets are damaged or destroyed, pancreatic tissue becomes firm and nodular with areas of hemorrhage and necrosis. These islets are no longer able to function. Diseases of the pancreas may result in inflammation and cellular damage that causes leakage of digestive enzymes or insufficient production or secretion of enzymes.

Exocrine Pancreas Tests

The tests commonly performed to evaluate the acinar functions of the pancreas include amylase and lipase. Trypsinlike immunoreactivity and serum pancreatic lipase immunoreactivity are also available as tests for pancreatic function. In cats, serum amylase and lipase activities have been shown to have limited clinical significance in the diagnosis of pancreatitis. In experimentally induced pancreatitis in cats, serum amylase actually decreases. Serum activities of both enzymes are frequently normal in cats with pancreatitis.

Amylase

The primary source of amylase is the pancreas, but it is also produced in the salivary glands and small intestine. Increases in serum amylase are nearly always caused by pancreatic disease, especially when accompanied by increased lipase levels. The rise in blood amylase level is not always directly proportional to the severity of pancreatitis. Serial determinations provide the most information.

Amylase functions to break down starches and glycogen in sugars, such as maltose and residual glucose. Increased levels of amylase appear in blood during acute pancreatitis, flare-ups of chronic pancreatitis, or obstruction of the pancreatic ducts. Enteritis, intestinal obstruction, or intestinal perforation may also result in increased serum amylase from increased absorption of intestinal amylase into the bloodstream. In addition, because amylase is excreted by the kidneys, a decrease in GFR for any reason can lead to increased serum amylase. Serum amylase activity greater than three times the reference range usually suggests pancreatitis.

Two amylase test methods are available: the saccharogenic method and the amyloclastic method. The saccharogenic method measures production of reducing sugars as amylase catalyzes the breakdown of starch. The amyloclastic method measures the disappearance of starch as it is broken down to reduce sugars through amylase activity. Calcium-binding anticoagulants, such as EDTA, should not be used because amylase requires the presence of calcium for activity. The presence of lipemia may reduce amylase activity. The saccharogenic method is not ideal for canine samples because maltase in canine samples may artificially elevate assay results. Normal canine and feline amylase values can be up to 10 times higher than those in human beings. Therefore samples may have to be diluted if tests designed for human samples are used.

Lipase

Nearly all serum lipase is derived from the pancreas. The function of lipase is to break down the long-chain fatty acids of lipids. Excess lipase is normally filtered through the kidneys, so lipase levels tend to remain normal in the early stages of pancreatic disease. Gradual increases are seen as disease progresses. With chronic, progressive pancreatic disease, damaged pancreatic cells are replaced with connective tissue that cannot produce enzyme. As this occurs, a gradual decrease in both amylase and lipase levels are seen.

Test methods for determination of lipase levels usually are based on hydrolysis of an olive oil emulsion into fatty acids using the lipase present in patient serum. The quantity of sodium hydroxide required to neutralize the fatty acids is directly proportional to lipase activity in the sample. Newer tests for lipase are available from some reference laboratories capable of detecting canine lipase by using immunologic methods.

Lipase assay may be more sensitive for detecting pancreatitis than is amylase assay. The degree of lipase activity, like amylase activity, is not directly proportional to the severity of pancreatitis. Determinations of blood lipase and amylase activities usually are requested at the same time to evaluate the pancreas.

Increased lipase activity is also seen with renal and hepatic dysfunction, although the exact mechanisms for this are unclear. Steroid administration is correlated with increased lipase activity with no concurrent change in amylase activity.

Amylase and Lipase in Peritoneal Fluid

Comparison of amylase and lipase activity in peritoneal fluid with serum may provide additional diagnostic information. A finding of higher amylase and lipase activity in peritoneal fluid than in serum strongly suggest pancreatitis provided intestinal perforation has first been ruled out.

Trypsin

Trypsin is a proteolytic enzyme that aids digestion by catalyzing the reaction that breaks down the proteins of ingested food. Trypsin activity is more readily detectable in feces than in blood. For this reason, most trypsin analyses are done on fecal samples. Trypsin is normally found in feces, and its absence is abnormal.

Two fecal test methods are used in the laboratory: the test tube method and the x-ray film test. The test tube method involves mixing fresh feces with a gelatin solution. The test solution does not become a gel if trypsin is present in the sample to break down the protein (gelatin). If trypsin is absent, the solution becomes a gel. The x-ray film test uses the gelatin coating on undeveloped x-ray film to test for the presence of trypsin. A strip of x-ray film is placed in a slurry of feces and bicarbonate solution. If trypsin is present in the fecal sample, the gelatin coating is removed from the film upon rinsing with water. If no trypsin is present, the gelatin coating remains on the film after rinsing. The test tube method is considered more accurate than the x-ray film test in evaluating fecal trypsin proteolytic activity.

Only fresh feces should be used. Fecal trypsin activity may be decreased if the patient has recently eaten raw egg whites, soybeans, lima beans, heavy metals, citrate, fluoride, or some organic phosphorous compounds. Calcium, magnesium, cobalt, and manganese in the feces may increase trypsin activity. Proteolytic bacteria in the fecal sample may result in false-positive or apparently normal results, especially in older samples.

Serum Trypsinlike Immunoreactivity

Serum trypsinlike immunoreactivity (TLI) is a radioimmunoassay that uses antibodies to trypsin. The test can detect both trypsinogen and trypsin. The antibodies are species specific. Trypsin and trypsinogen are produced only in the pancreas. With pancreatic injury, trypsinogen is released into the extracellular space and converted to trypsin, which diffuses into the bloodstream. The test is available only for the dog and cat.

TLI provides a sensitive and specific test for diagnosis of exocrine pancreatic insufficiency in dogs. Dogs with exocrine pancreatic insufficiency (EPI) have a serum TLI of less than 2.5 mg/L. Normal dogs have a range of 5 to 35 mg/L. Dogs with other causes of malassimilation may have normal serum TLI. Dogs with chronic pancreatitis may have normal TLI values or between 2.5 and 5 mg/L. Normal cats have 14 to 82 mg/L TLI, whereas cats with EPI have less than 8.5 mg/L.

Serum TLI decreases in parallel with functional pancreatic mass. The inflammation associated with acute and probably chronic pancreatitis may enhance leakage of trypsinogen and trypsin from the pancreas and increase TLI. Also, decreased GFR increases TLI (trypsinogen is a small molecule that easily passes into the glomerular filter). Serum TLI is an important indicator of functional pancreatic mass. It is most informative if coupled with N-benzoyl-l-tyrosyl-p-aminobenzoic acid (BTPABA) and fecal fat results to characterize and diagnose malassimilation.

Serum TLI increases after eating (especially proteins) but values remain within reference intervals. In addition, pancreatic enzyme (exogenous) supplementation does not alter TLI. Therefore food should be withheld for at least 3 hours and preferably 12 hours before taking a blood sample. The blood is coagulated at room temperature and the serum stored at 20° C until assay.

Serum Pancreatic Lipase Immunoreactivity

Serum feline pancreatic lipase immunoreactivity (fPLI) is specific for pancreatitis, and its use is now recommended instead of the previously validated serum feline trypsinlike immunoreactivity test as a serum test to diagnose cats with symptoms of pancreatitis.

Endocrine Pancreas Tests

A variety of tests are available to evaluate the endocrine functions of the pancreas. In addition to the traditional blood glucose tests, other tests now available include fructosamine, β-hydroxybutyrate, and glycosylated hemoglobin. Urinalysis, serum cholesterol, and triglyceride tests also provide information on the function of the pancreas.

Glucose

Regulation of blood glucose levels is complex. Glucagon, thyroxine, growth hormone, epinephrine, and glucocorticoids are all agents favoring hyperglycemia. They boost blood glucose levels by encouraging glycogenolysis, gluconeogenesis, and/or lipolysis while discouraging glucose entry into cells. Insulin is the hypoglycemic hormone. Promoting glucose flux into its target cells, it also triggers anabolism, a process that converts glucose to other substances. This regulatory effect prevents the blood glucose concentration from exceeding the renal threshold and the spilling of glucose into the urine.

The pancreatic islets respond directly to blood glucose concentrations and release insulin (from the beta cells) or glucagon (from the alpha cells) as needed. Glucagon release also directly stimulates insulin release. Epinephrine is under direct sympathetic neural control; hyperglycemia is one aspect of the classic “flight or fight” state. The other hormones mentioned respond to hypothalamic/pituitary command. At any point in time, most of these agents are acting, shifting the blood glucose concentration up or down.

Because only insulin lowers blood glucose levels, aberrations of insulin action have the most obvious clinical effects. Hypofunction (diabetes mellitus) or hyperfunction (hyperinsulinism) can occur.

The blood glucose level is used as an indicator of carbohydrate metabolism in the body and may also be used as a measure of endocrine function of the pancreas. The blood glucose level reflects the net balance between glucose production, such as dietary intake and conversion from other carbohydrates, and glucose utilization, which is expended energy and conversion to other products. It also may reflect the balance between blood insulin and glucagon levels.

Glucose utilization depends on the amount of insulin and glucagon produced by the pancreas. As the insulin level increases, so does the rate of glucose utilization, resulting in decreased blood glucose levels. Glucagon acts as a stabilizer to prevent blood glucose levels from becoming too low. As the insulin level decreases (as in diabetes mellitus), so does glucose utilization, resulting in increased blood glucose concentration.

Many tests are available for blood glucose. Some of these react only with glucose, whereas others may quantitate all sugars in the blood. End point and kinetic assays are available. The kinetic enzymatic assays tend to be the most accurate and precise. Samples must be taken from a properly fasted animal. Serum and plasma for glucose testing must be separated from the erythrocytes immediately after blood collection. Glucose levels may drop 10% an hour if the sample of plasma is left in contact with erythrocytes at room temperature. Even the use of an SST may not be adequate to prevent this. Mature erythrocytes use glucose for energy and, in a blood sample, they may decrease the glucose level enough to give false-normal results if the original sample had an elevated glucose level. If the sample originally had a normal glucose level, erythrocytes may use enough glucose to decrease the level to below normal or to zero. If the plasma cannot be removed immediately, the anticoagulant of choice is sodium fluoride at 6 to 10 mg/ml of blood. Sodium fluoride may be used as a glucose preservative with EDTA at 2.5 mg/ml of blood. Refrigeration slows glucose utilization by erythrocytes.

Fructosamine

Glucose can bind a variety of structures, including proteins. Fructosamine represents the irreversible reaction of glucose bound to protein, particularly albumin. When glucose concentrations are persistently elevated in blood as in diabetes mellitus, increased binding of glucose to serum proteins occurs. The finding of increased fructosamine indicates a persistent hyperglycemia. Because the half-life of albumin in dogs and cats is 1 to 2 weeks, fructosamine provides an indication of the average serum glucose over that period. Fructosamine levels respond more rapidly to alterations in serum glucose than does glycosylated hemoglobin. However, serum fructosamine may be artifactually reduced in patients with hypoproteinemia.

Glycosylated Hemoglobin

Glycosylated hemoglobin represents the irreversible reaction of hemoglobin bound to glucose. The finding of increased glycosylated hemoglobin indicates a persistent hyperglycemia. The test result is a reflection of the average glucose concentration over the lifespan of an erythrocyte—3 to 4 months in dogs and 2 to 3 months in cats. Patients that are anemic may have artifactually reduced levels of glycosylated hemoglobin.

β-Hydroxybutyrate

Ketone bodies can also be detected in plasma. The ketone produced in greatest abundance in ketoacidotic patients is β-hydroxybutyrate. However, many tests for serum ketones only detect acetone. Tests for β-hydroxybutyrate that use enzymatic, colorimetric methods are now becoming available for use in the veterinary clinic.

Glucose Tolerance

Glucose tolerance tests directly challenge the pancreas with a glucose load and measure insulin’s effect by evaluation of blood or urine glucose concentrations. If adequate insulin is released and its target cells have healthy receptors, the artificially elevated blood glucose level peaks 30 minutes after ingestion and begins to drop, reaching normal value within 2 hours, and no glucose appears in the urine. A normal glucose blood level at 2 hours postprandial may rule out diabetes mellitus. Prolonged hyperglycemia and glucosuria are consistent with diabetes mellitus. Profound hypoglycemia after challenge may indicate a glucose-responsive, hyperactive beta-cell tumor of the pancreas. This test may be simplified by determining a single 2-hour postprandial glucose.

Oral glucose tolerance is affected by abnormal intestinal function, such as enteritis or hypermotility, and excitement (as from gastric intubation); an intravenous glucose tolerance test is preferred. The intravenous test is the only practical option for ruminants. With the intravenous glucose tolerance test (Procedure 3-3), a challenge glucose load is injected after a 12- to 16-hour fast (except in ruminants). Blood glucose is subsequently checked and its progress mapped as a tolerance curve. Results are standardized as disappearance half-lives or glucose turnover rates expressed as percent per minute:

image

Decreased glucose tolerance (increased half-life, decreased turnover rate) occurs in diabetes mellitus and less consistently in hyperthyroidism, hyperadrenocorticism, hyperpituitarism, and severe liver disease. Increased glucose tolerance (decreased half-life, increased turnover rate) is observed with hypothyroidism, hypoadrenocorticism, hypopituitarism, and hyperinsulinism. However, results may be erroneous. Normal animals on low-carbohydrate diets may manifest “diabetic curves.” The effect of this can be minimized by providing the animal with 2 to 3 days of high-carbohydrate meals before testing. The intravenous glucose tolerance test results are so variable in normal horses, depending on diet and fasting, that they are not useful.

PROCEDURE 3-3   Intravenous Glucose Tolerance Test

1. Evaluate the animal’s diet. For patients on low carbohydrate diets, feed a high-carbohydrate diet (100 to 200 g/day for dogs) for 3 days before the test.

2. Fast the animal for 12 to 16 hours to lower the blood glucose level to 70 mg/dl in patients with suspected hyperinsulinism (do not fast ruminants or dogs with insulinoma)

3. Obtain a preinjection blood sample in a sodium fluoride tube for a baseline blood glucose determination.

4. Begin timing the trial at the start of infusion of glucose solution IV at 1.0 g/kg administered over a 30 second period.

5. Obtain blood samples at 5, 15, 25, 35, 45, and 60 minutes after glucose infusion, using sodium fluoride as an anticoagulant, and submit all blood samples for glucose assay. An additional blood sample is collected after 120 minutes for feline patients

6. Plot glucose values on a semilogarithmic graph paper and determine the time required for glucose levels to decrease by 50% (glucose half-life)

7. Results: The postinfusion blood glucose level should fall to approximately 160 mg/dl in 30 to 60 minutes and return to baseline values in 120-180 minutes.

Glucose tolerance tests are usually unnecessary to obtain a diagnosis of diabetes mellitus. Persistent hyperglycemia and glucosuria, frequently with a history of polyuria, polydipsia, polyphagia, and weight loss, are sufficient to diagnose diabetes mellitus. The test may be of value in detecting hyperinsulinism because most beta-cell tumors of the pancreas are not rapidly responsive to glucose. They may even cause diabetic glucose tolerance curves because insulin-antagonist hormones are released as a result of the initial hypoglycemia. Patient stress and chemical restraint also affect glucose tolerance test results. Serum glucose measurements themselves may be erroneously low if blood samples are not subjected to anticoagulants and are allowed to sit at room temperature. However, the test is still used.

The best use for the glucose tolerance test is in animals with borderline hyperglycemia without persistent glucosuria. However, this test is not cost-effective for the owner and may not result in significant therapeutic change. This dilemma is most often seen in cats in which high renal thresholds for glucose and stress-induced hyperglycemia are common and misleading. Extra information may be obtained from the intravenous glucose tolerance test if immunoreactive insulin concentrations are followed simultaneously. This protocol may differentiate diabetes mellitus resulting from absolute lack of insulin (type 1) from that resulting from target-cell insensitivity (type 2) or inappropriate slow insulin release (type 3).

Insulin Tolerance

The insulin tolerance test also probes the causes of diabetes mellitus. Specifically, it checks the responsiveness of target cells to challenge with regular crystalline (short-acting) insulin 0.1 IU/kg subcutaneously or intramuscularly. Serum glucose levels are measured in blood samples obtained before insulin injection (fasting blood glucose) and every 30 minutes after injection for 3 hours. If the serum glucose level fails to drop to 50% of the fasting concentration within 30 minutes of insulin injection (insulin resistance), the insulin receptors are unresponsive or insulin action is being severely antagonized. The latter may occur in hyperadrenocorticism and acromegaly. Insulin resistance profoundly influences prognostic and therapeutic decisions. If the insulin-induced hypoglycemia persists for 2 hours (hypoglycemia unresponsiveness), hyperinsulinism, hypopituitarism, or hypoadrenocorticism should be suspected. Because the test may cause this hypoglycemia, with possible weakness and convulsions, a glucose solution should always be on hand for rapid intravenous administration.

Glucagon Tolerance

The main indications for the glucagon tolerance test are repeated normal or borderline results with the amended insulin/glucose ratio test (see subsequent discussion) or lack of an insulin assay. The glucagon tolerance test gives another assessment of hyperinsulinism. Glucagon stimulates the pancreatic beta cells directly and indirectly to increase the blood insulin level. In normal animals, glucagon injection (0.03 mg/kg intravenously up to a total of 1.0 mg in dogs and 0.5 mg in cats) transiently elevates the blood glucose level to greater than 135 mg/dl. In normal animals, this concentration level is greater than 135 mg/dl. In normal animals, this concentration returns to fasting concentrations. In normal cats, peak insulin occurs at 15 minutes, declining to basal concentration at 60 minutes. Type 1 diabetic cats present a flat insulin response. If the animal has a pancre-atic beta-cell tumor, the serum glucose level peak is lower than normal and is followed within 1 hour by hypoglycemia (serum glucose is less than 60 mg/dl) because excessive insulin is secreted by the stimulated neoplasm.

To perform the test, the patient is fasted until the serum glucose level dips below 90 mg/dl (usually less than 10 hours). Glucagon is injected, and sodium fluoride anticoagulated blood samples are obtained before glucagon injection and 1, 3, 5, 15, 30, 45, 60, and 120 minutes after injection to monitor the glucose response. Unfortunately, the test is insensitive and may cause hypoglycemia convulsions up to 4 hours later. Patients must be fed immediately after the test and observed for hours.

Insulin/Glucose Ratio

The cause of hyperinsulinism may be assessed by taking simultaneous measurements of serum glucose and insulin levels in a fasting animal. Hypoglycemia normally inhibits insulin secretion. Pancreatic beta-cell tumors, hyperactive and unresponsive to glucose, secrete an abundance of insulin inappropriate to the prevailing blood glucose concentration. Although fasting serum insulin concentrations are often normal in hyperinsulinism, ratios of insulin to glucose concentrations are usually aberrant.

The absolute ratio of insulin to glucose can be amended to increase diagnostic accuracy. The amended insulin/glucose ratio (AIGR) subtracts 30 from the serum glucose concentration. At a serum glucose level of 30 mg/dl or less, insulin is normally undetectable, so this discriminant puts the zero of both the glucose and insulin scale at the same physiologic place. Because abnormally high insulin concentrations are more obvious at low serum glucose concentrations, the AIGR is most valuable in animals with a confirmed hypoglycemia of less than 60 mg/dl. Insulin and glucose have to be measured from the same serum sample. Then serial determinations may be performed to select an insulin concentration in hypoglycemia. However, the test is not totally dependable. If the results are unconvincing, the procedure should be repeated or other tests tried. Specifically, diagnostic imaging and insulinlike growth factor tests should be tried to rule out or confirm paraneoplastic hypoglycemia.

Miscellaneous Tests of Insulin Release

When results of a glucagon response test or AIGR are equivocal, glucose, epinephrine, leucine, tolbutamide, or calcium challenges may be attempted. These substances, like glucagon, may provoke a hyperinsulinemic response from pancreatic islet cell tumors, resulting in decreased serum glucose levels. However, tumors vary in their sensitivity to these agents and false-negative results (no response) can occur. These tests are also dangerous because they can precipitate severe, prolonged hypoglycemia.

OTHER ENDOCRINE SYSTEM ASSAYS

In addition to the pancreas, a variety of organs and tissues release hormones that function in the endocrine system. The primary organs of the endocrine system are the adrenal glands, thyroid and parathyroid glands, and the pituitary gland. These glands produce and secrete hormones directly into capillaries and have a variety of target organs and effects.

Adrenocortical Function Tests

Adrenocortical function tests are commonly performed. Adrenal dysfunction is increasingly common, too often because of misuse of corticosteroids. The adrenal axis starts with the hypothalamus. Stimuli originating in the brain, such as from stress, cause the hypothalamus to secrete corticotropin-releasing factor (CRF). Under the influence of CRF, the adenohypophysis secretes adrenocorticotrophic hormone (ACTH), the hormone that stimulates adrenocortical growth and secretion, particularly of glucocorticoid-synthesizing tissue. Cortisol is the major hormone released in domestic mammals. It, in turn, feeds back to inhibit both CRF and ACTH release, completing a balanced system.

True or mimicked hyperfunction of the system is the common complaint. Brain or pituitary tumors leading to secondary bilateral adrenal hyperplasia, idiopathic adrenal hyperplasia, or neoplasia (one or both glands) may cause excessive cortisol release and hyperadrenocorticism. Overenthusiastic glucocorticoid therapy is the most common cause of cortisol excess. Because exogenous, like endogenous, glucocorticoids inhibit adrenotrophic hormones, iatrogenic hyperadrenocorticism is accompanied by the paradox of atrophied adrenal glands. Sudden withdrawal of exogenous glucocorticoids leads to adrenal hypofunction. However, hypoadrenocorticism (Addison’s disease) by definition includes mineralocorticoid deficiency, which does not occur in iatrogenic disease from rapid withdrawal of glucocorticoids. Addison’s disease also may result from overuse of mitotane (Lysodren; for adrenal hyperplasia) or from idiopathic causes.

Screening tests for hyperadrenocorticism must be carefully interpreted because many dogs with nonadrenal disease such as diabetes mellitus, liver disease, or renal disease may have false-positive results. Thereafter, final diagnosis of hyperadrenocorticism is made on the basis of clinical signs in conjunction with several of the various laboratory tests. Conversely, if negative laboratory testing occurs with consistent clinical evidence, the animal should be retested 1 or 2 months later (if clinical signs persist).

ACTH and cortisol concentrations may be a helpful diagnostic aid in differentiation of primary (adrenal-dependent) from secondary (pituitary-dependent) hypoadrenocorticism. However, a single measurement has limited usefulness because levels can fluctuate on a diurnal cycle. More often, these measurements are taken as baseline data and compared with data obtained from challenge to the adrenal gland with ACTH or dexamethasone. Animals with functioning adrenocortical tumors have low concentrations from the negative feedback effect. Animals with pituitary-dependent hypoadrenocorticism should have higher concentrations. Low to undetectable ACTH concentrations occur in secondary Addison’s disease, whereas normal (or increased) concentrations are expected in primary Addison’s disease. ACTH is a labile protein and requires special handling of the plasma sample. Aprotinin (protease inhibitor) within the EDTA tube and/or immediate freezing of the plasma may be required. Tests for cortisol and ACTH are immunoassays and some are available to the veterinary practice laboratory. Some tests are performed on serum; others can be performed only on plasma samples. A few tests can also be performed on urine samples. Urine cortisol/creatinine ratios have also been used as screening tests for adrenal function.

ACTH Stimulation Test

Animals with suspected hypoadrenocorticism (Addison’s disease) or hyperadrenocorticism (Cushing’s disease) may be evaluated with an ACTH response test. In addition, the test is indicated to distinguish among iatrogenic and spontaneous hyperadrenocorticism (Procedure 3-4). It also may test the efficacy of mitotane, ketoconazole, or metyrapone therapy. The ACTH stimulation test evaluates the degree of adrenal gland response to administration of exogenous ACTH. Degree of response to stimulation by glucocorticoid should be in proportion to the glands’ size and development. Hyperplastic adrenal glands have exaggerated responses, whereas hypoplastic adrenal glands show diminished responses. The test can detect these abnormalities but not reveal their ultimate cause. The ACTH response test is a screening test. Adrenal glands that are hyperactive from neoplasia may be insensitive to ACTH. Nonetheless, current figures indicate that the test is more than 80% accurate in diagnosing adrenocortical hyperfunction in the dog and more than 50% in the cat.

PROCEDURE 3-4   ACTH Stimulation Test

1. Collect a plasma sample for determination of baseline plasma cortisol concentration.

2. Administer synthetic ACTH (Cosyntropin) by intravenous injection

a. Dosage varies by species; 125 μg (cats); 250 μg (dog) or 1 mg (horse)

3. Collect a second plasma sample for cortisol determination at 30 minutes post ACTH administration in dogs and cats and 2 hours post administration in horses.

4. Collect a third plasma sample for cortisol determination one hour post ACTH administration in dogs and cats and 4 hours post administration in horses.

5. Results:

a. Normal pretest cortisol concentration:

Dog: 0.5 to 4 μg/dl or 14 to 110 nmol/L

Cat: 0.3 to 5 μg/dl or 8.3 to 138 nmol/L

b. Normal post-ACTH cortisol concentration:

Dog: 8 to 20 μg/dl or 220 to 552 nmol/L

Cat: 5 to 15 μg/dl or 138 to 414 nmol/L

6. Interpretation:

a. Exaggerated post-ACTH cortisol concentration is observed in most dogs (80%) and 51% cats (borderline increased in 16%) with hyperadrenocorticism. Cats may have increased only one of the post-ACTH cortisol determinations.

b. Reduced post-ACTH cortisol concentration is observed, consistent, in both Addison’s disease, iatrogenic Cushing’s disease and, mitotane, ketoconazole, or metyrapone therapy.

c. Normal post-ACTH cortisol concentration does not rule out Cushing’s disease (occurs in 50% of dogs with adrenal-dependent Cushing’s disease).

Dexamethasone Suppression

Dexamethasone suppression tests evaluate the adrenal glands differently by using the adrenal feedback loops. The low-dosage test confirms or replaces the ACTH response test for hyperadrenocorticism (Cushing’s disease). The high-dosage test goes further, differentiating pituitary from adrenal causes of hyperadrenocorticism (Procedure 3-5). In cats, only a high-dose dexamethasone suppression test is suitable.

PROCEDURE 3-5   Dexamethasone Suppression Tests

Low Dosage

1. Obtain a blood sample for baseline plasma cortisol determination at 8 am. (Some clinicians also do a 2- or 3-hour test as well.)

2. Immediately administer intravenous dexamethasone at 0.01 mg/kg for dogs and 0.1 mg/kg for cats

3. Obtain a second plasma sample for cortisol determination 8 hours after dexamethason injection

4. Results:

Adrenal condition Pretest cortisol level Posttest cortisol level
Normal 1.1–8.0 μg/dl 0.1–0.9 μg/dl (<1.4)
Hyperadrenal 2.5–10.8 μg/dl 1.8–5.2 μg/dl (>1.4)

High Dosage

1. Use the same protocol as above, except the dexamethasone dosage is 0.1 mg/kg for dogs and 1.0 mg/kg for cats

2. Results:

Pituitary-dependent hyperadrenocorticism: Normal values as above

Adrenal-dependent hyperadrenocorticism: As for hyperadrenal values above

Note: Successful suppression is defined as a 50% decrease in the plasma cortisol concentration from the baseline value. In 15% of dogs with pituitary-dependent hyperadrenocorticism, the plasma cortisol level is not suppressed by 50%. About 20% of dogs with adrenal-dependent hyperadrenocorticism have suppression of the plasma cortisol level by less than 50%, but all values remain above these considered adequate for suppression (greater than 1.5 μg/dl).

Dexamethasone, a potent glucocorticoid, suppresses ACTH release from the normal pituitary gland, resulting in a drop in plasma cortisol concentration. Hyperadrenocorticism of any etiology is usually resistant to suppression from small dexamethasone doses because a diseased pituitary gland is abnormally insensitive to the drug and continues elaborating excessive ACTH, although 35% of dogs with pituitary-dependent hyperadrenocorticism have a 4-hour post-dexamethasone cortisol level lower than 1 mg/dl or 50% of the baseline concentration. Neoplastic adrenal glands are autonomously secreting cortisol independent of endogenous ACTH control. The excessive cortisol production suppresses secretion of ACTH by the normal pituitary gland through negative feedback inhibition. Small doses of dexamethasone do not affect plasma cortisol measurements. However, such doses may complicate test results and may differentiate only normal animals from those with hyperadrenocorticism.

With larger dexamethasone doses, more differences appear. The sensitivity of a diseased pituitary gland to dexamethasone is incomplete; large dexamethasone doses overcome it and the abnormally high plasma ACTH and cortisol concentrations fall. Abnormal adrenal glands, however, continue to secrete cortisol autonomously. Thus plasma cortisol concentrations unresponsive to all dexamethasone doses are probably caused by primary adrenal gland disease. Suppression by large but not small doses suggests pituitary gland disease. The test has 73% accuracy in differentiating pituitary from adrenal causes in dogs and 75% sensitivity to diagnose hyperadrenocorticism in cats.

A dual high-dosage dexamethasone test and ACTH response test are described in Procedure 3-6. Although combined protocol is a step saver, possibly ambiguous results necessitate more tests and expense. The ACTH response segment of the test is particularly prone to error. Because dexamethasone alters the adrenal responsiveness to ACTH (enhances or inhibits, depending on the duration of activity), the timing of the test is crucial. Normal standards must be newly established for any changes in protocol.

PROCEDURE 3-6   Protocol for Combined Dexamethasone Suppression and ACTH Corticotropin Stimulation Test

1. Collect a plasma sample for cortisol determination.

2. Administer dexamethasone 0.1 mg/kg IV.

3. Collect a plasma sample for cortisol determination 4 hours post injection.

4. Immediately administer synthetic ACTH IV at a dose of 125 μg (cats); 250 μg (dog)

5. Collect a third plasma sample for cortisol determination at 30 minutes post ACTH administration in dogs and cats and 2 hours post administration in horses.

6. Collect a fourth plasma sample for cortisol determination one hour post ACTH administration in dogs and cats and 4 hours post administration in horses.

7. Results:

a. Normal pretest cortisol concentration:

Dog:0.5-4 μg/dl or 14-110 nmol/L

Cat: 0.3-5 μg/dl or 8.3-138 nmol/L

b. Normal post-dexamethasone cortisol concentration:

1-1.4 μg/dl or 28-39 nmol/L

c. Normal post-ACTH cortisol concentration:

Dog: 8-20 μg/dl or 220-552 nmol/L

Cat: 5-15 μg/dl or 138-414 nmol/L

8. Interpretation:

a. Elevated post-dexamethasone and elevated post-ACTH cortisol concentration indicates hyperadrenocorticism.

b. Elevated post-dexamethasone and normal post-ACTH cortisol concentration indicates hyperadrenocorticism.

c. Normal post-dexamethasone and exaggerated post-ACTH cortisol concentration indicates pituitary-dependent hyperadrenocorticism.

Corticotropin-Releasing Hormone Stimulation

This test may be indicated to differentiate among pituitary-dependent and primary hyperadrenocorticism. Plasma cortisol concentration and ACTH should not be elevated after corticotropin-releasing hormone (CRH) stimulation in dogs with adrenal-dependent Cushing’s disease.

The protocol of this test consists of obtaining a pretest sample to determine cortisol and ACTH, administering 1 mg/kg of CRH, and obtaining blood samples again 15 and 30 minutes later to evaluate cortisol and ACTH.

Thyroid Assays

Thyroid hormones have pervasive effects, influencing the metabolic rate, growth, and differentiation of all body cells. Because the clinical signs of thyroid malfunction are numerous and confusing, function tests are valuable. The thyroid glands are governed like the adrenal cortices. Thyrotropin-releasing factor (TRF) from the hypothalamus encourages the anterior pituitary to release thyrotropin or thyroid-stimulating hormone (TSH). TSH enhances thyroid growth, function, and thyroxine release. Thyroxine is really composed of two varieties of hormones, triiodothyronine (T3) and thyroxine (T4), varying in their extent of iodination. T4 also is converted to the more active T3 in tissues. Thyroxine completes the regulatory cycle by inhibiting TRF and TSH release.

Thyroid disease is manifested primarily as hypofunction in dogs, horses, ruminants, and swine and as hyperfunction in cats. The cause may be dietary iodine deficiency or excess or goitrogens, most common in large animals. Primary glandular disease (neoplasia, autoimmune disease, idiopathic atrophy) comprises most cases, whereas pituitary (secondary thyroid disease) comprises 5% of hypothyroid dogs. In food animals, diagnosis is based on clinical signs (e.g., abortion, stillbirths, alopecia, and goiter in fetuses and neonates), serum T4 concentrations, serum protein-bound iodine concentrations, and pasture iodine analyses. Feeds may be examined for goitrogenic plants (Brassicae spp.) or excess calcium, which decreases iodine uptake.

Baseline thyroxine concentrations are used diagnostically, but normal values vary dramatically. Semiquantitative immunologic tests are available to measure T4 concentrations. Their diagnostic inadequacy mirrors that of plasma cortisol determinations. Some drugs, such as insulin or estrogens, may increase T4 concentrations; others, such as glucocorticoids, anticonvulsants, antithyroid drugs, penicillins, trimethoprim sulfamides, diazepam, androgens, and sulfonylureas, may decrease T4 concentrations. In addition, total T4 (TT4) may be increased in hypothyroid dogs as a result of the presence of anti- T4 antibodies. Specific determination of the active form of thyroxine non-proteinbound or free T4 (FT4) by equilibrium dialysis is a more accurate approach to thyroid function. Thereafter, determination of both endogenous TSH and T4 (TT4 or FT4) are suitable to diagnose canine hypothyroidism.

TSH Response

This test is used on small animals (except in cats with hyperthyroidism) and horses and provides a reliable diagnostic separation of patients with normal versus abnormal thyroid function (Box 3-6). Exogenous TSH challenge may sort out borderline cases and separate real hypothyroid patients from those with other illness or drug-depressed thyroxine concentrations and also may pinpoint the site of the lesions.

BOX 3-6   Overview of TSH response test

• A pretest blood sample is collected for baseline serum T4 determination.

• TSH is administered and T4 determination made on a second blood sample is collected 4 to 6 hours after injection

• Results: T4 post-TSH stimulation in normal dogs should be approximately twice the baseline value or should exceed 2.0 μg/dl or 25 nmol/L

The test usually is used to explore canine hypothyroidism. After TSH is injected, thyroid response (usually serum T4 levels, the most reliable index) is followed. An increase in the serum T4 level occurs in normal animals. Primarily exhausted or insensitive thyroids do not respond to exogenous TSH. Indeed, endogenous TSH concentrations are already high from failing T4 inhibition. Therefore serum T4 level is not increased in these animals. With pituitary or brain disease, however, the thyroid glands remain responsive. Such lesions result in too little endogenous thyrotropin. Although an increase in the serum T4 level is expected in animals with pituitary lesions, 2 to 3 days of TSH challenge may be necessary before increased serum T4 levels are seen. The extra TSH is required to overcome chronic glandular atrophy, similar to “priming the pump.”

Glucocorticoids seem to inhibit both TSH and T4 secretion, so euthyroidism with low serum T3 levels only often accompanies Cushing’s disease or vigorous glucocorticoid therapy. Fortunately, the TSH and ACTH response tests may be performed simultaneously. In such animals, the glands remain responsive to TSH but the absolute values of prechallenge and postchallenge serum T4 are low or low resting with normal post-TSH values. Feline hyperthyroidism is usually caused by functional thyroid adenomas. Oddly, with exogenous TSH challenge, little or no increase occurs in the serum T4 level, as in canine primary hypothyroidism. This phenomenon suggests that the neoplasm either functions independently of the trophic hormone or is already manufacturing and leaking T4 at maximum capacity. A lack of TSH responsiveness, appropriate clinical manifestations, and high baseline plasma T4 concentrations all attest to feline hyperthyroidism.

In horses, iodine-deficiency hypothyroidism is rare because iodized salt usually is offered free choice or in feeds. Overzealous iodine supplementation with kelp meal or vitamin-mineral mixes, however, provokes hypothyroidism and goiter. Excessive use of iodine inhibits thyroid function. In the assessment of thyroid function in horses, the normal serum T4 values are 1 to 3 mg/dl, which is lower than in other species. Hypothyroidism should be suspected only with serum T4 concentrations of less than 0.5 mg/dl.

Rare tumors of the pars intermedia of the pituitary, compressing the anterior pituitary, may cause secondary hypothyroidism in older horses. Because pituitary damage induces a plethora of signs, the TSH response test may be especially helpful.

Thyrotropin-Releasing Hormone (TRH) Response

TRH response is used on small animals and provides a reliable diagnostic separation of patients with normal versus abnormal thyroid function. FT4 is the fraction of thyroxine that is not bound to protein. FT4 levels are less influenced by nonthyroidal diseases or drugs that total T4 concentrations. Exogenous TRH challenge may sort out borderline cases and separate real hypothyroid and hyperthyroid patients from those with other illness or drug-depressed thyroxine concentrations. The test usually is used to explore canine hypothyroidism when TSH is not available. Baseline serum TT4 and FT4 concentrations are determined. Four hours after 0.1 mg/kg or 0.2 mg (total dose) of TRH are injected intravenously, thyroid response (serum TT4 and FT4 levels) is followed. An increase of the serum TT4 concentration of 50% or 1 μg/dl (13 nmol/L) and FT4 (1.9 times) concentration, compared with baseline concentrations, occurs in normal animals. The evaluation of FT4 levels allows a clearer distinction between euthyroid and hypothyroid dogs when TT4 results are equivocal. TRH response test may be used to diagnose mild to moderate feline hyperthyroidism. Baseline serum TT4 and FT4 concentrations are determined. Approximately 4 hours after 0.1 mg/kg of TRH are injected intravenously, serum TT4 and FT4 levels are determined. An increase of the serum TT4 less than 50%, compared with baseline concentrations, occurs in hyperthyroid cats. Increases between 50% and 60% are borderline, and increases more than 60% rule out hyperthyroidism.

Triiodothyronine Suppression Test

Hyperthyroidism is common in middle-age to old cats in the United States and Great Britain. Diagnosis may be based on resting thyroid hormone concentrations. Determination of both TT4 and FT4 may help distinguish a nonthyroidal disease. The combination of high FT4 value with low TT4 is indicative of nonthyroidal illness, whereas a high FT4 concentration and high-normal TT4 concentration indicates hyperthyroidism. However, some cases may require a functional test to confirm or rule out the disease.

Thyroid suppression testing is based on the expected negative feedback regulation of TSH, induced by high concentrations of circulating thyroid hormone. Hyperthyroid cats should not have a normal pituitary-thyroid regulation. As a result, administration of exogenous T3 must induce a decrease on endogenous T4 unless feedback TSH regulation is altered.

To perform the test, a basal T3 and T4 determination is required. Seven T3 doses of 25 μg orally every 8 hours are administered at home. Approximately 2 to 4 hours after the seventh dose, a blood sample is obtained for T3 and T4 determination. Cats with hyperthyroidism have serum T4 concentrations higher than 1.5 μg/dl or 20 nmol/L, whereas nonhyperthyroid cats have lower values. Low posttest T3 concentrations indicate an invalid test resulting from failure of exogenous T3 administration.

Pituitary Function Tests

Diagnosis of canine acromegaly may be based on documentation of elevated growth hormone (GH). Serial GH determinations (three to five samples taken at 10-minute intervals) are performed because affected dogs have constant levels of GH instead of fluctuating GH concentrations. In addition, affected dogs do not respond to stimulation with GH-releasing hormone (GHRH). This test requires intravenous administration of 1μg/kg GHRH or 10 μg clonidine. Posttest plasma GH in normal dogs increases 5 to 15 μg/L or 13 to 25 μg/L, respectively.

ELECTROLYTE ASSAYS

Electrolytes are the negative ions, or anions, and positive ions, or cations, of elements found in all body fluids of all organisms. Some of the functions of electrolytes are maintenance of water balance, fluid osmotic pressure, and normal muscular and nervous functions. They also function in the maintenance and activation of several enzyme systems and in acid-base regulation. Acid-base status depends on electrolytes, so these should be interpreted together. The major electrolytes in plasma are calcium, inorganic phosphorus, magnesium, sodium, potassium, chloride, and bicarbonate. Evaluation of electrolytes, such as sodium and potassium, was at one time not commonly performed in practice laboratories because of the special analytic instrumentation needed. The most common techniques for measurement of electrolytes use ion-specific electrochemical methods. Ion-specific methods require special instrumentation. Automated ion-specific instruments are now readily available and reasonably priced, so many veterinary practices have the ability to perform electrolyte testing. More information on electrochemical testing can be found in Chapter 1.

Volume displacement by lipid typically affects electrolyte measurement, although this is method dependent. An increased concentration of lipid results in plasma volume with decreased water content. Electrolytes are distributed in the aqueous portion of plasma and are not found in the lipid portion. Therefore procedures that measure electrolytes in total plasma volume (per unit of plasma), such as flame photometry or indirect potentiometry, will result in artifactually decreased electrolyte values. This will occur only in very lipemic samples (e.g., triglyceride concentrations greater than 1500 mg/dl). Procedures that measure electrolytes in the aqueous phase only (per unit of plasma water), such as direct potentiometry, will result in accurate electrolyte concentrations.

Calcium

More than 99% of the calcium in the body is found in bones. The remaining 1% or less has major functions in the body, which include maintenance of neuromuscular excitability and tone (decreased calcium can result in muscular tetany), maintenance of activity of many enzymes, facilitation of blood coagulation, and maintenance of inorganic ion transfer across cell membranes. Calcium in whole blood is almost entirely in plasma or serum. Erythrocytes contain little calcium.

Calcium concentrations are usually inversely related to inorganic phosphorus concentrations. As a general rule, if the calcium concentration rises, the inorganic phosphorus concentration falls. Hypercalcemia is an elevated blood calcium concentration. Hypocalcemia is a decreased blood calcium concentration.

Samples for calcium testing should not be collected using EDTA or oxalate or citrate anticoagulants because they bind with calcium and make it unavailable for assay. Hemolysis results in a slight decrease in calcium concentration in samples as the fluid from the ruptured erythrocytes dilutes the plasma.

Inorganic Phosphorus

More than 80% of the phosphorus in the body is found in bones. The remaining 20% or less has major functions, such as energy storage, release, and transfer; involvement in carbohydrate metabolism; and composition of many physiologically important substances, such as nucleic acids and phospholipids.

Most of the phosphorus in whole blood is found within the erythrocytes as organic phosphorus. The phosphorus in plasma and serum is inorganic phosphorus and is the phosphorus assayed in the laboratory. Inorganic phosphorus levels in plasma and serum provide a good indication of the total phosphorus in an animal. Plasma or serum phosphorus and calcium concentrations are inversely related. As phosphorus concentrations decrease, calcium concentrations increase.

Hyperphosphatemia is an increased serum or plasma phosphorus concentration. Hypophosphatemia is a decreased serum or plasma phosphorous concentration. Hemolyzed samples should not be used. The organic phosphorus liberated from the ruptured erythrocytes may be hydrolyzed to inorganic phosphorus, which results in a falsely elevated inorganic phosphorus concentration. The serum or plasma should be separated from the blood cells as soon as possible after blood collection and before the sample is stored.

Sodium

Sodium is the major cation of plasma and interstitial, or extracellular, fluid. It plays an important role in water distribution and body fluid osmotic pressure maintenance. In the kidney, sodium is filtered through the glomeruli and resorbed back into the body through the tubules in exchange, as needed, for hydrogen ions. In this manner, sodium plays a vital role in pH regulation of urine and acid-base balance. Sodium concentrations are measured by flame photometry, which is usually not available in practice laboratories, or by dry reagent testing. Hypernatremia refers to an elevated blood level of sodium. Hyponatremia is a decreased blood level of sodium. The sodium salt of heparin should not be used as an anticoagulant because it can falsely elevate the results. Hemolysis does not significantly alter results, but it may dilute the sample with erythrocyte fluid, causing falsely lower results.

Potassium

Potassium is the major intracellular cation and is important for normal muscular function, respiration, cardiac function, nerve impulse transmission, and carbohydrate metabolism. In acidotic animals, potassium ions leave the intracellular fluid as they are replaced by hydrogen ions, resulting in elevated plasma potassium levels, or hyperkalemia. The plasma potassium level also may be elevated in the presence of cellular damage or necrosis, which causes release of potassium ions into the blood. Decreased plasma potassium levels, or hypokalemia, may be associated with inadequate potassium intake, alkalosis, or fluid loss resulting from vomiting or diarrhea.

Plasma is the preferred sample because platelets may release potassium during the clotting process, causing artificially elevated potassium levels. Hemolysis should be avoided because the concentration of potassium within erythrocytes is higher than the concentration in plasma. Hemolysis releases potassium into the plasma, resulting in artificially elevated potassium levels. The sample should not be refrigerated until the plasma has been separated from the cells because cooler temperatures promote loss of potassium from the cells without evidence of hemolysis. Samples should not be frozen without first separating the blood cells because the resulting hemolysis makes the sample unsuitable for testing.

Magnesium

Magnesium is the fourth most common cation in the body and the second most common intracellular cation. Magnesium is found in all body tissues. More than 50% of the magnesium in the body is found in bones, closely related to calcium and phosphorus. Magnesium activates enzyme systems and is involved in production and decomposition of acetylcholine. Imbalance of the magnesium/calcium ratio can result in muscular tetany from release of acetylcholine. Cattle and sheep are the only domestic animals that show clinical signs related to magnesium deficiencies. Hypermagnesemia refers to an elevated blood magnesium level. Hypomagnesemia is a decreased blood magnesium level. Anticoagulants other than heparin may artificially decrease the results. Hemolysis may elevate the results through liberation of magnesium from erythrocytes.

Chloride

Chloride is the predominant extracellular anion. It plays an important role in maintenance of water distribution, osmotic pressure, and the normal anion/cation ratio. Chloride is usually included in electrolyte profiles because of its close relationship to sodium and bicarbonate levels. Hyperchloremia is an elevated blood chloride level. Hypochloremia is a decreased blood chloride level. Hemolysis may affect test results by diluting the sample with erythrocyte fluid. Prolonged storage without first separating out the blood cells may cause slightly low results.

Bicarbonate

Bicarbonate is the second most common anion of plasma. It is an important part of the bicarbonate/carbonic acid buffer system and aids in transport of carbon dioxide from the tissues to the lungs. These functions help keep the body pH in balance as acids and bases are continually introduced into the body. The kidney regulates bicarbonate levels in the body by excreting excesses after it has resorbed all that is needed. Bicarbonate levels are frequently estimated from blood carbon dioxide levels. The bicarbonate level is approximately 95% of the total carbon dioxide measured. Arterial blood is the sample of choice for bicarbonate determinations. If plasma is used, lithium heparinate is the anticoagulant of choice. The sample should be chilled in ice water to prevent glycolysis from altering the acid-base composition. Freezing the sample results in hemolysis. Most test methods require incubation at 37° C.

MISCELLANEOUS CHEMISTRY ASSAYS

Creatine Kinase

Creatine kinase (CK) was previously known as creatine phosphokinase (CPK). It is produced primarily in striated muscle cells and, to some extent, in the brain. CK is considered one of the most organ-specific enzymes available for clinical evaluation. When skeletal muscle, including cardiac muscle, is damaged or destroyed, CK leaks out of the cells and produces an elevated blood CK level. Although the brain produces some CK, how much CK from the brain actually enters the peripheral circulation is uncertain. CK is frequently assayed if an animal has an elevated blood AST level but shows no clinical signs of liver disease. CK is also evaluated in CSF because its measurement in CSF has been suggested as an ancillary diagnostic test for nonspecific damage to neural tissue (e.g., neural hypoxia, trauma, inflammation, or compression by a space-occupying lesion, such as a tumor). The CSF CK value therefore may be a useful guide to prognosis in canine neurologic cases and in premature foals. Increased values also may be observed after seizures.

Although the CK assay is an organ-specific assay, it cannot determine which muscle has been damaged or indicate the severity of the muscle damage. Anything that damages the muscle cell membrane can cause an increased blood CK level. This damage may stem from intramuscular injections, persistent recumbency, surgery, vigorous exercise, electric shock, laceration, bruising, and hypothermia. Myositis and other myopathies also cause elevated blood CK levels. CK levels in samples may be artificially increased by oxidizing agents such as bleach, EDTA, citrate, fluoride, exposure to sunlight, or delay in assay.

Lactate

Lactate (lactic acid) is produced by anaerobic cellular metabolism. Its presence does not indicate any specific disease. However, increased lactate levels indicate hypoxia or hypoperfusion. Lactate levels may be measured in plasma, peritoneal fluid, and CSF. Hypoxia of a section of bowel wall results in increased lactate production, much of which diffuses into the peritoneal cavity before entering the circulation to be removed by the liver. Use of paired blood and peritoneal fluid lactate measurements has been advocated as a diagnostic aid in equine colic cases. The blood lactate concentration of normal horses is always greater than that of peritoneal fluid. Horses with gastrointestinal disorders generally have peritoneal fluid lactate concentrations greater than corresponding blood values. Less severe gastrointestinal disorders, such as impactions, tend to cause a smaller difference between peritoneal fluid and blood lactate concentrations than do serious ones, such as intestinal torsion (a twisted section of bowel). Peritonitis also increases peritoneal fluid lactate values.

The sample for lactate measurement (blood or peritoneal fluid) should be collected in a fluoride oxalate or lithium heparin anticoagulant tube. The fluoride stops cellular metabolism of glucose and consequent production of lactate, and the oxalate prevents sample clotting.

Chemical Tests of Gastrointestinal Function

The principal functions of the gastrointestinal (GI) tract are the assimilation of nutrients (digestion and/or absorption) and excretion of waste products. Most nutrients are ingested in a form either too complex or insoluble for absorption. Within the GI tract these substances are solubilized and degraded enzymatically to simple molecules that may be absorbed across the mucosal epithelium.

Gastrointestinal diseases are common in veterinary practice. Specific diagnosis is essential, especially when the disease is chronic. In cases of malabsorption, intestinal biopsy tends to be required to obtain a definitive diagnosis. However, function tests are performed previously to rule out other diseases and confirm the need for more invasive diagnostic procedures. Therefore function tests are useful in guiding treatment.

Malassimilation may be classified by pathophysiologic process into maldigestive or malabsorptive forms. Maldigestion results from altered gastric secretion and lack of or decreased amounts of digestive enzymes, usually secreted by the pancreas and, less often, the intestinal mucosa. Malabsorption most often is caused by an acquired disease of the small intestinal wall or by bacterial overgrowth syndromes. Before clinical signs of maldigestion are seen, approximately 90% of the pancreas must be either nonfunctional or destroyed. The small intestine of a dog can function well with up to 85% loss, but greater than 50% loss may result in “short bowel syndrome” that cannot be compensated for by adaptive mechanisms.

Laboratory tests may evaluate gastric hydrochloric acid secretion, but most of them are directed to detect malassimilation and its origin. Gastric acid secretion also may be indirectly estimated by determining gastric juice pH; normal dogs have a fasting gastric pH from 0.9 to 2.5. Gastric juice pH may be continuously monitored by radiotelemetric technique.

Malassimilation tests are based on examination of feces for fecal dietary nutrients, fecal enzyme activities, and serum for concentrations of orally administered substrates or metabolites and specific tests for endogenous substances.

Gastrin Secretion

The diagnosis of Zollinger-Ellison syndrome (gastrinoma) may be required after identification of gastrointestinal ulceration and amine precursor uptake and decarboxylation, neoplasia (prevalently located in the pancreas), and the assessment of high serum gastrin concentrations. However, some cases may require provocative tests to document the excessive gastrin secretion. Stimulation of gastrin secretion may be accomplished by test meal, intravenous calcium infusion, or intravenous secretin infusion. Serum samples for gastrin determination must be obtained before, 2, 5, 15, and 30 minutes after intravenous administration of 2 to 4 IU/kg of secretin. A positive test is considered if twofold increase gastrin concentration occurs 2 and/or 5 minutes after administration.

Fecal Occult Blood

Blood loss into the gut is another cause of protein-losing gastroenteropathy. Dramatic bleeding is evident as black feces (melena) or frank fecal blood (hematochezia). Less-obvious, subtle bleeding is a significant sign of GI ulcers, neoplasia, or parasitism. Chronic, low-level bleeding may lead to iron-deficiency anemia.

Useful reagents to detect insidious bleeding include orthotoluidine (Occultest, Ames, Iowa) and benzidine (Hemoccult, Beckman Coulter Inc., Fullerton, Calif.). Impregnated strips or tablets are oxidized to a colored product by hemoglobin peroxidase activity in the feces. Both reagents are so sensitive that they respond to dietary hemoglobin and myoglobin; therefore the patient’s diet must be meat free for 3 days before the test. A cottage cheese and rice diet, 50% of each, is recommended for the patient before the test is performed. This precaution is less pertinent to herbivores, but the technician must check that the diet has not been supplemented with meat or bone meal. Another test for fecal occult blood is the guaiac test. The test is less sensitive but also less affected by diet. However, the reagents used for the test are not commonly found in the veterinary clinic.

Fecal Proteolytic Activity

The assay for fecal proteolytic activity was found to be effective for the evaluation of exocrine pancreatic insufficiency in cats and other species. Fecal proteolytic activity can be determined colorimetrically by using an azocasein or azoalbumin substrate or the radial enzyme diffusion method. Dogs with EPI may seldom have normal fecal proteolytic activity. A reliable radial enzyme diffusion requires fecal samples of three different days. Normal values of diameter of digestion halo are 10.7 ± 2.7 mm in dogs and 9.6 ± 3.4 mm in cats.

Fecal α1-Protease Inhibitor (α1-Antitrypsin)

This plasma protease resists intraluminal proteolytic degradation. Detection of α1-protease inhibitor may be accomplished by radial immunodiffusion or enzyme-linked immunosorbent assay in dogs. The assay is species specific. Documentation of protein-losing enteropathy or gastrointestinal hemorrhage is appropriate.

The best method for diagnosing protein-losing enteropathy is quantitative loss of chromium-52-labeled albumin. The test is based on intravenous administration of radioactive chromium, which links to circulating plasma albumin; albumin loss is proportional to radioisotope quantity (radioactivity) in feces.

Lipid Absorption

Lipid absorption relies on bile, pancreatic lipase, and a healthy intestinal mucosa. Associated lesions cause steatorrhea and prevent the normal hyperlipemia that follows a fatty meal. Lipemia, as judged by plasma turbidity, is then a gauge of GI function.

The patient must be fasted for 12 hours and a baseline blood sample must be drawn and centrifuged. If fasting lipemia is encountered (e.g., in diabetes mellitus, starvation, feline steatitis, hypothyroidism, hyperadrenocorticism, liver/biliary disease, familial hyperlipemia of Schnauzers, white muscle disease of sheep, or hyperlipemia of ponies, among others), the test must be abandoned. If the plasma is clear, either corn oil (3 ml/kg) or peanut oil (2 ml/kg) is administered orally. Blood samples are drawn at hourly intervals for 4 hours. This plasma should be cloudy. Clear follow-up samples indicate disturbed absorptive function, but the site of disturbance may not be known. Some argue that this test reveals nothing more than the simple discovery of steatorrhea.

A variation of this method may better define the lesion. The test is repeated on another day; however, the oil is preincubated at room temperature for 20 minutes with pancreatic enzymes. A cloudy postchallenge plasma implies a pancreatic enzyme deficiency. If it is clear, inadequate bile production or intestinal malassimilation must be considered. The fat absorption test may yield false-negative results (no fat absorption) with delayed gastric emptying (fat induces this), gastric inactivation of added pancreatic enzymes, or enteritis.

Slight lipemia before oil administration indicates increased levels of low lipoproteins (mainly triglycerides and cholesterol). The enzyme lipoprotein lipase usually clears plasma of triglycerides; the activity of this enzyme is enhanced by insulin, thyroid hormones, glucagon, and heparin. Determination of triglyceride concentrations of slightly lipemic plasma may reveal increased triglyceride concentration, which could be attributed to disease of hepatic, pancreatic, renal, or endocrine origin. Slight postprandial lipemia also may be seen in some canine patients with malabsorptive syndromes.

Monosaccharide Absorption Tests

These tests more specifically probe intestinal function. Again, the agent is given orally; blood concentrations are the measure of absorption.

d-Xylose Absorption: d-xylose is a five-carbon sugar absorbed passively in the jejunum and excreted rapidly by the kidneys. Because xylose absorption is simple and the agent is not metabolized, its fate is readily traced. Xylose absorption is inefficient and often affected by some intestinal diseases. Nonetheless, the test is relatively insensitive because control values are variable.

The test is performed in dogs and horses, as described in Box 3-7. Interference from rumen flora precludes use of the oral test in cattle and sheep; the alternative injection of monosaccharides into the abomasum is difficult enough to make its use rare.

BOX 3-7   Overview of monosaccharide absorption test in dogs and horses

Oral Xylose Absorption in Dogs

• The patient is fasted and baseline xylose measurement determined.

• Xylose solution is administered via stomach tube and xylose measurements obtained from post administration blood samples collected 30, 60, 90, 120, 180, and 240 minutes after xylose administration.

• Blood xylose concentration are graphed over time

• A peak of less than 45 mg/dl between 30 and 90 minutes is abnormal. Peak values of 45 to 50 mg/dl are possibly abnormal. A peak above 50 mg/dl is probably normal.

Glucose and Xylose Absorption in Horses

• Glucose and xylose tests are performed separately with the same general protocol

• The patient is fasted for 12 to 18 hours and water is then withheld

• Baseline xylose (or glucose) concentrations are obtained

• The Xylose (or glucose) solution is administered via stomach tube and blood samples collected at 30-minute intervals for 4 to 5 hours.

• Maximum blood xylose level of 20.6 ± 4.8 mg/dl is expected at 60 minutes. The preadministration blood glucose concentration should be doubled by 120 minutes.

Abnormal xylose absorption indicates intestinal malassimilation, specifically malabsorption. However, only slight differences separate normal and abnormal ranges; diseased animals may have normal results. Animals with lymphangiectasia still may have normal results because the lymphatics do not participate in xylose absorption. The rate of xylose absorption depends only on the amount given, the size of the absorptive area, intestinal blood circulation, and gastric emptying. The latter may be delayed by cold or hypertonic solutions, pain, apprehension, or feeding. Fasting or radiographs to confirm an empty stomach are required. Vomiting, however, falsely lowers blood values, as does ascites (xylose enters pooled fluids). Bacteria have the ability to metabolize xylose; therefore bacterial overgrowth may be monitored by this test. If bacterial overgrowth is suspected (in cases of intestinal stasis or pancreatic enzyme deficiency), the test should be repeated after 24 hours’ use of oral tetracycline. Finally, renal disease falsely elevates blood xylose concentrations.

The fate of xylose also has been followed in dogs by collecting a 5-hour urine sample after a 25-g oral dose and determining the total xylose excreted. This method is more laborious but requires only one xylose assay.

Cats were thought to have plasma concentrations and kinetics similar to dogs. Other studies found xylose uptake to be variable in cats; plasma concentration did not increase to the levels found in dogs. Peak plasma concentration of xylose in normal cats ranged between 12 and 42 mg/dl when a dosage of 500 mg/kg body weight was used.

False-negative results may be caused by delayed gastric emptying, abnormal intestinal motility, reduced intestinal blood flow, bacterial overgrowth, and sequestration of xylose in ascitic fluid. False-positive results may be caused by decreased GFR; therefore ensuring the patient is fully hydrated and not azotemic at the time of testing is important.

Of the oral dose of xylose, 18% is excreted through the kidneys within 5 hours. This test has been improved by performing d-xylose and 3-O-methyl-d-glucose (3MG) absorption test comparing the differential absorption of two sugars to eliminate the nonmucosal effects of d-xylose absorption.

Serum Folate and Cobalamin

Serum concentrations of folate and cobalamin may be assessed by radioimmunoassay. Both concentrations tend to be decreased in malabsorption. Folate is absorbed in the proximal intestine, whereas cobalamin is absorbed in the ileum. Bacterial overgrowth also may alter these concentrations; folate synthesis is increased in bacterial overgrowth, whereas some bacteria may decrease the cobalamin availability.

Mucin Clot Test

Synovial fluid mucin forms a clot when added to acetic acid. The nature of the resultant clot reflects the quality and concentration of hyaluronic acid. The following is one method. To perform the test, 1 ml nonanticoagulated synovial fluid is added to a 7-N glacial acetic acid diluted 0.1:4. The synovial fluid/acetic acid solution is gently mixed and allowed to stand at room temperature for 1 hour before evaluating for the presence of a clot. The mucin clot generally is graded as good (large, compact, ropy clot in a clear solution), fair (soft clot in a slightly turbid solution), fair-poor (friable clot in a cloudy solution), or poor (no actual clot, but some large flecks in a turbid solution). Clot assessment is enhanced by gently shaking the tube. Good clots remain ropy, whereas poor clots fragment. If only a few drops of synovial fluid are obtained at arthrocentesis, an abbreviated mucin clot test may be performed. If available after preparation of a cytologic smear (and possibly total nucleated cell count), a drop of non-EDTA-preserved fluid is placed on a clean microscope slide. Three drops of diluted acetic acid are added and mixed. The resultant clot is graded after approximately 1 minute. Assessment may be easier against a dark background.

TOXICOLOGY

Numerous agents may be involved in common poisonings of dogs, cats, horses, and food animals, including herbicides, fungicides, insecticides, rodenticides, heavy metals (especially lead), household products (including phenols), automotive products (especially ethylene glycol), drugs (including medications), and various poisonous plants and animals. Often a presumptive diagnosis may be attained from an accurate history, including environmental factors, and a thorough clinical examination followed by response to therapy or by necropsy. However, establishing a specific etiologic diagnosis may be difficult in some cases.

A few simple tests may be performed in the veterinary practice laboratory. In such situations, personnel must be familiar and competent with the test procedure, reagents must not be outdated, and special equipment may be required. These requirements, together with a sporadic demand for such tests, frequently dictate that practitioners send all toxicologic specimens to a specially equipped laboratory for analysis.

Toxicologic Specimens

Suggestions on appropriate specimens and preferred methods of handling, packaging, and transport can be obtained by consultation with the toxicology laboratory. Such contact also ensures that the laboratory offers the procedures requested. Submitted specimens should be free from contamination by extraneous environmental compounds or debris. Specimens should not be washed, which may remove toxic residues. Samples of different fluids, tissues, and feeds must be submitted in separate leak-proof (airtight), clean plastic or glass containers. All containers should be individually identified by the owner’s and veterinarian’s names, animal’s name or identification number, and the nature of the specimen before packaging into a large container for submission to the laboratory.

Samples of whole blood (at least 10 ml, usually heparinized), serum (at least 10 ml), vomitus, gastric lavage fluid, feces, and urine (approximately 50 ml) may be submitted from live animals. Samples of feed (portions of at least 200 g), water, and suspected baits also may be helpful in some cases. In fatal poisoning, samples collected during a thorough necropsy should include whole blood or serum; urine; gut (especially stomach) contents (at least 200 g, noting site of collection); and organ or tissue samples, especially liver and kidney, but sometimes brain, bone, spleen, or fat (generally, where practical, at least 100 g of each tissue). Sending too large a sample is always better than not sending enough because excess can be discarded.

In general, serum or blood samples are best submitted refrigerated, whereas gut contents and tissues are best frozen. Preservatives are usually not required. An exception would be tissue samples submitted for histopathologic examination, which require fixation in 10% formalin and must not be frozen. If a preservative is used on a specimen submitted for chemical analysis, it is probably worthwhile also to submit an aliquot of preservative for reference analysis. Frozen samples should be insulated from other specimens and should arrive at the laboratory while still frozen. Dispatch to the laboratory by courier is recommended.

Because litigation may result from poisoning cases, accurate and detailed records should be kept from the outset of the case. Establishment of a good working relationship with the toxicology laboratory, including provision of a good case history (and necropsy findings in fatal poisonings) when samples are submitted, helps ensure the best results.

The main advantages of the following tests are that they can be performed reasonably quickly in the practice laboratory. Results are therefore available more rapidly than if the sample were sent to a toxicology laboratory. However, they are best viewed as screening procedures, suggesting appropriate avenues of investigation and treatment. Verification of findings (especially positive ones) by a reputable toxicology laboratory is advisable, especially if subsequent legal action by the client is a possibility.

Lead Poisoning

Lead is a fairly common environmental pollutant, in the air of cities and in old lead-based paints, lead shot (ammunition), linoleum, car batteries, solder, roofing materials, and petroleum products. Lead poisoning (plumbism) can occur in all species. Clinical signs vary with the species and are related chiefly to the gastrointestinal tract and nervous system. Hematologic examination of blood from an animal with lead poisoning may reveal basophilic stippling of some erythrocytes and increased numbers of circulating nucleated red blood cells (metarubricytosis) (see Chapter 2). Such findings in an animal that is not anemic and has clinical signs consistent with lead poisoning strongly suggest plumbism.

No simple, reliable in-house tests exist to detect lead in blood, feces, urine, milk, or tissues. Blood lead levels can be determined readily (by atomic absorption spectrophotometry) at a toxicology laboratory on whole blood, collected in EDTA, heparin, or citrate blood tubes. Tissue samples (especially liver and kidney) and feces also may be tested. Histopathologic examination of liver, kidney, or bone, stained by the Ziehl-Neelsen technique, may reveal characteristic eosinophilic, acid-fast intranuclear inclusion bodies in hepatocytes, renal tubular cells, and osteoclasts, respectively.

Nitrate or Nitrite Poisoning

Nitrate or nitrite poisoning may occur in ruminants, pigs, and horses ingesting feeds with high concentrations of these compounds. Such may be the case in cereals, grasses, and root crops heavily fertilized with nitrogenous compounds. Water, especially from deep wells filled with seepage from heavily fertilized ground, may contain large quantities of nitrate. Nitrates are converted to nitrites in the feed or in the intestinal tract. Nitrites absorbed from the gut decrease the oxygen-carrying capacity of the blood by degrading hemoglobin to methemoglobin in erythrocytes. Consequently, the animal’s blood becomes dark-red to brown. The severity of clinical signs is related to the quantity ingested. Death can be acute and many animals can be affected.

A rapid, fairly specific, semiqualitative test uses diphenylamine, which is converted to quinoidal compounds with an intensely blue color by nitrates and nitrites. Diphenylamine (0.5 g) is dissolved in 20 ml distilled water and the solution made up to 100 ml with concentrated sulfuric acid. This stock solution may be used undiluted or diluted 1:1 with 80% sulfuric acid. The solution is applied to the inner portion of the plant’s stem. An intense blue color within 10 seconds of application of the undiluted solution suggests greater than 1% nitrate is present (and the feed is potentially toxic). False-positive results may occur with numerous substances, the most significant of which is iron. Such iron is generally on the outside of the stalk; therefore careful application circumvents this problem.

A more dilute diphenylamine solution (the previous stock solution diluted 1:7 with concentrated sulfuric acid) may be used to test for nitrates/nitrites in serum or plasma, other body fluids, and urine. Three drops of the diluted diphenylamine are added to 1 drop of the sample on a glass slide over a white background. Nitrate/nitrite produces an intense blue color immediately. Hemolysis may mask the color change.

Anticoagulant Rodenticides

Anticoagulant rodenticides (e.g., warfarin, diphacinone, pindone) act by inhibiting metabolism of vitamin K in the body. The latter is required for production of factors II, VII, IX, and X in the liver. Anticoagulant rodenticide poisoning initially prolongs the prothrombin time (PT) because factor VII is the first to be depleted. Subsequently, the partial thromboplastin time (PTT) and activated coagulation time (ACT) are prolonged as the other factors also become depleted. When an animal is bleeding as a result of such poisoning, both the PT and PTT (or ACT) are usually prolonged. Diagnosis of anticoagulant rodenticide poisoning is often based on these screening tests and the response to treatment with vitamin K.

Chemicals that Denature Hemoglobin

A variety of compounds, when ingested, may result in damage to (oxidative denaturation of) hemoglobin in erythrocytes with the formation of Heinz bodies (see Chapter 2). Such substances include paracetamol and methylene blue (cats), onions (dogs), red maple leaves (horses), and onions and brassicas (ruminants). Demonstration of Heinz bodies on a blood film is diagnostic of such poisoning.

Selenium-deficient animals are more prone to such oxidative injury because of a deficiency of glutathione peroxidase (an enzyme in erythrocytes that helps protect them against such damage).

Ethylene Glycol

Ethylene glycol is the major constituent of most antifreeze solutions. Accidental ingestion can cause serious or fatal toxicosis, usually in dogs and cats. Ethylene glycol and its metabolites can be detected in whole blood or serum samples by a toxicology laboratory. Its presence is strongly suggested when urine sediments from poisoned dogs or cats contain masses of calcium oxalate monohydrate crystals (see Chapter 5). Histopathologic examination of the kidney of fatally affected animals reveals renal tubular nephrosis and numerous oxalate crystals.

Recommended Reading

Karselis, I. The pocket guide to clinical laboratory instrumentation. Philadelphia: FA Davis, 1994.

Meyer, DJ, Harvey, JW. Veterinary laboratory medicine: interpretation and diagnosis. St Louis: Elsevier, 2004.

Sodikoff, C. Laboratory profiles of small animal diseases: a guide to laboratory diagnosis. St. Louis: Elsevier, 2001.

Thrall, MA, Baker, DC, Lassen, ED. Veterinary hematology & clinical chemistry. Baltimore: Lippincott Williams & Wilkins, 2004.

Willard, MD, Tvedten, H. Small animal clinical diagnosis by laboratory methods. St Louis: Elsevier, 2003.