The purpose of this chapter is to identify production practices that may influence the risk of contamination and exposure to the consumer by human pathogens. Key areas of concern are prior land use, adjacent land use, field slope and drainage, soil properties, crop inputs and soil fertility management, water quality and use practices, equipment and container sanitation, worker hygiene and sanitary facilities, harvest implement and surface sanitation, pest and vermin control, effects of domesticated animal and wildlife on the crop itself or packing area, post-harvest water quality and use practices, post-harvest handling, transportation and distribution, and documentation and record-keeping. The role of water quality and manure management practices is particularly critical. This chapter is largely focused on practices and research originated in the United States, however the issues of concern would most likely be applicable worldwide, and therefore affecting domestic as well as imported products. Appendix A describes the results of a estate survey on local requirements regarding manure and water quality management that may influence microbial contamination of produce.
This chapter is not intended to be a guide for producers of fresh-fruits and vegetables but a compilation of evidence that provides a basis for identifying sources of contamination where they may exist. Much of this chapter is based on observation, experience, and common sense, rather than scientific research, although a review and evaluation of the literature available is included. The reader is referred to recent guides on agricultural practices for comprehensive identification of measures to minimize public health risks from consuming fresh and fresh-cut produce (Gorny and Zagory forthcoming, IFPA 1996, IFPA 1997, FDA 1998). These documents provide excellent guidance for all parties involved in the production chain, from growers to shippers, towards achieving a safe fresh produce supply.
The grower, packer, shipper, and handler of fresh consumed horticultural products are often faced with a labyrinth of dynamic responses to weather, pests, market value and trends, labor, and customer requests. Each decision that veers established practices may alter microbial food safety risks. The potential risk may be reduced or increased by seemingly minor deviations in timing, source of production input, degree of handling, method of cooling, or any dozens of different interacting factors. The effects are generally unrecognized, the scientific basis for the hazards is limited. The highly perishable nature of much of what is produced, the general volatility of the market, sudden swings in availability of product, and opportunities to add value by special handling or niche production create an industry that is, both conforming and highly individualized, at the same time.
The diversity of cropping systems, scale of operation, use and design of equipment, regional and local practices, environmental influences, specifics of on-farm soil related factors, and many other production factors defy any attempt to develop an encompassing assignment of microbial risk to commodities or to crop management practices. This was recognized in the evolution of the Guide To Minimize Microbial Food Safety Hazards For Fresh Fruits and Vegetables (FDA 1998), which has become a template for focusing on the key areas of presumptive risk potential for fruit and vegetable production and handling. Although the available scientific literature is adequate to identify sources of contamination and estimate microbial persistence on plants, the specific influence and interactions among the production environments and crop management practices are not sufficiently understood to provide detailed guidance to growers and shippers. Climate, weather, water quality, soil fertility, pest as well as irrigation, and other management practices are difficult to integrate towards the development and implementation of microbial risk prevention and reduction programs on the farm.
Clearly, the risks associated with the purposeful introduction of pathogen-contaminated inputs (that is, inadequately aged manures, inadequately treated wastewater) or inadvertent contamination (that is, irrigation water quality, wildlife activity, adjacent land use, worker hygiene and sanitation) have been long recognized (Geldreich and Bordner 1971; Cherry and others 1972; Bryan 1977). Such risks have been thoroughly reviewed (Beuchat 1996; Tauxe and others 1997; Brackett 1999; NACMCF 1999) or summarized in the form of food safety guidelines for producers and handlers (IFPA 1997; FDA 1998; Cornell 2000). A recent overview of the anticipated persistence of pathogens on the surface of fruits and vegetables during production contrasted the characterized survival capacity of resident plant pathogens and benign microbial epiphytes (surface colonizers) with microorganisms responsible for food borne illness (Suslow 2001).
Among the environmental and management factors, animal waste pollution of water and the application of managed fecal matter (that is, animal manure or human biosolids) to soil intended for edible crops are two issues of concern during the pre-harvest and post-harvest management of fresh fruits and vegetables. A potential hazard exists for persistent pathogen populations to be transferred to harvested crops indirectly through contaminated water or by direct cross-contamination by proximity to animal production compounds or facilities, or from inadequately composted animal manure and biosolids (Committee on the Use of Treated Municipal Wastewater Effluents and Sludge in the Production of Crops for Human Consumption 1996; FDA 1998). These edible horticultural crops, including fruits, vegetables, edible flowers, seed sprouts, and a variety of recently domesticated wild species are typically consumed without a treatment to inactivate any pathogenic microflora that may be present. The potential for contamination does not appear to depend on the size of the operation (small- versus large-scale operations). The extent of potential illness from contaminated produce is more likely to be broad in locations where large-scale animal production is immediately adjacent to edible crop production. Though not the norm, this type of situation does occur. No conclusive evidence of foodborne pathogen transfer or human illness is available that definitively links this perceptual on-farm hazard to known outbreaks. Nonetheless, in light of current knowledge, awareness, and consumer perceptions and concerns regarding fresh produce, research directed towards characterizing and quantifying the risk seems prudent (Rose and Gerba 1991; Atwill and others forthcoming). Successful programs, such as training on pathogen prevention and control methods for all parties involved in the fresh produce production chain are already being offered by experts in Universities and Trade Associations.
2. Pre-harvest Operations
A schematic for pre-harvest operations is given in Figure II-1. Although each crop experiences particular conditions at each step of the process, some potential points need to be given careful consideration. These points are key elements to ensure the safety of the final product because they can introduce pathogens that may persist and reach the consumer. It is important to recognize these points, get the appropriate information, and develop strategies to reduce the risks. The most important considerations are the quality of the water used during irrigation or various foliar applications and the application of manure, biosolids or compost as fertilizers. They will be the focus of this chapter because of their most direct potential implication in contamination (sections 2.1. and 2.2.).
While other pre-harvest circumstances may be equally significant, research to assess their impact in food safety is lacking. For instance, the selection of land that has no recent history of manure use or animal grazing or farming activities would be a preferred location. Distance limits to adjacent animal farm activities should be considered, and the possibility of contaminated run-off should not be overlooked. These factors will be discussed in the context of water quality (section 2.2).
Finally, there are additional factors whose contribution to increasing food safety risks may be of a more indirect source. These include transfer of pathogens through wild and domestic animals, vermin, and aerosols. Section 2.3. describes some of the research that has been published in an effort to evaluate the risks from these sources.
|Field Operation||Potential Risk-Reducing Control Point|
|Crop site selection||Soil, water, wild and domestic animals, drift and run-off from adjacent farms, prior land-use history|
| ↓ |
| ↓ |
| ↓ |
| ↓ |
|Irrigation water, irrigation method|
| ↓ |
|Foliar applications: water quality|
| ↓ |
|Irrigation water, irrigation method|
| ↓ |
|Wild and domestic animals, vermin|
| ↓ |
Dust control: spraying roads and paths with water
2.1. Manure and biosolids
2.1.1. Description of the situation
Animal manure is commonly used as a crop fertilizer worldwide. In the United States excess animal manure may be spread on land in the vicinity of animal or produce farms due to high transportation costs. Although, according to an USDA survey (USDA 2001), organic sources of fertilizers are not commonly used by conventional fruit and vegetable growers, they are readily used by organic producers.
A large number of factors influence the probability of human pathogens being established on produce (for example, location, soil microflora, rain, and irrigation). Enough scientific evidence suggests that human pathogens may be transferred to existing adjacent crops by a variety of physical routes prior to (for example, wind) or during soil incorporation of organic soil amendments (Ahmed and Muller 1984; Jones 1999; Gagliardi and Karns 2000; Hossain and others 2000; Abu-Ashour and Lee 2000; FengYu and others 2000; Warnemuende and Kanwar 2000). There are multiple potential sources of contamination (Beuchat and Ryu 1997): animal and human feces, contaminated raw manure, irrigation water, water used for pesticide application or other agricultural purposes, contaminated dust, vermin and insects as vectors in fecal matter, and transfer by or on farm equipment. Significant populations of Escherichia coli and related pathogens of fecal origin are probably limited to soil contaminated with fecal material and are widely held to be transitory. Soil per se is not an important source of enteric pathogens on plants (NACMCF 1999). Fecal contamination of irrigation water, however, may be an important source, and other pathogens may be present in feces and have a recognized soil residency. Also, it is known that Listeria monocytogenes exists as a soil resident associated with decomposed organic mater (Dowe and others 1997; Porto and Eiroa 2001).
Animal manure has been used for thousands of years as a soil amendment to increase or maintain the organic matter content, biological diversity and activity, and soil aggregate stability of agricultural soils (Brady 1990). Animal manure contributes to the fertility management of soils, particularly nitrogen, and may be the primary source of plant nitrogen in organic farming systems. However, outlets to agricultural land are not always an option and in many regionally concentrated animal production systems (including pig, dairy and poultry) the disposal of manure has become a significant issue and waste management problem (EPA 2000). In many areas of the United States, manure is accumulating in excess of that needed for application to on-farm or local forage production areas. Animal manure is often contaminated with human pathogens. This waste management issue is believed to be a key contributor to an intimately related potential source of produce contamination, water. For example, the annual animal waste production in the United States is estimated to be 1.36 billion tons, compared to a volume of human waste of over 64 million tons (EPA 2000). Animal production operations are increasingly concentrated, and large dairy operations are characteristic in states such as California, a major producer of fruits and vegetables for domestic and export markets. This large volume of waste must find an outlet as material for on-site bedding, on-site land application, and off-site land or farm distribution. Dairy and feedlot production facilities are the largest producers of waste volume and may be sources of Salmonella spp., E. coli O157:H7, Cryptosporidium parvum, and other potential human pathogens, although other farm animals may also be pathogen sources. The risk of water pollution and contamination from waste spills, run-off, seasonal flooding, and lagoon leakage is increased.
Although a significant number of outbreaks of human foodborne infections have been linked to consumption of raw fruit and vegetable products such as unpasteurized apple juice, pre-sliced melon, lettuce, and sprouts, the epidemiological association of outbreaks to the use of aged manure or compost in the United States is still speculative. Acquisition of pathogens from pathogen-containing soil amendments has been demonstrated (Bryan 1977 and references therein), but, although possible, there is no demonstrated evidence that pathogens incorporated as soil amendments prior to planting can persist until harvest and be transferred to the edible portion of the crop. Although direct evidence of food-associated illness due to contamination of produce, from any source, during commercial production is scant, compelling epidemiological evidence, primarily related to fresh leafy greens and fruit-vegetables has implicated poor production practices and poor animal waste and manure process control practices. It is reasonable to believe that as a result of sub-standard or even illegal agricultural practices, produce may be contaminated with human pathogens. The National Advisory Committee on Microbiological Criteria for Foods (NACMCF) lists 11 agents associated with produce-borne outbreaks. Foremost among them are E. coli O157:H7 and various Salmonella serotypes (see Chapter IV and Tauxe 1997). Health officials at a recent national food safety meeting disclosed preliminary data, which demonstrate that foodborne illness associated with fresh produce in the United States is related predominantly to pathogens of animal origin. Illness attributed to imported produce predominantly aligns with human sources of contamination. Comparable findings have been reported from earlier studies with environmental sources of Salmonella spp. and clinical salmonellosis (Dondero and others 1977; Jones and others 1980; Garcia-Villanova Ruiz, Cueto Espinar and others 1987). This is an interesting and important area of future research that is critical to better identify the environmental sources of contamination. In addition to information on pathogen origin, there is an ongoing need for multidisciplinary research on industry practices, pathogen persistence, and pathogen reduction practices for manure intended for soil incorporation on fruit and vegetable farms or direct crop applications as foliar sprays.
2.1.2. Factors affecting contamination of fruits and vegetables
It has long been known that the improper use of manure can transfer pathogens onto crops, resulting in human disease. Raw manure should not be applied to crops. In addition to the hazard of pathogen transmission, it is well recognized that salt injury to sensitive vegetable crops and transfer of viable weed seed may result unless the manure is subjected, at least, to a period of unmanaged (no thorough mixing or pile inversion) composting. This "stacked" or "aged" manure is applied at various times and amounts to a variety of production soils. According to a USDA survey, among conventional growers, only 6% of fruit acres and 3% of vegetable acres were reported having manure applied in 1999. Only 2% of all fruit acres and 1% of all vegetable acres received sludge applications in 1999 (USDA 2001). The survey also reported that of the manure users, 22% and 15% of fruit and vegetable farms, respectively use untreated manure. However, among the growing sector of organic producers that cannot use synthetic fertilizers according to the USDA standards, the practice of using manure as a fertilizer is more spread. Very few studies have been performed to address the microbiological safety of organic foods, and more comprenhensive studies are needed to fully evaluate this concern.
The potential for the pathogenic contamination of produce from the use of manure depends on numerous factors. As one looks at the various mechanisms for contamination, one should remember that the potential for a hazard results from a number of possible events and their probabilities of occurrence, for example contamination, survival, persistence during processing. Ultimately, pathogens must be present and consumed in sufficient quantities to become a public hazard. The panel identified four main factors that will be further described in more detail:
- Pathogen populations in the animal feces
- The treatment/storage/processing method of manure
- The biological activity and structural stability of the soil to which manure is applied
- Relative timing and location of manure in the crop rotation
Some other factors that are worth mentioning are the existence of a possible pathway for transport of the pathogens onto or into the fruit or vegetable; cultural methods (for example full-bed plastic mulch or type of tillage), the type of crop; and the method of manure application. Theoretically, three routes exist for a pathogen to contaminate a plant: the surface of the edible portion of the plant, transfer to the plant tissue through an injury, or transported through the root system. The probability of any of those routes to occur is not known.
22.214.171.124. Pathogen populations in animal feces
Pathogen populations in animal manure may be related to the state of animal health or to normal elimination in asymptomatic animals (Burton 1996; Hancock and others 1997; Shere and others 1998; Atwill and others forthcoming). Stressed animals, as in calving operations, have been shown to be more prone to disease and to excrete increased pathogen population densities (Williams and Newell 1970). The prevalence of E. coli O157:H7 and Salmonella spp. in manure also varies with the source animal. Escherichia coli O157:H7 colonizes cattle and other ruminants but generally not poultry. The prevalence of cattle pathogen shedding varies among different studies. Cassin and others (1998) projected that the number of E. coli O157:H7 shedding animals varies from 0.3% to 0.8%, but may be considerably higher in a population consisting exclusively of young or stressed animals. In addition, survey results may be strongly influenced by regional and seasonal variation. For example, in the Midwest and East, summer samplings are more likely to result in positive detection, though still low, while similar studies conducted at the same time in California have proven to yield predominantly negative information (J. Cullor; unpublished data; unreferenced). The numbers shed by colonized animals reportedly varies from 3 to 50000 colony-forming units (CFU) per gram of manure (Zhao and others 1995). A recent USDA survey found that, as mentioned above, even in healthy dairy cattle, Campylobacter jejuni was positive in 37.7% of individual fecal samples. Factors associated with persistence included application of manure with broadcast spreading, feeding of whole cottonseed or hulls or alfalfa, and accessibility of seed to birds. Arcobacter sp., possibly an emergent pathogen, was also present in 14.3% of individual fecal samples of healthy cattle (Wesley and others 2000). Regardless of the high variation in shedding contamination from farm surveys, there is no doubt that on-farm food safety would still benefit from programs that identify animal production practices that minimize pathogens in the manure management system (EPA 2000).
Escherichia coli prevalence
More current data on shedding of shiga toxin -producing E. coli O157 by cattle indicates that the prevalence may be higher than suspected, partly due to improved methods of detection (Gansheroff and O'Brien 2000). For example, from a total of 365 fecal samples of cattle, E. coli O157 was found in 20% of samples in an intensive management beef cattle farm in the Csech Republic using selective enrichment followed by immunomagnetic separation (Cizek and others 1999). A Dutch study reported that 7 out of 10 cattle farms tested positive for verocytotoxin producing E. coli O157 with the prevalence ranging from 0.8-22.4% (Heuvelink and others 1998). A 15 mo follow-up study showed that farms that were negative in the first visit would later become contaminated with the pathogen. Characterization methods implied that there was more than one source of verocytotoxin producing E. coli O157 on the farms. The study also demonstrates that testing for the pathogen during a single visit to a farm does not demonstrate verocytotoxin producing E. coli O157. Age of cattle seem to be a factor in pathogenic E.coli shedding in that younger cattle and calves at weaning seems to have higher feces prevalence than older cattle (Laegrid and others 1999; Heuvelink and others 1998). A recent USDA survey to estimate the frequency of enterohemorrhagic E. coli O157:H7 in feces at slaughter houses showed 72% of 29 lots had at least one positive fecal sample (Elder and others 2000). Among individual cattle the prevalence was 28% overall. High prevalence was also reported by a Canadian study performed in cattle fecal samples at the point of processing (van Donkersgoed and others 1999).
Salmonella spp. prevalence
Salmonella spp. may be detected in both cattle and poultry manure. The prevalence among dairy herds may range from 57 to 84% (Smith and others 1993; Atwill and others forthcoming and references therein). Shedding is predominantly discontinuous, even in herds with a high positive recovery frequency. Less than 4% of healthy cows that were determined to be asymptomatic carriers of Salmonella Dublin were found to be shedding at any sampling date (Smith and others 1993). A singular assessment of feedlot cattle (Committee on Salmonella 1995) indicated that Salmonella could be recovered from 38% of a composite profile of feedlots but the percentage of individual positive samples would be 5% or less, depending on the length of time on supplemental feed. Despite the high prevalence, there is insufficient information to predict the residual viable populations of Salmonella spp. shed (CFU/g) in manure. Estimated numbers in manure slurry from colonized livestock herds are reported to be from less than 2 to 5,000 CFU/g-wet weight (Atwill and others forthcoming). By extension, a crude estimate of 20 to 50,000 CFU of Salmonella / g of manure may represent a typical initial load before aging or composting. The frequency of shedding and numbers of Salmonella per gram of chicken manure are not readily available. In a California survey, Riemann and others (1998) found that chicken manure piles in 68% of layer houses were positive for Salmonella with a broad range of detection (25 to 100% of replicate samples). The estimated numbers of Salmonella per gram feces from layers, reported in this study, range from 0.68 to more than 340. Another survey in the Netherlands and Belgium to investigate the microbiological contamination level of raw sludge at pig and chicken slaughter houses revealed that Salmonella spp. were present in all samples.
Prevalence of other human pathogens
Other human pathogens were also detected in high numbers, such as pathogenic Yersinia enterocolitica (detected in 7 out of 13 slaughter houses) and Campylobacter jejuni/coli (2.8-7.3 log /g in 10 out of 14 slaughter houses) (Fransen and others 1996). Gregory and others (1997) also reported a high prevalence of Campylobacter spp. in broilers cecal droppings (100% of 20 samples were positive), even for newly constructed houses. Similar data was presented by Stern and others (1995) where average levels of Campylobacter spp were 5.44 log / g cecal material in 9 out of 10 broiler farms and increased significantly after transportation.
Farm practices and Escherichia coli
In an effort to minimize the level of pathogenic organisms at the source, research is being increasingly directed to the identification of specific farm management practices that may be linked to the incidence of pathogens in animal manure or in the farm environment. For example, the prevalence of E. coli O157:H7 was investigated on 91 dairy operations in another USDA attempt to identify management practices associated with pathogen prevalence on farms. In 24.2% of the operations, 1.2% of samples were positive for verotoxin-producing E. coli O157:H7. Herds on farms that did not flush manure with water had significantly fewer positive samples for verotoxin producing E. coli O157:H7 (Garber and others 1999). Other factors, such as chlorination of cow's drinking water and feeding practices, seemed to have an effect but were statistically insignificant. A higher number of positive samples for the pathogen in the feces were also associated with the summer months.
Several recent studies point out that among farm practices, animal diet may influence pathogen shedding (Diez-Gonzalez and others 1998; Hovde and others 1999; Herriot 1998; Dargatz and others 1997; Buchko and others 2000). A grain diet may induce changes in the cow's digestive system that promotes the survival of acid-resistant E. coli (Diez-Gonzalez and others 1998; Hovde and others 1999). Grain-fed cattle shed 1000-fold more acid-tolerant E. coli than hay-fed cattle, and had lower colonic pH. Similarly, from a total of 36 samples, herds fed corn silage had higher numbers of E. coli O157:H7 than those that were not. Other associated factors included the weaning method, protein level of calf starter, and feeding of ionophores (that is, lasalocid and monesin, feed supplements introduced in the 1970's), grain screens, or animal by-products (Herriot 1998). Other studies also suggest the influence of animal diet on pathogen incidence (Dargatz and others 1997). Minimizing environmental dissemination of E. coli O157:H7 in conjunction with diet modification may reduce numbers of E. coli O157:H7-positive cattle, according to an inoculation study (feed was inoculated with 1010 CFU of E. coli O157:H7) that found that corn-fed or cottonseed/barley-fed steers were less likely to be positive for E. coli O157:H7 than steers fed only barley. As in the study by Diez-Gonzalez and others (1998), a lower pH of the feces in the corn-fed animals was suggested to contribute to this difference (Buchko and others 2000).
Other studies suggested an association between verotoxin producing E. coli strains management practices other than feed (that is, a larger herd size, open pile manure storage, and occasional use of equipment to handle both manure and feed) (Bormaneby and others 1993). In a Washington state study of 60 dairy cattle and 25 beef cattle , E. coli O157:H7 was found in 0.28, 0.71, and 0.33% of the fecal samples from dairy, pasture beef , and feedlot beef cattle, respectively. As many as 16% of the beef cattle and 8.3% of the dairy cattle herds were infected. In this case, management practices also seem to be able to reduce human exposure to E. coli O157:H7 (Hancock and others 1994). However, when a larger study was designed and herds selected to prove this hypothesis, the prevalence of E. coli O157:H7 (1.41% of positive individual fecal samples out of 12,664 samples over 6 mo and 75% positive herds out of of 36 ), neither the application of manure to forage crops nor the housing in dry lots (as opposed to grazing in pasture) nor the grazing in pasture were associated with the prevalence of the pathogen (Hancock and others 1997).
Farm practices and other human pathogens
The influence of management practices in the prevalence of other pathogens in feces has also been investigated. For instance, in one study the probability of having a positive pigeon fecal sample for C. jejuni was decreased by using dry manure in nesting, cleaning shipping crates, and by decrease frequency of chemical disinfection of water (Jeffrey and others 2001). Presumably, the use of manure as part of the nesting material protects the pigeons against C. jejuni infection. This would agree with the competitive exclusion principle, demonstrated in poultry. The authors recognized that critical control points for food safety pathogens may vary widely, and the formulation of effective programs depend on science-based knowledge of diverse animal production systems. Likewise, as part of a large United Kingdom project, another survey indicated the excretion of L. monocytogenes (including virulent strains) by farm animals (pig, poultry, sheep, and cattle) was associated with long distance transportation of animals and their diet (Fenlon and others 1996). The studied showed that animals fed on silage, commonly contaminated with the pathogen, shed L. monocytogenes, whereas in animals fed on hay or manufactured diets L. monocytogenes was not detected.
Eleven dairy farms from the northeastern region of the United States were sampled for Cryptosporidium in farm and stream water and feces over a 6-month period (Sischo and others 2000). Ninety one percent of the farms had Cryptosporidium on their premises. The single risk factor for detecting Cryptosporidium in surface water was the increasing frequency of spreading manure in fields. Shedding cryptosporidia was associated with young calves (15% of calves 0-3 wk of age) and frequent contact and change of bedding.
Based on these studies on farm management practices, current knowledge does not allow for a clear association between certain farm practices and pathogen shedding. The results are difficult to interpret mainly due to the variability in practices and interaction with other factors that might have been overlooked. Despite the variability certain practices need to be investigated further. These include the influence of diet and the management and use of manure. With the current trends to minimize the incidence of pathogens at the point of origin, future research should focus on farm practices.
126.96.36.199. Manure storage and processing methods
Proper storage and management of manure includes anaerobic digestion, aeration of sludge, and composting. These practices greatly reduce residual pathogen populations in contaminated manure. Proper manure management, often called aging, is essentially a heat-pasteurization process, generally targeted to maintain between 60 and 65 °C ( 140-149 °F) (Kudva and others 1998; Berg and Berman 1980; Burton 1996). There are however insufficient data from controlled studies in which the fate of foodborne pathogens has been determined. Although many years ago, it was thought that pathogen persistence did not usually become an issue of risk if animal bedding is ultimately mixed with or used for soil amendments (Strauch 1977), the current situation may require different manure management methods. The increase in biosolids production and waste disposal issues, as well as the emergence of new pathogens of concern call for more dated scientific validation procedures.
With minimal supporting data, EPA stated that this pathogen reduction step is best accomplished by composting, but aging (stacking) for at least 3 d at 131 - 149° F (55 - 65° C) is effective, if done thoroughly (EPA 1993). Composting, as compared to aging, is a more directed aerobic fermentation which involves the same target degree of temperature increase, but includes the provision that all material must be turned to maintain aeration, that moisture be added, and that the process be allowed to reach a peak microbial composition over a period of at least 3 mo. The need for scientific validation studies of pathogen reduction in manure is an eminent ned and consequently this is an active area of research.
Previous scientific studies that assess population fates and kinetics in pathogen-bearing manure are sparse. Porter and others (1997) only detected E. coli O157:H7 in manure with a high moisture content. Salmonella cultures dried on polyester sheets and buried in cattle manure survived at 5 - 30o C (41-86 °F) for up to 105 d (Plymm-Forshell and Ekesbo 1993); at 1 - 15o C (36.5-59 °F) detection was positive for up to 210 d. Simulating the temperature increase during aging, Salmonella survival on the inoculated sheets was detectable for at least 8 d at 50 - 62o C (122-143.6 °F). The potential for reduced recovery of stressed cells within the recovery protocols was not specifically addressed. Plymm-Forshell and Ekesbo (1996) determined that Salmonella Dublin inoculated into dried fecal material on a stall surface survived for 68 mo at ambient temperatures. When applied as slurry, populations were reduced by 3.0 log, but not eliminated, following approximately 4 d of rapid drying conditions. In contrast E. coli O157:H7 could not be obtained from the dry outer surface layer of pathogen-containing ovine and cattle manure piles (Kudva and others 1998). This study was done to determine the persistence of E. coli O157:H7 in manure under various experimental and environmental conditions. When ovine and bovine manure were inoculated and periodically aerated, E. coli O157:H7 remained positive for 4 mo and 47 d, respectively. The pathogen survived best without aeration at 23 °C (73.4 °F). In frozen bovine manure or at 4 or 10 oC (39.2 or 50 °F) in ovine manure, the microorganism survived for 100 d. Under other conditions (23, 37, 70 oC) (73.4, 98.6, 158 °F) the microorganism survived between 1 and 47 d (Kuvda and others 1998). Another study reported that over a 12 wk period at 10 °C (50 °F) a 3.5- and 5.5-log reduction was observed in the E. coli O157:H7 population that was inoculated in slurry from cattle fed different diets. This persistence of E. coli O157:H7 indicates that the pathogen has the potential to be transmitted to the environment (McGee and others 2001). Such long-term survival ability emphasizes the need for including treatment of manure as a management practice to eliminate this pathogen as a primary source of food and water contamination and to minimize human health related risks. The efficacy of the process, though, can be improved with different experimental protocols. For instance, composting under intermittent aeration with a blower control system and at least 60 oC (140 °F) killed E. coli in 24 h and was not detected after the fermentation period (30 d) (Mori and Sakimoto 1999).
The survival of other human pathogens such as Salmonella Enteritidis under different conditions and experimental protocols has also been investigated. Salmonella Typhimurium was not recoverable at 44 d at 10 °C (50 °F) and by 3 d at 35 °C (95 °F) in dairy lagoon water (McCaskey and Jaleel 1975). Slurry materials commonly remain in lagoons for storage periods that exceed 5 mo. Lagoon capacity and prevention of leakage or storm-related spills and run-off may become a significant concern if the production location will induce downstream crop or groundwater contamination. Compared to dairy manure, a higher ammonia content (0.2-0.4%) and alkaline pH (8.6) found in chicken litter would be predicted to accelerate Salmonella inactivation of 7.0 log within 11 d (Turnbull and Snoyenboss 1973). Recent studies (Himathongkham, Bahari and others 1999; Himathongkham, Nuanualsuwan and others 1999; Himathongkham and Riemann 1999; Himathongkham 2000; Himathongkham, Riemann and others 2000) confirm that after a short term of multiplication in fresh moist manure, there is an approximately logarithmic decline for both E. coli O157:H7 and Salmonella Typhimurium. The rate of inactivation and the decimal reduction time (D-value) varied with temperature and origin of manure and was different for manure and manure slurries. At 37 oC (98.6 °F), the fastest reduction of inoculated E. coli and Salmonella Typhimurium in cow manure and manure slurry occurred. The decimal reduction time ranged from 6 d to 3 wk in manure and from 2 d to 5 wk in manure slurry (Himathongkham, Bahari and others 1999). According to the authors, the D-value could be used to predict the time and temperature needed to achieve a desired reduction of pathogen level.
An additional issue that may become a concern is the existence or development of microorganisms tolerant to high temperatures. The high temperatures that evolve during aging and composting may induce resistance within the microbial population or survival of resistant microorganisms through selection processes. Thermotolerant (also thermophilic mutants) E. coli and Salmonella Typhimurium have been reported from composting material. Brinton and Droffner (1994) reported a maximum growth temperature for an enriched variant of E. coli and Salmonella Typhimurium of 48 and 54 °C (118.4 and 129.2 °F), respectively. Survival of both inoculated strains was at least 56 d in compost maintaining 60 °C (140 °F). Thermotolerance was inducible and reversible when the strains were grown at lower temperatures. This phenomenon, however, has not been reported during the commercial production of manure.
Although phenotypic thermotolerance constitutes a potential concern during manure management, it is not a required phenomenon for the presence of persistent pathogens in manure piles. The outer layer of a manure pile may be as much as 35 °C (63 °F) lower than the interior, and never exposed to temperatures sufficient to kill pathogens if not specifically turned (Suslow and Meyer; unpublished data; unreferenced). Unless specifically prevented by a company operational policy, manure handlers may add new manure to an existing manure pile; any mixing or inversion may be limited to pile movement with a front-loading scraper. Desiccation at the outer layer will result in inactivation. Death of E. coli O157:H7 and Salmonella Typhimurium was limited to 1 to 2 logs in slow drying manure over 24 h (Himathongkham and Riemann 1999), whereas fast drying resulted in up to a 3-log decline in just 6 h at approximately 22 °C (71.6 °F) (Himathongkham, Bahari and others 1999). However, residual populations remain viable for a long time despite low water activities (aw). A 6-log reduction was estimated to take 3 mo at very low aw and 20 °C (68 °F). When manure piles are turned, the previous surface layer will be exposed to different conditions (for example higher moisture content) in the deeper part of the stack. Remaining viable pathogens presumably die as temperature elevation occurs. In piled poultry manure, ammonia generation causes as rapid a reduction and elimination of pathogens in the top layer as in the deeper layers (Himathongkham and others 2000).
Certain protozoa are also important disease-causing agents in Europe and North America, although the vast majority of them are not lethal. Potential sources of contamination are water and biosolids for land application. Conventional aerobic and anaerobic wastewater treatment in Ottawa consists of screening, primary clarification, aeration, secondary clarification, and anaerobic digestion. One study compared the extent of Cryptosporidium and Giardia reduction to that of other pathogenic and indicator microorganisms during conventional treatment. During aerobic treatment, Cryptosporidium and Giardia cysts were reduced by 2.96 and 1.40 log (from an initial population of 3.68 log and 3.92 log), respectively. After further anaerobic treatment, there was no significant reduction for any protozoan organisms tested while a 1-2 log reduction of fecal coliforms was observed (Chauret and others 1999). Another study showed that no Cryptosporidium muris oocysts were detectable after 44 d of fermentation of bovine feces and rice hull at 73 °C (163.4 °F) from an initial load of 2.26 x 105 oocysts/g of compost (Furuya and others 1999).
Recently, the new experimental data (Table II-1; Cliver and others 2001) were combined with previously published information to assess the risk to consumers of growing lettuce with manure as a soil amendment. Application of raw bovine or poultry manure to a lettuce field shortly before harvest presents a considerable risk to the consumer. Poultry manure stored for 2 mo at 20 °C (68 °F) and applied to the field 2 mo before harvest represents negligible risk with respect to Salmonella or E. coli O157:H7. In contrast, cattle manure used the same way seems to represent an unacceptable risk.
|Cattle manure||5||70||Wang and others 1996|
|4||100||Himathongkham, Nuanualsuwan and others 1999|
|4||20||Kudva and others 1998|
|20||57||Himathongkham, Nuanualsuwan and others 1999|
|22||56||Wang and others 1996|
|23||40||Kudva and others 1998|
|37||49||Wang and others 1996|
|37||8||Kudva and others 1998|
|37||36||Himathongkham, Nuanualsuwan and others 1999|
|Poultry manure||4||70||Himathongkham and others 2000|
|37||8||Williams and Benson, 1978|
|Cattle manure slurry||4||25||Kudva and others 1998|
|20||20||Porter and others 1997|
|20||5||Plymm-Forshell and Ekesbo, 1996|
|23||5||Kudva and others1998|
|Slurry, fresh manure||4||92||Himathongkham, Nuanualsuwan and others 1999|
|Slurry, old manure||4||261|
|Slurry, poultry manure||4||223||Himathongkham and others 2000|
|Soil||20||36||Rickle and others 1995|
|20||28||Zhai and others 1995|
|20||94||Himathongkham and others 2000|
|Soil + manure||20||66|
|Soil + straw||4||69|
Reproduced from Cliver and others (2001) by permission of Dean O. Cliver.
The difference between poultry and cattle manure is mainly due to a more rapid die-off in poultry manure because it accumulates ammonia. Manure mixed with bedding, resulting in self-generated heat of 60 - 70 °C (140-158 °F), is assumed to present little risk. To estimate overall risk to consumers of raw produce, the data and models must be carefully scrutinized; data describing present manure handling practices are also needed.
This section clearly demonstrates the importance of developing manure treatment protocols that efficiently reduce the pathogenic population to a level that minimizes the risk of fresh produce-derived illness. A number of technical difficulties still need to be resolved before a manure treatment protocol can be suggested. Those include sampling protocols, aeration and turnover methods, and addition of fresh manure. Furthermore, because several other factors contribute to the further reduction or growth of the pathogens, the desired level of reduction is still a matter of discussion. When assessing the risk of contamination from pathogenic organisms in manure, one needs to consider a number of additional environmental parameters. A recently published book chapter (Suslow 2001) discusses the further reduction of residual pathogen populations following soil incorporation by desiccation on the plant surfaces, by ultraviolet (UV) exposure, or other environmental stresses. The reader is also referred to Chapter IV on pathogen growth and survival on produce.
The current regulations found in EPA 40 CFR Part 503 for use or disposal of domestic sewage sludge are derived from regression analysis studies of compost research and thermal inactivation analysis in model matrices (Farrell and others 1990, 1996). Beyond limitations in the ability to predict the environmental fate of key pathogens in the diverse physical and chemical environments of soil, varied climates, specific seasonal weather events, and soil management practices, pathogen elimination is predominantly an issue of uniformity and consistency of process controls (CCREF 2001). EPA 40 CFR Part 503 estates Class A sewage sludge can be used without restrictions and must contain <1,000 Most Probable Number (MPN) / g total solids or < 3 Salmonella sp. MPN / 4 g total solids. In addition the temperature of the sewage sludge must be maintain at a specific value for a minimum period of time, depending on the percent of solids and the treatment method (usually 50-55 °C [122-131 °F] or greater for 4 h-15 d). Different methods of pathogen reduction are described such as aerobic and anaerobic digestion, air drying, or composting.
If sewage sludge does not meet the standards above, it may be classified as Class B. Class B sewage sludge must contain < 2, 000,000 MPN fecal coliforms / g total solids (Class B pathogen limit, Part 503 rule). It should be noted that in this regulation, the fecal coliforms and Salmonella standards are used as indicators of the process, not the presence of pathogens. Strong statistical evidence, however, suggests that pathogens may be present when indicator microbes are present at this level (Farrell and others 1990). Therefore, this material could not be used or sold for use on vegetable crops or distributed to the general public without management requirements and site restrictions. Class B product can be used on crops that will be consumed by humans or animals; however, there are requirements for waiting periods between the time of application and crop harvest and for restricting public access. The amount of waiting time required depends on various factors (for example proximity of edible part of the plant to the soil), details of which are readily available in the Part 503 rule. States may have more restrictive and independent rules for biosolids use and reporting (see Appendix A). As a precaution, produce buyers are using market pressure to preclude growers from producing fruits and vegetables on ground with a prior history of biosolid application.
Manure, however, is exempt from these regulations. There are no federal or state rules regarding pathogen levels in aged manure used for land application. Although no such specific rules (federal or state) currently apply to aged or stacked manure use and distribution, the time-temperature criteria for pathogen reduction and elimination by composting are being broadly used. Also, certified organic growers must follow certain standards to satisfy the criteria for certification. For example, under the new USDA organic certification program the raw animal manure must either be composted, applied to land used for a crop not intended for human consumption, or incorporated into the soil at least 90 d before harvesting an edible product that does not come into contact with the soil and at least 120 d before harvesting an edible product that does come into contact with the soil. Composted plant or animal materials must be produced through a process that achieves a temperature between 131 °F (55 °C) and 170 °F (76 °C) from 3-15 d depending on the composting system.
As previously mentioned, sublethal exposure due to inadequate time-temperature management in aged manure composting may yield soil amendments, which have pathogen numbers similar to Class B biosolids compost. A high degree of uncertainty remains about the efficacy of the treatment and usefulness of indicators as presumptive evidence of the absence of pathogens. Besides the uncertainty regarding the efficacy of the treatment, additional research on persistence in soil and on plant surfaces is needed to support science-based policy decisions on restrictive limits.
188.8.131.52.2. Current situation
In practice, detectable populations of nonpathogenic E. coli, which serve as indicators of survival potential, are commonly found in stored stacked manure piles and field-side piles prior to spreading. Populations of E. coli in aged piles or field-side windrows of dairy manure are reported to range from undetectable (less than 100 CFU/g by most enrichment-based methods) to greater than 1,000,000 viable bacteria per gram. Viable E. coli and Salmonella have been detected in manure piles over a broad range of collection-point temperatures, including sub-surface samples measured at 52 °C (125.6 °F) (Suslow, Meyer, and Cliver; unpublished data; unreferenced). Temperatures below the surface of manure piles, routinely taken at 1 m depending on the size of the pile, exceeded 65 °C (149 °F), while a layer just under the surface may be below 35 °C (95 °F). In addition to temperature fluctuations, other factors such as water activity, pH, ammonia concentrations, and microbial activity affect the rate of loss of pathogen viability in stacked piles. Over wintering, manure piles on the side of the fields may harbor high populations of E. coli, although surveys for the presence of key pathogens have not been publicly reported. Studies with E. coli O157:H7 and Salmonella Typhimurium predict a survival period exceeding 100 d from a starting population of one million cells in both chicken and dairy manure (Himathongkham, Bahari and others 1999; Himathongkham, Nuanualsuwan and others 1999).
Aerobic composting is preferred for manure intended for fruit and vegetable production because it results in a stabilization of nutrients. It is important for the added manure to have nutrient release characteristics that meet the fertility management plan and projected sufficiency demand of crops throughout the season (Smith and others 1998; Lubke 1995; Nelson and Uhland 1955). For this reason, manure may be applied after short-term storage, just enough to be manageable with a conventional spreading system, but without a more intensively managed compost processing. In some regions of the United States, manure may be applied to land as slurry, often untreated or minimally treated to avoid the cost of construction and nuisance reduction and environmental protection management requirements of large storage facilities and pits. Typically this slurry would not be applied to fruit and vegetable production ground (certainly no major production area), but may be a source of contamination by run-off (see section 2.2.).
184.108.40.206. Biological and physical buffers
The survival of residual pathogenic bacteria from manure in the farm soil environment is thought to be largely an outcome of competition from the existing soil microflora (Lynch and Poole 1979; Killham 1995; Tate 1987). It is well established that populations of introduced bacterial inoculum to soil are rapidly reduced due to competition from the endemic microflora. Recent research, with better methods of recovery, however, suggests that pathogens adapted to the gastrointestinal environment have an uncertain period of persistence in soils, depending on several factors. This is a primary area of focus in current food safety research, and related findings are just starting to become available (Atwill and others forthcoming). Soils with relatively low microbial activity are believed to allow the extended persistence of pathogens. Therefore, the application of large quantities of manure to soils with low existing microbial activity is thought to increase the ability of pathogens to persist in the soil environment. The soil type and matric potential (soil moisture levels) also influence the survival of introduced microorganisms (Henschke and others 1991; Drahos and others 1992; Meikle and others 1995). It has been shown that well-aggregated soils that have a high organic content result in high soil microbial activity and generally poor persistence of introduced microorganisms (Killham 1995). Few direct quantitative data are available, but risk assessment studies and persistence data generated during the early years of concern regarding the release of recombinant soil bacteria and recombinant microbial pesticides for agriculture strongly support a low probability of persistence of enteric pathogens in soil.
In a study of potential indicator microbes, fecal coliforms, as a group, showed a biphasic logarithmic death curve in soil amended with poultry manure and a moisture content of 15% (Zhao and others 1995). The initial decimal reduction time (D-value) was approximately 4 - 5 d, followed by approximately a D-value of 15 - 20 d for the residual population. Rickle and others (1995) found that zeolite and similar materials that absorb water, ammonia, and other compounds had little impact on survival of Salmonella Typhimurium in soil. Survivor curves over 29 d were approximately exponential with a D-value of 2.3 - 3.6 d at 35% moisture. With increasing moisture, the D-value increased to 12 d. In a study by Zibilske and Weaver (1978), Salmonella Typhimurium was not recoverable in one week in dry soil at 39 °C (102.2 °F). At the time of manure incorporation, however, such high soil surface temperatures are limited to certain regions of the United States. Thus, the model data are not generally instructive in crop management decisions relative to food safety. Moreover, manure incorporation into the pre-irrigated soil would be unlikely at these temperatures except under atypical conditions or practices. In model studies at more common soil temperatures of 5-22 °C (41-71.6 °F), soil survival for more than 50 d is widely reported. Himathongkham (2000) observed a D-value value for E. coli O157:H7 and Salmonella Typhimurium of 14 d in clay soil (pH 8.9, moisture 22%) at 20 °C (68 °F). Mixing manure into the soil (1:5) did not change the D-value. Controlled studies that address the impact of soil matric potential cycling (wet-dry cycles) and subsequent field preparation activities (that is, discing, bed-shaping, pre-plant irrigation) on survival are not available.
When the persistence of E. coli O157:H7 in river water, cattle feces, and soil cores were investigated with model systems, survival was greatest in soil cores with rooted grass, decreasing only from 108 to 107-106/g soil after 130 d at 18 °C (64.4 °F). The organism also survived in feces for more than 50 d. In cattle slurry and river water, no E. coli O157:H7 was detected after 10 and 27 d, respectively (Maule 2000).
The incidence and survival of L. monocytogenes has also been the focus of a few research studies. As part of a Listeria spp. survey in the United Kingdom, it was reported that 93.9% of 115 sewage samples were positive for Listeria, 20% of which were identified as L. monocytogenes. In garden soil, only 0.7% of the samples contained L. monocytogenes (MacGowan and others 1994). In an effort to know more about what environmental conditions or agricultural practices leading to the increase on produce contamination with L. monocytogenes, a laboratory experiment with different soil types, inoculation levels, and fertilizer sources was conducted. Clay loam or sandy loam soil or soil amended with chicken manure resulted in higher survival of L. monocytogenes than sandy soil or soil fertilized with liquid hog manure or an inorganic fertilizer. Listeria monocytogenes levels slightly declined in clay soil and tended to increase in sandy loam soil, a discrepancy with other research showing L. monocytogenes declined to not detectable levels in 2 mo in sandy loam soil (Van Renterghem and others 1991) or 6 mo in clay soil (Welshimer 1960). Such discrepancies could be due to differing soil moisture levels. The authors however, concluded that there were no significant differences in L. monocytogenes populations in the tested soil types but field studies were needed to confirm these results (Dowe and others 1997). The organism has been found in soil in a frequency varying from 9 to 14% (Weis and Seeliger 1975). Using a variety of enrichment techniques, L. monocytogenes was detected in 16% and 20% of pig and cattle feces, respectively, but not detected in stored liquid manure or manured soil samples. In the fresh feces samples, L. monocytogenes died off after 3 wk and after 2 mo of storage if manure or soil was inoculated.
As mentioned previously, protozoa are also pathogens of concern and their survival in soil has been studied. The persistence of Giardia cysts and Cryptosporidium oocysts in water, cattle feces, and soil was investigated at -4, 4, and 25 °C (77 °F). One week of freezing and 2 wk at 25 °C (77 °F) eliminated the infectivity of Giardia cysts. At 4 °C (39.2 °F) the infectivity remained for longer in water (11 wk), soil (7 wk), and feces (11 wk). Cryptosporidium cysts were more resistant, surviving in feces for up to 12 wk at 4 °C (39.2 °F). The results suggest that in order to minimize health risks from Cryptosporidium, contaminated feces should be distributed during warmer weather not earlier than 12 wk after storage. Otherwise an effective manure treatment needs to be performed (Olson and others 1999). A USDA study on Cryptosporidium parvum showed that vertical recovery of oocysts decreased rapidly in loam and sandy soils. Data from packed soil cores indicated that decomposition depends on the interactive effect of manure, soil structure, water flux, and time (USDA 1999). The number of oocysts in the leachate decreased exponentially on consecutive days after the application. In general, oocysts do not appear to be readily transported through tilled soils. Another study monitored the potential for transfer of the pathogen Cryptosporidium parvum through soil to land drains and water courses after the application of livestock waste to land using simulated rainfall and intact soil cores. The authors reported that an initial load of 108 oocysts/core were reduced to undetectable or low numbers in lecheate after 21 d, depending on the soil type (Mawdsley and others 1996).
220.127.116.11. Timing and location factors
All current guidance within organic and conventional agriculture strongly states that raw manure should not be applied directly to a field or immediately before harvesting an adjacent existing crop. Doing so may result in contact or indirect transfer to crops. This is most critical with produce typically eaten raw (for example, salad/leafy vegetables,herbs, soft fruit, and melons). Spreading inappropriately aged manures next to these crops should also be avoided, due to the potential for transmission of pathogens as dust aerosols. Specific distance limits that would ensure the safety of the produce have not been scientifically validated.
Predictive information on the persistence of key pathogens in aged, stacked dairy and cattle manure, a common handling method, is generally lacking. Limited time-temperature studies of Salmonella and E. coli survival in stacked piles support the effectiveness of current managed composting practices. Natural or artificially inoculated manure exposed to a temperature range of 45 to 50 °C (122 °F) eliminates detectable populations of these pathogens in less than 3 wk. Current practices are targeted to exceed this limit.
The absolute window of time separation between the application of manure known to contain viable pathogens and safety of the harvested crop has not been sufficiently researched to account for all production, environmental, and crop-specific variables. Manure is predominantly applied to orchard floors or vegetable production ground in the fall, prior to leafing-out, bloom, seeding or transplanting. Rates and frequency of application vary widely but typically range from 4 to 6 tons/acre. Small-scale intensive, vegetable operations may apply the equivalent of as much as 12 to 14 tons/acre. Crops commonly produced in these systems, including many of the specialty greens, often have a short-rotation between seeding and harvest. In some areas, climatic conditions permit production well within the recommended temporal separation between aged manure incorporation and planting of 60 d. Some current recommendations or buyer specifications extend this period in excess of 100 d, effectively establishing this practice for growers in many marketing outlets. To the best of our knowledge, neither of these recommendations (60 or 100 d of temporal separation) has been evaluated, and currently there is no scientifically based determination of a safe temporal separation between aged manure incorporation and planting.
Although placing field-stored piles next to existing crops is not a common practice, surviving populations in such piles represent an undetermined risk. Current experimental models predict that once E. coli O157:H7 is incorporated into soil, 99% of viable populations is lost in a period of 60 to 120 d depending on soil type, matric potential and other factors yet to be determined (Himathongkham, Bahari and others 1999). In parallel field studies over a two-year period, generic E. coli was not detectable during the planting season following soil incorporation of manure (E. coli was detectable at the time of incorporation) during the fall in coastal California vegetable fields (R. Smith, K. Schulbach, and T. Suslow; unpublished data; unreferenced). Within the limits of the sampling and detection methodology, no E. coli was detectable in 20 soils and the associated leafy vegetable crops and mesculun mix from these fields at the time of harvest. These preliminary data are consistent with the outcome of proprietary product testing being conducted by individual shippers and packaged salad processors.
A handful of laboratory studies have addressed the fertilization with manure as a source of L. monocytogenes produce contamination. In a laboratory experiment performed in Iraq, when sewage cake contaminated with L. monocytogenes (3-15 cells/g) was added to soil, 10% of the alfalfa crop was positive for the pathogen, although levels were low (<5 cells/g) (Al-Ghazali and Al-Azawi 1990). Similarly, some of the parsley samples growing in pots with the same fertilizer was positive for the pathogen after 3 wk of fertilizer application. These researchers concluded that L. monocytogenes seems to be incapable of surviving for long periods in soil or liquid manure, which therefore cannot be considered reservoirs. A more likely reservoir is the plant-soil rhizosphere, since 50% of the radishes contained L. monocytogenes, but only 17% of soil samples in which radishes had been growing contained the pathogen. Listeria monocytogenes was detected in radishes sown in inoculated soil (50% of samples) but was not found in carrots (Van Reterghem and others 1991). The potential for growth, however, exists when produce is subjected to refrigeration temperatures.
In conclusion, detailed, systematic, and large-scale testing of environmental fates of pathogens incorporated into soil and onto plant surfaces, within a controlled research facility, would be highly desirable.
2.1.3. Indirect contamination
During production, and in some harvest and post-harvest situations, agricultural water may be contaminated by pathogen-containing manure or compost. At this time, animal waste management specialists generally recommend a 200 feet separation of untreated manure from wells, although less distance may be sufficient. At least 100 feet separation for sandy soil and 200 feet separation for loamy or clay soil (slope less than 6%;increase distance to 300 feet if slope greater than 6%) is recommended as distance between untreated manure and surface water.
2.1.4. Use of compost and manure-teas in organic produce
Organic producers, based on philosophical preference and conviction or in response to an increasing market opportunity, exclude or prohibit the use of conventional crop inputs common to modern farming. Synthetic pesticides and fertilizers are not allowable in current organic certification programs. To achieve optimal quality and economic returns, organic farming systems rely upon crop rotations, crop residues, animal manures, legumes, green manures, off-farm organic wastes, mechanical cultivation, mineral-bearing rock powders, and biological pest control (UC 2000a,b,c,d,e,f). These components maintain soil productivity and tilths, supply plant nutrients, and help to control insects, weeds, and other pests.
Plant disease control is a common objective of foliar treatments. Compost teas and liquid manures have been evaluated for their efficacy in the control of foliar diseases. Liquid manures are applied to establish and support biologically diverse and metabolically dynamic processes during production and extending to long-term land stewardship. The various liquid treatments are intended to serve, primarily, as a source of soluble plant nutrients, growth stimulants, and disease suppressors. Foliar-applied biotic extracts are believed to initiate a systemic response known as induced resistance, which may act as a repellant or reduce the severity of pest and disease activities on plants (Weltzien 1990). Various manure and compost extracts, such as horse, cattle, dairy and chicken, rabbit, goat, ostrich, and others alone or in combination with straw, cull vegetables, and other plant-based materials are reported to enable biologically based control of plant pathogens through their action on the phyllosphere (generally encompassing the leaf surface and associated foliar structures) (Blakeman 1981; Andrews and Hirano 1991; Suslow 2001 and references therein). A wide range of mechanisms including induced resistance (as mentioned above), delay or abortion of spore germination, other modes of antagonism, and nutrient and niche competition with pathogens contribute to the suppressive effects reported (Tranker 1992; Cronin and others 1996; Elad and Shtienberg 1994).
In the context of addressing potential sources of fresh produce microbial contamination, the practice of applying manure slurries or teas to existing crops deserves special attention. As mentioned above, manure-enriched brews of various composition have been used by growers and home gardeners around the world for many years for fertility management and plant disease control. Domestically, the extent of use on fresh vegetables is unclear but the practice is popular among smaller-scale organic and biointensive producers. At this time, the numerous sources of instructional information on manure tea preparation and use are essentially devoid of any precautionary statements regarding human food safety. Typical preparation calls for the use of raw or aged manure in a 55-gallon drum (1:4 manure to water). After 2 to 3 wk, a strained tea or slurry is applied to soil or sprayed on foliage. The strained solids are applied to green waste compost piles. Himathongkham, Bahari and others (1999), Himathongkham, Nuanualsuwan and others (1999), and Himathongkham and others (2000) have determined a period of at least 10 d to greater than 70 d for the destruction of E. coli O157:H7 and Salmonella Typhimurium in liquid manure slurries held at temperatures between 4 and 20 °C (39.2-68 °F). Extrapolation from recent research reports strongly suggests that mixing other organic components (generally plant origin) into the steeping drum water may increase the survival potential of E. coli O157:H7 (Hancock and others 1997, Buckho and others 2000; McGee and others 2001) . Other potential pathogens and parasites, such as C. parvum, a serious water-borne pathogen, may survive the incubation period for manure tea. As manure teas are not uncommon in some regions, particularly for herb production (including fresh consumed) a greater effort at risk assessment of this practice is well justified.
2.1.5. Current research
Clearly, the climate, soil properties, site characteristics and management practices (run-off, buffer strips, and water collection ponds) at a land application site will strongly influence the fate and transport of manure and any accompanying microbes. The details and dynamics of these processes are the subject of an international research effort at present. Examples of recent published reports include transport of pathogens in run-off from soil (Abu-Ashour and Lee 2000); environmental survival in soil (Barwick and others 2000; Maule 2000); survival in manure slurries (McGee and others 2001); fecal shedding of pathogens and environmental and vector dissemination (Buchko and others 2000; Wesley and others 2000; Jeffrey and others 2001); and composting and biosolids process improvements (Chauret and others 1999). Manure treatments that reduce the pathogen populations prior to and in conjunction with land application are increasingly being implemented and improved. Composting is one approach among many already in use and under scientific validation and optimization. Treatment technologies prior to land application are also the focus of an on-going effort by a USDA/ARS program on manure and byproduct utilization with the objective of developing a guide to pathogen reduction practices.
Current agronomic practices and recommendations for use of animal manures in cropping systems are being revised in many states to reflect the need for protection of water quality (see section 2.2.). These efforts, though, focus primarily on nutrient management and groundwater contamination by chemical constituents, and the emphasis on pathogen management is limited. There is, however, considerable awareness and discussion within the animal industry and among fresh fruit and vegetable producers, processors, and buyers regarding the potential for contamination of produce by pathogens in manure. This increased awareness is due to the guidance and education documents developed by industry associations, that is, Guide to Minimize Microbial Food Safety Hazards for Fresh Fruits and Vegetables (FDA 1998), and to the academic and extension activities regarding a shared commitment to improving microbial food safety.