Production Practices as Risk Factors in Microbial Food Safety of Fresh and Fresh-Cut Produce
2.2. Water for agricultural uses
2.2.1. Description of the situation
Serious deficits in fresh water availability and quality exist globally. The United States ranks third with an estimated 13 billion cubic meters of annual shortage (Postel 2000). In major fruit and vegetable producing areas of California, Florida, and Texas, aquifers are being seriously overdrawn. In addition to water availability, climate plays an important role in water quality and the potential for direct or indirect contribution to illness and outbreaks. Sewage spills, run-off from concentrated animal production facilities, storm-related contamination of surface waters, illicit discharge of waste, and other sources of pathogens threaten the quality of both surface water and groundwater used for fruit and vegetable production and therefore the safety of the consumed product. The magnitude of the problem and its relation to food safety and security has been recently reviewed (Postel 2000; Rose and others 2000).
At a local and regional level, water availability and multi-user water management planning affects the cost of agricultural water. Together with energy costs, water availability often drives growers to make choices about which crops to produce, sources of water for crop management, and methods of irrigation. An individual grower or packer may alternate water sources over the course of a season, periodically or sporadically using available surface water. Alternately, a high quality water source may be diverted into an impoundment or farm pond during periods of lower electricity costs for pumping. If not well protected, this water may be affected by the varied sources of contamination, as with any surface source.
The significance of irrigation or foliar applications with pathogen-contaminated water is largely speculative and must be evaluated on a case-by-case basis until more definitive data on persistence is available. An overview of the varied uses of agricultural water in the production of fruits and vegetables and the speculative risk associated with this method are given in Tables II-4 and 5. The risk associated with each water use was decided based on experts'opinions and, moreover, for certain practices a level of risk could not be given. The available literature on water microbial quality and microorganism survival on relevant crops is discussed below.
2.2.2. Factors determining the risk of microbial contamination
220.127.116.11. Management choices
During the production of fresh fruits and vegetables, management decisions by growers, harvest managers, and packing operations supervisor or line workers are made in response to the perishable nature of crops and dynamic nature of the markets. As mentioned in the introduction to this chapter, the choices made can do much to augment or inadvertently undo crop practices closely related to the level of microbial food safety risks. It is worthwhile to better understand how seemingly small variations in crop management may impact microbial risk and the underlying basis for the deviation. The following are three examples that illustrate the many factors that are considered when a managing decision is pursued. Although this section is focused on decisions over irrigation and other water uses during production, similar scenarios could applied to other aspects of the production chain.
Many vegetable crops are established from seeds or seedling transplants using overhead sprinkler irrigation. Once established, in some areas of production such as California, it is common to switch to furrow-irrigation for the remainder of the season. In this scenario, depending on the pre-harvest interval and crop, several weeks or more may pass without intentional overhead irrigation. Occasionally, sprinkler lines are brought back in just before harvest to apply late-season fertilizer in a water-soluble form, typically calcium ammonium nitrate (CAN). For celery, this would occur at the point when field equipment can no longer be used to apply fertilizer due to the likelihood of serious crop damage. Under hot summer conditions, CAN is applied for nitrogen (needed when the crop is going through the most rapid increase in biomass) as well as for calcium (to prevent calcium deficiency due to soil uptake limitations resulting in a condition called Blackheart). Growers may not always need to switch back to overhead irrigation, depending on susceptibility to Blackheart but also on various market conditions and harvest timing decisions. Adjacent fields managed by the same grower may have different irrigation management profiles over the course of a season. The water quality used for late season overhead irrigation will have a significant influence on microbial populations and qualitative composition (Beuchat 1996; Robinson and Adams 1978).
In a similar manner, market conditions, equipment availability, and inadequate cold storage space may prompt a grower to postpone harvests and "store" a crop in the field. When the anticipated interval and conditions before the completion of harvest require a supplemental irrigation, it is typical to switch from furrow irrigation to overhead irrigation. The overhead sprinklers give farmers more control over the amount of water applied and result in less soil saturation, allowing harvest equipment and crews to complete the harvest more quickly. In some locations, an on-farm source of water (not typically used for irrigation) may be accessed for this special need. The microbial quality of this water may not be consistent with the grower's normal good practices and attention to food safety concerns.
Lastly, although drip irrigation, especially sub-surface drip delivery, has been shown to minimize foliar contamination of fecal coliforms (Bastos and Mara 1995; Sadovski and others 1978), this method of irrigation is not always a practical option for economic or water conservation reasons. For example, in California furrow irrigation is common in the production of many leafy greens. For some of these specialty crops, the economics of production and profit margins precluded the use of sprinkler irrigation. As the markets for minimally processed salads grew, development of mechanized harvesting caused some producers to modify standard bed spacing (the distance between rows) to a much wider arrangement. Although several advantages are gained by this modification, furrow or drip irrigation would not be suitable, cost efficient or practical (the distance across a bed to uniformly provide water became too great). Under these new conditions, furrow and drip irrigation were replaced with overhead irrigation. In these cases full season overhead irrigation may have increased the microbial prevalence for these crops, depending on water quality and local environment.
Many case studies and examples like the ones described above may have unfavorably shifted the risk of a potential microbial hazard. The broad attention given to food safety in fruit and vegetable production has stimulated an unprecedented level of awareness among growers and shippers as to how their decisions can directly impact food safety. With the knowledge of the intricacies of crop management, growers and harvest operations managers are in the best position to work with the scientific community on developing priority research needs regarding pathogen environmental sources, persistence, and mitigation.
18.104.22.168 Modes of irrigation
Irrigated agriculture in the United States has increased from 41 million acres in 1974 to over 55 million acres in 1997 (USDA 1998). Much of this irrigated acreage is not in edible horticultural crops. However, in California, Texas, Florida, and Arizona, the primary horticultural crops producing states, the percent of on-farm and off-farm water sourcing for irrigation is proportionally high. In the 1998 USDA report (USDA 1998), the only survey data available, California had over 8 million irrigated acres, of which 71% was gravity flow (flood or furrow). In contrast, Texas had over 5 million acres under irrigation, with 61% applied by sprinkler and 38% by gravity flow. The distribution of irrigation modes in key producing states are given in Table II-2. The proportion of acres of vegetable and fruit crops under drip irrigation is significant but much less than typically assumed outside the industry. From Table II-2, it can be seen that Florida uses proportionally more drip irrigation and sub-irrigation (lateral distribution from interval canals). In Table II-3, the major types of overhead sprinkler irrigation modes are given. A breakout of sprinkler modes for fruits and vegetables is not available, but in general hand-moved pipe would tend to be used for vegetables whereas solid-set sprinklers would be used in orchard crops. In some larger operations, self-propelled linear overhead irrigation lines that span an entire field pull water from lateral canals. The water supplying the lateral canals may be from groundwater, on-farm impoundments, or an irrigation district supplier (delivered via main surface canals).
Many factors, such as water availability and cost, soil type, slope, depth of water table, economics, and cropping rotations, determine the choice of irrigation mode. At this time, it is not possible to report the irrigation mode statistics for different vegetable and fruit crops. Microbial food safety concerns have not been a primary determinant in selecting the type of irrigated water delivery system. Broader knowledge and awareness of the impacts of water quality on microbial risks during production may provoke a shift in some areas where good water quality is seasonally unavailable. There is an inminent need to establish what constitutes "good water" quality that can be safely used from the perspective of irrigation of edible crops. Microbiological risk assessments need to be developed so that agricultural water standards can be established with a scientific basis.
|Distribution of Irrigation Methods (Survey, USDA 1998) |
(1998 Response in 1000's acres)
|Distribution of Irrigation Methods (Survey, USDA 1998) |
(1998 Response in 1000's acres)
HPP - High Pressure Center Pivot; LPP - Low Pressure Center Pivot; Raised Liner-Move; Mobile-Propelled Liner or Side Roll; Hand Moved Pipe; Solid Set System
22.214.171.124. Protection of water sources
In a recently released survey on U.S. agricultural practices (USDA 2001), the most common source of irrigation water was deep ground wells, with 51% of the vegetable and 39% of the fruit growers reporting this source of water. Flowing surface water was the next most common source of irrigation water, with 38% of the fruit growers and 19% of the vegetable growers drawing water from this source. About 5% of produce growers reported using municipal water.
Protection of surface and groundwater sources from pathogen contamination is one critical area of production that is receiving great attention. Public health concerns about infectious disease agents from run-off, reclaimed water, and sludge (biosolids) are broad and well documented (Committee on the Use of Treated Municipal Wastewater Effluents and Sludge in the Production of Crops for Human Consumption 1996). Sources include domestic and industrial solid waste effluent, domestic, and industrial water discharge and reuse (Gerba and others1995). According to the 1998 National Water Quality Inventory, approximately 60% of the various types of water quality impairing pollution in rivers and streams and 45% in lakes comes from agricultural sources. Agriculture, predominantly concentrated animal production operations, is cited as the major contributor of microbial pollution to the nation's waterways. There are currently an estimated 376,000 operations that, combined, generate approximately 128 billion pounds of manure each year. Concentrated animal feeding operations (CAFOs) are the largest of these livestock operations. The recent outbreak of E. coli O157:H7 in Walkerton, Ontario, is an example of contaminated run-off water from farms contributing to a public heath crisis. Although the contaminated run-off was only a minor contributing factor in the outbreak (major factors were water chlorination failure and heavy rainfall), it illustrates how animal farms waste should be adequately managed. Other outbreaks or cases due to contaminated water have been associated with faulty well designs (Jackson and others 1998) or run-off after torrential rain (Charatan 1999).
Serious water quality problems result from pathogen contamination that impact agricultural and domestic users. Pathogens, such as Cryptosporidium, have been linked to impairments in drinking water supplies and represent a frequent direct threat to human health. Agriculture, municipal point sources, urban run-off, residential septic system failures, industrial point sources, weather related raw sewage spills, and illicit discharge are widely recognized as the leading sources of the problem. Although it is difficult to determine the exact contribution of any particular source on a national basis, it is widely recognized that each has or could pose a number of risks to water quality and therefore produce contamination during production and post-harvest handling. This clearly points to the need of establishing microbiological standards for agricultural water.
Topological elements surrounding the water source, such as slopes and depressions, could lead to the introduction of run-off from an adjacent field, grazing land, animal production facility, septic system, waste spreading field, dairy lagoon, or other potential sources of pathogen contamination. Run-off prevention and diversion structures, such as collection channels, diversion berms, and vegetated buffer areas can help divert run-off away from the water source. To protect on-farm water sources, farm agent extension recommendations range from distances of 30 to 400 feet between potential contaminants and a water source. Greater distances are strongly advised for sources near concentrated animal facilities and manure storage areas. According to USDA (2001), only 2% of vegetable acres and 1% of fruit acres were reported to be adjacent to a confined animal production facility. Seven percent of all fruit acres and 3% of all vegetable acres were reported to be adjacent to a residential area. One or less than 1% of all fruit and all vegetable acres were located next to industry, waste disposal or landfill or manure storage sites. Strategies to control run-off included ditch construction, buffer strips, retention systems, and drainage systems. Although only 40-75% of the fruit farms adjacent to potentially unsafe environments implement some type of run-off control, the majority of the vegetable farms had some control. The data suggest that small farms are less likely to have such strategies.
For groundwater sources, responsible production practices include well designed, maintained, and regularly inspected wells. This includes inspecting well casings regularly and repairing them as needed. A properly designed system must have a well casing that extends to the water level in the well, and must have a grout seal. Well casings should, in addition, extend more than 12 inches above the land surface and have a sloped crown at the base to prevent water accumulation at the seal. The height should be sufficient to minimize the chance of seasonal run-off reaching the well opening and having a direct route to the source water. While much is known from the scientific literature about the potential influence of surface water, local weather patterns, floodwaters, soil drainage, run-off and soil percolation rates on groundwater contamination by toxic chemicals, far less information is available about the infiltration potential of bacteria and viruses. Older or shallow wells are typically more easily influenced by contamination. A sanitary well sleeve may be installed in these wells to help protect against flood contamination. It is important to make sure that there are no cross-connections between clean water sources and contaminated sources. This will prevent back-siphonage, which can cause contamination of one water source by another. However, if the lines must be connected, the grower should install valves, legal air gaps, or other mechanisms that will prevent back-siphonage in the irrigation pipes and other water lines.
Limited reports that assess the risk of groundwater contamination are available. Haas and others (1996) conducted an analysis of the potential for human pathogenic viruses, from various fecal containing wastewater solids and municipal solid wastes, to contaminate groundwater and aquifers. Their comparative risk analysis emphasized the mathematical evaluation of contamination potential for drinking water supplies used by small communities and isolated facilities influenced by proximity to sources of contaminant leachate. The details of models and research-based assumptions of adsorption and inactivation are available within the paper cited, but their conclusion stated: "It was concluded that, even with conservative assumptions, the health risk to humans from exposure to microbial pathogens of fecal origin deposited in well-designed and operated sanitary landfills is below levels currently considered to be acceptable under U.S. drinking water regulations applicable to treated potable water supplies."
In contrast to this study, it is well known that untreated wastewater contains relatively high concentrations of a range of pathogens and if used for agricultural purposes it poses great concerns. The contamination and persistence of infectious pathogens in water, in soil, or on crop plants, associated with the discharge of untreated wastewater (Ahmed and Muller 1984; Armon and others 1994; Bastos and Mara 1995; Bell 1976; Downs and others 1999; Gallegos and others 1998; Garcia-Villanova Ruiz, Cueto Espinar, and others 1987; Katznelson and others 1977; Kovacs and Tamasi 1979; Libero 1989; Shuval 1993; Szabo 1976; Takayanagui and others 2000; Teltsch and Katznelson 1978;Tamasi and Winkler 1977; Vaz da Costa Vargas and others 1991) or on elevated clinical disease in impacted areas (Ait Melloul and Hassani 1999; Downs and others 1999) is evident in various reports from outside the United States. For example, a dramatic decrease in typhoid fever and hepatitis A was reported when growers stopped using city sewage water for irrigation of vegetables in Santiago de Chile, Chile (Alcayaga 1993). These studies (not all in English but retrievable English abstracts are provided) are disclosed for the purpose of background information but are only considered to be a relevant or a comparable situation to the scope of this assessment from the standpoint of predicted persistence following raw sewage spills or flood related contamination.
The dissemination of human bacterial pathogens through agricultural soil and the potential for groundwater or surface water contamination has been evaluated or reviewed by many researchers (Ahmed and Muller 1984; Gagliardi and Karns 2000; Hossain and others 2000; Jones 1999). In general, mobility thorough soils is low and transference would not be expected in the absence of direct run-off drainage or percolation channels. In coarse textured soils, however, percolation to shallow subsurface water is highest, especially in the absence of various forms of biofiltration, such as active wetland or grassland roots (Decamp and Warren 2000, Niewolak and Tucholski 1999). In a laboratory experiment with simulated rainfall, tillage practice, in combination with soil type and source of pathogen load, impacted but did not prevent vertical transport of E. coli O157:H7 through soils (Gagliardi and Karns 2000). Multiplication and persistence, following manure incorporation, was reported in all soils tested, with the exception of a clay loam. Leaching of E. coli O157:H7 in run-off water was detected for at least 17 d. The data indicate that provided there is a dispersion mechanism like rainfall or irrigation, E. coli can travel below the top layers of soil for more than 2 mo after application.
Only a few studies on microbial migration through agricultural soils or down slopes have been conducted in the field. Abu-Ashour and Lee (2000) monitored the migration of bacteria on sloping surfaces by spraying water inoculated with a nalidixic acid-resistant E. coli as a biotracer. The extent of microbial migration during run-off after two heavy rains depended on the slope of the plots. Although most samples collected after 15 d had only small amounts of the bacteria, the authors concluded that the survival and migration of pathogenic E. coli due to run-off may increase the potential for contamination of produce.
Another field study was done in Iowa to determine the effect of manure application practices on bacterial run-off water caused by rainfall simulation. Swine manure was used at two application rates and with two application methods. Escherichia coli and fecal streptococcus were found in all run-off samples but the bacterial population decreased with time (FengYu and others 2000). Other attempts reported that the association of certain manure practices with persistence of bacterial populations in subsurface drainage water and run-off surface water were not clear (Warnemuende and others 1999). Because of the difficulties in assessing the impact of run-off in field-studies, the authors proceeded to do a water run-off laboratory study after manure application in the field. The data showed that higher rates of application of swine manure resulted in higher leachate of E. coli and enterococci from intact soil columns after 4 weekly irrigation events, especially when the manure application was in the spring. Lower bacterial densities were seen with the autumn application that may be due to the freezing of the soil between manure application and irrigation (Warnemuende and Kanwar 2000). An interaction was also reported between application rate and manure application season. The association of certain manure practices with persistence of bacterial populations in run-off water seems complex, depending not only on manure practices but also on other factors such as soil texture and climate. Run-off levels of indicator organisms seem to be higher in warmer months. Although for obvious reasons these field studies were not performed by inoculating the soil with the pathogenic organisms, the conclusions may serve as examples of the potential risk of contaminating crop fields by run-off waters.
126.96.36.199. Irrigation with run-off water
In addition to incentives for the use of reclaimed water (see section 188.8.131.52.), there are standard conventions in irrigation management and local or regional incentive programs for collection and recycling run-off water for on-farm or downstream irrigation. One potential area of concern is the influence of land-applied biosolids and sewage or animal manure effluent on subsequent irrigation run-off water. This issue has been recognized for some time (Bryan 1977). Although the risk of contamination of run-off water with pathogens of manure origin is not known, the laboratory and field research outlined above and in the manure section suggests that the potential for such a scenario exists. Regarding biosolids, Jones and others (1980) reported that salmonellae were recoverable from 68% (882 total samples) of biosolids and processed sewage effluent that was likely to be applied to agricultural soils. Although raw sewage was predominantly positive for salmonellae (85%), the frequency of post-processing positive detection depended on the treatment facility ranging from 0 to 100% of replicated samples. The authors reported that direct enumeration was not possible due to the low sensitivity of the method and estimates of initial populations by enrichment recovery were uniformly less than 200 MPN/100 ml. The authors correctly cautioned that the numbers may be biased towards enumerating those salmonellae most favored by the particular enrichment recovery technique. The authors concluded that such low numbers of pathogens in the biosolids and effluent would significantly persist in soil or plants after 7 to 10 d. As a precaution, however, a minimum period of 28 d post-application to grazing land was recommended.
The survival of pathogens would occur more likely in mixed sediments in collection ponds or return-system pumps than in the overlying water between periods of mechanical turbulence. Therefore, monitoring of the collected water may not reveal the presence of pathogens or high numbers of fecal indicator microbes in water samples. Extending the conclusions developed by Goyal and others (1977), pumping operations, sudden high volume flow during discharge from one area to the next, or dredging maintenance of a pump is likely to bring up persistent populations of pathogens from sediments. Interestingly, their data, in agreement with several other reports, failed to establish any correlation between fecal coliform population densities and Salmonella recovery. The estimated density ratios for co-isolation of salmonellae and fecal coliforms varied from 1:9 to 1:2000. Water quality criteria based on sensitive, accurate, reliable, fast, and cost-effective methods are needed to lead the development of routine irrigation and irrigation run-off water testing.
184.108.40.206. Irrigation with reclaimed water
One long-standing solution to both wastewater management and water availability needs has been the use of reclaimed water in agriculture, including irrigation of fruits and vegetables. Reclaimed water has been increasingly used for irrigation and to recharge groundwater since the 1980's. This has been due partly to concerted efforts to promote reclaimed water used among growers. Various studies have demonstrated that if strict disinfection methods and microbiological standards are followed, public health risks are negligible. The World Health Organization developed guidelines for the use of reclaimed water for agriculture, with a recommendation of <1,000 fecal coliforms/100 ml (WHO 1989). The U. S. Environmental Protection Agency has also provided guidelines for water reclamation and urban, industrial, and agricultural reuse (EPA 1992). For irrigation, the guideline recommends that fecal coliforms are not detected in the water in at least 50% of the observations. The actual standards remain the responsibility of the states. For instance, Florida and California state regulations require that the reclaimed water receives secondary treatment, filtration, and high-level disinfection, with no detectable levels of coliforms. In Florida, however, a limitation states that reclaimed water is not allowed for direct contact application methods (spray irrigation) used on crops that will not be peeled, skinned, cooked, or thermally processed (FDEP 1999). For such products, such as salad crops, indirect contact methods may be used (ridge and furrow, drip, or subsurface irrigation). There is no scientific basis for the current prohibition on direct contact irrigation methods, for the so-called salad crops and removal of this prohibition has been recommended (York and others 2000; FWEA 2000).
In 2000, Florida used 35 million gallons per day of reclaimed water to irrigate over 14,000 acres of edible crops (FDEP 2001). While citrus represents the primary edible crop irrigated with reclaimed water, a wide range of other edible crops (tomatoes, cabbage, peppers, watermelon, corn, eggplant, strawberries, peas, beans, herbs, squash, and cucumbers) also are irrigated with reclaimed water in Florida (York and others 2000). In California, a major source of fresh produce, fruit and vegetable crops irrigated with reclaimed water include apples, asparagus, avocados, broccoli, cabbage, cauliflower, celery, citrus, grapes, lettuce, peaches, peppers, plums, and squash (California State Water Resources Control Board). Disinfected tertiary recycled water is applied to over 12,000 acres of vegetable-producing land in the Monterey County, CA (Sheikh and others 1990).
A five-year study determined that process controls, as required by California Code of Regulations Title 22 (coagulation, flocculation, filtration, and disinfection) were sufficient to exclude the possibility of residual pathogen content in the water, frequently used for overhead irrigation (Sheikh and others 1990; Sheikh and others 1999). From surveys of raw incoming wastewater on eight dates, Salmonella never exceeded 16 CFU/100ml (mean 5), E. coli O157:H7 was never detected, and Cryptosporidium varied from undetectable to over 200 oocysts/L (mean 74). Cyclospora was reported on one date in December 1997 at 330 oocysts/L (details of confirmatory tests or species not provided). There was no indication that the human pathogen, Cyclospora parvum, was present. Of interest is that, as in reports on the use of treated effluent for vegetable crop irrigation, the mean recovery of Salmonella following secondary treatment was essentially the same as untreated incoming waste. Cryptosporidium, Giardia, and Cyclospora were effectively eliminated, essentially all appearing as empty, non-viable cysts. Following tertiary treatment, Salmonella was not detectable or recoverable from finished irrigation quality water. In comparison, the fecal coliform count was progressively reduced by 1.5 and 6.1 log cycles, a level below the detection limit, following secondary and tertiary treatment, respectively. Irrigation with reclaimed water was found to produce excellent yields of high-quality produce (artichokes, broccoli, cauliflower, celery, and lettuce) with no detectable levels of salmonellae, shigellae, or human parasites. Likewise, no pathogens were found on aerosols.
The need of compliance testing and to assure consumer confidence in treated crops prompted these types of studies. Interestingly, regulations provide strict microbial quality parameters for reclaimed water but no such standards apply to general agricultural water for irrigation. Federal Worker Protection Standards require the availability of drinking water for workers and potable water for hand washing. Currently, there are no standards in the United States for maximum contaminant levels for agricultural water. In California, the general approach has followed state laws. Water delivered for irrigation or crop management uses must not exceed a median value of 2.2 total coliform per 100 ml over a 7-day period. Orchards and vineyards may be surface irrigated, with water quality equivalent to primary effluent. Irrigation water for seed crops may also be of primary effluent quality. For overhead or sprinkler irrigation, reclaimed wastewater must be treated to achieve the 2.2 coliform/100 ml rule and, in addition, may not exceed 23/100 ml in more than one sample within a 30-day period.
Limited assessments are available upon which to develop constructive guidelines for growers in the development of microbial limits within a Good Agricultural Practices (GAP) Program. A report by Shuval and others (1997) concluded that the risk of disease from regularly consumption of vegetables irrigated with effluent that meets WHO guidelines (1,000 fecal coliforms/100 ml) is negligible. The risk assessment model was applied using Hepatitis A virus and Vibrio cholera. According to the authors, the additional health benefit that might result from a further reduction of risk gained by adhering to standards requiring no detectable fecal coliforms/100 ml appears to be insignificant. A more comprehensive risk assessment is needed, however, including other relevant pathogens such as E. coli O157:H7. York and Burg (1998) reviewed pathogen data for reclaimed water and for other waters commonly used for irrigation - notably high-quality surface water, ground water, and treated drinking water. The focus was on the protozoan pathogens Giardia spp. and Cryptosporidium spp. They concluded that reclaimed water was comparable to other high quality sources of irrigation water and is an excellent source of water for landscape and agricultural irrigation. The National Research Council (NRC 1996) conducted a comprehensive evaluation of the use of reclaimed water and residuals in food crop production. The NRC concluded that: "Current technology to remove pollutants from wastewater, coupled with existing regulations and guidelines governing the use of reclaimed water in crop production, are adequate to protect human health and the environment." They also noted that "food crops thus produced do not present a greater risk to the consumer than do crops irrigated from conventional sources."
220.127.116.11. Other production uses of water
The application of foliar treatments include virtually all contact sprays on aerial plant parts: fertility management, pest and disease control, plant growth regulators, or microenvironment modification. These other minor uses of water constitute also an area of primary concern for the potential to contaminate fresh produce with infectious pathogens (Tauxe and others 1997; FDA 1998). Tables II-4 and 5 outline many of the activities involved in crop production that utilize water.
Water contact close to harvest is, understandably, considered to be of greatest concern as the specifics of persistence for various pathogens on plant surfaces is poorly characterized (Brackett 1999; Suslow 2001). Although pre-harvest pesticide applications, late-season fertility management sprays, or sunscald protectants are of low volume, evidence is building that foliar sprays near or at harvest have been involved in outbreaks. According to a CDC report, water used in a pesticide solution was the most likely source of contamination of raspberries that caused the Cyclospora outbreak in the United States and Canada in 1995. The report recommends the use of potable water for pesticide application (Anonymous 1997).
Foliar treatments are important common crop management activities for both conventional and organic producers and handlers. Speculatively, the nature of organic crop management practices, particularly with regard to the use of foliar treatments consisting of both simple and complex organic matrices, may predispose certain fruits and vegetables to a higher risk of persistent contamination, if human pathogens are present. Contamination may derive directly from the source material or may occur through secondary sources such as the use of contaminated diluent water. Although no documentation is provided, observation and personal experience with crop management practices over many years, regions, countries, and scales of operation allow the panel to state with certainty that the sources of water used to dilute and apply foliar sprays to edible crops are diverse. More importantly, water quality and microbial content may be highly variable. In addition, practical realities of water availability or proximity to sites of production/application may cause individual applicators to use non-potable surface water (that is, ponds, creeks, and canals) as the spray make-up source. Practices that may increase the risk of microbial growth include holding partial or full tank-mixes of expensive spray-on materials for 24 h or longer. Temperatures of the source water and tank water have not been specifically measured but would reasonably range from 5 to 37 °C (41 to 98.6 °F), depending on season, region, source, hold-time in exposed or tractor-mounted tanks, and other factors. Hypothetically, if the source water contained infectious pathogens, formulations with a high organic content, a reduced microbial load due to processing or an inactivating treatment, and a potential nutrient-base for pathogens capable of saprophytic growth may elevate the risk of illness associated with the consumption of uncooked organic produce. Conversely, the chemical, physical and biological properties of the formulation in non-potable water may limit survival of all or specific groups of microbial pathogens.
Research has been initiated to determine the survival and growth potential of Salmonella and E. coli O157:H7 in late-season fungicides used on fruits and vegetables. Thus far, when used at recommended rates and with or without adjustment to neutral pH, no products tested have an inhibitory effect on pathogen survival within 4 to 5 h. Some of the products permit limited multiplication if held at 37 oC (98.6 °F) for 16 h (Suslow, unpublished data). In another related published study, the effect of diluting various pesticides products with water containing 102 CFU/ml of E. coli O157:H7, Salmonella flexneri and enteritidis, Shigella sonnei, Shigella flexneri, and L. monocytogenes was investigated. Pathogen survival and growth depended on the pesticide, pesticide concentration, microorganism load, and temperature. Consistent with the previous results, their findings showed that some of the commercial pesticide products when diluted with contaminated water may promote the growth of pathogens, which could increase the potential for contaminating produce when applied (Guan Tat and others 2001).
Several interacting factors determine the relative risk of the persistence of diverse pathogens known to be associated, at least transiently, with fruits and vegetables at production or during harvest and post-harvest handling operations (Beuchat 1996; Katznelson and others 1977; Suslow 2001). Plant anatomical traits, macro and microclimatic effects, solar irradiance, the composition of the contaminant-carrying matrix, other crop management practices, the pre-existing plant microflora, and the pre-harvest interval of treatment are among these influential factors. Therefore, in addition to research on microbial survival and growth within the foliar application system (that is, pesticide or growth hormone), the post-application survival on plant surfaces needs to be addressed. As it has been emphasized previously, important microbial reduction conditions, such as the potential for desiccation and osmotic shock-induced death, must be tested before any full assessment of risk is completed.
2.2.3. Microbial quality of water and contamination of produce
The microbial quality of agricultural water is a critical issue of concern in the pre-harvest and post-harvest food safety management of edible horticultural commodities and minimally processed produce. By direct or indirect contamination, from water or water aerosols, the potential hazard for persistent pathogen populations on harvested crops has been long recognized (Cherry and others 1972; Dondero and others 1977, Garcia Villanova Ruiz, Galvez Vargas and others 1987; Smith and others 1978; Teltsch and Katznelson 1978; Teltsch and others 1980). Though direct evidence of foodborne illness due to contamination of edible horticultural commodities during commercial production is scant, compelling epidemiological evidence involving these crops has implicated specific production practices, including use of animal waste or manure, fecal contaminated agricultural water for irrigation or pesticide/crop management applications, and farm labor personal hygiene, as leading to direct contamination (Brackett 1999). Brackett (1999) categorically suggested that only clean, potable water should be used for irrigation of fruits and vegetables after planting. Though well intentioned, such an approach fails to take into account many aspects of water availability, water conservation programs, irrigation method, geographic diversity, crop diversity, temporal factors, and the significant difficulty inherent in water monitoring for microbial content during production.
The majority of risk assessment studies related to irrigation water quality have been conducted outside the United States. These studies evaluate the presence or persistence of pathogens conveyed to crops by spray irrigation, irrigation aerosols of sewage effluent (Garcia Villanova Ruiz, Galvez Vargas and others 1987; Garcia Villanova Ruiz, Cueto Espina and others 1987; Teltsch and Katznelson 1978) or drip irrigation (Sadovski and others 1978). The general findings are consistent with the results of qualitative surveys in that detection is variable and depends upon the level and nature of environmental stress (Katznelson and others 1977; Teltsch and Katznelson 1978; Teltsch and others 1980). It has been reported that detection was correlated to population densities of target pathogens in the source water and spatial orientation relative to the point source. Survival was favored by the presence of increasing organic matter content in the water. Not surprisingly, key factors affecting the survival of Salmonella in aerosols are relative humidity and solar irradiation. The effect of solar irradiation is suggested by the greater recoveries at greater distances (up to 60 m from the point source) obtained during night irrigation. Persistence following deposition to plants was not reported. Aerosolized enteric viruses were found to be more resistant to environmental stress than bacterial indicators or bacterial pathogens (Shuval and others 1989). In another study that examined the presence of enteric organisms in the air, the range of pathogen or indicator coliform detection in sprinkler aerosols ranged from 40 to 730 m downwind, depending on output volume and droplet size. Importantly, aerosolized E. coli was only detected when the level in the irrigation water before spraying was log 4.0 CFU / ml or greater (Katznelson and others 1977).
The persistence of pathogens on produce varies and no general conclusion can be made from the surveys in the literature. The following results are indicative of the need to identify appropriate indicator organisms and to better define the interacting factors that may result in an unacceptable public health hazard. One study confirms the need for indicator organisms when post-irrigation survival of Salmonella on vegetables was reported. In this study, no correlation between indicator E. coli and isolation of Salmonella was observed (Garcia Villanova Ruiz, Galvez Vargas and others 1987). In 31% of the diverse vegetables surveyed, Salmonella were recovered but E. coli was undetectable.
One report by Vaz da Costa-Vargas and others (1991) indicated that in a semi-arid region of Portugal, Salmonella became undetectable on effluent-irrigated lettuce 5 d after irrigation was terminated, but generic E. coli indicator strains persisted. The 2-year study also compared the fecal contamination in spray-irrigated lettuce with that of locally marketed lettuces. Environmental conditions influenced the extent of the contamination so that lettuce grown in a hot, dry year had similar fecal indicator bacteria loads as the locally grown lettuce only 5 d after sprinkler irrigation with water effluent from a conventional trickling filter plant. Escherichia coli levels were within the standards of the International Committee of Microbiological Specifications for Foods (ICMSF) (<105 E. coli/100g) and no Salmonella was detected. According to the authors this suggests that the current World Health Organization (WHO) water guidelines of 103 E. coli/100 ml may be too stringent (Vaz da Costa Vargas and others 1991). A parallel study investigated E. coli and salmonellae contamination on field and glasshouse-grown lettuces and radishes irrigated with waste stabilization pond effluent by drip- and furrow irrigation. Again, both E. coli and salmonellae levels were lower when the weather conditions were drier, confirming that the WHO water levels may have to be adjusted depending on the weather conditions. Rainfall seems to be a factor that increases contamination of vegetables, possibly due to better survival of bacteria under humid conditions.
Results of a survey of fruits and vegetables for salmonellae, shigellae, and enteropathogenic E. coli suggested that the frequency with which target pathogens could be isolated from irrigation water was inversely correlated with crop height (Velaudapillai and others 1969). Low plants, such as spinach and cabbage, had a higher frequency of confirmed positive isolation than taller chili peppers or tomatoes, while tree fruit had a negligible occurrence of contamination in this study. The sources of irrigation were reported to be from canal surface water and shallow wells of inadequate construction and design. Other factors related to the nature of the plant surface hydrophobicity and contours may also have affected the persistence of bacterial pathogens.
A survey of lettuce irrigated with sewage effluent revealed that E. coli, as an indicator of fecal contamination, would rapidly decline within 3 to 7 d in an open environment but could persist and be recovered for up to 21 d (Nichols and others 1971). Prior to this report the standing recommendation had been to terminate overhead irrigation 21 d prior to harvest. Negative impacts on yield and quality made this an impractical advisory. In these studies, periods of natural rainfall did not eliminate E. coli from the lettuce. Non-irrigated plots and plots irrigated with deep well water had very infrequent positive detection of E. coli and at very low predicted starting populations. Due to the uncertainty of survival under prevailing weather conditions, the authors advised the improvement of water quality and the implementation of alternative solutions for lettuce irrigation. In the laboratory, viable cells of E. coli O157:H7 were detected on lettuce after 15 d at refrigeration temperature even with an inoculum of <10 CFU/g using feces or peptone water as a carrier (Beuchat 1999). Although, definite conclusions cannot be made based on such laboratory studies, the results suggest that the pathogen can persist under refrigeration if contaminated water is used for irrigation.
Salmonella Typhimurium, Salmonella Kapemba, Salmonella London, and Salmonella Blockey were the most frequently isolated serotypes from 181 irrigation water samples and among 849 vegetables irrigated with these water sources in Spain (Garcia Villanova Ruiz, Cueto Espinar and others 1987). No correlation between these environmental and food isolates and clinical serotypes within the same period of the farm survey was observed. Similar results had previously been reported for lettuce and fennel (Ercolani 1976). The authors cautioned that the hazard of pathogen contamination of vegetables should not be underestimated despite the absence of strong linkage to human salmonellosis within a community. Dondero and others (1977), in a survey of surface waters in New York State, determined that the 144 Salmonella isolates from various known or potential sources of agricultural water possessed low or negligible virulence in a standard mouse infectivity study, as compared to human clinical isolates. As the authors point out the ability to recover certain serotypes is based on the isolation methods employed, reliance of mouse-based virulence testing, and the potential for reversion to virulence within the host. More recent studies have provided a more compelling connection between the use of wastewater for irrigation and Salmonella infection in those most at risk; namely, young children of agricultural families (Ait Melloul and Hassani 1999). Crop irrigation with untreated wastewater caused a significantly higher rate of infection with Salmonella in children of agricultural families (39%) than in children of non-agriculturists (24%). Also, the prevalence of Salmonella infection for the children exposed to sewage irrigation was 32% compared to 1% for children from an area that does not practice sewage irrigation. In another study, during a seven-month microbiological survey of vegetables, higher total and fecal coliform counts were recorded on the dates when the sprinkle irrigation water source was of poor microbiological quality than when acceptable water was used (Armon and others 1994). In addition, the coliform levels depended on the product, possibly its structural features. Salmonella spp. were detected only on vegetables that had been irrigated with the poor quality water.
In summary, and as mentioned before, although the importance of using high quality water for irrigation cannot be overemphasized, multiple other factors need to be taken into account when developing water microbiological guidelines. The irrigation method, crop type, climate and other environmental conditions contribute to the persistence of the particular pathogen on the edible portion of the plant. Also, due to global issues such as fresh water shortages, the gains in public health safety need to be carefully weighed against the cost of strict water guidelines before any implementation occurs.
2.2.4. On-farm treatments of water
The practice of monitoring pathogens in agricultural source water to assure the safety of edible horticultural commodities is questionable and controversial. The costs of microbiological monitoring are high compared to the benefit of satisfying the criteria of an on-farm GAP and food safety program. The value of pathogen monitoring of bulk source water has been reviewed in many sources, including Allen and others (2000).
Measures that may be more successful at minimizing surface water and contamination of groundwater supplies are proper design, construction, and protection of well-heads. The value of periodic microbial monitoring of wells, generally for E. coli as an indicator of recent or persistent fecal contamination, is widely recognized, because shock treatment with disinfectants is possible. Prescriptive treatment of surface water to eliminate potential pathogen contamination, however, is less likely to be practical.
Scientific reports that document the feasibility and performance of various methods of on-farm water treatment are scarce. Research and empirical studies are in progress to evaluate various disinfectant treatments to surface or well water, including chlorination, pH shock, peroxyacetic acid, UV, and ozone treatment. They may prove to be effective and economical in specific situations (Suslow, personal observation). Generally, these efforts are restricted to water intended for drip or micro-sprinkler irrigation, due to the more limited volume of water needed. A few strawberry growers on production blocks that may receive water with high indicator coliform counts have routinely practiced ozonation of irrigation water (Suslow, personal observation).
Unfortunately, guidance to growers on water treatments for microbial reduction, is based on dated reports. Robinson and Adams (1978) found that UV irradiation of river water for irrigation of celery was effective in reducing total coliforms as well as the low levels of nonpathogenic E. coli present. Water was supplied to celery through overhead sprinkler irrigation. Progressively, within their study, it was necessary to incorporate a suspended solids filtering unit and a total of 8 UV source lamps to compensate for fluctuating water quality. The maximum microbial reductions achieved by irradiation and filtration, as compared to untreated river water, was 0.8 CFU / ml of total aerobic mesophilic, 1.8 MPN coliforms, and 1.2 MPN E. coli/100 ml. The difference in total coliforms between water qualities was not significant at 6 d before harvest and the presence of E. coli was negligible. UV irradiation, using the systems of the time, had no significant impact in river water on plant pathogenic bacteria with potential to induce soft-rot of celery. Science-based justification and research-based performance criteria for any buyer-imposed or government mandated antimicrobial water treatment for irrigation of edible crops are clearly needed.
It is clear from the studies presented that the use of water of low microbiological quality on fresh produce poses a potent public health risk. The level of risk, however, varies depending on many other factors that need to be scientifically evaluated. A key variable is the persistence of pathogens on the edible part of the plant. This in turn is influenced by a number of interacting and unmanageable factors such as climatic conditions. The implementation of high quality water standards as part of an integrated food safety program may result in lower public health risks. In light of serious concerns over global shortages of fresh water, however, the benefits gained by society from such an implementation need to be carefully evaluated and well-justified. A substantial body of data describes transport of fecal coliforms that may be relevant in estimating the transport of pathogen bacteria through soils, but little is known about the transport of parasites. Data that describes the role of environmental parameters, such as topography, soil characteristics, vegetation, climate on the rate and extent of pathogen transport is scarce.
Scientific studies, using modern techniques, current and emerging technologies, and novel approaches are needed to fill in information and develop predictive risk relationships between current and developing production practices, irrigation methods, and water quality. Much greater crop and regionally specific data or predictive models for post-irrigation or post-foliar application survival and dissemination are needed. This research is needed for a broader range of pathogens in relation to a broader range of model crops and production environmental parameters. As with other areas of agricultural practices relevant to food safety, this is an active area of research by both government and academia. One would expect to see many valuable data in the near future. For a more specific list of research needs, see the research needs section of this chapter.
|Application Mode||Root Crops||Leaf and Flower||Fruit on or Near Soil||Fruit Elevated above Soil||Common Source||Potential for Microbial Contamination|
|Furrow by feeder ditch||C||C||C||CU||S, GW||Medium|
|Furrow by siphon tube||C||C||C||CU||S,GW||Medium|
|Furrow by gated-pipe||C||C||C||CU||S,GW||Medium|
|Furrow by lay-flat||U||C||C||U||S,GW,I||Medium|
|Overhead high pressure sprinkler||U||U||U||U||S,GW,I,RW,M||Very High|
|Linear drop-head sprinkler||C||C||U||U||S, GW,I,RW||Medium|
|Under canopy sprinkler||U||U||U||U||S,GW,I,RW||Medium|
|Suspended drip emitter||U||U||U||U||GW, I,M||Low|
|Surface drip tape||U||C||C||C||S,GW,I,RW||Medium|
|Subsurface drip tape||C||C||C||C||GW,I,RW||Low|
|Subsurface drip tape w/mulch||U||C||C||C||GW,I,RW,TW||Very Low|
|Ebb and flow glasshouse||U||C||U||C||GW,RW,M||Low|
|Tractor mounted tanks||C||C||C||C||GW,M,???||?|
|Aerial applicator||C||C||C||U||GW,M, ???||?|
|Trailer mounted tanker||C||C||C||C||GW,M, ????||?|
|Backpack applicator||U, CSF||U, CSF||C||C||GW,M, ???||?|
|Harvest (late season)||C||C||U||U||?||Medium-High|
|Under canopy sprinkler||U||U||U||U||?||?|
|Growth Regulators Pre-harvest||U||U||U||U||?||?|
|Access Road Dust Control||C||C||C||C||?||?|
|Application Mode||On/Near Soil||Elevated above Soil (Bush)||Trellised||Standard Tree||Common Source||P|
|Flood||U||U||U||C||S||Low (in standard tree orchard during production)|
|Furrow by feeder ditch||U||C||C||U||S, GW||Low -Medium (depending on proximity to soil)|
|Furrow by siphon tube||U||C||C||U||S,GW||Low -Medium (depending on proximity to soil)|
|Furrow by gated-pipe||U||C||C||U||S,GW||Low -Medium (depending on proximity to soil)|
|Furrow by lay-flat||U||C||C||U||S,GW,I||Low -Medium (depending on proximity to soil)|
|Overhead sprinkler||U||C||C||U||S,GW,I,RW||Low -Medium (depending on proximity to soil)|
|Overhead high-pressure sprinkler||U||U||U||U||S,GW,I,RW,M||Very High|
|Linear drop-head sprinkler||U||U||U||U||S, GW,I,RW||Medium to High (depending on environment)|
|Under canopy sprinkler||U||U||C||C||S,GW,I,RW||Medium|
|Suspended drip emitter||U||U||C||U||GW, I,M||Low|
|Surface drip tape||C||C||C||U||S,GW,I,RW||Low to Medium (depending on proximity to soil)|
|Subsurface drip tape||C||C||C||C||GW,I,RW||Low|
|Subsurface drip tape with mulch||C||C||U||U||GW,I,RW,TW||Very Low|
|Drop emitter glasshouse|
|Ebb & flow glasshouse|
|Tractor mounted tanks||C||C||C||C||GW,M,???||?|
|Aerial applicator||C||C||GW,M, ???||?|
|Trailer mounted tanker||C||C||C||C||GW,M, ????||?|
|Backpack applicator||U, CSF||U, CSF||U||U||GW,M, ???||?|
|Harvest (late season)||U||R||?|
|Overhead sprinkler||C||U||C||C||GW, S||?|
|Under canopy sprik.||C||U||C||C||GW, S||?|
|Growth Regulators||C||C||C||C||GW, S||?|
|Growth Regulators||C||C||C||C||GW, S||?|
|Harvest Aide||C||C||GW, S|
|Thinning Aide||C||U||C||C||GW, S||?|
|Access Road Dust Control||C||C||?|
2.3. Other indirect vectors of pathogens
An additional source of risk of contamination during all production operations is the transfer of microbial pathogens by rodents, insects, and birds, particularly gulls. The limited research available clearly supports the concern that gulls transmit human pathogens from sewage outfalls and solid waste (primarily waste foodstuff) to fresh surface water (Ferns and Mudge 2000; Hall and others 1977; Levesque and others 1993; Wallace and others 1997). Salmonella was the most frequently studied pathogen due to the higher rates of human salmonellosis in adjacent communities. The link between specific Salmonella in sewage, trapped gulls, and human clinical isolations, however, was not always confirmed. Other wild birds have the potential to act as vectors of pathogenic microorganisms and may contaminate fruits and vegetables in the field. It is well known that birds are hosts for several human pathogenic microorganisms, including Campylobacter spp. (Luechtefeld and others 1980), Salmonella spp (Jones and others 1978), Vibrio cholerae (Ogg and others 1989), Listeria spp. (Fenlon 1985), and Escherichia coli O157:H7 (Wallace and others 1997). Rodents have also been investigated for their potential to acquire human pathogens from sanitary landfill operations (Harvey and Macneil 1984).
Insects, primarily domestic flies (Musca domestica), have been studied in laboratory feeding and transmission experiments and in field surveys for their capacity to vector human pathogens from various sources of waste solids (Radi and others 1988; Rahn and others 1997; Cohen and others 1991; Hancock and others 1998; Iwasa and others 1999; Kobayashi and others 1999). Not surprisingly, fly vectoring of pathogens was found not to be a simple mechanical association. Aggressive control programs significantly reduced the number of visits to health clinics due to pathogens such as Shigella and enterotoxigenic E. coli. Field studies were conducted on stockpiled dairy manure, fields onto which human sewage waste has been spreaded, and dairy environments. Fruit flies were easily contaminated with E. coli and were able to transmit the bacterium to uncontaminated apple wounds (Janisiewicz and others 1999). When surveyed (USDA 2001), the majority of farms (> 90%) reported to use some type of pest control method in the field (insect and/or rodent traps and sprays, field or block maintenance, or pest exclusion).
These limited data suggest that wildlife vectors may contribute to produce contamination, but research is needed to establish the extent and magnitude of this phenomenon. Likewise, the role of aerial transport in water or food contamination is unknown.