Food
Listeria monocytogenes Risk Assessment: VI. "What If" Scenarios
FDA/Center for Food Safety and Applied Nutrition
USDA/Food Safety and Inspection Service
September 2003
Contents
The revised FDA/FSIS Listeria monocytogenes risk assessment model, taken in its entirety, describes the current status of knowledge about listeriosis and provides predications of disease incidence based on Listeria monocytogenes concentration in foods at retail, frequency of consumption, serving size, the microorganism’s growth/survival characteristics, and storage conditions. This risk assessment model can be used to estimate the likely impact of intervention strategies by changing one or more input parameters and measuring the change in the model outputs. These changes to the model, which are commonly referred to as ‘what if’ scenarios, can be used to test the likely impact of new or different processing parameters or regulatory actions. These ‘what if’ scenarios can also be hypothetical, not necessarily reflecting achievable changes but designed instead to show how different components of the complex model interact. Modeling specific scenarios can assist in the interpretation of a complex risk assessment model by allowing a comparison of baseline calculations to new situations. The following scenarios are intended to simulate the consequence of a putative regulatory policy (i.e., a possible intervention strategy) that alter one or more of the input distributions. Postretail, at retail and preretail interventions were evaluated.
Several simulations were constructed to illustrate the relationship between concentration at consumption or at retail and predicted disease rate. These simulations used exposure models with a range of fixed concentrations. Because a separate simulation was required for each concentration point at the range, a few selected food category/ subpopulation pairs were selected to serve as examples.
PostRetail Intervention
This risk assessment indicates that most cases of listeriosis result from consuming high levels of Listeria monocytogenes from foods that permit growth. For a specific food, the growth is dependent on the characteristics of the food matrix and on the temperature and time allowed for growth. Microbial growth is exponential with time (e.g., linear when plotted on a logarithmic scale) until the stationary phase is approached. The levels of a microorganism after a period of growth also depend upon the initial levels. The following scenarios show how refrigeration temperature and storage time are interrelated using selected food categories and subpopulations. The relationships demonstrated with these examples would generally apply to other foods and to the other subpopulations. Cooking is another postretail intervention. The impact of consumers adequately cooking foods was evaluated to measure the impact of reducing the number of frankfurters consumed without adequate reheating on the predicted number of illnesses. The rate of illness as a function of the concentration levels of Listeria monocytogenes in food at the time of consumption was also examined.
Refrigerator Temperature Scenarios
These scenarios evaluate the impact of controlling refrigerators to eliminate temperature above various limits. The baseline model used the full empirical distribution of refrigerator temperatures reported by Audits International (1999). Two types of scenarios were run:
 Limit range of refrigeration temperatures for two food categories. The baseline model for Deli Meats and Pasteurized Fluid Milk were modified by limiting the range of refrigeration temperature values to a maximum of 4 to 16 ºC (39 to 53 ºF) and calculating the resulting annual mortality.
 Truncate refrigeration temperatures for all food categories. The baseline model for all 23 food categories was modified by truncating the refrigeration temperature at
5 ºC and 7 ºC (41 and 45 ºF). This scenario allows a comparison of the impact of total cases of listeriosis if the maximum refrigerator temperatures could be regulated at these two specific temperatures.
Figures VI1 and VI2 show the estimated annual predicted mortality rate in the elderly subpopulation as a function of maximum temperature for Deli Meats and Pasteurized Fluid Milk, respectively. The median number of annual cases of listeriosis predicated by the baseline assumption (includes all refrigeration temperatures up to a maximum of 16 °C) is 228 for Deli Meats and 13 for Pasteurized Fluid Milk. As the refrigerators that have higher temperatures are removed from the distribution (i.e., moving from the right to the left on the curve) the number of predicted cases declines. This is a consequence of removing the higher temperature refrigerators where the fastest growth of Listeria monocytogenes would occur. The number of refrigerators with temperatures between 12 and 16 °C represent about 1% of the refrigerators from the Audits International survey, however, these refrigerators account for approximately 10% of the deaths from consumption of deli meats. At 7 °C, the removal of 12.2% of the refrigerators reduces the median mortality from deli meat consumption to 71 cases (68.9% reduction). For milk, the decrease in mortality is more linear than for deli meats and occurs at higher limits than for deli meats. Removal of refrigerators above 7 °C reduces the predicted median number of cases from milk consumption from 13 to only 2 cases (84.6% reduction). It should be noted that the relationship between maximum temperature and case rate varies among food categories. However, both examples indicate that eliminating the minority of refrigerators operating above 7 °C would greatly reduce the incidences of listeriosis. The impact on the predicated total number of cases of listeriosis from all 23 food categories and total United States population by eliminating the refrigerators operating above 5 and 7 °C is shown in Table VI1. By limiting the refrigerator temperature at 7 °C, the number of cases of listeriosis is reduced 69% from 2105 to 656 and limiting the refrigerator temperature at 5 °C further reduces the number of cases to 28 per year (>98% reduction). These scenarios indicate that controlling refrigerator temperature is a potentially effective means to reduce listeriosis.
Figure VI1. Predicted Annual Mortality in the Elderly Population Attributable to Deli Meat as a Function of Maximum Storage Temperaure
[The solid line represents the median estimate. The dotted lines represent the 5th and 95th percentiles of the uncertainty distribution.]
Figure VI2. Predicted Annual Mortality in the Elderly Population Attributable to Pasteurized Milk as a Function of Maximum Storage Temperaure
[The solid line represents the median estimate. The dotted lines represent the 5th and 95th percentiles of the uncertainty distribution.]
Table VI1. Estimated Reduction of Cases of Listeriosis from Limits on Refrigeration Temperatures
Maximum Refrigerator Temperature  Cases of Listeriosisa  
Median  5th Percentile  95th Percentile  

Baselineb  2105  ¾c  ¾c  


 
7 ËšC (45 ËšF) maximum  656  331  761  
5 ËšC (41 ËšF) maximum  28  1  126 
^{a}Values for the median, upper and lower uncertainty levels.
^{b}The baseline uses the full empirical distribution of refrigerator temperatures from the Audits International (1999) survey.
^{c}The baseline number of cases of listeriosis is fixed based on CDC surveillance data.
Storage Time Scenarios
These scenarios evaluate the impact of changing the maximum storage time (e.g., by labeling food with “consumeby” dates). In two scenarios using Deli Meats and Pasteurized Fluid Milk, the baseline model was modified by truncating the storage time at various maximum limits. In another scenario using Smoked Seafood, the impact of extending shelf life on the predicted risks was explored. The baseline distributions were modified BetaPert distributions defined by minimum, most likely and maximum times.
Limited Storage Times. In these scenarios, when a simulation chose a storage time longer than desired, that simulation was assigned the maximum storage time for that scenario. These simulations assume that the food is consumed during storage up to the maximum scenario storage time and the food is not discarded. Simulations were run for Deli Meats and Pasteurized Fluid Milk and the predicted annual mortality rate attributable to the group was calculated for the elderly subpopulation. The scenarios tested included seven maximum storage times for deli meats of 4, 7, 10, 14, 17, 21, and 28 days and four maximum storage times for milk of 4, 7, 10, and 14 days. The baseline maximum storage time is 28 days for deli meats and 14 days for milk.
Results from the simulations are presented in Figure VI3 and Figure VI 4. The baseline risk assessment is shown on the right of the curve (28 days for deli meats and 14 days for milk). Limiting the storage time for deli meat from the 28 day baseline to 14 days reduces the median number of cases of listeriosis in the elderly population from 228 to 197 (13.6%) and shortening storage time to 10 days further reduces the cases to 154 (32.5%). For milk, reducing the maximum storage time from the 14 day baseline to 4 days reduced the annual number of listeriosis cases from 13.3 to 7.5 (43.6%). The dependence of predicted risk on storage time varies across food categories. Reducing maximum storage time appears to be less effective at reducing risk than reducing the refrigerator temperature for the deli meat and milk examples. Other storage time scenarios with other food categories would produce different results, for example, the reduction in cases of listeriosis might be greater if foods stored beyond the maximum scenario storage time are discarded instead of consumed on the last day.
Figure VI3. Predicted Annual Mortality in the Elderly Subpopulation Attributible to Deli Meats as a Function of Maximum Storage Time
[The solid line represents the median estimate. The dotted lines represent the 5th and 95th percentiles of the uncertainty distribution.]
Figure VI4. Predicted Annual Mortality in the Elderly Subpopulation Attributible to Pasteurized Milk as a Function of Maximum Storage Time
[The solid line represents the median estimate. The dotted lines represent the 5th and 95th percentiles of the uncertainty distribution.]
Extended Storage Times. A storage scenario was conducted using Smoked Seafood to estimate the impact of a lengthened storage time on the predicted risks per serving and cases per annum for the elderly subpopulation. The estimates from the current 2003 risk assessment used the best estimates of the expert panel for the variation and uncertainty in the home storage times. A modified BetaPert distribution for the 2003 risk assessment had minimum, most likely and maximum values, with uniform uncertainty ranges, of 0.5 days, 3 to 5 days, and 15 to 30 days, respectively. For the extended storage time scenario, the modified BetaPert distribution was defined as 0.5 days (minimum), 6 to 10 days (most likely), and 15 to 45 days (maximum).
The distribution for the extended storage scenario is the same one used in the 2001 draft risk assessment for Smoked Seafoods. However, the calculated values are not the same as in the draft risk assessment because other data sets that are part of the calculation (such as contamination and growth data) have been revised and updated for the 2003 risk assessment.
The median and mean risks per serving and cases per annum are given on Table VI2 with 5th and 95th values indicating the uncertainty distributions for the calculated risks. The median risk per serving for the elderly subpopulation increased from the baseline value of
1.9 x 108 to the extended storage time value 5.0x108 cases per serving, an increase of about 2.5 times. The median storage time increased from 5.3 to 9.3 days and the percentage of servings that exceeded 10 days of storage increased from 9 to 43%. The uncertainty range for the baseline scenario from the 5th to 95th percentile was approximately three logarithms. The mean risk per serving increased about 58% with the longer storage times. The estimates of the cases per annum follow the changes in risks per serving because the same doseresponse relationship and number of servings are used in each scenario. The median number of cases per annum increases from 0.8 with the baseline scenario to 2.1 with the extended storage time scenario and the mean number of cases per annum increased from 10.6 to 17.
The difference between the median and mean reflect the skewed shape of the uncertainty distributions. The median indicates where the center of the distribution is and where the values tend to congregate. The mean will be larger because it is more affected by the few high values than the median, however, it does indicate the central tendency of repeated samplings of the distribution and can be viewed as the “average” value if the cases per annum were tracked over a number of years. The mean risk per serving and risk per annum for each food category is provided in Appendix 10.
The comparison for Smoked Seafood agrees with the truncated storage time scenarios used in the Deli Meats and Pasteurized Fluid Milk examples. Extending the storage times of a food that supports growth increase the probability that listeriosis will occur.
Table VI2. Impact of Home Refrigerator Storage Times on the Number of Predicted Cases of Listeriosis Attributed to Smoked Seafood for the Elderly Subpopulation
Parameter  Number of Predicted Cases of Listeriosis  
Current 2003a  ‘What if’ Scenariob  

Per Serving Basis  
Median  1.9x108  5.0x108 
Lower bound (5th percentile)  9.7x1010  2.7x109 
Upper bound (95th percentile)  1.0x106  1.8x106 
Mean  2.6x107  4.1x107 
Per Annum Basis  
Median  0.8  2.1 
Lower bound (5th percentile)  <0.1  0.1 
Upper bound (95th percentile)  43  74 
Mean  10.6  17 
^{a}For the current 2003 risk assessment, the assumed storage time was a distribution with minimum of 0.5 days, most likely of 3 to 5 days, and maximum of 15 to 30 days.
^{b}For the ‘What if’ Scenario, the assumed storage time was a distribution with minimum of 0.5 days, most likely of 6 to 10 days, and maximum of 15 to 45 days. [Note this was the storage time used for the draft 2001 risk assessment.]
Storage Time and Temperature Interaction Scenario
As an example of the potential impact of dual interventions, the interaction modifying both storage time and temperature on the predicted annual mortality rate in the elderly subpopulation attributed to Deli Meats was simulated. The baseline models were adjusted in the same manner as the individual interventions. Results for the temperature and time interaction are shown in Figure VI5.
The median estimates from the uncertainty distribution are plotted for each storage duration series. The baseline model estimated the upper right value, 228 cases as shown in Figure VI5. Each line represents a range of maximum storage times at maximum refrigerator temperatures. Achieving a 50% reduction in cases of listeriosis from consumption of deli meats would require eliminating storage above approximately 8 °C or all storage times longer than 8 days. An example of a combination that would reduce cases of listeriosis by 50% is 10 °C and 11 days.
Figure VI5. Predicted Annual Mortality in the Elderly Subpopulation Attributible to Deli Meats as a Function of Maximum Storage Time and Maximum Storage Temperature
Cooking Scenario
Cooking is a postretail intervention. Because cooking is an effective method of killing Listeria monocytogenes, the risk from unreheated frankfurters is much greater than from adequately reheated frankfurters. A simulation was run in order to simulate the consequence of an intervention that reduces the number of frankfurters consumed without adequate reheating. The baseline assumption, a triangle distribution with an uncertainty range (minimum 4, most likely 7, and maximum 10), was replaced with values of 2, 4, and 6 (minimum, most likely, maximum, respectively). The impact of this change was to reduce the predicted median number of cases of listeriosis by approximately 58% (Table VI3).
Table VI3. Scenario testing: Reducing the Estimated Consumption of Unreheated Frankfurters
Scenario  Predicted Number of Cases of Listeriosis  
Median  5th Percentile  95th Percentile  

Baselinea  31  3.3  250 
Reduced Consumptionb  18  2.2  133 
^{a}Baseline model uses triangular distribution with minimum of 4%, most likely of 7%, and maximum of 10% frankfurters are consumed without reheating.
^{b}Reduced consumption scenario assumes a triangular distribution of minimum of 2%, most likely of 4%, and maximum of 6% frankfurters are consumed without reheating.
Disease Rate as a Function of Concentration Levels at the Time of Consumption
This simulation utilizes the main elements of the doseresponse simulation and the serving size component from the exposure simulation. Figure VI6 illustrates the relationship between Listeria monocytogenes concentration at the time of consumption and mortality for Deli Meats, it is derived from Figure IV8 and the serving size distribution for deli meats. Since the only food category specific component is serving size, a similar relationship would be expected for other food categories.
PreRetail and At Retail Interventions
Reduction of the Number of Organisms Scenarios
Interventions might also be designed to reduce the number of Listeria in food before it is sold. There are a variety of ways in which this might be done. Effectively modeling a preretail intervention may require expanding the modeling effort to include the step at which the intervention takes place. However, a common method of representing or measuring an intervention that kills bacteria (e.g. pasteurization, cooking) is to calculate the number of surviving bacteria as a fraction of the initial number. Since the surviving fraction may be very small, the effectiveness of a kill step may be represented as a log reduction of cfu, where 10% survival represents a 1 log reduction, 1% survival a 2 log reduction, etc. To model an intervention that is measured this way, scenarios were run to calculate the predicted reduction in the number of cases in the elderly population attributable to deli meats as a function of the reduction in cfu prior to retail. This means that for a one log reduction, the distribution of servings containing a given number of Listeria monocytogenes at retail was shifted to values one log lower. For example, the 103 cfu/g level, which represented 0.5% of the servings, was shifted to 102 cfu/g. The contamination was not truncated at any specific cfu/g level, so high contamination levels could still occur but they would be observed less frequently compared to the baseline simulations. The growth after retail was modeled in the same manner as in the baseline model. The results are displayed in Figure VI7.
Figure VI7. Reduction of Predicted Annual Mortality in the Elderly Subpopulation Attributible to Deli Meats as a Function of Log Kill Achieved by the Inclusion of a Lethal Intervention Prior to Retail
[The solid line represents the median estimate. The dotted lines represent the 5th and 95th percentiles of the uncertainty distribution.]
The scenarios shown in Figure VI7 indicate that inclusion of treatment that produced a one log reduction in contamination at retail would reduce the number of predicted deaths in the elderly population attributed to Deli Meats nearly 50%, from 227 to 120. Reducing contamination two logs would result in a 74% reduction. This reduction could be achieved by a number of different means such as reduced contamination of raw materials, more effective sanitation, or a process step that results in some lethality.
Estimations of risk per serving from specific cfu/g at retail scenarios
The ability of Listeria monocytogenes to grow in a food is associated with the likelihood of that food causing illness. The following scenario provides insight on how the contamination level at retail in a food that supports growth affects the risk of listeriosis per serving. This example is based on Deli Meat and the elderly population where the contamination level is a single value, not a distribution with variation and uncertainty as in the other examples (Figure VI8). Since the actual number of cases depends on the number of servings, only the case rate per serving is used as the endpoint.
There is a wide variation in growth resulting from the combination of exponential growth rate, temperature, time and maximum levels but some servings will grow to populations having high likelihood of causing illness. The level of Listeria monocytogenes is the determining factor in the resulting risk per serving. For example, if a 56g serving that has one Listeria monocytogenes per gram at retail (i.e., 0 log10 cfu/g or approximately 56 total Listeria monocytogenes per serving) grows as described by the baseline model, it will result in a risk per serving of 1.1 x 106 (–5.96 log10 or approximately 1 death per million servings). For a 56g serving with 100 cfu/g at retail, the model predicts a modest increase in the likelihood of death (1.3 x 106 deaths per serving). Conversely, if a 10g serving has one Listeria monocytogenes per g, the model predicts a risk of 1.0 x 106 (–6.0 log10) and for a 100g serving, the model predicts a reduction of the risk to 0.71 x 106 (–6.15 log10). These relatively small changes in risk despite a tenfold change in contamination level are a consequence of the expected postretail growth of Listeria monocytogenes in food before consumption.
Given the refrigerator temperature and storage time distributions, the relatively low numbers at retail have the potential to grow to levels at the time of consumption in a sufficient fraction of servings that the overall risk is in the range of 106 per serving. Reducing the levels from 103 to 102 and to even 100 cfu/g reduces the risk, but not very much. Only when the contamination level decreases to less than one Listeria monocytogenes per package does the risk fall in proportion to the frequency of contamination (103 cfu/g decreases the risk to 0.02 x 106 per serving). What this implies is that in foods that support growth, reducing contamination to some specified level (but not zero) is not adequate by itself in controlling the risk of listeriosis.
Fresh Soft Cheese Made from Unpasteurized Milk Scenario
Unlike the 2001 draft risk assessment, the revised risk assessment indicates that the risk from Fresh Soft Cheese is low. This change is largely attributable to the inclusion of additional new data indicating a very low prevalence rate in this food category. However, there is a strong epidemiological correlation between Hispanicstyle fresh soft cheese (Queso Fresco) and listeriosis. A likely explanation for this discrepancy is that the data collected for this category is not representative of the cheese linked to the disease (i.e., fresh soft cheese made from raw, unpasteurized milk). In particular, although most commercial sources of fresh soft cheese are manufactured from pasteurized milk, some sources of queso fresco are made from raw milk.
To characterize the risk from queso fresco made from raw milk, the exposure model was constructed using the same analog as in the 2001 draft risk assessment – soft unripened cheese made from raw milk (Loncarevik, et al., 1995), where 50% of the samples tested were positive. A data set for the contamination distribution was developed using the methodology described in the Exposure Assessment chapter using the default range of 2 to 5 geometric standard deviations and applying a correction factor for overestimation from older data. The same growth and storage parameters were used as in the baseline estimation.
The estimated risk per serving for two sensitive populations is presented in Table VI4. The risk per serving was 43 times greater for the perinatal population and 36 times greater for the elderly population when cheeses were assumed to be made from unpasteurized milk compared to manufacture with pasteurized milk. The tested ‘high prevalence’ scenario increased the predicted risk on a per serving basis from low to a high risk.
Table VI4. Comparison of Baseline and a High Prevalence Scenerio Risk per Serving for Fresh Soft Cheese for Two Subpopulations
Population  Median Predicted Risk per Serving (5th and 95th percentiles)  
Baselinea  High Prevalenceb  

Perinatal  4.7 x 109 (3.0 x 1011, 9.8 108)  2.0 x 107 (5.1 x 109, 5.3 106) 
Elderly  2.8 x 1010 (1.3 x 1012, 4.5 109)  1.0 x 108 (3.2 x 1010, 2.3 107) 
^{a}Baseline uses a prevalence distribution based on available survey data.
^{b}High Prevalence scenarios assumes that 50% of the samples tested are positive.
Disease Rate as Function of Concentration Levels Measured at Retail
To simulate the relationship between Listeria monocytogenes concentration at retail and public health, the growth component of the exposure assessment is also included. Since the growth model differs significantly across food categories, examples for both high (Deli Meats) and low (Hard Cheese) growth are shown in Figures VI9, VI10, and VI11. Comparison of Figures.
VI9 (elderly) and VI10 (neonatal) suggests that similar doseresponse relationships may be expected for different subpopulations. However, the comparison of Figure VI9 (Deli Meat) and VI11 (Hard Cheese) indicates that the growth component of the model for a particular food category can have a large influence on the relationship between concentration at retail and the rate of listeriosis. Foods with high growth rates (such as Deli Meats) exhibit a relatively flat curve that suggests that the number of cases is only slightly dependent on initial concentration. On the other hand, low growth foods (such as Hard Cheese) indicate a substantial increase in the disease rate as the concentration increases. This suggests that for foods that support growth, above some minimum concentration the risk is largely determined by the growth that occurs subsequent to purchase.
Pasteurized Fluid Milk Scenarios
The primary intervention for milk is pasteurization. Differences in pasteurization requirements and handling practices among different countries could result in different levels of frequency and amounts of Listeria monocytogenes in milk at consumption. The Pasteurized Fluid Milk food category contains 30 studies including 3 studies conducted in the United States. There are a total of 12,407 fluid milk samples including whole milk, low fat, skim milk, and chocolate milk. All of the milk samples are from cows, except for a single sample of goat milk. The average percent of positive samples across the 30 studies is 0.4%. As with all of the food categories, the data were weighted for location, study age, and study size.
A “whatif” analysis was conducted to evaluate the impact of including nonU.S. studies and chocolate milk in this food category. The results for the three subpopulations and the total U.S. population are presented below in Tables VI5 And VI6. Excluding nonU.S. milk and chocolate milk has little impact on the predicted number of cases of listeriosis attributed to Pasteurized Fluid Milk on both per serving and per annum basis.
Scenario  Median Cases of Listeriosis per Serving  

IntermediateAge  Perinatal  Elderly  Total  
Baseline  4.4x1010  1.6x108  3.4x109  1.0x109 
Domestic Milk Only  3.7x1010  8.0x107  2.9x109  8.8x1010 
Domestic Milk (excluding chocolate milk)  3.8x1010  8.4x107  3.0x109  9.3x1010 
Domestic Chocolate Milk Only  4.2x1010  9.4x107  3.4x109  5.8x1010 
Scenario  Median Cases of Listeriosis per Annum  

IntermediateAge  Perinatal  Elderly  Total  
Baseline  31.4  8.0  49.8  90.8 
Domestic Milk Only  27  6.7  43  77 
Domestic Milk (excluding chocolate milk)  26  6.9  45  78 
Domestic Chocolate Milk Only  1.2  0.3  0.2  1.7 
Summary
In these scenarios, selected food categories (Deli Meats, Frankfurters, Fresh Soft Cheese, Pasteurized Fluid Milk, Smoked Seafood, and Hard Cheese) were used as examples. Other foods which permit different rates of growth and are stored for different lengths of time may have different results, but the general interrelationships are representative of other food categories. These scenarios compared with the baseline estimations of risk illustrate the impact of storage time, storage temperature, and contamination level on the risks per serving.
 Reducing the ranges of refrigerator temperatures by eliminating storage at the high temperatures reduced the predicted cases of listeriosis by reducing growth of Listeria monocytogenes in the foods that permit growth.
 Eliminating the longest storage times reduced the number of cases of listeriosis, even with the full range of storage temperatures and contamination levels. However, reducing a percentage of the longest storage times appeared to be less effective than reducing the corresponding percentage of highest storage temperatures, unless the storage time is reduced to very short duration between retail and consumption.
 Reducing the overall frequency of high levels of contamination will reduce the number of cases, particularly when frequencies of the highest contamination levels are reduced. However, growth can occur from relatively low contamination levels at retail to levels at consumption that are likely to cause illness. Thus, in foods that permit growth, reducing the Listeria monocytogenes at or before retail to less than some specified level other than zero will not result in the elimination of the risk.