This report presents a review of risk characterisation, the final step in risk assessment of exposures to food chemicals. The report is the second publication of the project "Food Safety in Europe: Risk Assessment of Chemicals in the Food and Diet (FOSIE)". The science underpinning the hazard identification, hazard characterisation and exposure assessment steps has been published in a previous report (Food Safety in Europe, 2002). Risk characterisation is the stage of risk assessment that integrates information from exposure assessment and hazard characterisation into advice suitable for use in decision-making. The focus of this review is primarily on risk characterisation of low molecular weight chemicals, but consideration is also given to micronutrients and nutritional supplements, macronutrients and whole foods. Problem formulation, as discussed here, is a preliminary step in risk assessment that considers whether an assessment is needed, who should be involved in the process and the further risk management, and how the information will provide the necessary support for risk management. In this step an evaluation is made of whether data are available and what level of resources are needed, as well as the timeline for completing the assessment. The report describes good evaluation practice as an organisational process and the necessary condition under which risk assessment of chemicals should be planned, performed, scrutinised and reported. The outcome of risk characterisation may be quantitative estimates of risks, if any, associated with different levels of exposure, or advice on particular levels of exposure that would be without appreciable risk to health, e.g. a guidance value such as an acceptable daily intake (ADI). It should be recognised that risk characterisation often is an iterative and evolving process. Historically, different approaches have been adopted for the risk characterisation of threshold and non-threshold effects. The hazard characterisation for threshold effects involves the derivation of a level of exposure at or below which there would be no appreciable risk to health if the chemical were to be consumed daily throughout life. A guidance value such as the ADI, is derived from the no-observed-adverse-effect-level (NOAEL) or other starting point, such as the benchmark dose (BMD), by the use of an uncertainty or adjustment factor. In contrast, for non-threshold effects a quantitative hazard estimate can be calculated by extrapolation, usually in a linear fashion, from an observed incidence within the experimental dose-response range to a given low incidence at a low dose. This traditional approach is based on the assumption that there may not be a threshold dose for effects involving genotoxicity. Alternatively, for compounds that are genotoxic, advice may be given that the exposure should be reduced to as low as reasonably achievable (ALARA) or practicable (ALARP). When a NOAEL can be derived from a study in humans, this would be utilised in the derivation of guidance values or advice. However, there may be uncertainties related to the possible role of confounders and the precision of both the incidence and exposure data. Individuals may be at an increased risk because of their greater exposure or their greater sensitivity. Risk characterisation should include information not only on the general population, but also on any subpopulation considered to be potentially susceptible. Risk characterisation considers both individuals with average exposures and those with high exposures. High exposure may be related to life stage, cultural practices and/or qualitative and/or quantitative food preferences. Inter-individual differences in toxicokinetics are an important source of variability in response. This may arise from differences in genetic constitution or environmental influences including diet, nutritional status, physiological status such as pregnancy, as well as patho-physiological states. Studies undertaken for hazard identification and characterisation investigate a substance in isolation, and not in combination with other substances to which humans may be exposed at the same time. It is recognised that food represents an extremely complex mixture of substances. In general, the available data indicate that interactions between chemicals in food are unlikely to be a significant problem for health. However, attention needs to be focused during risk characterisation on substances that share a common mode of action. The patterns of human exposure to chemicals in food may be chronic (usually low-level), short-term (often at higher levels) or chronic low-level with occasional high intakes. This may necessitate the development of guidance values for acute exposures (the acute reference dose, ARfD) based on shorter-term studies, in addition to an ADI-value usually based on chronic studies. The possibility of increased risks of chronic adverse effects associated with long-term low-level exposure, combined with occasional peak exposures has generally been handled by averaging such exposures. The significance of intakes above the ADI is difficult to assess. Consideration in this respect has to be given to the nature of the effect, the magnitude of the excessive intake, as well as the duration of excessive intake, in relation to the half-life of the compound in the body and the associated body burden. An intake above the ADI may not necessarily be associated with significant adverse health outcomes since the ADI usually is based on chronic intake and incorporates a safety margin. However, an intake above the ADI would have the effect of eroding the safety margin by the ratio of the ADI to the predicted excess intake. Alternative approaches to assessment of the significance of intakes above the guidance value are provided by categorical regression analysis and probabilistic methods. For non-threshold effects, such as for some cancers, that have undergone risk characterisation by the use of quantitative low-dose hazard extrapolation, any increase in risk with increased exposure can be readily interpreted using the same mathematical model. The narrative that accompanies the risk characterisation should explain the strengths and limitations of the data. When risk characterisation is based on animal data, the validity of such data needs to be affirmed and reported. Also, uncertainties associated with the extrapolation of data from studies in animals to predict human risk should be presented. Uncertainty can be expressed numerically when intake assessment and hazard characterisation are based on mathematical calculations and/or empirical distributions. Such numerical analyses can also be subject to sensitivity analyses, to test the contribution of different aspects of the database to overall uncertainty. Knowledge regarding the influence of human genetic polymorphisms on toxic responses is advancing rapidly. This has led to increasing concern that the default uncertainty factor may not provide adequate protection in the case of certain polymorphisms. The methods used for risk characterisation of low molecular weight chemicals are applicable in many respects to micronutrients. However, there are some unique aspects, the most obvious ones being that some intake is essential for life and the margins between essential intakes and toxic intakes may, for a number of micronutrients, be small. Since both deficiency and excess of a micronutrient can cause health problems, two guidance values for a micronutrient may be expressed. The setting of a tolerable upper intake level (UL) includes consideration of what does not cause physiological perturbations as well as consideration of the probability of an adverse effect occurring at some specified level of exposure. Macronutrients, such as dietary lipids, proteins and carbohydrates, may be present in the food/diet in substantial amounts. Consideration needs to be given in hazard characterisation of macronutrients to tolerance and to toxicological and nutritional impact. Hazard characterisation using animal studies may not be possible because the addition of bulky macroingredients to experimental diets, in amounts that are exaggerated relative to the human diet, may render such diets unpalatable and/ or cause nutritional imbalance. Because of this, the role of human trials and observational studies are widely viewed as particularly important for macronutrients, addressing toxicokinetics, nutritional issues and tolerance. Observational epidemiological studies may also help to identify adverse effects. As for micronutrients, there may need to be more than one guidance value for a macronutrient. In certain instances, a margin of safety approach may be preferable, when it is not possible in animal studies to exaggerate the dosage sufficiently to accommodate the usual uncertainty factors. This project also addresses hazard and risk characterisation related to whole foods, except those based on GM technology, which is dealt with in another European Union (EU) project, ENTRANSFOOD. Whole foods may be foods currently on the market or novel foods for which approval is being sought. There is as yet no world-wide consensus on the most appropriate approaches to hazard and risk characterisation of whole foods, other than to recommend that a case-by-case consideration and evaluation is needed. The initial approach to novel foods requires consideration of the extent to which the novel food differs from any traditional counterparts, or other related products, and hence whether it can be considered as safe as traditional counterparts/related products (the principle of substantial equivalence). As for macronutrients, epidemiological data identifying adverse effects, including allergic reactions, may also exist. Human trials on whole foods, including novel foods, will only be performed when no serious adverse effects are expected.
ASJC Scopus subject areas
- Food Science