Open Access
Issue
Radioprotection
Volume 55, May 2020
Coping with uncertainties for improved modelling and decision making in nuclear emergencies. Key results of the CONFIDENCE European research project
Page(s) S175 - S180
Section DECISION MAKING UNDER UNCERTAINTIES
DOI https://doi.org/10.1051/radiopro/2020029
Published online 20 May 2020

© The Authors, published by EDP Sciences 2020

Licence Creative CommonsThis is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

1 Introduction

Emergencies inevitably involve uncertainties; threatened or actual offsite releases of radiation from nuclear facilities are no exceptions. In the threat and early release phases, the source term, its strength, time profile and composition are hugely uncertain (Mathieu et al., 2018a). Meteorological and hydrological uncertainties further confound predictions of the dispersion of the contamination (Wellings et al., 2018). Additional uncertainties enter into dose predictions from external dose or via food chains, and the effectiveness of countermeasures and their implementation including public compliance increases the uncertainty further (Howard et al., 2005; Nisbet et al., 2005). However, these are only some of the uncertainties faced by emergency managers and recovery teams in responding to a nuclear accident. The modelling and analysis of dose, while reducing some of the uncertainties, introduces further ones related to the model choices and the computation (Haywood, 2010; Haywood et al., 2010). Then there are ambiguities and value uncertainties that arise when the managers try to contextualise emergency plans with their imperatives to minimise the risk to “human health” or some such objective to the specifics of the accident and those affected (French et al., 2016). In short, many uncertainties need to be addressed in the decision making.

This paper builds on existing literature on uncertainty types (French, 1997; Snowden, 2002; Walker et al., 2003), and discusses various forms of uncertainty, how they arise and how we might analyse them in the context of emergency management. Examples from CONFIDENCE work are included as to contextualise the challenges and possible approaches to addressing different uncertainty types. We believe that we need to be clearer in our understanding and communication of uncertainties if we are to support emergency managers in their tasks and decision making.

2 Uncertainties in nuclear emergency management

The range of uncertainties that need be considered has been in focus for decades (French, 1997), but until recently in developing tools and procedures for nuclear emergencies, they have not been addressed in a comprehensive sense. Indeed, current decision procedures tend to ignore uncertainties and focus on an expected or reasonable worst case (French et al., 2016). The CONFIDENCE project had the overriding aim of identifying, addressing and communicating the myriad of uncertainties in a much more comprehensive and effective way.

There are many ways of categorising uncertainty (see, e.g., Berkeley and Humphreys, 1982; French, 1995; Snowden, 2002), and no real agreement on how to do so. This was also apparent during early discussions within the CONFIDENCE project. Some proposed that as scientists, we should use the term more tightly, restricting it to contexts in which probability modelling is appropriate. Recognising the many dimensions of uncertainty, as well as the fact that emergency managers and stakeholders use the term very widely (Perko et al., 2019), we adopted a broader but pragmatic approach. Drawing from existing literature on uncertainty categorisation, we briefly describe 9 types of uncertainty, recognising that neither is the list exhaustive nor the categories mutually exclusive.

2.1 Stochastic or aleatory uncertainty

Many uncertainties arise from randomness within many physical behaviours or natural variations in populations and are referred to as stochastic or aleatory uncertainties. In such situation, we have some knowledge about the potential outcomes, but we are not certain about which of these will be realised. Whether the world is truly random or whether it is so complex that the slightest variation in conditions can dramatically affect the outcome of a deterministic behaviour does not matter. What matters is that we cannot predict an outcome with certainty: we need probability. There is general agreement across the scientific and lay communities that probability models are the appropriate means of describing uncertain behaviours in physical systems (Barnett, 1999). Similarly, probability is the accepted way of modelling natural variation in a population, for instance when estimating cancer risk from radiation exposure (Keil and Richardson, 2018). Examples of such types of uncertainties can be found throughout the CONFIDENCE project, including meteorological conditions and dispersion models, impact of soil parameters on food-chain transfer, as well as the use of probability models within scenario based assessments (Duranova et al., 2020a, 2020b; Beresford et al., 2020).

2.2 Actor uncertainty

Decision makers need to consider how other actors will behave. For instance, after a nuclear accident many uncertainties relate to the behaviour and the compliance of the local public. Within the CONFIDENCE project, examples included whether or not affected communities would follow the advice of authorities on countermeasure such as stable iodine intake or sheltering (Turcanu et al., 2020a), as well as consumer trust in food from affected areas (Duranova et al., 2020a). It is possible to model human behaviour using probability models, which assume that over a population, variations in how people act can be described stochastically. Simulation models, including agent-based modelling, do this. So do many countermeasure models, perhaps forgetting the variation and just using averages. However, humans think, and their behaviour is not random. It is driven by their wants and desires. There are models of human behaviour that may be more useful than simply using a probability distribution to predict the actions that people take: e.g. prospect theory (Kahneman and Tversky, 1979; Mercer, 2005; Barberis, 2013). Currently, there are developments in adversarial risk analysis, which model interactions between individuals, allowing that each may adopt different levels of “rationality” (Banks et al., 2015), though such models need more investigation before they might be applied. The mental model approach applied within the CONFIDENCE project is another example (Zeleznik et al., 2020). Ultimately, stakeholder engagement processes, both in the preparedness and in the emergency and recovery phase are essential to understand each actor’s perspective, improve emergency and recovery plans and reach effective decisions.

2.3 Epistemic uncertainties

Some uncertainties relate to our lack of knowledge and have a different character to stochastic uncertainty. For instance, we may have a number of possible source term models but not know which represents the actual release better. Another example relates to health effects of low radiation doses: at present we do not have sufficient knowledge to estimate who among the exposed population will be affected and in which way. Such uncertainties are called epistemic. Statistical theory, which articulates the process of scientific inference or induction, has considered how epistemic uncertainty should be introduced and dealt with in analyses. Frequentist approaches, which once dominated statistical methods, eschew full quantification of epistemic uncertainty leaving the scientist to learn intuitively from the evidence displayed to them in the analyses through p values, confidence intervals and significance levels (Barnett, 1999). Bayesian approaches, based on quantifying epistemic uncertainty through probabilities (French and Rios Insua, 2000; Gelman et al., 2013), now dominate statistical thinking. Probability is taken as representing the uncertainty that an idealised rational person beginning with an agreed body of knowledge would hold in the light of the available empirical evidence. Bayesian methods now provide a coherent foundation to machine learning, decision modelling and artificial intelligence.

Sensitivity and uncertainty analysis have a role in exploring and assessing the implications of epistemic uncertainty for the support of specific decisions. If all plausible explanations and models predict similar outcomes of potential actions, any epistemic uncertainty will not be significant for that decision (French, 2003).

Some uncertainties, particularly epistemic ones, may be deep (French, 2015) or severe (Comes et al., 2011) in that we know too little to assess them or build a probability model of our uncertainty convincingly in the time available. Relevant data may be sparse and there may be little agreement among experts about neither what is happening nor how to model the behaviour. The types of uncertainty prevalent during the early phase of a nuclear emergency, especially related to the source term, timing of release and future meteorological conditions are well-recognised examples of epistemic uncertainties (Mathieu et al., 2018b), but also include behavioural factors (e.g. how many people will self-evacuate and which routes will they take; or will all people from the affected area have access to iodine tablets timely) or factors impacting on long-term consequences (deposition and dispersion, and health impacts) of accidents are other examples epistemic uncertainties (Duranova et al., 2020a).

2.4 Judgemental uncertainties

In any analysis, there are many judgements to be made: which model to use, what parameter values should be set, among other. There may be many candidates (e.g. Galmarini et al., 2008). For example, there are many atmospheric dispersion models with many parameters, and little agreement on which should be used. So choices have to be made, leading to judgemental uncertainty. In some cases, this may be assessed using Monte Carlo methods (Evans and Olson, 2002) or a variety of more deterministic sensitivity analyses (Saltelli et al., 2000a, 2000b; French, 2003; Saltelli et al., 2004). Model intercomparison studies, such as the scenario comparisons carried out in WP1 of the CONFIDENCE project are very useful too, providing examples of the impact of parameter choice on the predicted consequences during early release phases such as the evolution of plumes and differences in prognosis frequency maps for countermeasures in affected regions (Hamburger et al., 2020; Duranova et al., 2020a).

2.5 Computational uncertainties

The algorithms that are used to analyse models and evaluate their predictions are not perfect. Computer codes are developed using approximations, iterations and enormous numbers of arithmetical calculations based on finite mathematics, inevitably introducing errors: e.g. computational uncertainties. We may find that the computations are intractable in feasible time, so further approximations may be introduced to increase speed and hence computational uncertainty. Statistical emulation may be used to fit a complex model with a much simpler one, a sort of functional regression (Craig et al., 2001; O’Hagan, 2006; Conti et al., 2009; Goldstein, 2011), again increasing computational uncertainty. Numerical analysis provides bounds on computational errors in specific calculations; emulation algorithms provide some assessment of their own accuracy. Recently Hennig et al. (2015) have promoted probabilistic techniques for representing overall computational uncertainty.

2.6 Model uncertainty

However good the model and computations, the outputs will not fit the real world perfectly: the only true model of reality is reality itself. For instance, many radioecological models use empirical transfer factors to estimate soil-to-plant transfer of radionuclides and do not fully account for the variation in root uptake caused by variation in soil properties (Almahayni et al., 2019). Over the years, attempts have been made to model the gap between a model and reality (Draper, 1995; Goldstein and Rougier, 2009; O’Hagan, 2012). But the task, though informative in understanding the process of modelling, is fruitless, creating an infinite regress of models modelling errors of modelling error models. In some cases, modelling error may be significant, the model only giving broad indications behaviour. The cited papers provide some techniques to allow for modelling error. However, in using models for prediction, one has to rely on the user’s experience and tests made on existing data to allow for “how good the model is” (Kuhn, 1961).

There are considerable overlaps between judgemental, computational and modelling uncertainty. We did warn that our categories were not mutually exclusive. The important point is that modelling and analysis introduce uncertainties over and above stochastic, actor and epistemic uncertainties.

2.7 Uncertainties resulting from ambiguity and lack of clarity

While judgemental, computational, model, actor and even stochastic uncertainties can be considered specific cases of epistemic uncertainty, since they relate to a lack of knowledge, ambiguity and lack of clarity are entirely different. They relate to a lack of clear understanding about what is meant by some wording or certain visual information. For instance, what is the meaning of maximal permitted levels of radiation in food: does this mean that food with radioactivity below these levels is safe while food above is unsafe? (Charron et al., 2016). Other uncertainties of this type relate to the description of a consequence (French et al., 2017). Some researchers have suggested modelling such uncertainty with fuzzy concepts (Kacprzyk and Zadrozny, 2010), especially in natural language processing. For decision-making, however, fuzzy methods are not an appropriate way forward (French, 1995). When making decisions, we do not need a model of some ambiguity or lack of clarity; rather we need to think more deeply about what we mean and resolve any lack of clarity by conscious deliberation. A common approach to this is via facilitated workshops in which the facilitator continually challenges participants to explore and define much more clearly what is meant by phrases such as “health effects” (Eden and Ackermann, 1998; O’Brian and Dyson, 2007; French et al., 2009). Resolving ambiguity and lack of clarity in modelling and analysis invariably requires value judgements. Indeed, it requires value-focused thinking Keeney (1992) in that we need to think about why we need to undertake these. What do we want to learn from the modelling and analysis that we are about to undertake? Work within CONFIDENCE on different ways of calculating and presenting radiation induced health effects provide a good example of such ambiguities (Walsh et al., 2019), also highlighted in discussions at the CONFIDENCE workshop of how this might be communicated to the public (Duranova et al., 2020a).

2.8 Value, social and ethical uncertainties

Some uncertainties relate to questions that require value judgements in their resolution: e.g. how much to trade-off a reduction in radiation exposure for an increase in cost, or at what dose levels countermeasures should be initiated. These differ from the judgmental uncertainties discussed above in that they do not relate to the choice of parameters used in models, but the decisions and choices made based on the results of models and predictions. Such value uncertainties stem mainly from value pluralism: an emergency affects different stakeholders in different ways, and these stakeholders may have different priorities. They are, however, important enough to consider separately from ambiguity and lack of clarity. Even if the science base were undisputable, the values underlying the priorities of different stakeholders may be conflict with each other. For instance, the authorities may set their priority reducing the doses to population, below a certain threshold, while for affected population the socio-economic revitalisation of the area may be at least as important. Again, modelling such uncertainties is unhelpful; they need to be resolved by thoughtful deliberation, perhaps supported by sensitivity analysis since precision is irrelevant when there is no effect on the ultimate choice. Value uncertainties introduce social responsibilities and ethical concerns, particularly when acting on behalf of stakeholders. Social uncertainties in how expert recommendations are implemented in society may refer to public acceptance and compliance with protective actions advice; social and economic consequences of the recommendation and actions, and uncertainties in those consequences; and the level of stakeholder and public engagement used or planned (Turcanu et al., 2020b). Ethical uncertainties also refer to whether members of a population feel that they have control over or have given consent to being exposed to a particular level of risk and the need to be sensitive to inequalities in the distribution of risk (Oughton et al., 2004; Tomkiv et al., 2020). Again debate and deliberation, when possible with the engagement of affected stakeholders, are the only useful way forward (Nisbet et al., 2010).

2.9 Uncertainty about the depth of modelling

When decisions are taken, a further uncertainty arises: are the analyses sufficient to justify the actions being taken? Is it requisite, e.g. “good enough”? Such uncertainty again can only be resolved by judgement and deliberation (Phillips, 1984; French et al., 2009); although in the emergency itself, the need to make timely decisions may supersede this. Discussions on the robustness of different food-chain models carried out within CONFIDENCE (Beresford et al., 2020) illustrate the importance of having this deliberation prior to an actual emergency.

A related issue here is how confident the decision makers feel when taking the decision. The analyses and results presented to emergency managers and recovery teams are based on long and complex model chains. Considerable judgemental, computational and model uncertainties may have accumulated in developing the analysis and results. Such uncertainties, as we have indicated, are extremely difficult to quantify and will not be fully represented in any uncertainty bounds and plots produced. Yet they need to be communicated to the decision makers so that they have as full as possible understanding of the total uncertainty that they face. Within CONFIDENCE, robustness indicators have been developed which reflect the uncertainty level of different components and inputs to the model chain (Nagy et al., 2020).

3 Conclusions and broader groupings

We emphasise again that the list of uncertainties above is only one possible categorisation and that we make no claim that it is comprehensive nor that the categories are non-overlapping. We have, however, found it useful in discussing how to handle uncertainties in nuclear emergency management. In Table 1, we organise these types of uncertainty into three groups. The first group relates to our knowledge of the external world; it might be called scientific uncertainty. The second relate to uncertainties and errors that are introduced by the models and techniques that are used to analyse the risks and possible interventions in an emergency. The third relates to uncertainties that the emergency managers, experts and stakeholders hold about themselves, their values and responsibilities and how to account for these differences in decision-making. In our experience, in the past most attention and discussion has focused on the first and some of the third group of uncertainties, but much less attention has been given to the second. The CONFIDENCE project made an important contribution to increasing reflection on, and communication of, the many different types of uncertainty inherent within nuclear emergency management, that to date had been poorly communicated to the decision-making process. However, this is only a start and there is a need for further discussions of the different types of uncertainties, and how to handle them within decision-making, perhaps with the aid of MCDA within stakeholder engagement workshops (Hiete et al., 2010). Both the categories and the groupings can act as a check-list for the analysts to be sure that they have analysed and communicated all major uncertainties involved in predicting what might happen. Finally, extending the list of uncertainties beyond the purely technical is an attempt to adopt rather than correct the language used by actual emergency managers, as well as to recognise the real challenges faced by decision-makers.

Table 1

Broad categories of uncertainty types.

Acknowledgement

The work described in this paper was conducted within the CONFIDENCE project, which was part of the CONCERT project. This project has received funding from the Euratom research and training programme 2014–2018 under grant agreement No 662287. Disclaimer (Art. 29.5 GA). This publication reflects only the author’s view. Responsibility for the information and views expressed therein lies entirely with the authors. The European Commission is not responsible for any use that may be made of the information it contains.

References

  • Almahayni T, Sweeck L, Beresford NA, Barnett CL, Lofts S, Hosseini A, Brown J, Thørring H, Guillén J. 2019. An evaluation of process-based models and their application in food chain assessments. CONCERT Deliverable D9.15. Available from https://concert-h2020.eu/en/Publications. [Google Scholar]
  • Banks DL, Aliaga JMR, Insua DR. 2015. Adversarial risk analysis. Boca Raton: CRC Press. [CrossRef] [Google Scholar]
  • Barberis NC. 2013. Thirty years of prospect theory in economics: A review and assessment. J Econ Perspect 27(1): 173–196. [Google Scholar]
  • Barnett V. 1999. Comparative statistical inference. Chichester: John Wiley and Sons. [CrossRef] [Google Scholar]
  • Beresford NA, Barnett CL, Chaplow J, Lofts S, Wells C, Brown JE, Hosseini A, Thørring H, Almahayni T, Sweeck L, Guillén J, Lind O-C, Oughton DH, Salbu B, Teien H-C, Perez-Sánchez D, Real A. 2020. CONFIDENCE Overview of improvements in radioecological human food chain models and future needs. Radioprotection 55(HS1). https://doi.org/10.1051/radiopro/2020019. [Google Scholar]
  • Berkeley D, Humphreys PC. 1982. Structuring decision problems and the “bias heuristic”. Psychol. Bull. 50: 201–252. [Google Scholar]
  • Charron S. et al. 2016. Overview of the PREPARE WP3: Management of contaminated goods in post-accidental situation – Synthesis of European stakeholders’ panels. Radioprotection 51(HS2): S83–S91. [EDP Sciences] [Google Scholar]
  • Comes T, Hiete M, Wijngaards N, Schultmann F. 2011. Decision maps: A framework for multi-criteria decision support under severe uncertainty. Decis. Support Syst. 52(1): 108–118. [Google Scholar]
  • Conti S, Gosling JP, Oakley JE, O’hagan A. 2009. Gaussian process emulation of dynamic computer codes. Biometrika 96(3): 663–676. [Google Scholar]
  • Craig PS, Goldstein M, Rougier JC, Seheult AH. 2001. Bayesian forecasting for complex systems using computer simulators. J. Am. Stat. Assoc. 96(454): 717–729. [Google Scholar]
  • Draper D. 1995. Assessment and propagation of model uncertainty (with discussion). J. R. Stati. Soc. B57(1): 45–97. [Google Scholar]
  • Duranova T, Raskob W, Beresford NA, Korsakissok I, Montero M, Müller T, Turcanu C, Woda C. 2020a. CONFIDENCE dissemination meeting: Summary on the scenario based workshop. Radioprotection 55(HS1). https://doi.org/10.1051/radiopro/2020009. [Google Scholar]
  • Duranova T, van Asselt E, Müller T, Bohunova J, Twenhöfel CJW, Smetsers. 2020b. MCDA stakeholder workshops. Radioprotection 55(HS1). https://doi.org/10.1051/radiopro/2020032. [Google Scholar]
  • Eden C, Ackermann F. 1998. Making strategy: The journey of strategic management. London: Sage. [Google Scholar]
  • Evans JR, Olson DL. 2002. Introduction to simulation and risk analysis. Upper Saddle River, NJ: Prentice Hall. [Google Scholar]
  • French S. 1995. Uncertainty and imprecision: Modelling and analysis. J. Oper. Res. Soc. 46(1): 70–79. [Google Scholar]
  • French S. 1997. Uncertainty modelling, data assimilation and decision support for management of off-site nuclear emergencies. Radiat. Prot. Dosim. 73: 11–15. [CrossRef] [Google Scholar]
  • French S. 2003. Modelling, making inferences and making decisions: The roles of sensitivity analysis. TOP 11(2): 229–252. [CrossRef] [MathSciNet] [Google Scholar]
  • French S. 2015. Cynefin: Uncertainty, small worlds and scenarios. J. Oper. Res. Soc. 66(10): 1635–1645. [Google Scholar]
  • French S, Rios Insua D. 2000. Statistical decision theory. London: Arnold. [Google Scholar]
  • French S, Maule AJ, Papamichail KN. 2009. Decision behaviour, analysis and support. Cambridge: Cambridge University Press. [CrossRef] [Google Scholar]
  • French S, Argyris N, Layton H, Smith JQ, Haywood SM, Hort M. 2016. Presenting uncertain information in radiological emergencies. Available from https://admlc.wordpress.com/publications/. UK Atmospheric Dispersion Modelling Liaison Committee. [Google Scholar]
  • French S, Argyris N, Haywood S, Hort M, Smith J. 2017. Uncertainty handling during nuclear accidents. ISCRAM2017. Albi: ISCRAM. Available from www.iscram.org. [Google Scholar]
  • Galmarini S, Bianconi R, De Vries G, Bellasio R. 2008. Real-time monitoring data for real-time multi-model validation: Coupling ENSEMBLE and EURDEP. J. Environ. Radioact. 99(8): 1233–1241. [CrossRef] [PubMed] [Google Scholar]
  • Gelman A, Carlin JB, Stern HS, Dunson DB, Vehtari A, Rubin DB. 2013. Bayesian data analysis. London: Chapman and Hall. [CrossRef] [Google Scholar]
  • Goldstein M. 2011. External Bayesian analysis for computer simulators (with discussion). Bayesian Statistics 9 (JM Bernardo et al., Eds.). Oxford: Oxford University Press (in press). [Google Scholar]
  • Goldstein M, Rougier JC. 2009. Reified Bayesian Modelling and inference for physical systems (with discussion). J. Stat.l Plan. Inference 139: 1221–1256. [CrossRef] [Google Scholar]
  • Hamburger T, Gering F, Ievdin I, Schantz S, Geertsema G, de Vries H. 2020. Uncertainty propagation from ensemble dispersion simulations through a terrestrial food chain and dose model. Radioprotection 55(HS1). https://doi.org/10.1051/radiopro/2020014. [Google Scholar]
  • Haywood SM. 2010. A method for displaying imprecision in early radiological emergency assessments. J. Radiol. Prot. 30(4): 673. [Google Scholar]
  • Haywood SM, Bedwell P, Hort M. 2010. Key factors in imprecision in radiological emergency response assessments using the NAME model. J. Radiol. Prot. 30(1): 23–36. [Google Scholar]
  • Hennig P, Osborne MA, Girolami M. 2015. Probabilistic numerics and uncertainty in computations. London: Proc. R. Soc. A, The Royal Society. [Google Scholar]
  • Hiete M, Bertsch V, Comes T, Schultmann F, Raskob W. 2010. Evaluation strategies for nuclear and radiological emergency and post-accident management. Radioprotection 45(5): S133–S147. [CrossRef] [EDP Sciences] [Google Scholar]
  • Howard BJ, Liland A, Beresford NA, Anderson K, Crout NMJ, Gil JM, Hunt J, Nisbet A, Oughton DH, Voight G. 2005. The STRATEGY Project: Decision tools to aid sustainable restoration and long-term management of contaminated agricultural ecosystems. J. Environ. Radioact. 83: 275–296. [CrossRef] [PubMed] [Google Scholar]
  • Kacprzyk J, Zadrozny S. 2010. Computing with words is an implementable paradigm: Fuzzy queries, linguistic data summaries, and natural-language generation. IEEE Trans. Fuzzy Syst. 18(3): 461–472. [Google Scholar]
  • Kahneman D, Tversky A. 1979. Prospect theory: An analysis of decisions under risk. Econmetrica 47: 263–291. [CrossRef] [MathSciNet] [Google Scholar]
  • Keeney RL. 1992. Value-focused thinking: A path to creative decision making. Cambridge, MA: Harvard University Press. [Google Scholar]
  • Keil AP, Richardson DB. 2018. Quantifying cancer risk from radiation. Risk Anal. 38(7): 1474–1489. [PubMed] [Google Scholar]
  • Kuhn TS. 1961. The function of measurement in modern physical science. Isis 52(2): 161–193. [Google Scholar]
  • Mathieu A, Korsakissok I, Périllat R, Chevalier-Jabet K, Stephani F, Fougerolle S, Créach V, Cogez E, Bedwell P. 2018a. Guidelines ranking uncertainties for atmospheric dispersion, D9.1.3 Guidelines describing source term uncertainties. CONCERT Deliverable D9.1. Available from https://concert-h2020.eu/en/Publications. [Google Scholar]
  • Mathieu A, Korsakissok I, Andronopoulos S, Bedwell P, Chevalier-Jabet K, Cogez E, Créach V, Fougerolle S, Geertsema G, Gering F, Hamburger T, Jones AR, Klein H, Leadbetter S, Pázmándi T, Périllat R, Rudas C, Sogachev A, Stephani F, Szanto P, Tomas J, Twenhöfel C, de Vries H, Wellings J. 2018b. Guidelines ranking uncertainties for atmospheric dispersion. CONCERT Deliverable D9.1. Available from https://concert-h2020.eu/en/Publications. [Google Scholar]
  • Mercer J. 2005. Prospect theory and political science. Ann. Rev. Polit. Sci. 8: 1–21. [CrossRef] [Google Scholar]
  • Nagy A, Perko T, Müller T, Raskob W, Benighaus L. 2020. Uncertainty visualization using maps for nuclear and radiological emergencies. Radioprotection 55(HS1). https://doi.org/10.1051/radiopro/2020033. [Google Scholar]
  • Nisbet AF et al. 2005. Achievements, difficulties and future challenges for the FARMING network. J. Environ. Radioact. 83(3): 263–274. [Google Scholar]
  • Nisbet A et al. 2010. Decision aiding handbooks for managing contaminated food production systems, drinking water and inhabited areas in Europe. Radioprotection 45(5 Supplement): S23–S37. [CrossRef] [EDP Sciences] [Google Scholar]
  • O’Brian FA, Dyson RG (Eds.) 2007. Supporting strategy: Frameworks, methods and models. Chichester: John Wiley and Sons, Ltd. [Google Scholar]
  • O’Hagan A. 2006. Bayesian analysis of computer code outputs: A tutorial. Reliab. Eng. Syst. Saf. 91(10): 1290–1300. [CrossRef] [Google Scholar]
  • O’Hagan A. 2012. Probabilistic uncertainty specification: Overview, elaboration techniques and their application to a mechanistic model of carbon flux. Environ. Model. Softw. 36: 35–48. [Google Scholar]
  • Oughton DH, Bay I, Forsberg E-M, Kaiser M, Howard B. 2004. An ethical dimension to sustainable resoration and long-term management of contaminated areas. J. Environ. Radioact. 74: 171–183. [CrossRef] [PubMed] [Google Scholar]
  • Perko T, Tafili V, Sala R, Duranova T, Zeleznik N, Tomkiv Y, Hoti F, Turcanu C. 2019. Report on observational study of emergency exercises: List of uncertainties. CONCERT Deliverable D9.28. Available from https://www.concert-h2020.eu/en/Publications. [Google Scholar]
  • Phillips LD. 1984. A theory of requisite decision models. Acta Psychol. 56(1–3): 29–48. [CrossRef] [Google Scholar]
  • Saltelli A, Chan K, Scott EM (Eds.) 2000a. Sensitivity analysis. Chichester: John Wiley and Sons. [Google Scholar]
  • Saltelli A, Tarantola S, Campolongo F. 2000b. Sensitivity analysis as an ingredient of modelling. Stat. Sci. 15(4): 377–395. [Google Scholar]
  • Saltelli A, Tarantola S, Campolongo F, Ratto M. 2004. Sensitivity analysis in practice: A guide to assessing scientific models. Chichester: John Wiley and Sons. [Google Scholar]
  • Snowden D. 2002. Complex acts of knowing – Paradox and descriptive self-awareness. J. Knowl. Manag. 6(2): 100–111. [CrossRef] [Google Scholar]
  • Tomkiv Y, Perko T, Sala R, Zeleznik N, Maitre M, Schneider T, Oughton DH. 2020. Societal uncertainties recognised in recent nuclear and radiological emergencies in Europe. Radioprotection 55(HS1). https://doi.org/10.1051/radiopro/2020025. [Google Scholar]
  • Turcanu C, Perko T, Wolf HV, Camps J, Oughton DH. 2020a. Social uncertainties associated with stable iodine intake in a nuclear emergency. Radioprotection 55(HS1). https://doi.org/10.1051/radiopro/2020027. [Google Scholar]
  • Turcanu C, Perko T, Baudé S, Hériard-Dubreuil G, Zeleznik N, Oughton DH, Tomkiv Y, Sala R, Oltra C, Tafili V, Benighaus L, Maitre M, Schneider T, Crouail P, Duranova T, Paiva I. 2020b. Social, ethical and communication aspects of uncertainty management. Radioprotection 55(HS1). https://doi.org/10.1051/radiopro/2020024. [Google Scholar]
  • Walker WE et al. 2003. Defining uncertainty: A conceptual basis for uncertainty management in model-based decision support. Integr. Assess. 4(1): 5–17. [CrossRef] [Google Scholar]
  • Walsh L, Ulanowski A, Kaiser JC, Woda C, Raskob W. 2019. Risk bases can complement dose bases for implementing and optimizing a radiological protection strategy in urgent and transition emergency phases. Radiat. Environ. Biophys. 58: 539–552. [CrossRef] [PubMed] [Google Scholar]
  • Wellings J, Bedwell P, Leadbetter S, Tomas J, Andronopoulos S, Korsakissok I, Périllat R, Mathieu A, Geertsema G, De Vries H, Klein H, Hamburger T, Gering F, Pázmándi T, Szántó P, Rudas C, Sogachev A, Davis N, Twenhöfel C. 2018. Guidelines ranking uncertainties for atmospheric dispersion, D9.1.5 Guidelines for ranking uncertainties in atmospheric dispersion. CONCERT Deliverable D9.1. Available from https://concert-h2020.eu/en/Publications. [Google Scholar]
  • Zeleznik N, Benighaus L, Mitrakos D, Tafili V, Duranova T, Sala R, Benighaus C. 2020. Mental models of uncertainty management in nuclear emergency management. Radioprotection 55(HS1). https://doi.org/10.1051/radiopro/2020026. [Google Scholar]

Cite this article as: French S, Haywood S, Oughton DH, Turcanu C. 2020. Different types of uncertainty in nuclear emergency management. Radioprotection 55(HS1): S175–S180

All Tables

Table 1

Broad categories of uncertainty types.

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.