ASG 2014-2015 Values, Decision Making and Risk
Workshops and seminars will be organized around five themes, covering different topics related to values, decision making and risk. The main aim of the study group is to bring Lund University researchers working with values, decision making and risk together, and formulate ideas for future interdisciplinary research within this field.
Theme 1: The Cultural Cognition Thesis
An intriguing finding has recently been reported (Kahan et al. 2012, 732): Members of the public with the highest degrees of science literacy and technical reasoning capacity were not the most concerned about climate change. Rather, they were the ones among whom cultural polarization was greatest. In other words, the association between advocacy of cultural values and degree of concern about climate change was strongest among scientifically literate people. This looks like bias: at any rate, let us refer to it as a “culture bias”. Interestingly, it appears not to fit the heuristics and biases pattern developed in recent dual process thinking (see e.g. Kahneman 2011). Normally bias is held to result from lack of cognitive resources; but culture bias becomes stronger with increased resources. Dual process thinking does not appear to be straightforwardly applicable to this phenomenon. The finding also casts doubt on the familiar “knowledge-deficit” model which says that laypeople have limited concern about climate change issues because they are poorly equipped with scientific information and/or the capacity for scientific thinking. This explanation does not account for the cultural polarization among the scientifically literate. Nor does it account for the fact that less educated individuals are often more concerned about climate change (together with the scientists) than those with higher levels of education. As an alternative explanation Kahan et al. (2012, 732) formulate the “cultural cognition thesis” (CCT): ... individuals, as a result of a complex of psychological mechanisms, tend to form perceptions of societal risks that cohere with values characteristic of groups with which they identify. In itself, CCT does not explain why the risk perceptions of scientifically literate people deviate from those of others. Something has to be added to the thesis, something along the lines that the former are better at forming coherent personal world-views, or form stronger (more certain, more stable) “cultural” values and preferences. Kahan et al. appear to prefer the first of these supplements. They say: “Fitting information to identity-defining commitments makes demands on all manner of cognition” (2012, 734).
Theme 2: Quantitative or qualitative risk analysis?
There is an ongoing debate on the relationship between qualitative and quantitative approaches to risk analysis. Sometimes a hierarchical relationship is assumed, where qualitative (and more rudimentary) analyses are considered as preliminary or initial analyses, on which quantitative analyses might be based (Kaplan and Garrick, 1981). Then decision making based on the analysis is considered to improve as the amount of quantitative data used increases. In other cases quantitative approaches to risk analysis are considered inadequate, not able to capture all relevant aspects of risk in certain contexts (Slovic, 2001).
As risk analyses are often used in formal decision making processes they have an applied, tangible side accompanying the more philosophical and theoretical. A current topic that could be used as a source of examples is the establishing of the research facilities ESS and MAX IV in Lund. How should the risks associated with their operations be understood, calculated, communicated etc? The discussions will cover such questions as: Who decides on what risk is? Who should decide? Who sets acceptable limits? Which perspective should be preferred – Quantitative, qualitative or both? What are the advantages and limitations with each approach? When, if ever, are the two approaches interchangeable?
Theme 3: Rules of thumb and risk analysis
How do laypeople understand and estimate risk? This is a question that has been at the focus of research for a long time. Some of the notable suggestions are affect (as in Slovic’s affect heuristic, 1987), and availability (as suggested by Tversky and Kahneman 1973). Usually these heuristics have been tested separately, but recent research has begun to see whether people change their risk estimation methods depending on the type of risk judgments they are asked to make. For instance, Pachur, Hertwig and Steinmann (2012), explored both of these rules of thumb in a controlled experiment, where also the effect of direct experience on risk judgments was investigated. They found that the various rules of thumb predicted different parts of the data set best. In general availability was a better predictor on risk judgments when these were presented in an ”objective manner” (i.e., for instance stating how many people die of a particular form of cancer each year). When tasks became more personal, however, such as when judging the likelihood to die from a particular cancer form, the predictive power of the affect heuristic improved. This indicates that how tasks are presented to individuals will to a great extend affect how risks are judged.
Theme 4: Confronting uncertainties beyond Bayesianism
The Bayesian approach to interpret and assign probabilities makes it possible to treat knowledge-based uncertainty in parameters and model structure. The Bayesian approach rests upon coherent principles of logic reasoning and inference which easily transfers into decision analysis. It is put forward as the basic approach to assess and communicate knowledge-based uncertainty in risk assessment. The discussion under this theme will evolve around the question when the Bayesian approach is not sufficient to treat uncertainty and what happens to the bases of the Bayesian approach when new treatments of uncertainty enter the scene. The growing demand for science-informed policies to handle climate change and environmental problems has led to an increased attention to the treatment of knowledge-based uncertainty. Lack of scientific knowledge is distinguished as important for the science policy interface, but lack of knowledge is seldom something that is on or off. What knowledge that qualifies to inform decision making is context dependent and a judgment under influence of the values and attitudes of the producers and users of knowledge. Qualitative aspects of knowledge are seen as important and there is an ongoing debate concerning upon which characteristics such qualities can be evaluated and how to take these into account in risk assessment. For example, alternative measures of uncertainty have been suggested to deal with extreme situations with lack of knowledge. The question is how these measures can be used in replacement of or in combination with Bayesian probabilities and when conditions to assign Bayesian probabilities are to be seen as non-ideal and critical.
There are more types and sources of uncertainty to consider than those possible to address with a Bayesian approach. For example, parameters and model structure in a risk assessment is just one part of a knowledge production process, which has been preceded by problem framing and assessors choice of how to limit the problem and what sources of information to use. Looking at the entire knowledge production process might reveal more types and sources of uncertainty to consider. It is relevant to ask if and how new types of uncertainty may affect the rationale and applicability of the Bayesian approach.
Theme 5: Vetenskap och beprövad erfarenhet (Scientific evidence and proven experience)
Since its first application in medicine in the 1890s, the concept of vetenskap och beprövad erfarenhet (scientific evidence and proven experience) has been pressed into service as a standard of evidence in Swedish law and public policy in areas as diverse as education, environmental risk assessment, veterinary care and social work. This notion explicitly recognizes the importance of both scientific evidence and evidence emerging from other sources, more oriented towards professional practice. In actual fact in the public sector in Sweden certain services, such as health care and education, have a longstanding legal duty to show due regard for both kinds of evidence. For example, the Patient Safety Act (SFS 2010: 659) states that medical practice must be based on “vetenskap och beprövad erfarenhet”. Similarly, the Education Act (SFS 2010: 800) requires primary and secondary schooling to “vila på vetenskaplig grund och beprövad erfarenhet”. Moreover, The Higher Education Ordinance (SFS 1993: 100) states that to be awarded a Bachelor degree in disciplines such as Occupational Therapy, Audiology, Biomedical Laboratory Science and Prosthetics and Orthotics, the student must demonstrate knowledge of the links between vetenskap och beprövad erfarenhet and the significance of these links for professional practice. Finally, the Swedish Environmental Code (SFS 1998: 808) states that investigations of the environmental risks associated with GM organisms must be conducted in accordance with vetenskap och beprövad erfarenhet. We want to know more about in what ways vetenskap and beprövad erfarenhet interact, and how they are integrated in actual decision-making. The study group will look at relevant chains of decision, asking how they function and what happens before, during, and after decisions based on vetenskap och beprövad erfarenhet.
Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G. (2012). “The polarizing impact of science literacy and numeracy on perceived climate change risks”, Nature Climate Change, 2(732), 732–735.
Kahneman, D. (2011). Thinking fast and slow. New York: Farrar, Straus and Giroux; London: Allen Lane.
Tversky, A. & Kahneman, D. (1973). “Availability: A Heuristic for Judging Frequency and Probability”, Cognitive Psycholgy, 5, 207–232.
Kaplan S. & Garrick B.J. (1981). “On the quantitative definition of risk”, Risk Analysis, 1(1), 11–27.
Pachur, T., Hertwig, R. & Steinmann, F. (2012). “How do people judge risks: Availability heuristic, affect heuristic, or both?” Journal of Experimental Psychology: Applied, 18, 314–330.
SFS 1993: 100. The Higher Education Ordinance. Swedish Code of Statutes.
SFS 1998: 808. The Swedish Environmental Code. Swedish Code of Statutes.
SFS 2010: 659. The Patient Safety Act. Swedish Code of Statutes.
SFS 2010: 800. The Education Act. Swedish Code of Statutes.
Slovic, P. (1987). “Perception of risk”, Science, 237(17 April), 280–285.
Slovic, P. (2001). “The risk game”, Journal of Hazardous Material, 86(1-3), 17–24.
The group consists of nine persons representing six faculties of LU.
Annika Wallin, PhD, Cognitive science
Erik Persson, PhD, Philosophy
Johannes Persson, Professor, Philosophy
Jonas Borell, PhD, Ergonomics (coordinator of the group and main contact person)
Kerstin Eriksson, PhD, Risk management (coordinator of the group)
Lena Wahlberg, PhD, Law
Niklas Vareman, PhD student, Medical ethics
Ullrika Sahlin, PhD, Centre for Environmental and Climate Research (CEC)
Åsa Knaggård, PhD, Political science
Seminars and Workshops
28 October 2014, internal workshop
15 December 2014, seminar with Tord Kjellström, Australian National University, on Climate risks
12 March 2015, workshop with Rasmus Bååth, Peter Gärdenfors and Nils-Eric Sahlin, Lund University, on Beyond baysianism
27 March 2015, seminar with Ragnar Löfstedt, King’s College London
27 March 2015, open lecture with Ragnar Löfstedt, King’s College London, on The informal European Parliamentary working group on risk
9 April 2015, open lecture with Thorsten Pachur, Max Planck Institute for Human Developmenton, on Rules of thumb and risk analysis
21 April 2015, seminar with Sten Anttila, Swedish Agency for Health, Technology Assessment, and Assessment of Social Services, and Ingrid Thernfrid, Chief Medical Officer at Adult Psychiatry, Lund, on Vetenskap och beprövad erfarenhet – om evidens för medicinskt beslutsfattande
5 May 2015, open seminar with Kristie Ebi, University of Washington, on Uncertainty guidance for international assessments of climate change
19 May 2015, seminar with Paul Slovic, University of Oregon, on The psychology of risk
19 May 2015, open seminar with Paul Slovic, University of Oregon, on When (In)Actions Speak Louder than Words: Confronting the Collapse of Humanitarian Values in Foreign Policy Decisions