Research Article
The Scientific Approaches to Risk and Risk Management: A Critical Review
Macquarie Graduate School of Management, 99-Talavera Road, Macquarie Park, Sydney, 2113, NSW, Australia
Risk and risk management have become essential to the very functioning of many modern societies and contemporary organizations (Attar, 2009; Attar and Badham, 2007; Beck, 1992; Power, 2007). Likewise, in academic discourses and disciplines as diverse as business management, finance and economics, engineering, public administration, energy policy, psychology, sociology, anthropology, health care, education, it has become common for people to ask such questions as these (Attar et al., 2007, 2008): What are the risks in our area and how we can better understand them? Why people in a culture or organization perceive something as risky? How one can say that a risk management model is capable to identify, measure and monitor risks, conquering future contingencies?
Such questions may originate in a variety of interests which in fact reflect a growing recognition of risk and risk management in contemporary societies to which individuals and organizations must adapt if they are to avoid the threat of uncertainty and disorder (Beck, 1992; Giddens, 1999; Power, 2007). Although there is a quest to increasingly interpret events, facts and artifacts in terms of risk and risk management, a moments reflection on such interpretations as a whole is sufficient to produce a sense of confusion. As Power (2004) argues, the term risk is used in many different, inconsistent and incongruent ways.
In this study, I attempt to describe the literature on risk and risk management from the scientific approaches. The structure of the study is as follows. I first seek to examine and explore the traditional meaning of risk and trace how risk and risk management have been shaped and reshaped over time since its beginning (i.e., 13th century) from the scientific approach. I then discuss problems inherent in this tradition and very briefly review more recent approaches and finally conclude with proposals for new directions.
A BRIEF HISTORY OF RISK
The history of risk goes back to the history of probabilities and a Hindu/Arabic/Persian language of numbers that reached Western societies in the thirteenth century-the time that much of the earth was seen as being systematically explored and its resources exploited by humanity as the servants of God or Gods gardeners (Bernstein, 1996). Many scholars link the emergence and development of the contemporary concept of risk, however, with the venturesome voyages and voyage insurance of navigators and explorers (in the middle ages) who attempted to identify and grapple with the perils of the sea (Luhmann and Barrett, 1993). In previous times, risk was regarded as a natural event, carrying its own intrinsic meaning-what nature could do to human affairs. Following Calvinism and the Renaissance, however, one of the special contributions of contemporary understandings of risk was the notion that humanity could and should discover gods' agenda and if necessary, opposes that agenda and take the responsibility for the consequences (Bernstein, 1996).
THE SCIENTIZATION OF RISK
Such instrumental views of risk gradually extended to social order and global trade arrangements during the eighteenth and nineteenth centuries when modern European states increasingly sought to harness society and nature using science, scientific deduction and positivism (including the laws of probability and statistics) (Foucault, 2007). This development paralleled the modernist belief of the sixteenth and seventeenth centuries that the social and natural world follows laws that can be measured, modeled and mastered by humankind (Burtt, 1939). As positivism and rationalism became the only source of positive knowledge of the world, positivist/rationalist methods became increasingly sophisticated in explaining scientific knowledge. Positivist/rationalist methods were then applied to morality, management, sociology and politics as well (Habermas, 1971).
It was in this light that peoples minds were seen as being cleansed of mysticism, superstition and other forms of pseudo-science and the concept of risk was scientized drawing upon probabilistic reasoning and conceiving of the future (assuming the future as a mirror of the past) as fundamentally colonizeable and programmable through rational/scientific measures of risk (Bernstein, 1996; Lupton, 1999a). The question how I shall act in situations of ambiguity and uncertainty? was thus converted into a scientific one and the best risk management tools were seen as being to be selected by the use of science-based knowledge.
RISK MANAGEMENT AS A UNIVERSAL TOOL FOR PROGRAMMING THE FUTURE
Conventional wisdom, however, came along with scientized risk (Giddens, 1999). There was a belief that the modern market economy is a universal plan to be mapped and mediated by an instrumental/rational process of risk management as a value-neutral tool, which would bring humanity unparalleled accesses to the good things of life such that nations around the world should hasten to join this plan (Bernstein, 1996). Bernstein (1996) defines the boundary between modernity and traditional society in terms of mastery of risk and risk management in which humanity no longer exists as passive and the future is no longer subject to whim of the gods. With its associated principles, techniques and assumptions about the contingencies in the world, proponents of scientization of risk treat risk management as the modern means to discipline the order of nature and society. Modern societies use risk management as a guide over a vast range of decision making from allocating wealth to safeguarding public health, from waging war to planning a family, from paying insurance premiums to wearing a seatbelt, from planting corn to marketing cornflakes (Bernstein, 1996). This view therefore sees a great divide between traditional and modern societies which in the latter, risk management has acted as a device to model and master a utopian imagery of the future through the rational/scientific process of risk-taking rather than letting rational gods dictate their agenda.
This approach has been also applied and promoted in academic disciplines such as economics (Bouchaud and Potters, 2003) and cognitive science (Tversky and Kahneman, 1974); natural science, engineering, operation and project management (Abkowitz, 2008; PMI, 2004); as well as medicine and epidemiology (Carroll, 2003), to name but a few. The focus is upon the acquisition and application of research-based knowledge, scientific methods and measurement as the most appropriate approach to managing risk and making decision under conditions of uncertainty (Attar and Badham, 2009). Lay peoples response to risk is usually pictured as biased, unscientific and ill-informed (Lupton, 1999a) compared to what are seen to be objective scientific calculations.
The focus in scientific approaches to risk is more on how well a risk is identified or measured, what is the level of seriousness in terms of its likely consequences and how rigorous and comprehensive are the analysis that has been employed to measure, map and understand the relevant chain of consequences and responses (Lupton, 1999a). The individual, as the unit of analysis (particularly in the psychometric analysis of risk), is treated as an emotion-free information processing actor whose behavior in the face of a pre-defined danger is rational and hence can be modeled, measured and manipulated. The formulations and calculations produced by researchers tend to be regarded as objective scientific facts and absolute truths (Bradbury, 1989). This means that one can make observations, form hypotheses to explain them and deduce consequences from these hypotheses and benchmark to confirm or disconfirm the hypothesis according to given science-based standards. Hence, the process of construction and measurement of risks is taken to be value-free and tends to exclude the subjective role played by the selective world views, ways of seeing or frameworks of the researchers that design and develop risk management models (Lupton, 1999a).
There is however a balance scale assumption in this approach-the likelihood of positive events (upside gains) balanced against the likelihood of threatening or negative events (downside losses). Upside gains or downside losses are two dimensions of risk in general, that one should find a trade-off between the likelihood of the risks and the rewards of making a decision or undertaking an activity (Sortino and Satchell, 2001).
Whether positive or negative, this view positions risk in the language of probabilities and quantitative reasoning. Risk of an event (positive or negative) is then the probability of the event multiplied by its estimated loss or gain if the event actually occurs (PMI, 2004). Risk can be dissected into its probability and impact component as shown in Fig. 1.
By and large, the concept of risk in this view, has become construed in terms of probabilities (Bernstein, 1996) and as such, it lends itself to measurement, bringing assessment of an event into decision making frameworks using probabilistic likelihoods (Clegg and Bailey, 2007). As Hansson (2007) points out: In decision-making under risk, we know what the possible outcomes are and what are their probabilities. Perhaps a more adequate term for this would be decision-making under knowable and measurable probabilities. The loss or gain can be measured in financial terms, time, corporate reputation and the like (Smith, 2003). Uncertainty in the scientific approaches to risk, on the whole, is in the likelihood of an event or threat causing future loss not the event per se.
Fig. 1: | The Risk of an event Z |
Fig. 2: | The scientific/rational model of risk |
Hence, explicit here is the assumption that in conventional risk management a threat can be knowable and its probability of occurrence estimated through probabilistic reasoning (e.g., based on previous experience, as in actuarial tables, or some other formulation). In such a rational process of risk management (Fig. 2), known risks are those that have been (1) systematically identified, analyzed and classified, (2) ranked and prioritized in terms of likelihood and likely impact and (3) resolved, i.e., pre-empted, avoided, mitigated or transferred and monitored continuously. Therefore, underlying this approach is the fundamental assumption that through a series of orderly judgments it is possible to list and manage selected risks proactively so as to secure a desired future state rather than leave them to fate (Giddens, 1999) or to the whim of the gods (Bernstein, 1996).
THE DILEMMA OF NON-RATIONAL GODS
There are, however, unknown risks that cannot be identified in advance and according to the Project Management Institute,: unknown risks cannot be managed proactively and a prudent response by the project team can be to allocate general contingency against such risks, as well as against any known risks for which it may not be cost-effective or possible to develop a proactive response (PMI, 2004).
Implicit in this statement is the possibility of unknowable events that mean both the character and their associated loss or gain is in a state of uncertainty. This implies that tackling situations of uncertainty-where both the event and its likelihood are unknown-simply fall outside the logic of these instrumental treatments of risk. Apparently, those who develop contemporary risk management models are not very much committed to engage with the recent and broader sociocultural perspectives (Lupton, 1999a) or willing to develop theoretical frameworks that can integrate both narrow technical/rational and broader sociopolitical/cultural processes (Beck, 1999; Giddens, 2003; Wynne, 2002). Recognition of unknown unknowns (Hoffmann and Wynne, 2002) or reflecting upon the depth and details of professional practice in situations of handling uncertainty (use of intuition, emotion, hunches, cues, etc.), for proponents of scientization of risk, usually means the termination of discussion rather than opening up inquiry.
It is common, therefore, for scientific approaches to risk to regard ambiguity and uncertainty in hypotheses as relevant only in so far as the logic of their standards and scientific methods are concerned. As Wynne (1988) argues, this school addresses only those uncertainties that are tractable to scientific analysis that would otherwise bring them complexity. Those contingencies that fall beyond the scientific scope of investigations or cannot be interpreted in the sense of a cause-effect relationship are considered to be irrational, hence overlooked or selectively screened. Instead of an emphasis on uncovering the limits to knowledge, scientific approaches to risk usually tend to prove existing knowledge of risk to be legitimate and correct (Hoffmann and Wynne, 2002).
It can be argued, in particular according to Bernstein, that the idea of risk and risk management has been gradually developed during the past centuries to help humanity's so called sacred affairs as the gods gardener to discover the agenda of the rational gods and if necessary use it in opposition to received authority. The commonsense and dominant definitions of risk, therefore, revolve around the ways by which practitioners can handle future contingencies using risk management models as an analytic tool for intervention (Attar, 2010b).
However, there are a number of difficulties with this view. Many advocates of the scientific approaches to risk have become increasingly sensitive to the phenomena of Knightian uncertainty (Knight, 2006) and it has become commonplace for them to speak of unmeasurable uncertainty in which problems do not lend themselves to application of clear-cut techniques and measures (Smith, 2003).
Seen from the scientific approach, the concept of risk is a statistical one and risk management in its most general sense finds its place in the practice of probabilistic reasoning. It is based on an assumption that there is or can be a clear definition of the problems, future events, alternatives, or the objects at stake. It is seen as possible to identify the likely outcomes, estimate the likelihood of their occurrence, assign probabilities and manage the selected risks (Attar, 2010a).
Situations of Knightian uncertainty, however, although action is required, resist analysis in such risk management terms (Schon, 1967). In such conditions, the phenomenon or the situation faced-as Dewey (1938) observes-is inherently problematic. It does not easily lend itself to precise quantitative expression because possible outcomes or alternatives are unknown, vaguely defined, unmeasurable or only dimly apparent at the outset (Lester and Piore, 2004; Knight, 2006). Such situations can be both unique and pressing; at times something needs to be done quickly without having a clear definition of the problems because there is too much competing information or too little to make an informed decision (Schon, 1967). In such situations, one must invent and reinvent received wisdom about what to do given that the problems faced are multifaceted, means and ends are fuzzy, alternatives are ill-defined, outcomes are indeterminate and the smallest impulse may generate flaws or happy accidents which alter ones experience of the situation and ultimately the whole course of action (Dewey, 1930; Schon, 1983). There are often mismatches between what one intends (intention), what one can put into practice (implementation) and what emerges and how one perceives (realization) which block the flow of the kind of systematic and orderly activity and rational problem-solving recommended in standard risk management methodologies. In these situations one usually has to set and reset the problems as well as the likely relevant scenarios again and again and only on occasion (or as an outcome) is one able to tentatively employ a calculus of probabilities (Attar, 2010b).
In addition to this argument, the problematic and inadequate nature of the scientific approaches to risk has been more recently recognized by social scientists (Beck, 1992, Giddens, 1999), sociologists of technology (Wynne, 2002), organization analysts (Flyvbjerg et al., 2003), designing engineers and practitioners of innovation (Bucciarelli, 1994; Friend and Hickling, 2005), to name but a few. These scholars have become increasingly sensitive to uncovering broader dimensions of risk. Similar to Knightian uncertainty, they speak of chaotic, unique, puzzling and fluctuating environments in which problems and risks do not lend themselves to technical/scientific models of benefit-cost or probabilistic reasoning. Likewise practitioners have become acutely aware that they are usually confronted with problematic and messy situations to which they must respond under conditions of anxiety, limited time and budgets which leave no room for probabilistic quantification and risk calculations. Some practitioners argue that surprises, ambiguities, material properties and dread of failure are drowned in a sea of calculations as the risk management process fosters an exaggerated trust in the language of calculable risks (Flyvbjerg et al., 2003).
In response to these limitations, broader approaches to risk and risk management have been formulated including heroic firefighting (Bohn, 2000), system accident (Perrow, 1999), high reliability and mindfulness (Weick and Sutcliffe, 2001), cultural theory of risk (Douglas, 1966, 1992), governmental approaches to risk (Lupton, 1999a, b), the risk society perspective (Beck, 1992) and the new conservatism using risk (Power, 2004).
This study seeks to examine and explore why risk and risk management have become key concepts in many modern societies and organizations. It traces how the notion of risk was invented in the first place, why it was scientized and how risk management from the scientific approach has influenced many academic disciplines. The study then argues that, according to some recent and broader approaches to risk and its management, there are a number of difficulties with this dominant scientific view. Accordingly, the paper calls for a more detailed examination and elaboration of the nature of risk from social, cultural and political perspectives as well as the Knightian account of uncertainty.