In the risk assessment step the enterprise identifies the critical risks to strategy, it analyses and evaluates these critical risks and it prioritizes the critical risks. Risk assessment has been the traditional focus of many risk managers for decades. However, in ERM critical risks include all risks whether operational, competitive, financial, regulatory or from other sources. Finally both positive and negative risks are considered in the context of their criticality as it could affect the strategy.
Jean-Paul Louisot
Formerly Université Paris 1 Panthéon-Sorbonne, Directeur pédagogique du CARM Institute, Paris, France
Laurent Condamin, Ph.D
Consultant and CEO ELSEWARE1
Patrick Naim
Consultant and Partner ELSEWARE
Enterprise-wide risk management (ERM) is a key issue for boards of directors worldwide. Its proper implementation ensures transparent governance with all stakeholders' interests integrated into the strategic equation. Furthermore, risk quantification is the cornerstone of effective risk management, at the strategic, tactical, and operational level, covering finance as well as ethics considerations. Both downside and upside risks (threats and opportunities) must be assessed to select the most efficient risk control measures and to set up efficient risk financing mechanisms. Only thus will an optimum return on capital and a reliable protection against bankruptcy be ensured, i.e. long-term sustainable development.
Within the ERM framework, each individual operational entity is called upon to control its own risks, within the guidelines set up by the board of directors, whereas the risk financing strategy is developed and implemented at the corporate level to optimize the balance between threats and opportunities, systematic and non-systematic risks, pre- and post-loss financing and finally retention and transfer.
However, those risk reduction measures, including risk avoidance, that entail substantial investments and financial impacts may have to be decided at the top management level for approval within the global financial strategy.
However daunting the task, each board member, each executive and each field manager must be equipped with the toolbox enabling them to quantify the risks within his/her jurisdiction to the fullest possible extent and thus make sound, rational and justifiable decisions, while recognizing the limits of the exercise. Beyond traditional probability analysis, used by the insurance community since the 18th century, the toolbox offers insight into new developments like Bayesian expert networks, Monte Carlo simulation, etc., with practical illustrations on how to implement them within the three steps of risk management: diagnostic, treatment and audit.
Recent progress in risk management shows that the risk-management process needs to be implemented in a strategic, enterprise-wide manner, and therefore, account for conflicting objectives and trade-offs. This means that risk can no longer be limited to the downside effect; the upside effect must also be taken into account. The central objective of global risk management is to enhance opportunities while curbing threats, i.e. driving up stockholders' value, while upholding other stakeholders' expectations. Therefore, risk quantification has become the cornerstone of effective strategic and enterprise-wide risk management.
The volatile context within which organizations must operate today calls for a dynamic and proactive vision aimed at achieving the organization's mission, goals and objectives under any stress or surprise. It requires a new expanded definition of “risks”. The “new” risk manager must think and look beyond the organization's frontiers, more specifically to include all the economic partners and indeed all the stakeholders of the organization. Special attention must be devoted to the supply chain, or procurement cloud, and the interdependences of all parties.
The ISO 31000:2009 standard provides a very broad definition of risk as the impact of uncertainties on the organization's objectives. It provides a road map to effective ERM (enterprise-wide risk management) rather than a compliance reference; this is why the principles and framework provide a track to explore.
But whatever the preferred itinerary, all managers will need to develop a risk register and quantify the possible or probable consequences of risks to make rational decisions that can be disclosed to the authorities and the public. In many circumstances the data available are not reliable and complete enough to open the gates for traditional probability and trend analysis, other toolboxes may be required to develop satisfactory quantification models to help decision makers include a proper evaluation of uncertainty in any strategic or operational decision.
As a reminder, we believe that the cornerstone of risk management is the risk management process completed by a clear definition of what is a risk or exposure:
Therefore, quantification is the key element for strategic – or holistic – risk management, as only a proper evaluation of uncertainties allows for rational decision-making. Only a robust perspective on risk could support the design of a risk management program, both at tactical and strategic levels, for implementation at the operational level. One of the key tasks of the risk manager is to design a risk management program and have it approved.
Risks are situations where damaging events may occur but are not fully predictable. Recognizing some degree of unpredictability in these situations means that events must be considered as random. But randomness does not mean that these events can't be analyzed and quantified!
Most of the risks that will be considered throughout this book are partially driven by a series of factors, or drivers. These drivers are conditions or causes that would make the occurrence of the risk more probable, or more severe.
From a scientific viewpoint, causation is the foundation of determinism: identifying all the causes of a given phenomenon would allow prediction of the occurrence and unfolding of this event. Similarly, the probability theory is the mathematical perspective on uncertainty. Even in situations where an event is totally unpredictable, the laws of probability can help to envision and quantify the possible futures. Knowledge is the reduction of uncertainty – when we gain a better and better understanding of a phenomenon, the random part of the outcome decreases compared to the deterministic part.
Some authors introduce a subtle distinction between uncertainty and volatility, the latter being an intrinsic randomness of a phenomenon that cannot be reduced. In the framework of deterministic physics, there is no such thing as variability, and apparent randomness is only the result of incomplete knowledge. Invoking Heisenberg's “uncertainty principle” in a discussion on risk quantification seems disproportionate. But should we do it, we understand the principle as stating that the ultimate knowledge is not reachable, rather than that events are random by nature:
“In the sharp formulation of the law of causality (if we know the present exactly, we can calculate the future) it is not the conclusion that is wrong but the premise.” (W. Heisenberg, 1969)
Risk management is maturing into a fully-fledged branch of managerial sciences dealing with the handling of uncertainty with which any organization is confronted, due to more or less predictable changes in the internal and external context in which they operate, as well as evolutions in their ownership and stakeholders that may modify their objectives.
Judgment can be applied to decision making in risk-related issues, but rational and transparent processes called for by good governance practices require that risks be quantified to the fullest extent possible. When data are insufficient, unavailable or irrelevant, expertise must be called upon to quantify impacts as well as likelihoods. This is precisely what this chapter is about. It will guide the reader through the quantification tools appropriate at all three steps of the risk management process: diagnostic to set priority; loss control and loss financing to select the most efficient methods with one major goal – long-term value to stakeholders – in mind; and audit to validate the results and improve the future.
The analysis of recent major catastrophes outlines three important features of risk assessment. First, major catastrophes always hit where and when no one expects them. Second, it is often inaccurate to consider they were fully unexpected, but rather that they were consciously not considered. Third, the general tendency to fight against risks that have already materialized leaves us unprepared for major catastrophes.
A sound risk assessment process should not neglect any of these points. What has already happened could strike again; and it is essential to remain vigilant. What has never happened may happen in the future, and therefore we must analyze potential scenarios with all available knowledge.
The Bayesian approach to probabilities can bring an interesting contribution to this problem. The major contribution of Thomas Bayes to scientific rationality was to clearly express that uncertainty is conditioned to available information. In other words, risk perception is conditioned by someone's knowledge.
Using the Bayesian approach, a probability (i.e. a quantification of uncertainty) cannot be defined outside an information context. Roughly speaking, “what can happen” is meaningless. I can only assess what I believe is possible. And what I believe possible is conditioned by what I know. This view is perfectly in line with an open approach of risk management. The future is “what I believe is possible”. And “what I know” is not only what has already happened but also all available knowledge about organizations and their risk exposure. Risk management starts by knowledge management.
Reducing the risks is the ultimate objective of risk management, or should we say reducing some risks. Because risks cannot be totally suppressed – as a consequence of the intrinsic incompleteness of human knowledge – risk reduction is a trade-off.
Furthermore, even when knowledge is not the issue, it may not be “worth it” for an organization to attempt a loss reduction exercise, at least not beyond the point when the marginal costs and the marginal benefits are equal. Beyond that point it becomes uneconomical to invest in loss control. Then two questions will have to be addressed:
Beyond the macro-micro distinction, there are individual variations on the perception of risk by each member of a given group; each group and the final decisions may rest heavily on the perception of risk by those in charge of the final arbitration. This should be kept in mind throughout the implementation of the risk management process. Why do people build in natural disaster prone areas without really taking all the loss reduction measures available, while at the same time failing to understand why the insurer will refuse to offer them the cover they want or at a premium they are willing to pay?
Every individual builds his own representation that dictates his perception of risks and the structural invariants in his memory help in understanding the decision he reached. His reasoning is based on prototypes or schemes that will influence the decision he reaches. In many instances, decisions are made on a thinking process based on analogies: they try to recall previous situations analogous to the one they are confronted with. Therefore, organizing systematic feedback at the unit level and conducting local debriefing should lead to a better grasp of the local risks and a treatment more closely adapted to the reality of the risks to which people are exposed.
This method should partially solve the paradox we have briefly described above, as the gradual construction of a reasonable perception of risk in all should lead to more rational decisions.4
There remains to take into account the pre-crisis situation when the deciders are under pressure and where the time element is a key to understanding sometimes disastrous decisions. Preparing everyone to operate under stress will therefore prove key to the resilience of any organization.
From a quantitative point of view, the implementation of any risk control measure will:
Therefore, the cost of risks is the sum of three elements: accident losses, loss control cost, and opportunity cost.5 These elements are of course interrelated. Reaching the best configuration of acceptable risks is therefore an optimization problem, under budget and other constraints. From a mathematical point of view, this is a well-defined problem.
Of course, since loss reduction actions have an intrinsic cost, there is no way to reduce the cost of risks to zero. Sometimes, the loss control action is simply not worth implementing. The opportunity cost is also essential: ignoring this dimension of the loss control would often result in a very simple optimal solution – reducing the exposure to zero, or in other words, stopping the activity at risk! This loss control method is called avoidance, and will be discussed further.
As we will see, the quantitative approach to risks is a very helpful tool for selecting the appropriate loss control actions. But here we must be very careful as four categories of drivers can be identified:
When a first set of risk models is created during the risk assessment phase, the use of observable and hidden drivers would generally be limited to the initial risk assessment, simply because they cannot assist in the evaluation of the impact of proposed loss reduction measures.
For instance, when dealing with terrorist risks, the hostility of potential terrorists cannot be measured. When dealing with operational risks, the training level and the workload of the employees certainly impact the probability of a mistake. However, this dependency is very difficult to assess. But should these drivers be ignored in risk reduction? Should a state ignore the potential impact of a sound diplomacy or communication to reduce terrorist exposure? Should a bank neglect to train its employees when striving to improve the quality of service and reduce the probability of errors?
Simply said, we must recognize that causal models of risks are partial. And, although using this type of models is a significant improvement when dealing with risk assessment, they should only be considered as a contribution when dealing with risk reduction.
Risk financing is part of the overall medium- and long-term financing of any organization. Therefore, its main goal is derived from the goals of the finance department, i.e. maximizing return while avoiding bankruptcy, in terms of obtaining the maximum return on investments for the level of risk acceptable to the directors and stockholders. In economic terms, that means riding on the efficient frontier.
To reach this goal the organization can use a set of tools aimed at spreading through time and space the impact of the losses it may incur, and more generally taking care of the cash flows at risk. However, deciding whether it can retain or must transfer the financial impact of its risks cannot be based merely on a qualitative assessment of risks. A quantitative evaluation of risks is necessary to support the selection of the appropriate risk financing instruments, to negotiate a deal with an insurer or understand the cost of a complex financing process.
The question is to identify the benefits of building a model which quantifies the global cost of risks, thus providing the risk manager with a tool that allows him to test several financing scenarios: the benefits of quantification to enhance the process of selection of a risk financing solution. Financing is the third leg of risk management based on the initial diagnostic and after all reasonable efforts at reducing the risks have been selected. Risk financing, even more than risk diagnostic or reduction, requires an accurate knowledge of your risks. “How much will you transfer?” and “How much will you retain?” are questions about quantities, the answers to which obviously require fairly precise figures.
Insurance premiums are set on the basis of quantitative models developed by the actuaries of insurance and reinsurance companies. Thus, insurance companies presumably have an accurate evaluation of the cost of your (insurable) risks. The problem is to ensure a balanced approach at the negotiation table. It is not conceivable to have a strong position when negotiating your insurance premiums equipped with only a qualitative knowledge of your risks. You may try, but it will be difficult for you to convince an insurer.
A complex financing program is usually expensive to set up, and sometimes to maintain; therefore the organization must make sure that the risks to be transferred are worth the effort. As part of their governance duties, the board of directors will expect from the finance director a convincing justification of the proposed program both in terms of results and efforts.
Any decision concerning the evaluation or the selection of a financing tool must be based on a quantified knowledge of your risks. Defining the appropriate layers of risk to be retained, transferred, or shared involves a clear understanding of the distribution of potential losses. Deciding whether you are able to retain a €10 million loss requires that you at least know the probability of occurrence of such a loss.
Before developing any risk financing programs, the first decision concerns what risks must be financed. This issue should be addressed during the diagnostic step. Diagnostic has been extensively developed in a previous article. This step provides a model for each loss exposure and sometimes a global risk model. This model quantifies:
Developing and implementing a risk financing solution involves being able, at least, to measure beforehand the cost of retention and the cost of transfer and this is possible only by combining the risk model and a mathematical formalization of the financing tool cost.
Risk quantification is essential for risk diagnostic, risk control and risk financing. All steps of the risk management process involve indeed an accurate knowledge of the risks an organization has to face and of the levers it could use to control risks, and finally what remains to be financed.
However, under certain circumstances, an organization could still rely on qualitative assessment to identify and control its risks. For risk financing, qualitative assessment is definitely not adequate to deal with evaluating premiums, losses volatility, etc. An accurate quantification of risks is necessary for a rational risk financing.
Several motivations lead the risk manager to address the quantification of risks:
However, modeling the risks may prove insufficient if we want to address the three challenges listed above. We also have to model the financing program. A general framework can be developed where any financing program can be considered as a set of elementary financing blocks and then used as a base model for the present financing program. This model should suffice for many of the classical financing tools – self-insurance and informal retention, first line insurance, excess insurance, retro-tariff insurance, captive insurer, cat bonds – but it should be adapted to take into account a complex financing set up.
But even if an organization did its best and built an accurate model of risks and financing tools and even if it is able to evaluate the theoretical premium it should pay, the market will decide the actual price the organization should pay to transfer its risks. This market may be unbalanced for some special risks. When the insurance offer is tight, actual premiums could differ from theoretical primes calculated by models. This does not invalidate the need for accurate quantification as, even if the final cost of transfer depends on the insurance market, the organization should be aware of that fact and assess the difference. Also, the liquidity of the insurance markets is likely to increase as they become connected with the capital markets. The efficiency of these markets leads us to expect that the price to be paid for risk transfer will tend to be the “right” one.
W. Heisenberg (1969), Der Teil und das Ganze. Munich: Piper. English: Physics and Beyond: Encounters and Conversations. A.J. Pomerans, trans. (New York: Harper & Row).
Georges-Yves Kervern
Formerly Ancien Élève de l'Ecole Polytechnique, and founder of Cindynics
Jean-Paul Louisot
Formerly Université Paris 1 Panthéon-Sorbonne, Directeur pédagogique du CARM Institute, Paris, France
One of the major difficulties for a risk manager is not only to identify and quantify the emerging risks, the known-unknowns, but also to imagine those that are not yet emerging, the unknown-unknowns. For that brainstorming exercise there is no relying on past events, on data bank and mathematical models that have no basis to be developed. Even systems safety approaches fall short of a total vision as they incorporate human elements as components of the system with their own rate of failure, but fail to really take account of what is now known as the “human factor”. The human element is part of the system, but he acts also to modify it for his benefit. Understanding everyone's motivation and point of view is essential to foresee what may contribute to future risks, opportunities and threats.
It is with this objective in view that some French scientists and executives, led by Georges-Yves Kervern, developed a new approach to foreseeing, rather than forecasting, future developments when they imagined the “hyperspace of danger” and founded what they called Cindynics.
Since the early 1990s, a group of practitioners gathered around Jean-Luc Wybo, a professor at the École Nationale Supérieure des Mines de Paris, to develop practical examples of using Cindynics to understand past complex events and project their findings for future action and decision making. Their application included “Explosion”, for example, the explosion at the AZF factory on September 22, 2001 in France; “Pollution”, like that on the beaches of Brittany, “Social Unrest” events, as occurred in the suburbs in France, to name but a few.
Some trace the first step in Cindynics6 to the earthquake in Lisbon. Science starts where beliefs fade. The earthquake in Lisbon in 1755 was the source of one of the most famous polemic battles between Voltaire and Jean-Jacques Rousseau. The main result was the affirmation that mankind was to refuse fate. This is reflected in Bernstein's7 comment “Risk is not a fate but a choice.”
In a way, the Lisbon episode may well be the first public manifestation of what is essential in managing risks: a clear refusal of passively accepting “fate”, a definite will to actively forge the future through domesticating probabilities, thus reducing the field of uncertainty.
However, since the financial crisis of 2008, black swans or fat tails represent a major challenge to all professionals in charge of the management of organization. Clearly, the traditional approaches to identifying and quantifying uncertainties based on probability or trend analysis are at a loss in a world that changes fast and may be subject to unexpected, and sometimes unsuspected ruptures.
As a matter of fact, these “dangerous or hazardous” situations can develop into opportunities or threats depending on how the leadership can anticipate them and exploit them for the benefit of their organization, and its growth in a resilient society.
Human factors are a key factor in the anticipation and development of such situations. Although it is essential that decision-makers learn to make decisions under uncertainty, it is far from sufficient to prepare for the black swans. Furthermore, system safety approaches that consider the human component as a physical element fall short of taking into account the fact that humans are part of a complex system that they influence and try to change to their benefit; and the system can be affected and modified even through a simple act of observation.
In such a volatile situation, the concepts developed as early as the late 1980s could prove very valuable if properly used and translated into practical tools, even though they may appear at first to be too conceptual for practical application. As a matter of fact the concepts of “Cindynic situation” and “hyperspace of danger” allow for the identification of divergences between groups of stakeholders in a given situation and thus allow for the anticipation of “major uncertainties” and to be able to work on them to reduce their likelihood and/or their negative consequences (threats) while enhancing the positive consequences (opportunities).
This scientific approach to perils and hazards was initiated in December 1987 when a conference was called at the UNESCO Palace. The name “Cindynics” was coined from the Greek word “kindunos”, meaning hazard. Many industrial sectors were in a state of shock after major catastrophes like Chernobyl, Bhopal, and Challenger. They offered an open field for experience and feedback looping. Since then, Cindynics continues to grow through teaching in many universities in France and abroad. The focal point is a conference organized every other year. Many efforts have been concentrated on axiology and attempts at objective measures. Before his death in December 2008, Professor Georges-Yves Kervern reviewed the presentation that follows (see bibliography) in the light of the most recent developments in Cindynics through the various Conferences, until September 2008.
The first concept, situation requires a formal definition. This in turn can be understood only in the light of what constitutes a peril and hazards study. According to the modern theory of description, a hazardous situation (Cindynic situation) can be defined only if:
The field of “hazards study” is clearly identified by
The perspective of the observer studying the system. At this stage of the development of the sciences of hazards, the perspective can follow five main dimensions.
First dimension: Memory, history – Statistics (the space of statistics)
This consists of all the information contained in the data banks of the large institutions, feedback from experience (Electricity of France power plants, Air France flights incidents, forest fires monitored by the Sophia Antipolis Centre of the École des Mines de Paris, claims data gathered by insurers and reinsurers).
Second dimension: Representations and models drawn from facts – Epistemic (the space of models)
This is the scientific body of knowledge that allows for the computation of possible effects using physical and chemical principles, material resistance, propagation, contagion, explosion and geo-Cindynic principles (inundation, volcanic eruptions, earthquake, landslide, tornadoes and hurricane, for example).
Third dimension: Goals and objectives – Teleological (the space of goals)
This requires a precise definition by all the actors, and networks involved in the Cindynic situation of their reasons for living, acting and working.
In truth, it is an arduous and tiresome task to express clearly why we act as we act, what motivates us. However, it is only too easy to identify an organization that “went overboard” only because it lacked a clearly defined target. For example, there are two common objectives for risk management “survival” and “continuity of customer (public) service”. These two objectives lead to a fundamentally different Cindynic attitude. The organization, or its environment, will have to harmonize these two conflicting goals. It is what we call “social transaction”, which is hopefully democratically solved.
Fourth dimension: Norms, laws, rules, standards, deontology, compulsory or voluntary, controls, etc. – Deontological (the space of rules)
This includes all the normative sets of rules that make life possible in a given society. For example, the need for a highway code was felt as soon as there were too many automobiles to make it possible to rely on courtesy of each individual driver: the code is compulsory and makes driving on the road reasonably safe and predictable. The rules for behaving in society, like how to use a knife or a fork when eating, are aimed at reducing the risk of injuring one's neighbor as well as a way to identify social origins.
On the other hand, there are situations in which the codification is not yet clarified. For example, skiers on the same track may be of widely different expertise thus endangering each other. In addition some use equipment not necessarily compatible with the safety of others (cross country skis and snowboards, etc.). How to conduct a serious analysis of accidents on skiing domains? Should experience-drawn codes be enforced? How can rules be defined if objectives are not clearly defined beforehand? Should we promote personal safety or freedom of experimentation?
Fifth dimension: Value systems – Axiological (the space of values)
It is the set of fundamental objectives and values shared by a group of individuals or other collective actors involved in a Cindynic situation.
As an illustration, when the forefathers declared that “the motherland is in danger”, the word motherland, or “patria” (hence the word patriot), meant the shared heritage that, after scrutiny, can be best summarized in the fundamental values shared. The integrity of this set of values may lead the population to accept heavy sacrifices. When the media use the word apocalyptic or catastrophic, they often mean a situation in which our value system is at stake.
These five dimensions, or spaces, can be represented on a five-axis diagram and Figure 2.1 is a representation of the “hyperspace of danger”.
In combining these five dimensions in a different way – these five spaces – one can identify some traditional fields of study and research.
Combining facts (statistics) and models gives the feedback loop so crucial to most large corporations' risk managers.
Combining objectives, norms and values leads to practical ethics. Social workers have identified authority functions in this domain. These functions are funded on values that frame the objectives and define norms that they enforce hereafter. If there is no source of authority to enforce the norms, daily minor breaches will soon lead to major breaches and soon the land will dissolve into a primitive jungle.
This new extended framework provides a broader picture that allows visualizing the limitations of the actions too often conducted with a narrow scope. Any hazard study can be efficient only if complete, i.e. extended to all the actors and networks involved in the situation. Then, the analysis must cover all of the five dimensions identified above.
The first stage of a diagnostic to be established as described above consists in identifying the networks and their state in the five dimensions or spaces of the Cindynic model. The next step will be to recognize the incoherencies, or dissonances, between two or several networks of actors involved in a given situation.
These dissonances must be analyzed from the point of view of each of the actors. It is therefore necessary to analyze dissonances in each dimension and between the dimensions. In this framework, the risk control instrument we call prevention is aimed at reducing the level of hazard in any situation. In a social environment, for example, some actors may feel that an “explosion is bound to occur”. This is what is called the Cindynic potential. The potential increases with the dissonances existing between the various networks on the five spaces.
A prevention campaign will apply to the dissonances: an attempt at reducing them without trying to homogenize all five dimensions for all the actors. A less ambitious goal will be to attempt to develop for each dimension a “minimum platform” shared by all the actors' networks thus ensuring a common set of values as a starting point. In other words, it is essential to find:
The minimum foundation is to establish a list of points of agreement and points of disagreement. Developing a common list of points of disagreement is essential.
The definition of these minimum platforms is the result of:
The “defiance” between two networks, face to face, has been defined as a function of the dissonances between these two networks following the five dimensions. Establishing confidence, a trusting relationship, will require the reduction of the dissonances through negotiations, which will be the task of the prevention campaign. This process can be illustrated by three examples.
Family systematic therapy: Dr. Catherine Guitton8 focused her approach on dissonances between networks:
When healing is reached on the patient pointed to by the family, the result was obtained thanks to work on the dissonances rather than a direct process on the patients themselves.
Adolescents and violence: Dr. M. Monroy's9 research demonstrates that violence typically found in the 15–24 age group is related to a tear, a disparity along the five dimensions. This system can be divided into two sub-systems between which a tremendous tension builds up.
These dissonances can lead the adolescent to a process of negotiation and aggression with violent phases in which he will play his trump card, his own life. From this may stem aggressions, accidents and even, sometimes, fatal solutions of this process of scission, specific to adolescence.
The case of the religious sects: It is during this process of scission that the success of some sects in attracting an adolescent following may be found. Their ability to conceal from the adolescents their potential dangers comes from the fact they sell them a ready-made “turn-key” hyperspace. The kit, involving all five dimensions, is provided when the adolescent is ripe. As a social dissident, any adolescent needs to develop his own set of references in each of the five dimensions.
Violence in the sects stems from the fact that the kit thus provided is sacred. The sacredness prevents any questioning of the kit. Any escape is a threat to the sacredness of the kit. Therefore, it must be repressed through violence, including brainwashing and/or physical abuse or destruction, as befits any totalitarian regimes that have become masters in large-scale violence.
In a recent book on the major psychological risk (see Bibliography) where the danger genesis in family is analyzed according to the Cindynic framework, Dr. M. Monroy tries to grasp all the issues by numbering all the actors involved in most of these situations.
Network I | Family |
Network II | Friends and peers |
Network III | Schooling and professional environment |
Network IV | Other risk takers or stakeholders (bike riders, drug users, delinquents) |
Network V | Other networks embodying political and civilian society (Sources of norms, rules and values) |
Network VI | Social workers and therapists |
This list of standard networks allows spotting the dissonances between them that build the Cindynic potential of the situation.
In the case of exposures confronting an organization, an analysis of the actors' networks according to the five dimensions facilitates the identification of the “deficits” specific to the situation. For example, the distances between what is and what should be provides an insight of what changes a prevention campaign should bring about. These deficits should be identified through a systemic approach of hazardous situations. It can be:
These deficits always appear in reports by commissions established to inquire on catastrophes. It is striking to realize how all these reports' conclusions narrow down to a few recurring explanations.
How do these situations change? Situations with their dissonances and their deficits “explode” naturally unless they change slowly under the leadership of a prevention campaign manager.
In the first case, non-intentional actors of change are involved. The catastrophic events taking place bring about a violent and sudden revision of the content of the five dimensions among the networks involved in the “accident”. Usually all five dimensions are modified: revised facts, new models, new goals, implicit or explicit, new rules, and new values.
In the second case, that all organizations should prefer, the transformer chooses to act as such. He is the coordinator of the negotiation process that involves all the various actors in the situation. Deficits and dissonances are reduced through “negotiation” and “mediation”. The Cindynic potential is diminished so that it is lower than the trigger point (critical point) inherent to the situation.
Exchanges between different industrial sectors, Cindynic conferences and the research on complexity by Professor Le Moigne10 (University of Aix en Provence, derived from the work of Nobel Prize winner, Herbert A. Simon11) have developed some general principles. The Cindynic axioms explain the emergence of dissonances and deficits.
CINDYNIC AXIOM 5 – AMBIGUITY REDUCTION: Accidents and catastrophes are accompanied by brutal transformations in the five dimensions. The reduction of the ambiguity (or contradictions) of the content of the five dimensions will happen when they are excessive. This reduction can be involuntary and brutal, resulting in an accident, or voluntary and progressive achieved through a prevention process.
The theories by Lorenz on chaos and Prigogine on bifurcations offer an essential contribution at this stage. It should be noted that this principle is in agreement with a broad definition of the field of risk management. It applies to any event generated or accompanied by a rupture in parameters and constraints essential to the management of the organization.
The main benefit of the use of these principles is to reduce the time lost in fruitless unending discussions on:
In a Cindynic approach, hazard can be characterized by:
Dissonances and deficits follow a limited number of “Cindynic principles” that can be broadly applied. They also offer fruitful insights to measures to control exposures that impact the roots of the situation rather than, as is too often the case, reduce only the superficial effects.
For more than a decade now, the approach has been applied with success to technical hazards, acts of God and more recently on psychological hazards in the family and in the city. It can surely be successfully extended to situations of violence (workplace, schools, neighborhoods, etc.). In some cases, it will be necessary to revisit the 7 principles to facilitate their use in some specific situations.
The objective is clear: Situations that could generate violence should be detected as early as possible, they should then be analyzed thoroughly, their criticality reduced and, if possible, eliminated.
Cindynics offer a scientific approach to anticipate risks, act and improve the management of risks. Thus, they offer a new perspective to the risk management professional, they dramatically enlarge the scope of his/her action in line with the trend towards holistic or strategic risk management while providing an enriched set of tools for a rational action at the roots of danger.
Kervern, G-Y. (1993) La Culture Réseau (Ethique et Ecologie de l'entreprise), Paris: Editions ESKA.
Kervern, G-Y. (1994) Latest Advances in Cindynics. Economica.
Kervern, G-Y. (1995) Éléments fondamentaux des cindyniques. Economica.
Kervern, G-Y. and Rubise, P. (1991) L'archipel du danger. Introduction aux cindyniques. Economica.
Kervern, G-Y. and Boulenger, P. (2008) CINDYNIQUES Concepts et mode d'emploi. Economica.
Kervern, G-Y. The Evil Genius in Front of the Risk Science: The Cindynics. Risque et génie civil. Colloque, Paris, France (08/11/2000).
Wybo, J-L. et al., Introduction aux Cindyniques. Paris (France): ESKA, 1998.
Jean-Paul Louisot
Formerly Université Paris 1 Panthéon-Sorbonne, Directeur pédagogique du CARM Institute, Paris, France
The purpose of this article is to zoom in on a subject not really addressed in the RM Standards published worldwide, including ISO 31000; it proposes a practical tool for identifying, analysing and prioritizing the portfolio of exposures, and opportunities, as well as threats that confront any organization that envisions its future.
The “space of exposure” will prove a powerful tool for all embarking in the ERM (Enterprise-wide journey) to help “lift the fog of uncertainties in decision making and implementing” to paraphrase a recurring theme in Felix Kloman's12 conferences and presentations.
The future is never known with certainty, “Who knows what tomorrow will bring?” but managing organizations means making decisions, enlightened by information drawn from different methods that shed light on the future.
For a long time, men have tried to improve tomorrow by influencing the forces that guide the future, or by offering sacrifices to the gods. It was only at the end of the seventeenth century, that Pascal and Fermat and their successors, including Bernoulli, started developing ways to open the gates to the future by drawing from past and present experiences. Probability and trend analysis were the first approaches to see through the “cloud of unknowing”.13
During the last decade of the twentieth century, the development of risk management, resting on more elaborate forecasting models, seems to have focused on only the downside aspect of risk, the threats, and has slowly put aside the upside, usually called “opportunities”. Confronted with the uncertainties of the future, organizations are rediscovering that “threats” and “opportunities” – the yin and the yang of risk – represent two sides of the same coin.
It has never been more important that directors and officers, as well as investors, remember the basics of economic and financial theories. Risk is inherent to the undertaking of any human endeavor. Indeed, it is the acceptance of a significant level of risk that provides the type of return on investment that is expected by investors. The theory of finance defines the expected rate of return as the sum of two components:
Basic return, of the risk less investment (usually measured by the US treasury bond rate of similar maturity); and
The risk premium, i.e. an additional return that the investor deserves for having accepted a higher volatility of profit, to enhance some societal goal, like improved technology, a new drug, etc.
Of course, all volatilities are not “equal”. Traditionally, scientific authors distinguish between probabilistic future (risk) and non-probabilistic future (hazard).
Most of the time, deciders are in the first situation (risk) when they have enough reliable data to compute law of probability or draw a trend line for future events and can define confidence intervals, i.e. limits between the likely and the unlikely future. For example, in analysing past economic conditions, it should be possible to have a reasonable idea of the numbers of cars to be sold in the EU, in the US or in Australia. The booming and recent Chinese market may not lend itself easily to this type of reliable trending. While an automobile company can predict with a fair degree of precision its mature market, forecasts are far more volatile in emerging markets. Therefore, there is a higher risk to market cars in emerging markets, but the reward may be much higher in case of success.
On the other hand, when launching a new model, especially if some defects are revealed in the first year of sales, it is much more difficult to justify the investments if reliable forecasts can prove the existing models will be profitable. Banks have experienced a similar situation when they embarked on the management of operational risks to comply with the new Basel II14 requirements. When no data bank is at hand, experts' opinions will have to be formalized using a Bayesians network15 approach and scenarios.
As a matter of fact the above examples might be considered as incorrect for looking only at the negative side of risks. However, operational risks are also a significant opportunity for competitive advantage for the banks that invest more than others in this endeavor. Not only are banks likely to “save” on internal funds, they may even gain expertise that could benefit their clients in the longer term.
A current trend in risk-management is to minimize risk silos in order to reach a real global optimization of the management of risk, taking into account for each unit, each process, each project both threats and opportunities. The organization is analysed as a portfolio of risks with an upside and downside that must be optimized, much as an investor would optimize a personal portfolio of shares.
In practice, this integration of all risks is achieved more easily for the financial consequences at the risk financing level. More and more economic actors consider their risk financing exercise as part of their overall long-term financial strategy. However, it is possible to integrate risk assessment and loss control provided all in charge (at whatever level) are included in the risk management process. This integrated approach is now called ERM and within ERM the managers become “risk owners”. The globalization of risk management is ensured through the principle of subsidiarity: the directors and officers should deal only with the exposures that have impact on the strategy, assured that the risk management process implemented throughout the organization will take care of “minor” threats and seize “tactical” opportunities. That should sound familiar to many in Australia, as this is a fundamental tenet of management guidelines in the Australian/New Zealand Framework (companion to AS/NZ 4360 used in various versions since 1995 and finally replaced since 2009 by the adoption of ISO 31000 under the name AS/NZ 31000:2009).
Risk management, risk mitigation and risk financing – indeed the word “risk” is used by all risk management professionals as well as by many others in their daily life. But do we really know what we mean by risk? The Australian RM standard states, “…the chance of something happening that will have an impact on objectives”, whereas ISO 31000 proposes an even wider definition: “the effect of uncertainties on objectives”. But this is not the final word because there are other common understandings of the word risk.
This definition is most commonly used by specialists, which is compatible with the definition of risk in the ISO 31000:2009 standard, depending on the nature of the consequences involved.
Systematic risk (i.e. non diversifiable) is the result of non-hazardous causes that may happen simultaneously. That means that it does not lend itself to diversification. As an example, all economic actors can be affected by a downturn in the economy or a rise in interest rates.
Unsystematic risk (i.e. diversifiable) is the result of hazardous causes and lends itself to probabilistic approaches. They are specific to each individual economic entity and offer the possibility to build a “balanced portfolio” of risk sharing. Therefore, insurance cover might be designed to cover them.
This is a more restrictive definition as it refers to an event for which there is an insurance market. Furthermore the work “risk” is used commonly by insurance specialists to refer to the entity at risk, the peril covered, the quality of the entity risk management practices (level of risk), and the overall assessment of a site (“good risk for its category”).
One must be cautious because this commonly used word may have totally different meanings for different individuals. This requires all involved to be aware of that diversity when communicating and consulting in the boardroom or with different stakeholders.
This reality must be kept in mind when communicating about risks, whatever the media or audience. Any “risk management” professional should be always aware of one of the main challenges and hazards of risk awareness and understanding: how risk is perceived by stakeholders and decision makers is more important than any “scientific” assessment of risk. Recommendation: Whenever possible avoid using such an uncertain word; another, less common, should be substituted: exposure, threat, opportunity, peril, impact, etc., are acceptable alternatives but there is a caveat here as well – the use of any term depends on which facet of “risk” is the subject of the discussion! Sometimes, a specifically crafted professional word needing some explanation may prove safer than a commonly used word that may be understood differently within a group. A common taxonomy of terms is vital not only for understanding but also in developing consistent data across the organization.
The word “risk” has several meanings and can be misleading when used in a professional context, especially in the case of an organization communicating on risks with a broader audience. Therefore, practitioners and academics in risk management need to define a more precise concept. This is what George Head, who developed the Associate in Risk Management designation, attempted with the word “exposure” as early as 1975. However Head's definition considered only the downside of risks. A new definition more appropriate for today's global approach is required:
An exposure is characterized by the consequences on its objectives resulting from the occurrence of an unexpected (random) event that modify the resources of an organization.
This definition allows for an exposure to be clearly identified with three factors:
However, it should be noted that not all consequences touch only the organization under
study; therefore, especially for the downside risk, it is essential to distinguish:
In a complete analysis, the upside risk should be included, as the threats to one organization may well create an opportunity for another organization!
Once the concept of “exposure” is clearly mastered, it provides a model to develop a systematic approach to managing risks. As a result, any organization will be seen as a portfolio of exposures, with a special attention to those that represent challenges to the optimal implementation of a strategy. The risk register suggested by the Australian standards appears as a list of “risk assets”. Therefore, the decision tools developed in finance for the management of investment portfolios – the portfolio theory – are pertinent towards implementing a rational decision making process in risk management that will ultimately lead to sound governance.
Once again, the concept of exposure constitutes a step towards the integration of risk management in an organization's strategic process by leveraging opportunities and mitigating threats, not only as they are anticipated at the development stage but also as they materialize along the path of the implementation towards achieving strategic goals.
Any organization can be defined as a dynamic combination of resources pulled together to reach its goals and objectives. Therefore, developing and communicating these objectives is at the heart of any risk management, indeed any management, exercise. This is the reason why, for risk management purposes, we have defined an organization as a portfolio of exposures, both threats and opportunities, to be managed in the most efficient manner to reach these goals and objectives under any circumstances. Within the context of a competitive economy, efficiency means either to reach the most ambitious objectives with the available resources or reach the assigned goals with as little resource as possible.
While many would agree on this simple approach, it must be determined how many classes of resources should be considered. The model proposed here is limited to a small number of classes, five, that will take into account practically all the resources involved in the management of an organization, which can be used to list the resources in a specific organization. This will permit a systematic and global identification because the classes of exposures will be directly linked to the classes of resources. Each of these classes calls for specific forms of loss control measures. Thus, the five classes of resources are as follows:
I = Information: All the information that flows throughout the organization, in whatever form (electronic, paper, and human brains). This may cover information concerning the organization itself, information regarding others (medical files for patients in a hospital) but also what others may want to try to know, and what the organization wants to know about others (economic intelligence of different forms). Also included are intangibles like goodwill, credit score, rating agency evaluation, and other financial or cross-discipline metrics.
Furthermore, the ability to do business depends on the trust established with others: the perception that all the stakeholders have of the organization is an essential “asset”, the risks to reputation have become an important item in boards' risk agendas.
Market globalization has generated ever more complex webs that link many organizations worldwide through the externalization process. The large conglomerate has become more and more focused on conception, marketing and assembling parts from all over the world. Many smaller or medium size organizations are only one cog in a very complex supply network.
In most situations, we are confronted with a network of partners rather than a chain, indeed a cloud when the frontiers are not completely defined. This is the reason why procurement risk management has become the backbone of most organizations producing goods and services, while their production relies on an ever-expanding number of outsourced tasks.
Therefore, what “partners' resources” encompass are raw materials, parts, equipment and services as well as distribution networks on which organizations depend on a daily basis for their own operations. These resources can be grouped in three distinct categories:
It is essential to clearly identify all the elements of this class of resources while conducting a risk management assessment as the same principles apply but with a major difference that the three categories have in common: the organization is dependent for its own security on the actions and attitudes that it cannot monitor daily as is the case for the resources under its direct command. In other words, consciously or unconsciously, the organization has “transferred” to a third party an essential part of its risk management activity. Therefore, the crucial question in procurement risk management is to find ways to ensure the organization's overall resilience should one of the “partners' resources” fail to be delivered, in time and to the quality desired.
The basic rule of thumb is not to be too dependent on one given partner, be it a supplier or a customer. Basic common sense applies here: “Don't put all your eggs in the same basket.” Most recommend having at least two or three sources at all time. However, this is not always possible, especially when there is a very advanced technology, patents, or some very specific know-how that can only be obtained from one source. Furthermore, the multiple sources must be balanced against the cost of maintaining several suppliers, with the advantage of them entering a competition to retain the organization's business. Finally, there is increased risk of information leaks if the relationship involves sharing trade secrets of any sort.
Beyond, this basic principle, the same rule applies upstream and downstream, which could be called the “3 Cs” rule.
As far as lateral resources are concerned, unless you are a project leader, which is rarely the case for small or medium size organizations, they are typically partners you have not chosen and you may have no contract with whereas each of them has a contract with the leader. Therefore, the only way to ensure the quality of the risk management is through your common partner, project leader, large firm, etc. In your dealings with the team leader, you should have access to the list of all those involved in the project you share.
Finally, remember that when you transfer risks, you are still socially responsible for the well-being of those who are stakeholders in the overall process. If a member of the team betrays their trust, all the members of the team will suffer. In other terms, risks to reputation are never transferred!
In our global economy, who would dare to claim that there are resources that we do not pay for? Clearly any organization needs both internal and external resources that have been detailed in a previous question. By external we mean those exchanged with the economic partners both up and down stream. These resources are paid for. However, there are also non-transactional exchanges with the environment that are essential for the organization's development, even its survival. These resources received from the environment without direct financial compensation are labeled “free resources” insofar as they do not appear on the organization's accounting documents. However, the term environment is too broad and in each situation should be investigated:
These exchanges represent what economists call “externalities” that are not part of any contractual transaction with economic partners. Remember that these externalities can be positive (society receives a benefit from a private transaction, which is additional to the private transaction), or negative (society incurs a cost additional to the private transaction). It should be noted that the domain of these externalities may vary from country to country; in terms of pollution, the development of codes to protect the environment has forced the “internalization” of the costs of cleaning sites or restricting contaminant releases as private producers have seen some “social costs” transferred to their operation.
It is crucial for any organization planning to diversify or enter new markets in any locale to be aware of its needs for “free resources” as they may not be available in the prospective locations and/or countries involved. More precisely, a very successful SME might well be unaware of the specific circumstances that led to success in its original location that may not be found in the proposed sites, or lost in the case of fusion or acquisition.
The concept can be illustrated with some specific situations keeping in mind that these are only common cases and that each individual organization must conduct a systematic analysis of its circumstances:
An organization has been defined as a portfolio of risks or exposures, i.e. threats and opportunities. Each exposure is defined by three dimensions – resource at risk, peril or hazard and impact. Thus the peril is the second of these parameters.
Some define a peril as that which gives rise to a loss whereas a hazard would be that which influences the operation of the peril, i.e. fire would be a peril, a house that could burn a hazard. For others the peril is commonly defined as the cause of loss, whereas the hazard is commonly defined as a condition that creates or increases the chance that a peril will occur.
Here for management purposes, a “peril” is an event that may or may not happen, the occurrence of which would change in a drastic manner the level of one of the organization's resources: for the downside, the resource would be destroyed partially or totally, permanently or temporarily, for the upside a sudden increase of the resource would become available. In most organizations, for risk management purposes, only the downside impact would be assessed.
The two dimension vector, resource/peril, identifies the exposure, and is the foundation for the analysis phase that investigates the impact, quantifying the financial or other consequences, without any consideration for reduction methods.
“Known” perils are qualified with a probability measured through experimental probability drawn from historical data and/or mathematical models. In other cases, the “known unknown”, only a qualitative approach will be possible for lack of reliable data as in the case of emerging risks and even more so for the “unknown unknown” fat tails or a black swan event. Under such circumstances a qualitative scale (exceptional, rare, infrequent, or frequent) could prove useful, provided the group in charge of evaluating the probabilities has a common definition for these terms (once a decade, once a year, twice a year, once a month, etc.).
Many phenomena follow normal distributions (bell-shaped curves) which are completely defined by two parameters:
For instance, from historical evidence, it is “practically” certain that Florida will be hit by at least one hurricane every year and no more than seven.
For phenomena more easily controlled than natural events, for example, the number of accidents per year in a given large fleet of vehicles, or the number of fires in plants of large multinational firms with over 2,000 sites worldwide, the occurrence of a number of events significantly outside of the confidence interval is valuable information for a long-term number of losses forecast. Depending on the sign of the deviation, improvement or deterioration, the situation will call for an explanation of this evaluation; a check on the deep-root causes.
When the peril lends itself to a quantitative probability distribution, the uncertain future is deemed “probabilistic”; in other cases it is called “non-probabilistic”. This distinction is essential as the tools available to make decisions for an uncertain future rely heavily on the quality and the risk diagnostic is aimed at improving information to make “sound” decisions.
In any case, the probability distribution of occurrence coupled with the probability distribution of impact or severity is the key to rational decision making as it allows for justifying the investments or recurring costs of proposed loss control measures as well as the premium quote by insurers.
It should be stressed that for extremely infrequent events, the average number of occurrences “in the long run” (law of large numbers), has not much meaning for the decider. In these situations, the decision will be based mostly on the impact level, the severity, and consists in reducing the probability of occurrence below a level that will be deemed acceptable by the major stakeholders. For example, when the officers in an organization managing nuclear power plants make a decision on “nuclear risks” they will assess the likelihood of a major accident occurring “tomorrow” rather than on the average cost in the long run, meaning 1 or 10 million years! They must take into account the level of probability above which the population would no longer be prepared to live close to one of their plants.
The nature of the peril will dictate the type of loss control measures that could be implemented. For instance, in the case of a vehicle fleet, if the peril is “drivers' skills”, the remedy will call for training the drivers to modify their attitude behind the wheel – i.e. defensive driving.
There are many ways to classify the events that may occur to alter the state of affairs on which an organization formed its strategic decisions. Some would look at the causes, others at the consequences. An analogy that is useful to consider here is the knot of the bow tie rather than the two wings. The classification proposed has no scientific pretension; rather it focuses on providing the risk management professional with a first approach to what loss control instruments might prove appropriate to mitigate the exposure at hand. Perils and hazards are classified under two criteria:
Where the hazard or peril originates
1 THE NATURE OF THE HAZARD OR PERIL
However, the human origin is not enough to understand the root cause of the phenomenon.
It must be further differentiated:
The voluntary human peril: “Malicious acts” occur when an individual or a group of individuals embark on a mission to appropriate third parties' belongings or assets, be they tangible or intangible. Therefore, they are usually illegal acts, punishable in most countries where they would be performed, for instance, industrial spying, arson, forgery, assault, etc.
It is further essential to qualify the acts as to whether the individuals try to get wealthy (lucrative malice) or to further a political, religious, or ideological agenda (non-lucrative malice).
In the first case we are dealing with an enterprise, illegal but still governed by profit seeking. Therefore, these individuals make their decisions like legitimate entrepreneurs and they can be deterred by lowering the “return” on their investment (time, effort, and/or money). Strategies such as the following could work: increase the costs (prison terms, security efforts, etc.) or reduce the value (lower inventories, published information and know-how made public).
In the second case we are dealing with individuals who work for a cause (from vandals scratching cars in the wealthy section of the town to outright terrorism) and their motivations transcend economical issues. Their reasoning is much harder to crack, punish or “bring heaven to earth”. The tragic events in the USA on 9/11/2001 and also Madrid (2004) and London (2005) are all reminders of how difficult it is to fight terrorism within the framework of a democratic society.
In any case, “voluntary human perils” are always the most elusive to fight. It is important to recognize that we are confronting an intelligent mind that can and will adapt to whatever new form of loss control measure we will imagine and wait patiently to strike when our guard is lowered. One illustration would be information systems: new worms and viruses are created every day and firewalls and other protections are to be updated all the time. Furthermore, employees may become complacent if not always reminded of the uphill battle to be fought every day.
2 THE CONSEQUENCES OF THE HAZARD OR PERIL
The third dimension of an exposure is its impact or consequences on the organization's objectives and these consequences can be good, creating an opportunity, or bad, generating a threat. In principle, an unexpected high level of a resource would create an opportunity, and a sudden depletion a threat, but unexpected constraints could provide a path to higher efficiency whereas a sudden affluence of resources could be squandered without careful prior planning.
Therefore, the “impact” can be positive or negative and be seen at three levels as defined earlier:
Tertiary impact: This covers the impact on third parties, including economic partners and other stakeholders, society and other impacts on the environment.
When they are negative, they may engage the contractual, professional or other civil as well as penal bodies. If they are not involved in a transaction they represent an externality in the sense of economy and can be positive or negative.
Figure 2.2 summarizes the elements that are included in the exposure diagnostic, or risk assessment, to retain the expression used in the ISO 31000:2009 standard. The risk identification step means marking the first two dimensions that define any exposure (resource and event) whereas the third (impact quantification) will constitute the risk analysis step. The risk evaluation step consists in comparing the quantified impact under current circumstances to the risk criteria defined as acceptable by the leadership (risk appetite or risk tolerance).
Understand the probability of loss, adjusted for the severity of its impact, and you have a sure-fire method for measuring risk.
Sounds familiar and seems on point; but is it? This actuarial construct is useful and adds to our understanding of many types of risk. But if we had these estimates down pat, then how do we explain the financial crisis and its devastating results? The consequences of this failure have been overwhelming.
However, a new concept has developed to describe how quickly risks create loss events. “Velocity of risk” provides an additional insight into the concept of risk through an evaluation of “time to impact” that implies proactively assessing when the chain of events will actually strike. While it is still relatively new and not yet widely used, it is gaining momentum in professional circles, as it is a valuable concept to understand and more so to apply.
Whereas it is necessary to know how likely it is that a risk will manifest itself into a loss or a gain, and the impact of the manifestation, it is not enough to make the most of the opportunity, or limit the loss. Therefore a better way to generate a more comprehensive assessment of risk is to estimate how much time will be available to prepare a response or make some other risk treatment decision about an exposure when its occurrence becomes imminent. This allows you to prioritize more appropriately between exposures; those that require immediate preventive action and those that can be treated when the event is becoming imminent. An efficient allocation of limited resources is the key to robust risk management.
As a matter of fact, expending limited resources on identification and assessment really doesn't buy much more than awareness; and awareness, from a legal perspective and governance perspective, creates an additional risk, that could prove quite costly if reasonable action is not taken to reduce them in a timely manner. Not every exposure will result in this incremental risk, but a surprising number do.
Even five years into the crisis, there's a substantial number of actors in the financial services sector who wish they'd understood risk velocity and taken some form of prudent action that could have perhaps altered the course of loss events that triggered the situation in which the developed world has since been engulfed.
To better understand the velocity of risk there are several avenues to follow:
Collectively, these efforts can provide a few of the many data points that can help in piecing together a picture of emerging risks and give some context around the speed with which they could develop and cause loss.
The more of these elements can be assessed, the more opportunity will be offered to develop and implement loss prevention plans that could lead to the avoidance of the loss altogether. Between efforts at prevention and protection, it may prove possible to avoid or mitigate the next crisis, an experience any organization would be better off not going through.
There is a growing trend to consider BIA (Business Impact Analysis) as a new discipline that consultants promote heavily in the wake of the huge contingent loss of profit experienced by many organizations as a consequence of the various natural disasters that hit Asia in 2011.
However, if we refer to ISO 31000, clearly a growing reference worldwide, the risk assessment part of the risk management process calls for a longer view of the impact of a given event or change in situation. It would be a very poor practice to envision only the immediate consequences at the site of the event. Furthermore, risk should be reassessed as soon as there are significant changes in the organization, its internal and external context, and/or its missions or strategic goals. I trust that encompasses the entire scope of the BIA, which appears only as a tool in the risk assessment process and a reminder to look beyond the immediate and local effect.
Back quickly on the assessment/impact issue: personally I prefer the word “diagnostic” and the analysis/evaluation stage of this process clearly must encompass all of the potential impacts: primary damages, secondary damages (loss of profit), tertiary damages (to third party and society), and “quaternary”, i.e. impact on reputation or “social license to operate”.
It seems to me that the “invention” of impact analysis is linked to insufficient attention to a thorough risk analysis (unless it provides a consultant with a competitive advantage and the actuaries a legitimacy to invade the RM field!).
Within the ERM framework, as per the ISO 31000:2009 standard, risk management objectives can only be derived from the organization's strategic objectives that they are meant to serve with a major focus on pulling through difficult situations.
Clearly, the mission is to plan for all the resources that will be needed to achieve the organization's goals whatever the circumstances may be, more specifically when it is hit by a severe event. Among the resources vital to the organization are the cash flows needed to compensate for the losses whatever their origin may be. This will be the role of risk financing that will remain a corporate function, part of the overall finance strategy, no matter how decentralized risk management may be to the operational risk owners.
Even when approached from both the perspective of threats and opportunities, the essence of risk management is to plan for situations of high volatility; otherwise it would interfere directly with the cost control mission of any manager. The essential volatility of risk management performance has driven the professionals to contrast clearly two sets of objectives that some call “pre-loss” and “post-loss” objectives. However in a global approach, it would be better to call them “pre-” and “post-” event, thus not pre-judging the nature of the impact of the event, which also could be positive.
It is necessary therefore to set the objectives prior to any unexpected event but with a clear understanding of the different timing with regards to major events occurring.
In any case, the first objective is the organization's survival, which could be achieved if there is enough cash at hand when the demands from the event are due. But there is a continuum of objectives depending on how resilient the organization should be. If we look at it from the perspective of the resources at risk:
Financial Even when there is enough cash to survive, that may not prove enough for the investors' community, which may require proof of the executives' foresight whatever the circumstances:
Financial markets do not take lightly to a publicly traded company showing erratic results. In such a case the share price may sink, providing takeover specialists with a tempting target, which is a risk for the independence of the organization and the security of the “executive team”!
These are the objectives that, like any other department in any organization, risk management is expected to be efficient and deliver service to the organization while consuming as little resource as feasible, i.e. contain ongoing costs as much as possible. However, the level of desirable post-event objectives will govern the level of resources to be allocated to risk management.
The core mission of risk management in any organization is to maintain some degree of activity through any type of turbulence in order to allow it to reach its strategic objectives, no matter what happens. The risk manager's core job is to make sure that the vital resources needed to achieve this will be available to reach the level of post-event objectives set by the board.
However, this mission must also be fulfiled with as little resource as possible. To reach the optimal level of economic efficiency, the risk-management operations must be measured against some objective yardstick. Minimizing the long-term cost of risk represents such a standard. However, what is the “cost of risk” in a given organization?
Traditionally, the cost of risk has been broken into these four components:
Retention risk financing costs: These represent all the claims and fractions of losses that remain in the books of the organizations. Clearly that includes those that are not insured or not insurable but also the deductible, self-insured retention, and/or the portion above policy limits, especially in the liability area. For the losses outside the realm of insurance, retention risk financing costs will be reported only if the RM accounting practices provide for their tracking.
However this breakdown of the “cost of risk” concentrates on the downside, the threats, and does not take into account the upside of risk, the opportunities. A “fifth” component should be added:
In all organizations, executives are always tempted to benchmark, with competitors as well as partners. A typical question would be “Is our cost of risk in line with the competition?” This is a very difficult game to play as no two organizations have the same risk profile (each has its own specific portfolio of exposures) and no two executives' teams have the same risk appetite.
Once again it should be stressed that if the organization excludes the high frequency losses that should be treated as quality failures, in the short run, an organization with no risk management in place might seem more efficient as it incurs little “cost of risk”. This will last until a catastrophic event happens that kills it off, for it has no means to rebound. Therefore the cost of risk should always be assessed taking into consideration the company's resilience.
The underlying objective of any risk management program is the survival of the organization under whatever stress level, but it has to include within its scope the very important concept of sustainable development, not only in terms of preserving the natural resources for future generations but also as a way to provide the investors with a reasonable long-term return on investment. This may call for measures far beyond available sources of cash to pay for the damages that would ensure survival in the aftermath of a major claim.
However, even this minimum objective may prove elusive should some extraordinary circumstance take place; of course this may be the case essentially for liability and environment exposures, e.g., the cost of Exxon Valdez and the BP Gulf oil rig disaster which for smaller firms would have threatened their survival. For extremely catastrophic events, it may be necessary to engage with the stakeholders to find an acceptable compromise where the perception of risk is not excessive, in order to grant the organization its “licence to operate” in view of the positive societal advantages: the opportunities it brings to all.
On the other hand, this “survival” approach to risk management may fall short of stakeholders' expectations especially in the “moderate risk class”, i.e. those scenarios that are due to happen with a good degree of certainty over a 5–10 year horizon, but whose annual impact may fluctuate significantly. The objective of survival would not treat situations that take the financial results of a company through a true rollercoaster that would not please the investor community. Furthermore, employees and managers, as well as customers and suppliers, might question the long-term viability of the organization and seek employment and/or partnership elsewhere to protect their own interests. In the same way, the government, the local authorities and the citizen consumers might well be impatient when confronted with what they would perceive at best as short-sighted management.
It is under these circumstances, and with the capital “reputation” in mind, that both the directors and the executives might well consider a higher level of post-event objectives to assign to the risk management operations in the organization, like limiting service or production interruptions to a level compatible with the partners' interests, or imposing a minimum level of profits and/or growth even in the case of unexpectedly large losses. Clearly, they will expect the organization to rebound faster and higher than mere survival would allow, even after a serious loss.
Without undue developments, it is all too clear that these higher post-event objectives are going to require investing more resources, financial in particular, into the risk management process than “absolutely” necessary. This “additional investment” is in conflict with the pre-event objectives of economic efficiency, which call for containing the cost of risk.
Once again, risk management efficiency has to be assessed in the long term. However, long term must be defined. For a CEO whose tenure is going to be anything from three to six years, this is long term; for a government it should be the well-being of this and at least the next generation of human beings (and not the next election); for an environment specialist is a millennium enough (think nuclear waste)? Pension funds that are such an important player in the financial market should look at 20–40 years' horizon for the benefit of their “investors”, not just the next three quarters, which may represent the best interest of their funds managers.
Therefore, if purely financial returns do not provide a clear view on the long-term sustainability of the firm, a new concept, a new measure, is needed to provide a comparison tool. Resilience for an organization was forged as a social concept by analogy with the quality of a metal that regains its quality after a stress. It is often used in modern risk management literature, not always with a clear understanding of its underlying meaning. It was used for the first time in an audit context by the Canadian Auditors Association, too happy to find a word that is the same in English and in French. The definition can be summarized as follows: “The capacity of an organization to rebound even after the most severe stress and still maintain its strategic objectives in the interest of its main stakeholders.”
Therefore the resilience must be assessed by each stakeholder in view of the preservation of its stakes or interests in the organization. As an illustration, the expectations of the organization from the points of view of different groups are:
Establishing a diagnostic of the exposures for a given organization is the first step in the risk management process. It can be split into three different phases: identification, analysis and assessment.
One of the problems of the emerging risk management science is that, in spite of the definitions offered in the ISO Guide 73 vocabulary for risk management, so many concepts remain ill defined. Let us therefore state that here we mean by “identification” the recognition that some undesirable event may occur; “analysis” means quantifying the consequences for the organization and hence also its stakeholders, without any consideration of the control measures in place; while “evaluation” takes into account the best possible outcome if all current control measures operate at full capacity, i.e. an analogy with the insurance concepts of MPL (maximum possible loss) and ERL (expected reasonable loss). It should be noted that what we call exposures diagnostic is also referred to in ISO 31000 as “risk assessment”.
It is obvious that the good risk manager envisions the probability of the event occurring, the frequency, the importance of the impacts, and the severity of potential losses. However, this would fall short of an understanding of the phenomenon without giving due consideration to a measure of the uncertainty of the consequences, i.e. the dispersion of the consequences. At the end of the day, what the risk management professional should be concerned with are the consequences on the ability of the organization to achieve its goals and missions, no matter what, i.e. what we have called its level of resilience.
The diagnosis is the cornerstone of the risk management process: our decisions are only as good as the information we base them on; if a risk is not identified, the opportunity will be missed or the threat will not be curbed. On the other hand, when a risk is identified and properly quantified, the risk management treatment may just pop out for the experienced risk manager. Later in the book we will come back to how the diagnosis is transformed in an ongoing process of risk mapping through the feedback loop.
A single event may impact a number of organizations. For example, a used tire facility burns in the vicinity of a populated town. Toxic fumes from the fire are blown by the wind in the direction of a nearby industrial and commercial park. The impact of this event must be analysed from different perspectives, but first of all from the company who owns the inventory and manages the site. The analysis will cover property and equipment damages, loss of personnel, and loss of net revenues. Of course the consequences for all economic partners should be assessed, if only to analyse the potential contractual liability losses. However, each of the partners should also conduct their own evaluation of the consequences. If the context has been defined accurately, then the eventuality of this exogenous event should be included in the exposure diagnosis of each of them as an “external” hazard. Finally, for the tire owner the impact on reputation and image should not be forgotten as economic partners may also suffer. Nevertheless, the impacted organizations may also experience reputation loss even if “innocent”, especially if they cannot demonstrate a proper preparedness for such a “foreseeable” event!
But this analysis in a semi-open system, i.e. a system that is not totally self-sufficient, is not enough as many other stakeholders, some of whom the organization has no contractual ties with may be impacted, e.g. the neighbors, the city and surrounding communities, as well as regional and national authorities, and even other private entities. Therefore, an investigation on the potential impacts on all stakeholders will have to be conducted at the organization level to further assess tort and criminal liabilities that the fire might induce. This is after all an endogenous hazard that may have consequences on an extended environment.
Furthermore, the risk manager of a healthcare organization located within a mile radius should have identified the “tire inventory operation” as a potential hazard as part of its community wellness program, especially in the context of its pulmonary patients, children, and other at-risk persons.
To summarize, the exposure diagnosis consists of a recurring exercise to keep an updated register of the exposures confronting the organization, as exhaustive and current as possible.
The second phase develops a quantified evaluation of the impacts on the organization's resources and objectives, as far as possible. This can be achieved through the use of probability and trend analysis for the “frequency risks” for which there is a data bank. Other quantification methods such as expert advice and Bayesian networks can be used for the median risk category, and scenario analysis is appropriate for “catastrophic” events, especially when many stakeholders could be impacted.
It is a “no brainer” that the more open the system, the more delicate the diagnosis process. Such is the case of malls, healthcare establishments and local authorities where most of the stakeholders have no direct subordination or contractual links with the organization. It is clearly easier to manage risk within the limit of a manufacturing plant where most actors can be trained and educated to recognize risks and act responsibly to limit their consequences.
Finally, there is the case of managing the risks in a project. Projects usually involve different partners that may have not necessarily entirely convergent goals and objectives, indeed even sometimes divergent interests. Project risk management in such a case will require a common approach to managing the risks in the most efficient way while satisfying the needs of all participants.
The realm of possibilities that may arise in a situation of uncertainty for a given organization is practically without borders. And even if risk management tends to look principally at downside risk when referring to risk diagnostic, it is essential to have a broad perspective on their potential impact, if only to prioritize risk control efforts.
Too often the risk impact evaluation is limited to the two traditional variables used by the insurance industry to calculate an expected value of the claims:
As a matter of fact, if the portfolio is large enough and underwritten cautiously, the multiplication of the frequency by the severity, or expected value, is indeed what the insurer can expect to pay in claims on his portfolio in the long run.
However, when applied to a single organization, large as it may be, the formula will not give insight into the adverse conditions that it may be faced with, except for the very limited and frequent claims like physical damages to a fleet of vehicles. A major variable is missing, that could be summarized in the volatility of the annual losses and the negative cash flows that stem from the realization of the hazardous events. An example drawn from the nuclear industry will illustrate. If any nuclear power plant has a probability of 1 in 10,000 million of suffering a major catastrophe with a $2,000 billion loss, then the long-term annual “cost of risk” for Electricité de France (EDF) that operates less than 100 sites is less than $20,000 a year. That is a quite acceptable burden for EDF; however, does this answer the question for the public: “Is the nuclear industry in France safe enough to ensure both permanent supply of electricity and safety for those living close to one of the power plants?” Who cares if the annual long-term cost over 10 million years is negligible? At the end of the day the question is linked with sustainable development, not expected value of cost. In other words, any exposure assessment will have to take into account the dispersion of the force of the impact, be it in financial terms or in human or environmental evaluation.
For the high frequency risk, the “non risk” exposures in terms of financial implications like health cover for a large population or vehicle fleets, the historical data will provide a good start to forecast the future losses provided trend variables are properly inserted in the model. Furthermore, the “expected value” of annual cost may even prove a reasonable base for a budget exercise, as it will be relatively non-volatile.
For exceptional exposures on the other hand, it is more the severity and the spread of consequences that will guide in the decisions of risk treatment, even to the ultimate option of discontinuing or not engaging an activity deemed too risky. In those situations, what is essential is to determine the stakeholders' appetite for risk. To summarize, the parameters should not be multiplied; it is the three dimensions vector (F, S, and σ)17 that must be assessed: “At a given level of confidence, are the consequences socially and economically acceptable?”
Some authors have recently advocated looking at the worst case scenario and defined for financial institutions the “value at risk” and for non-financial the “cash flow at risk” as a measure of the stress that the system, the organization, can endure without permanent damage or impairment. Now, recent authors place it in the QBRM (quantile-based risk measures) and argue that it is fundamentally flawed and does not represent a coherent risk measure. It is not within the scope of this book to fully address the debate, which will be left to the FRM (financial risk management) specialist and the actuaries. Suffice to know that it is a fast developing field that the risk management professionals must be aware of.
In any event, assessing severity requires developing an “extreme scenario” according to Murphy's Law (everything that can go wrong, will go wrong). Clearly, from the organization's point of view, recognizing all the negative (and positive?) impact is essential. The worst-case scenario will have to address the question of the stakeholders' confidence and the loss of reputation.
Finally, when dealing with risk assessment, one must exercise caution as some consequences are not readily measurable in monetary terms (long-term impact on the environment) and cannot always be positioned in the financial model of the firm even with the long-term view of value creation for the stockholders. Other objectives have to be taken into account such as whether they are true missions or merely constraints; it is for each board of directors to answer, but they include sustainable development, assistance to distressed persons, etc. It is probably necessary to broaden the assessment tools to include other variables than the three mentioned here.
One final piece of advice: as all is not quantifiable in monetary terms, all that can be must be, with utter care and application, be it only to limit the residual uncertainty!
The “risk centre” method for assessing risks, exposure diagnostic, is founded on the model that describes an organization as a dynamic combination of five classes of resources (Human, Technical, Information, Partners, and Financial).
The crux of the method is to split the organization defined as a “complex system”, i.e. a system of systems surrounded by environments, into as many sub-systems as needed to make the smaller entities “user-friendly”. Then the sub-system is analyzed as a combination of the same resources.
This identification of sub-systems within an organization is totally compatible within the system safety approach. It must stop when the level of “elementary sub-system” or “micro firm” is reached. It is still a living cell with a manager aware of the missions he/she is to fulfil with the help of the necessary resources in the five classes. It is then possible to identify, analyze and mitigate the risk at the risk centre level, which is a “micro-organization” that can be defined as a set of resources, combining to reach an identified sub-goal, contributing to the organization's overall goals and missions.
The boundaries of the risk centre and the forces of its environment should be apprehensible by the risk centre manager who should have the initiative to manage this “micro enterprise” and navigate at best within the threats and opportunities identified. This is precisely the “risk owner” often referred to in all ERM (Enterprise-wide Risk Management) presentations.
We have already listed and explained the seven risk identification tools; however, once the risk centres have been identified, the main tool to use is the questionnaire, or preferably, the interview of the identified risk-owners. The information gathered during the interview will be the material used for the workshop organized henceforth with the management team to appropriate the risks and agree on priorities and actions.
The interview should be conducted according to the main points listed in the box below:
The desired output of the diagnosis exercise is a list of exposures, as exhaustive as possible and a classification by order of priority. It is customary to assess a priority on the basis of the “long-term economic impact” measured by the two variables: frequency and severity.
The risk matrix, that some consultants still insist on calling a “risk-map”, is a two-axis table: on one axis the probability of the event taking place (or frequency) and on the other the potential impact (usually in monetary terms). This matrix does not have the permanency of a physical geography map: it is merely transitory help to decision makers, whose decisions will immediately alter the risk profile of the organization, not to mention the evolution in the external and internal context. However, this function as an information tool for managers and executives must provide them with some insight into the risk; therefore the classes of risks thus described must be in measures that make sense to them in the light of the decision to be made:
Combining frequency and severity provides the long-term weight of the risk, but judgment must be exercised, especially where improbable catastrophic events are concerned. At this stage the traditional green, yellow and red zones depending on the acceptability of risk level dictated by the deciders' or stakeholders' risk appetite. If an event whose potential impact is 1 per mil of the profit would only happen once a year, it can be ignored. If the same event could occur once a week, the situation will call for treatment. On the other hand, a millenary flood, even potentially catastrophic, may be left untreated.
Clearly the key to efficient risk management is an in-depth understanding of all the exposures to which an organization is confronted, their characteristics and root causes to infer their potential economic impact. The risk matrix clearly provides an appropriate tool for classifying the risks.
However, more than the transitory output, the risk matrix, essential in the diagnosis exercise, is the permanent process that facilitates the appropriation of their risks by each risk owner in the organization. The approach that we describe here is a mirror of any project management exercise and will engage all operational managers (be they local risk owners or chief risk officer). It is a three-stage process:
During the course of the last decade, risk management has clearly become a system to gather, process and communicate information as is clearly illustrated by the developments of analytics and “big data”. At each step of the process, from the diagnosis or risk assessment to the audit of the program, there is a constant need of communication (to obtain the necessary information) to manage information (gather and explain), and communicate again (to present the results and draw practical consequences). This is precisely why installing a risk management information system (RMIS), i.e. a set of hardware and software to gather and treat all relevant data for making and implementing decisions, is an essential tool to manage efficiently risks in any organization. Its main attributes are:
The decisions made with and all along the risk management process are based on systems that efficiently link data and people. The following are illustrations of what can yield a RMIS:
One of the most difficult challenges for any risk management professional is to narrow down the range of possible outcomes in any decision-making circumstance, i.e. limit the uncertainties at a level that the “stakeholders can live with”. However, there remains the daunting question of defining and measuring uncertainty. One definition could be: “The doubt concerning the capacity to forecast future events.” In financial terms, most models take the standard deviation of the probability distribution of potential outcomes as a measure of the risk/uncertainty. Clearly improving the quality of information is transferred in a reduced standard deviation; avenues of possible futures are drawn through “the cloud of the unknown.” Enhancing the decision processes within the organization is probably the main contribution of an efficient RMIS.
In addition to improving the decision process, the RMIS impacts other aspects of risk management. Among others, it improves productivity and plays a key role in the swift and efficient implementation of the risk treatment programs in the following:
Analysis and reporting functions can be used to inform both staff and management on the progress of risk management within the organization, the major trends and individual department contribution to risk management. The RMIS system should be linked to the information system capabilities of the organization to facilitate interaction and edit clear and synthetic documentation to illustrate the impact of risk management on all the activities of the organization.
The visibility of risk management is greatly improved where operations receive accurate information on risks together with other information on management. This puts risk management on an equal footing with other sources of controllable costs within the organization, and managers can take it as seriously as other disciplines.
At the foundation of Enterprise-wide Risk Management, the RMIS plays a key role to instil a risk-management culture throughout the organization and with its main partners.
In reality, nowadays the question of a separate system for managing information regarding risks is superseded by the need for top management to be able to have a complete view of the information flows within and outside the organization breaking the silos of specific applications for individual functions or departments. This overall strategic information system is often referred to as the Business Information System involving the collection of information throughout the organization and includes intelligence gathered, so that both the inside-out and the outside-in perspective are available to make or revise strategic, tactical and operational decisions in a coherent “risk intelligence”.
Sophie Gaultier-Gaillard
Assistant Professor, Université Paris 1 Panthéon-Sorbonne, Paris
An ERM program's path to maturity rests heavily on the adhesion of both internal and external stakeholders that some call “instilling a risk management culture”. Furthermore, the ISO 31000:2009 standard recognizes the importance of stakeholders' trust and confidence by stressing in its proposed risk-management process the “communication and consultation with stakeholders”.
On the other hand, if we agree with Felix Kloman's view of risk management as “lifting somewhat the fog of uncertainties about the future” to enhance the decision process at all stages, collecting relevant data and transforming it into information is key to any risk management program.
Behind the development of efficient data banks, many different tools can be used to evaluate the stakeholders' perception of risk and measure their trust and confidence in the organization to optimize risk-taking by enhancing opportunities and curbing threats.
However, in many situations, building and administering questionnaires will prove to be the only efficient way to gather or develop these data, both to assess the current situation, and model possible outcomes depending on the course of action chosen, taking into account the potential evolution of the internal and external context in which the organization operates. There is a systematic approach to developing and implementing questionnaires that will help ensure optimal data gathering. Creating an efficient questionnaire is not as simple as it may appear. Many designers were disappointed by the results of their enquiries, only because they did not devote enough time and energy to a preliminary thought process.
A questionnaire developing process is split into four stages: conception, construction, administration and analysis. However, the two first stages should represent two-thirds of the time devoted to the whole process. Administration and treatment require rigor essentially, but should not prove very time consuming, somewhat less than a third of the total duration of the study.
Before writing any question it is essential to define the study/project objectives and the issue on which the survey is to shed light. This stage will allow the development of questions that will lead to answers that will effectively address the study's scope and produce the types of results desired, which is the key to a successful questionnaire administration.
While there are many ways to survey people, and many will be touched upon in the analysis, this paper will explore the in-person interview in more depth as a way of gathering data from stakeholders in a risk management context. Interviewees can be employees, management, the board, key shareholders, customers, regulators and others. The nature of the interview and the persons interviewed are directly related to the goals and objectives of the survey.
This first step, like for any IT program, is essential for success as the questionnaire will provide information as good as the specifications that have been developed:
One way to define clear survey objectives is to conduct a feasibility study to explore, describe, explain and predict what is being pursued. The feasibility study aims at understanding the context of the situation without attempting to validate the results. The current lack of hard evidence or data is the reason that the survey is to be conducted in order to produce enough information for the decision makers to make qualified decisions. A feasibility study helps to better define the information required for the survey to gather. In the three other situations (describing, explaining and predicting) it is possible to refine the questions and answers format to obtain a more granular analysis.
To better formalize the issues the questionnaire will address write down 4 or 5 questions that the survey should answer, in no more than 4 or 5 sentences.
All survey participants have to be carefully chosen according to criteria that depend on the aim of the survey. They have to be targeted to specific characteristics studied by the questionnaire. The main socio-economic characteristics of the target interviewees (organization, position in the organization, seniority in the position, geographic location, age, education level, etc.) should be calibrated to provide exploitable answers for the survey objectives. This means that the participants must represent a credible sample of the targeted population so that the results are not biased. This specific targeting will help to match participants' profiles to the objectives and ultimately the results of the questionnaire.
Information to be collected may be qualitative or quantitative. This initial choice is essential as these categories require significantly different statistical treatments and question format. Most often we use verbal data. It has to be thought about how the data will be analyzed at the time that the questions are being formulated. There are several methods that can be used depending on the level of directivity sought in the administering of these questionnaires.
Depending on the level of precision expected for the results, question formats may differ. For exploratory studies, it is recommended to adopt a binary format (yes/no). The type of results expected is then “for or against” the survey subject. If more detailed information is needed then more answers should be offered through the use of a multi-level scale. It can be a numeric scale (i.e. boxes marked 1 to 4, in increasing preference order), or a qualitative scale (i.e. four boxes like “never”, “ rarely”, “sometimes”, “often”). The scale may be comprised of “2” to “10” graduations or boxes, “2” is the minimum to allow comparing, “10” the maximum number of items that an individual can order mentally. Between these two limits, the choice will be dictated by the level of detail expected from the answers:
A major flaw in survey technique is impatience. Those who want data want it now and sloppy question writing is often the result, leading to misleading or even unusable data and frustration at all levels. For example, if interviewees are asking, “why are you asking ME this” you know there is a disconnection between the interviewee and the content of the survey.
Treatment is dictated by the way the interview is conducted and how the questions are formatted. If the interviewer, or his/her statistical team, has limited statistical competency, it is advisable to limit the format to 3 to 4 points scale relying only on descriptive statistics. If the interviewer, or his/her statistical team, is more statistics-savvy then more extended scales can be used.
It is quite essential that the individuals selected to conduct the interviews (when the questionnaire is administered by direct interviews) be quite aware of the importance of their mission, i.e. that the quality of the conclusions drawn from the process relies heavily on their accomplishing their mission with integrity and professionalism. Questionnaire protocol must spell out specifically what the interviewer can and cannot do. For example: absolutely refrain from faking interviews and filling out the questionnaires themselves. They must understand the importance of providing adequate time so that the interviewees can structure their answers. In most instances, some training will be needed to make interviewers aware of their responsibilities and of the ultimate importance of the quality of data collection.
The conceptual stage of the questionnaire process will allow the study manager to ensure that all interviewers will collect all the necessary data for the successful completion of the study through appropriate implementation and treatment.
At the beginning of the questionnaire the interviewee must be confident that the sponsoring group is only interested in his/her opinion. It will require explaining to the interviewee that this is not a test, and interviewers must be trained to build trust in the interviewee. A sentence specifying that there are no good or bad answers may prove useful in all cases. All interviewees should be offered the opportunity to review a summary of the survey's results. These two measures will help to put the interviewee at ease and should enable the interviewee to provide more thoughtful answers.
The person in charge of writing the questionnaire is confronted with an alternative as he/she has the option of:
If the interviewer is experienced in developing questionnaires, and if he/she is convinced that through the conduct of the questions he/she can guide the interviewee to answer the question being asked, then he/she can afford not to describe the context at the beginning. Otherwise, if the context is presented to the interviewee, it should be kept in mind that the value of the results is overestimated. In fact, in such a case, the interviewee has become aware of a risk that might have been ignored, and this awareness usually makes him/her more afraid of that risk, and this explains the overestimation of the results.
Several points have to be considered when it comes to managing the questions. It is advisable to:
Whatever the technique chosen to administer the questionnaire, the administrator must provide the interviewee with a brief presentation of himself/herself and the organization, in order to demonstrate his/her legitimate right to conduct the survey. The interviewer should try to engage the interviewee to explain that he/she should take the time to provide answers. It must always be kept in mind that the interviewee may have been asked many times to participate in surveys and must prioritize attention and time. This is especially true when surveying executives or individuals in positions of power who are a frequent target for surveys. A letter or a short presentation limited to a few minutes may capture the attention of the selected target respondents and raise their trust level which will help improve the quality of the result. Emphasizing the importance of the survey may also assist in getting the attention of busy executives and employees.
Individual interviews may take place at the workplace of the interviewee, in the risk manager's office, or consultant's offices or even on the street. In any case, the key to the validity of the findings is to ensure the professional integrity and the reliability of the interviewer who must vouch for compliance with the rules for administering the questionnaire. This can be both a strength, as the interviewer is in a position to help individuals better understand the questionnaire, and a weakness as the interviewer might influence their answers.
The facilitator that took part in the development of the questionnaire must conduct the group interviews so that he/she can lead the discussion in the right direction to fulfil the objectives of the study. Typically the groups will gather a minimum of 10 participants. The leader may need to focus the interviewees on the subject at hand and not hesitate to add open questions if the group gets off track or provides inadequate information. In most instances group interviews will be recorded and this needs to be explained to the group up front. Group answers may prove difficult to analyze as they tend to be more extreme, and sometimes emotional, than individual interviewee answers; participants may be more optimistic, or more pessimistic depending on the presence in their midst of a leader, or not. The upside to focus groups is that they can open debates and exchanges where very different points of view are confronted. The downside is that the results tend to be multiple and complex and require careful analysis. The interviewer may also need to address directly to individuals that are shy in the group, especially when there are some dominant participants that might monopolize the conversation. The leader of the focus group should, in this specific case, interrupt the dominating voice and give the right to answer to the other members of the group.
This method is often used in the case of panels to be followed over several months. Its use is rare as they are expensive (costs of sending and returning the questionnaire by mail) and response rate tends to be low. The upside is that each individual has ample time to answer the questions. The downside is that the respondent may not answer the questions in the order they appear on the questionnaire, and this may influence answers to questions answered out of the intended order. The participant may also review the questionnaire later in the day or the day after to finish up, or even days later if it has not been sent yet or a reminder is sent.
This technique is gaining momentum, as the cost is very low once the proper names database has been built. The upside is a swift collection of data; an easy exploitation of data. Sending reminders is facilitated when potential interviewees have not responded. The downside is that the increase of such solicitations tends to lower the rate of answers by individuals who feel harassed. Also many organizations may filter these out as spam. Other organizations may have strict guidelines about whether employees can provide answers to such questionnaires whether related to the organization or not.
This technique allows for contact with individuals who may be geographically dispersed. However it supposes that the questionnaires are simple and short as the time that respondents are ready to invest to answer telephone questionnaires is much more limited than in the case of individual meetings. In the United States the “Do Not Call” statute prohibits solicitation by for-profit entities (including surveys) if the phone number has been entered into the Do Not Call database.
Data treatment must take into account the competencies of the developers and consultants involved in the study.
This section provides only a brief overview of some of the categories of analytic tools available to the researcher. In most cases, proper statistical analytic techniques may require professional competencies to apply the appropriate technique to produce quality results. It may also be necessary to hire such talent during the preliminary stages of questionnaire development and delivery to test subjects to facilitate discovering and fixing early in the process potential problems that might lead to problematic or even invalid results.
Whatever the level of detail sought in the study, this step is essential as it provides a precise description of what has been studied. It will therefore improve the understanding of the results further derived from most questionnaires. Descriptive statistics lends itself to graphic presentations such as histograms, sectorial diagrams, or pie charts.
This step allows for the sorting of data and the cleaning of data that may be vague, inadequate, or misleading. The proper use of data analysis techniques is important at this stage. Data analysis techniques consider each answer as a potential explanatory variable. If there are “n” answers there will be “n” spaces, but an individual is not able to think in more than a three-dimensional space. For that reason, the technique consists in representing these “n” spaces in a two-dimensional space, to make them more readable, and allow for comparing. Then, after this spatial projection, each space corresponds to an axis, also called vector, more or less explanatory of the subject studied. The developer may then choose the quantity of information he wishes to see explained by the variable retained and thus determines the number of explanatory variables to be used in the follow-up analyses.
Thanks to the econometric analysis and depending on the data selected before, the developer may determine which type of individual (age, sex, occupation, income level, etc.) is more likely, for example in a risk-taking survey, to take risks related to the subject of the study, or uncover which variables influence risk taking. Inferential statistics lead to testing the significance on series determined before, or test cohesiveness or homogeneity. This step is to be done by a statistician.
When these first four stages of the process have been completed, the final step consists in developing and writing the results in a language and tone that the target audience will understand (board of directors, management, staff, economic partners, and any other internal or external stakeholders when dealing with risks). Graphic presentations may help visualize statistical results. It is often essential that interviewees themselves be forwarded the results, maybe in the form of an “executive summary”, especially if they wished to be informed and might need to be interviewed for further studies. This is particularly true in studies concerning an organization's risk management practices as all interviewees are likely to be stakeholders and monitoring their perception through time is essential to manage the risk management efforts, let alone in terms of risk to reputation. (However there are situations where the organization may not want to disseminate results too widely because they could help competitors and create a competitive disadvantage; also the media may make an inadequate use of the information for the organization's sake.)
The model to manage risk to reputation (see Section 3.1 on risk to reputation in this book), stresses the importance of consulting stakeholders to define a reputation index that can be monitored. As an illustration of the process a risk to reputation questionnaire has been developed. To better illustrate the process we have selected one specific driver and one specific stakeholder, chosen among the list provided in the previous article. This questionnaire presents a common model for a study, in the pole “action”, of the perception of risks to reputation of the staff (stakeholder) through a “leadership/governance” approach (driver). It might be written the following way:
In Your Opinion:
Essential |
Very important |
Moderately important |
No impact |
YES | |
NO | |
DON'T KNOW |
a. Specific field evaluation | |
b. Informal observation | |
c. Financial results | |
d. Media coverage | |
e. Published ranking | |
f. Others |
DRIVERS | KEY (?) |
a. Ability to attract and retain best talents | |
b. Quality of management | |
c. Corporate social responsibility (community) | |
d. Sustainable development (environment and future generations) | |
e. Innovation | |
f. Cover extensions and quality | |
g. Claims handling and insured satisfaction | |
h. Efficient use of corporate assets | |
i. Financial soundness | |
j. Long-term investment value | |
k. Effectiveness in doing business globally |
DRIVERS | RANK |
a. Ability to attract and retain best talents | |
b. Quality of management | |
c. Corporate social responsibility (community) | |
d. Sustainable development (environment and future generations) | |
e. Innovation | |
f. Cover extensions and quality | |
g. Claims handling and insured satisfaction | |
h. Efficient use of corporate assets | |
i. Financial soundness | |
j. Long-term investment value | |
k. Effectiveness in doing business globally |
Factor | Essential | Very important | Moderately important | No impact |
Clients | ||||
Employees | ||||
CEO's reputation | ||||
Stockholders | ||||
Public Officials | ||||
Media – press | ||||
Media – Radio & TV | ||||
Media – social media | ||||
Financial Analyst | ||||
Activity Analyst | ||||
Trade Union | ||||
Internet | ||||
Plaintiff Attorneys |
Essential | Very important | Moderately important | No impact |
Question | Essential | Very important | Moderately important | No impact |
Internet strategy management | ||||
Control negative information on Internet | ||||
Internet monitoring (blog and forum) |
PLEASE RETURN COMPLETED QUESTIONNAIRE BY EMAIL, MAIL or FAX
BY precise deadline
Richard Connelly, Ph.D.
Founder and Director, Business Intelligence International
Jean-Paul Louisot
Formerly Université Paris 1 Panthéon-Sorbonne, Directeur pédagogique du CARM Institute, Paris, France
“Risk doesn't mean danger – it just means not knowing what the future holds.” (Peter L. Bernstein)
This quote is at the heart of what risk management should be for any organization, whether when managing the potential downside of an investment or putting a value on the option of waiting when making irreversible decisions. The ISO 31000:2009 RM standard points to the needs for applying sound enterprise business intelligence analysis to risk management programs through the alignment of the GRC – Governance/Risk Management/Compliance Triangle.
Long before “analytics” came to be known as the reference for the compilation of decision information within and outside an organization, Howard Dresner in 1989 proposed a definition of “business intelligence” as an umbrella term to describe “concepts and methods to improve business decision making by using fact-based support systems” (Dresner, 2009).
Enterprise-wide risk management maturity means auditors' assertions can state that decision making is based on reliable data management processes that comply with governance requirements and legal matters management controls. This can be achieved only through maintaining documented assurance that international standards for evaluating performance accountability, reporting transparency and audit integrity are embedded in the roots of organizational culture.
The value of an investment in enterprise risk management information, as requested in the ISO 31000 standard, may be assessed by the depth of risk factors disclosure information reported to stakeholders and credit analysts. Credit agencies need “reasonable assurance” that the risk assessment data is reliable and consistent throughout the organization. Technical advances in enterprise information technology provide the means to connect management's performance guidance statements to predictive analytics that correlate financial results with asset–liability reserves and risk mitigation response plans. These are the key elements that investors, underwriters and regulators are assessing when stress testing economic forecasts and applying valuation models to capital liquidity analysis.
ERM – Enterprise wide risk management – is a global and integrated approach to risk management program implementation:
Enterprise risk management and loss control programs are efficient only if they are based on consistent, complete and reliable risk management documentation. This is why every organization can improve risk management decision making at all levels, from strategic to tactical, by applying business intelligence analytics to generate risk exposure insights from their information system assets.
Enterprise risk analytics (ERA) systems integrate enterprise-wide data flows for management reporting, business planning, internal controls testing, and credit evaluation. The evidence of ERA systems use helps to fulfil regulatory oversight needs for transparent reporting. Data management logs that show IT Governance practices are orchestrated across risk management collaboration groups support enterprise corporate governance objectives for performance reporting accountability, regulatory compliance fulfilment, and audit assertions integrity.
The processing capacity of enterprise information architecture has expanded to accommodate “big data” transaction file sizes and “in memory processing” of complex calculations. The business intelligence technical reference to ETL (extraction, transformation and loading into functional data marts) has been supplanted by the computational capability to do direct analytic calculations against transaction files. This reduces storage requirements, processing time and applies risk detection to more immediate notifications across a broad monitoring array of operational and financial system mappings.
Enterprise business intelligence analytics systems ensure that there are consistent definitions and calculations in the data management foundations of business reporting and analysis. These are the primary business intelligence functions that apply to documenting enterprise-wide risk management decision analysis practices:
Risk management culture auditing is reflected in assessing how information that passes IT governance standards is used by directors and officers to confirm there is enterprise-wide oversight of risk management performance roles delegation. The chief executive officer (CEO) and chief financial officer (CFO) of public corporations have fiduciary oversight responsibility to certify the integrity of internal risk controls. The application of enterprise risk analytics information to specific ISO quality standards illustrates to stakeholders that risk management culture principles originate at the top of the business's hierarchy. ISO implementation status reviews set the “tone at the top” that reinforces the organization's commitment to managing risk enterprise-wide through each person's job activities.
The consolidated documentation of how enterprise analytics systems, IT governance and risk monitoring outputs are orchestrated by senior management forms the basis of enterprise risk factor case reviews; these confirm organization roles' decision-making accountability for achieving risk management culture goals.
Management agendas for enterprise risk analytics reviews are set by prioritizing specific risk factors that may rise to the strategic impact level. The ERA review process covers all issues that may have a material impact on financial results. Each risk factor is associated with programs or projects where managers are assigned to assess upside opportunities and downside loss exposures.
The ERA agenda always includes core topics such as Cyber Risk Exposures, Stage 1 Disaster Recovery Plans, Business Continuity Risk Mitigation Plans and Systemic Investment Market Risk Exposure Response Plans. The chief risk officer (CRO) also puts significant Insurance Coverage/Catastrophic Loss potential/Risk Retention decisions on the agenda for active discussion of financial and operational risk treatment plans.
Enterprise risk analytics systems' information can solve the problem of connecting risk management program plan objectives across organization departmental silos and external supplier/vendor networks. Outcomes from enterprise risk analytics reviews provide assurance that risk management plans fulfil fiduciary oversight of six key risk management orchestration factors:
Financial and Credit Reporting
Legal Matter Management
Audit Programs Synchronization
IT Governance
Risk Exposure Analytics
Risk Management Programs
Chief risk officers and other staff members who prepare enterprise risk analysis cases for senior management reviews maintain enterprise-wide oversight of the organization decision making roles that contribute to evaluating resource time and expense budget line items. These oversight steps are critical to confirm risk management programs are operationally viable.
Risk management program implementation plans include ongoing assessment of enterprise risk management indices. The index includes inherent risk assessments of potential maximum loss severity events that address catastrophic risk planning. The residual risk assessment will show how COSO (Committee for Sponsoring Organizations) standards for mitigation controls maturity are applied to calculating loss frequency and severity probabilities. The active use of enterprise risk analytics distinguishes legal entities that are able to link risk management program decision owners to the specific risk assessments and risk mitigation plans for his/her areas of job role responsibilities.
Insurance, reinsurance companies and banks have regulatory guidelines in place to frame ERM programs goals for managing business process risks in the wake of Solvency 2 and Basel banking standards enforcement. Responsible board members in all companies need to understand how the use of enterprise risk analytics systems relate to assessing decisions that strengthen management capabilities to predict opportunities for growth and prevent potential losses.
ERM or enterprise-wide risk management topics are now high on the boardroom agendas of financial committees, human resources/compensation committees, and risk committees. Whatever board committee structure is in place, board consensus on both upside opportunities, and downside threats, are clearly the cornerstone of directors' fiduciary monitoring responsibility for the GRC triangle's principles that connect governance, risk and compliance oversight standards.
Managing risk uncertainty is essential to develop and execute enterprise performance strategies that can adapt to the unexpected – and still deliver company value expectations to all stakeholders. The main driver is to target the application of enterprise risk analytics systems from directors' understanding of the complexities in enterprise risk case prioritization decisions, whether the subject may be maintaining desired credit rating documentation or applying risk analysis information to business continuity contingency planning.
Directors and officers recognize that each entity must develop its own risk management program plans. They must assess whether internal resources that do not have access to risk analysis systems have adequate capacity to maintain reporting programs accountability and their risk mitigation treatment plan responsibilities. As risk information reporting requests from regulators and business contract counterparties expands, the need increases to monitor fulfilment of risk management-related reporting responsibilities for the organization's goals and strategic performance management missions.
Regulatory changes in global financial risk information reporting from “systemically important financial institutions” provide test cases for all directors and officers to assess how they are adjusting banking and insurance information exchange practices with “SIFIs” that may be “too big to fail.”
US regulators are linking the systemically important financial institutions' (SIFIs) capital adequacy calculations with bankruptcy stress testing analysis reports. SIFIs are required to file corporate living wills (CLWs) documentation that shows how their simulations of global asset value meltdowns are linked to risk mitigations plans and ultimately to bankruptcy trustee filings. SIFIs' CLW reports provide inputs for all company treasurers to assess how their bank's and investor's stress test scenarios affect Treasury management contingency plans.
Intraday credit monitoring regulatory changes are now in place to reduce the threats of how asset liquidity meltdowns can freeze global banking relationships. Over-the-counter derivative instruments now require three-way settlement reconciliation with clearing banks to assure adequate margins are on deposit to match derivatives transactions. Investor cash balance adequacy assurance now requires the enterprise total collateral valuation reports support financial guarantees. Portfolio securities price value aging must be disclosed to investors. Companies that are near credit cap requirements must prioritize daily securities trades in advance of a common (US East Coast) afternoon settlement time. Maintaining securities settlement fiduciary documentation simply requires risk analytics mastery to exchange information and risk notifications with investment counterparties.
The case for enterprise analytics systems starts at the top of the organization with assessment of the largest risk exposures. The following box provides a partial explanation of enterprise risk analytics systems benefits that are relevant across all organization levels.
All US 10 KQ filing companies are on an XBRL detailed reporting implementation timetable to apply disclosure analytics to financial statement account line balances and footnotes. The filings are used by investors and regulators for industry peer group performance analysis. (XBRL is a global data tagging standard for exchanging information through accounting taxonomyschemas and linked databases that are tested for financial reporting validation checks.) The XBRL Analytics tagging framework has also been extended to cover all investment securities' corporate actions events that relate to changes in the valuation of capital and equity.
Should your company not have had the ERM discussion in the boardroom yet, it will happen soon. Obtaining the board's support is necessary but not sufficient for successful ERM program implementation.
Table 2.1 shows a set of enterprise risk analysis cases that have risen to the strategic review level in many companies. Case review issues and functional leadership highlight topics covered and organization participants who are typically involved in risk management analysis and risk treatment program planning. All companies will add a priority rating and risk readiness evaluation to the specific issues that apply to their legal entities.
Table 2.1 Enterprise risk analysis case issues
Risk Management Programs | Case Review Issues | Functional Leadership |
Cyber Risk Treatment Plans |
|
|
Property – Environment Treatment Plans |
|
|
Accident and Safety Treatment Plans |
|
|
Health & Wellness Treatment Plans |
|
|
Pension & Savings Programs Treatment Plans |
|
|
Investment Market Risk Plans |
|
|
Country Risk Recovery |
|
|
Risk treatment plan leaders use risk analysis metrics dashboards to plan and control risk management programs. Table 2.2 shows examples of organizational job roles that maintain key information that goes into risk analysis decisions and implementation activities for specific programs.
Table 2.2 Risk analysis metrics dashboard
Enterprise Risk Analysis Metrics | Metric Type | Risk Management Collaboration Group Roles |
Inherent Risk Financial Impact | Currency Value |
|
Risk Factors Impact – Stakeholder Reporting Footnotes | Count # |
|
Relevant Insurance Coverage Policies/Premiums – Financial Guaranty Contracts/Amount | Count # Currency Value |
|
Claims Pending/Paid/Reserves | Count # Currency Value |
|
Loss Adjustment Services Providers/Expense | Count # Currency Value |
|
Related Legal Matter Documents/ Legal Matter Expense | Count # Currency Value |
|
Residual Risk Control Tests/Audit Documents/Business Process Maturity (COSO) Assessments | Count # Assessment Score |
|
Regulatory Authority Filings/ Regulatory Filing Documents/ Regulatory Penalties | Count # Currency Value |
|
Risk Management Collaboration Group Members | Count # |
|
Risk Documents Access Privileges | Count # |
|
Risk management action plan preparedness metrics provide the keys to assessing the difference between paper-based programs and executable action that reduce ultimate losses. Experience shows that success is correlated with how risk owners are empowered and trained for their enterprise risk analysis roles in ERM programs (Table 2.3).
Table 2.3 Risk management action plan
Enterprise Risk Analysis Metrics | Metric Type | Risk Management Collaboration Group Roles |
Risk Response Resources | Full Time Equivalent Staff (FTE) Commitment |
|
Risk Response First Stage Recovery Budget Expense Estimate | Currency Value |
|
Risk Treatment Plan Workshops | Count # |
|
Risk Treatment Plan Workshop Participants | Count # |
|
Higher analytics system user percentages of all company managers leads to measureable success in loss experience ratings that lower the cost of risk and increase performance goal forecasting accuracy. The information in the Enterprise Risk Analysis Case Issues and the Enterprise Risk Analysis Metrics Dashboard examples provide baselines to evaluate how well current IT Assets support risk management planning and control. Understanding Risk Information Cycle Management “gaps” between current risk decision information reliability and desired risk analytics mastery targets is the foundation of risk management planning leadership.
Dresner, H. (2009) Profiles in Performance: Business Intelligence Journeys and the Roadmap for Change. New York: John Wiley & Sons.
Robert L. Snyder, BA, JD, ARM
Professional risk advisor and a member of the Texas Bar. and has served as Adjunct Lecturer in the College of Business at the University of Houston – Downtown
Healthcare delivery is one of the most complex industries in modern American society. Taken as an “enterprise”, healthcare is comprised of a diverse set of service providers and stakeholders, ranging from direct healthcare providers represented by physicians, nurses, therapists and an array of other clinicians and allied health professionals. There are institutional care facilities such as hospitals, long-term care facilities (e.g. nursing homes, assisted living facilities, senior living communities), rehabilitation centers, ambulatory surgery centers, diagnostic imaging centers, and other facilities. On the supply side there are pharmaceutical and medical device manufacturers, and medical research facilities. On the business end there are private and institutional investors and shareholders in many of these businesses. Finally, there are the payers for services, including governmental programs, such as Medicare and Medicaid, health insurance companies and self-insured employers.
Healthcare is an enterprise that touches every individual throughout life in material and profound ways that other industries do not. There are many products and services one might elect to purchase or not purchase in the course of a lifetime (a house, an automobile, a personal computer, a vacation, a college education – all “elective” purchasing decisions). However, it is virtually 100% certain every person will need healthcare within the delivery structure that exists for providing it at a given point in time during his or her life.
The business, technological, political and societal influences on healthcare are also complex and interrelated. “Healthcare reform” efforts undertaken in the United States in the late twentieth and early twenty-first centuries (to be addressed further below) have revealed many challenges in identifying and addressing “risks” associated with healthcare.
Enterprise level risks, broadly stated, apply within the healthcare industry as they do in a host of other settings, along the following lines:
Financial risks can be generally defined as risks affecting profitability, and/or economic efficiency in the case of not-for-profit institutions. Financial risks include those that impact the enterprise's cash position, access to capital or favorable financial ratings, business relationships with other parties, such as suppliers, and the timing of recognition of revenues and expenses.
With respect to healthcare, while the “system” overall is comprised of an amalgamation of both for-profit and not-for-profit sectors, “profitability” applies to both. In for-profit endeavors, it is easy enough to understand that investors seek a return on their capital that is at risk in the enterprise. Although much healthcare is provided through not-for-profit institutions, these entities likewise must normally earn a financial margin (a surplus that is akin to profit) to be sustainable. For instance, a common motto among faith-based, not-for-profit healthcare organizations is, “no margin, no mission.”
The government at various levels has a major and increasing role in delivering and managing healthcare. Arguably the government does not seek to, and need not be concerned about, operating at a “profit.” However, if entities under governmental control accumulate large deficits over time, the burden falls on the taxpayers, which has significant political consequences, including the continuation (or not) of certain programs.
Hazard Risks, generally speaking, are risks resulting in loss or damage to physical assets of the business, or injury or property damage to other parties, including customers, patients, employees, business trading partners or other third parties, arising from the actions or alleged negligence of the business. Hazard risks are sometimes thought of as “insurable risks,” in that they are comprised of the types of damage or injury for which most businesses can readily purchase insurance. Examples in the healthcare setting include medical malpractice and product liability lawsuits, and natural disasters (e.g. hurricanes, tornadoes, floods) causing damage to facilities such as hospitals or nursing homes.
Operational Risks refer to risks to the ongoing conduct of the business that result from changes in business practices, allocation of entity resources, effects of external regulations or requirements, inadequate or failed internal processes, people or systems. Operational risk is sometimes referred to as the risk associated with “doing the (strategic) thing right.”
Strategic Risks are risks that impact the organization's ability to achieve its broader goals and objectives, such as risks to market position or reputation, or the risk that a business plan to which major resources and effort are committed will ultimately not be successful due to lack of acceptance in the marketplace. Strategic risk is sometimes referred to as the risk associated with “doing the right thing.”
In fact, it is important to understand strategic risk management, in particular as a critical component ultimately driving “enterprise” risk management. “Strategic risk” is associated with adopting or not adopting the correct strategy for the organization in the first place, or, once adopted, not adapting the chosen strategy in response to competition or other forces. Strategic risk management contemplates the integration of strategic planning, the setting of organizational objectives and the identification of “risk” with the organization's enterprise risk management program.
Enterprise risk management addresses risks to strategy at its core. ERM significantly looks for critical risks (and, as noted below, opportunities) associated with the defined strategy. In the context of healthcare reform (to be discussed further), for instance, an important strategic shift for both providers and payers is the realignment from “fee for service” medicine (i.e. the more services and procedures provided the more revenue generated) to “global” type payments that will generate rewards, presumably for all parties, including patients, through wellness and quality metrics associated with managing the health of certain defined populations, especially including population groups characterized by common chronic conditions, such as hypertension, obesity and diabetes.
It is further important to note that “risk” does not merely denote the likelihood of failure. Risk also represents opportunity, and in fact, from a business perspective in healthcare, or other industries, any opportunity worth pursuing is likely to entail risk. A chairman of Lloyd's of London phrased it this way:
But risk management is not simply about preparing for the worst. It's also about realizing your full potential. With a clear understanding of the risks they face, businesses can maximize their performance and drive forward their competitive advantage.17
Further, it will be obvious that while the “four quadrants” represent a convenient manner for broadly categorizing risks, risks within each quadrant do not exist in isolation from the risks in other quadrants. There is significant overlap, and an area of convergence, where particular risks may be regarded concurrently as financial, strategic, operational or hazard risks in various combinations. “Enterprise-wide” risk management essentially focuses on the overlapping risks. These risks might be thought of by the managers or leaders of the enterprise in the form of the question, “What keeps you up at night?”
To illustrate within a specific segment of healthcare, consider for a moment the risks associated with a Managed Care Organization (MCO). The MCO is typically a third-party payer for medical or other healthcare services, such as a health insurance company or Healthcare Maintenance Organization (HMO). These are licensed, regulated entities, business enterprises generally subject to a specific set of laws and regulations promulgated on a state-by-state basis. Within the context of “managed care,” not only do these entities negotiate contracts to pay healthcare providers, typically on behalf of employer-funded health insurance programs, they establish the parameters for coverage, such as which tests and procedures will be covered, what treatments will be accepted, based on a particular diagnosis, and what preventive services will be offered.
Examples of risks within each of the “quadrants” appear within Figure 2.3. For many MCOs these risks (or some subset thereof) have been the focus of attention for quite a long time. In the current environment, where new or significantly modified risks will arise under recently enacted federal and state laws, consider how the illustrated risks might change.
Similar matrices can be created for other business sectors, which collectively represent the sweep of entities comprising the healthcare industry.
One overriding factor impacting enterprise risk in the healthcare industry has been the evolving healthcare reform. During the past generation the first serious reform at the federal level was the effort actively promoted during the first term of the Clinton administration in the early 1990s. “Reform” objectives had long been debated both in society and in Congress. Generally, these objectives related to proposed measures for increasing access to healthcare by a large and growing segment of the population lacking health insurance, coupled with measures designed to control costs and improve outcomes for those receiving healthcare. For many years, influenced by many factors, medical cost inflation had outstripped inflation in the general economy. At the same time, despite an ever-growing proportion of the United States' gross domestic product consumed by the cost of healthcare, the country actually began to lag behind other developed economies in a number of quality indicators.
Ultimately, the healthcare reform effort of the 1990s did not succeed, but the nation's attention was focused on the issue in a way that it had not been in many years. Coming into the 21st century, the effort was rekindled after the election of Barack Obama as president in 2008. Although there was great political and public debate, in 2010 Congress passed the Patient Protection and Affordable Care Act (PPACA), the most sweeping set of reform measures relating to healthcare in many years. The new law was challenged in federal court, ultimately leading to the 2012 decision by the U.S. Supreme Court18 upholding the major provisions of the law.
The Supreme Court decision, coupled with the re-election of President Obama to a second term in November, 2012, have made it clear healthcare reform is here to stay for the foreseeable future. Profound implications are created at the enterprise level for entities and providers involved in the delivery of healthcare.
Thus, enterprise risk management stands to be a topic of increasing importance in healthcare. It remains to be seen exactly what risks will emerge and how they will be managed, but the following are suggested as major risk drivers, which will impact different sectors of healthcare:
PPACA provides for the formation and licensing at the federal level of “umbrella” entities within geographic areas that will contract with healthcare providers and manage the delivery of care to designated population groups participating in governmentally funded programs, particularly Medicare (primarily focused on the elderly) and Medicaid (primarily focused on the poor). ACOs will be financially incentivized by the government to contain costs and improved quality through a shared savings program and a set of metrics relating to health outcomes of the population served. ACOs are required to be independent legal entities and may be not-for-profit or investor owned. Health systems, for instance, will be being challenged to determine whether there is a particular competitive advantage (or disadvantage) to them in ACO development or participation.
PPACA also directs the creation of state run health insurance “exchanges,” which will serve to provide insurance to segments of the population that might otherwise be unable to obtain it. States are incentivized within the law to participate in the exchange program on an optional basis through enhanced funding from the federal government to support the Medicaid program, which has long been administered at the state level. For states that reject the option and refuse to establish exchanges, the federal government is authorized to set up the exchanges.
An important aspect of healthcare transformation will be the management of care within specific “populations” in ways not previously utilized. This might include, for instance new collaborations among providers targeting specific chronic diseases, such as high blood pressure or diabetes, affecting a certain population segment. Care will require coordination among specialists, which may or may not be formalized by specific contractual agreements allocating legal and financial liabilities. Revenue sharing among collaborating providers will have to be agreed. Population health management is likely to transcend medical care and also involve “quality of life” assistance from home aides, for instance, running errands or assisting with household needs.
Over a period of years the federal government will substantially decrease the level of reimbursements for many healthcare services provided primarily by physicians and hospitals. Presumably, the providers will have access to a larger insured population and thus the reduced reimbursements will be offset by greater volume generating additional revenue for the providers, as opposed to them having to “write off” a certain proportion of uncompensated care provided to an indigent population.
The complexities associated with complying with the new law, and the need to be part of provider networks, are likely to have the practical effect of forcing many physicians out of their traditional independent practitioner roles and into either direct employment by hospitals or health systems, or into various contractual relationships that will result in a high degree of control over their practices. This trend is well underway in many parts of the country. The acquisition of physician practices and the negotiation of physician employment contracts are both complex undertakings.
Going beyond the 1996 HIPAA act, the new law effectively mandates that healthcare providers adopt and implement systems for electronic medical records, in order to facilitate the timely and accurate exchange of patient information among an array of providers, such as primary care physicians, specialists and hospitals. The federal government has allocated “stimulus” funds available to providers to encourage their adoption of EMR systems. At the same time, providers are subject to strict constraints and penalties under a prior federal law, the Health Insurance Portability and Accountability Act of 1996 (HIPAA), for security breaches resulting in the intentional or unintentional disclosure of “personal health information.”
Both pharmaceutical companies and medical device manufacturers stand to be impacted by material changes in the healthcare business environment. Pharmaceutical companies will be under pressure to document the effectiveness of medications in producing positive clinical outcomes at acceptable cost for various populations under medical management. Insurers may refuse to include medications falling outside certain parameters on their approved formularies. Funding for development of new therapies may become more difficult to generate. With respect to device manufacturers, they are impacted by a new 2.3% excise tax called for by the Accountable Care Act as of January 1, 2013. Small start-ups, in particular, which represent a material proportion of all medical device development may be disproportionately impacted and placed at a competitive disadvantage.
As the impact of many aspects of the healthcare reform law at the federal level, and various counterparts at the state level, takes hold the financial viability of many healthcare entities will become tenuous. In a competitive environment inevitably there will be consolidation and contraction in various forms, such as through mergers and acquisitions. This activity logically will increase the probability that investors, such as shareholders, and other stakeholders will find themselves financially disadvantaged and will seek legal redress.
All of these risks, as well as others undefined, can be seen as emerging “enterprise” risks for healthcare organizations. Enterprise risk management, therefore, stands to serve a major role in development of business strategies for the healthcare industry in the years to come.