Monthly Archives: January 2017
You must be logged in to view this document. Click here to login
TheGRCBlueBook combines risk advisory services with cutting edge research, a knowledge of the GRC marketplace and a platform for GRC solutions providers to educate and showcase their products and services to a global market for risk, audit, compliance and IT professionals seeking cost effective solutions to manage a variety of risks. Partner with TheGRCBlueBook to help educate corporate buyers about your GRC products and services.
Behavioral economics has only recently begun to garner gradual acceptance by mainstream economists as a rigorous discipline that may serve as an alternative perspective on decision-making. However, the broad acceptance and growing adoption of behavioral economic theories and concepts along with advancements in computational firepower present opportunities to put into practice practical applications for improving risk management practice. The goal of this article is to develop a contextual model of a cognitive risk framework for enterprise risk management that frames the limitations and possibilities for enhancing enterprise risk management by combining behavioral science with a more rigorous analytical approach to risk management. The thesis of this paper is that managers and staff are prone to natural limitations in Bayesian probability predictions as well as errors in judgment due in part of insufficient experience or data to draw reliably consistent conclusions with great confidence. In this context, a cognitive risk framework helps to recognize these limitations in judgment. The Cognitive Risk Framework for Cybersecurity and the Five Pillars of the framework have been offered as guides for developing an advanced enterprise risk framework to deal with complex and asymmetric risks such as cyber risks.
“A major task in organizing is to determine, first, where the knowledge is located that can provide the various kinds of factual premises that decisions require.” – Herbert Simon
In a 1998 critique of Amos Tversky’s contributions to behavioral economics (Laibson and Zeckhauser) discussed how Tversky systematically exposed the theoretical flaws in rationality by individual actors in the pursuit of perfect optimality. Tversky and Kahneman’s Judgment under Uncertainty: Heuristics and Biases (1974) and Prospect Theory (1979) demonstrated that actual decisions involve some error. “The rational choice advocates assume that to predict these errors is difficult or, in the more orthodox conception of rationality, impossible. Tversky’s work rejects this view of decision-making. Tversky and his collaborators show that economic rationality is systematically violated, and that decision-making errors are both widespread and predictable. This now incontestable point was established by two central bodies of work: Tversky and Kahneman’s papers on heuristics and biases, and their papers on framing and prospect theory.”
Much of Tversky and Kahneman’s contributions are less well known by the general public and misinterpreted as a purely theoretical treatment by some risk professionals. As researchers, Tversky and Kahneman were well versed in mathematics, which helped to shine light on systemic errors in complex probability judgments and the use of heuristics in inappropriate context. As groundbreaking as behavioral science has been in challenging economic theory, Tversky and Kahneman’s work centers on a narrow set of heuristics: representativeness, availability and anchoring as universal errors. The authors used these three foundational heuristics broadly to describe how decision-makers substitute mental shortcuts for probabilistic judgments resulting in biased inferences and a lack of rigor in making decisions under uncertainty.
Cognitive Risk Framework: Harnessing Advanced Technology for Decision Support
In the thirty years since Prospect Theory data analytics expertise and computational firepower have made significant progress in addressing the weakness in Bayesian probabilities recognized by Tversky and Kahneman. Additionally, the automotive industry and Apple Inc., among others, have been successful in incorporating behavioral science in product design to reduce risk, anticipate human error and improve the user experience adding value in financial results. This paper assumes that these early examples of progress point to untapped potential if applied in constructive ways. There are distractors, and even Tversky and Kahneman admitted to inherent weaknesses that are not easy to solve. For example, observers are skeptical that laboratory results may not replicate real-life situations; that arbitrary frames don’t reflect reality as well as a lack of mathematical predictive accuracy.
Since Laibson and Zeckhauser’s (1998) critique of Tversky’s contributions to economics a large body of research in cognition has evolved to include Big Data, Computational Neurosciences, Cognitive Informatics, Cognitive Security, Intelligent Informatics, and rapid early stage advancements in machine learning and artificial intelligence. A Cognitive Risk Framework is proposed to leverage the rapid advancement of these technologies in risk management however technology alone is not a panacea. Many of these technologies are evolving yet additional progress will continue in various stages requiring risk professionals to begin to consider how to formalize steps to incorporate these tools into an enterprise risk management program in combination with other human elements.
The Cognitive Risk Framework anticipates that as promising as these new technologies are they represent one pillar of a robust and comprehensive framework for managing increasingly complex threats, such as, cyber and enterprise risks. The Five Pillars include Intentional Controls Design, Intelligence and Active Defense, Cognitive Risk Governance, Cognitive Security Informatics, and Legal “Best Efforts” Considerations. A cognitive risk framework does not supplant other risk frameworks such as COSO ERM, ISO 31000 or NIST standards for managing a range of risks in the enterprise. A cognitive risk framework is presented to leverage the progress made in risk management and provide a pathway to demonstrably enhance enterprise risk using advanced analytics to inform decision-making in ways only now possible. At the core of the framework is an assumption about data.
One of the core tenets of Prospect Theory is the recognition of errors made in decision-making derived from small sample size or poor quality data. Tversky and Kahneman noted several observations where even very skilled researchers routinely made errors of inference derived from poor sampling techniques. Many recognize the importance of data however organizations must anticipate that a cross-disciplinary team of expertise is needed to actualize a cognitive risk framework. Data will become either the engine of a cognitive risk framework or its Achilles Heel and may be the most underestimated investment in ramping up a cognition driven risk program. A cognitive risk framework anticipates much more diverse skills than currently exists in risk management and IT security.
Data is but one of the considerations in developing a robust cognitive risk framework. Other considerations will include developing structure and processes that allow ease of adoption by practitioners across multiple industries and in different size organizations. While it is anticipated that a cognitive risk framework can be successfully implemented in large and small organizations risk professionals may decide to adopt a modified version of the Five Pillars or develop solutions to address specific risks such a cybersecurity as a standalone program. It is anticipated that if cognitive risk frameworks are adopted more broadly that technology firms and standards organizations would take an active role in developing complementary programs to leverage these frameworks to advance enterprise risk using advanced analytics and cognitive elements.
 LAIBSON/ZECKHAUSER Kluwer Journal @ats-ss8/data11/kluwer/journals/risk/v16n1art1 COMPOSED: 03/26/98 11:00 am. PG.POS. 2 SESSION: 15
You must be logged in to view this document. Click here to login
Protiviti and North Carolina State University’s ERM Initiative are pleased to provide this report focusing on the top risks currently on the minds of global boards of directors and executives. This report contains results from our fifth annual risk survey of directors and executives to obtain their views on the extent to which a broad collection of risks are likely to affect their organizations over the next year.
“Never let the facts get in the way of a good argument”
Facts, or more precisely, our understanding of facts or the truth have become more transient in the information age or has it? The Internet has radically changed how we access information in ways that few appear to challenge or even understand. Today, anyone can Google a fact or story or news event about any topic imaginable to “learn” about a topic instantly with only a few keystrokes. We are bombarded today with opinion pieces, rumors, false news stories and innuendoes without bothering to check the validity of the stories. In fact, depending on the viewer of said data, the facts are easily dismissed when the “information” disagrees with one’s views or beliefs about the topic. So the question here is “has the information age inhibited critical thinking?” Risk managers are not immune to these same biases and the implications may help explain why risk management is at risk of failing.
It turns out that the definition of the “truth” does not answer the question of what a truth really is. Here are a few examples: Merriam-Webster states that truth is “sincerity in action, character, and utterance”. Or “the state of being the case: a fact. Or “the body of real things, events, and facts”. Or a transcendent fundamental or spiritual reality” Or “a judgment, proposition, or idea that is true or accepted as true. Or my favorite, “the body of true statements and propositions.” Dictionary.com has 10 different definitions each in contrast with Merriam-Webster. In other words, truth is what we believe it is. You know you are in trouble when truth and transcendental or spiritual reality are used in the same definition. Apparently, we have no idea what a truth is or we are simply more confused than ever as we get bombarded with different truths.
But why is this important for risk professionals? If the truth changes based on evolving norms, opinions, perception and biases how does a risk professional manage emerging risks in an environment where a variance from the old truths conflict with new truths? Operating models change as new leadership dictates his or her view on old operating models requiring risk professionals to question how does one assess these new risks? What was once indisputable no longer applies and old assumptions are considered impediments to progress. Or does it?
In the age of Big Data corporations are in search of the truth in customer behavior, buying preferences, and managing the risk of strategic plans. However, even with the assistance of advanced analytics we are more “archaeologists “ than true scientists. Archaeologists apply a body of knowledge and a great deal of conjecture in constructing their view of the past. Each new discovery has the potential to disrupt or partially validate assumptions in our belief about what ancient civilizations or animals were really like. We don’t have enough information to confirm these conjectures but instead believe them in the absence of data that fails to contradict them. This is the crude method in how humans learn — through trial and error. If something is proven to work reasonably well over time it becomes the truth. If it is fails, miserably, it is considered to not be the truth. But we know from scientific experiments that truth can be derived from failures, even massive failures like the space shuttle catastrophe or major battles in war. We “learn” from mistakes and vow to never repeat them again.
The truth is we seldom, if ever, have perfect information. Imperfect information is uncertainty NOT a risk. Risk is a known quantity. It can be measured and we know to avoid it or accept it and that is why we call it a risk. The failure in risk management is not knowing the difference. Fear, confusion, and hope are signs of uncertainty and are emotional signals that we have crossed the Rubicon of not knowing whether the outcomes will result in losses or gains. This is when risk managers become archaeologists. Archaeological risk managers try to develop stories from past experience and imperfect information to describe the new truths using old methods. This happens in every industry from insurance to financial services and beyond and partly explains why we miss really big emerging risks until a “learning” experience teaches us what a risk really looks like.
Fear, confusion and hope are natural responses in our primitive brain of “Fight vs Flight” mechanisms of survival. These emotional responses are also signals that we must tread lightly, gather information gradually and take measured risks without betting the farm on a new shiny thing that may be a train coming through the tunnel of darkness.
How can risk professionals avoid the freight train? Don’t be afraid to say you don’t know. When worry, fear, and confusion permeates communications that is a signal a freight train may be barreling down the tracks. Instead you must use this time to understand what you know and separate what you don’t know. Understanding the difference is critical because it provides risk managers with direction to gather information, perform advanced assessments and provides definable boundaries where risks may be lurking. It is also important to understand that huge potential is the other side of uncertainty. Big rewards can be found when uncertainty is at its highest level however risk professionals must have a measure approach to understanding the upside of uncertainty.
This is not the time to follow the crowd.
The upside of uncertainty requires risk managers to seek opportunity where others are fleeing or cannot see how the change in the new rules may benefit organizations poised to leverage change. What risk professionals must avoid during uncertainty is becoming archaeologists. Old methods may help to tell a compelling story but the real risks and upside to uncertainty will be lost as the new rules obscure what the truth really is.