Monthly Archives: January 2016
Simplicity may conjure up thoughts of inner peace and contemplative musings about self-actualization, but that is not the kind of simplicity I am referring to. The concept of simplicity that I think is really interesting is the challenge of making the complex simple. I am referring to the kind of simplicity that Steve Jobs imagined when he changed how we use technology.
Jobs redesigned how our brains interact with technology without our realizing we were participating in a brain hack! The ecosystem Apple created with the Mac and mobile devices via the Apple Store is a stroke of genius and an answer to an interesting problem. How to make technology so simple everyone on the planet can use it right out of the box?
“The reason that Apple is able to create products like the iPad is because we’ve always tried to be at the intersection of technology and the liberal arts. To be able to get the best of both. To make extremely advanced products from a technology point of view, but also have them be intuitive easy-to-use, fun-to-use, so that they really fit the users. The users don’t have to come to them, they come to the user. And it’s the combination of these two things that I think has let us make the kind of creative products like the iPad,” quote from Steve Jobs 2010’s introduction of the iPad.
Supposedly, the idea of a smart phone had been discussed long before Apple created the first iPhone but no one was able to put all the pieces together in the way Jobs did. The lesson from Apple’s success should not be that simplicity is too hard to conceive; instead, we must reframe simplicity as the end goal.
By reframing simplicity as the end goal, Jobs was able to see how multiple devices, such as, the Walkman (remember those?), cameras, and phones could be integrated seamlessly into one device? Jobs showed how a focus on solving the problems that led to the cause of poor customer experience helps create higher profits, customer loyalty and shareholder value; not the other way around. More importantly than the technology, Jobs chose not to control how the devices were used which harnessed yet another eco-system of developers and innovators who shared in Apple’s ascent to become the most profitable company on the planet. In other words, simplification led to organic iterations of new services spawning demand globally for all things Apple.
This raises very interesting questions about how we deal with risks or solve complex problems that appear to be intractable. If complexity is a product of our own design what can we learn from Apple’s lessons in simplicity? You might be surprised that simplicity is a topic being studied and tested in real world scenarios.
About 5 years ago a new website called the SimplicityIndex was created by Siegel+Gale, a global brand strategy, design and experience firm to understand the role simplicity plays in brand awareness and loyalty. The Simplicity Index explains how customers perceive the simplicity of a company’s products and services: Easy to Understand; Transparent and Honest; Making Customers Feel Valued; Innovative and Fresh; and, Useful to Customers. The simplicity attributes of brand leads to measurable benefits in higher profitability, customer loyalty and premium pricing because of the perceived value.
Simplicity can be quantified and measured in real returns to organizations!
The power of simplicity is much bigger than a product strategy! Consider how risk management could be transformed if internal controls and compliance were redesigned to make it simple for employees to get their work done or follow the rules? Simplicity requires that we ask non-intuitive questions such as why must we continue to operate the way we do or what barriers to simplicity exist for customers and employees? Is it time to reconsider how the attributes from the Simplicity Index serve as the end game, not a mission statement with no real strategy of execution?
While you ponder those questions we should also ask why complexity is the norm and not simplicity. There are no simple answers but there are examples from network engineering of how overly complex network security design leads to vulnerabilities in cybersecurity.
The concept of Robust Yet Fragile
Engineers of computer networks are well versed in how complexity builds as well-meaning security professionals add controls and policies in response to threats and weaknesses without considering the impact to network fragility over time.
John Doyle, the John G Braun Professor of Control and Dynamical Systems, Electrical Engineering, and BioEngineering at the California Institute of Technology, introduced the concept of “Robust Yet Fragile” (“RYF”) paradigm to explain the five components of network design used to build a robust system.
Each design component is built on the concept of adding robustness to networks to handle today’s evolving business needs. “Reliability is robustness to component failures. Efficiency is robustness to resource scarcity. Scalability is robustness to changes in the size and complexity of the system as a whole. Modularity is robustness to structure component rearrangements. Evolvability is robustness of lineages to changes on longtime scales.
The graph in the above exhibit describes the optimal point of robust network design. “Like all systems of equilibrium, the point at which robust network design leads to unnecessary complexity is the paradox faced by security professionals and systems architects. Systems, such as the Internet, are robust for a single point of failure yet fragile to a targeted attack. As networks bolt on more stuff to build scale the weight of all that stuff becomes more risky,” according to Doyle.
Doyle’s warnings about internet security also applies to enterprise risk management. Does anyone really believe that every employee understands how to operationalize all of the myriad policies and procedures put into effect each year? If so, you may be operating in the Domain of the Fragile and not aware of the vulnerabilities lurking around the corner.
How does an organization reframe simplicity?
The answer to that question is different by industry and organizational culture. A better way to answer the question is to pose new questions for you to consider in your organization. For example, has the cost of critical operational functions increased at a higher rate than the benefits? How difficult is it for management to get timely answers about customer profitability, enterprise risk or financial performance? Are you losing customers because you are difficult to do business with? Are your employees empowered to solve risks on their own or given the tools to improve the customer experience? As you can see the list of possibilities are endless; however, if you are not aware of the answers to these questions you are operating in the Domain of the Fragile.
Board governance is one place where the example of simplicity can be modeled from the top down. Directors have an opportunity to reframe success and reduce risk with a focus on simplicity. Simplicity is not just a focus on Less but a renewed focus on Better. Simplicity is not about doing “more” with “less”; it’s about doing less to achieve more!
As you consider new strategies for 2016 and beyond how you reframe simplicity may be the difference in success or failure for years to come.
Whether you turn on your television or read your iPad, Smartphone or other mobile device the cacophony of news around us has become more confusing and unsettling. The never ending wars in the Middle East, Cyber Security, global market rallies and capitulation, natural disaster, corporate layoffs …. you get the picture!
If you are like me you want nothing more than a return to a more quiet time when things were better! But the truth is hindsight is seldom what we remember or some thing we can return to. We filter out the bad and remember the good. Our ability to move forward in the face of uncertainty depends on our brain’s ability to discount the negative and remain optimistic for the future.
Welcome to the new world of Asymmetric Risks!
The world around us has changed in ways most could not have contemplated 10 years ago. The benefits of mobile technology and social media has facilitated an explosion of cyber warfare. Likewise, the euphoria of democracy espoused in the Arab Spring has deteriorated into a cauldron of warring factions and terrorists. These events epitomize asymmetric risks described as a situation where the gains realized from one or more events differs significantly from the losses incurred by events in the opposite direction.
Asymmetric risks are different but the phenomena is as old as mankind. Asymmetry has been the basis for managing a variety of risks from derivatives trading, healthcare research and military warfare to cybersecurity. Asymmetric risks are pervasive but lie below the surface of awareness….until some major event jars us from complacency.
To better understand asymmetry it’s important to explain how these risks go unnoticed until we are faced with the consequences of a dealing with a crisis we have not planned for. Asymmetric risks are harder to anticipate primarily due to the infrequent nature of these events. Quantitative analysts call these events Tail Risks! Nassim Nicholas Taleb described these events in his seminal book, The Black Swan, “a black swan is a highly improbable event with three principle characteristics; it is unpredictable, it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random, and more predictable, than it was.”
Asymmetric risks build quietly and gradually but present themselves in very sudden inexplicable ways. The most recent example is the drop in oil prices simultaneously with the decline of China’s economic growth. These two events are unrelated but both events were predictable and should not have shocked global markets. Anyone who has studied an analysis of historical oil patch economic boom cycles could have predicted that oversupply would eventually lead to lower prices and economic decline. Correspondingly, China’s economic miracle has no historical precedence; as economies mature and grow a reversion to the mean is inevitable. The fact that these two unrelated events happened at the same time is less remarkable than the fact that the markets did not anticipate one or the other occurring.
The Efficient Market Theory, a rebuked hypothesis, postulated that all market participants receive and act on all of the relevant information as soon as it becomes available. We are now more aware of the fallacy of this theory and the thinking behind it, which in part, lead to the Great Recession of 2008. Oil price declines and economic growth models have significant historical data from which to model how these events, but not the timing, would eventually resolve themselves. The fact that these two events (Oil price decline & China slowing) occurred at the same time may be a coincidence or not; but therein lies the risk of asymmetry.
The last oil boom and bust cycle ended more than 50 years ago; and who can remember when an emerging economy, such as China, rose to compete with America for the largest economy in the world? We fail to anticipate these events because of a phenomenon known as Availability Bias. Availability bias is a focus on the most recent information we have in front of us on a daily basis. Examples include a focus on the latest poll results for presidential candidates, dips in the stock market or latest terrorist attack.
Availability bias distracts us from the kinds of asymmetric risks organizations must now begin to prepare for to manage in a world where emerging economies, technology and scientific discovery will increase in velocity driven by advances in artificial intelligence, esoteric financial products and computing power.
Unlike Nassim Taleb, I am optimistic that we will learn to manage these risks more effectively as the ability to harness the power of historical data continues its trajectory of development. In the meantime, organizations must begin to consider asymmetric risks within an Enterprise Risk framework. Too often, strategic corporate objectives emphasize the positive outcomes of their plans for sales growth or market share acquisition without fully contemplating the asymmetry of events that might offset the proposed gains if not successful. A lack of high quality data for peering into the future is no longer an excuse. Much of the data exists to model one or more outcomes with varying degrees of precision however firms fail to model how these events might evolve in opposite directions offsetting gains and losses.
Managing asymmetry in an Enterprise Risk framework requires a new mental model that is less based on some formulaic internal controls framework. Asymmetric risk requires a revision of the Enterprise Risk framework reimagined as an approach to make us more aware of blind spots in our thinking about risk.
Over the last 7 years the one constant corporate executives have complained about has been uncertainty in economic recovery, geopolitical risks, global competition and expanding government regulation however each of these perceived risks has paled in comparison to human behavior in the executive suite. In other words, while concerns have been externally focused the true cause of corporate pain has been self-inflicted by bad corporate behavior.
The most recent example of internal control weakness at Walmart and Toshiba were cited as a wake-up call for external audit firms in an article from a Fortune magazine article. A letter dated Sept. 9 addressed to Securities and Exchange Commission Chair Mary Jo White and the SEC Commissioners was no ordinary letter. Its signers included heavy-
hitters such as former Federal Reserve Chairman Paul Volcker, Vanguard founder Jack Bogle, and former SEC Chairmen Arthur Levitt and Richard Breeden. Former Comptroller General of the U.S. Charles Bowsher, former board member and acting chair of the Public Company Accounting Oversight Board (PCAOB) Chuck Niemeir, and former Chair of the International Accounting Standards Board Sir David Tweedie, along with a host of other luminaries, also signed the letter.
In it they wrote that they have “an interest in the auditing and financial reporting quality of companies listed in the U.S. and internationally … Our purpose in sending this letter is to express our support for Chair [James] Doty’s reappointment [as Chair of the PCAOB] and to explain the reasons for this support.”
With very minor exceptions, no firm fails to attain “reasonable assurance” attestations from their external auditors when evaluating internal controls, even firms who eventually experience massive financial fraud, financial restatements or financial internal control weakness findings after the fact. Why does this happen so frequently and what prevents external auditors from detecting fraud?
After debating this issue with auditors and researching the PCABO website for answers it has become clear that the standard for “reasonable assurance” is one key contributor to failure. To find answers I looked at accounting standards in the U.K. and U.S. to better understand the guidance given to external auditors to formulate reasonable assurance. Here is what I found:
‘Reasonable assurance’ is the level of confidence that the financial statements are not materially misstated that an auditor, exercising professional skill and care, is expected to attain from an audit. The confidence that an auditor attains is subjective and is the basis for offering an audit opinion. Users of financial statements derive their own confidence in the audited financial statements from many sources, including a knowledge that the auditors work to professional standards within a framework of regulation and that the auditors have felt sufficiently confident that the financial statements are not materially misstated to issue an opinion.
As a consequence of their confidence that financial statements are not materially misstated, users of financial statements may also gain confidence that the management of the entity are conducting its affairs in the knowledge that the financial consequences of their actions will be reported.
The assurance the auditor obtains from performing procedures and the assurance the auditor expresses in the report on the financial statements vary based on the type of service the auditor provides. An audit is the highest level of service an auditor can provide. An audit allows the auditor to express an opinion about whether the financial statements are free of material misstatement. In contrast, the objective of a review of interim financial information is to provide the auditor with a basis for communicating whether, as a result of the procedures performed, the auditor became aware of any modifications that should be made to the interim financial information for it to conform with generally accepted accounting principles (“GAAP”).
The procedures performed in a review do not provide the auditor with a basis for expressing an opinion on the financial statements. Thus, the assurance the auditor provides to financial statement users based on a review is more limited than the assurance that can be provided as a result of an audit.
In both cases, the standards boards for both the UK and the US have punted on “reasonable assurance” even though boards of directors and senior executives, not to mention regulators, in the SEC, Department of Treasury and other regulatory agencies depend in these assessments. The UK standard specifically states that reasonable assurance is “subjective” while the US standard is more muddled suggesting that “depending on the level of services provided” the attainment of “reasonable assurance” is varied?
Basically, these standards are legal cover for whatever a firm wants them to mean. If one external audit firm concludes “reasonable assurance” is sufficient and another audit using more advanced audit procedures comes to a completely different conclusion both opinions are acceptable, that is, until the firm restates earnings and lays off thousands of employees to fix the problem.
Auditors frequently argue that “reasonable assurance” is well-established by corporations and broadly accepted. This may be true but blood-letting was also an accepted medical practice in the 17th and 18th century until more patients died from the procedure than were cured! Is reasonable assurance the 21st century’s version of blood-letting? Just because it is accepted as standard practice does not mean that the practice is efficacious!
Blood-letting has since been discredited and would be considered malpractice in medical circles as a result of scientifically advanced procedures for curing health ailments. Isn’t it time that the accounting industry subject its practice to more advanced procedures using a combination of cognitive and analytical processes to give corporate boards and senior executives confidence in their work product? Otherwise, reasonable assurance should be treated more like collusion with management to give the appearance of compliance with little substance whatsoever to demonstrate confidence that internal controls are operating properly.