Tag Archives: big data

2015-03-28 by: James Bone Categories: Risk Management Are basic Human Rights being violated by Big Data?

The Edwastock-photo-11012758-computer-networkrd Snowden insider threat has been played out in a variety of scenarios but the questions have been expressed as theoretical “What IF” table top exercises at best.   To date no research has been conducted to analyze the potential unintended circumstances that could lead to real harm to society and basic human rights if big data is misused.

That is until now.   A research grant of nearly 5 million pounds or $7.4 million US dollars has been awarded to the Economic and Social Research Council (ESRC) at the University of Essex who will lead the first global project investigating the human rights implications of the collection, storage and use of big data.

The project will explore how big data that is gathered through social media can be used against us and to determine if there are ways to document human rights violations through the exploitation of this information.

Essex University will conduct the research through its Human Rights Centre using a multidisciplinary team of experts from law, social science, computer science and technology.    Principal investigator, Professor Maurice Sunkin of the Essex School of Law noted, “Rapid technological developments, and our engagement with things like social media, enable unprecedented collection and analysis of big data.

“The data can be used to provide comprehensive profiles of our societies, and of identifiable individuals.  While offering huge potential benefits this poses significant threats to fundamental human rights.”

“Adopting a global perspective, our research will improve understanding and assist the development of law to enable us to benefit from big data, while limiting the potential harm and ensuring human rights are protected.”

For more information please contact the University of Essex Communications Office telephone:  01206 872400 or email: comms@.

2014-12-06 by: James Bone Categories: Risk Management Cicero: Why we kill the messenger

free_57281 images for thegrcbluebook shadow of man“When you wish to instruct, be brief; that men’s minds take in quickly what you say, learn its lesson, and retain it faithfully. Every word that is unnecessary only pours over the side of a brimming mind.”
Cicero, Marcus Tullius unknown 106-43 BC

Marcus Tullius Cicero was murdered by decree on December 7th in the year 43 BCE. He was a lawyer, statesman, politician and philosopher and came to be known as one of Rome’s greatest orators. Marcus Tullius Cicero was an avid thinker and writer and his texts include political and philosophical treatises, orations and rhetoric, the latter of which has come to be known as “Ciceronian rhetoric,” and an amass of letters.

How is a Roman philosopher relevant to 21st century risk professionals?  Even the most educated and articulate practitioner of the art of risk management can be fooled by randomness.  The influence of philosophical thought between 150 – 20 BC evolved during a time of change brought on by war and in intellectual thought. Alliances were formed and dissolved through marriage, assassination, or political arrangement by the ruling class to maintain power.

The ability to persuade one’s audience through effective rhetoric being one of the most prized skills in the legal and political arena helped to build and sustain influence during periods of relative stability.  It is fair to say that the rigor of probability were not mathematically advanced during Cicero’s era however the intellectual pursuit of understanding random events were no less important in the Roman empire than they are in today’s modern business or political setting.

The practice of probability was best described in the school of thought called the “Skeptics” in the pursuit of truth, ethical behavior, and the proper role of civil life.  These words and ideas did not exist before Socrates, Cicero and other philosophers “invented” Latin names in an attempt to establish “ideal” societal behavior given the less than ideal lawlessness that was often the norm of the day.

“Cicero was most aligned with the Academy Skeptics and the general view that nothing can be known with certainty and that ‘truth’ is essentially relative probability. The skeptic approach appealed to him especially as an effective strategy in law and politics. The skeptic must seek as many perspectives as possible and tease out as many probabilities in order to present a valid argument. As well, it also accepts and advocates malleability as probabilities and perspectives fluctuate over time, and ‘evidence’ proves otherwise.”

The skeptics’ school of thought is still prevalent in today’s scientific approach to probability, mathematics, physics, and applied quantitative big data.  But what led to Cicero’s untimely and violent end?  He was victim to the same error many make in the pursuit of the ideal to find truth.    “Truth”, like risk, is in the eye of the beholder and the person in power gets to determine what truth is and how to manage the risk that threatens the truth they wish to manage.

Human nature has changed very little in over 3,000 years!

Cicero was first exiled then unceremoniously murdered by Roman solders and his body parts displayed in the Roman Senate as a message to others whose narrative was not aligned with current leadership.

This is why corporate governance is so challenging to address effectively.  Early retirement, job reassignment and staff reorganizations have displaced summary executions but the effect is the same.

Is there a silver lining?  Cicero’s writings and philosophical teachings have influenced leaders through the century and continue to be the cornerstone of regulatory guidance but the challenges remain.  Human nature is hard to overcome.

2013-05-29 by: James Bone Categories: Risk Management Remodeling Risk: Trends impacting Risk

free_252493 

The banking industry has led the way in developing and adopting technology to manage financial risk.  According to a survey by American Banker Executive Forum, large Wall Street banks have made significant investment in risk systems during the Great Recession in anticipation of new regulation.

Now it appears that “72% of all banks are planning to make purchases of risk systems over the next 12 – 18 months.  This sentiment is being echoed at conferences across the country and is a trend that may have room to grow.  Mid-tier banks are now jumping into the market for technology to manage risk and a few interesting trends have emerged.

According to the survey results, “banks with more than $10 billion of assets are satisfied with their enterprise risk systems yet 36% of small banks plan to implement an enterprise system.” “Firms continue to want to chop down the silos and provide more information across disciplines,” says Michael Versace, research director of IDC.

Improved management of regulatory risk is the trend that is driving increased investment in risk tools.

Secondly, revamping credit models to account for counter-cyclical trends in the economy are forcing risk professionals to rethink capital management, counterparty risk and adjustments to risk-weighted assets. 

Thirdly, social media is becoming a bigger factor in how firms use technology to manage risk.  Social media now presents a great deal of potential for data mining for market intelligence.

Trend #4 involves how risk is priced.  Banks are looking for ways to improve how they adjust to changes in customer behavior.  Integrating capital requirements into their systems for evaluating credit risk allow firms to allocate capital more efficiently as customer accounts change over time.

Model validation is trend #5.  Banks have incorporated a host of new risk models which are costly to maintain and update frequently.  The Officer of the Comptroller of the Currency has issued guidelines for best practice in model validation requiring banks to devise cost effective approaches for ensuring existing models, and their assumptions, are still relevant given changes in the business environment.

Trend #6 right-sizing dashboards with the key metrics customized for each firm.  Customized dashboards give senior executives the information they need in the way they expect to see it. 

Real-time and continuous risk monitoring is trend #7.  Risk solutions must be fine-tuned to monitor critical risks allowing for human intervention on a just-in-time manner while ensuring that routine limits remain in line with expectations.

Trend #8 looks at the integration of risk systems.  Most silos within many firms have been artificially created by the technology silos that exist to address spot solutions or business processes.  Integrating risk systems breaks down silos allowing for a richer line of sight to enterprise risks.

Trend #9 is Big Data and the use of in-memory computing of existing database technology. 

The American Bankers Executive Forum study demonstrated that a variety of vendors are being used today to address these trends.

What these trends represent is the growing importance firms are placing on the management of risk and the tools needed to assist them in tackling real business problems.

TheGRCBlueBook mission is to become a global risk and compliance community site and resource portal for sharing best practice across all highly regulated industries.  A one stop source for all things risk and compliance related.

2013-05-28 by: James Bone Categories: Risk Management Aligning strategic value with risk management

stock-photo-12107875-young-man-presenting-ideas-to-his-business-team 

The vision of risk management contributing as strategic partner in the executive suite has long been a dream of most serious risk professionals and now that vision may be coming into focus.  Senior managers now view risk managers as strategic partners in the execution of corporate objectives by assessing and identifying key risks resulting from strategic plans.  That’s the good news!

However, according to a study by Marsh and RIMS “only 15% of the risk professionals and 20% of the C-Suite respondents said the risk manager is a full member of the strategic planning and/or execution teams, suggesting that risk management has yet to be fully integrated strategically.”

The study does not attempt to explain why risk managers have not made the leap to equal partners in guiding the organization to successful outcomes but one key factor may be the relevance of risk information brought to the table.  This begs the question of what defines strategic value in risk terms?  Increasingly the answer is data and the analysis of risks impacting an organization.

It is hard to argue with the collective wisdom that is forming around the quest for a better understanding of data and developing better techniques for the analysis of data.  Senior management has begun to define the value proposition in the form of data analytics therefore risk management must be responsive to these expectations. 

The problem or challenge with these surveys is the generic use of the terms data analytics and the lack of specificity regarding what firms expect. 

Blindly conducting fishing expeditions for the sake of “doing” risk management may backfire and not produce the results firms are seeking.  Many obvious risks are lying around in plain view needing attention but are ignored because there is no systemic approach to investing in risk mitigation.  Other risks are the unknown risks that are inherent in the uncertainty of launching a new and unproven initiative or line of business. 

What appears to be missing is a clear and balanced approach to risk management with a focus on setting the context for discussing risks and the tools that should be employed to understand and address risks.  Risk management is not a science project where data analysis alone will uncover some universal truth.  Good risk management is the implementation of a clear baseline from which to judge changes in the environment that may create risks and opportunities alike. 

Risks, in all its forms, evolve as the business environment evolves requiring senior management and the risk manager to think about risk as a natural byproduct of business objectives.  Risk practice, no matter how quantitatively proficient will not eliminate risk.  Therefore, risk management should be perceived as a learning process informed by data and adjusted in response to new information as it becomes available.

When everyone understands that risk management is a process like all good business processes risk managers will have earned their place in the executive suite with other senior managers.

TheGRCBlueBook mission is to become a global risk and compliance community site and resource portal for sharing best practice across all highly regulated industries.  A one stop source for all things risk and compliance related.

2013-05-13 by: James Bone Categories: Risk Management When Big Data Doesn’t Work

stock-photo-2456857-binary-code

With few exceptions, articles about Big Data start off with promises to be smarter, run more efficiently, or make more money.  As proof, each article cites standard examples of how data analytics and robotics have transformed warehouse operations, IBM’s Watson’s mastery over Jeopardy, the game show, and how firms will make decisions more effectively.

Examples of success may be far fewer than we realize given the context of a future state as opposed to the few actual case studies cited above.  Real or not we may learn more from stories of failure to gauge how much progress we have yet to achieve.

 Big Data requires an infrastructure that does not exist in its entirety today.  The infrastructure of Big Data is evolving very rapidly but exists at the lower end of the S-Curve in its development and sophistication.  In other words, it is still an immature concept.  What is this infrastructure? 

  •  A robust Big Data infrastructure requires the following
  •  Skilled knowledge workers – quantitative and qualitative
  •  A set of business standards succinctly defining Big Data
  •  Well defined data set of structured and unstructured data
  • Data scrubbing capabilities: in-house or vendor-based
  • Efficacious and repeatable operating standards allowing for industry adoption as opposed to one-off solutions

 This incomplete framework is not intended to be exhaustive or comprehensive.  Its intent is to acknowledge that Big Data may evolve along the same parallel path as the evolution of cloud computing which is also in its infancy as an industry.

 There is a major race to gear up and develop talent for what may become one of the largest growth industries in the 21st century.  At a recent business conference in Boston real case studies demonstrated the success and obstacles to realizing the potential of Big Data.

 Case story#1

Major tax preparer prepares to learn more about its customer’s needs for new product development. 

 Opportunity: a high velocity/volume business (data rich); high security IT  (demographic data); high contact (good historical data). 

Challenges: complex software (multiple versions); multiple SKUs (inconsistent data); high levels of text data (unstructured); data set definition (lack of taxonomy defining key data); recycle results (continuous trial and error cycles)

 Outcome: Long cycle project; steep learning curve; continuous restarts

 Case story#2

Web-based start-up for mothers focused on child development.

 Opportunity:  Multiple data collectors (suite of apps used to collect a variety of data); Baby social network (user-generated data); Adaptive learning (behavioral patterns discernible);

Challenges:  Lack of real-time processing of data (suboptimal feedback); missing data (gaps in clean data); lack of end-to-end clarity (cause and effect of change); length of big data projects costly and time –consuming (start small); lack of specialists to code scrubbing scripts (business acumen)

 Outcome: Costs exceeded budget; redundant processes; lack of appropriate skills to complete project

 These case stories represent a small sample of the not so successful implementations of Big Data.  Small samples should never be used to predict outcomes.  These case stories do however provide useful and sobering information and should be included along with the benefits of Big Data. 

 Here are a few additional observations:

  • The cost of storage of Big Data is large
  • What is the net present value of Big Data? ROI may be hard to quantify
  • The tools for system developers are very immature to process Big Data effectively
  • Redundancy of effort is a problem; but may be unavoidable due to immature processes
  • Bridge the gap between technical expertise, which exists, and a well-defined business vision for Big Data. 
  • Bioinformatics skills do not exist today or are in short supply
  • Understanding the right data to solve a specific business problem
  • Deciding early on if the right data exists to solve a business problem
  • Start small
  • Organize around small data upfront to ensure that Big Data produces reliable outcomes
  • The legal and regulatory environment may not keep up with technical product cycles – limits on trademark and intellectual property will be challenged

 Looking backwards from the future these observations may simply turn out to be speed bumps in the progress towards Big Data.  Unimagined new industries may undoubtedly follow yet much work is needed to build a sustainable framework in support of Big Data.  Failures in Big Data warn us not to become too complacent.

 The art and science of Big Data whether transformational or not is here to stay as a tool for converting data into information.   How we use and build the tools of Big Data will ultimately depend on the infrastructure to support these efforts.

TheGRCBlueBook mission is to become a global risk and compliance community site and resource portal for sharing best practice across all highly regulated industries.  A one stop source for all things risk and compliance related.

2013-05-12 by: James Bone Categories: Risk Management Algorithmic staff recruiting

AUDITING A PROFESSION RIPE FOR DISRUPTION-2 

 While you may not know the term “work-force science” your next job may be determined by Big Data.  A small but growing trend is emerging with recruiters and Silicon Valley start-ups to find top talent using analytics based on publicly available data.

The search for top talent using data and social media has been revolutionized by sites such as LinkedIn and the bar is being raised by new start-ups seeking to cash in.  Luca Bonmassar is the founder of Gild, a new entrant in the talent search industry to use proprietary analytics to find talent for highly sought after computer programmers.

Bonmassar and others in this field are turning traditional metrics of recruiting on its head by developing algorithms to determine how well someone will perform on the job.  

The traditional markers of top talent such as the college you attended, referrals from colleagues or your past career path may become less relevant, at least for high tech talent.   Gild searches for other clues to determine job performance by scouring the web and social media in search of test scores, relevance in the blogosphere, and other soft skills that may not be apparent in a resume.

Gild is not alone.  According to the New York Times author Matt Richtel, “competitors such as TalentBin, RemarkableHire, and Entelo” all perform their own version of data analytics to uncover talent for firms seeking hard to find top talent. 

Not everyone is convinced that Big Data is a huge improvement over the current process performed by human resources according to Susan Etlinger, data analyst with Altimeter Group.  “The big hole is actual outcomes,” she said. “What I’m not buying yet is that probability equals actuality.”  However, Etlinger concedes “it’s worth a try. “

The potential and risks associated with Big Data will no doubt have a profound impact of our lives in ways not yet contemplated.  The concept of privacy is evolving along with its inherent risks and opportunities for sharing data to create new opportunities.  

But don’t be surprised in the near future when the email you receive for a new career opportunity comes from a computer program instead of a human.

Originally article written By MATT RICHTEL / The New York Times
Read more: 
http://www.post-gazette.com/stories/business/technology/how-big-data-is-playing-recruiter-for-specialized-workers-685275/#ixzz2Ryh4Xqay

TheGRCBlueBook mission is to become a global risk and compliance community site and resource portal for sharing best practice across all highly regulated industries.  A one stop source for all things risk and compliance related.

2013-05-08 by: James Bone Categories: Risk Management Decoding Hit Movies with Data

Marilyn MonroeHollywood is quietly using data analysis to decode the secret formula hit movies.  Data analytics has been used in medical research, Wall Street financial reports and a host of other industries now movie scripts are being mined for the “hits” and “misses” in Hollywood scripts.

“This is my worst nightmare” said Ol Parker, a writer whose film credits include “The Best Exotic Marigold Hotel.” But Hollywood executives are plowing ahead with data analysis to mitigate the risk of a box office flop. 

Vinny Bruzzese, a chain-smoking professor who has taught statistics at State University of New York at Stony Brook on Long Island, claims to be a distant relative of Albert Einstein.  Bruzzese is but one of a cadre of analytical scientist and students who may well form an entire industry preparing to find gold in the data haystack.

Hollywood, long the bastion of creative talent has come to respect and fear Dr. Bruzzese’s success.  The New York Times article by Brooks Barnes notes that movie executives are paying as much as $20,000 per script to compare the movie script and genre with recently released box office hits shows. 

Bruzzese sees a market on Broadway and with TV producers as well which suggests that data analytics will become a critical risk mitigation tool in a variety of entertainment industries. 

Solving Equation of a Hit Film Script, With Data, By BROOKS BARNES

http://www.nytimes.com/2013/05/06/business/media/solving-equation-of-a-hit-film-script-with-data.html?hp&_r=0

2013-04-20 by: James Bone Categories: Risk Management Beyond GRC

free_221309 images for thegrcbluebook man running across street in suit

A bold new experiment is taking place in the Federal government across a number of agencies to identify and address systemic risk before the next financial collapse occurs.  You may be familiar with the Securities and Exchange Commission’s Division of Risk, Strategy, and Financial Innovation. 

Over the last 3 years, the S.E.C. has revamped this Office into a “think tank” with a multidisciplinary team of professionals from a variety of academic disciplines.  This is not your father’s SEC; the team is made up of 35 PhD financial economists, financial engineers, programmers, MBA’s and other experts. 

Likewise, the Treasury Department has set up a new Office of Financial Research, which was created under the Dodd-Frank bill in 2010 to support the Financial Stability Oversight Council – the group responsible for coordinating the efforts of the top financial regulators. 

Richard Berner, the newly appointed head of the OFR, is tasked with finding threats to financial markets BEFORE they occur.  Berner, a trained economist, has some experience looking around corners as the chief economists for Morgan Stanley he and a colleague revised their forecast of economic growth in 2007 to predict the coming recession before many on Wall Street saw the signs of economic trouble. 

There is an arms race of data analytics unfolding amongst economists and researchers to create tools to recognize and hopefully avoid the next crisis.  Berner is leading this charge and is now building a new forecasting model with the help of academics and financial engineers.  Many market watchers give Berner kudos for these efforts however there are some who question whether a financial model is capable of capturing the complexity of global financial markets.   

Berner faces the same challenge of the providers of Big Data solutions.  How do you standardize all sorts of records to a common data set that everyone agrees with so that the numbers are comparable?  There is no common taxonomy for data across different firms!

The Office of Financial Research may not be able to see the future and avoid all risk events to financial markets but it does mark a new era in how risk management will be conducted going forward. 

What role does GRC play in a world dominated by predictive analytics?  What new skills are needed by risk practitioners in the future?  Berner didn’t see or understand the systemic risks inherent in a correlated global market and missed how risks in US markets might impact our European counterparts overseas.  “There are still pretty big gaps in our knowledge”, Berner said during his interview for the article. 

What is becoming clear is regardless of your business the expectation to understand data and develop a governance model for data is increasingly apparent.  Attempting to tackle this effort alone in isolated silos would be self-defeating.  The best course of action is to begin to socialize the need for data management with key stakeholders in your firm.  Agreeing on a common set of definitions and taxonomy helps create a framework for defining important data and understanding where the gaps exist.

Resist the temptation to discuss risks at this stage of discovery.  Trust the process to reveal new information and potential risks as you learn more about how data is used and managed across your firm.  Rushing to define risks may predetermine outcomes and prevent you from learning gaps you would not have anticipated beforehand. 

You may not be able to “see around” corners when you complete this exercise but you may begin to ask new questions and have a better understanding of the bottlenecks of data that prevent you from achieving higher levels of performance.  Early success is the key to how far you decide to push the envelope in your data analysis. 

Regulators are building a formidable store of information on organizations that will grow and become more sophisticated.  Risk professionals should be prepared to have an equally robust set of data to demonstrate that you are building the same level of proficiency to understand their business.

http://www.garp.org/risk-news-and-resources/risk-headlines/story.aspx?newsid=60935

Original story written by jim.tankersley@washpost.com

2013-04-15 by: James Bone Categories: Risk Management How to Implement and Align Technology within Your GRC Framework

by James Bone, Executive Director TheGRCBlueBook

GRC Summit panel

Panelists:

GRC Summit – Michael Rasmussen (GRC 20/20), Norman Marks (SAP), Lance J. Freedman (Lockheed Martin Corporation)

Norman Marks’s introduction of the Day Two keynote speaker, Michael Rasmussen demonstrated the dichotomy of the divergent views evolving in GRC.  Norman set up the introduction with an overview of the State of the Industry address.  Marks’s view is informed by developments in predictive analytics and the promise of big data. 

“GRC stands for Governance, Risk and Confusion”, half joked Marks.  “The GRC solution remains elusive as does agreement on definitions and a common taxonomy for implementing an effective framework.”  So how does one align GRC with technology? 

According to Marks, “there is no informed approach that has proved effective in deciding how to purchase a GRC solution.”  The available analyst reports from leading consulting firms were deemed to be insufficient in providing prospective users with the tools needed to make an informed choice between respective risk solutions.  “[Analyst’s] reports are based on a generic set of business outcomes intended to address the preconceived needs of risk managers”, according to Marks.  Even Michael Rasmussen admits that risk managers need more than three client references from GRC vendors.  “Do you expect to receive a bad reference from a GRC vendor?” questioned Michael.

Rasmussen has broadened his view of GRC beyond a strict definition of the features embedded in the platform to now include a focus on GRC architecture.  In Michael’s view, “GRC is about organizing the manual processes, data and accountability to solve for the complexity inherent in today’s business environment”.  

This is what Rasmussen calls “GRC3.0, Enterprise Architecture.”  Rasmussen has adopted the OCEG Red Book framework as his operating model which advocates aligning business objectives and performance with GRC.  “Effective enterprise architecture will require half a dozen or more GRC solutions in order to address the full complement of risks outlined in Michael’s framework.” 

What both evangelists agree on is that the end solutions must have a positive impact on the performance of business objectives.  One of the best lines came from Norman Marks as he described the cause of diluted successes in GRC to date.  “These random acts of improvement lead to uncoordinated progress”, according to Marks.  “The key is aligning GRC for business value from strategy to operations.”

Each of the panelists provided a comprehensive set of examples for why risk tools are needed to manage increasingly challenging regulatory and business objectives while leaving the audience with no more clarity on a prescription for moving forward.  The missing piece to the puzzle remains elusive.  How does one determine which solution is appropriate for their needs given the unique risk challenges each firm faces?

Will there be a convergence of approaches after a critical mass of firms adopts a systemic solution to manual processes and begins to see the benefit of Big Data analytics?  Will predictive analytics make today’s subjective risk assessment irrelevant?  Will a disparate set of solutions be needed, as Rasmussen suggests, once a clear data management program has been implemented with the requisite ability to query data to the business answers one is seeking?

The panelist debate prompted more questions than answers.  What is clear is a prospective buyer of these tools has very few reliable options for choosing the appropriate risk solution.  Given the number of available GRC solutions providers the odds of finding the tool that fits your need is a daunting task.  This task is made less clear by a lack of transparency into the market, generic standards for defining GRC implementation, and no professional consultative services independent of the solutions provider to develop a strategic plan before choosing the solution that addresses one’s needs.

2013-04-05 by: James Bone Categories: Risk Management Science vs the Art of Risk Management

Eisinger-tmagArticleJesse Eisinger, a reporter of Times partner ProPublica, recently sat down with John Breit, former head of market risk insight for Merrill Lynch, to discuss whether the human factor is being lost in risk management.  Data analytics and the hype around Big Data has become the central focus for finding value and improving risk management.  John Breit, a physicist by training was part of the early wave of “quants” to leave academia, government and the military to work on Wall Street.

Breit soon became disillusioned with his role when he realized the limits of building financial risk models and the focus shifted to become a glorified hall monitor for the trading desk. John now believes that risk managers should “develop what spies call humint — human intelligence from flesh and blood sources. They need to build networks of people who will trust them enough to report when things seem off, before they become spectacular problems. Mr. Breit, who attributes this approach to his mentor, Daniel Napoli, the former head of risk at Merrill Lynch, took people out drinking to get them to open up. He cultivated junior accountants.”

A focus on data alone may be misleading.  Cars are designed with windshields for seeing the road and the risks that jump out in front of you as you drive.  The dashboard serves as an indicator of other variables that may impact the condition of the vehicle such as gas level, driving speed, and outside temperature.

Breit’s lesson is that risk managers have been squeezed into a box.  “Regulators have reduced risk managers to box checkers, making sure they take every measure of risk and report it dutifully on extensive forms. It just consumes more and more staff, turning them into accountants and rotting brains.”

The promise of data analytics is still evolving and risk managers have an opportunity to lead by creating context for data analysis however there is the real possibility that risk management may be further marginalized as a result of a purely data driven mindset.

Jesse Eisinger is a reporter for ProPublica, an independent, nonprofit newsroom that produces investigative journalism in the public interest. Email: jesse@propublica.org. Twitter: @Eisingerj