ISACA Journal Author Blog

ISACA > Journal > Practically Speaking Blog > Posts > Dispelling Concerns Regarding Quantitative Analysis

Dispelling Concerns Regarding Quantitative Analysis

Jack Jones, CISA, CRISC, CISM, CISSP
| Published: 1/23/2017 3:18 PM | Category: Risk Management | Permalink | Email this Post | Comments (1)

In my recent Journal article, I stated that our profession needs to adopt quantitative methods of risk analysis to enable well-informed executive stakeholder decisions. Common reactions to this notion include:

  • Quantitative risk measurement is too time-consuming.
  • There are not enough data to support quantitative analysis.

I will be the first to admit that quantitative analysis will always take more time than sticking a wet finger in the air and proclaiming high risk. Then again, you get what you pay for. In my own experience working with numerous organizations, I have found that between 70% and 90% of high-risk issues in risk registers and top 10 lists do not, in fact, represent high risk. So the question becomes, how much value is there in effectively prioritizing and understanding the cost-benefit of risk management investments? 

My experience as a 3-time chief information security officer (CISO) who has been using quantitative methods for a decade is that the time spent in “getting risk right” is well worth the effort. The executives I have served have shared that opinion. In addition, performing quantitative risk analysis when using proven methods, e.g., Monte Carlo, project evaluation and review techniques (PERT) distributions, and calibrated estimation, is much less time consuming than most people imagine. 

But what about the concern regarding data? Here, again, there is legitimacy to the concern, but it is tainted with misunderstanding. A common belief is that to do quantitative risk analysis, you must have a statistically significant volume of data regarding both likelihood and impact. Although that would be ideal, the methods I mentioned above are specifically designed to enable quantitative analysis with uncertain data. For example, if you are trying to estimate the likelihood of a specific attack, but you lack significant historical data and/or the threat landscape is in a state of flux, then you express your likelihood estimate as a wider, flatter distribution than you would if you had high-quality data. This faithfully represents the lower confidence in your data, which can be a crucial data point in and of itself to decision makers. 

The bottom line is that the means of doing high-quality quantitative risk analysis exist and are being applied successfully and pragmatically. If we hope to evolve as professionals, we need to at least be aware of these methods so that we can leverage them when appropriate.

Read Jack Jones’ recent Journal article:
Evolving Cyberrisk Practices to Meet Board-level Reporting Needs,” ISACA Journal, volume 1, 2017.

Comments

Re: Dispelling Concerns Regarding Quantitative Analysis

A wake-up call in pointing out the importance of quantitative analysis with uncertain data to support qualitative judgement and assessment.
Antonius Ruslan at 1/24/2017 7:19 PM
Email