Welcome to the SecAware blog

I spy with my beady eye ...

19 May 2018

NBlog May 19 - PRAGMATIC security metrics

This week, a newcomer to the ISO27k Forum asked about metrics for vulnerability management:  

"[I] Would like to take your view on metrics from great vulnerability management perspective which may have integration with asset, patch, application and risk management databases.  Can you share [your] experience from security and business metrics based on vulnerability management - security metrics intended for technical management and business metrics for Board?"

The first respondent offered a stack-dump of possible metrics in three groups:
Security Metrics for Technical Management:
  1. Total no Critical, High, Medium, Low Vulnerabilities found on each Asset.
  2. Repeated Vulnerabilities from previous assessment.
  3. Total No of False-positive Vulnerabilities --> this is essential to evaluate your Vulnerability Management solution effectiveness.
Whenever Technical change happens or new launch happens you can present report to the management, because Technical Management should be aware of the potential Risk. 
Business Metrics for Board
  1. Top 10 or Top 20 or Top 25 Assets and Vulnerabilities
  2. When new vulnerabilities identified by your CISO or by your security team, How long the team took to identify those new vulnerability in your infrastructure? How long the team took to patch the vulnerability?
Over a period, you can show trend reports of your Vulnerability Management process effectiveness.  This is one of the latest metrics Board level management are interested, Because when ransomware attack occurred, I had shown this metrics. This is more helpful for the higher management. As a security team, you can show ROI on Vulnerability Management Solution, Assuming if the VM solution not available we might had downtime or loss of data, which would have cost us $$$$ , but by the solution we prevented it.  
Ad-hoc metric:
There is one common metrics is for both the category; Certain vulnerability will turn potential risk to the organization. You need to follow risk management exercise against those vulnerabilities those can be highlighted to the management.  Example: Legacy application (like XP, 16 Bit Application, EOL Application, etc.) or unsecured protocol used due to business requirement or technical limitation, most of the time, business will not understand the risk involved with that. So, consider taking Vulnerability Management finding into your risk assessment and show the residual risk to the management after considering the existing controls and ask them to treat the risk. This is one of the proven ways to meet your security needs.  

I make that at least seven metrics so far. The second respondent suggested four more:
  1. Potential consequences (including financials) when shit becomes real because of the vulnerabilities that are left open;
  2. Number of open, close and work-in-process vulnerabilities;
  3. Time to address high, medium and low vulnerabilities;
  4. Number of vulnerabilities that are assessed by the risk management process.

I then waded-in with yet another: “Proportion of harmful incidents that resulted from novel vulnerabilities” where 'novel' means something like ‘previously unrecognized or unknown and hence not specifically treated’.

I imagine the newcomer was feeling a bit overwhelmed with a dozen metrics on the table already and little in the way of explanation or justification about why he might want to use any of them ... so I decided to "help" by using the PRAGMATIC method to examine the metric I had just proposed:
  • 80% Predictability – strongly indicative of the information security status going forward, since it covers both identifying and resolving vulnerabilities;
  • 80% Relevance – highly relevant to information risk and security management, particularly the information risk identification element of information risk management;
  • 65% Actionable – there may be way to improve the identification of risks but the metric alone doesn’t indicate how, just ‘room for improvement’;
  • 70% Genuine – there may be some discussion/dispute over whether incidents were both harmful and novel, in which case the criteria might be clarified/specified;
  • 80% Meaningful – not hard to understand, some nuances might usefully be explained (e.g. we can also learn something useful from the proportion or number of incidents resulting from vulnerabilities that were not novel); 
  • 80% Accurate – little doubt over the numbers, especially if they are fully specified;
  • 90% Timely – although the metric uses historic data, those data can be obtained and analyzed rapidly following incidents with little delay;
  • 80% Independent – the metric is based on factual data that are readily obtained and verified;
  • 80% Cost-effective – a relative cheap-to-generate metric with a lot of business value, I believe.
A simple unweighted mean gives an impressive PRAGMATIC score of 78% making this a strong candidate for inclusion in the organization's suite of information risk and security metrics.

Don’t worry too much about the scores I have given: they are clearly subjective and make a bunch of assumptions, particularly about the organizational context and maturity in this area. I leave it as 'an exercise for the reader' to score all 12 metrics on a comparable basis - your homework this weekend maybe?

The PRAGMATIC method is a rational, systematic basis for considering, assessing and comparing possible metrics. Aside from helping us decide between the 12, the analytical process generates deeper insight into the measurement objectives, extending the brief request originally posted. 

It's a creative process: you can probably think up variants or derivatives of the 12 metrics, including combinations and perhaps some totally different approaches. 

The method is also useful for refining metrics, both before and after they are implemented. The lowest PRAGMATIC scores are obvious candidates for improvement. How might the suggested metric be modified to increase, say, its Actionability? And would the modified metric score differently on the other PRAGMATIC criteria?

Finally, I'll point out that I'm focusing here on the inherent qualities, the strengths and weaknesses of the metrics. There are lots of possibilities concerning their generation, presentation and use which further complicate the matter. And as if that's not enough already, the security metrics must align with the organization's other metrics, enabling and supporting various governance and management activities. I'm talking about a small part of a complex management system.

No comments:

Post a Comment