Welcome to NBlog, the NoticeBored blog

You don't need eyes to see: you need vision

Jun 20, 2013

SMotW #62: security policy management maturity

Security Metric of the Week #62: security policy management maturity




As with the other ‘maturity metric’ examples given in the book (e.g. those for asset management, physical securityHR, business continuity and compliance) we envisage this metric as a scoring scale using predefined criteria against which the organization's security policy management practices are assessed and rated.  

Here's the first of four rows from the example policy maturity metric in Appendix H:

0%: no information security policy management
33%: basic information security policy management
67%: reasonable information security policy management
100%: excellent information security policy management
There is nothing even remotely resembling a security policy as such
There is a security policy of sorts, although probably of poor quality (e.g. badly worded or inconsistent), incomplete and/or out-of-date with some elements undocumented
The information security policy is documented, reasonably complete, accurate and up-to-date, reflecting most corporate and external obligations, albeit somewhat stilted and difficult to read and apply in places and perhaps with limited coverage in topical issues such as cloud computing
The information security policy materials are formalized, entirely complete, accurate, up-to-date, consistent and readable, explicitly reflecting a documented set of high level security principles, fully reflecting all corporate and external obligations and promoting generally accepted good security practices

On each row, there are four scoring criteria denoting scores of 0, 33, 67 and 100 points on the percentage scale.  There is also a fifth, implied point: 50% marks the boundary between unacceptable (scores less than 50%) and acceptable (scores greater than 50%).

The scoring criteria are written in order to give the assessor a good steer for the kinds of information security policy management practices to look out for at each maturity level, yet these are merely examples rather than firm requirements.  For instance, the 33% scoring point on this row clearly refers to the presence of something resembling a 'security policy' (in marked contrast to the 0% point), but calls into question the quality and status of the document (again, distinguishing it from the higher scoring points).  If that is a fairly accurate description of the situation, the assessor can simply award a score of 33% for that row and move on to the next, but he/she has the discretion to award slightly higher or lower scores to reflect the unique way that the organization manages its security policies.  This allows some leeway to acknowledge strengths and weaknesses that may not be shown in the scoring criteria, or that may appear at different points on the scoring scale (e.g. if the security policy is formally documented but the quality of the document is poor, it might merit a score of say 40 or 50%).

Although this was not the top-scoring policy metric, it is clear from the metric's PRAGMATIC score that Acme's management were impressed with this one:
  
P
R
A
G
M
A
T
I
C
Score
90
95
70
80
88
85
90
82
88
85%

The scoring process and/or the Meaning of the final score may need to be explained when the metric is reported, for instance highlighting particular rows in the table against which the organization scored relatively strongly or weakly to demonstrate how the final score was determined.  Doing so would be an opportunity to address the Actionability issue, since the detailed findings indicate particular things that Acme could be doing to improve its maturity score.

By the way, the very act of drawing up or refining the scoring criteria used in  maturity metrics like this is itself a sign of maturity in the organization’s approach to security metrics.  It takes some thought and effort to prepare the criteria, including research into good practices.  Gray-beard IT auditors or information security management professionals have generally experienced a wide variety of good and bad practices in past assignments, while there is plenty more advice in information security standards and methods concerning the kinds of things that the organization ought to be doing.