"Information security is not the easiest of things to manage. The lack of suitable metrics makes it even harder in many organizations. Security management decisions are generally made on the strength of someone’s gut feel (an important but fallible and potentially biased approach), or for external compliance purposes (seldom aligned with the organization’s risk appetite). Metrics are the only way to tell whether best practices are truly good enough, and provide the data to make informed choices, identify improvement opportunities, and drive things in the right direction."
That's the executive summary of a new management paper on security metrics for our Information Security 101 security awareness module, which we are currently revising and updating. The current module was released at the end of 2010 and, despite being a relatively superficial overview of a selection of general-interest information security topics for new hires, it's surprising how much has changed over the past 3 years. BYOD, cloud computing, ransomware and SIEM, for examples, were barely on the radar back then, while the whole Big Brother NSA thing was still under wraps.
That set me thinking about the rate of change of information security. Infosec pros like me often spout off about ours being a 'highly dynamic field'. Are we justified in saying so? On what basis do we assert that? What do we even mean? Is infosec any more or less dynamic than other fields, in fact? The questions keep coming!
Being a self-confessed metrics freak, I can't help but wonder at whether and how we might actually measure this, ideally in such a way as to be able to compare different areas on a common basis. Let's simplify things down to a comparison of infosec against, say, risk management or perhaps management as a whole. That train of thought suggests the idea of inviting managers and subject matter experts to rate a bunch of activities or concerns on the basis of their perceived changeability or dynamism. A straightforward survey would suffice, asking respondents to rank maybe 5 to 10 areas, perhaps allowing them to add additional areas as they see fit.
Meanwhile, an even more pragmatic metric is staring us in the face: 2 paragraphs ago I mentioned that our Information Security 101 awareness module needs revision after just 3 years. How does that review period compare to the equivalent awareness/training materials covering things such as HR, compliance, health & safety etc.? You might argue that there are several factors driving the review and update process aside from changes in those fields, and indeed there are but we could potentially address that issue by surveying numerous organizations, somehow avoiding the self-selection bias by, for instance, polling the readers of a general management website or magazine, or members of groups such as the Institute of Directors. Supplemental survey questions could help us identify and sift out biased responses.
OK, well it's all starting to look a bit difficult and expensive at this point, and things are just as awkward on the other side of the cost-benefit equation. What would we gain by measuring the dynamics of information security? A facile reason would be to put some meat on the bones of the bland assertions by us infosec pros, but that's hardly a valid business driver. A more useful purpose for the metric would be to help drive the strategy, for instance emphasizing the need for more rapid and tactical responses to emerging information security issues. I can imagine a number of governance, strategy and policy decisions in various organizations being guided by the numbers ... or not.
At the end of the day, gut feel, presumptions, assertions and perceptions appear to be sufficient to drive strategy right now, so I'm not entirely convinced there is a clear value case for a metric concerning the rate of change in information security. However, if you are an infosec manager or CISO pondering how to argue your next budget or investment proposal, this rambling piece might just spark a novel approach.