Welcome to NBlog, the NoticeBored blog

Like the finer things in life, quality trumps quantity.

Mar 21, 2014

Avoiding metrics myopia

Things being measured are patently being observed in a very specific, focused manner. Things that are observed so closely tend to be presumed by others to be of concern to the measurer/observer, at least. Things under the measurement ruler therefore assume a certain significance simply by dint of their being closely observed, regardless of any other factors.

We see this 'fascination bias' being actively exploited by cynical promoters, marketers and advertisers on a daily basis through an endless stream of largely banal and unscientific online polls and infographics, their true purpose made all the more obvious by the use of bright primary-color eye-catching graphics. They are manipulating readers to believe that since something has been measured, it must be important. How many of us take the trouble to think about the quality of the metrics, or about all the other aspects that haven't been measured? Like bunnies in the headlights, we stare at the numbers.

In PRAGMATIC Security Metrics, we outlined 21 observer biases drawn from an even longer list drawn up by Kahneman, Slovic and Tversky in Judgment Under Uncertainty (1982): what I'm calling 'fascination bias' has some resemblance to what Kahneman et al. described as 'attentional bias', the tendency to neglect relevant data when making judgments of a correlation or association.

Fascination bias creates a genuine concern in that we tend to measure things that are relatively easy to measure, and place undue faith in those metrics relative to other factors that are not being measured. Back in 2011, Michel Zalewski said in his blog:
"Using metrics as long-term performance indicators is a very dangerous path: they do not really tell you how secure you are, because we have absolutely no clue how to compute that. Instead, by focusing on hundreds of trivial and often irrelevant data points, they take your eyes off the new and the unknown."
While we don't entirely accept that we 'have no clue how to compute security performance', his point about neglecting other risks and challenges due to being inordinately focused on specific metrics is sound.  It's only natural that what gets measured gets addressed (though not necessarily improved!). The unfortunate corollary is that what doesn't get measured gets neglected.

The upshot of this is that there is a subtle obligation on those who choose metrics to find ways to measure all the important matters, even if some of those metrics are expensive/complex/qualitative/whatever. It's simply not good enough to measure the easy stuff, such as the numbers that assorted security systems constantly pump out 'for free'. It's inappropriate to disregard harder-to-measure issues such as culture, ethics, awareness and trust, just as it is inappropriate to restrict your metrics to IT or cybersecurity rather than information security.

That's one of the key reasons why we favor the systematic top-down GQM approach: if you start by figuring out the Goals or objectives of security, expand on the obvious Questions that arise and only then pick out a bunch of potential Metrics to answer those questions, it's much harder to overlook important factors. As to figuring out the goals or objectives, conceptual frameworks for information security such as BMIS and ISO27k, based on fundamental principles, are an obvious way to kick-off the thinking process and frame the initial discussion.