“Made with the highest attention to the wrong detail”
It set me thinking about the waste of effort that goes into oh-so-carefully measuring and reporting the wrong things, for reasons that include:
- Failing to determine the information actually required, and/or mistakenly assuming the nature of the inquiries (and, perhaps, reporting to the wrong audience)
- Naivete and lack of understanding about metrics, measurement, decision making and/or statistics in general
- Using certain measures simply because the base numbers and/or the charts, tables and reports are readily available (so they must be useful, right?)
- Presuming the need for a high level of accuracy and precision when in fact rough-and-ready indicators would be perfectly adequate (and cheaper) (and quicker)
- Analytical errors e.g. believing that the measured item is well-correlated with or predictive of something of interest when in fact it isn't
- Brain-in-neutral: no discernable thought patterns or reasoning behind the choice of metrics, at least nothing concrete that we can recall if pressed
- Falsely equating voluminous data with information or knowledge (knowing everything about nothing)
- Adopting metrics used, recommended, mentioned or discussed by others in conversations, articles, standards, metrics catalogs, websites or blogs (yes, including this one!) without considering one's own information needs and the differing contexts
- Giving the appearance of Being On Top Of Things ("The value is 27.435 ... but ... OK I'm not entirely sure what that means")
- Generating chaff, deliberately distracting the audience from genuine issues by bombarding them with spurious data, theories, arguments and presumptions
- Being "too clever by half" - an obsessive/compulsive fascination with particularly complex or obscure yet strangely intriguing metrics having a whiff of magic
- Being required to spout nonsense either by some authority who perhaps doesn't understand the issues but wants to be seen to be Doing Something, or in accordance with a poorly-considered contract, Service Level Agreement or reporting system
- Continuing to use (= failing to challenge and revise/withdraw) old metrics long after they should have been dispatched to a rest home for the sadly bewildered
- Re-using metrics that have proven worthwhile elsewhere/in other contexts on the mistaken assumption that they are 'universal'
- Filling spaces on a fancy dashboard or management report with "pretty" metrics (eye-candy), triumphs of form over substance
- Desperation stemming from an apparent lack of alternatives due to limited capabilities and skills, a lack of imagination/creativity and/or no will or opportunity to design or select more suitable metrics
That's some list! If I'm honest, I am personally guilty of some of them. I've been there, done that, and still I see others treading the same route. I think I'm wiser now, but only time will tell if the effort required to think more deeply about metrics and write the book has led to a breakthrough, or whether (or, more accurately, in which respects) I am still deluding myself.
Think about this issue the next time you find yourself poring over a management report or survey, trying to make sense of the content. Even more so if you are skimming and discarding metrics without making any real effort to understand them. Ask yourself whether you actually need to know whatever it is you are being told, and why it concerns you. In GQM parlance, consider the Goals and the Questions behind the Metrics on the table. Understand also that there may be hidden agendas, false assumptions and plain errors in the numbers. Consider the methods used to gather, analyze and present the metrics, including their mathematical/scientific validity (e.g. was the population sampled randomly or selectively? Is the variance significant? Are there uncontrolled factors?). Be dubious, cynical even, if it makes you contemplate the true meaning and worth of the information presented.
For therein hides genuine insight.