Welcome to the SecAware blog

I spy with my beady eye ...

25 Nov 2013

SMotW #81: control count

Security Metric of the Week #81: number of different information security controls

We're not entirely sure why anyone would feel the need to count their security controls, unless perhaps they think there might either be too many or too few, begging the question "How many controls should we have?". Nevertheless, somebody proposed this as an information security metric and ACME's managers explored, discussed and scored it through the PRAGMATIC process:


They felt that counting security controls would be tedious, error-prone and laborious hence the metric's depressed ratings for Timeliness, Accuracy and Cost-effectiveness. The 88% rating for Meaningfulness suggests that they believed this metric would provide useful information, provided the following issues were addressed.

The word "different" in the full title of the metric could be misleading: different in what sense? Does it actually mean separate as in counting antivirus installations on each IT system as different controls, or does it indicate different kinds or types of control?  If so, how different do they need to be to count separately? Failing to define the metric would probably lead to inconsistencies, particularly if various people were involved in counting controls. 

ACME would also need to be careful about what does or doesn't constitute an 'information security control'. For instance the door locks on an office, a media storeroom, a toilet and a janitors' closet have quite different implications in relation to protecting ACME's information assets: do any of them qualify as 'information security controls'?  Do they all count?

That said, the metric could prove a useful way to manage the overall suite of security controls if the issues were bottomed-out. 'Getting a handle on things' through metrics means not just measuring stuff, but using the numbers both as a means to determine what adjustments to make and to determine that the adjustments do in fact lead to the anticipated changes in the numbers, thus supporting the implied cause-effect linkages.

The graph above illustrates a more sophisticated version of the metric that distinguishes preventive, detective and corrective controls, showing baseline and custom control counts for each type. This is just one of many ways the numbers might potentially be counted, analyzed and presented. If you are thinking seriously about this metric, you might also like to consider variants that distinguish:
  • Confidentiality, integrity and availability controls;
  • Free, cheap, mid-price and expensive controls;
  • Controls that have been fully, partially or not yet implemented (established, new or proposed controls);
  • Basic, intermediate and advanced controls;
  • Old fashioned/traditional and novel/cutting-edge controls;
  • Control counts within different departments, operating units, countries, businesses etc.;
  • Fail-safe/fail-closed versus fail-unsafe/fail-open controls; 
  • Automated, manual and physical controls;
  • Controls required for compliance with externally-imposed obligations versus those required for internal business reasons;
  • Counts versus proportions or percentages;
  • Trends or timelines versus snapshots;
  • Other parameters (what do you have in mind?  What matters most to your organization?).

22 Nov 2013

Roughly right trumps precisely wrong

Inspired by a rant against information overload, I looked up Sturgeon's Law which might be paraphrased as "90% of everything is crap".  That in turn got me thinking about the Pareto principle (a.k.a. the 80/20 rule: 80% of the effects relate to 20% of the causes). The numbers in both statements are arbitrary and indicative, not literal. The 80% or 90% values are meant to convey "a large proportion" and bear no special significance beyond that. Adding the phrase "of the order of" would not materially affect either statement.  

I'm also reminded that (according to Stephen Wright) "42.7% of statistics are made up on the spot", while Benjamin Disraeli's "lies, damned lies, and statistics" reminds us that numbers can be used to mislead as much as to inform.

So how does this relate to PRAGMATIC security metrics?

It is especially pertinent to the Accuracy and Meaningfulness criteria.

Most metrics can be made more Accurate by taking a greater number of measurements and/or being more careful and precise in the measurement process. The number of readings is statistically relevant when we are sampling from a population: the more samples we take, the more accurately we can estimate the total population. Measurement precision depends on factors such as the quality of the measuring instruments and the care we take to determine and record each value. Taking repeated measurements on the same sample is another way to increase the Accuracy. However, that extra Accuracy comes at a Cost in terms of the time, effort and resources consumed.

Greater Accuracy may increase the validity and precision of the metric, but is that valuable or necessary?

Regarding Meaningfulness, the fact that we have special terms for them implies that, despite their imprecision, rules of thumb are valuable. Rough approximations give us a useful starting point, a default frame of reference and a set of assumptions that are in the right ballpark and close enough for government work.

A long long time ago, I dimly recall being taught arithmetic at school, back in those cold dark days before electronic calculators and computers came to dominate our lives, when digital calculations meant using all ten fingers. We learnt our 'times tables' by rote. We were shown how to do addition, subtraction and division with pencil and paper (yes, remember those?!). We looked up logarithms and sines in printed tables. When calculators came along, difficult calculations became even easier and quicker, but so too did simple errors, hence we were taught to estimate the correct answer before calculating it, using that to spot gross errors. It's a skill I still use to this day, because being 'roughly right' often trumps being 'precisely wrong'. To put that another way, there are risks associated with unnecessary precision. At best, being highly accurate is often - though not always - a waste of time and effort. Paradoxically, a conscious decision to use the rounding function or reduce the number of significant figures displayed in a column of numbers can increase the utility and value of a spreadsheet by reducing unnecessary distractions. Implicitly knowing roughly how much change to expect when buying a newspaper with a $20 note has literally saved me money.

The broader point concerns information in general, not just numbers or security metrics of course. A brief executive summary of an article gives us just enough of a clue to decide whether to invest our valuable time in reading the entire thing. A precis or extract is meant to portray the flavor of the piece, not condense its entirety. 

So, to sum up this ramble, don't dismiss imprecise, inaccurate, rough measures out of hand. As the name suggests, "indicators" are there to indicate things, not to define them to the Nth degree. A metric that delivers a rough-and-ready indication of the organization's security status, for instance, that is cheap and easy enough to be updated every month or so, is probably more use to management than the annual IT audit that sucks in resources like a black hole, and reports things that are history by the time they appear in the oh so nicely bound audit report.

20 Nov 2013

PCI, meet Security Awareness

Whereas current and previous versions of PCI DSS, the standard for securing credit card data, have mentioned the need for security awareness, the forthcoming PCI 3.0 release will be more forthright on the need for security education and awareness.

According to the official change notice, “Lack of education and awareness around payment security, coupled with poor implementation and maintenance of the PCI Standards, gives rise to many of the security breaches happening today. Updates to the standards are geared towards helping organizations better understand the intent of requirements and how to properly implement and maintain controls across their business. Changes to PCI DSS and PA-DSS will help drive education and build awareness internally and with business partners and customers.”
The underlying issue is that, without adequate awareness, other information security controls are more or less pointless. I suspect PCI 3.0 will focus on ensuring that PCI security requirements are very clearly expressed to the user organizations’ management ... leaving it to them to cascade the security requirements down to the relevant staff and IT professionals as they see fit. 
A cynic might argue that PCI-DSS is more a legal device for the credit card companies to avoid liabilities arising from security failures at merchants and other card processors, than a security standard. Preventing a loss of confidence and consequent collapse of the credit card industry is, not surprisingly, the industry's overriding concern. Actually protecting members of the public against identity theft/fraud, privacy breaches etc. is a secondary consideration. 
In reality, PCI compliance is just part of an organization’s information security concerns: a well-rounded information security awareness program helps protect all information assets, including credit card data, trade secrets, other personal and proprietary information, business strategies, financial data, printed/written information, even intangible forms such as knowledge, experience, expertise and ideas. There’s far more at stake here than mere PCI compliance!
I'll go a step further. Effective information security awareness programs secure business advantage, going well beyond merely avoiding nasty stuff such as the contractual liabilities and adverse publicity arising from PCI failures. They position and promote information security as a business enabler, a tool that enables the organization to conduct business activities that would otherwise be too risky. Awareness is the oil that slips security quietly into place. A security-aware workforce appreciates the need for security, understands the purpose, and behaves securely by default - even without the PCI auditors breathing down their necks. 
Doing security awareness purely for compliance reasons is a very myopic approach. You're missing out on the business benefits of a strong security culture that encompasses information security as a whole, at all levels of the organization.  It's needed for governance and risk management reasons.
Get it right and PCI compliance is an incidental side-effect, not a driver.

19 Nov 2013

On being cast adrift in a sea of metrics

With a spot of brainstorming and Googling around, it's not hard at all to come up with hundreds of candidate security metrics, often in fact entire families of potential metrics based on any starting point such as the 150 metrics in our book (we'll show you how that works with our next 'example metric of the week', here on the blog). There are loads of information-security-related things that could be measured, and loads of ways to measure them. This is a point we discussed in chapter 3, describing many potential sources of metrics inspiration. 

If you don't perceive a vast ocean
of possible security metrics before you,
you're either lacking in experience
or you need to look harder!

Having come up with a big bunch of possible security metrics, the PRAGMATIC method is a great way to filter out the few that are actually worth putting into production. Metrics with relatively low PRAGMATIC scores naturally gravitate to the bottom of your list while the high-achievers gently rise to the top. Instead of feeling overwhelmed with a confusing mass of possibilities, your job is simply to cream off the floaters, perhaps revisiting a few more that show promise but don't quite make the grade.

Aside from quietly contemplating your shortlist, metrics workshops* work well in many organizations, bringing the people who will generate and use the metrics together to consider and discuss their objectives, requirements and constraints, and to pore over a set of candidate metrics. Another suggestion is to run trials or pilot studies, trying out a few metrics and comparing them side-by-side for a few months to discover which ones work best in practice. Don't forget to ask your audiences what they make of the metrics, which ones they prefer, and why. 

The GQM (Goal -> Question -> Metric) approach is yet another way to figure out what to measure. GQM doesn't necessarily lead to particular metrics, but it emphasizes the business needs and priorities first, using those to focus attention on particular questions or issues of concern in the management of information security risks. The strategic perspective is well worthwhile, at least in suggesting what kinds of security metrics are needed and why i.e. the areas or aspects that ought to be controlled and hence measured.

Furthermore, the manner in which your metrics are analyzed and presented is another opportunity for creative expression: the graphs and images that illustrate this blog are deliberately bright and arguably a bit weird in order to catch your eye. The more formal corporate reporting situation may be different although we would advise against using monochrome line/bar/pie charts unless you have to, for some reason. Security metrics needn't be as dry, dull and boring as a badly-delivered statistics lecture. Try a splash of color at the very least. Let your passion for the subject shine through.  You never know, it might just rub off on the audience ...

Kind regards,
Gary Hinson

If you'd like me or Krag to lead your metrics workshop, do drop us a line. Actually using the PRAGMATIC method for real is an obvious next step once you have read the book. As well as sharing our passion, knowledge and experience in this field with you and your management, we'd welcome the chance to bring you quickly up to speed on PRAGMATIC as well as helping you address your security metrics issues.

18 Nov 2013

SMotW #80: quality of system security

Security Metric of the Week #80: Quality of system security revealed by testing

Our 80th Security Metric of the Week concerns [IT] system security testing, implying that system security is in some way measured by the testing process.  

The final test pass/fail could be used as a crude binary metric. It may have some value as a measure across the entire portfolio of systems tested by a large organization in a period of a few months, but despite being so simple, underneath lurks a raft of potential issues. If, for instance, management starts pressuring the business units or departments whose software most often fails security testing to 'pull their socks up', an obvious but counterproductive response would be to lower the security criteria or reduce the amount or depth of security testing.  

The number of security issues identified by testing would also be a simple metric to gather but again not easy to interpret. If the metric is tracking upwards (as seen on the demo graph above), is that good or bad? It's bad news if it means that there are more security issues to be found, but good if testing is finding more of the security issues that were there all along. Taken in isolation, the metric does not distinguish these, or indeed other possibilities (such as changes in the way the measurements are made - a concern with every metric unless there is strong change management).

Rather than the simple count, more sophisticated metrics could be designed, perhaps analyzing identified issues by their severity (which is really another way of saying risk) and/or nature (e.g. do they chiefly affect confidentiality, integrity or availability?). ACME managers were quite keen on such a metric, judging by the PRAGMATIC score:


The metric would have been a no-brainer if it were not for the 10% rating on Cost-effectiveness. In the managers' opinion, the metric would need to measure and take account of a number of factors relating to system security, making it fairly expensive. However, with a bit more work up-front, some or all of the data collection processes might perhaps be automated in order to reduce the costs. This, then, is an obvious avenue to explore in developing the metric.  

A pilot study would be a good way to take this forward, trialing the metric and perhaps comparing a number of variants side-by-side, systematically eliminating the weakest over several months until there were just one or two remaining, or until management decided that the metric does not make the grade after all.

11 Nov 2013

Seven design goals for security awareness

Excellent security awareness programs satisfy the following seven design goals:

1) Inclusivity

Information security is and should be perceived as everyone’s responsibility, hence the awareness program should reach everyone in the organization i.e. employees (staff and managers) in every location, business unit, function or department, plus contractors, consultants, temps and other third party employees working for the organization. Ideally, the awareness activities should start with a new employee induction or orientation module covering the basics of information security for newcomers. Most of all, the awareness program should be visibly supported and endorsed by management - which in turn implies that managers themselves need to be security-aware, and to understand the business value of security awareness. Involving managers in the formulation of security policies, rather than just presenting them as a fait accomplis, is one way to ensure that they have their say in molding information security to the organization's overall needs, not the other way around.

2) Comprehensiveness

The program should cover a broad range of topics, elements and aspects such as: accountability and responsibility; auditing; authentication and identity management; bugs and other technical security vulnerabilities; business continuity including business impact analysis, resilience, disaster recovery, contingency and survivability; BYOD and portable computing; cloud computing; compliance and enforcement; cryptography; databases; email, Skype, instant messaging, Twitter etc.; ethics and trust; fraud; governance; hacking; human error and human factors; incident management and digital forensics; information protection; insider threats; IPR; knowledge and other intangible information; malware; network and Internet security; office security; oversight; privacy; physical security; risk management; SCADA/ICS and embedded systems security; security design; secure software development and acquisition; social engineering; surveillance; third parties; and trade secrets. The sheer number of relevant subjects makes it impossible to do justice to them all in a single annual awareness event. In my experience, covering a different topic every month strikes the right balance between breadth and depth of coverage (and, on a personal note, it fits my own attention span! After a month on any one topic, I need to move on).

3) Currency and topicality

While some of the topics listed in the previous point are relatively static, others are evolving rapidly and every so often something truly novel appears on the security horizon. An awareness program that is only updated infrequently or sporadically is likely to be a stale as old socks, hardly a recipe for success! What's more, if it fails to keep up with changes in the security landscape, as well as changes in the business and regulatory environment, there is a distinct risk that employees will be woefully unprepared for new threats. Learning from others' security failures sure beats the indignity of becoming a competitor's case study, or being pilloried in the press!  This is another justification for a relatively rapid turnover of topics. Old news is an oxymoron.

4) Motivational

The emphasis on motivation takes awareness beyond merely presenting information and hoping that people pay attention: it means the awareness and training materials must be relevant, topical, pragmatic and useful. The program has to engage with its audiences, and persuade them to respond by changing their ways. Rather than simply chastising workers for not complying with their security obligations, for example, management should reward and encourage those who do the right thing. Being crystal clear about the true purpose of security awareness makes a huge difference: remember that 'security awareness' is merely an interim objective, not the ultimate goal. Its true purpose is to make people behave more securely and drop risky behaviors, thereby reducing the number and severity and hence costs of information security incidents.

5) Satisfy audience needs 

The program should take account of the range of information security interests and competences within the organization. For example, most employees will not understand the technical content that IT professionals need. Basic awareness materials may be suitable for junior employees whereas experienced colleagues are likely to appreciate more advanced materials. Governance, risk management and compliance matters are of greater concern to management than to staff. This implies the need to recognize distinct target audiences and provide suitable awareness materials specifically for each of them. One size definitely does not fit all! 

6) Creativity

Creative expression can breath life into an otherwise drab, monotonous and frankly ineffective awareness program. Variety is the key here. Limiting the awareness program to one format, mode or style of delivery cuts out those who, for whatever reason, don't appreciate that particular approach. Some people, for instance, respond better to pictures than to words. Some prefer to be told stuff, others like to be shown, others need to find out for themselves. Canned online/electronic delivery has little to no impact on those who learn best by considering, discussing and even arguing about the topic. Bright sparks catch on to new content quickly, whereas others need more time and perhaps quiet reflection or repetition. Cartoons and games catch some people's imagination, while coming across as childish and condescending to others (we're talking about adult education, don't forget). A relatively dry, formal style suits some but not all materials, topics and recipients. There is room to be contentious and challenging (within reason) and sometimes raising but not directly answering rhetorical questions can do wonders: an audience that discusses a security challenge and figures out its own response is more likely to internalize and follow-through on the response than one that is flatly instructed on what to do. 

7) Value

The awareness program should, of course, be an asset with a net positive value to the organization. Suitable metrics should demonstrate the program’s costs and benefits, as well as facilitating continuous improvement of the program’s design and delivery processes. As it happens, security metrics also make a fascinating awareness subject! Neglecting this aspect is a recipe for budget cuts and lackluster management support. Recall that security awareness is not an end in itself, therefore it doesn't automatically justify its own existence. From our perspective as suppliers, the low cost of our security awareness products works against us: managers who don't appreciate the true benefits tend to dismiss them as trivial and hence optional expenses, whereas in reality security awareness is the oil that slips other information security control mechanisms into place and keeps them running sweetly. Putting that another way, without awareness, the investment in security technologies is sub-optimal if not entirely wasted. If you know of a purely technical solution to social engineering, for instance, the market is wide open.

What have I missed?

Gary (Gary@isect.com)

8 Nov 2013

SMotW #79: Employee turn vs account churn

Security Metric of the Week #79: Employee turn versus account churn

This week's metric is typical of the kind of thing that often crops up in security metrics workshops and meetings. Whenever someone invents or discovers a metric like this, they are often enthusiastic about it, and that enthusiasm can be infectious. 

The alliteration in 'employee turn versus account churn' is eye-catching: for some reason buried deep in the human psyche, we find the phrase itself strangely attractive, hence the metric is curiously intriguing. 

We've fallen into a classic trap: the metric sounds 'clever' whereas, in reality, this is a triumph of form over substance. It is far from clear from the cute phrase what the metric is actually measuring, how, why, and for whom. What are 'employee turn' and 'account churn', exactly, and why would we want to compare them? What would that tell us about information security anyway?

In practice, someone at the workshop would probably have asked questions along those lines of the person who proposed the metric, and in turn they would have made a genuine attempt to explain it. In a field as complex as this, it's really not hard for an enthusiastic and influential person to concoct an argument justifying almost any security metric.  Combine that with a team exhausted by discussing dozens of metrics candidates, and it's easy to see why rogue metrics might slip through to the next stage of the process: management review.

By forcing this metric through the PRAGMATIC sausage machine, ACME's managers stripped back the gloss to consider its potential as a means of measuring information security:


Strangely, despite marking the metric down on Predictiveness, Relevance, Actionability, Accuracy, and Cost-effectiveness, they thought it had some Meaning. Perhaps they too were intrigued by the alliterative phrase! Nevertheless, the metric's poor overall score sealed its fate since there were many stronger candidate metrics on the table.

Remember this example whenever someone proposes a 'clever' security metric. Is it truly insightful, or is it simply obtuse and perplexing? By the same token, think twice about your own pet security metrics - and yes, we all have them (ourselves included!). 

Taken in the proper sequence, the Goal-Question-Metric approach forces us to start by figuring out what concerns us and then pose the obvious questions before finally considering possible metrics. Rogue metrics are less likely to crop up and harder to explain and justify. PRAGMATIC filters out any that make it through the earlier screening, despite their being pushed by influential people who are infatuated with their pets. This may seem rather cold and sterile, but think about it: metrics are all about bringing cool rationality, precision and facts to the management of complex processes. There's no room for rogues.

6 Nov 2013

New listing of ISO27k standards

I have rewritten my listing of the ISO27k standards.  It's now in a tabular format and as up to date as I can make it.  Although the descriptions are brief, there are hyperlinks to the relevant information pages on each of the standards at ISO27001security.com. 

In addition to the 21 already available, several more ISO27k standards are at DIS or FDIS stage.  Some may well be published before the end of the year.

I am waiting patiently for the ANSI INCITS versions of ISO/IEC 27001:2013 and ISO/IEC 27002:2013.  I can't find them on the ANSI site as yet, but it has always been a bit of a mission to search the ANSI site. The 2005 versions were just US$30 for single-user PDFs direct from ANSI. IT Governance Ltd. in the UK is selling the 2013 versions but at 60 quid a go, but I'll bide my time. Meanwhile, I'm making do with late drafts, and hoping not much changed when they were published.


5 Nov 2013

Apple, Boeing & Disney socially engineered

The fifth annual social engineering capture-the-flag competition at DEFCON once again graphically illustrated the social engineering risk. Take 5 minutes to download and read this year's report: the results, particularly the implications for information security, are truly shocking.

In short, the contestants were invited to socially-engineer a number of items of information from ten victim companies - all high-profile US brands (Apple, Boeing, Chevron, Exxon, General Dynamics, General Electric, General Motors, Home Depot, Johnson & Johnson and Walt Disney). Contestants had some time ahead of the event to research their targets using published information and to prepare their pretexts. During the competition period, live on stage at DEFCON, they attempted - and generally succeeded - in tricking and persuading employees of the victim companies to part with the "flags", pieces of information that should not have been disclosed. 

Listed on page 8 of the report, most of the flags would no doubt have seemed quite innocuous and trivial to the employees targeted, hence one social engineering technique is to provide the pretext or context in which disclosing them seems entirely natural.  In reality, the flags go towards building up an information dossier or toolkit, paving the way for ever more serious attacks such as physical site intrusion and network/system hacking. Think of them as the snowflakes and little snowballs that form the giant, unstoppable snowball which rolls down the hill smashing things in its wake.  

In the real world, of course, malicious social engineers would have been unconstrained by the rules of the competition or ethics, so could undoubtedly have captured much more sensitive and valuable proprietary information using the same techniques, plus those that they were not permitted to use at DEFCON (e.g. personal threats or coercion).  

All of this begs the obvious questions: "How can we guard against social engineering?" and equally "How can we use social engineering?"  

Given that the report was no doubt written by social engineers whose skills are more offensive than defensive, it offers just three suggestions on preventive techniques: policies, awareness and tests.  On awareness, the report says: 
"2. Consistent, Real World Education 
One of the areas that appear to be lacking across the board is quality, meaningful, security awareness education. In our experience, there is a definite relationship between companies that provide frequent awareness training and the amount of information that company surrenders. An organization that places a priority on education and critical thinking is sure to possess a workforce that is far more prepared to deal with malicious intrusions, regardless of the attack vector. 
Security awareness training needs to be consistent, frequent and personal. It doesn’t require that a company needs to plan large events each month, but annual or biannual security reminders should be sent out to keep the topic fresh in the employees’ minds. Often, the difficulty lies in businesses making training and education a priority to the extent that appropriate resources are allocated to ensure quality and relevance. Security education really cannot be from a canned, pre-made solution. Education needs to be specific to each company and in many cases, even specific to each department within the company. Companies who truly understand the challenges and rewards associated with high quality training and education will find themselves most prepared for the inevitable."
Naturally, I completely agree that security awareness is a vital part of the solution, and "frequent awareness training" is definitely needed, but "annual or biannual reminders" aren't nearly frequent enough to be effective, in my considered opinion. Our preferred approach is continuous, using monthly security awareness topics as a means to remind employees throughout the entire year about social engineering, hacking, password hygiene, backups, business continuity and a million other information security things. For a small fraction of what it would cost you to research and prepare the materials from scratch, our NoticeBored service delivers fresh, creative, camera-ready awareness content every month.   

It just so happens that we are currently revising the NoticeBored security awareness module on social engineering.  Aside from policies, awareness and tests, the module offers several other social engineering controls. I particularly like the idea of helping general employees first of all recognize the signs that they may be dealing with a social engineer, and then pass the inquiries through to employees who have been specially trained to deal with the threat.  The neat part about this control is that virtually every organization already has employees who are competent to take on the specialist role - in fact they practice their skills on a daily basis.  The NoticeBored awareness module provides suitable awareness/training materials for both the general employees and the specialists, along with materials to bring management up to speed on the social engineering risks and controls.

Check out NoticeBored to kick-start your awareness program on social engineering, but don't hang about: social engineers are already making snowballs from your information.  Sign up this month to guarantee delivery of the newly-revised social engineering awareness module before the end of year holiday period, 'peak season' for social engineering attacks. 

By the way, I have already addressed the question "How can we use social engineering?" on this blog.  Social engineering is definitely a 'dual-use' weapon.  Are you tooled-up yet?

Gary (Gary@isect.com) 

3 Nov 2013

PRAGMATIC Security Metric of the Quarter #6

The league table for another 3-month's information security metrics shows a very close race for the top slot:

Metric P R A G M A T I C Score

81 69 89 92 80 99 98 90 98 88%

95 97 70 78 91 89 90 85 90 87%

75 75 90 73 84 76 80 77 93 80%

65 76 91 73 83 77 70 61 78 75%

80 85 40 66 72 75 80 80 80 73%

88 86 88 65 78 60 26 90 70 72%

72 80 10 80 80 80 61 80 79 69%

86 80 51 40 65 39 55 95 60 63%

80 70 72 30 75 50 50 65 65 62%

58 55 82 73 86 47 64 66 17 61%

75 70 66 61 80 50 35 36 50 58%

85 85 67 40 77 40 48 16 40 55%
Psychometrics 40 24 0 79 15 55 10 42 5 30%

[Click any metric to visit the original blog piece that explained the rationale for ACME's scoring.]

Hopefully by now you are starting to make out themes or patterns in the metrics that score highly on the PRAGMATIC scale.

Having so far discussed and scored more than half of the example metrics from the book, plus a bunch more metrics from other sources, there's a fair chance we have covered some of the security metrics that your organization currently uses. How did they do? Do the PRAGMATIC scores and the discussion broadly reflect your experience with those metrics?  

We would be amazed if your metrics rate exactly the same as ACME's but if any of your scores are markedly higher or lower, that itself is interesting (and we'd love to hear why - feel free to comment on the blog or email us directly). The most likely explanation is that you are interpreting and using the metric in a way that suits your organization's particular information security management needs, whereas ACME's situation is different. Alternatively, it could be that you are applying the PRAGMATIC criteria differently to ACME (and us!). To be honest, it doesn't matter much either way: arguably the most important benefit of PRAGMATIC is that is prompts a structured analysis, and hopefully a rational and fruitful discussion of the pros and cons of various security metrics.