Welcome to the SecAware blog

I spy with my beady eye ...

20 Sept 2016

CIS Critical Security Controls [LONG]

Today I've been nosing through the latest 6.1 version of the CIS Critical Security Controls for Effective Cyber Defense, described as "a concise, prioritized set of cyber practices created to stop today’s most pervasive and dangerous cyber attacks".

In reality, far from being concise, it is a long shopping list of mostly IT/technical security controls, about 100 pages of them, loosely arranged under 20 headings. There are literally hundreds of controls, way more than the '20 critical controls' mentioned although obviously 'Implement the 20 critical controls' sounds a lot more feasible than 'Implement hundreds of tech controls, some of which we believe are critical for cyber defense (whatever that is)'!

The selection of controls is evidently driven by a desire to focus on what someone believes to be the key issues:
The CIS Controls embrace the Pareto 80/20 Principle, the idea that taking just a small portion of all the security actions you could possibly take, yields a very large percentage of the benefit of taking all those possible actions
There is no backing or evidence behind that bald assertion in the document, nor on the introductory page on the CIS website - but, hey, it's a nice idea, isn't it? "We only need to do these 20 things, possibly only the first 5, to be cyber-secure!". 

Yeah, right. Welcome to cloud-cuckoo land. Security doesn't work that way. Assuming that the bad guys are going to give up and go away if at first they don't succeed is almost unbelieveably naive. They are persistent buggers. Some see it as an intellectual challenge to find and exploit the chinks in our armor. If anything, closing the gaping holes makes it more fun to spot the remaining vulnerabilities ... and if all you have done is to implement 20 'critical controls', you are asking for trouble.

As to inferring that CIS has identified precisely the 'small proportion of all the security actions' which will generate 'a very large percentage of the benefit', well I leave you to ponder the meaning of the Pareto principle, and whether we are being duped into thinking 20 is a magic number. Personally, I doubt it's even remotely similar to the true value.

Naturally I looked to see what they advise in the way of security awareness, and duly found Critical Security Control 17:
CSC 17: Security Skills Assessment and Appropriate Training to Fill Gaps
For all functional roles in the organization (prioritizing those mission-critical to the business and its security), identify the specific knowledge, skills, and abilities needed to support defense of the enterprise; develop and execute an integrated plan to assess, identify gaps, and remediate through policy, organizational planning, training, and awareness programs.
The recommendation is for training to fill gaps in knowledge, skills and abilities, implying specific/targeted training of specific individuals addressing particular technical security weaknesses. That, to me, is appropriate for those relatively few workers with designated security responsibilities, but does not work well for the majority who have many other responsibilities besides information security, let alone "cyber security".

Yes, I'm ranting about "cyber", again. Here we have yet another product from the US cyber defense collective that fails to clarify what it actually means by "cyber", unless you class this as definitive:
We are at a fascinating point in the evolution of what we now call cyber defense. Massive data losses, theft of intellectual property, credit card breaches, identity theft, threats to our privacy, denial of service – these have become a way of life for all of us in cyberspace. 
It's about as vague and hand-waving as "cloud". The CIS describes itself in similarly vague terms, again liberally sprinkled with cyber fairy-dust:
The Center for Internet Security, Inc. (CIS) is a 501c3 nonprofit organization whose mission is to identify, develop, validate, promote, and sustain best practices in cyber security; deliver world-class cyber security solutions to prevent and rapidly respond to cyber incidents; and build and lead communities to enable an environment of trust in cyberspace.
Anyway, back to CSC 17. The control is broken down into 5 parts:
[17.1] Perform gap analysis to see which skills employees need to implement the other Controls, and which behaviors employees are not adhering to, using this information to build a baseline training and awareness roadmap for all employees.
Hmmm, OK, a gap analysis is one way to identify weak or missing skills (and knowledge and competencies) that would benefit from additional training, but I'm not clear what they mean in reference to behaviors that employees are 'not adhering to'. In what sense do we 'adhere to' behaviors? I guess that might mean habits??
[17.2] Deliver training to fill the skills gap. If possible, use more senior staff to deliver the training. A second option is to have outside teachers provide training onsite so the examples used will be directly relevant. If you have small numbers of people to train, use training conferences or online training to fill the gaps. 
This part is explicitly about training for skills. There is no explanation for recommending 'more senior staff' or 'outside teachers' providing 'training onsite': there are of course many different ways to train people, many different forms of training. I see no reason to be so specific: surely what is best depends on the context, the trainees, the subjects, the costs and other factors? 

I'm hinting at what I feel is a significant issue with the entire CIS approach: it is prescriptive with little recognition or accounting for the huge variety of organizations and situations out there. Information risks differ markedly between industries and in different sizes or types of organizations, while their ability and appetite to address the risks also vary. A one-size-fits-all approach is very unlikely to suit them all ... which means the advice needs to be tempered and adapted ... which begs questions about who would do that, and how/on what basis. [I'll hold my hand up here. I much prefer the ISO27k approach which supplements its lists of controls with advice on identifying and analyzing information risks, explicitly introducing a strong business imperative for the security.] 
[17.3] Implement a security awareness program that (1) focuses on the methods commonly used in intrusions that can be blocked through individual action, (2) is delivered in short online modules convenient for employees (3) is updated frequently (at least annually) to represent the latest attack techniques, (4) is mandated for completion by all employees at least annually, (5) is reliably monitored for employee completion, and 6) includes the senior leadership team’s personal messaging, involvement in training, and accountability through performance metrics.
Oh dear. I would quarrel with every one of those six points:
  1. Why would you 'focus on the methods commonly used in intrusions' (specifically) rather than, say, protecting intellectual property, or spotting and correcting mistakes? We know of well over 50 topics within information risk and security that benefit from heightened awareness. This point betrays the prejudices of CIS and the authors of the document: they are myopically obsessed with Internet hackers, neglecting the myriad other threats and kinds of incident causing problems.

  2. Why 'short online modules'? It is implied that convenience rules, whereas effectiveness is at least as important. 'Online modules' only suit some of the workers who use IT systems, and we all know just how useless some online training and awareness programs can be in practice. If all you want is to be able to tick the box on some compliance checklist, then fine: force workers to click next next next while skimming as rapidly as possible through some mind-numbingly dull and boring, not to say cheap-and-nasty cartoon-style or bullet-point drivel, and answer some banale question to "prove" that they have completed the "training", and Bob's yer uncle! If you actually want them to learn anything, to think differently and most of all to change the way they behave, you are sadly deluded if 'short online modules' are your entire approach. Would you teach someone to drive using 'short online modules'? Can we replace the entire educational system with 'short online modules'? Of course not, don't be daft. 

  3. I agree that a security training and awareness program needs to be 'updated frequently', or more accurately I would say that it needs to reflect current and emerging information risks, plus the ever-changing business environment, plus recent incidents, plus business priorities, plus learning from past issues, incidents and near-misses (including those experienced by comparable organizations). Updating those 'short online modules' on 'the methods commonly used in intrusions' and 'the latest attack techniques' misses the point, however, if it all comes down to a cursory review and a bit of tittivation - worse still if the materials are only updated annually. The mere suggestion that annual updates might be sufficient is misleading in the extreme, bordering on negligent: things are moving fast in this domain, hence the security awareness and training program needs to be much more responsive and timely to be effective. [Again, if all you want is that compliance tick, then fine, suit yourself. Ignore the business benefits a culture of security would bring you. Do the least amount possible and pretend that's enough - like Sony and Target might have done ...] 

  4. 'Mandated for completion' harks back to the bad old days when we were all forced to attend some tedious annual lecture on health and safety, dental hygiene or whatever. We know that is a badly broken model, so why push it? Modern approaches to education, training and awareness are much more inclusive and responsive to student needs. The process caters for differing styles and preferences, uses a range of materials and techniques, and most of all seeks to hook people with content that is useful, interesting and engaging, so there should be no need to force anyone through the sausage machine. Wake up CIS! The world has moved on! How about the kcrazy notion, for instance, of rewarding people for being aware, demonstrating their understanding by doing whatever it is you want them to do? If your awareness and training materials are not sufficiently creative and engaging to drive demand, if your people need to be dragged kicking and screaming into the process then you might as well break out the cat-o-nine-tails. "The floggings will continue until morale improves"!

  5. Mere 'completion' of those 'short online modules' is trivial to determine as I mentioned above: simply count the clicks and (for bonus marks) set an arbitrary passmark on that final 'assessment' - albeit allowing students to try as many times as they can be bothered to keep on guessing, just to escape the tedium and get back to work. Do you honestly think that has any value whatsoever, other than (once again) ticking the compliance box like a good little boy? The same can be said for attendance at awareness sessions, courses, events or whatever. It's easy to count page views on the intranet Security Zone, for instance, and from there to claim that x% of employees have participated, but how many of they have taken the slightest bit of interest or actually changed their behaviors in any meaningful and positive way? You won't find that out by measuring 'completion' of anything. In short, 'completion' metrics are not PRAGMATIC.

  6. Part 6 is a vague mish-mash of concepts, depending on how one interprets the weasel-words. 'Personal messaging' from the 'senior leadership team' is all too often an excuse for a few (as few as possible!) carefully-chosen words on those dreadfully trite corporate motivational posters: "Make It So" says the boss. "Do it ... or else!" Likewise, 'getting involved in training' might be restated as "Turn up at the odd event" or "Make a guest appearance, say a few words, press-the-flesh". What's completely missing from the CIS advice is the revolutionary idea that managers - at all levels from top to toe - should actively participate in the security awareness and training program as students, not just whip-crackers and budget-approvers. Managers need to be well aware of information risks, security, compliance, governance, control And All That, just as staff need to know how to avoid becoming cyber-victims. How do you expect managers to know and care about that stuff if they are not participating in the security awareness program? What kinds of strategies are they likely to support if they lack much of a clue? [Hint: "Implement the 20 controls" has a certain ring to it.]

    The final clause about 'accountability through performance metrics' once again needs careful interpretation. Along with responsibility, duty, obligation and so on, accountability is a crucially important concept in this field, yet one that is more often misinterpreted than correctly understood. We like to sum it up in the hackneyed phrase "The buck stops here" which works in two ways: first, we are all personally accountable for our actions and inactions, our decisions and indecisions, our good and bad choices in life. The person who clicks the phishing link and submits their password (the very same crappy password they use on all the places they can get away with it) leading to a major incident can and should be held to account for that obvious lapse of judgment or carelessness. At the same time, the person's managers, colleagues, support network and - yes - their security awareness and training people all share part of the blame because information security is a team game. I would also single out for special attention those who put the person in the situation in the first place. There are almost always several immediate issues and a few root causes behind security incidents: teasing out and addressing those root causes is the second angle to stopping-the-buck. Why did the person ignore or misread the signs? Why didn't the systems identify and block the phishing attack? Why wasn't this kind of incident foreseen and avoided or mitigated? ... leading ultimately to "What are we going to do about this?" and sometimes "Who will swing for it?"! Performance metrics are of tangential relevance in the sense that we are accountable for meeting defined and measurable performance targets, but holding people to account for information security is much more involved than counting how many widgets they have processed today. Performance is arguably the most difficult aspect to measure in information security, or cyber-security for that matter. It's all very well to measure the number and consequences of incidents experienced, but how many others were avoided or mitigated?
OK, moving along, let's take a squint at the remaining parts of CSC 17:
[17.4] Validate and improve awareness levels through periodic tests to see whether employees will click on a link from suspicious email or provide sensitive information on the telephone without following appropriate procedures for authenticating a caller; targeted training should be provided to those who fall victim to the exercise. 
Of the 50+ topics in information security awareness and training, why pick on email and phone phishing, specifically? Is nothing else important? I appreciate that phishing is a current concern but so too are ransomware, privacy, human errors, industrial or national espionage, piracy and many many others, ALL of which benefit from targeted awareness and training. What's more, the situation is dynamic and differs between organizations, hence it is distinctly misleading to pick out any one topic for special attention unless it is phrased merely as an example (it wasn't). Oh dear. It gets even worse at the end with the suggestion that 'targeted training' should be administered to victims: is that punishment? It sounds like punishment to me. We're back to flogging again. How about, instead, rewarding those who did not fall for the exercise, the ones who spotted, resisted and reported the mock attack? Hey, imagine that!
[17.5] Use security skills assessments for each of the mission critical roles to identify skills gaps. Use hands-on, real world examples to measure mastery. If you do not have such assessments, use one of the available online competitions that simulate real-world scenarios for each of the identified jobs in order to measure mastery of skills mastery.
'One of the online competitions'?? Well I suppose that is an approach, but somehow I doubt its effectiveness - and (for good measure) it definitely raises security concerns. Instead of being tacked on the bottom like the donkey's tail, 17.5 should probably have been included in 17.1 since it relates back to the identification of 'gaps'. As to measuring 'masterery of skills mastery', let's assume that is just a typo, a human error, one of those 50+ other things that best practice broad-spectrum information security awareness and training programs cover besides phishing or cyber-wotsits. 

Bottom line: sorry, CIS, but if control #17 is representative of the remaining 19, I'm disappointed. There are too many flaws, errors and omissions, and it is very biased towards IT and hacking. It is prescriptive and too far from good- let alone best-practice to recommend. 


PS  Remember these distinctly cynical comments the next time you read or hear someone extolling the virtues of the CIS 20 critical controls. If they think the CIS advice is wonderful, what does that tell you about their standards and expectations? 

PPS  And if you disagree with me, the floor's yours. I'm happy to discuss. Put me right if you will.

No comments:

Post a Comment