Welcome to the SecAware blog

I spy with my beady eye ...

17 Feb 2017

28 days of awareness: day 16

Today I've mostly been reading and thinking about the official launch of the NCSC - that's the UK's new National Cyber Security Centre as opposed to the US National Counterintelligence and Security Center (presumably an unfortunate clash of acronyms ... or was that deliberate?  They do have remarkably similar aims). 

Reading between the lines of the formal speeches and usual puffery on the website, I get the feeling this is more than just the lastest in a long line of government re-orgs. In particular, I noticed the chief's stated intent to demonstrate the value of cybersecurity advice by proving it on Her Majesty's Government first. Using HMG as a guinea pig is a bold move and an interesting one. I wish them well - seriously, I have more than just a passing interest in their success.

Skipping deliberately past the thorny issue of what they actually mean by "cyber", one of their glinting advisories caught my beady eye because it concerns security awareness. "10 Steps: User Education and Awareness" is part of the "10 Steps to Cyber Security" series ... which so far has 13 steps (!) since in addition to 10 'technical advice sheets' it includes 3 introductory sheets aimed at senior management. Bear with me: I'll come back to that point towards the end.

First let's take a closer look at their advice on "user education and awareness". 

The advisory starts out well by explaining the [information] risks it covers, providing a useful context for the recommended controls that follow. The stated risks include: 

  • "Removable media and personally owned devices" - that's actually two distinct if related categories of risk.  I'm uncomfortable to see this listed first, precedence implying higher priority;
  • "Legal and regulatory sanction" is really an impact not a risk, but fair enough compliance is bound to be a strong driver for any authority, something they are naturally keen to promote more widely. There's more to say about this later;
  • "Incident reporting culture" is a control not a risk, one that is quite rightly covered below leaving the corresponding risks unstated;
  • "Security Operating Procedures" is again a control not a risk, although confusingly the supporting comments refer vaguely to 'imbalance', where [excessive or inappropriate] security impedes legitimate business activities - another separate issue and hardly a major information risk in any reasonably well-managed risk-driven outfit. I wonder if someone has misinterpreted SOP (Standard Operating Procedure), anyway, and garbled this point?;
  • "External attack" is a broad class of incidents, refined by the following comments to mean attacks by outsiders using insiders i.e. social engineering. This is patently a key risk to tackle through awareness and education since it revolves around people. I wonder why they chose not to say "social engineering": is it too scary perhaps?;
  • "Insider threat" is technically a class of threats but is broadly understood to mean the information risks arising from and relating to workers. Again, awareness is a key control in this area, so that's cool. Personally, I'd have put this along with the previous bullet up top.
Other relevant risks are missing from the list e.g.:

  • Genuine errors, accidents and mistakes that compromise information and hence the organization: this is by far the biggest category of human-related incidents by volume and probably by value ('death by a thousand cuts'), hence it is a significant omission from the list. It's also a nice risk to mention because of the lack of associated blame in most cases (aside from carelessness and negligence, anyway) and the fact that collective and individual behaviours can make a real difference in this area;
  • Poor quality work: this is an information risk stemming partly from carelessness along with other factors such as people settling for incomplete, out of date or inadequate information, not spotting and correcting errors, failing to help or encourage others to do things right, and not speaking up when they see things going wrong or think of better ways to work. It's another area where awareness can drive behavioural and cultural changes, with obvious payoffs for the organization (and the people!) if done well. I guess this one would suffer from rampant political correctness since at first glance it appears to be accusing workers of shoddy work, so it probably ought to be expressed more carefully ... on the other hand, 'rampant political correctness' may be part of the problem! Sometimes, straight talking achieves more, especially if supported with positive suggestions on how to make things better for everyone;
  • Fragility, by which I mean situations where the organization gets thrown out of kilter by even fairly small issues or incidents. This risk arises from a deep-seated vulnerability in many mature organizations (such as governments), a profound reluctance to change and, worse still, a premium on stability, conservatism and maintaining the status-quo over creativity, innovation, responsiveness and resilience. It is a particular issue in security and technology being such fast-moving areas. We are constantly facing fresh challenges, new threats, new vulnerabilities, new impacts ... and new opportunities, some of which are well worth adopting. Maturity and stability have their moments too so this can be a tricky issue to express. I guess the risk relates to striking the right balance between opposing forces.

Notice that I'm straying beyond the generally-accepted cybersecurity sphere here, quite deliberately. And I have picked out just 3 of many information risks that are intimately associated with people. 

Anyway, moving swiftly on, next comes their security advice:

  • "Produce a user security policy": hmmmmm, 'a policy' is likely to be quite a beast given the breadth of issues to be covered, although the advice also mentions procedures so maybe I'm being too picky here. However, even assuming they are actually talking about a coherent suite of policies and procedures (hopefully including guidelines and other supporting materials, in various formats and styles, professionally written, readable and engaging, motivational as well as informational ... all of which remains unsaid) the phrase 'user security policy' hints at another concern (see below).
  • "Establish a staff induction process": again I could quarrel with 'establishing' a process (unless that happens to mean designing, operating, managing, measuring and systematically improving it!), and specific mention of 'staff' is another issue (what about managers? How are they supposed to get up to speed on this stuff?) but it gets worse. The text refers to compliance responsibilities being formally acknowledged for disciplinary purposes: fair enough, the formalities are important for new starters, but that's primarily an HR or Legal issue rather than information or cyber security: what about helping newcomers understand the corporate culture and attitudes in this area, and appreciate why information risks are of concern? Even simple things like how to get help, who to call if something doesn't seem right, where to find the infosec policies etc. are all worthwhile topics.
  • "Maintain user awareness of the security risks faced by the organisation": I'm relieved to read 'maintain' and 'regular refresher training' in the notes, although I'm surprised they only mention "security risks to the organisation" - not controls? Not governance? Not information risks?  Not personal concerns? Not compliance? I realise this is a succinct piece of advice but it could be read very narrowly, missing an opportunity to spread good practice.
  • "Support the formal assessment of security skills": this one is not bad but also misses the mark. 'Formal assessment' and 'certification' are fine but the real value comes from personal development, competence, knowledge and motivation, not the courses or the parchment on the wall.
  • "Monitor the effectiveness of security training": monitoring - and measuring - are good, provided the information is used positively to drive improvements that benefit the organization, which the following advice sort of says. However, the term 'security training' raises yet another concern as that implies courses in security, either in a classroom setting or online focused study. What about all the other forms of awareness and education? Shouldn't they be monitored, measured and improved as well? Oh, hang on a moment, those other forms are barely even hinted-at, let alone promoted, in the document.
  • "Promote an incident reporting culture": well OK, the most obvious question is "How?" and brief advice follows. I'd also ask "Why?" but that's not covered. Prompt reporting of incidents, near-misses and concerns is a valuable part of information risk management, although it was not mentioned among the risks listed earlier.
  • "Establish a formal disciplinary process": I still trip up on that weasel-word 'establish', and struggle with the implication that a 'formal disciplinary process' is sufficient in this area - necessary, yes, but not enough. This is definitely an old-skool approach to compliance, focused on hammering those who don't comply, rather than rewarding those who do. Both approaches have their place, with positive reinforcement being much more powerful and valuable (in my experience) in terms of driving the culture in the right direction. It's better that workers willingly and readily do the right thing because they understand and support the organization's objectives, than because they fear disciplinary action. At the very least, there is a fighting chance they will behave appropriately even when nobody is watching over their shoulder like a hawk, waiting to pounce.

I can think of other controls in this area, in fact I talk about them often on this very blog. It's disappointing that the official advice is so lame, so far behind current practice. I would not have been the least bit surprised to spot that old saw "annual awareness training" in there (thankfully not).

But wait, I'm not done moaning yet. Those repeated references to 'users' as in 'computer users' concern me. We are talking about people, not keyboard-jockeys or key-pressing automata, nor illegal-drug-users (an even more common expansion of the term). We, the people, have personalities, desires, constraints, flaws, priorities, prejudices, biases, creativity and many other biological and sociological characteristics that make us uniquely human. To ignore all that is to disrespect us, or at least demonstrates an apparent lack of empathy towards the audiences for awareness and educational purposes. Saying 'users' reflects a distinctly computer- or cyber-centric view of the world and probably unrealistic expectations about how people behave, react, respond to and learn from this kind of stuff. It is demeaning. We can't be rebooted, and the big red off-switch ... well, let's not go there. 

Finally, and most disappointingly of all if I'm reading this right, is the suggestion that awareness is something that management does to staff. Managers, it seems, are above all that. It's for their underlings, the hoi palloi, the sheep. And, just as bad, IT and other pros are above it too. They are expected simply to know all about information risks and security and all that, having presumably picked it up by some magical sixth-sense parallel-universe out-of-body learning experience - Vulcan mind-melds perhaps. They are anticipated to be the teachers and gurus, apparently, and yet how do they learn the ropes? Speaking for myself as a career infosec pro with 3 decades' experience under my belt and clearly a keen interest in awareness, I definitely don't know it all, and I learn new things every day. The day I stop learning is the day my big red switch gets flicked to OFF and the lights go out for the very last time.

So, to end on a more positive note, I am gratified that the 3 management-level briefings I mentioned earlier have been included, albeit seemingly tacked-on as an afterthought. Those, to me, are at least as important as the tech briefings - in fact more so in the sense that without management's understanding in this area, their support is going to be lacklustre at best. If your management is obsessive about "compliance", for instance, that is a huge hint that they Just Don't Get It. Surely it makes sense to tackle management's security awareness and understanding first, and to maintain that impetus thereafter? More please!


Gary (Gary@isect.com)

No comments:

Post a Comment