Welcome to the SecAware blog

I spy with my beady eye ...

8 Jun 2017

NBlog June 8 - frame the problem to find the solution

Today we're exploring and elaborating on the information risks associated with the wide variety of modern-day workplaces I mentioned yesterday.

The risk-control spectrum diagram is a convenient way to get our thoughts - as well as the risks - in order. It's straightforward to present and discuss the risks along with the corresponding security controls, in a priority sequence that sort of makes sense. 

'Sort of' hints at an underlying issue that I'd like to discuss today. 

Whereas we strive to make the NoticeBored security awareness materials reasonably complete and accurate, we cannot entirely reflect any specific customer organization and its particular business context or needs, not least because we simply don't know what they are.

At the same time, that ambiguity presents an awareness opportunity. It opens the way for customers to consider, discuss, challenge, adapt and extend the generic content. Take for instance our placement of the "Working from home" to the left (lower risk) side of "Office fire/flood". "Working from home" is not actually an information risk - rather it's a commonplace scenario with several associated information risks ... which aren't called out explicitly on the diagram but will be expanded upon in the accompanying notes. Likewise "Office fire/flood" is not intended to be an explicit description of the risk, so much as a prompt or cue for the audience to consider that kind of situation from the information risk and security perspective. How you would describe the risks, and where you would place them on the spectrum (both in absolute terms and relative to others) is down to you ... but the diagram is a good starting point for contemplation and discussion, "close enough for government work" as it were.

There are limits to the generic approach though. Much of the security awareness content doing the rounds on the Interweb is so bland and of such poor quality that the authors' experience and expertise are called into question (to put it poilitely). A lot of it is myopically concerned with IT systems and data, neglecting the broader aspects such as - well - workplace information security for just one of many related topics. More perniciously, the free security awareness content in almost all those free slideshares and vendor white papers usually tops-out around the middle of the risk spectrum, in other words it only covers low to middling cybersecurity risks and baseline controls, without even acknowledging that there are more sigificant issues out there and other control options worth considering.

It's something that springs to mind whenever I see those 'top N' lists recommending someone's chosen subset of favorite controls. The well-meaning authors believe they are helping matters with their naive checklist approach. The implicit message is "If only everyone would adopt these N things, we'd all be better off!". Nice idea but unfortunately, experience tells us that hardly anyone has the will, patience and resources to complete the list - and that goes for any generic list presented as good or (worse still) best practice.

Some cybersecurity/awareness pro's might argue that there's no point even mentioning the high-risk end of the scale because the corresponding controls are infeasibly expensive and complex, 'only suited to the government and defense industry' but that's their judgment call which pre-empts management. As a consistent approach, it systematically biases the entire security awareness program and in fact information security management as a whole. 

Look at it this way. If we  were to offer managers these 3 options, which are they most likely to accept?
  1. Baseline, basic or trivial security controls addressing only the lowest risks, and not very well at that; 
  2. Top-N best practice controls addressing the low and middle-range risks;
  3. Serious controls addressing most of the risks, including the uncommon but potentially disastrous high-end bet-the-farm ones.

Most likely they would prefer options 1 or 2, possibly even option 0 - the do nothing, bury head in sand, la-la-la can't hear you option. Only the bravest and most far-sighted (well OK, risk-averse) managers would seriously consider, let alone choose option 3. However, look what happens instead if we silently drop option 3 from the table: we're left now with options 0 (which may well remain unspoken) plus 1 and 2, and once again it is human nature to go for the middle. By lopping-off the top end, the entire frame of reference has been lowered. The high-end risks and security controls are the elephant in the room, except for the awkward and inconvenient fact that only those people who are already security-aware even sense its ghostly presence.  

Oh oh. We have a problem Houston.

Bottom line: it is entirely appropriate to bring up the high-end risks and controls in the security awareness program even if, in practice, they are likely to be discounted. Framing the problem space broadly is necessary to avoid distorting the field and creating bias. If nothing else, it's a matter of integrity.


No comments:

Post a Comment