Welcome to NBlog, the NoticeBored blog

I may meander but I'm 'exploring', not lost

Oct 16, 2017

NBlog October 16 - is privacy a lost cause?

Today I've been thinking and writing about privacy risks, comparing the differing perspectives of individual people and organizations.

Something that stands out from the risk analysis is that, despite journalists, authorities, privacy pro's and victims being aghast when privacy breaches occur, we all gladly accept significant privacy risks as a matter of course. In a few cases (e.g. tax), we have virtually no choice in the matter, but mostly we choose to share our personal information, trusting that the recipients will protect it on our behalf.

To be honest, privacy doesn't even enter our minds most of the time. It doesn't occur to us, because of our blase attitudes.

Admittedly, it would take extreme measures to be reasonably assured of complete privacy, and even then there would still be risks: consider people in 'witness protection schemes' for example, or moles, spies, criminals and terrorists doing their level best to remain anonymous, below the radar. We know they don't always succeed.

Extremists aside, ordinary people like you and me mostly pay scant attention to our privacy. We use the Internet, and cellphones, and all manner of government and commercial services either under our own names, or with superficial efforts to conceal our identities. We share or post selfies online, email and text others, and wander about in public spaces under the full gaze of myriad CCTV cameras. We use our credit and debit cards to buy stuff, register for various services, and generally anticipate nothing untoward ... which in turn places even more pressure on the organizations and individuals to whom we disclose our personal information, hence the reason that privacy laws such as GDPR are so important in a societal sense.

Attitudes have changed markedly within a generation or three. Way back when I was a naive young lad, the very concept of taking, let alone sharing explicit selfies was alien to me. Porn was available, of course, but access was discreet, guilt-ridden and exceptional, despite the raging hormones. As Victorian values have relaxed, we've been through "free love", page 3 girls, Hugh Heffner, tolerated or legalized prostitution, gay rights and other largely sexual revolutions - in most Western nations anyway: clearly there are cultural discrepancies with distinct differences of opinion on decorum and propriety. Scandinavian attitudes to nudity are part of the enjoyment of saunas, for me: the naked human body is something to be revered and celebrated, as it was in the original Olympic games. I still smile when I remember a male American guest at a sauna party in the 80's, already feeling distinctly awkward about the men enjoying their collective nakedness, quite unable to cope with an influx of naked women when 'their' sauna went cold: he left hurriedly, all a fluster.

Privacy, then, is just as much a cultural phenomenon as it is a question of personal information, informed disclosure, security and so on. The underlying issue is more to do with control of personal information, than protection. Whether I choose to reveal my secrets to others, or to withhold it, is the key point, a dynamic concern with cultural as well as personal overtones, making privacy a deeper, more involved and more interesting awareness topic than it might appear.

Oct 14, 2017

NBlog October 14 - a different tack

There are several good reasons for protecting personal information, of which compliance with privacy laws and regulations is just one. 

For example, personal information can be extremely valuable in its own right - a business asset in fact. 

Consider the adverse consequences of personal information being lost or corrupted, perhaps the result of a system/hardware failure, a software bug, an inept or malicious system administrator, malware, ransomware or ....  well anything that can damage/destroy or deny legitimate access to information could of course affect personal information. In a sense, it is "just" information. 

At the same time, its commercial value is strongly linked to its confidentiality. This is why we are invited to pay $thousands for various mailing lists, offers which we either ignore or robustly decline since we are strongly ethical and most certainly not spammers! It's why sales professionals jealously guard their personal contacts. They are truly concerned about identity theft, as opposed to identity fraud

Treating personal information as a business asset worth protecting and exploiting puts an unusual slant on privacy. In particular, it emphasizes the commercial value of controls securing personal information, beyond the begrudging 'avoidance of fines' angle. It's also, I believe, a way to increase the pressure on senior management to do what needs to be done in order to secure personal information, even if they are not that fussed about privacy laws - a carrot-and-stick approach.

We'll expand on this and other good reasons to take privacy seriously in November's awareness module. 

Oct 13, 2017

NBlog October 13 - data breach reality check

In searching for information relating to GDPR and privacy for next month's awareness module, I bumped into the Business Continuity Institute's Horizon Scan 2017 report.

The report's headline data come from a survey of 666 business continuity and risk management professionals from Europe and North America (mostly), concerning their perceptions about threats and incidents ... and immediately a few issues spring out at me.

First of all, the survey population is naturally biased given their field of expertise: although sizable, this was clearly not a random sample. As with all professionals, they probably overemphasize the things that matter most to them, meaning serious incidents that actually or are believed to threaten to disrupt their organizations. It's no surprise at all that 88% of BC pro's are concerned or extremely concerned about "cyber attack" - if anything, I wonder what planet the remaining 12% inhabit! On the other hand, BC pro's ought to know what they are talking about, so their opinions are credible ... just not as much as hard, factual data concerning the actual incidents.

On that score, this year's report provides information on actual incidents:
"A new metric introduced in the BCI Horizon Scan Report measures actual disruption levels caused by the threats listed in figure 1 in order to provide a comparison against organizations’ concerns. Figure 2 shows a contrast between the levels of disruption caused by a particular threat and how concerned an organization is about it. The study shows the actual causes of business disruption slightly differ from the threats practitioners list as significant concerns. The top causes of business disruption according to the same respondents include unplanned IT and telecommunications outages (72%), adverse weather (43%), interruption to utility supply (40%), cyber attacks (35%) and security incidents (24%)."
The discrepancy between BC pros' perceptions and reality is quite marked. I'll come back to that in a moment.

Second, the way incidents (and/or threats - the report is somewhat ambiguous over the difference) are described puzzles me.  Here are the top 7, ranked according to the proportion of respondents who claimed to be "extremely concerned":
  1. Cyber attack (e.g. malware, denial of service) 
  2. Data breach (i.e. loss or theft of confidential information) 
  3. Unplanned IT and telecom outages 
  4. Security incident (e.g. vandalism, theft, fraud, protest) 
  5. Adverse weather (e.g. windstorm, flooding, snow, drought) 
  6. Interruption to utility supply (i.e. water, gas, electricity) 
  7. Act of terrorism

These are indistinct, overlapping categories - for example #1 and #2 often occur together, and both often accompany other categories such as #3, #5 and #6. #2 "Data breach" is a specific type of incident outcome with a huge variety of causes, ranging from deliberate attacks by outsiders or insiders, to accidental disclosures and ineptitude, plus thefts of IT equipment and storage media ... speaking of which #4 "Security incident" in fact refers to physical security incidents, judging by the examples.

#7 "Act of terrorism" seems way too high on the list for me ... but whether that's because I am fortunate enough to live and work in a tranquil backwater, or because the terrorists are winning (creating terror, even among supposedly level-headed BC pro's!), or is a genuine reflection of the threat level, I can't easily tell.

The top 7 actual causes of incidents tells a rather different story to the list above:
  1. Unplanned IT and telecom outages 
  2. Adverse weather (e.g. windstorm/tornado, flooding, snow, drought) 
  3. Interruption to utility supply (i.e. water, gas, electricity, waste disposal) 
  4. Cyber attack (e.g. malware, denial of service) 
  5. Security incident (e.g. vandalism, theft, fraud, protest) 
  6. Transport network disruption 
  7. Availability of talents/key skills (e.g. ‘bench strength’)

"Cyber attack", the #1 perceived threat, turns out to be #4 on the actual causes.  "Data breach" drops way down from #2 perceived to #8 in actuality, while transport disruption and lack of talents/key skills appear to be significant risks that are not perceived as such. "Act of terrorism" comes in at a more realistic (but still far too high, as far as I'm concerned) #13 on the actual causes.

Those discrepancies seem to indicate serious problems with the risk identification and assessment processes used by BC pro's for BCM purposes, which in turn are presumably being used to plan and prioritize BC activities ... or do they? One could argue that actual incidents are historically based, while BC pro's are paid for their expertise in predicting the future - professional soothsayers you could say. Hmmm.  Food for thought there.

Moving to the report's conclusions, I'm impressed to see this issue picked out in black and white as the first item:
"1. Organizations need to focus on the objective appraisal of threats and their particular impacts.
This year’s report has highlighted some gaps between the level of concern and actual disruptions caused by various threats. For example, the study noted significantly high levels of concern over cyber attacks and data breach which may be influenced by increased media coverage. Business disruptions nonetheless are still mainly driven by other threats such as unplanned IT and telecom outages and adverse weather. As such, organizations need to continually look at the business impacts of various threats and deploy appropriate tactics to become more resilient."
Well said! It would be interesting to explore why there are such marked discrepancies between perception and reality among BC pro's, since that would be an obvious handle to improve the alignment if appropriate (conceivably the BC pro's are right after all - perhaps we'll see changes in the actual causes in future reports!).

Anyway, back to the plot, the survey inspired the following graphic that we'll include in the awareness content (citing the source, of course):





Oct 7, 2017

NBlog October 7 - privacy update

This month we are updating the privacy awareness module for delivery in November, with a particular focus on GDPR just six months away. 

By the time it comes into force in May 2018, compliance with the EU General Data Protection Regulation will be a strategic objective for most organizations, thanks to the potential for massive fines and adverse publicity for any who are caught in contravention. Provided they are aware of it, we believe managers will welcome assurance either that everything is on track to make the organization compliant by the deadline, or that GDPR is definitely not applicable to them. 

Our job is to make managers aware of GDPR, emphasizing the governance and compliance plus information risk and security management aspects - updating corporate privacy policies for example, and ensuring that suppliers and business partners are on-track as well as the organization itself. If cloud service providers were struggling to meet the compliance deadline, for instance, there would be implications for their customers - another thing for management to consider. A GDPR compliance checklist would therefore be a worthwhile and timely addition to the NoticeBored materials.

The task of achieving GDPR compliance largely falls to IT and compliance specialists. Our awareness objectives for that audience are more tactical in nature, relating to project management, technical challenges and change management. The compliance checklist may help them consider the compliance project status from management's perspective, perhaps re-prioritizing and re-energizing the remaining activities.

For the general worker awareness audience, we plan to tackle the personal angle, addressing rhetorical questions such as "What's all the fuss?", "What's GDPR?" and "What's in it for me?" ... suggesting three awareness posters similar to the one above. We'll be developing those and other ideas into a brief for the graphics team this weekend.

GDPR and privacy are already making appearances in the professional media and will increasingly hit the general news outlets in the run-up to May - albeit mostly as fillers for slow news days. The first major organizations to be fined for GDPR non-compliance will surely be headline fodder, for a few days at least. Our customers' employees will have had the background hopefully to notice privacy-related news and appreciate what's behind the headlines, linking the general media with the corporate awareness programs. There's a broad educational purpose to November's module, in addition to the more direct awareness role. 

Oct 2, 2017

NBlog October 2 - a 2-phase approach to bolster the security culture

We've just updated the NoticeBored website to describe the new awareness module on security culture and delivered the latest batch of security awareness materials to subscribers. 

Culture is a nebulous, hand-waving concept, hard to pin down and yet an important, far-reaching factor in any organization. 

The new module (the 63rd topic in our bulging security awareness portfolio) is essentially a recruitment drive, aimed at persuading workers to join and become integral parts of the Information Security function. The basic idea is straightforward in theory but in practice it is a challenge to get people to sit up and take notice, then to change their attitudes and behaviors. 

During September, we developed a two-phased approach:

  1. Strong leadership is critically important which means first convincing management (all the way up to the exec team and Board) that they are the lynch-pins. In setting the tone at the top, the way managers treat information risk, security, privacy, compliance and related issues has a marked effect on the entire organization. Their leverage is enormous, with the potential to enable or undermine the entire approach, as illustrated by the Enron, Sony and Equifax incidents.

  2. With management support in the bag, the next task is to persuade workers in general to participate actively in the organization's information security arrangements. Aside from directly appealing to staff on a personal level, we enlist the help of professionals and specialists since they too are a powerful influence on the organization - including management. 

October's awareness materials follow hot on the heels of the revised Information Security 101 module delivered in September. That set the scene, positioning information security as an essential part of modern business. Future modules will expand on different aspects, each one reinforcing the fundamentals ... which is part of the process of enhancing the security culture. Consistency is key, along with repetition. The trick, though, is for the awareness program to maintain interest levels, hence simply saying the same thing over and over is counterproductive: people soon tune-out and glaze-over.

Another factor to take into account is that changing the culture inevitably takes time. Lots of time. This is a   s l o w   process. We've provided a survey form with a strong hint that the security culture should be measured on an ongoing basis since improvements may not be immediately obvious. The awareness effort may appear to have been wasted unless changes can be demonstrated through suitable metrics. There's another more subtle purpose to the survey though, getting management to determine what's sufficiently important to be worth surveying. There's value in the process of designing the metric, as well as the survey results - a little bonus.

Get in touch to bolster
your organization's security culture
through creative security awareness



That's it, October's module is done and dusted. So what next? 

With just six months from November until GDPR comes into force, we will be revising the privacy module to help subscribers pave the way through awareness. Once again, November's materials will build upon the same foundations, boosting understanding in the privacy area specifically while gently maintaining the undercurrent of information risk, security and compliance in general.

Right now, I have a more immediate goal in mind. After a month's hard work and the weekend's tech nightmare, I think we've earned ourselves lunch in town. 

Oct 1, 2017

NBlog October 1 - security culture module

Well, despite Finagle's Lawwe've limped home over the finishing line.  Another tidy stack of NoticeBored security awareness content is packaged up and will shortly be ready for our subscribers to download, customize and deploy.

'Security culture' is the 63rd awareness topic we've covered, among the most challenging module to develop and yet also the most rewarding: it's clear, in retrospect, what an important topic this is for any organization that takes information security seriously enough to run an awareness program. In short, there is no better mechanism than an effective security awareness program with which to foster a security culture. How on Earth have we ducked the issue for so long?  

Perhaps it's a maturity thing. Perhaps it's cultural: we are forging new paths, heading way off the track well-beaten by more conventional security awareness programs. 

Just in case you missed it,
there's so much more to
security awareness than phishing!

I pity organizations that rely solely on their security and privacy policies. 'Laying down the law' is undoubtedly an important part of the process, necessary but not sufficient. If it were, speed limit signs coupled with the threat of prosecution would have long since curbed driving incidents: we'd be left dealing with genuine accidents, mechanical failures and so forth, but excess speed would hardly ever be an issue. Patently, it is not ... and that's despite the parallel investment in awareness, training and education. 

It doesn't take much to imagine the carnage on our roads if 'laying down the law' was all that happened.

Turns out it's not too hard to elaborate on the business benefits of a corporate security culture. There are genuine business reasons for managers, in particular, to take this seriously, something that Enron, Sony and Equifax management and stakeholders might appreciate more than most.

We'll complete the delivery and update the website tomorrow, once the final stages of the computer rebuild are completed. It has been a long weekend!

Sep 30, 2017

NBlog September 30 - complying with Finagle's Law

Finagle's law elaborates on Sod's law: not only will anything that can go wrong, go wrong, but it will do so at the worst possible time.

With our self-imposed end of month deadline fast approaching, October's awareness module was close to being completed ... until a hardware failure caused a day's delay. A solid state disk drive gave up the ghost without warning last night. Naturally being highly security-aware we have backups, lots of backups, but rebuilding/restoring the system on a new disk inevitably takes time. Bang went my Saturday!

October's module is entirely new, being a new awareness topic for us, so it has taken longer than normal to prepare the module, leaving little slack in our schedule. Such is life. So, tomorrow I'll be slogging through what remains of the weekend, doing my level best to catch up and complete the materials for delivery on Monday, hopefully.

On the upside, our backups worked! We had enough spare hardware to survive this incident with relatively little impact except a day's lost work and elevated stress levels. An unplanned business continuity exercise.

Sep 29, 2017

NBlog September 29 - strategic alignment

On the ISO27k Forum this morning, a member from a financial services company asked for some advice on aligning IT and Security with overall corporate/business strategies.  He said, in part: 
"Organizational level strategic plan, covering its core business, has been derived. And it includes what is expected form Technology and Security departments,  I.E. to keep customers, shareholders happy and to provide safe and secure technology services.   
[I need] to prepare a strategic plan decoded from organization's strategy, specifically for Technology and Security department, with goals, objectives, principles etc.  So for achieving this, my approach is to understand each business strategy and determine the possible ways that Technology and Security team can help it. 
Business strategy -> Technology strategy -> Security Strategy"
I strongly support the idea of explicitly linking 'our stuff' with corporate/business strategies (plus initiatives, projects and policies) but 'our stuff' is more than just technology security, or IT security, or cybersecurity, or data security .... I encourage everyone to refer to information risk, defined as 'risk pertaining to information', an all-encompassing term for what we are managing and doing. Especially in the strategic context, we should all be thinking beyond securing bits and bytes.  

[The mere fact that they have a department, team or whatever named "Security" that he and presumably others consider a part of, if not very closely tied to, "Technology", strongly suggests a very IT-centric view in the organization. To me, there's the merest whiff of a governance issue there: treating this as 'IT's problem', with the emphasis on security (as in controls, restrictions and prohibitions, as much as protection and safety) is a common but, in my view, sadly misguided and outdated approach - a widespread cultural issue in fact.]

Identifying information risk aspects of the corporate strategies is a creative risk assessment activity. In stark contrast to financial risks, information risks tend to be largely unstated, if not unrecognized, at that level but can generally be teased out from the assumptions (both explicit and implicit). For instance, if a business strategy talks about "Expanding into a new market", consider what that actually means and how it will be achieved, then examine each of those proposed activities for the associated information risks - including for instance the information risk that the 'new market' opportunity has been misunderstood or misstated (often by whoever is eagerly promoting the approach, an obvious bias that experienced managers are adept at discounting in others, yet curiously reluctant to admit in themselves!). If it goes ahead, management are making significant assumptions that the market exists and is profitably exploitable using the proposed strategic approach but what if they are wrong? What if the projections are unrealistic (overly optimistic or pessimistic: remember risk cuts both ways)? What if the assumptions turn out to be unfounded? What if 'something else happens'? These are just some of the information risks concerning a proposal that is being used as the basis for strategic business decisions - a high-stakes situation for sure. In addition, there are the more obvious implications on Security of going ahead with the strategy (e.g. finding the information risk and security specialists needed to support and guide the new market activity) plus other more subtle effects (e.g. diverting attention and resources from more mundane but potentially just as risky stuff).

Doing that kind of risk assessment properly and thoroughly is a lot of work - a major and potentially difficult and costly undertaking, involving business managers plus specialists from Security, Risk Management, IT, HR, Compliance, Business Continuity, Audit etc. It's a team effort, supporting and enabling each other and negotiating for the best overall outcome for the business as a whole. If that's not feasible given the current circumstances, maturity level and resources, then I recommend at least focusing on and clearly prioritizing risk associated with the organization's most valuable and/or vulnerable information assets. In financial services, customer financial data is undeniably worth protecting, so there should be little argument if the strategy lays out whatever that involves. Other things may dangle from that handy hook, within reason, but still it's better to be able to show that single every item from the strategy or plan relates to something that the business has identified as a driver, goal, objective etc. [OK, some of those relationships might be tenuous in practice, but still it's hard for management to resist or block activities that relate to strategic goals. Possibly career-limiting in fact.]

Especially if we are able to do this properly, a significant advantage is that the business drivers for information risk and security form an excellent basis for security metrics: if our metrics measure those things, we can reasonably expect management to take notice and use them. If not, why are we wasting their time with irrelevancies? In other words, we can squeeze extra mileage out of the strategy development process by picking out the associated metrics that will help achieve the strategy. It's a win-win.

Don't forget that strategy is relatively long-term big-picture stuff. This is our big chance to plan the foundations for the future development and maturity of information risk and security management in the organization: it's not just about tagging dutifully along behind whatever the business is doing, but also setting things up so the business has more, better options going forward. It's part of 'business enablement'. If, for example, I would love (for sound business and professional reasons, you understand) to set up a superb Security Operations Centre but have so far been denied the opportunity, are there things we can do over the next year or so to set things up and get the process running so that, maybe in a few years time, the SOC is more likely then to be approved? The strategy development process is like a chess game: we need to think several moves ahead, and consider what the other players are doing and how they will respond to our moves. It's also a competitive team game: as much give as take. Call it back-scratching or horse-trading if that helps.

Sep 28, 2017

NBlog September 28 - safe & secure

The Coming Software Apocalypse is a long, well-written article about the growing difficulties of coding extremely complex modern software systems. With something in the order of 30 to 100 million lines of program code controlling fly-by-wire planes and cars, these are way too large and complicated for even gifted programmers to master single-handedly, while inadequate specifications, resource constraints, tight/unrealistic delivery deadlines, laziness/corner-cutting, bloat, cloud, teamwork, compliance assessments plus airtight change controls, and integrated development environments can make matters worse. 

Author James Somers spins the article around a central point. The coding part of software development is a tough intellectual challenge: programmers write programs telling computers to do stuff, leaving them divorced from the stuff - the business end of their efforts - by several intervening, dynamic and interactive layers of complexity. Since there's only so much they can do to ensure everything goes to plan, they largely rely on the integrity and function of those other layers ... and yet despite being pieces of a bigger puzzle, they may be held to account for the end result in its entirety.

As if that's not bad enough already, the human beings who actually use, manage, hack and secure IT systems present further challenges. We're even harder to predict and control than computers, some quite deliberately so! From the information risk and security perspective, complexity is our kryptonite, our Achilles heel.

Author James Somers brings up numerous safety-related software/system incidents, many of which I have seen discussed on the excellent RISKS List.  Design flaws and bugs in software controlling medical and transportation systems are recurrent topics on RISKS, due to the obvious (and not so obvious!) health and safety implications of, say, autonomous trains and cars.

All of this has set me thinking about 'safety' as a future awareness topic for NoticeBored, given the implications for all three of our target audiences:
  1. Workers in general increasingly rely on IT systems for safety-critical activities. It won't be hard to think up everyday examples - in fact it might be tough to focus on just a few!
  2. With a bit of prompting, managers should readily appreciate the information risks associated with safety- and business-critical IT systems, and would welcome pragmatic guidance on how to treat them;
  3. The professional audience includes the programmers and other IT specialists, business analysts, security architects, systems managers, testers and others at the sharp end, doing their best to prevent or at least minimize the adverse effects when (not if) things go wrong. By introducing the integration and operational aspects of complex IT systems in real-world situations, illustrated by examples drawn from James Somers' article and RISKS etc., we can hopefully get them thinking, researching and talking about this difficult subject, including ways to bring simplicity and order to the burgeoning chaos.

Well that's the outline plan, today anyway. No doubt the scope will evolve as we continue researching and then drafting the materials, but at least we have a rough goal in mind: another awareness topic to add to our bulging portfolio.

Sep 27, 2017

NBlog September 27 - compliance culture

A discussion thread on CISSPforum about the security consequences of (some) software developers taking the easy option by grabbing code snippets off the Web rather than figuring things out for themselves (making sure they are appropriate and, of course, secure) set me thinking about human nature. We're all prone to 'taking the easy option'. You could say humans, and in fact all animals, are inherently lazy. Given the choice, we are inclined to cut corners and do the least amount possible, making this the default approach in almost all circumstances. We'd rather conserve our energy for more important things such as feeding and procreating.

Yesterday, Deborah mentioned being parked at a junction in town near a one-way side road. In the few minutes she was there, she saw at least 3 cars disregard the no-entry signs, breaking the law rather than driving around the block to enter the side road from the proper direction. Sure they saved themselves a minute or so, but at what cost? Aside from the possibility of being fined, apparently there's a school just along the side road. It's not hard to imagine kids, teachers and parents rushing out of school in a bit of a hurry to get home, looking 'up the road' for oncoming vehicles and not bothering to look 'down the road' (yes, they take the easy option too).

The same issue occurs often in information security. 'Doing the right thing' involves people minimizing risks to protect information, but there's a cost. It takes additional time and effort, compared to corner-cutting. 

Recognizing that there is a right and a wrong way is a starting point - easy enough when there are bloody great "No entry" signs on the road, or with assorted warning messages, bleeps, popup alerts and so forth when the computer spots something risky such as a possible phishing message. Informing people about risks and rules is part of security awareness, but it's not enough. We also need to persuade them to act appropriately, making the effort that it takes not to cut the corner.

You may think this is a purely personal matter: some people are naturally compliant law-abiding citizens, others are naturally averse to rules (sometimes on principle!), with a large swathe in the middle who are ambiguous or inconsistent, some plain ignorant or careless. How they react depends partly on the particular circumstances, including their past experience in similar situations ... which hints at another aspect of security awareness, namely the educational value of describing situations, explaining the consequences of different courses of action, guiding people in how they should respond and ideally getting them to practice until 'doing the right thing' becomes the default.

However, there is also a cultural aspect to this: social groups vary in their compliance: compare driving standards in, say, Sweden with Italy for a clear demonstration of cultural differences at a national level. In practice, traffic lights, signs, rules and laws are at best advisory (derisory you might say!) in much of the Mediterranean.

In the information security context, such cultural distinctions can make a huge difference to the way we express and enforce the rules necessary to protect information. Management in compliant organizations can develop, publish and mandate security policies and procedures, knowing employees will respect them (most of the time anyway), whereas in noncompliant organizations that approach alone would be inadequate - barely even the first stage. Additional activities would be needed to both reinforce and enforce compliance. That's potentially a large hidden cost arising from noncompliance, especially if applies equally to all sorts of rules: tax laws, bribery and corruption, driving, privacy, intellectual property rights and so on.

Having just made a case for a culture of compliance, I should say that compliance per se is not the ultimate goal. One could argue that safety - not compliance - is the true objective of road signs, speed limits etc. From that perspective, compliance is merely a way to achieve the objective. So long as most of the drivers in Rome play the same game and stay reasonably safe, compliance with the road laws is incidental. [Judging by the proportion of beaten-up cars on the roads, I don't think the collision avoidance and hence safety objective is being met either, but that's a subjective opinion based on my cultural background!].

Sep 25, 2017

NBlog September 24 - five-step bulletproofing?

In the course of searching for case study materials and quotations to illustrate October's awareness materials, I came across 5 ways to create a bulletproof security culture by Brian Stafford. Brian's 5 ways are, roughly: 

  1. Get Back to Basics - address human behaviors including errors. Fair enough. The NoticeBored InfoSec 101 awareness module we updated last month is precisely for a back-to-basics approach, including fundamental concepts, attitudes and behaviors.
  2. Reinvent the Org Chart - have the CISO report to the CEO. Brian doesn't explain why but it's pretty obvious, especially if you accept that the organization's culture is like a cloak that covers everyone, and strong leadership is the primary way of influencing it. The reporting relationship is only part of the issue though: proper governance is a bigger consideration, for example aligning the management of information risks and assets with that for other kinds of risk and asset. Also security metrics - a gaping hole in the governance of most organizations.
  3. Invest in Education - "Any company that seeks to have a strong security culture must not only offer robust trainings to all employees—including the c-suite—but also encourage professional development opportunities tailored to their unique focus areas." Awareness, training and education go hand-in-hand: they are complementary.
  4. Incentivize & Reward Wanted Behavior e.g. by career advancement options. Again, the InfoSec 101 module proposes a structured gold-silver-bronze approach to rewards and incentives, and I've discussed the idea here on the blog several times. Compliance reinforcement through rewards and encouragement is far more positive and motivational than the negative compliance enforcement approach through pressure, penalties and grief. Penalties may still be necessary but as a last resort than the default option.
  5. Apply the Right Technology - hmm, an important consideration, for sure, although I'm not sure what this has to do with security culture. I guess I would say that technical controls need to work in concert with non-tech controls, and the selection, operation, use and management of all kinds of control is itself largely a human activity. The fact that Brian included this as one of his 5 ways betrays the widespread bias towards technology and cybersecurity. I'd go so far as to call it myopic.
Personally, and despite our obvious efforts in this area, I'd be very reluctant to state or imply that an organization's security culture could ever be considered bulletproof, not even in the purely rhetorical sense. It's an important part of a bigger set of things, one that happens to be relevant to most of information risk, security, privacy, compliance, governance and so on, but culture, alone, won't deflect bullets: knowing that, and being ready and willing to handle the consequences of incidents, is itself characteristic of a robust security culture.

Sep 23, 2017

NBlog September 23 - security culture sit rep

October's awareness module is gradually taking shape. The management and professionals' seminar slide decks and notes are about 80% done. They're quite intenst, earnest and rather dull though, so we need something inspiring to liven things up a bit. More thinking and digging around required yet.

Meanwhile, the staff/general materials are coming along too. The next 7 days will be busy, systematically writing, revising, aligning and polishing the content until it gleams and glints in the sun - talking of which, we set the clocks forward an hour tonight for summer time: it has been a long, wet NZ Winter this year.


Sep 22, 2017

NBlog September 22 - cultured security

Aside from concerning the attitudes and values shared within groups, or its use in microbiology (!), there's another meaning of 'culture' relating to being suave and sophisticated. 

In the information risk and security context, it's about both being and appearing professional, exuding competence and quality - and that can be quite important if you consider the alternative. 

Given the choice, would you be happy interacting and doing business with an organization that is, or appears to be, uncultured - crude, slapdash, unreliable etc.? Or would you be somewhat reluctant to trust them?

There are some obvious examples in the news headlines most weeks: any organization that suffers a major privacy breach, hack, ransomware or other incident comes across as a victim and arguably perhaps culpable for the situation. It's hardly a glowing endorsement of their information risk, security, privacy and compliance arrangements! Contrast their position against the majority of organizations, particularly the banks that exude trustworthiness. Corporate cultures, brands and reputations are bound strongly together.

The two meanings of 'culture' are linked in the sense that the overall impression an organization portrays is the combination of many individual factors or elements. Through marketing, advertising and promotions, public relations, social media etc., management naturally strives to present a polished, impressive, business-like, trustworthy external corporate image, but has limited control over all the day-to-day goings on. Myriad interactions between workers and the outside world are largely independent, driven by the individuals, individually, and by the corporate culture as a whole.

Management may try to control the latter, espousing 'corporate values' through motivational speeches and posters, but in most organizations it's like herding cats or plaiting fog. Much like managing change, managing the corporate culture is a tough challenge in practice. Realistically, the best management can hope for is to influence things in the right direction, perhaps rounding-off the sharpest corners and presenting a more consistently positive front.  

I'm talking here about the organization's integrity, one of the three central information properties alongside confidentiality and availability. Protecting, enhancing and exploiting the organization's culture is a core issue for information security, one that includes but extends well beyond the very limited domain of cybersecurity.

That in turn makes 'security culture' a valuable topic for the security awareness program, and makes the program a valuable part of running the business. Through NoticeBored, the awareness materials and activities are not just meant to inform and influence individuals one-by-one, but to mold the overall corporate culture in a more generalized way. We're not just addressing 'users', computer systems, networks and apps. A NoticeBored awareness program deliberately envelopes everyone in all parts and at all levels of the organization. 

The awareness stream aimed at management will be particularly important in October's module. Our intention is to convince managers that:
  1. Although they may never have considered it before, the corporate security culture really matters to the organization - it's very much a business issue;
  2. While culture is largely an emergent property of dynamic social groups and interactions, it can be influenced, if not actually controlled, through sustained and deliberate actions - it's a strategic business issue;
  3. The security awareness program is a viable and valuable mechanism to influence the corporate security culture;
  4. Managers themselves are part of the strategic approach e.g. not merely mandating staff compliance with security and privacy rules through directives, policies and procedures, but walking-the-talk, demonstrating their personal concerns and proactively supporting information risk, security, privacy, compliance etc. - in other words showing leadership.
[Get in touch soon to subscribe to NoticeBored and receive October's awareness materials.]

Sep 20, 2017

NBlog September 20 - Phishing awareness & cultural change


This plopped into my inbox last evening at about 8pm, when both ANZ customers and the ANZ fraud and security pros are mostly off-guard, relaxing at home. It's clearly a phishing attack, obvious for all sorts of reasons (e.g. the spelling and grammatical errors, the spurious justification and call to action, the non-ANZ hyperlink, oh and the fact that I don't have an ANZ account!) - obvious to me, anyway, and I hope obvious to ANZ customers, assuming they are sufficiently security-aware to spot the clues.

I guess the phishers are either hoping to trick victims into disclosing their ANZ credentials directly, or persuade them to reveal enough that they can trick the bank into accepting a change of the mobile phone number presumably being used for two-factor authentication, or for password resets.

Right now (8 am, 12 hours after the attack) I can't see this particular attack mentioned explicitly on the ANZ site, although there is some basic guidance on "hoax messages" with a few other phishing examples. The warnings and advice are not exactly prominent, however, so you need to go digging to find the information, which means you need to be alert and concerned enough in the first place, which implies a level of awareness - a classic chicken-and-egg situation. I presume ANZ has other security awareness materials, advisories and reminders for customers. If not, perhaps we can help!

Aside from the authentication and fraud angle, I'm interested in the cultural aspects. Down here in NZ, people generally seem to be quite honest and trusting: it's a charming feature of the friendly and welcoming Pacific culture that pervades our lives. Given its size and history, things may be different in Australia - I don't know. But I do know that phishing and other forms of fraud are problematic in NZ. The Pacific culture is changing, becoming more careful as a result of these and other scams, but very slowly. Increasing distrust and cynicism seems likely to knock the corners off the charm that I mentioned, with adverse implications for tourism and commerce - in other words cultural changes can create as well as solve problems. 

The same issue applies within organizations: pushing security awareness will lead (eventually, if sustained) to changes in the corporate culture, only some of which are beneficial. It's possible to be too security-conscious, too risk-averse, to the point that it interferes with business. October's awareness seminar and briefings for management will discuss a strategic approach aiming to settle the organization's security culture in the sweet spot somewhere between the two extremes, using suitable metrics to guide the process.

Sep 19, 2017

NBlog September 19 - what is 'security culture'?

For some while now, I've been contemplating what security culture actually means, in practice. 

Thinking back to the organizations in which I have worked, they have all had it some extent (otherwise they probably wouldn't have employed someone like me!) but there were differences in the cultures. What were they?

Weaknesses in corporate security cultures are also evident in organizations that end up on the 6 o'clock news as a result of security and privacy incidents. In the extreme, the marked absence of a security culture implies more than just casual risk-taking. There's a reckless air to them with people (including management - in fact managers in particular) deliberately doing things they know they shouldn't, not just bending the rules and pushing the boundaries of acceptable behavior but, in some cases, breaking laws and regulations. That's an insecurity culture!

The strength of the security culture is a relative rather than absolute measure: it's a matter of degree. So, with my metrics hat on, what are the measurable characteristics? How would we go about measuring them? What are the scales? What's important to the organization in this domain?

A notable feature of organizations with relatively strong security cultures is that information security is an endemic part of the business - neither ignored nor treated as something special, an optional extra tacked-on the side (suggesting that 'information risk and security integration' might be one of those measurable characteristics). When IT systems and business processes are changed, for instance, the information risk, security and related aspects are naturally taken into account almost without being pushed by management. On a broader front, there's a general expectation that things will be done properly. By default, workers generally act in the organization's best interests, doing the right thing normally without even being asked. Information security is integral to the organization's approach, alongside other considerations and approaches such as quality, efficiency, ethics, compliance and ... well ... maturity.  

Maturity hints at a journey, a sequence of stages that organizations go through as their security culture emerges and grows stronger. That's what October's NoticeBored security awareness content will be addressing, promoting good practises in this area. Today I'll be exploring and expanding on the maturity approach, drawing conceptual diagrams and thinking about the governance elements. What would it take to assemble a framework facilitating, supporting and strengthening the corporate security culture? What are the building blocks, the foundations underpinning it? What does the blueprint look like? Who is the architect?

Where does one even start? 

I've raised lots of rhetorical questions today. Come back tomorrow to find out if we're making progress towards answering any of them! 

Sep 15, 2017

NBlog September 15 - symbolic security


An article bemoaning the lack of an iconic image for the field of “risk management” (e.g. the insurance industry) applies to information risk and security as well. We don’t really have one either. 

Well maybe we do: there are padlocks, chains and keys, hackers in hoodies and those Anonymous facemasks a-plenty (a minute's image-Googling easily demonstrates that). Trouble is that the common images tend to emphasize threats and controls, constraints and costs. All very negative. A big downer.

Information risk and security may never be soft and cuddly ... but I'm sure we can do more to distance ourselves from the usual negative imagery and perceptions. I really like the idea of information security being an enabler, allowing the organization do stuff (business!) that would otherwise be too risky. So I'll be spending idle moments at the weekend thinking how to sum that concept up in an iconic image. Preferably something pink and fluffy, with no threatening overtones.



Sep 13, 2017

NBlog September 13 - surveying the corporate security culture

Inspired perhaps by yesterday's blog about the Security Culture Framework, today we have been busy on a security culture survey, metrics being the first stage of the SCF. We've designed a disarmingly straightforward single-sided form posing just a few simple but carefully-crafted questions around the corporate security culture. 

Despite its apparent simplicity, the survey form is quite complex with several distinct but related purposes or objectives:

  • Although the form is being prepared as an MS Word document with the intention of being self-completed on paper by respondents (primarily general staff), the form could just as easily be used for an online survey on the corporate intranet, a survey app, or a facilitated survey (like shoppers being stopped in the shopping mall by friendly people with clipboards ... and free product samples to give away).
  • The survey form is of course part of our security awareness product, linking-in with and supporting the other awareness content in October's module on 'security culture', and more broadly with the ongoing awareness program.  The style and format of the form should be instantly familiar to anyone who has seen our awareness materials. 
  • A short introduction on the form succinctly explains what 'security culture' means and why it is of concern and value to the organization, hence why the survey is being carried out. I'm intrigued by the idea of positioning the entire organization as a ‘safe pair of hands’ that protects and looks after information: a reasonable objective given the effort involved in influencing the corporate security culture. Even the survey form is intended to raise awareness, in this case making the subtle point that management cares enough about the topic to survey workers' security-related perceptions and behaviors including their attitudes towards management. 
  • Conducting the survey naturally implies that management will consider and act appropriately on the results. We take that implied obligation seriously, and will have more to say about it in the module's train-the-trainer guide. The survey is more than just a paper exercise or an awareness item: respondents will have perfectly reasonable expectations merely as a result of participating.
  • The survey questions themselves are designed to gather measurable responses i.e. data on a few key criteria or aspects of 'security culture'.  We have more work to do on the questions, and even when we're done we hope our customers will adapt them to suit their specific needs (e.g. if there is an organization-wide issue around compliance, it might be worth exploring attitudes and perceptions in that area to tease out possible reasons for that).  For starters, though, the questions are extremely simple -  at face value, very quick and easy to read and answer - and yet given sufficient responses, the survey is a powerful, statistically valid and meaningful metric measuring a complex, multi-faceted and dynamic social construct. No mean feat that!
  • It would be feasible to develop further forms to survey populations other than 'general employees'. I'm thinking particularly of management and perhaps third parties: how does the corporate security culture appear from their perspectives? What concerns them? Are there issues that deserve concerted action? We may not have the time to prepare forms for October's NoticeBored module ... but we might pose that suggestion to our subscribers, again in the train-the-trainer guide.
  • Beneath each of the questions are spaces for respondents to comment, plus we encourage respondents to make their views known either on the reverse or (to maintain their anonymity) on a separate sheet, web page or email. We take the interactive approach quite deliberately and routinely because there's a lot of value to be gained by getting workers to open up a little and mention things that concern or interest them, from their perspectives and in their terms. In the particular context of the survey, we want to give respondents the opportunity to explain, expand or elaborate on the numeric responses if they feel the need. It's surprising just how powerful and insightful quotes direct from the horse's mouth can be. Pithy quotations make excellent content to illustrate and pep-up management reports and further awareness materials.
  • Mentioning 'free product samples' and 'sufficient responses' suggests the possibility of offering some sort of inducement for people to complete the survey, other than the opportunity to express their opinions and hopefully influence management. I have previously mentioned the gold-silver-bronze 'award menu' included in the Information Security 101 module: bronze level rewards would be ideal for this purpose. [Provided the anonymity aspect is addressed, a more attractive silver or gold award could be offered in, say, a prize draw: given the potential business value of the information generated by a well-designed survey, that's not a bad investment.]
So there we go. All we have to show for a whole day's work is a single page survey form (oh, and this blog piece!), illustrating once again the key point I made in relation to the elevator pitch for InfoSec 101: the shortest, pithiest awareness pieces are often the hardest to prepare. Less really is more!

Sep 12, 2017

NBlog September 12 - Security Culture Framework


In preparing for our forthcoming awareness module on security culture, I've been re-reading and contemplating Kai Roer's Security Culture Framework (SCF) - a structured management approach with 4 phases.

1. Metrics: set goals and measure

Speaking as an advocate of security metrics, this sounds a good place to start - or at least it would be if SCF explored the goals in some depth first, rather than leaping directly into SMART metrics: there's not much point evaluating or designing possible metrics until you know what needs to be measured. In this context, understanding the organization's strategic objectives would be a useful setting-off point. SCF talks about 'result goals' (are there any other kind?) and 'learning outcomes' (which implies that learning is a goal - but why? What is the value or purpose of learning?): what about business objectives for safely exploiting and protecting valuable information?

SCF seems to have sidestepped more fundamental issues. What is the organization trying to achieve? How would what we are thinking of doing support or enable achievement of those organizational objectives? Security awareness, and information security as a whole, is not in itself a goal but a means to an end. I would start there: what is or are the ends? What is information security awareness meant to achieve? 

Having discussed that issue many times before, I'm not going to elaborate further on today, here except to say that if the Goals are clear, the Questions arising are fairly obvious, which in turn makes it straightforward to come up with a whole bunch of possible Metrics (the GQM method). From there, SMART is not such a smart way to filter out the few metrics with a positive value to the organization, whereas the PRAGMATIC metametrics method was expressly designed for the purpose.

SCF further muddies the waters by mentioning a conventional Lewin-style approach to change management (figure out where you are, identify where you want to be, then systematically close the gap) plus Deming's Plan-Do-Check-Act approach to quality assurance. I'm not entirely convinced these are helpful in setting goals and identifying measures. I would have preferred to elaborate on the process of analyzing the organization's core business, teasing out the 'hooks' in the business strategies on which to hang information security and hence security awareness. Those are powerful drivers, not least because only a fool would seriously resist or interfere with something that explicitly supports or enables strategic business objectives - a career-limiting move, to be sure!


2. Organization: involve the right people


Involving the right people makes sense for any activity including the previous step in SCF - in other words, the right people need to be involved in defining and clarifying the organization's objectives, which means these two activities overlap. Despite the numbering, they are not entirely sequential. The right people must be actively engaged in setting goals initially, and in deciding who else needs to be involved.

Sequencing issues aside, the second module of SCF discusses ways to identify 'the right people' for two distinct purposes: (1) those who will run the 'security culture program' (whatever that is! It is undefined at this stage); and (2) the target audience for security awareness (again, part of the vague 'security culture program').  

I fully support the idea of identifying awareness audiences, which is why NoticeBored delivers three parallel streams of content aimed at workers in general, managers and professionals. While we don't subdivide those audiences, we recommend that the security awareness professionals to whom the materials are delivered do so - it's standard advice in the train-the-trainer guide in virtually every awareness module to identify who has an interest in the monthly topic, and work with them to customize, communicate, inform and persuade. In many cases that comes down to business departments or functions, and sometimes individual people (e.g. the Privacy Officer clearly needs to be actively engaged in privacy awareness, along with the Legal/Compliance function - or their equivalents since their titles, responsibilities and interests may vary). 

SCF picks out executives, HR and Marketing as obvious examples of groups you would probably want to involved, and fair enough ... although I can think of many more (such as the two mentioned above). In fact it's hard to think of any part of the organization that could safely be excluded, given that information flows throughout the entire organization like a nervous system.

SCF mentions the idea of nominating ambassadors or champions, hinting at the process we call 'socializing information risk and security'. It also mentions the need for regular communications of tailored messages - good stuff.


3. Topics: choose activities

The advice here is to "Build culture that works by choosing relevant topics and activities". I'm confused by 'culture that works' but in practice determining the security awareness and training topics is the focus of this module, and that's quite straightforward.  There's sound advice here:
"One thing to note about topics is that it is highly unlikely, and usually not something you would want, to cover all topics in one year. Long-term results are created by carefully crafting a plan to build the security culture you want over the course of several years."  
True, for two reasons: (1) given a broad perspective on information risk and security, there are lots of topics to cover, hence a lot of information to impart; and (2) cultural changes are inevitably slow. People need time to receive and internalize information, and change their ways. They need gentle encouragement and support, motivation and, in some cases, enforcement of the security rules.
"Some topics are relevant at different stages of an employee lifecycle. One example is introducing new employees to policies and regulations when they begin working. Another is during relocation, when it may make sense to train the employee in local security routines."
The need to include information risk and security in induction or orientation training is obvious, no problem there. Relocation, though, is not a strong example: in 'employee lifecycle' terms, what about internal moves and promotions, and eventually leaving the organization?  Those are almost universal activities that do indeed have information risk and security implications that the awareness program might usefully cover. Hmmm, perhaps we should put that idea into practice with NoticeBored awareness materials. We already cover some aspects (such as periodically reviewing and adjusting workers' information access rights).

Some of the advice in SCF has become lost in translation e.g.:
"To map down topics that builds up under goal and matches an organizational map is one method to get a good overview. The easiest one is those who targets the whole organization and builds up under the overall goals in the goal hierarchy. Those who only target segments of the organization demands mostly more work."
Que?

SCF mentions a few forms or styles of awareness and training - mostly training in fact, with an emphasis on computer methods. 


4. Planner: plan and execute

SCF's advice in this area is straightforward and conventional, quite basic though helpful for someone just getting into security awareness for the first time, or at least the first time in a structured, planned way. 

Aside from defining goals, audiences and topics, and establishing metrics, there's little discussion of project or program management as a whole, including (1) risk management (what are the risks to your awareness program? What could go wrong? What should you be doing to mitigate the risks? And what about opportunities? Can you seize the opportunity and take advantage of business/organizational situations, or for that matter novel information risk and security situations such as the recent ransomware outbreaks, and forthcoming changes in privacy as a result of GDPR?); (2) resource management (e.g. recruiting, training and developing the awareness team, plus the extended team taking in those awareness ambassadors mentioned earlier); and (3) change management (it's ironic that change is noted earlier in SCF but not in the sense of managing changes to the awareness program itself - aspects such as changes to management support and perceptions, personnel changes, changes of focus and approach as old ways lose their impact and new ideas emerge, maturity, and changes prompted by the security metrics).


Conclusion

SCF has some good points, not least focusing attention on this important topic. The advice is fairly basic and not bad overall, although the sequencing and reference to other approaches is a bit muddled and confusing.

Of more concern are the omissions, important considerations conspicuously absent from the website's overview of SCF e.g. business value, psychology, adult education, compliance, motivation and maturity. I'm disappointed to find so little discussion of security culture per se, given the name of the framework: it mostly concerns the mechanics of planning and organizing security awareness and training activities, barely touching on the before and after stages. Perhaps Kai's training courses go further.

That said, both the Security Culture Framework website and Kai's book "Build a Security Culture" are succinct, and patently I have been sufficiently stimulated to write this critique. I prefer Rebecca Herold's "Managing an Information Security and Privacy Awareness and Training Program" but you may feel differently. There's something to be said for getting to know both of them, plus other approaches too such as David Lacey's "Managing the Human Factor in Information Security" - another excellent book.