Welcome to NBlog, the NoticeBored blog

I may meander but I'm 'exploring', not lost

Nov 17, 2018

NBlog Nov 17 - all quiet? TOO quiet?

Don’t just hoard your feedback and metrics: use them! Squeeze every last drop of value from them!

It is all too easy to down-play or dismiss comments and especially criticisms about the awareness program. Resist your natural defensive tendencies. Collate and take another, dispassionate look at your awareness metrics and the feedback you have received in recent months concerning information security and/or the awareness and training program. Try to identify common threads or themes that might have escaped your attention previously, or that seem to crop up repeatedly.

This kind of review is best conducted as a team exercise, better still if you persuade some of your most vocal/persistent critics to get actively involved (invite them to your review meetings, give them the floor and listen hard to what they have to say!). SWOT analysis and brainstorming techniques can help tease out genuine concerns and novel ways to tackle them. For example, if your budget is a serious constraint on the awareness program, there may be free/cheap alternatives and more efficient and effective ways of using whatever you have. 

Metrics and verbatim comments from your audience demonstrating demand for and appreciation of your awareness and training activities should make your status reports more positive and budget requests more compelling.

If you aren't getting much in the way of feedback, don’t sit on your laurels.  Perhaps the awareness program is going extremely well but are you really doing enough to encourage feedback, or are people too lazy or too intimidated to respond? Consider commissioning an independent third party to conduct an anonymous survey on your behalf, or at least set aside a few minutes every day to call or visit people to find out what they truly think. Write yourself a basic script if it helps e.g. start by asking questions about current or recent awareness topics and activities/events.

Nov 16, 2018

NBlog Nov 16 - trust awareness

Among other findings, PwC's "The Journey to Digital Trust" report picks on inadequate attention to awareness and training:
"Many businesses could do more to raise employee awareness and accountability around cybersecurity and privacy. Only 34% of respondents say their company has an employee security awareness training program. Only 31% say their company requires employee training on privacy policy and practices."
Less than a third of companies require training on their privacy policies and procedures? Wow! The other two thirds presumably expect their people to 'just know' this stuff. Perhaps it gets into their heads through osmosis, Vulcan mind melds or magic crystals. Perhaps management is over-reliant on the general news media and public awareness activities, forgetting that we are all awash in a vast ocean of information. Picking out the Stuff That Matters is getting harder and harder by the second.

It is any surprise, then, that privacy breaches and other information incidents occur so often? I suspect a good proportion of the organizations that do provide privacy awareness have suffered already - they've learnt the hard way, whereas the rest of us can and should learn from their mistakes.

It's hardly rocket surgery: if workers are expected to do stuff and not do other stuff in order to secure information, maintain privacy and satisfy all the other requirements to minimize information risks and compliance, surely they need to know what's expected of them. Just as kids need to be told and shown, repeatedly, what's right and what's wrong, adults need instruction and guidance in this area.

PwC offers the following 'Actionable advice for business leaders':
  • "Prioritize raising workforce awareness about cybersecurity and privacy to support business objectives. Use messaging that avoids invoking security fatigue and is memorable enough to influence behavior when busy employees later face phishing schemes and other sophisticated threats.
  • Establish corporate policies governing access to IT assets and data. Enforce the policies at all levels of the company to drive accountability for cybersecurity and privacy."
Well said, PwC! I agree with emphasizing business objectives, although they might also have mentioned personal, team and social objectives: information security and privacy are not just important for our organizations. Protecting the interests of customers, for instance, by adequately protecting their personal information is not purely a strict business matter. Influencing employee behavior is an important goal ... and I might add that influencing decisions (especially management decisions made by business leaders) is one of the most powerful changes that an effective awareness and training program can achieve.

PwC's mention of policies and accountability smacks of the compliance-driven culture which is particularly strong in America and increasing elsewhere in the world - GDPR being a topical example. Noncompliance with the privacy regulations can seriously damage the bottom line and be career-limiting for those held to account for their failures, including management's bad decisions I just mentioned. It's a governance matter. Duck and cover is not a viable response.

Nov 14, 2018

NBlog Nov 14 - lack of control =/= vulnerability

A common misunderstanding among infosec professionals is that vulnerabilities include the lack or inadequacy of various infosec controls e.g. 'the lack of security awareness training'.

No     No!    NO!

Vulnerabilities are the inherent weaknesses that may be exposed and exploited by the threats, leading to impacts. 

In the lack-of-awareness example, people's naivete and ignorance are inherent weaknesses that may be exposed in various situations (e.g. when someone receives a phishing email) and exploited by threats (the phishers in this case i.e. fraudsters using social engineering techniques to mislead or misdirect victims into clicking dubious links etc.) leading to various impacts (malware infection, identity fraud, blackmail or whatever), hence risk. Naivete and ignorance are vulnerabilities. There are others too, including human tendencies such as greed and situations that distract us from important points, such as security warnings from our email and browser software, or that little voice in our head whispering "Too good to be true!".

Vulnerabilities exist with or without the controls. Sure, well-designed and implemented controls mostly reduce vulnerabilities but the lack of a control is not itself a vulnerability. It's a lack of control, something fundamentally different. 

Effective infosec awareness and training compensate for and reduce the naivete and ignorance, in part, and give people the skills and motivation to spot and deal appropriately with threats to  information, such as phishing. The control is imperfect, though - we know that - hence the risk is not totally eliminated, merely reduced ('mitigated' in the lingo). The limitations are two-fold: (1) those inherent issues run deep, and (2) the threats are constantly morphing.

I've blogged about this before and was reminded of it yet again today when checking out some 'infosec threat catalogs' on the Web. There are some potentially useful generic infosec threat lists out there but most also list non-vulnerabilities such as lack of awareness, catching my beady eye and distracting me. Those hijack my attention and wind me up, to the point that I refuse to recommend the associated threat catalogs even if those bits are sound. I won't propagate the misconception that lack of control is vulnerability.

Yes, I'm vulnerable too. I'm human. Allegedly. My button is hot.

To complicate matters further, controls can contain or be associated with vulnerabilities. Controls sometimes fail to work as designed. They break or are broken, get bypassed, misconfigured or turned off, or are simply overwhelmed - a genuine concern for phishing given the sheer number and growing variety of attacks. Nevertheless, I maintain that control weaknesses are not vulnerabilities. They are conceptually distinct.

Weak or missing controls result from inherent weaknesses or flaws in our information security practices, which are vulnerabilities. Misunderstanding "vulnerability" is both a vulnerability and a threat, at which point I'm going to leave this top a-spinning as I stagger back to my morning coffee.

Nov 13, 2018

NBlog Nov 13 - what to ask in a security gap assessment (reprise)

Today on the ISO27k Forum, a newly-appointed Information Security Officer asked us for "a suitable set of questions ... to conduct security reviews internally to departments".

I pointed him at "What to ask in a gap assessment" ... and made the point that if I were him, I wouldn't actually start with ISO/IEC 27002's security controls as he implied. I'd start two steps back from there:
  1. One step back from the information security controls controls are the information risks. The controls help address the risks by avoiding, reducing or limiting the number and severity of incidents affecting or involving information: but what information needs to be protected, and against what kinds of incident? Without knowing that, I don't see how you can decide which controls are or are not appropriate, nor evaluate the controls in place.
  2. Two steps back takes us to the organizational or business context for information and the associated risks. Contrast, say, a commercial airline company against a government department: some of their information is used for similar purposes (i.e. general business administration and employee comms) but some is quite different (e.g. the airline is heavily reliant on customer and engineering information that few government departments would use if at all). Risks and controls for the latter would obviously differ ... but less obviously there are probably differences even in the former - different business priorities and concerns, different vulnerabilities and threats. The risks, and hence the controls needed, depend on the situation.
I recommend several parallel activities for a new info sec pro, ISO, ISM or CISO – a stack of homework to get started:
  • First, I find it helps to start any new role deliberately and consciously “on receivei.e. actively listening for the first few weeks at least, making contacts with your colleagues and sources and finding out what matters to them.  Try not to comment or criticize or commit to anything much at this stage, although that makes it an interesting challenge to get people to open up!  Keep rough notes as things fall into place.  Mind-mapping may help here.
  • Explore the information risks of most obvious concern to your business. Examples:
    • A manufacturing company typically cares most about its manufacturing/factory production processes, systems and data, plus its critical supplies and customers;
    • A services company typically cares most about customer service, plus privacy;
    • A government department typically cares most about ‘not embarrassing the minister’ i.e. compliance with laws, regs and internal policies & procedures;
    • A healthcare company typically cares most about privacy, integrity and availability of patient/client data;
    • Any company cares about strategy, finance, internal comms, HR, supply chains and so on – general business information – as well as compliance with laws, regs and contracts imposed on it - but which ones, specifically, and to what extent?;
    • Any [sensible!] company in a highly competitive field of business cares intensely about protecting its business information from competitors, and most commercial organizations actively gather, assess and exploit information on or from competitors, suppliers, partners and customers, plus industry regulators, owners and authorities;
    • Not-for-profit organizations care about their core missions, of course, plus finances and people and more (they are business-like, albeit often run on a shoestring);
    • A mature organization is likely to have structured and stable processes and systems (which may or may not be secure!) whereas a new greenfield or immature organization is likely to be more fluid, less regimented (and probably insecure!);
  • Keep an eye out for improvement opportunities - a polite way of saying there are information risks of concern, plus ways to increase efficiency and effectiveness – but don’t just assume that you need to fix all the security issues instantly: it’s more a matter of first figuring out you and your organization’s priorities. Being information risk-aligned suits the structured ISO27k approach. It doesn’t hurt to mention them to the relevant people and chat about them, but be clear that you are ‘just exploring options’ not ‘making plans’ at this stage: watch their reactions and body language closely and think on;
  • Consider the broader historical and organizational context, as well as the specifics. For instance:
    • How did things end up the way they are today? What most influenced or determined things? Are there any stand-out issues or incidents, or current and future challenges, that come up often and resonate with people?
    • Where are things headed? Is there an appetite to ‘sort this mess out’ or conversely a reluctance or intense fear of doing anything that might rock the boat? Are there particular drivers or imperatives or opportunities, such as business changes or compliance obligations? Are there any ongoing initiatives that do, could or should have an infosec element to them?
    • Is the organization generally resilient and strong, or fragile and weak? Look for examples of each, comparing and contrasting. A SWOT or PEST analysis generally works for me. This has a bearing on the safe or reckless acceptance of information and other risks;
    • Is information risk and security an alien concept, something best left to the grunts deep within IT, or a broad business issue? Is it an imposed imperative or a business opportunity, a budget black hole (cost centre) or an investment (profit centre)? Does it support and enable the business, or constrain and prevent it?
    • Notice the power and status of managers, departments and functions. Who are the movers and shakers? Who are the blockers and naysayers? Who are the best-connected, the most influential, the bright stars? Who is getting stuff done, and who isn’t? Why is that?
    • How would you characterize and describe the corporate culture? What are its features, its high and low points? What elements or aspects of that might you exploit to further your objectives? What needs to change, and why? (How will come later!)
  • Dig out and study any available risk, security and audit reports, metrics, reviews, consultancy engagements, post-incident reports, strategies, plans (departmental and projects/initiatives), budget requests, project outlines, corporate and departmental mission statements etc. There are lots of data here and plenty of clues that you should find useful in building up a picture of What Needs To Be Done. Competent business continuity planning, for example, is also business-risk-aligned, hence you can’t go far wrong by emphasizing information risks to the identified critical business activities. At the very least, obtaining and discussing the documentation is an excellent excuse to work your way systematically around the business, meeting knowledgeable and influential people, learning and absorbing info like a dry sponge.
  • Build your team. It may seem like you’re a team of 1 but most organizations have other professionals or people with an interest in information risk and security etc. What about IT, HR, legal/compliance, sales & marketing, production/operations, research & development etc.? Risk Management, Business Continuity Management, Privacy and IT Audit pro’s generally share many of your/our objectives, at least there is substantial overlap (they have other priorities too). Look out for opportunities to help each other (give and take). Watch out also for things, people, departments, phrases or whatever to avoid, at least for now.
  • Meanwhile, depending partly on your background, it may help to read up on the ISO27k and other infosec standards plus your corporate strategies, policies, procedures etc., not just infosec. Consider attending an ISO27k lead implementer and/or lead auditor training course, CISM or similar.  There’s also the ISO27k FAQ, ISO27k Toolkit and other info from ISO27001security.com, plus the ISO27k Forum archive (worth searching for guidance on specific issues, or browsing for general advice).  If you are to become the organization’s centre of excellence for information risk and security matters, it’s important that you are well connected externally, a knowledgeable expert in the field. ISSA, InfraGard, ISACA and other such bodies, plus infosec seminars, conferences and social media groups are all potentially useful resources, or a massive waste of time: your call. 
Yes, I know, I know, that’s a ton of work, and I appreciate that it’s not quite what was asked for i.e. questions to ask departments about their infosec controls. My suggestion, though, is to tackle this at a different level: the security controls in place today are less important than the security controls that the organization needs now and tomorrow. Understanding the information risks is key to figuring out the latter.

As a relative newcomer, doing your homework and building the bigger picture will give you an interesting and potentially valuable insight into the organization, not just on the information risk and security stuff … which helps when it comes to proposing and discussing strategies, projects, changes, budgets etcHow you go about doing that is just as important as what it is that you are proposing to do. In some organizations, significant changes happen only by verbal discussion and consensus among a core/clique (possibly just one all-powerful person), whereas in some others nothing gets done without the proper paperwork, in triplicate, signed by all the right people in the correct colours of ink! The nature, significance and rapidity of change all vary, as do the mechanisms or methods.

So, in summary, there's rather more to do than assess the security controls against 27002. 

PS  For the more cynical among us, there’s always the classic three envelope approach.

Nov 7, 2018

NBlog Nov 7 - risk awareness (more)

The controls suggested in Annex A of 27001 and the other ISO27k standards are typical, commonplace, conventional, good practice … whatever. Mature organizations often use them and find them useful. They have evolved into being over decades of experience with IT and millennia of experience with the use of information in a business context, and they are still evolving today. Cloud, BYOD and IoT, for examples, are all relatively new hence the associated risks are still emerging and the controls are a work in progress. Fraud, espionage and hacking are always going to remain challenging because of the ongoing arms-race between defenders and attackers: as fast as the controls are improved, the threats change. 

The published ISO27k standards present a fraction of the accumulated knowledge of thousands of ISO/IEC JTC1/SC27 committee members and helpers around the world with experience in myriad organizations and situations. Most committee members accept the advice is valid, useful and worthwhile, on the whole. The standards development process reaches consensus among the committee, or as close as we can get with the occasional stalemate, truce, abstention or objection!

The standards are generic – deliberately so since they are meant to be applicable to any and all organizations. They need to be interpreted and applied sensibly according to the specific context of each organization. Key to doing that in the ISO27k way is first for the organization to figure out its information risks, consider and evaluate them, then use the advice in the standards (and/or elsewhere) as guidance on how those risks might be treated.

Sounds straightforward in theory, but in practice? 

Any individual organization, manager, or information risk and security professional, may not have experienced all the issues that led to the controls being included in the standards - in other words, some of the information risks have not eventuated for them. Some may have occurred but not been recognized as such (e.g. the risk of losing valuable intellectual property when knowledge workers leave the organization may not be apparent, at least not for some time). Therefore, those risks may not feature at all on their risk landscape, or may be downplayed and perhaps lost among the weeds.

Hopefully, though, the exercise of reviewing the controls outlined in the standards leads to the corresponding risks being considered, although this is far from guaranteed, especially if those using the standards are inexperienced in the field. I would prefer the ISO27k standards themselves to be risk-driven in the same way as the ISO27k approach, explaining what information risks are addressed by the standards and, ideally, each of the controls within.

Failing that, we routinely document the information risks associated with each of the security awareness topics in our portfolio for the same reason: helping our customers' awareness audiences understand the purposes or objectives of the suggested controls in each area.

At the moment, the risks are integrated and discussed within various NoticeBored awareness materials - the presentations, briefings, newsletters etc. Maybe for 2019 we might produce a discrete deliverable for each module specifically on the risks. Hmmmm. That's a thought. I can already picture the format. Drafting the first 'Information risk profile' (or whatever we call it) will be the chance to generate a template to stabilize the format.

That's another thing for my lengthy to-do list. Talking of which, must dash ...

Nov 5, 2018

NBlog Nov 5 - end of year awareness and training review

As we plummet towards the end of another year, now is an opportunity to take a long hard look at your awareness and training program as a whole, thinking forward to next year and beyond. Here are some things to bear in mind.

Is the program pitched appropriately? Is your awareness and training approach polished in appearance? Does it look good? Is it professional? Is the branding and presentation up to scratch? Is it attracting sufficient interest and engagement? Is it reaching all the right people across the organization?

What about the delivery mechanisms and awareness activities: are you making good use of the available corporate communications and training facilities? Consider your Learning Management System, intranet, notice boards, seminar and training rooms, email circulations, newsletters, company magazines, courses, briefing sessions, lunchtime updates, security clubs and so on. By all means focus on the methods that achieve the most benefit for the least effort, but don't completely discount the others including novel approaches. Look around for additional opportunities. Remember, you have a diverse audience with differing personalities and preferences. A diverse comms approach takes more effort but increases the reach.

How well is your security awareness and training program working out, in fact? Is it well-respected and popular with punters? Is it adequately funded and proactively supported by management? 

Critically review relevant metrics such as awareness test results and attendance figures, and study evaluation feedback comments to see things from the perspectives of the awareness and training participants. Look at training records and skills profiles. Run an impromptu survey if you need more data.

As your experience and maturity grows, you will undoubtedly find ways to tweak and refine your awareness and training program, possibly making substantial improvements (such as subscribing to NoticeBored!). Talk to colleagues in HR, Health and Safety, Risk etc. about how their awareness and training programs and activities are doing. Share good ideas and novel approaches. Collaborate and work as a team to address common issues and collectively raise your game.

What about the awareness and training program management and governance arrangements: are there rough edges that need attention? Can the metrics and reporting be improved to deliver better value and efficiency (better outputs from less work!)? Do you have sufficient resources - not just budget but people, skills, sources, systems and so on? If you could wave a magic wand, what would you most like to do with additional resources?

Use all of this to review/update your strategy and plan your awareness and training program for 2019. Make notes on what you intend to:
Outline the most effective bits, the approaches, activities etc. that are working well and delivering real business value.
This is the low-value, outdated stuff that no longer earns its keep, is unpopular and frankly not worth the effort any more.
The things that need revision.
Clarify the need or justification for change, elaborate on the anticipated improvements and (for the plan)
at least outline how the changes
are to be made.
innovation helps keep the awareness and training program topical, engaging and relevant. As well as updating the content, updating
the delivery mechanisms etc. can breathe new life into it.

Regarding innovation, for example, millennials just joining the payroll are likely to be more familiar with mobile devices and social media than the average worker, and being new they are obvious targets for awareness and training ... so ... how can you exploit their interests and technological mastery?  

We, too, are enthusiastically reviewing our services in preparation for the new year. No matter how good we are, we can always do better. That hunger for quality improvement is part of our passion for security awareness and training. We can't help it. We love this stuff!

Nov 1, 2018

NBlog Nov 1 - cloud computing security awareness module released

Cloud computing is a strong and still growing part of the IT industry. It’s a hit!
However, the relative novelty of cloud computing puts inexperienced or naive managers, staff and professionals at something of a disadvantage: lacking appreciation of the technology and the commercial/business context, the information risks and especially the security and other cloud-related controls aren’t exactly obvious.
Information security (in the broadest sense – not just IT or cybersecurity) is a major concern with cloud computing, a source of aggravation and costs for the unaware. The organization's professionals/specialists in areas such as IT, risk, compliance and business continuity should have a deeper understanding of the pros and cons of clouds but have you every wondered how that level of knowledge is achieved? 
Simply put, securing the anticipated business benefits of cloud computing involves addressing the information risks that are associated with it.  If the risks are simply ignored, the benefits may be reduced or destroyed by costly security incidents. 

Learning objectives

We have thoroughly updated/rewritten the awareness materials originally delivered back in 2014 - eons ago in Internet time! So what has changed since then? 
Peer through the fog to learn how to avoid the pitfalls and secure the business benefits of cloud computing, with NoticeBored.
  • Introduces and outlines cloud computing, providing general context and background information (e.g. explaining why so many organizations are eagerly adopting it) with as little techno-babble as we can get away with;
  • Informs workers in general about the information risk and security issues and concerns relating to or arising from cloud computing (e.g. the organization’s partial loss of control over its information), plus the business benefits (e.g. reduced costs, greater resilience and flexibility, plus access to cloud specialists). We’re promoting a balanced view;
  • Encourages those considering, specifying, evaluating, contracting for, using or managing cloud computing to identify, analyze and address the information risks, typically through appropriate controls that secure the business benefits as much as the data;
  • Promotes information risk and security management as a business enabler, without which cloud computing would be unacceptably risky.
Review your organization’s use of cloud computing - the apps, dependent business processes, strategies, policies and incidents. Are there any cloud -related risks on the corporate radar? How well are they understood and treated? What’s missing? What stands out? Talk to the relevant experts about it. Flush any issues and ideas into the open, incorporating them where appropriate into your awareness delivery.

And talk to us about subscribing to turbo-charge your awareness program. 

Oct 27, 2018

NBlog Oct 27 - what is integrity?

‘Integrity’ is a fascinating property of information, multi-faceted, more complex and more widely applicable in information security that it might seem.

It involves aspects and issues such as:
  • Factual correctness of information (objectivity versus subjectivity, plus the huge grey area in between and issues arising such as impartiality and perspective);
  • Relevance of information to the matter/s at hand and the substantiality or weight of evidence (e.g. 'contemporaneous notes' recorded in the policeman’s pocket book at the time of an alleged offence may carry more weight in court than later, verbal or written accounts and recollections, but audio/video footage and other evidence captured at the scene with all the right controls in effect tends to be even stronger, even weightier);
  • Completeness of information (which also touches on context and scope issues, and practicalities in a legal setting: there isn't time to present, consider and take into account absolutely everything, so someone has to select the most valuable bits, introducing their judgement into the process); 
  • Timeliness and up-to-date-ness of information (not being too outdated or stale, being applicable to and valid within the specific context);
  • Impact of information (some things are inherently notable and more important than others, perhaps having shock value or otherwise eliciting strong emotional reactions ... which has implications on what information is provided, how it is expressed, to whom, when, in what manner, with what emphasis etc.);
  • Proof and provability (the ability to demonstrate, confidently and convincingly, that everything is in order, with sufficient strength to resist challenges, hence the importance of ‘chain of custody’, for instance, and all manner of physical and logical controls to prevent or at least detect tampering, substitution etc. in forensics);
  • Trust and trustworthiness, confidence, credibility etc. of the information, plus the associated activities, systems, storage, analytical methods, analysts and so on (goes hand-in-hand with proof and provability, includes aspects such as compliance with applicable rules concerning how evidence may be obtained or captured in the first place);
  • Presentation, discussion, interpretation and ultimately the perceived meaning and value of information (that part of information integrity around communicating things properly in a manner that leads to therm being correctly understood: communication involves both sending and receiving, remember, plus other issues such as interception, duplication, duplication, interruption, modification, delays, mis-routing, redirection etc.);
  • Competence, capability, credibility and suitability of various witnesses, analysts and advisors, lawyers, judges etc. involved in cases (e.g. what does it really mean to be an “expert witness”? What are the criteria and obligations of that role? Who determines whether a judge is competent to judge, and how?) ... and similar issues in other contexts (e.g. in business, managers rely on sound, expert advice from competent professional specialists);
  • IT systems, communications and data integrity (e.g. cyclic redundancy checks, cryptographic methods such as digital signatures using hashing, database/referential integrity and more - the technological and mathematical basis for ICT), plus the whole area of digital or eForensics as opposed to the more traditional forms of forensics;
  • Fairness and equitability (e.g. treating similar crimes on a similar basis, and protecting the rights of the weak against the might of the strong – with the interesting consequence that even low-weight ‘circumstantial’ evidence may be valuable if there is nothing better and simply discounting it would be ‘unfair’);
  • Ethics, plus all manner of frauds and scams, social engineering, manipulation, deception and more (human integrity failures! This, arguably, makes integrity the ultimate challenge in politics).
I realise this is a brain dump ... but it's clear that there is a lot of stuff here, more than enough to fill a month's awareness module on 'integrity'. The same is true of 'confidentiality' and 'availability', two closely-related core concepts in information security. 

But should we go down this route at all or is it all too 'academic'?

I'm undecided at the moment. Even if we don't produce C, I and A awareness modules as such, we routinely cover C, I and A in the course of our other topics anyway since these are fundamental to all that we do. However, I find that long shopping list of things above intriguing: there's lots we could say in this area, and plenty of real-world examples we could use to illustrate and explain the topic pragmatically. It would be educational ... but would it be sufficiently interesting and motivational for the majority of our audience?

The list above was prompted by a question on the ISO27k Forum about integrity in forensics ... which suggest another awareness topic. I guess the endless stream of TV shows in this area has set the scene for us, and would provide an opportunity to poke fun at gross inaccuracies such as detectives wandering willy-nilly through crime scenes that are being, or have yet to be, forensically examined. Hmmm, "fun" is something that everyone enjoys so an awareness module on forensics is a definite possibility. I guess I should start watching those CSI programs and taking notes.

Meanwhile, the jury's out.

Oct 25, 2018

NBlog Oct 25 - risk awareness

In a discussion thread on the ISO27k Forum about engaging corporate Risk Management functions with the information security work, Nigel Landman mentioned that ‘Everything becomes a business risk’ ... which set me thinking.

Managing risks to the organization is a significant element of business management – in fact it is possible to express virtually everything about management in terms of managing risks and opportunities (upside risks). It's a very broadly-applicable and fundamental concept.

Given the importance and value of ‘information’ in any business, it’s hard to imagine any full-scope Risk Management function failing to be concerned about information risk and security, unless for some reason they are limited to specific categories or types of risk (e.g. financial, strategic, compliance, competitive etc.) and for some reason haven’t (yet!) made the connection with information risks in those areas … in which case exploring, explaining and elaborating on the information risk and security aspects in conjunction with the Risk Management function would seem to be a worthwhile activity early-on in the ISO27k implementation.

The same goes for various other corporate functions that are currently disengaged, unaware or reluctant to get involved in information risk and security. The usual excuse is that “it's an IT thing”, a myth perpetuated by crudely labeling it “IT risk”, “IT security” or “cybersecurity”. Of course there are risks to or involving IT but that’s just the tip of the iceberg of information risks, business risks, and risk in general. It's fine to focus-in but makes little sense to attempt to manage individual categories or types of risk (including information risk, by the way) in isolation from the rest. You could even say that failing to manage information risks within the broader business context is itself a business risk - or an opportunity for improvement!

At a deeper psychological level, lack of understanding and fear of the unknown may well be factors behind the reluctance of some business people to engage with the ISO27k implementation, the Information Security Management System and information risk management. Some of the issues we are dealing with are complex and scary even for us, let alone those without a background and professional interest in the field. Couple that with our profession's almost obsessive focus on harmful, downside risks and it's easy to see why business managers might be reluctant to engage. We're making it easy for them to drop it in the "bad news" bin, leaving it to someone else. Hopefully. Fingers crossed.

I recommend making security awareness an integral part of the ISO27k implementation project as well as the ISMS. Specifically, I'm suggesting explaining information risk and security patiently to managers and other business people using business language and concepts. I gave an example here yesterday in the piece about preparing an elevator pitch on cloud security: rather than blabbering on about virtual systems and network security, we're emphasizing the business implications of cloud-related risks and opportunities. "Cloud services can be cost-effective and reliable, provided the associated risks are treated appropriately." may be just a single sentence but it's one-tenth of the elevator pitch, a key point worth emphasizing.

Oct 24, 2018

NBlog Oct 24 - cloud security elevator pitch

Imagine that you bump into a senior manager - an executive, maybe the CEO or MD or someone else who sits at the helm of your organization - presenting you with a fleeting opportunity to communicate.

Imagine that you have concerns about the organization's approach to cloud computing - what it is doing or not doing, the way things are going, the strategies and priorities, objectives and resources, that sort of thing.

Now imagine how you might put across your concerns and interests in that moment that either just occurs (a chance meeting in the elevator, perhaps), or that you engineer in some way (maybe targeting and snaring your prey en route to or from the Executive Suite, or lunch).

What would you say?  I'm not asking 'what would you talk about' in a sweeping hand-waving cloudy sort of way but more precisely what are the few key points you want to express, and exactly how would you do that?  

The challenge is similar to writing an executive summary on a management report, or preparing the introduction and conclusion of a management presentation, essentially getting yourself in the zone to make the most of the brief opportunity. Less is more, so condensing or collapsing all the things you'd quite like to say down to those particulars that you need to say is a management skill that takes practice. It's almost triage: when the elevator doors open and your prey heads into the distance, what is or are the messages you most want to leave them with, above all else?

It's a challenge for us, too, to generate generic security awareness materials for exactly that kind of situation. What are the key issues for senior management relating to the monthly topic (i.e. cloud security for November's module)? What thoughts or impressions or action points are likely to be the most important for most if not all our clients? And how can we communicate those as efficiently and effectively as possible, as succinctly and yet poignantly as we can?

We have the luxury of time to contemplate and help prepare our clients for the possibility of that chance meeting. They have the benefit of the awareness materials as a whole, the research and thinking that goes into the NoticeBored awareness module as well as the 'elevator pitch' itself. Through less than 150 words, we're encouraging them to get in the zone, prepared for whatever situation occurs - a form of contingency preparation really. We can help them get at least one step ahead of the game, ready, set and willing to seize the moment. 

Oct 18, 2018

NBlog Oct 18 - intentions to actions

"Asking for a Friend: Evaluating Response Biases in Security User Studies" is a lengthy scientific research paper exploring consumer software update behavior. Authors Elissa M. Redmiles, Ziyun Zhu, Sean Kross, Dhruv Kuchhal, Tudor Dumitras, and Michelle L. Mazurek conclude, in part, that people don't in fact update their systems as promptly as they say they do, or should do.

The study is primarily concerned with the methods used to survey human behaviors. The authors acknowledge the extensive body of scientific research concerning survey methods and common biases. In respect of discrepancies between lab tests and real-world results, they acknowledge typical reasons such as: 
  • Sub-optimal study designs;
  • Inadequate survey population sampling;
  • Cognitive biases by respondents, including a reluctance to admit to socially unacceptable behavior; and 
  • Other issues with some approaches (e.g. online surveys).

They actively countered some of the biases in this study, for example by:
  • Carefully framing and wording each survey question and the responses (e.g. asking how respondents would advise a friend on speed of updates, in contrast to how they report their own update speeds);
  • Randomizing the sequence of some questions;
  • Comparing online against interview-based surveys. 

My interest is more pragmatic than academic: why is it that people don't update as promptly as they think or should do? Is there anything we might do to close that gap between intention and action? 

Awareness efforts (including  ours!) typically emphasize the importance of rapid patching of vulnerable systems for security reasons ... but it would be helpful if our approach was even more motivational.

To be fair, it would also help if the process of patching systems was less arduous, disruptive and risky in its own right. Automating the new-version checks, patch downloading and installation reduce the effort but increase the risk, especially on today's relatively complex IT systems with numerous applications sharing and sometimes conflicting for the same resources.  There's a lot to be said for the IoT-type approach, simplifying things (and things) through specialization. Why install a networked Windows or Linux PC to control an elevator when a dedicated and isolated control system can do the job with much less complexity and risk? 

And one more thing: if software was better specified, designed, developed and quality-assured in the first place, there would be less need for security patches at all! Dream on.

Oct 13, 2018

NBlog Oct 13/2 - CERT NZ goes phishing

CERT NZ (apparently) has once again circulated an email warning about phishing, containing a distinctly phishy link to "READ MORE INFORMATION". The hyperlink leads from there to certnz.cmail20.com with a tracker-type URL tail.

Unlike most of the intended audience, I guess, I'm cyber-smart enough to check out the whois record: cmail20.com domain is registered to Campaign Monitor Pty Ltd of New South Wales - presumably a legitimate mass emailer/marketing company whose services are being used by CERT NZ to circulate the warnings - but that's not the point: the fact is that the embedded link target is patently not CERT NZ's own domain.

What's more, the body of the email is a rather vaguely-worded warning, not entirely dissimilar to many a classic phisher. "Nasty stuff is going to happen unless you do something" just about sums it up. It isn't even addressed to me by name, despite me being required to supply my name and email address when I signed up for CERT NZ's "updates". They know who I am.

I've notified CERT NZ about this kind of thing privately before, to no avail, so this time around I'm going public, here on the blog.

CERT NZ, you are perpetuating the problem. Wake up guys! It's simply not good enough. I expect more of you. Your sponsors, partners and taxpayers expect more of you. NZ expects more of you.

Is it really that difficult to either drop the marketing tracking, or at least to route clickers via cert.govt.nz first, with a redirect from there to the tracker?

Is there nobody in CERT NZ with sufficient clue to appreciate and respond to such an obvious concern? 

Am I wasting these bytes? Hello, CERT NZ! Anyone home?

Ironically, CERT NZ has allegedly been promoting the past five days as "Cyber Smart Week 2018", which as far as I can make out appears to consist of a single web page on CERT NZ's website expanding a little on these four simple tips:
  1. Use unique passwords
  2. Turn on 2FA
  3. Update your apps
  4. Check your privacy

Admirably brief ... but there's nothing explicit about phishing or business email compromise, nor social engineering, scams and frauds. No obvious links to further information. 

Ironically, again, the Cyber Smart page ends: 
"Report any cyber security issue you experience to CERT NZ. We’ll help you identify it and let you know what the next steps are to resolve it. We’ll also use the information to create advice and guidance for others who might be experiencing the same issue."
Been there, done that, got precisely nowhere. I despair.

Next time I receive a phishing-like email from CERT NZ, I'll take it up with the news media. Maybe they care as much as me.

NBlog Oct 13 - little boxes, little boxes

In preparation for a forthcoming NoticeBored security awareness and training module on business continuity, I'm re-reading The Power of Resilience by Yossi Sheffi (one of the top ten books I blogged about the other day). 

It's a fascinating, well-written and thought-provoking book. Yossi uses numerous case studies based on companies with relatively mature approaches to business continuity to illustrate how they are dealing with the practical issues that arise from today's complex and dynamic supply chains - or rather supply networks or meshes.

Risk assessment is of course an important part of business continuity management, for example:
  • Identifying weak, unreliable or vulnerable parts of the massive global 'system' needed to manufacture and supply, say, aircraft or PCs;
  • Determining what if anything can be done to strengthen or bolster them; and 
  • Putting in place the necessary arrangements (controls) to make the extended system as a whole more resilient.
Yossi covers the probability plus impact approach to risk analysis that I've described several times on this blog, with (on page 34) a version of the classic Probability Impact Graph:

The dotted lines divide the example PIG into quadrants forming the dreaded 2x2 matrix much overused by consultants and politicians. He discusses more involved versions including the 5x5 matrix used by 'a large beverage company' with numbers arbitrarily assigned to each axis - not the obvious 1,2,3,4,5 linear sequence but (for some barely credible reason) 1,3,7,15 and 31 along the impact axis and 1,2,4,7 and 11 for likelihood or probability, with the implication that they then multiply the values to generate their risk scores.

That appears straightforward but is in fact an inappropriate application of mathematics since the numbers are not cardinal numbers or percentages denoting specific quantities but category labels (ordinals). The axes on the 2x2 matrix could have been labeled green and red or Freda and Fred: it makes no sense to multiply them together ... but that's exactly what happens, often.

Yossi's example PIG above demonstrates another problem with the approach: "Earthquake" is shown across the middle of the impact axis, spanning the Light and Severe categories. So which is it? If it must be in a box, which box?

The obvious response is either to shift "Earthquake" away from the boundary, arbitrarily, or add another central category, dividing that axis into three ... which simply perpetuates the issue since there are so few clear columns on the PIG to draw the lines. Likewise with the rows.

What's more, earthquakes vary from barely detectable up to totally devastating in impact, way more range than the PIG shows. Those barely-detectable quakes happen much more frequently than the devastating ones (fortunately!) hence a more accurate representation would be a long diagonal shape (a line?  An oval?  A banana? Some irregular fluffy cloud maybe?) mostly sloping down from left to right, crossing two or three of the four quadrants and extending beyond the graph area to the left and right. A single risk score is inappropriate in this case, in almost all cases in fact since most risks show the same effect: more significant and damaging incidents typically occur less often than relatively minor ones. We can't accurately determine where they fall on the PIG since the boundaries are indistinct. We seldom have reliable data, especially for infrequent incidents or those that often remain somewhat hidden and perhaps totally unrecognized as such (e.g. frauds). 

As if that's not enough already, the whole situation is dynamic. The PIG is a snapshot representing our understanding at a single point in time ... but some of the risks may have materially changed since then, or could materially change in an instant. Others 'evolve' gradually, while some vary unpredictably over the time horizons typical in business. Some of them may be related or linked, perhaps even inter-dependent (e.g. "Computer virus", or more accurately "Malware",  is one of many causes of "IT system failure", hence is it appropriate to show those as two distinct, separated risks on the PIG?). 

The possibility of cascading failures is one of Yossi's core messages: it is not sufficient or appropriate to consider individual parts of a complex system in isolation - "the straw that broke the camel's back" or "the butterfly effect". A seemingly insignificant issue in some obscure part of a complex system may trigger a cascade that substantially magnifies the resulting impact. System-level thinking is required, a wholly different conceptual basis.

Given all the above complexity, and more, it makes sense (I think) to dispense with the categories and quadrants, the dodgy mathematics and the pretense at being objective or scientific, using the PIG instead as a tool for subjective analysis, discussion and hopefully agreement among people who understand and are affected by the issues at hand. An obvious yet very worthwhile purpose is to focus attention first and foremost on the "significant" risks towards the top right of the PIG plus those across the diagonal from top left to bottom right, while downplaying (but not totally ignoring!) those towards the bottom left. That's the reason the NoticeBored PIGs have no specific values on the axes, no little boxes, a variety of sizes and shapes of text indicating the risks, overlaid on a background simplistically but highly effectively colored red-amber-green. We're not ignoring the complexities - far from it: we're consciously and deliberately simplifying things down to the point that experts and ordinary people (managers, mostly) can consider, discuss and decide stuff, especially those red and amber zone risks. Are they 'about right'? What have we missed here? Are there any linkages or common factors that we ought to consider? It's a pragmatic approach that works very well in practice, thank you, as both an awareness and a risk management tool.

I commend it to the house.