Welcome to NBlog, the NoticeBored blog

I may meander but I'm exploring, not lost

Mar 26, 2009

Pop Mechanics does infrastructure security

Popular Mechanics gives the US national infrastructure a once-over from the perspective of its resilience to cyberwarfare, asking "How Vulnerable is U.S. Infrastructure to a Major Cyber Attack? Could hackers take down key parts of our infrastructure? Experts say yes. They could use the very computer systems that keep America's infrastructure running to bring down key utilities and industries, from railroads to natural gas pipelines. How worried should we be about hacking, the new weapon of mass disruption?"

It starts with a pop culture doomsday scenario to grab the readers' attention: "The next world war might not start with a bang, but with a blackout. An enemy could send a few lines of code to control computers at key power plants, causing equipment to overheat and melt down, plunging sectors of the U.S. and Canadian grid into darkness. Trains could roll to a stop on their tracks, while airport landing lights wink out and the few traffic lights that remain active blink at random."

Referring to the "hodgepodge" of Industrial Control Systems controlling elements of the critical infrastructure such as power and water supplies, the author at one point claims that "a good rule of thumb is that any device that is computer-controlled and networked is vulnerable to hacking". That's true I guess, for undefined values of 'vulnerable'. But SCADA/ICS devices that are connected to wireless/microwave control links or use phone lines and modems are also vulnerable to hacking: are these 'networked' I wonder?

I would disagree with the author on one point. He says "Infrastructure is meant to last a long time, so upgrades to existing systems tend to occur at a glacial pace." The glacial pace is not because infrastructure is meant to last a long time, but because changing such complex, safety-critical systems in any way (even to implement security patches) creates additional risks that may outweigh the need to make the change. It's a risk management decision, of course, and a delicate one given that leaving the systems open to cyberwarfare attackers does not necessarily lead to cyberwarfare, whereas creating a power cut or safety incident is bound to hit the headlines.

The article covers the usual range of headline incidents and scare stories with a little expert commentary, and as such is fine as a general security awareness piece. There's nothing of much use here, though, for security or general management at critical infrastructure organizations.

Mar 24, 2009

Revised NIST security awareness/training standard

I've been reading and thinking today about a revised NIST Special Publicatio SP800-16, currently released for public comment. If you are genuinely interested in making security awareness more effective, I recommend setting aside an hour or three to read and consider the draft document.

To whet your appetite, here are just a few short paragraphs from one section of the draft, with my own thoughts and comments cited below.

Under section 2.2.1 of SP800-16, NIST says:
"Awareness is not training (1). Security awareness is a blended solution of activities (2) that promote security, establish accountability, and inform the workforce of security news (3). Awareness seeks to focus an individual’s attention on an issue or a set of issues (4). The purpose of awareness presentations is simply to focus attention on security (4). Awareness presentations are intended to allow individuals to recognize information security concerns and respond accordingly. (2)

In awareness activities the learner is a recipient of information, whereas the learner in a training environment has a more active role. (2) Awareness relies on reaching broad audiences with attractive packaging techniques. Training is more formal, having a goal of building knowledge and skills to facilitate job performance. (5)

A few examples of information security awareness materials/activities include:
• Events, such as an information security day,
• Briefings (program- or system-specific or issue-specific)
• Promotional/specialty trinkets with motivational slogans,
• A security reminder banner on computer screens, which comes up when a user logs on,
• Security awareness video tapes, and
• Posters or flyers. (6)

Effective information security awareness efforts must be designed with the recognition that people tend to practice a tuning-out process called acclimation. If a stimulus, originally an attention-getter, is used repeatedly, the learner will selectively ignore the stimulus. (6) Thus, awareness delivery must be on-going, creative, and motivational, with the objective of focusing the learner's attention so that the learning will be incorporated into conscious decision-making. This is called assimilation, a process whereby an individual incorporates new experiences into an existing behavior pattern. (3 & 5)

Learning achieved through a single awareness activity tends to be short-term, immediate, and specific. For example, if a learning objective is “to facilitate the increased use of effective password protection among employees,” an awareness activity might be the use of reminder stickers for computer keyboards. (7)

The fundamental value of information security awareness programs is that they set the stage for awareness training and role-based training by bringing about a change in attitudes which should begin to change the organizational culture. The cultural change sought (8) is the realization that information security is critical because a security failure has potentially adverse consequences for everyone. Therefore, information security is everyone’s job. (9)"

My comments:

(1) The terms "awareness", "training" and "education" are often used interchangeably and sometimes combined, as in "awareness training". However, they are different activities with different mechanisms and purposes. SP800-50 “Building an Information Technology Security Awareness and Training Program” covers this point rather eloquently, better in fact than SP800-16 and FISMA which tie themselves in knots over the terminology.

(2) If you can read past the much abused second word of "blended solution of activities", the real point is that awareness requires a range of separate but complementary activities - and by "activities" I mean things that involve physical actions by both the information givers and the information receivers. I am talking about proactive learning, not passive entertainment or "edutainment". The most important part of a training course is not the presentation slides or other materials, the presenter, the facility or the audience: it's the engagement, interest and interaction that happens when members of the audience become inspired to change what they do thereafter.

(3) Informing people, in other words providing relevant facts about information security risks and controls, is an important element of awareness, training and education but is not in itself sufficient, in most cases. Erudite but boring and dry factsheets have limited impact and can be counterproductive. News stories are just one way to bring information security to life, reminding people that we are not talking purely hypothetically about security incidents. They are really happening around us, and not just Out There in the news headlines but much closer to home, affecting us, our colleagues, friends and families, and of course our organization and society. Getting personal on information security matters is a good way to engage with people.

(4) Focus is important. Generic, bland "be more secure" messages are a total waste of brain cycles. People need to know what, specifically, they should be worried about and what they should do ... but first they need to open up in order to even receive the message. Making people "wake up and smell the coffee" is one option but is not the only way (I'll speak about other techniques another time). Focus, to me, includes getting straight to the point - being direct and avoiding unnecessary fluff or irrelevancies. It also includes picking on specific information security topics, providing more depth than is typical of those rushed security induction training classes.

(5) Building knowledge and skills to enhance job performance is all very well but has little value unless people actually use the knowledge and skills when they get back to work. Achieving this is the crux of effective awareness, training and educational activities. Unless people are taken beyond the point of being mere receptacles for facts and are motivated to behave more securely, the program is not going to earn its keep.

(6) Notice that "forcing employees to sit down en masse in a stuffy meeting room or lecture theatre while some boring IT geek or clueless manager spouts off about information security" does not feature in NIST's list of worthwhile activities, but is not far from the truth in some organizations! Awareness, training and education take creativity and passion. It's not that hard really. [For lots more ideas, thing such as case studies with role plays, crosswords, competitions etc. see NoticeBored!]

(7) Taking focus to the extent of a single awareness activity covering just a single information security control might perhaps be necessary if that one control is conspicuously failing but seems unlikely to cover the full breadth of security controls that employees should understand and respect, in any reasonable timeframe. Coupling this point with comments about keeping the content interesting implies to me the need to run quite rapidly through a sequence of topics, moving ahead at or just before the point that eyelids start to droop. This idea of a rolling awareness program, in my experience, makes all the difference but there's one more little point to bear in mind. "Sequences" can be random or directed. A random assortment of information security topics may achieve the coverage desired but misses the opportunity to link together successive topics into a more coherent security story. Being smart about the sequence and scope of the topics leads to a more subtle form of the old teacher's saw "Tell them what you are going to tell them, tell them, then tell them what you told them". We can introduce future topics and refer back to previous topics, all while delivering the present topic. The interrelatedness of information security topics makes this quite easy to achieve with just a bit of thought and planning. The advantage is a level of coherence and reinforcement that random assortments don't achieve.

(8) Now there's a thought: we are seeking "cultural change" are we? Great idea, one I thoroughly endorse ... but unfortunately for many managers, security awareness is less about achieving cultural change than about "being seen to be Doing Something" or, even worse, "doing it for compliance reasons". Health and safety training finds itself in the same pickle. Effective H&S training has a lasting impact on what employees do as they go about their normal business activities, long after the ink has dried on the training evaluation forms. It's about putting on the ear muffs and safety goggles even when there's nobody else looking. It means taking a moment to deal with a trip hazard in a public thoroughfare even when you yourself have clearly spotted and avoided the hazard. Achieving cultural change to create a "culture of security" is a fabulous objective, one that's much easier to say than to do. For me, it goes somewhat beyond the rather simplistic if important ideas noted in section 2.2.1, picking up concepts such as:
  • Providing continuity - planning awareness activities over the long term (and I don't mean 'scheduling next year's security awareness session'!);
  • Addressing the entire organization (staff and managers), in fact the scope can usefully cover the extended organization including friends and relatives of employees, contractors/consultants, outsource suppliers, customers, suppliers, business partners, other stakeholders and, to some extent, society at large
  • Using creativity to create interest and engage people with the program, and retaining that interest indefinitely;
  • Being sensitive to cultural norms, communications preferences and so forth for the audiences - notice the plural: it makes little sense to focus all the security awareness activities on one homogeneous audience when we know full well that business units, departments, teams and individuals vary markedly in many key respects. "Selling" copyright compliance to, say, an Indian or Chinese business unit is a rather different prospect to getting the same point across to a Scandinavian organization. For some people, the 3 minute high level overview is more than enough: for others, 3 minutes would not be nearly enough for the briefest of introductions;
  • Taking audience engagement to the extent of active audience participation, for example encouraging managers, IT professionals and employees to converse on the same information security topic, putting their respective points of view in the context of a shared understanding of the terms and concepts involved.
(9) If "information security is everyone's job", it ought to be in everyone's job descriptions - not a bad idea in itself but I feel there's a bit more to it. "Information security is everyone's responsibility" takes it a step further since it is not purely a job-related thing, and hints at a vital security concept, that of ownership, accountability and responsibility. "Information security is what we do" might be a bit excessive, but I prefer the word "we" in there since it is clearly a shared responsibility. [Arguing about the specific meaning and nuance of every word smacks of the crazy process of developing corporate mission statements. However, the discussion is at least if not more valuable than the product, rather like planning and plans. Discussing such security principles leads to a common understanding and is a good way to engage senior managers with the awareness program.]

Right, that's section 2.2.1 duly considered. I'll stop there for now, leaving consideration of the remaining 156 pages as an exercise for you dear reader - homework if you will. NIST welcomes comments on the draft SP800-16 until June 26th 2009 by email to 800-16comments@nist.gov.

How to fix SCADA security [not]

In "A cautionary tale about nuclear change management" ComputerWorld blogger Scott McPerson discusses a few security incidents that have been linked to SCADA systems, picking out two causes: poor change management and problems with the IT architectures. If only things were so simple in Real Life.

According to Scott, the change management problem can be solved by adequate pre-release testing of patches. Mmm. OK, well let's assume a SCADA-using organization has the resources to invest in an IT test jig comprehensive enough to model the live SCADA/ICS systems, complete with real-time data feed simulators and control panels, or at least a sufficient part of the complete live system to allow representative and realistic testing. Presumably they could test the patches and software upgrades thoroughly enough to reduce the possibility of unintended consequences, but how far can or indeed should they go? Anyone who has actually tried to do exhaustive software testing, even in a very simple laboratory setting, knows that it is literally impossible to test everything in practice. With the best will in the world, the fanciest test jig that money can buy and the most competent, skilled and diligent professional testers on the job, there is always a residual risk at the declared end of testing. In real life, the end of testing is almost always declared by management well before the testers are truly happy, not least because the issues and risks that the planned software changes are supposed to fix inevitably persist at least until the fix is applied, so there are clearly competing pressures. Damned if we do, damned if we don't.

OK, I'm certainly not arguing that pre-release software testing is a waste of time on SCADA or any other IT systems, far from it. But the reality is that no matter how much testing and fixing is done, the eventual decision to implement implicitly if not explicitly accepts the residual risk. In my experience, the operational, safety and commercial risks associated with system failures on SCADA systems are so significant that the opposite situation is more of a problem, namely that SCADA systems are not patched at all, or at least not promptly, due to the extreme risk aversion. Legacy systems are the norm not the exception in SCADA/ICS-land. In the case of safety-relevant and certified systems, plus the highly specialized bespoke systems typical of controllers for complex machinery (such as, oh er, a nuclear power station), the inertia problem is even worse.

Scott's second point about IT architectural issues also seems rather glib to me. "The fact that some utilities -- including nuclear utilities -- are stupid enough to attach the servers that control and manage SCADA systems to the same Internet that runs porn and Nigerian scams and MySpace is ludicrous. It is also dangerous." That statement seriously denegrates the highly competent IT and business managers in the utilities, manufacturing and engineering companies where I have worked. Such people are far from stupid. As I said already, they are highly risk averse and do not take such decisions lightly. But again there are competing priorities. The Internet is a convenient, cheap way to access SCADA/ICS systems, networks, devices etc. for remote diagnostics and support purposes, for example, and often glues together critical business processes throughout the supply chain. Connecting the SCADA/ICS network to any other network (even the internal corporate LAN) is clearly fraught with danger so security is always a concern.

The main beef I have with you, Scott, is that you have over-simplified the problems and provided trivial solutions, as if simply saying these things will make a difference. Calling the people who are actually dealing with the risks "stupid" is hardly going to make friends and influence people.

Mar 19, 2009

SCADA stories of 2008

SCADA security specialists Digital Bond run an annual summary of the top SCADA security stories of the year before. Here are their lists for 2008, 2007 and 2006.

In 2007, the story about successfully hacking and taking control of an electricity generating plant was hot news, along with NERC's moves to improve information security for the US electricity industry. In 2008, the US water industry seems to have followed NERC's lead with their own security roadmap.

Worming the Internet

Unprecedented collaboration between ICANN, antivirus vendors, other malware security professionals and domain name registrars in US, China and elsewhere is seeking to neutralize the Conficker/Downadup worm. The worm's authors evidently intended the worm to download payloads from any of a long list of domains, so the security community has been busily registering or regaining control of those domains to prevent them being abused.

Microsoft has offered $250k for information leading to the arrest and prosecution of those behind Conficker/Downadup, a sign that Internet security issues are bad for all Internet users, not least the big businesses that depend on it.

Meanwhile, a third variant of the worm has been detected with a trigger date of April 1st. This could be big.

Mar 4, 2009

Scared of SCADA?


Our latest product is a brand new security awareness module on SCADA, ICS, DCS and related acronyms - essentially industrial process control systems. I suspect few employees outside of IT will have heard of SCADA and hardly any will have considered the security requirements associated with keeping the lights on, both literally (SCADA systems are heavily used by the electricity generators and grid) and figuratively (modern factories are packed with all manner of computerized industrial machinery). For those who work not in manufacturing industry but in ordinary offices, we point out that elevators and other facilities are typically managed by a Building Management System, itself a form of SCADA. For those who don't even work in an office, the Engine Management System in their car is another example.

In addition to the potential for unplanned production outages and disruption to critical infrastructures, the health and safety plus environmental protection aspects make SCADA security impacts potentially horrific. Simply being obscure is no defence against some hackers and, potentially, their terrorist masters. Governments and managers at major utilities are worried about SCADA security risks, so all in all this is an important awareness topic.