Welcome to the SecAware blog

I spy with my beady eye ...

31 Jul 2010

Raising awareness of industrial espionage

We often read about security incidents involving personal information in the newspapers or online.  Multi-million dollar credit card and social security number exposures grab the headlines and consume many column inches.  There are even websites dedicated to totting-up the sordid numbers.  There are laws and regulations to protect personal data, and most of us accept that our privacy is inherently worth protecting, no question.

When it comes to protecting confidential proprietary information belonging to corporations, however, the situation is less clear.  Someone taking, say, their former employer’s customer list to a new job may be ‘frowned upon’ but evidently this practice is often tolerated and is probably fairly common in practice.  Indeed professional résumés boast of prior work experiences and major projects, with the implication that proprietary knowledge and expertise gained on prior assignments is effectively for sale to the highest bidder.  

News stories involving industrial espionage are few and far between.  Why is that?  It’s conceivable that there are not many incidents, but it seems far more likely that most simply don’t see the light of day – in other words, they are kept under covers or quietly hushed-up, or perhaps they are just not identified as such.  As with personal data breaches, organizations are understandably reluctant to admit their security failures and discuss the vulnerabilities that were exploited, knowing that they reflect badly upon them and detract from their brands.  Possibly some fear that revealing incidents risks disclosing yet more of the proprietary information in question, or encouraging further attacks.  Without the legal pressures that force disclosure of many privacy breaches, organizations are within their rights to say nothing and evidently this is the most favored option in practice.

Our latest NoticeBored security awareness module explains the value of the information assets at risk and the myriad ways in which they may be threatened, and calmly describes the corresponding security controls.   We use diagrams, mind maps, photos, news cuttings and motivational writing to encourage people (specifically staff, managers and IT professionals) to take this seriously and change the way they behave.  Please contact us for more information about the NoticeBored awareness subscription service.  And hurry up before your competitors steal your trade secrets thanks to unaware employees.

26 Jul 2010

Book review: Managing the Human Factor in Information Security

David Lacey’s book concerns the influence of people in protecting information assets and is excellent value.  

It covers a surprisingly wide range of topics relating to the human aspects of information security, mostly from management and operational perspectives.  The book has depth too, while remaining generally pragmatic in style.

I highly recommend the book for all information security professionals, particularly CISOs and Information Security Managers who are not entirely comfortable with the social elements of information security, and for information security MSc students who want to boost their understanding in this area.  The book is particularly valuable also for information security awareness and training professionals who necessarily deal with human factors on a daily basis, and need to understand how best to work with and influence their organizational cultures.  

Read our book review for more.

Regards, Gary

21 Jul 2010

Business continuity during the holiday period

An email from Garrison Continuity pointed me to a neat 2-page Adobe PDF file with tips to ensure that business continuity arrangements won't falter as many employees will soon be on holiday.

Truth is, the holiday period thing is just a timely prompt to ensure the arrangements are sound: the plans should be checked and exercised periodically throughout the year.  It's one of the regular activities for the Business Continuity Manager, providing additional assurance that the plans will function properly whenever a major incident strikes.


16 Jul 2010

Human factors conspicuously absent

A new 'how to' piece on eHow.com titled Information Security Awareness & Training is curiously deficient.  I'm puzzled that someone who presumably feels they have expertise in the subject would write such a piece that refers almost exclusvely to technical IT security controls.  There is no significant mention of human factors, nor any pragmatic help on how to plan, organize, develop, deliver, measure and maintain an infosec awareness & training program.  It's so bad, I hardly know where to start criticising it.

Blog comments are welcome but, unlike eHow.com, there's no need to register your details or commit to writing more content.

Regards, Gary

12 Jul 2010

Rejuvenating a security awareness program

Regardless of whether your security awareness program is barely off the ground or has been running for a while, we all come up against barriers from time to time.  It can be very dispiriting for those of us tasked with “doing awareness”, leading to a drop in our morale and energy but fear not brave awareness person!  With a bit of creative or lateral thinking, there are all sorts of things you can do to bring your program back on track.  Here are six ways to tackle those barriers.

1.  Hit the barrier head-on
This is exactly what we normally do.  We ‘try harder’ and ‘have another go’.  Sometimes it works but occasionally, when we’ve hit our heads against the barrier and bruised our ego once too often, we realize it is no longer working and something has to change.  This is the trigger to take stock of the situation and plan something different – whether subtly or radically different is up to you.

2.  Overwhelm the barrier
This involves more than simply ‘more of the same’.  Be prepared to experiment, trying different approaches in areas where things have not gone to plan previously.  Think back at what has gone well and what hasn’t (and review your feedback forms if you have been using them), and learn from your experiences e.g. if your awareness seminars often seem to fall flat with poor attendance, try organizing “Q&A sessions” or “brown bag lunches” or whatever instead.  Consider inviting outside speakers or charismatic insiders to speak on security topics.

3.  Call in the cavalry
Did your CEO or Board of Directors originally support the proposal to invest in an awareness program?  Have they since quietly disappeared into the background?  Now’s the time to call on their proactive support!  Explain that you believe the program is flagging because they are not playing an active part in it, and suggest some practical ways in which they can help.  Appeals of this nature are best put face-to-face.  In conjunction with your line manager, see if you can schedule a short meeting with the CEO to explain what is going on and seek her assistance.  Prepare a shortlist of specific suggestions to answer the inevitable question, “What do you want me to do about it?”

4.  Undermine the barrier
Be realistic.  Are you overstretching yourself?  Maybe it’s too much work to keep up with the variety of topics you need to cover.  Perhaps you are spreading your effort too thinly.  Take another look at your awareness plan: are there topics you can combine or set aside for a while as you gather your strength?   Do you need help from other experts in internal communications, training or security?  Even if you cannot secure permanent resources, a heart-felt appeal to your colleagues (including your ‘awareness ambassadors’) may only secure you an hour or two of their time but that may be just enough to set you back on the road to success.  Students, contractors and consultants all have their place so don’t get fixated on permanent headcount.  Are there pieces of work that eat your time but could be packaged up for someone else to do (like, for example, NoticeBored)?

5.  Go around the barrier
Sometimes the problem stems from an individual person or function that always seems to be in the way.  Can you identify the blockage?  Have you tried to discover what is the reason they are blocking you?  Have you discussed things openly with them?  Have you tried reasoning and bartering (“If you’ll agree to promote the security awareness program, I’ll give you an hour a month to help with your intranet website”).

6.  Take another route
Think parachute drops into enemy territory.  Try picking up on other awareness and communications activities in the organization and learning from them.  Is there maybe a safety or legal compliance program in place?  Are there opportunities to combine efforts on specific topics of common interest?  How about planning a joint seminar, or back-to-back seminars?  Have they got expertise and ideas you could capitalize on, and vice versa?

5 Jul 2010

Disastrous lack of policy?

A DarkReading article caught my attention today:
Demolition firm Ferma nearly failed because its employees lacked a proper security policy.

In mid-2009, an employee at the California firm clicked on a link in an e-mail message and ended up at a malicious website. The site, run by online thieves, used a vulnerability in Internet Explorer to load a Trojan horse on the employee's system. With control of the machine, which was used for much of the firm's accounting, the thieves gathered data on the firm and its finances. A few days later, the thieves used 27transactions to transfer $447,000 from Ferma's accounts, distributing the money to accounts worldwide.

"They were able to ascertain how much they could draw, so they drew the limit," said Ferma president Roy Ferrari in an interview at the time.
It was that opening line that stood out for me.  Was this incident truly due to the lack of a "proper security policy", in fact?  If so, what would that "proper security policy" have said?

I would dispute the article's claim that:
For Ferma, a security policy that forbid surfing on computers used for accounting or resulted in stronger security for such computers would likely have stopped the attack cold.
No policy would have stopped the attack unless (a) employees fully complied with it, and (b) the controls it mandated were sufficiently strong to eliminate all the risks.  'Not surfing on computers used for accounting' would reduce but not eliminate the risk, and it would only provide that limited protection if in fact accounting users never surfed the Interweb on their normal PCs.  It would not have prevented incidents that involved other modes of attack, such as social engineering, network worms or Trojan-infected USB sticks. The incident might have been identified if not blocked by antivirus software, firewalls and network monitoring, and additional business controls over the authorization and release of large value transfers.  Anti-fraud and money laundering controls at the bank/s could have made the criminals' job harder too.

The truth is that information security almost invariably requires multiple overlapping or complementary controls.  To say that this incident was the result of a lack of policy is distinctly misleading.

Comments welcome.  Gary.

1 Jul 2010

Applying the Cooper Color Code to information security

A throwaway comment in a convoluted machine-translated blog led me to a fascinating Wikipedia piece about Jeff Cooper, father of the "modern technique" of handgun shooting, in particular the concept of "condition white".  Condition white describes the state of mind of someone who is totally oblivious to a serious threat to their personal safety.  Cooper used it in relation to situations involving violent assault where the potential victims don't even appreciate that they are in danger and hence are not in the least bit alert to the signs of impending attack.  The attacker therefore has the element of surprise.

The Wikipedia piece describes four levels recognized by Cooper:

  • "White - Unaware and unprepared. If attacked in Condition White, the only thing that may save you is the inadequacy or ineptitude of your attacker. When confronted by something nasty, your reaction will probably be "Oh my God! This can't be happening to me."
  • Yellow - Relaxed alert. No specific threat situation. Your mindset is that "today could be the day I may have to defend myself." You are simply aware that the world is a potentially unfriendly place and that you are prepared to defend yourself, if necessary. You use your eyes and ears, and realize that "I may have to SHOOT today." You don't have to be armed in this state, but if you are armed you should be in Condition Yellow. You should always be in Yellow whenever you are in unfamiliar surroundings or among people you don't know. You can remain in Yellow for long periods, as long as you are able to "Watch your six." (In aviation 12 o'clock refers to the direction in front of the aircraft's nose. Six o'clock is the blind spot behind the pilot.) In Yellow, you are "taking in" surrounding information in a relaxed but alert manner, like a continuous 360 degree radar sweep. As Cooper put it, "I might have to shoot."
  • Orange - Specific alert. Something is not quite right and has gotten your attention. Your radar has picked up a specific alert. You shift your primary focus to determine if there is a threat (but you do not drop your six). Your mindset shifts to "I may have to shoot HIM today," focusing on the specific target which has caused the escalation in alert status. In Condition Orange, you set a mental trigger: "If that goblin does 'x', I will need to stop him." Your pistol usually remains holstered in this state. Staying in Orange can be a bit of a mental strain, but you can stay in it for as long as you need to. If the threat proves to be nothing, you shift back to Condition Yellow.
  • Red - Condition Red is fight. Your mental trigger (established back in Condition Orange) has been tripped. If "X" happens I will shoot that person."
It occured to me that it might be illuminating to reinterpret Cooper's color code in the information security context:
  • White - Unaware and unprepared. If attacked in Condition White by, say, some malware, a social engineer or hacker, the only thing that may save you is the inadequacy or ineptitude of your attackers. When confronted by something nasty, your reaction will probably be "Oh my God! This can't be happening to me", a state psychologists call 'denial'.
  • Yellow - Relaxed alert. No specific threat situation.  You are simply aware that the virtual world is a potentially threatening place.  You use your eyes, ears and security software to look out for digital threats, realizing that "There are almost certainly threats out there." You should be in Condition Yellow whenever you are in unfamiliar surroundings, such as exploring different websites or handling email from people you don't know. You can remain in Yellow indefinitly, just as long as you are mentally able to stay sufficiently alert.  In Yellow, you are constantly "taking in" details about information flows and situations that unfold before you in a relaxed but alert manner, like a continuous 360 degree radar sweep. If you are too tired or distracted to keep up your guard, you should avoid risky behaviors, becoming more conservative in your online activities.  With practice, however, Yellow gradually becomes your default state of mind.
  • Orange - Specific alert. Something is not quite right and has gotten your attention. Your radar has picked up a specific information security risk. You adjust your primary focus, assessing the threat to determine what is going on, whether you are vulnerable, what might be the outcome if so and hence whether it is a genuine risk. Your mindset shifts to "Looks like I am being scammed and/or my information is being compromised," focusing on the specific target which has caused the escalation in alert status. In Condition Orange, you set a mental trigger: "If that goblin does 'x', I will definitely need to report a security incident and seek help." Staying in Orange requires some concentration but you can stay in it for as long as you really need to. If the threat comes to nothing, you shift down to Condition Yellow, though you may still call the Help Desk to report your suspicions.
  • Red - Condition Red is fight-or-flight. Your mental trigger (established back in Condition Orange) has been tripped. You and your information assets are definitely under attack, or have already been compromised.  You definitely need to call the Help Desk urgently to report the incident and take their advice on what to do about it.
Our security awareness materials aim to bring the whole organization up to Condition Yellow and maintain it at that minimum level, while at the same time giving people the ammunition (the knowledge, understanding and skills) to (a) appreciate when things might be turning Orange or Red, and (b) react appropriately if they do.

As an information security professional, I find myself at Orange or Red most of the time yet, despite the occasional tinges of paranoia, it's a relatively happy place for me.  This phenomenon seems to set infosec pros apart from the crowd.  I guess hackers are also comfortable in Orange or Red, with the additional motivation of creating or exploiting vulnerabilities rather than just finding and fixing them. 

Oh and by the way, hackers have virtual enemies too.  There's no honor among thieves.


Social engineering contest sparks a reaction

News that DEFCON, a hacker conference, will include a Capture The Flag contest using social engineering techniques has sparked a fearful reaction from a US financial services industry regulator, warning their clients to be on their guard during the contest.

In fact, all organizations should must be constantly on their guard against social engineering attacks, contest or no contest.  If the contest serves to raise awareness of the widespread, easily exploited vulnerabilities created by naive and unattentive people, then I am all in favor of it.  Good on yer!  There should be one every month!  A big one, with headline coverage in all the news media!  With special prizes for the organizations that successfully resisted the social engineering attacks for a specified period!

Social engineering is of course one of the central issues in this month's NoticeBored security awareness materials on human factors in information security.  With people attacking people, it's self evidently about the human factors.

The announced contest is very restrained, with pre-set rules that limit the target organizations, the nature of the attacks and the types of information to be exploited.  Anyone who believes criminal hackers using social engineering techniques outside of the artificial contest situation would respect such arcane rules is deluded.  That's the real take-away lesson from this contest and the furore that surrounds it: if a bunch of social engineers really threatens your corporate information assets under the strict rules of the contest, then oh boy are you vulnerable to unethical attackers.

To give them their due, the FFIEC does advise clients to run security awareness and training activities:
"Financial institutions need to educate users regarding their security roles and responsibilities.  Training should support security awareness and strengthen compliance with security policies, standards, and procedures.  Ultimately, the behavior and priorities of senior management heavily influence the level of employee awareness and policy compliance, so training and the commitment to security should start with senior management.  Training materials for desktop and workstation users would typically review the acceptable-use policy and include issues like desktop security, log-on requirements, password administration guidelines, etc.  Training should also address social engineering and the policies and procedures that protect against social engineering attacks.  Many institutions integrate a signed security awareness agreement along with periodic training and refresher courses."
I'm relieved that they don't actually say "annual awareness training courses" there at the end, but unfortunately I'm sure that's how many of their more naive clients will interpret the advice.  Annual courses are patently NOT the way to raise security awareness.  They have never worked as intended, as anyone who has either run them or been forced to attend will surely agree.  The change to ongoing/rolling security awareness programs makes all the difference.  So if "periodic" actually meant "continuous", I'd support the FFIEC advice.

What do you make of the social engineering contest?  Do you think it helps or hurts the cause for better information security?  Comments are very welcome.