Welcome to the SecAware blog
I spy with my beady eye ...
30 Aug 2007
29 Aug 2007
Beware free l(a)unches
Skimming through my inbox and spam box today, I've seen a few phisher emails like the following example:

The emails vary slightly in the names of the "beta software" (e.g. Investment Developer, Cooking Helper, Home Reno Planner etc.) and of course the senders and subject lines vary.
They all seem to point to an executable file at a numeric IP address, which is most likely another Trojan dropper.
This looks to me like another generation of the STORM worm.
The emails vary slightly in the names of the "beta software" (e.g. Investment Developer, Cooking Helper, Home Reno Planner etc.) and of course the senders and subject lines vary.
They all seem to point to an executable file at a numeric IP address, which is most likely another Trojan dropper.
This looks to me like another generation of the STORM worm.
Full disclosure on Wall Street Journal
I've been watching the brouhaha over the article in WSJ for most of a month now, with some bemusement. Essentially, 95% of the 'informed opinion' in the infosec blogosphere has been along the following lines:
- The WSJ is irresponsible to have published this piece;
- The journalist is even more irresponsible to have penned it;
- It is outrageous!! Something Must Be Done!! Prepare the noose!!
What I haven't seen anyone cover in depth as yet is the concern that information security controls on the corporate desktop are so pathetic that an editorial piece in WSJ can blow them wide open. Que? Aren't the bloggers completely missing the point?
I've never bought the argument of 'security by obscurity' which they seem to be arguing for. We in the infosec profession should be redoubling our efforts to design and apply sound desktop security controls, not bleating at the journalist who says "The King has no clothes". As to those 'infosec pros' who are baying for her blood, shame on you. Shooting the messenger won't alter the fact that desktop security stinks.
Isn't this just the same argument as with full disclosure of security vulnerabilities? Most of the profession are outraged that someone would even consider posting an exploit in a public forum, let alone doing so without giving the relevant party time to analyse it, create and test a fix, and then wait N months for everyone to implement the patch. Hackers, meanwhile, argue very convincingly that if they do not at least disclose exploits "responsibly", they will never be fixed because vendors are far too busy adding new bells and whistles. They say that crackers, the criminal underground and 'terrists' will eventually discover the self-same vulnerabilities and exploit them for criminal purposes and the world as we know it will come to a sticky end. Both points of view have merit but the real issue is that FAR TOO MUCH SOFTWARE HAS BLATANT BUGS THAT CREATE SECURITY VULNERABILITIES BECAUSE SECURITY IS NOT A DEVELOPMENT OR SALES IMPERATIVE. In that context, the full/responsible disclosure argument is simply irrelevant bickering.
I'm looking forward to the WSJ's forthcoming editorials blowing open web security, multifactor authentication, database security and all those other oxymorons so beloved of the 'infosec profession'.
Go ahead, shoot me if you like.
- The WSJ is irresponsible to have published this piece;
- The journalist is even more irresponsible to have penned it;
- It is outrageous!! Something Must Be Done!! Prepare the noose!!
What I haven't seen anyone cover in depth as yet is the concern that information security controls on the corporate desktop are so pathetic that an editorial piece in WSJ can blow them wide open. Que? Aren't the bloggers completely missing the point?
I've never bought the argument of 'security by obscurity' which they seem to be arguing for. We in the infosec profession should be redoubling our efforts to design and apply sound desktop security controls, not bleating at the journalist who says "The King has no clothes". As to those 'infosec pros' who are baying for her blood, shame on you. Shooting the messenger won't alter the fact that desktop security stinks.
Isn't this just the same argument as with full disclosure of security vulnerabilities? Most of the profession are outraged that someone would even consider posting an exploit in a public forum, let alone doing so without giving the relevant party time to analyse it, create and test a fix, and then wait N months for everyone to implement the patch. Hackers, meanwhile, argue very convincingly that if they do not at least disclose exploits "responsibly", they will never be fixed because vendors are far too busy adding new bells and whistles. They say that crackers, the criminal underground and 'terrists' will eventually discover the self-same vulnerabilities and exploit them for criminal purposes and the world as we know it will come to a sticky end. Both points of view have merit but the real issue is that FAR TOO MUCH SOFTWARE HAS BLATANT BUGS THAT CREATE SECURITY VULNERABILITIES BECAUSE SECURITY IS NOT A DEVELOPMENT OR SALES IMPERATIVE. In that context, the full/responsible disclosure argument is simply irrelevant bickering.
I'm looking forward to the WSJ's forthcoming editorials blowing open web security, multifactor authentication, database security and all those other oxymorons so beloved of the 'infosec profession'.
Go ahead, shoot me if you like.
25 Aug 2007
Awareness and training surveys in EU and US
Two survey reports into information security awareness and training practices offer insights into the state of the art.
The first report from the European Network and Information Security Agency ENISA is Information security awareness initiatives: current practice and the measurement of success.
Although the survey and case studies are European in origin, I'm sure the general discussion and ideas on the thorny issue of measuring information security awareness programs, and in fact measuring information security as a whole, are broadly applicable. Three-quarters of the Europeans surveyed said they have to do security awareness as a compliance requirement. I didn’t realize it was such a high proportion.
References in the report to the lack of consensus and evolving good practices indicate the variety of awareness and metrics techniques in use. I was interested to see markedly different opinions on the value of CBT (Computer Based Training) or posters, for examples, and ambiguity throughout the report about "training" vs "awareness" (NIST SP800-50 speaks to the difference, as does the NASCIO report noted below). I heartily agree with the implication that security awareness should be a rolling year-long event, continually updated to reflect current issues, rather than a sporadic/once-a-year training course (the dreaded 'sheep dip'!) or, even worse, the once-a-career induction course, no matter how effective is classroom-based training.
The awareness topic list on page 5 of the report seems 'about right' to me although there are many other topics perhaps worth covering (e.g. software development, database security, privacy ...) if you are creative about it, which also helps keep the program fresh and interesting. All in all, it's 20 pages well worth reading.
The second report from NASCIO (an organization representing chief information officers, information technology executives and managers from US state governments) is IT Security Awareness and Training: Changing the Culture of State Government. The authors promote security awareness as a preventive control that can help to avert major crises caused by serious information security incidents.
"Since a holistic approach to security revolves around people, cultural change is needed to truly ensure that employees and contractors understand their IT security responsibilities and take them seriously."The report promotes the value of continuous, long-term, broad-based security awareness activities in addition to more narrowly focused and spasmodic training activities.
"Continuous and ongoing awareness and training activities for state employees (and contractors) could help prevent a major state crisis ... Cultural change to the fabric of the state government workforce is needed to make IT security and the ethical use of state IT resources as ubiquitous as technology. Since that cultural change involves changing the way that state employees perceive IT security, consistency and patience are necessary ingredients. Isolated presentations or training sessions, while a good start, will not lead to the creation of a long-term culture of IT security. After all, state employees, like everyone else, have many plates to juggle and may not retain the entirety of the aweareness and training content to which they hjave been exposed, expecially upon the passage of months or years. Hence, regularized and constant reminders in mand forms are needed the enact this cultural shift ... Consistency is a key factor. One isolated presentation does not make for adequate awareness. Presentations on a more frequent basis can help to keep IT security at the forefront of government officials' agendas so that executive and legislative support does not wane over the long term."Absolutely! This is probably the key reason that old-fashioned "security awareness" programs (usually consisting of sporadic and uncoordinated security training sessions in fact) do not achieve the instant results that are anticipated. People who naively expect security awareness to turn things around within a few weeks or months are missing the point: genuine cultural change takes continuous gentle pressure in the right direction over years not weeks.
"Innovative approaches may serve to spark IT security awareness in the minds of many state employees. By starting with a marketing campaign of sorts for IT security, a state can start to build a culture of IT security vigilance."Again, I agree wholehartedly. With the marketer's hat on, NoticeBored's security awareness posters (for example) are efffectively 'advertizing' information security as a whole, with a touch of humor and a little information on the monthly awareness topics for good measure. A distinctive logo on all the materials helps bind them into a whole, while the underlying messages in all the materials reinforce the fundamental core values in information security such as: confidentiality, integrity and availability; risk and control; and prevention, detection and correction. This is quite clearly a branding technique. [By the way, that idea suggests to me a novel way of measuring the effectiveness of security awareness programs, namely using the same techniques that marketers use to assess the effectiveness of advertising programs. Surveys might for example assess the recall of key program images, sayings and messages by representatives of the target audiences, and measure the retention of information security concepts compared to 'competing' awareness initiatives such as health-and-safety or legal compliance.]
As you read the report, do check out the sidebars with numerous examples of security awareness activities from several states. Many of them have a public outreach element with security awareness activities targeted beyond satte employees.
The NASCIO report quotes Insider Security Threats: State CIOs Take Action Now! published earlier this year from which the graph above is taken. The obvious increase in incidents on the graph presumably reflects better incident reporting processes (otherwise there seems to have been a severe lapse of security since 2005) but the proportion of insider vs external hacker attacks is interesting. Insiders, of course, have ready access to the information required to do their jobs and often much wider access to information due to the practical problems of trying to enforce 'need to know' outside of a military context. When insiders go bad, therefore, they can cause a lot of damage without triggering the intruder alerts that (some) hackers trip. Other insiders are often best placed to identify and report internal security incidents, provided they are aware of their responsibilities and know what to look out for - in other words, security awareness is a very important element of control against the insider threat.
The report also touches on the difficulties of getting executive support for security awareness and offers some practical tips, essentially starting with specific high-level security awareness activities targeting the very executives who should understand and fund awareness.
Go ahead: print out both reports, sit yourself down somewhere quiet with a cup of coffee, red-pen them and cogitate. There are good ideas and complementary approaches in both of them. I certainly came away with a number of interesting thoughts and quotations that will appear on the NoticeBored site and our awareness materials in due course.
24 Aug 2007
Audio-visual fun for awareness seminars
WatchGuard has released some short audio clips making fun of IT help desk callers and IT pro's alike. If you're into multimedia presentations, they might liven up your security awareness seminars, along with their video clips on drive-by malware and rootkits.
22 Aug 2007
Malware spam spewed forth
We've received loads of similar malware spams today, all basically the same structure with minor differences and spelling mistakes (see above).
The links vary but we understand that one (at least) attempts to infect visitors' PCs with a downloader Trojan. Good up to date antivirus software should trap it but do not rely on this as your sole control: it is not recognized by all antivirus programs.
A quick search of my spam/deleted box for emails containing the string "account number" reveals a whole bunch of em received so far today.
Senders include
Bartenders Guide
Cat Lovers
Cool Pics
Dog Lovers
Downloader Heaven
Entertaining Pros
Free Web Tools
Fun World
Funny Files
Game Connect
Internet Dating
Job Search Pros
Joke-A-Day
Mobile Fun
MP3 World
Net Gambler
Net-Jokes
Office Antics
Online Gamers
Online Hook-Up
Poker World
Pet World
Resume Hunters
Ringtone World
Web Connects
Web Cooking
Wine Lovers
Subject lines include
Dated confirmation
Internal Support
Internal Verification
Internet Techincal Support [sic]
Login info
Login information
Login Verification
Member Confirm
Member Details
Membership Details
Membership support
New member confirmation
New User Details
New User Letter
New User Support
Registration confirmation
Registration Details
Tech Department
Thank you for joining
User Info
User services
User Verification
Welcome new member
There are other variants in circulation too.
The spams are believed to be the result of a new mutant of the Storm worm that has been very active for weeks. SANS Internet Storm Centre has some technical info on it and there's more on F-Secure's blog.
The usual advice "Don't click on dubious links" applies here. Now might be a good time for your security awareness person to inform your fellow employees in calm, helpful tones about the threat. PLEASE do not add to the problem by circulating wild warning emails with "Please tell everyone you know!" or similar - leave the job to the professionals and the news media. Oh and don't forget to check that your antivirus software is updating itself regularly.
*UPDATE* Download a security awareness 'alert' about this, suitable for circulating to your fellow employees. NoticeBored customers: please contact us for the editable MS Word version.
Security metrics for the Bored
The CSO Executive Council is running a series of surveys to assess security metrics practices. The latest survey report revealed that two thirds of respondents do not gather security program data in order to create statistical reports to present to senior management, and followed up with the following (sample) of explanations:
"Lack of interest from senior management" caught my eye and "No demand from The Clueless" made me smile but rather than simply accepting this sad state of affairs, how about running some security awareness activities to give senior managers a clue? If information is seen as a valuable organizational asset, the need to protect it is a natural and easy step (and if not, you have more fundamental issues!). If protecting information assets is important, measuring the extent of protection and identifying improvement opportunities is also important, isn't it? So there we are: an executive security awareness program in one paragraph.
I have more sympathy with other comments about the difficulties of designing an objective metrics scheme for information security. It's hard to figure out security metrics that are both simple/cheap to gather and meaningful/useful. My discussion paper published in ISSA Journal in July 2006 might help, as may a paper written by members of the ISO27k Implementers' Forum at ISO27001security.com that derives pragmatic security metrics from ISO/IEC 27002.
Take the CSO Executive Council's third Security Program Scorecard survey to be eligible for a drawing for a copy of Measures and Metrics in Corporate Security
- Not requested, and no value to security program at this point.
- Lack of management interest in seeing security metrics.
- Lack of interest by senior management.
- No funding'
- Embryonic'
- Information is gathered and presented to senior IT security management.
- Security organization is not established due to budget constraints.
- Nobody is asking, and I would not know what to prepare.
- Time, not sure what to measure.
- No good collection method.
- Didn't start to do it yet. We plan to do it in the near future.
- Data points too qualitative.
- No manpower.
- No formal security program.
- Not my responsibility.
- No demand from The Clueless.
- Not my role.
- Narrative reports are provided, not statistical.
- Not needed for awareness, budgeting, etc.
- Haven't developed metrics.
- Management doesn't know that they want this.
- Not requested.
- Don't have the requisite systems in place.
- Insufficient resources to gather automatic and consistent metrics.
"Lack of interest from senior management" caught my eye and "No demand from The Clueless" made me smile but rather than simply accepting this sad state of affairs, how about running some security awareness activities to give senior managers a clue? If information is seen as a valuable organizational asset, the need to protect it is a natural and easy step (and if not, you have more fundamental issues!). If protecting information assets is important, measuring the extent of protection and identifying improvement opportunities is also important, isn't it? So there we are: an executive security awareness program in one paragraph.
I have more sympathy with other comments about the difficulties of designing an objective metrics scheme for information security. It's hard to figure out security metrics that are both simple/cheap to gather and meaningful/useful. My discussion paper published in ISSA Journal in July 2006 might help, as may a paper written by members of the ISO27k Implementers' Forum at ISO27001security.com that derives pragmatic security metrics from ISO/IEC 27002.
Take the CSO Executive Council's third Security Program Scorecard survey to be eligible for a drawing for a copy of Measures and Metrics in Corporate Security
21 Aug 2007
Awareness through incidents
Educational Security Incidents (ESI) is a blog comprising brief summaries of (mostly privacy related) security incidents culled from the news media. These are intended to be used for security awareness purposes: analysis and deconstruction of the incidents can indeed be used for case studies or just to pep-up other awareness materials.
There are of course zillions of similar sources on the Web, from the regular news media to assorted blogs, mailing lists (such as RISKS-List) and discussion fora (such as CISSPforum and Security Catalyst), plus books such as Dear Valued Customer, You Are A Loser and those by Ira Winkler and Kevin Mitnick.
Stories of security incidents from within the organization are even more powerful, although in highly political organizations they are quite likely to be suppressed by those involved. I know of at least one Internal Audit function that uses incidents in this way, regardless of the company politics: they produce an annual booklet describing chosen incidents, in each case outlining the background to the situations and the impacts, and usually they add some subsequent commentary about how the controls were (belatedly) changed for the better. The booklet becomes a control, governance, security and fraud education resource for management. Nice!
There are of course zillions of similar sources on the Web, from the regular news media to assorted blogs, mailing lists (such as RISKS-List) and discussion fora (such as CISSPforum and Security Catalyst), plus books such as Dear Valued Customer, You Are A Loser and those by Ira Winkler and Kevin Mitnick.
Stories of security incidents from within the organization are even more powerful, although in highly political organizations they are quite likely to be suppressed by those involved. I know of at least one Internal Audit function that uses incidents in this way, regardless of the company politics: they produce an annual booklet describing chosen incidents, in each case outlining the background to the situations and the impacts, and usually they add some subsequent commentary about how the controls were (belatedly) changed for the better. The booklet becomes a control, governance, security and fraud education resource for management. Nice!
18 Aug 2007
Security awareness success in the US Army
Details of audits of US Army websites and blogs run by soldiers, disclosed under the Freedom of Information Act, reveals that far more security policy breaches occur on official Department of Defense websites than on blogs. The audits were conducted by the Army Web Risk Assessment Cell, a special unit with a remit (evidently) to minimize unauthorized disclosure of sensitive military information via the Web.
It seems to me the Army was absolutely right to highlight the information security risks relating to blogging. I believe the audit results reflect the outcome of a highly successful security awareness program. If this issue had not been addressed so effectively, I'm convinced there would have been far more noncompliance issues, in other words this is a lesson to us all.
In respect of security awareness programs and policy compliance, 'the military' have a significant advantage over most of us in that the workforce is specifically trained to respect authority and follow orders - or at least, that is the classical view. In fact, I understand modern soldiers are increasingly being taught to think for themselves and operate autonomously, albeit within a highly structured (literally 'regimented') operating framework and, when necessary, at gunpoint. The traditional approach to the blogging security issue would presumably have been to send out an order banning blogging. What actually happened was more subtle: Army bloggers were instructed to register their blogs with commanding officers and pre-clear what they publish. 'Blog responsibly' is a rather softer message but the audit results seem to indicate its effectiveness in this situation.
It seems to me the Army was absolutely right to highlight the information security risks relating to blogging. I believe the audit results reflect the outcome of a highly successful security awareness program. If this issue had not been addressed so effectively, I'm convinced there would have been far more noncompliance issues, in other words this is a lesson to us all.
In respect of security awareness programs and policy compliance, 'the military' have a significant advantage over most of us in that the workforce is specifically trained to respect authority and follow orders - or at least, that is the classical view. In fact, I understand modern soldiers are increasingly being taught to think for themselves and operate autonomously, albeit within a highly structured (literally 'regimented') operating framework and, when necessary, at gunpoint. The traditional approach to the blogging security issue would presumably have been to send out an order banning blogging. What actually happened was more subtle: Army bloggers were instructed to register their blogs with commanding officers and pre-clear what they publish. 'Blog responsibly' is a rather softer message but the audit results seem to indicate its effectiveness in this situation.
16 Aug 2007
Prehistoric ISO27k
I have been researching the origins of ISO27k, particularly the bit before it was launched as BS7799 in 1995, to complete the 'definitive history' on ISO27001security.com.
I dimly recall using an A5/booklet version of the Code of Practice for Information Security released by BSI DISC as PD003 in 1993, and an accompanying informational booklet PD005. I have also heard about but can't quite remember a "Users code of practice for security" released by the UK's National Computing Center (NCC) in the late 80s/early 90s, which I believe was largely derived from a Royal Dutch/Shell information security policy manual.
Does anyone reading this have copies of PD003, PD005, the NCC document or Shell's original policy manual, please, or other relevant information from that pre-1995 period? If so, please contact me (gary@isect.com). I'd really appreciate your help to set the record straight.
I dimly recall using an A5/booklet version of the Code of Practice for Information Security released by BSI DISC as PD003 in 1993, and an accompanying informational booklet PD005. I have also heard about but can't quite remember a "Users code of practice for security" released by the UK's National Computing Center (NCC) in the late 80s/early 90s, which I believe was largely derived from a Royal Dutch/Shell information security policy manual.
Does anyone reading this have copies of PD003, PD005, the NCC document or Shell's original policy manual, please, or other relevant information from that pre-1995 period? If so, please contact me (gary@isect.com). I'd really appreciate your help to set the record straight.
Failed redaction reveals trade secrets
Lawyers acting for the US Federal Trade Commission in an anti-trust case against a food company released inadequately-redacted documents, thereby disclosing highly sensitive proprietary information about the company's competitive strategies - "dozens of trade secrets" according to the Washington Post article. The failed redaction attempt involved pasting black blocks over the relevant text but since the original text was there 'underneath', it was a simple matter to remove the blocks from the electronic documents published. After being alerted to the gaffe, the lawyers printed and scanned the redacted documents: the hidden text cannot be revealed from the published scanned images, but of course it's all too late since Associated Press got the originals.
14 Aug 2007
12 Security Features and Rules Most Likely to Mess Up
"3. Most likely to be ignored: Security awareness posters"
That's the third on a list of dozen observations on security failures by a bunch of Gartner security consultants. The list is highly cynical but most of the observations ring true. Here's another dozen.
Why is it, I wonder, that 'security awareness' has come to be so firmly equated with 'posters' and/or [generally annual] 'training sessions'? It's such a lame paradigm and does a huge disservice to those of us working on creative security awareness programs.
What we need is a security awareness awareness program. I'm just off to the printers to get some posters done. Anyone want to sign up for a security awareness awareness training session next August?
That's the third on a list of dozen observations on security failures by a bunch of Gartner security consultants. The list is highly cynical but most of the observations ring true. Here's another dozen.
Why is it, I wonder, that 'security awareness' has come to be so firmly equated with 'posters' and/or [generally annual] 'training sessions'? It's such a lame paradigm and does a huge disservice to those of us working on creative security awareness programs.
What we need is a security awareness awareness program. I'm just off to the printers to get some posters done. Anyone want to sign up for a security awareness awareness training session next August?
Businessman scammed for AU$1.7m
An Australian businessman chasing an AU$100m deal with some Nigerian businessmen has lost AU$1.7m in what sounds like a classic 419 advance fee fraud.
Being a businessman, I guess he assessed the potential reward and decided that a 1.7% advance was worth the risk, but no more.
"[T]he scam started a year ago in Japan before spreading to other countries, and then ended in Amsterdam where he came for an appointment with his alleged business partners. After advancing large sums of money, supposedly for such things as notary fees, the Australian man finally started getting the idea that he was being ripped off, police said. He alerted Dutch police who were then able to arrest the three suspected swindlers in an Amsterdam hotel where they had arranged to meet the Australian with a suitcase full of money claiming it would soon be his."
Being a businessman, I guess he assessed the potential reward and decided that a 1.7% advance was worth the risk, but no more.
13 Aug 2007
Lessons Learned in Software Development
Through a series of nearly 300 “lessons”, the authors of Lessons Learned in Software Testing (~$27 from Amazon) share around 60 years of accumulated wisdom about how to test application systems - not so much which buttons to press but more how to establish and manage a test team, plan the work and dynamically adjust the testing process according to what is found and how much time is left.
Read our book review here.
9 Aug 2007
Five nines = a stretch target
In a shining example of integrity, transparency and customer service, 365 Main, a data center company that promises extremely high levels of availability, has published details of a serious power failure that took out service to over 40% of its San Francisco colocation clients for as much as 45 minutes. The diary of events describes the frantic investigative engineering work required to analyze and resolve a problem in the backup power systems, finally traced to a timing issue (one of the nastiest forms of software bug!) in a PLC (Programmable Logic Controller - a type of Supervisory Control and Data Acquisition SCADA) subsystem that failed to clear the memory reliably when the diesel generator control units reset. Although I'm not a SCADA security expert, the fact that the failure occurred after a number of set/reset events sounds like a memory leakage and buffer overflow problem to me, but then I'm reading another texbook about software security testing at the moment so it's on my mind.
In the course of explaining the failure, the company outlines the design of its "N+2" standby power system using ten 2.1MW diesel generators, two of which are backups in case of maintenance or failure of the remaining eight. This level of power system investment is evidently sufficient to deliver 99.99% availability ("four nines")in an area subject to "dozens of surges and utility failures" during the last five years, although it is patently insufficient to reach five nines. Close but no cigar.
Describing the rapid sequence of five poer surges as a "unique event" implies that they had not previously tested the power systems under the specific conditions that led to the failure. This is known as Sod's Law or Murphy's Law, I'm not sure which. The preventive maintenance and testing regime looks reasonable by most standards i.e. "preventative maintenance logs on the Hitec generators are currently available for customer review. All generators in San Francisco pass weekly start tests and monthly load tests where diesels are started and run at full load for 2 hours. Both of these tests simulate a loss of utility and the auto start function is accurately tested." That said, however, if I were advising them [which I am not!], I would probably suggest running occasional on-load tests for much longer - perhaps 24 to 48 hours or more - to ensure that the diesel tanks, pumps/valves and pipes are clear, to confirm their capacity for exceptional long-term outages, and to refresh the diesel in the tanks. One of our clients experienced a backup generator on-load failure due to a blockage between the diesel header tank and the main diesel tank: the header capacity was sufficient for short on-load tests but not for a multi-hour power failure.
Reading between the lines of the diary a little, it looks as if the company had 'full and frank exchanges' with senior management at Hitec, the supplier of the no-break diesel generators and controls. The fact that they name the supplier is perhaps indicative of a frosty chill in the business relationship, but equally could imply their confidence in the way the supplier responded to the incident.
Anway, this is all fascinating and will probably form the basis of a case study in our forthcoming awareness module on physical security and environmental services for IT, due for release in October, or perhaps a later as-yet-unplanned module on application security. As with this month's case study based on the ongoing Ferrari-McLaren spying incident, real world cases often make more convincing classroom assignments. The trick is to summarize and crystallize the key factors into a format suitable for discussion.
In the course of explaining the failure, the company outlines the design of its "N+2" standby power system using ten 2.1MW diesel generators, two of which are backups in case of maintenance or failure of the remaining eight. This level of power system investment is evidently sufficient to deliver 99.99% availability ("four nines")in an area subject to "dozens of surges and utility failures" during the last five years, although it is patently insufficient to reach five nines. Close but no cigar.
Describing the rapid sequence of five poer surges as a "unique event" implies that they had not previously tested the power systems under the specific conditions that led to the failure. This is known as Sod's Law or Murphy's Law, I'm not sure which. The preventive maintenance and testing regime looks reasonable by most standards i.e. "preventative maintenance logs on the Hitec generators are currently available for customer review. All generators in San Francisco pass weekly start tests and monthly load tests where diesels are started and run at full load for 2 hours. Both of these tests simulate a loss of utility and the auto start function is accurately tested." That said, however, if I were advising them [which I am not!], I would probably suggest running occasional on-load tests for much longer - perhaps 24 to 48 hours or more - to ensure that the diesel tanks, pumps/valves and pipes are clear, to confirm their capacity for exceptional long-term outages, and to refresh the diesel in the tanks. One of our clients experienced a backup generator on-load failure due to a blockage between the diesel header tank and the main diesel tank: the header capacity was sufficient for short on-load tests but not for a multi-hour power failure.
Reading between the lines of the diary a little, it looks as if the company had 'full and frank exchanges' with senior management at Hitec, the supplier of the no-break diesel generators and controls. The fact that they name the supplier is perhaps indicative of a frosty chill in the business relationship, but equally could imply their confidence in the way the supplier responded to the incident.
Anway, this is all fascinating and will probably form the basis of a case study in our forthcoming awareness module on physical security and environmental services for IT, due for release in October, or perhaps a later as-yet-unplanned module on application security. As with this month's case study based on the ongoing Ferrari-McLaren spying incident, real world cases often make more convincing classroom assignments. The trick is to summarize and crystallize the key factors into a format suitable for discussion.
5 Aug 2007
Boys toys
Thanks to stumbling across a list of 101 cool freeware apps compiled by PC World, I've now got virtual sticky notes on my screen, I'm monitoring the temperature of my CPU and I've rediscovered Belarc Avdisor, a tool that interrogates the PC to find out what hardware and software are installed. Since I last used it, Belarc has evidently been upgraded to provide an assessment of the PC against the CIS security benchmarks. Nice touch!
Google shines in the PC World list with strong entries in several categories including Gmail, Google Reader, Google Blogger, Google Docs, Google Notebook, Google Picasa, YouTube (now owned by Google) and Google Desktop all listed. If free search engines were listed, I'm quite confident Google would easily top the list.
My favourite discovery in the 101 is a neat little tool called SyncToy. At last I can replicate the files and directories from my desktop on the laptop, work for a while on the laptop and then re-synchronize to put the altered files back on the desktop. It works well. Having never quite got the hang of the Windows' functions for the same thing, it's good to find a tool so easy to configure and use.
Google shines in the PC World list with strong entries in several categories including Gmail, Google Reader, Google Blogger, Google Docs, Google Notebook, Google Picasa, YouTube (now owned by Google) and Google Desktop all listed. If free search engines were listed, I'm quite confident Google would easily top the list.
My favourite discovery in the 101 is a neat little tool called SyncToy. At last I can replicate the files and directories from my desktop on the laptop, work for a while on the laptop and then re-synchronize to put the altered files back on the desktop. It works well. Having never quite got the hang of the Windows' functions for the same thing, it's good to find a tool so easy to configure and use.
Get me the IRS Security Manager!
A majority of US Inland Revenue Service employees failed a social engineering penetration test recently, despite their "security awareness training" warning them of the threat:
I'm uncertain exactly what is meant by "security awareness training" - is it security awareness (ongoing, continuous awareness activities), security training (periodic training courses/classes) or some hybrid? The full report only refers to compulsory annual security awareness activities. Anyway, whatever they are doing is evidently having some effect (some tested employees did attempt to verify the calls) but pushing that proportion towards 100% will be very tough. ['Course I know a company that could help ... we'll be releasing an updated social engineering awareness module later this year.]
[UPDATE] The Treasury Department report is here.
Folklowing previous reports, the IRS said it:
and also that it
Obviously I would agree with the emphasis on security awareness but find the notion of an annual awareness event something of a joke. Putting notes on payslips and circulating a "computer security newsletter" are reasonable ideas but still nothing like enough to raise awareness. Is it any wonder so many IRS employees remain clueless?
"In a test sample, nearly 60 percent of 102 IRS employees were duped into handing over their access information, the IG said in a report released today. TIGTA auditors used social-engineering methods to survey the degree of compliance with data security. Posing as help-desk representatives, they called IRS line employees, including managers and contractors, and asked for their assistance to correct a computer problem. They requested that the employee provide a user name and temporarily change his or her password to one TIGTA callers suggested. TIGTA test callers convinced 61 of the 102 employees to comply with the requests. Only eight of the 102 employees in the sample contacted the appropriate offices to report or validate the test calls, the report said. The sample employees were from across IRS’ business units and geographic regions."
I'm uncertain exactly what is meant by "security awareness training" - is it security awareness (ongoing, continuous awareness activities), security training (periodic training courses/classes) or some hybrid? The full report only refers to compulsory annual security awareness activities. Anyway, whatever they are doing is evidently having some effect (some tested employees did attempt to verify the calls) but pushing that proportion towards 100% will be very tough. ['Course I know a company that could help ... we'll be releasing an updated social engineering awareness module later this year.]
[UPDATE] The Treasury Department report is here.
Folklowing previous reports, the IRS said it:
"Would update its security awareness program to include training on computer intrusions and unauthorized access and use existing media, such as the annual security training and security awareness week, to communicate IRS security standards on password protection procedures."
and also that it
"Had incorporated the topic of social engineering into its mandatory annual Online Security Awareness Training, which included examples and scenarios of attempts used to gain access to IRS systems. In addition, the IRS stated periodic reminders would be issued in the forms of (1) all-employee notices that would be included with employees’ Earnings and Leave statements and (2) articles in the computer security newsletter."
Obviously I would agree with the emphasis on security awareness but find the notion of an annual awareness event something of a joke. Putting notes on payslips and circulating a "computer security newsletter" are reasonable ideas but still nothing like enough to raise awareness. Is it any wonder so many IRS employees remain clueless?
4 Aug 2007
CA Blasts Rocket For Code Theft in $200M Suit
CA is seeking $200m (!) from Rocket Software, claiming they used CA's intellectual property for their own database management system.
The transfer of intangible intellectual property in the form of employees' accumulated knowledge and experience is a frequent cause of trade secret disputes. The courts have a tough time differentiating deliberate theft and abuse of trade secrets from application of general knowledge, experience, competencies and skills. Employees who move to a new employer inevitably find it hard to stop thinking about their previous position and unintentionally transferring proprietary information to the new. Former employers have problems proving that proprietary information was disclosed or taken by leavers, especially without hard evidence (e.g. data transfer media, email records etc.)
"The management software giant said in the complaint that Rocket hired programmers and software developers formerly employed by CA or Platinum technology International, which CA acquired in 1999. These employees (Mark Pompeii, Robert Schulien, Michael Skopec, and David Rowe), used CA's source code and development environment to fashion Rocket's software tools for the IBM DB2 relational database management system, CA alleged."
The transfer of intangible intellectual property in the form of employees' accumulated knowledge and experience is a frequent cause of trade secret disputes. The courts have a tough time differentiating deliberate theft and abuse of trade secrets from application of general knowledge, experience, competencies and skills. Employees who move to a new employer inevitably find it hard to stop thinking about their previous position and unintentionally transferring proprietary information to the new. Former employers have problems proving that proprietary information was disclosed or taken by leavers, especially without hard evidence (e.g. data transfer media, email records etc.)
2 Aug 2007
The light goes on?
"Agency computer systems are vulnerable because many lack basic controls,
and one of the best ways to improve information technology security is
to improve the metrics for how departments measure how these basic
controls are implemented."
Golly. Those in charge of rewriting FISMA have figured out that they probably need information security metrics to track government departments' performance.
OK guys, the next baby step is to work out what metrics are needed.
I'll put money on "number of security incidents" being one of the 'cutting edge security metrics' about to be proposed, followed shortly by some bright spark noticing and promoting NISP SP 800-55 as The Answer.
With that and the news about the hacking of three well-known US electronic voting systems, I'm glad I don't live in the Good Ol' US of Eh?
VA people sacked or trained
It's old news but I've just been reading about the Department of Veterans Affairs' response to the theft of a laptop containing Personally Identifiable Information from a VA employee's home. The response is in several parts:
The 'annual data privacy and cyber-security awareness training course' caught my eye. An annual course?! VA employees are likely to be riding high on a wave of security awareness arising from bad publicity about the incident but putting them through some sort of training course once a year is more or less pointless. Imagine if roads only had one speed sign every 1,000 miles. Or if big trucks only beeped once when reversing. Or if Coca Cola put all its budget into one advertisement per year. It's crazy. The VA (or rather their US Government bosses) are missing a trick. Telling staff to 'remind staff' about their responsibilities to protect sensitive information , if that's all they said, is hardly proactive.
Perhaps we ought to send them a link to NoticeBored?
Elsewhere I've been reading that "Your common sense is the world's best firewall. Just make sure that you turn it on." Likewise Ira Winkler is constantly telling us that common sense is not common but people need "common knowledge" i.e. information to make sound judgments about information security.
Citigroup seems to 'get it', judging by their advertisement for a London-based information security engineer: "The candidate will be a good communicator, capable of producing viable solutions by distilling the assumptions and requirements of the customers. The candidate will be required to promote information security awareness through interaction with technology peers and customers."
1. "The VA announced the appointment of a special adviser for information security.
2. "Members of the senior management team were forced to retire or resign and the hapless employee and his line managers were all sacked."
3. The Secretary of State has ordered all VA employees "to complete an annual data privacy and cyber-security awareness training course immediately".
4. Senior officials at the VA have been ordered "to compile an inventory of all workers and contractors who need access to sensitive data."
5. Senior department managers have been told "to remind staff to protect sensitive information"
6. A "security review of all laptops" has been ordered.
The 'annual data privacy and cyber-security awareness training course' caught my eye. An annual course?! VA employees are likely to be riding high on a wave of security awareness arising from bad publicity about the incident but putting them through some sort of training course once a year is more or less pointless. Imagine if roads only had one speed sign every 1,000 miles. Or if big trucks only beeped once when reversing. Or if Coca Cola put all its budget into one advertisement per year. It's crazy. The VA (or rather their US Government bosses) are missing a trick. Telling staff to 'remind staff' about their responsibilities to protect sensitive information , if that's all they said, is hardly proactive.
Perhaps we ought to send them a link to NoticeBored?
Elsewhere I've been reading that "Your common sense is the world's best firewall. Just make sure that you turn it on." Likewise Ira Winkler is constantly telling us that common sense is not common but people need "common knowledge" i.e. information to make sound judgments about information security.
Citigroup seems to 'get it', judging by their advertisement for a London-based information security engineer: "The candidate will be a good communicator, capable of producing viable solutions by distilling the assumptions and requirements of the customers. The candidate will be required to promote information security awareness through interaction with technology peers and customers."
1 Aug 2007
IT professional accused of hacking former employer
An IT professional has been accused of hacking into a former employer's server to 4,000 confidential documents:
"A press note issued by S. Balu, Deputy Superintendent of Police, Cyber Crime Cell, said police had arrested M.S. Ramasamy, a 37-year-old software engineer from Avadi, on charges of hacking and stealing confidential and proprietary information from the server of Caterpillar, a US-based construction and mining company ... When contacted, Mr. Balu said the accused had gained access to the company’s server headquartered at Peoria in Illinois, US, using another employee’s user ID and password and downloaded over 4,000 confidential documents. A closed circuit camera had visuals of him accessing the server at the time when the files were downloaded. "
Subscribe to:
Posts (Atom)