Aside from the obvious effects on the agency and the US government, today's Wikileaks disclosure concerning the CIA's capabilities to hack various technologies is a global concern for our “industry”, or rather our profession, our craft, taking in the information risk, security and related fields (such as business continuity, privacy and compliance) as a whole.
At a high level, major incidents reflect badly on all of us and are embarrassing … and yet scratching beneath the surface things invariably get more complex and convoluted in practice. There are reasons why things happened and were not avoided, identified, blocked or mitigated. We are where we are in the industry as a result of all that has gone before, including long-term cumulative effects of a gazillion decisions and events and developments along the way, not all of which were ours (e.g. cloud computing, BYOD and IoT/IIoT are three classic examples of areas where information security pros are openly concerned about the information risks, but our voices are drown out by the incessant demand for shiny new toys). Our is and will always be a developing, evolving field. Maturity is the journey not the end point. We can always do better.
One more thing to bear in mind is the inherent imbalance: we in the industry have to defend all points at once, while our adversaries need only find and exploit individual weaknesses to gain a foothold, and then perhaps seize the advantage. This is not an excuse but an acknowledgement that in the long run we are “bound” to fail from time to time, hence dealing with incidents large and small is an inevitable and important part of our brief.
Oh and there’s another: Big Brother is a genuine concern in this sphere. Official secrecy is dark cloak that covers all manner of goings-on, not all legitimate or in the best interests of society. Some “incidents” are not what they appear: the shocking part of some incidents stems from the disclosure itself rather than what was disclosed.
The nice thing about major, well-publicised incidents (such as Yahoo!'s and the Sony hack) is the insight they give us all, enabling us to explore those difficult questions such as “Could it have happened to us?” and “What makes us confident that we’d be any better off if it did?”. So, in a strange way, incidents are good. They are learning and improvement opportunities, at least.
The aviation industry’s approach to safety is, for me, a shining example of how we could deal with this issue globally through our industry, partly through the professional/trade bodies, partly through standards and agreements, partly through international collaboration, but mostly through a widespread acceptance among us information security professionals that we share common interests. We all suffer when one of our number gets hit, hence we all benefit by developing and sharing good security practices.
A significant milestone on the road to aviation-style global collaboration and information sharing is when organizations that have suffered major incidents honestly admit their faults and fully disclose what went wrong. We’re sort of at that point right now, partly through privacy breach disclosures, partly through investigative journalism and whistleblowers. We still have an issue, I believe, with delayed disclosures, with cover-ups and secrecy, and with 'challenged integrity' in general. We're not getting the whole truth, for sure. Meanwhile, organizations such as CERT are doing an excellent job of lining things up.
A bit further down the track, I foresee a rôle for itinerant teams of independent experts with their grab-backs pre-packed, ready to fly in and get stuck in to dealing with the immediate aftermath of major incidents, drawing out the general learning points, sharing and extending good practices for the benefit of the global infosec community.
I look forward to the day that changes to IT systems and information processes are, as a rule, (a) properly specified in a true engineering sense; (b) professionally developed, tested and proven, using certified materials, methods, processes, tools and workers; (c) certified competently and independently; (d) implemented formally; (e) documented, managed and maintained formally, in perpetuity.
So, that's a security innovation I would be glad to support.