ISACA Journal Author Blog

ISACA > Journal > Practically Speaking Blog > Posts > Parallels Between Safety and Security Failures

Parallels Between Safety and Security Failures

| Published: 8/1/2011 9:23 AM | Permalink | Email this Post | Comments (2)
Peter English
 
My vol. 4 article is about the need for organisations to be self-aware if they want to be secure. As set out in the ISACA Business Model for Information Security™ (BMIS™), security must be viewed in the wider organisational context and information security managers must be cognisant of it.
 
The importance of organisational context is further underlined by the scrutiny that will follow a breach. It will not just be the individual who sent 25 million people’s financial records through the post who comes under scrutiny if they go missing, but the culture, governance arrangements, failure of safeguards and perhaps even the gaps in legislation that may have contributed to the breach.
 
I have recently been reading about Accident Theory, which seeks to understand the root causes of accidents, and I have been struck at the parallels between the management of occupational health safety and information security. One striking example is in the report into the crash of British Nimrod surveillance aircraft XV230 over Afghanistan. The immediate cause of the crash is believed to have been a fire caused by fuel leaking onto a hot pipe, but a government-commissioned report highlights a number of organisational factors and criticises individuals.
 
In addition, the Nimrod report highlights parallels with other disasters, the loss of the US National Aeronautics and Space Administration (NASA) space shuttle Columbia in particular. The 12 contributing organisational factors highlighted as being common to these disasters should give us all pause for thought as they could well apply, albeit with less tragic consequences, to information security.
 
Common organisational causes of the losses:
  1. The ‘can do’ attitude and ‘perfect place’ culture (‘Rules schmules. Get it done’.)
  2. Torrent of changes and organisational turmoil
  3. Imposition of ‘business’ principles (in spheres where they do not belong)
  4. Cuts in resources and manpower
  5. Dangers of outsourcing to contractors
  6. Dilution of risk management processes
  7. Dysfunctional databases
  8. ‘PowerPoint engineering’ (oversimplification)
  9. Uncertainties as to out-of-service date (‘It will be fine for another year or two’.)
  10. ‘Normalisation of deviance’ (‘That error always happens; do not worry’.)
  11. ‘Success-engendered optimism’ (‘It worked out OK last time’.)
  12. ‘The few, the tired’ (i.e., the additional work burden/pressures on those left)
Read Peter English’s recent Journal article:
Rethinking Physical Security in the Information Age,” ISACA Journal, volume 4, 2011

Comments

Safety and Security are not always similar

I agree, there are a lot of similarities in the introduction of design or implementation flaws between security and safety related systems

I don't think human failures in organisations have changed that much. The Challenger disaster had similar problems in the behaviour of the booster rocket subcontract organisation.

Safety and Security incidents, though, can have quite different probability models. Wear out or random failures associated with safety and reliability don't tend to have malicious intelligence behind them not learn from previous exploiting of vulnerabilities.

It would be interesting to further analyse the similarities and differences
B Hunter at 8/8/2011 4:10 AM

Re: Safety and Security are not always similar

If there are any links you could post to any security models especially ones which include dependent probabilities I'd be interested in looking at them.

'Bow-tie' diagrams (see link below) can be useful in safety management I used them to analyse security issues in my last role.

http://www.theirm.org/members/D04-Businesscontinuitymanagement-thebowtieapproachBradEccles
InfoSecMgr101 at 8/16/2011 5:19 PM
Email