This Should Not Be Happening 

 
Download Article Article in Digital Form

I recently published my thoughts about hacking cyberattacks in this space, in a piece titled “The Train of Danger.”1 In it, I gave my paranoia free rein and suggested that organizations are unprepared for the danger of such attacks and that security professionals, in particular, are at risk. I received a thoughtful series of messages regarding that column from Stan Dormer of Cheshire in the UK. He led me to see the problem from a few other angles, which I would like to explore here.

The Problems Persist

Mr. Dormer wrote:

IBM celebrated its centenary some months ago; commercial security consultancies and anti-malware companies are ten a penny; every software vendor provides voluminous advice on security; ISACA provides quality advice and highly qualified professionals and has developed schemes such as COBIT. More formal methodologies such as SABSA accompanied by standards such as ISO 2700x abound.

We do pen testing, security certification testing, deploy ‘unbreakable’ cryptographic schemes…and we have the defense and other government agencies that employ some of the best security professionals on the planet.

And an individual or group permeates through all of this stuff like a knife going through butter!

This should not be happening.

Quite so, Mr. Dormer, quite so. If we understand the problem and have developed the solutions, then why do we still have the problem? Stan goes on to suggest some reasons:

  • Employees may be leaking personal data, security data and security credentials to outsiders for gain.
  • Alleged cyberwar attacks are fewer than reported and are being exaggerated for political reasons.
  • Software vendors may still be leaving ‘backdoors’ in their software just like they used to do in the 1960s and 1970s and these are communicated to a select few who then in turn leak the knowledge.
  • Perhaps it is that we are pathetic at deploying security and most security software achieves little.
  • Dorothy Denning2 may have been right—‘All software contains fatal weaknesses, and you cannot develop a formal system that is secure’—so we have to live with it.
  • Software may be over-complex and too interconnected to be able to lock it down.

The Culture We Deserve

These are all plausible specifics; putting them together leads me to think that there is a general explanation. I believe that cultural issues in our society and in our organizations are the greatest impediment to true security despite, as Mr. Dormer says, all the countermeasures we have deployed. Jacques Barzun said we get the culture we deserve.3 Perhaps we get the security our culture deserves, as well.

It is safe to say that everyone is in favor of security. Who can be against it? However, we do not value security, or at least we do not value it as highly as other attributes. We do not applaud risky business, but we do look up to people described as risk takers. There simply is not the same cachet for a person to be really secure. The praise for risk taking is deserved because risk is rewarded with profit. But, I suggest, what we really favor is prudent risk taking. That qualifier gets lost until markets crash, needless wars begin or a system gets hacked.

When bad things occur, or at least out-of-the-ordinary bad things, there is often a response that “no one could have anticipated” that it could occur.4 In the limited space of information security, we have had no lack of Cassandras telling the world about potential dangers. But, they (aw, heck, we) cannot say specifically what will happen, nor when it will occur.5 In the competition for budgets, it is easier to demonstrate that money invested in, say, a sales promotion will lead to higher revenue than it is to show that funds spent on security will result in lower losses. Now, the vice president (VP) for sales can no more predict which sales will be made because of a promotion any more than the chief information security officer (CISO) can tell which hack would be prevented by a new firewall, but when the cash does come in, the VP has something to point to. The CISO can only claim that something that might have happened failed to occur—an impossibly difficult position to defend.

Is this, then, the answer to Mr. Dormer’s question, that we simply do not care enough about security to pay for it? There is some truth to it, but I do not believe that that is entirely the case. As a society, particularly in these parlous times, we pay quite a lot for security at the national, corporate and individual levels. But, our willingness to spend almost always outruns the reality of the threats we face. We are only willing to pay for security when we are convinced that a bad thing will indeed occur if we do not provide enhanced protection. Those bad things must happen often enough, big enough, close enough to spur us to action.

Selling Security

To accelerate investments in security, we security professionals must do a better job of communicating the reality of the threats that our organizations face. Straightforwardly, we must sell security; we must do so rather than letting the hackers, the leakers and the forces of nature do it for us. I am speaking of much more than a security awareness program, which merely points out that there are threats in the world and that individuals need to do their part to combat them. I am suggesting a campaign that demonstrates the value of security not only to the company or agency but to individuals, their families and communities. This campaign should have spokespersons, a warm and fuzzy mascot, and a carefully crafted message that makes security desirable, if not sexy.6

Selling security calls for a different set of skills than managing or auditing it. It is often the case that the people responsible for security in any given organization are specialists in implementing and maintaining technologies; they have been neither recruited nor rewarded for their ability to sell a concept. This too is a manifestation of our society’s perception—our culture—of security. Until we get the right security professionals doing the right jobs in the right ways, Mr. Dormer’s last question to me will go unresolved:

Are there other explanations [to poor security] that we’re not exploring?

Author’s Note

ISACA publishes my email address because I like to hear from you, as I did from Mr. Dormer. You can also leave comments on the ISACA web site. I promise to read and respond to those as well.

Endnotes

1 Ross, Steven J.; “The Train of Danger,” ISACA Journal, USA, vol. 5, 2011
2 Dorothy Denning is a distinguished professor at the US Department of Defense Analysis Naval Postgraduate School and one of the most noted proponents of information security of our time.
3 Barzun, Jacques; The Culture We Deserve, Wesleyan University Press, 1989. Jacques Barzun is emeritus university professor at Columbia University and one of the most noted cultural historians of our time.
4 See Taleb, Nassim Nicholas, The Black Swan, Random House, 2011, p. xix (and the entire book, for that matter).
5 See Watts, Duncan J., Everything is Obvious:  Once You Know the Answer, Crown Business, 2011, chapter 6, for an explanation of the impossibility of prediction.
6 Ross, Steven J.; Creating a Culture of Security, ISACA, USA, 2011, p. 77–80. More shameless self-promotion.

Steven J. Ross, CISA, CISSP, MBCP, is executive principal of Risk Masters Inc. Ross has been writing one of the Journal’s most popular columns since 1998. He can be reached at stross@riskmastersinc.com.


Enjoying this article? To read the most current ISACA Journal articles, become a member or subscribe to the Journal.

The ISACA Journal is published by ISACA. Membership in the association, a voluntary organization serving IT governance professionals, entitles one to receive an annual subscription to the ISACA Journal.

Opinions expressed in the ISACA Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA and/or the IT Governance Institute and their committees, and from opinions endorsed by authors’ employers, or the editors of this Journal. ISACA Journal does not attest to the originality of authors’ content.

© 2012 ISACA. All rights reserved.

Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, MA 01970, to photocopy articles owned by ISACA, for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited.