This is a story about researching a simple question: Why are there so many vulnerabilities in information systems? One answer that might strike a chord with ISACA members is: “failure to listen to experts.”
Many of us have spent years advising companies to adhere to the principles of security by design and privacy by design, yet some still ship products with holes in them, vulnerabilities that leak sensitive data or act as a conduit to unauthorized system access. We’ve been teaching cyber-hygiene to end users since before it was called that, and we’ve all encountered organizations that don’t listen to our warnings about the risks inherent in their deployment of digital technologies.
But why do some people not listen to experts? I decided to study this question with help from my research colleague at ESET, Lysa Myers. We found an established body of research that examines the way people perceive risk and explores the ways in which risk communication can become more effective. Many of these studies centered on the rejection of warnings about risks inherent in successive waves of technology. For example, some were funded back when people argued about the risks from nuclear power and radioactive waste disposal. More recent research has explored why so many people don’t heed the warnings of climatologists.
Many studies used survey questions phrased like this: “How much risk do you believe [this hazard] poses to human health, safety, or prosperity?,” where this hazard might be global warming, genetically modified foods, and so on. Responses to these questions revealed interesting patterns when subjected to demographic analysis, particularly when that analysis included profiles derived from the cultural theory of risk perception (CT for short). According to this theory, we tend to perceive risk in a way that affirms our understanding of social structures and our place within them.
People who see society as a hierarchy of individuals rather than as a community of equals typically rate global warming less risky than folks who are more egalitarian and communitarian. Studies also found that, as a group, white males rated risks from a variety of technologies lower than white females, non-white males, and non-white females. Dubbed “the white male effect” by researchers who first observed it in 1994, this phenomenon appears to be caused by a subset of white males drastically under-rating risk relative to the mean (these men are predominantly hierarchical individualists with above average education and income).
What we didn’t find in our literature review was comparable surveying around risks arising from digital technology, so we conducted our own. We mixed six digital hazards in with nine risks unrelated to information systems, like air pollution. Using Survey Monkey, we polled more than 700 adults in the US. Our first surprise when analyzing responses was that “criminals hacking into computer systems” rated higher than any other risk, ahead of air pollution and hazardous waste disposal. A second digital hazard, theft or exposure of private data, rounded out the top four.
These results suggest that a significant portion of the American public now “gets” that digital technology brings serious risks, but what did our survey tell us about communicating with those who don’t “get” it? We did find a white male effect in our sample, but it was less pronounced for digital risks. The cultural alignment of respondents followed earlier studies for global warming, but looked quite different for digital risks. That tells me there is more work to do in this field, but we can improve our risk communication skills by learning from the work of those studying how cultural theory informs the science of science communication.
I encourage you to read Dan Kahan’s articles on this at CulturalCognition.net, and hope to see more people studying why the advice of information security experts is not universally embraced.
For more of our results, see our slides on SlideShare: https://www.slideshare.net/secret/j6a7vyrtlEgzOf.