In 2017, I wrote: “the digital technologies that enable much of what we think of as modern life have introduced new risks into the world and amplified some old ones. Attitudes towards risks arising from our use of both digital and non-digital technologies vary considerably, creating challenges for people who seek to manage risk.”
This is still true today, the 29th day of Cybersecurity Awareness Month, 2020; and, as the month draws to a close, I think it is helpful to reflect on how we feel about "cyber" risks, those created by our use of connected devices and the rest of the digital infrastructure that supports so many facets of life in the 21st century. You may have found that not everyone seems to be as concerned about some risks as you are.
Conversely, you might not be as worried about some things as some of your friends are. For some reason this makes me think of a Chief Information Security Officer cycling to work: she's more aware of, and concerned about, the risks posed by a new operating system vulnerability than most people, but she's less concerned than her friends and family about the risks of cycling to work.
The reasons for differences in risk perception are many and complex, and there's not enough room in this article to address them all in a fully-documented fashion. What I do have room for is a short account of my considered opinions on this, followed by some sources at the end. The underlying theme of what I have to say is this: the failure of some people to heed expert advice, particularly experts who are warning that something is a problem and poses risks that need to be taken more seriously.
The Way I/We/You/They See Certain Risks
Consider a survey question that offers the following choices for your answer: Low risk, Between low and moderate risk, Moderate risk. Between moderate and high risk, High risk. Suppose the question is this: How much risk do you believe global warming poses to human health, safety, or prosperity? What is your answer?
Over the last decade or so, numerous surveys have asked that question and the most frequent response is High risk. Almost all climate scientists agree that High risk is the "correct" answer, based on the science. But not everyone agrees, and that is clearly hampering efforts to slow down global warning.
So guess what what happens when you analyze the survey responses by gender you find that men are more likely to rate this risk Low. In fact, whenever you ask people to rate risks pertaining to a bunch of different technologies, you tend to find men see less risk than women. Furthermore, white males tend to see less risk than white females, non-white males, and non-white women.And this is not a new phenomenon. There is a long history of failure to heed the warnings sounded by experts on a wide range of issues. Consider the 1994 survey results graphed on the right. The grey line with the round data points is white males who saw less risk than everyone else in nuclear waste, chemical pollution, motor vehicle accidents, outdoor air quality, nuclear power plants and medical X-rays.
To be a bit more precise, the implication is that, on aggregate, white men in America tend to under-estimate technology risks, relative to the mean. And if you think, like I do, that the technologies we have been talking about so far present serious risks to human health, safety, or prosperity, then those men are wrong. What is more, their opinions act as a brake on efforts to address the risks that others are concerned about. And not only are they wrong, history has shown it is hard to persuade them of this, and of the need to raise awareness of these risks. All of which could have serious implications for cybersecurity awareness if it turns out that this pattern of findings extends to cyber risks.
Guess what? The pattern does extend to the digital realm, as you can see from this chart based on research I did a few years ago, working with my good friend Lysa Myers who was on my research team at ESET at the time, with some assistance from Dan Kahan of the Cultural Cognition Project at Yale Law School.
See that White Male line undercutting the others across a wide range of risk categories? The yellow highlighting picks out the “digital risks,” and it shows that white males tend to see less risk from digital technology than the other groups, although the gap is smaller than with some other risks (and there is one notable exception: government data monitoring seems to trouble non-white males even less than white males—there could be several explanations for this, but that is a subject for a different blog post).
"Not All White Men"
Of course, the story here is not as simple as it appears from these graphs. If you watched the TEDx talk on Day 8 of this month's cybersecurity awareness blog posts you will know that, the first time I got excited about this White Male Effect in technology risk perception, my wife point out that I am a white male; and I don't—in her professional opinion—under-estimate risk. And in fact, research shows that significantly less than half of white males are what I would call the "problem" here: refusing to accept expert opinion as to how serious the risks of technology are to human health, safety, and prosperity.
One of the pioneers in risk perception research, Dan Kahan, collaborated in a 2007 study that found a certain type of white male was "so extremely skeptical of risks involving, say, the environment ... that they create the appearance of a sample-wide "white male" effect."
As Kahan puts it, "that effect 'disappears' once the extreme skepticism of these individuals (less than 1/6 of the white [male] population) is taken into account." (see Kahan's discussion here). This makes a lot of sense when you look at cybersecurity. I think we can safely assume that most cybersecurity professionals perceive the risks from digital technology abuse to be high rather than low. And we know that for decades the cybersecurity profession has been dominated by white males. So what distinguishes them from the "certain type of white male" to which Kahan refers?
The answer lies in something called the Cultural Theory of Risk, and in the language of that theory, the white men in question, the guys drastically underestimating technology risks, are white hierarchical and individualistic males. According to this theory, "structures of social organization endow individuals with perceptions that reinforce those structures in competition against alternative ones" (Wikipedia). A hierarchical individualistic is inclined to agree with statements like: it's not the government's business to try to protect people from themselves, and this whole "everyone is equal" thing has gone to far.
This blog post does not have room for a discussion of the Cultural Theory, but the diagram on the right helps put the terms hierarchical and individualistic into context. To grossly over-simplify, the folks who see as much risk as I do in technology tend to be in the lower right: egalitarian and community-minded (we're all equal and in this together). A lot of women tend to be in that quadrant.
For much more on this, you can read about the ground-breaking digital risk research that Lysa and I did on this theory in the context of digital risks in this two-part report: Adventures in cybersecurity research: Risk, Cultural Theory, and the White Male Effect, part one, and part two. (Kudos to ESET for supporting this work.) There is also a summary here on Medium.
Lysa and I gave a talk about this work at the (ICS)2 Security Congress in 2017, describing how the failure to listen to experts, rooted in these differences in risk perception, impacts cybersecurity. The main points are as follows:
- The security of digital systems (cybersecurity) is undermined by vulnerabilities in products and systems.
- Failure to heed experts is a major source of vulnerability.
- Failure to heed experts is a known problem in technology.
- The Cultural Theory of risk perception helps explain this problem.
- Cultural Theory exposes the tendency of some males to underestimate risk (White Male Effect or WME).
- Our research assessed the public’s perceptions of a range of technology risks (digital and non-digital).
- The findings provide the first ever assessment of WME in the digital or cyber-realm.
- Additional findings indicate that cyber-related risks are now firmly embedded in public consciousness.
- Practical benefits from the research include pointers to improved risk communication strategies and a novel take on the need for greater diversity in technology leadership roles.
We suggested several ways in which our findings, and those of other experts researching risk perception, might help improve risk communication. Here is the relevant slide from the talk.
#BeCyberSmart
No comments:
Post a Comment