Friday, July 12, 2019

The big news from where I am, which will soon be somewhere else

Dateline San Diego, California, July 12:
Today is my last day at ESET, the company that I have worked for since 2011, and from which I am now retiring.

But wait there's more news! In early September, Chey and I will be relocating to the city of Coventry, England, birthplace of the pedal-chain bicycle, Jaguar carsthe turbojet, my parents, my brother, and me.

I will be writing more about this move as time permits, with the latest developments signposted on this blog (scobb.net).

If you want to stay in touch, and I hope you do, you can use email to reach me (use scobb at scobb dot net). You can also find me on Twitter, where I am @zcobb. I'm on LinkedIn as well and you may even spot me on Facebook - where my profile is stcobb - but I don't go there very often. In the past I have published on Medium and I may write some more articles there in the future.

So, that's the news of the day from where I am. What follows are a few random thoughts on the occasion of my departure, retirement, and relocation.

For the record, we will be flying to England, not sailing. I say this because I have twice moved from North America to England on ships. Once when I was six, and again in 1975 on the TSS Stefan Batory.

Postcard of TSS Stefan Batory from the collection of VMF at http://vmf-cruiseshipsandliners.blogspot.com/

Also for the record, I am leaving ESET with very positive feelings. I have never worked this long for anyone other than myself. In my opinion, ESET continues to set the standard for technical excellence, customer support, and dedication to helping the world enjoy safer technology. It was a privilege to work with such a great team of security researchers and I know that they will carry on the mission with courage, integrity, reliability, and passion. (Disclaimer: nobody's paying me to say this, I don't own stock or have any other financial stake in ESET.)

My relationship with ESET began exactly eight years ago this week, with a phone call about a job. The company wanted someone to do vendor neutral security research and education, which was great for me because that's been a passion of mine since the late 1980s. Adding to the appeal: the company wanted me to be based in California, my favorite state. (Chey and I met in California over 30 years ago, but left in the late 1980s to live in Scotland.)

As for my future, who knows? I do know I will keep researching and opining, mainly about technology. I will continue to blog, and there is a book I want to write. Coventry is home to a pair of excellent universities and there are more in the surrounding area - often referred to as "The Midlands" - including my alma mater, the University of Leicester. Doing some form of teaching is a possibility.

So, when Chey and I get properly settled into our new home, it is possible that I will reemerge, maybe as a something like a part-time, semi-retired, independent researcher and public-interest technologist. (I have been watching fellow security veteran Bruce Schneier move in this direction.)

At this point, and if this was a press conference, I would take questions. But I only have time for one right now, so I will answer the one I've been asked many times in recent weeks: Do you think you will miss San Diego?

Yes, I will miss San Diego, and not just because of the weather and the views. We have met so many wonderful people here, many of whom I have worked with in a business climate that is unique in my experience: San Diego has to be the Capital of Collaboration. This is great place to work on technology projects that benefit the community, the nation, and the world. I have often said that cybersecurity is the healthcare of IT, and San Diego is a center of excellence in both meatspace healthcare and cyberspace security. (The cuisine is pretty awesome too.)

On that note, I thank you for reading this far and wish you all the best. As the saying goes:

So long, and thanks for all the fish tacos!

Stephen

(Note: Image of ESET/Coventry combines a photo that I took plus photography by Si Chun Lam. Some rights reserved. This image is licensed under the Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) Licence.)

Monday, April 15, 2019

Dark markets, threat cumulativity, siegeware, and a cybercrime barometer

This is an update on five parts of my research and writing so far this year. The first part built on a suggestion from ESET PR Manager Anna Keeve: help people better understand the cybercrime threat by showing them the "dark markets" that are used to sell stolen information and buy the tools with which to steal it. So I decided to highlight their “evolution” into mainstream online services for enabling cybercrime.

1. Next Generation Dark Markets? Think Amazon or eBay for the criminally-inclined
In addition, Anna set up a session with the wonderful folks at Markeplace on NPR. So, if you want to hear more about the dark web, close your eyes and take this audio tour: Exploring the dark web with Kai Ryssdal on Marketplace

Wednesday, February 20, 2019

It's official: I'm an award-winning technologist

Earlier this month I was delighted to receive the CompTIA Tech Champion Award, "recognizing leaders focused on driving innovation, job growth and advancements for the information technology (IT) industry." There was even a press release and a video!


To put this award in context, CompTIA is the Computing Technology Industry Association:"the leading voice and advocate for..industry and tech professionals who design, implement, manage, and safeguard the technology that powers the world’s economy."

Saturday, February 16, 2019

Risk assessment and situational awareness: minding the gender gap

Man and woman in elevator iconConsider this: a man and a woman get into an elevator.

Which one is doing risk assessment:

the man or the woman?

I've been posing this question to random groups of people on the fringes of information security and cyber-workforce events for about a year now and the results have been very interesting to say the least. Almost without exception women respond by saying "the woman." And while I can honestly say that this is what I had expected, I continue to be surprised by two things.
  • How quickly that response is voiced, usually in less than a few seconds. 
  • How many women, after answering, proceed to share - without any encouragement - their personal elevator strategies (more on these later).
Also interesting: I have not yet heard a woman say: "I've never really thought about it."

How do men answer? A lot of them do eventually say "the woman" and I take that as a positive sign. It suggests that those men understand one of the fundamental realities of gender inequality in our society: women have had to adapt to living with a higher base level of fear for their personal safety than men.

But there are some men who hesitate before answering. You see quite interesting facial expressions when someone in mixed company answers "the woman" very quickly and decisively. And yes, some men seem genuinely puzzled. For those in doubt, I suggest some reading, like Rage Becomes Her.

Fear, risk perception and social science 

My original motivation in asking this question was to get a quick sanity check on a hypothesis that I had formed while researching risk perception as it relates to technology: women tend to see more risk in technology than men and so increasing female participation in technology development and cybersecurity may reduce risk and increase security.

Some results from the more formal research into risk perception as it relates to gender and technology are illustrated in the graph below - read more about the work here.


Of course, posing the elevator question to random groups of people does not count as formal social science. The reactions that I get may be influenced by the uncontrolled demographics of the group (all male, all male, mixed). That said, I'd love to hear from anyone who is in a position to do a more formal study.

What the graph above illustrates is the gender gap in technology-related risk perception. Numerous studies have documented this over the course of several decades (see the 1994 paper "Gender, race, and perception of environmental health risks" by Flynn, Slovic, and Mertz for early references: Risk Analysis, 14, pp. 1101-1108).

As far as I know, it was studies of public sentiment around environmental issues that led to the first documentation of a gender gap in technology-related risk perception. The research that I did with my colleague at ESET, Lysa Myers, was to the best of my knowledge the first to show that this gender gap also exists with respect to risks related to digital technologies. That finding led me to hypothesize that women - on average or in the aggregate - are more risk aware than men when it comes to technology.

A counter-argument might be that men are more realistic in their assessment of risk because the true level of risk is lower than women think and closer to the population mean. However, it is my opinion that many technology risks are higher than the mean, therefore I would argue that women are more accurate in their technology risk perception than men (on average or in the aggregate).

Research into the gender and ethnic variations in risk perception has shown that white males, as a whole, see less risk in technology than black males, white females, or black females (these were the names of the categories used by the researchers). But that score - which has been dubbed the white male effect - is the result of a subset of while males seeing drastically less risk than anybody else. The group, possibly 30% of white males, lowers the overall risk scores for all white males, creating the gap you see in this chart from the 1994 Flynn, Slovic, and Mertz study (adapted):
As I indicated earlier, this study was not an outlier, other studies point in the same direction and I am not aware of any that point in the opposite direction (I did look for them). You can find quite a few studies, as well as deep dives into why some people see less risk in technology than others, at the Cultural Cognition Project at Yale Law School.

What does it all mean? As I suggested in my TEDx talk a few years ago, I think it means that the rate at which new technology risks are created would go down if decision-making roles in tech companies were more evenly distributed between genders.

Back then I said "we need more women in decision-making roles" and some surveys suggest that there are now more women in such roles than there used to be; but I think we are nowhere near the level of gender equality needed to put the brakes on fresh technological blunders.

In the coming months and years I will continue to articulate these views. In the meantime, I have another study concept you might want to consider. Document what happens when you ask women this question: "What goes through your mind if you're alone in an elevator and a man gets on."

I think you will hear some interesting personal elevator strategies. The ones that I have heard certainly gave me a better sense of just how different life still is for women and men.

Thursday, January 24, 2019

How serious is the cybercrime problem in America?

The short answer to "how serious is the cybercrime problem in America?" is: Way more serious than our government seems to realize. That is one of the conclusions that can be drawn from recent ESET research into public attitudes to cybercrime, cybersecurity, and data privacy.

To check out the details, please visit this article I wrote at WeLiveSecurity, which is where you can download the full report. It has some pretty solid that may help us persuade policy makers to move cybercrime deterrence up the public policy agenda and make it the #1 priority that it should already be.

Frankly, as a student of criminology I was shocked to see that respondents thought cybercrime was a more important challenge than drug trafficking or money laundering. Almost equally worrying was the finding that less than half of Americans surveyed think that the authorities, including law enforcement, are doing enough to fight cybercrime.

So here is the conclusion that I wrote for the sruvey report: unless cybersecurity initiatives and cybercrime deterrence are made a top priority of government agencies and corporations, the rate at which systems and data are abused will continue to rise, further undermining the public’s trust in technology, trust that is vital to America’s economic well-being, now and in the future.

Please take a moment to share this information...thank you!

Sunday, August 12, 2018

What does threat cumulativity mean for the future of digital technology and cybersecurity

In recent years, most of my presentations about cybersecurity have included a slide titled "Security is cumulative". I made the slide when a group of business people asked if I would speak to them about cybersecurity. As usual, I said I would be delighted to do so, but it would help me to know what aspects of the subject they wanted me to address. The conversation continued like this:
  • Them: “You’ve been at this for a long time, right?” 
  • Me: “Yes, I guess I’ve been researching security for about 30 years.“ 
  • Them: “Well, why not talk about the top five or six things that you’ve learned.” 
Why not, indeed. The idea appealed to me and so I created a new slide deck to capture my thoughts and my first thought was this: security is cumulative. Beneath it I wrote words to this effect: To protect information systems and the data they process you have to anticipate and defend against new threats while also defeating old threats.

Ever since I wrote that, I have seen confirmation after confirmation that it is correct. Of course, there’s probably some confirmation bias at work, but consider these recent news stories
That is five examples in 10 days – July 26 to August 4, 2018 – five headlines that reflect the reality that “security is cumulative”. While many information security professionals have, over the years, stressed the need to learn from history, I think this aspect of cybersecurity, this need to defend against an accumulating list of threats, deserves a name, so I am suggesting this one: threat cumulativity.

Here is my proposed definition of threat cumulativity: the tendency of new technologies to spawn new threats that do not displace old threats but add to them.

Of course, there will be objections to this term, starting with "cumulativity is not a word" and "everybody knows this already." Well, cumulativity is a word, as I will explain in a moment. As for "everybody knows this already" let me be blunt: that is one of the most persistent errors in security thinking, kept alive by security experts who are out of touch with the relationship between technology and people.

To be clear, if you are a security expert, you probably do know that threats are cumulative. But there are a whole bunch of people whose work impacts security who have not internalized the implications of this phenomenon. I think that having a term to describe the phenomenon will help to spread awareness of its implications.

Another objection to "threat cumulativity is likely to be: "you mean risks, not threats, so you should be talking about risk cumulativity." This is a non-trivial point and so I am going to address it in a separate article. But I think there are good strategic reasons for using 'threat' here rather than 'risk'.

As for cumulativity, it is a term used in linguistic semantics to describe an expression (X) for which the following holds: "If X is true of both of a and b, then it is also true of the combination of a and b. Example: If two separate entities can be said to be "water", then combining them into one entity will yield more "water"." (Wikipedia)

Now, I am not an expert in linguistic semantics, but I do happen to have a decent degree in English Language and Literature. To my way of thinking, appropriating cumulativity for the security lexicon is a valid use of the word, one that can help people understand - and defend against - the phenomenon it purports to describe.

I will be writing more about threat cumulativity and furnishing examples of how it appears - to my eyes at least - to spell trouble for new technologies, some of which are the object of much hope for future prosperity.

Note: the illustration at the top of the article is from the works of Vauban, a pioneer in physical security, namely fortifications.

Sunday, May 06, 2018

Conversation starter for cybersecurity and workforce networking events

Elevator icon, created and released to public domain by Stephen Cobb
Suppose you work in cybersecurity and/or workforce development and you find yourself talking to other people in those fields, maybe during a networking break between conference sessions, or at one of those randomly seated lunches. Everyone has been introduced, said where they're from and for whom they work, but now there's a lull in the conversation. Try starting the conversation with this question:

A man and a woman get into an elevator; which one is doing risk assessment?

I have been asking strangers this question for a while now and the responses are very interesting. I don't want to tell you what they are at this point - that is a separate blog post. (I'm trying to devise a more formal study of responses from a range of audiences.)

But if you know me, or know of my research into gender and risk perception, you might be able to imagine where I'd like to see the conversation go after this icebreaker (places like a deeper understanding of how our sense of risk varies based on who we are and how our experiences in life have led us to differing levels of concern about potential threats to our wellbeing).

You might also want to ask this question outside of cybersecurity circles. Maybe in class? And you could change it up a little. For example, I have used "which one is more likely to be doing risk assessment?"

You could also ask this question on Twitter or Facebook (feel free to use the image above - frankly I think it's a daft sign, but I made it based on a real one that I saw recently in a very new office building in San Francisco).

Thursday, February 01, 2018

Cybersecurity and data privacy research: a modest eight piece portfolio

Research that I have done in cybersecurity and data privacy over the last few years has borne fruit in a number of different places so I wanted to provide a centralized reference point for eight of the main outputs. This should make it easier for folks to find them. I have annotated the items for context and relevance. (Note: I have formatted all the PDFs for Letter size paper but some of them use UK English spelling, others are US English.)

1. Code as a weapon

Document: Malware is Called Malicious for a Reason: The Risks of Weaponizing Code (PDF)

History: Published in the 6th International Conference on Cyber Conflict (CyCon) Proceedings, P. Brangetto, M. Maybaum, J. Stinissen (Eds.) IEEE, 2014.

Context:  I worked with my friend and colleague Andrew Lee, who was then CEO of ESET North America, to articulate several arguments against using code as a weapon. In the world of companies and consumers, program code that you run on someone else's system without permission is typically referred to as malicious software or malware. A single "infection" can cost a single enterprise hundreds of millions of dollars worth of damage (as in the WannaCry and NotPetya attacks of 2017, which used code developed by the NSA). We argued that the development of "righteous malware" by the military and intel communities, a process sometimes referred to as weaponizing code, has proceeded with insufficient input from the people who defend against, and clean up after, real world malware attacks. The consensus of this community is that military deployment of malicious code is at best a very risky proposition.

(While I was delighted that the paper was accepted for publication, and enjoyed traveling to NATO's Cycon event in Estonia in May of 2014 to present it, one of the reviewer's comments - "not very academic" - stung a little. Consequently, in August of 2014 I enrolled in a Master of Science program at the University of Leicester in England.)

2. Cybercrime and criminology

Document: The main problem with Situational Crime Prevention is that it fails to address the root causes of crime: a critical discussion

History: This 4,000 word essay, which includes an extensive reference list, was the first piece of work that I produced for my MSc in the Department of Criminology at the University of Leicester.

Context:  The essay received a good grade and writing it required me to think hard about some of the fundamental issues in criminology. Presented in the traditional English academic essay format, a proposition is argued for and against. In this case, the idea of practical crime prevention is set against the need to understand and address crime's root causes. My argument was framed in the context of cybercrime, aspects of which - such as attribution, scale, and geography - challenge tradition approaches to crime reduction. Of particular value to my evolving analysis of cybercrime was the early work on Routine Activity Theory performed by Felson and Cohen. Way back in 1979 they warned that: "the opportunity for predatory crime appears to be enmeshed in the opportunity structure for legitimate activities".

3. Measuring cybercrime

Document: Sizing cybercrime: incidents and accidents, hints and allegations

History: Paper selected for publication and presentation at Virus Bulletin, 2015. There is actually a video of the presentation that you can watch here.

Context: Just as defense of an information system means you first need to map and measure it, we need to know the scope and scale of cybercrime before we can effectively fight it. In many countries, the government tracks the number of murders, cars thefts, bank robberies, and other crimes. This data helps inform budgeting and resource allocation while enabling the measurement of efforts to reduce crime. Unfortunately, few countries, if any, have been tracking cybercrime. I argue that this abdication of governmental responsibility severely hampers efforts to fight cybercrime and do the work of cybersecurity. In the US, the federal government now directs inquiries about the level of cybercrime towards surveys performed by commercial organizations that have a vested interest in selling security-related products and services. My review of the literature and the surveys themsleves shows that many lack academic rigor and all are open to claims of bias.

4. Cyber futures and diversity: a TEDx talk

Document: Ones and Zeroes: a tale of two futures (video)

History: Talk given at TEDx San Diego, 2015, in which I drew on three things I learned while studying criminology, plus the inspiring young women of Securing Our eCity's Cyber Boot Camp.

Context: The organizers invited speakers to look to the future. I suggested that the future looks bleak if we don't step up our game in the realm of cybersecurity. I referenced crime deterrence and sentencing, Routine Activity Theory, Cultural Theory of Risk Perception, and White Male Effect. I ended by arguing that security would improve if we increased diversity in decision-making roles in technology companies.

5. The cybersecurity skills gap

Document: Mind this Gap: cybercime and the cybersecurity skills gap

History: Paper selected for publication and presentation at Virus Bulletin, 2015.

Context: As I looked more closely at the growth in cybercrime the more it became apparent that organizations were having great difficulty staffing cybersecurity positions.

6. Data privacy versus data protection in the US

Document:  Data privacy and data protection: US law and legislation

History: This white paper is based on an essay I wrote for my MSc in Security and Risk Management.

Context: As an essay, the document did not receive a great grade (it was deemed "not argumentative enough"). However, the underlying research was sound and, when formatted as a white paper, it has proved to be very useful for anyone trying to understand the American approach to data privacy in general, and more specifically, how this differs from the European notion of data protection, as embodied in the EU's General Data Protection Regulation or GDPR.

7. What it takes to be an effective CISO

Document: Getting to know CISOs: Challenging assumptions about closing the cybersecurity skills gap.

History: This is my MSc dissertation, all 18,000 words and 84 pages of it.

Context: From the abstract: "Pervasive criminal abuse of information and communication technologies has increased the demand for people who can take on the task of securing organizations against the increasing scope and scale of threats. With demand for these cybersecurity professionals growing faster than the supply, a problematic “cybersecurity skills gap” threatens the ability of organizations to adequately protect the information systems upon which they, and society at large, are now heavily reliant. This dissertation focuses on one barrier to closing the cybersecurity skills gap: the current paucity of knowledge about key work roles within the cybersecurity workforce – such as Chief Information Security Officer or CISO – and questionable assumptions about what it takes to perform such roles effectively."

8. Risk perception in cyber: a gendered perspective

Document: Adventures in cybersecurity research: risk, cultural theory, and the white male effect

History: A two-part article, published online, to present the results of the first ever survey of cyber risks relative to gender, ethnicity, and non-cyber hazards.

Context: If you are an information security professional, chances are you will have spent a fair amount of time and effort trying to get people and organizations to do more to protect their computers and data from abuse; and you will know that not everyone takes the risks of digital technology as seriously as you do. I asked myself why some people don't listen to experts, and why some people see less risk than others. Aided by my ESET research colleague, Lysa Myers, a survey was conducted to measure the white male effect and related phenomena. Along the way we found that criminal hacking is now perceived as a serious risk to health and prosperity by a significant section of the population.

Thursday, December 21, 2017

Cybersecurity, risk perceptions, predictions and trends for 2018

A quick update on research into Americans' perception of risks related to digital technology, as well as some predictions for cybersecurity in 2018.

Risk perception and cybersecurity

Over the summer I conducted some research with my ESET colleague (@LysaMyers) on the topic of risk perception as it relates to hazards arising from the use of digital technologies, which can be termed "cyber risks" for short. Our goal was to better understand why different people see different levels of risk in a range of hazards, and why some people listen to experts when it comes to levels of risk, but others do not.

For the past few months we have been analyzing and reporting on this work. Several of our findings proved newsworthy, like the extent to which concerns about criminal hacking has permeated American culture. This was the subject of an ESET press release.

We also documented evidence of a phenomenon that others have dubbed the "White Male Effect" in risk perception. First documented in 1994 with respect to a range of hazards, you can see in in our 2017 survey results here:


You can see more results of our research in several formats, from long to short:
For background on the cultural theory of risk perception that we used in our research, I encourage you to check out Dan Kahan's papers at the Cultural Cognition Project at Yale Law School. Prof. Kahan was very helpful to us as we designed our survey instrument (which is available to anyone who would like to repeat the survey).

Cybersecurity trends and predictions

As usual, I participated in ESET's annual review of security trends, this year contributing a chapter on critical infrastructure hacks, new malware for which was discovered by my colleagues. The Trends report is available here: https://www.welivesecurity.com/2017/12/14/cybersecurity-trends-2018-the-costs-of-connection/

Another annual ritual is my predictions webinar. A full recording of the December 2017 webinar that looks ahead to 2018 is available to watch on demand. Access is gated, but I think it is worth registering and should not result in a bunch of spam. Here is the agenda, click to access:


Note that regulatory risks was the top theme. And the regulation that tops them all is GDPR, the General Data Protection Regulation that comes into effect in May of 2018. I wrote about GDPR several times this year. In fact, the following article was my most widely read contribution to WeLiveSecurity in 2017: https://www.welivesecurity.com/2017/05/23/gdpr-is-world-ready-cybersecurity-impact/

Here's to all of us enjoying a safer year in 2018!

Saturday, September 09, 2017

Steps to take now Equifax breach has affected almost half of adults in US

The Equifax security breach, in which "identity theft enabling data" was stolen from a company that sells identity theft protection products, may well surpass the Target breach as one of the most impactful ever, at least from a consumer perspective.

As Lysa Myers, my ESET colleague, has noted this breach appears to have occurred between mid-May and July. It was discovered by Equifax on July 29 and the scale is staggering: 143 million people affected, almost half of all adults in the US!

For those wondering how to identify or mitigate problems caused by this breach, Lysa has some good advice. Unfortunately, the response from Equifax has not been exemplary and there are concerns that it might be trying to restrict consumer rights of redress as part of its "help" process (see this Atlantic article and the update below).

For those wondering how such a thing could happen, I suggest "stay tuned" to your favorite cybersecurity news feeds. We have some information already (Equifax may have fallen behind in applying security updates to its Internet-facing Web applications). However, I am sure there will be more details to come.

In the meantime, I leave you with this weird fact: A share of Equifax (EFX) stock was worth about $143 before the breach, which affected 143 million people. It dropped dramatically after news of the breach broke, closing on Friday at $123. That's a drop of more than 13%. Yet all the indications are that preventing the breach sounds could have been as easy as, you guessed it: 1-2-3.

Update: Thanks to Brian Krebs for flagging the change Equifax that made to its breach alert page. This makes it clear that "the arbitration clause and class action waiver included in the Equifax and TrustedID Premier terms of use does not apply to this cybersecurity incident."

I am providing the address of the breach alert page below, but stress that you use it at your own risk. The fact that I feel compelled to say that is a reflection of how badly, in my opinion, Equifax has been handling the breach response so far: https://www.equifaxsecurity2017.com/