Thursday, August 08, 2019

DEFCON III flashback: why hacking sucks

My session at DEFCON III back in 1995 has lived on as an audio recording (.m4b). Just scroll down this page: DEFCON III Archive. The title was intentionally provocative:

The Party's Over: Why Hacking Sucks

The idea was to generate dialogue about the ethics of hacking, and I think I succeeded. In fact, the audio captures that quite well.

(Bear in mind that this was 1995 and I've been to events in 2019 where organizers seemed incapable of capturing audio this well.

As someone who had been working on the computer security problem since the 1980s, I have to say I learned a lot from this session and really appreciated everyone's input.

I was invited back the next year and I will post a link to that DEFCON IV session when I find it again. My topic was how to go from hacker to infosec professional, but like many early DEFCON talks it went in several other directions as well (steam trains?).

Here is a link to initiate the audio file download for the DEFCON III talk, and yes, it is safe to do so. The audio is about 49 minutes long and while the sound starts out rough, it gets better quickly. The file is 18.2MB and the filename is: DEF CON 3 Hacking Conference Presentation By Stephen Cobb - Why Hacking Sucks - Audio.m4b

Monday, August 05, 2019

Experienced vendor-neutral panelist available to talk cybersecurity, cybercrime, data privacy, and more

Has this happened to you? You have this great idea for a panel at a conference, but you need to find great panelists, preferably people who are subject matter experts, but are not employed by a vendor, yet they do have experience as a panelist.

Well, I am one such person: a completely independent researcher specializing in cybersecurity and data privacy who is also an award-winning technologist with 30 years of industry experience. And yes, I have a track record of well-received panel appearances.

So, if you're putting together a panel proposal, or your proposed panel was accepted but now you need panelists, take a look at my areas of expertise. If you think I might be right for your panel, let's discuss - you can reach me on LinkedIn and DMs are open on Twitter.

Here are some of my areas of expertise and interest:
  • Cybercrime and cybercrime metrics
  • Cybersecurity education, skills gap, and workforce issues
  • Cyber-war and cyber-conflict
  • Data privacy and data abuse
  • New technology = risks and attacks (e.g. AI, IoT)
  • Public-interest technology and public policy related to the above
Here is me on video:

Friday, July 12, 2019

The big news from where I am, which will soon be somewhere else

Dateline San Diego, California, July 12:
Today is my last day at ESET, the company that I have worked for since 2011, and from which I am now retiring.

But wait there's more news! In early September, Chey and I will be relocating to the city of Coventry, England, birthplace of the pedal-chain bicycle, Jaguar carsthe turbojet, my parents, my brother, and me.

I will be writing more about this move as time permits, with the latest developments signposted on this blog (

If you want to stay in touch, and I hope you do, you can use email to reach me (use scobb at scobb dot net). You can also find me on Twitter, where I am @zcobb. I'm on LinkedIn as well and you may even spot me on Facebook - where my profile is stcobb - but I don't go there very often. In the past I have published on Medium and I may write some more articles there in the future.

So, that's the news of the day from where I am. What follows are a few random thoughts on the occasion of my departure, retirement, and relocation.

For the record, we will be flying to England, not sailing. I say this because I have twice moved from North America to England on ships. Once when I was six, and again in 1975 on the TSS Stefan Batory.

Postcard of TSS Stefan Batory from the collection of VMF at

Also for the record, I am leaving ESET with very positive feelings. I have never worked this long for anyone other than myself. In my opinion, ESET continues to set the standard for technical excellence, customer support, and dedication to helping the world enjoy safer technology. It was a privilege to work with such a great team of security researchers and I know that they will carry on the mission with courage, integrity, reliability, and passion. (Disclaimer: nobody's paying me to say this, I don't own stock or have any other financial stake in ESET.)

My relationship with ESET began exactly eight years ago this week, with a phone call about a job. The company wanted someone to do vendor neutral security research and education, which was great for me because that's been a passion of mine since the late 1980s. Adding to the appeal: the company wanted me to be based in California, my favorite state. (Chey and I met in California over 30 years ago, but left in the late 1980s to live in Scotland.)

As for my future, who knows? I do know I will keep researching and opining, mainly about technology. I will continue to blog, and there is a book I want to write. Coventry is home to a pair of excellent universities and there are more in the surrounding area - often referred to as "The Midlands" - including my alma mater, the University of Leicester. Doing some form of teaching is a possibility.

So, when Chey and I get properly settled into our new home, it is possible that I will reemerge, maybe as a something like a part-time, semi-retired, independent researcher and public-interest technologist. (I have been watching fellow security veteran Bruce Schneier move in this direction.)

At this point, and if this was a press conference, I would take questions. But I only have time for one right now, so I will answer the one I've been asked many times in recent weeks: Do you think you will miss San Diego?

Yes, I will miss San Diego, and not just because of the weather and the views. We have met so many wonderful people here, many of whom I have worked with in a business climate that is unique in my experience: San Diego has to be the Capital of Collaboration. This is great place to work on technology projects that benefit the community, the nation, and the world. I have often said that cybersecurity is the healthcare of IT, and San Diego is a center of excellence in both meatspace healthcare and cyberspace security. (The cuisine is pretty awesome too.)

On that note, I thank you for reading this far and wish you all the best. As the saying goes:

So long, and thanks for all the fish tacos!


(Note: Image of ESET/Coventry combines a photo that I took plus photography by Si Chun Lam. Some rights reserved. This image is licensed under the Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) Licence.)

Monday, April 15, 2019

Dark markets, threat cumulativity, siegeware, and a cybercrime barometer

This is an update on five parts of my research and writing so far this year. The first part built on a suggestion from ESET PR Manager Anna Keeve: help people better understand the cybercrime threat by showing them the "dark markets" that are used to sell stolen information and buy the tools with which to steal it. So I decided to highlight their “evolution” into mainstream online services for enabling cybercrime.

1. Next Generation Dark Markets? Think Amazon or eBay for the criminally-inclined
In addition, Anna set up a session with the wonderful folks at Markeplace on NPR. So, if you want to hear more about the dark web, close your eyes and take this audio tour: Exploring the dark web with Kai Ryssdal on Marketplace

Wednesday, February 20, 2019

It's official: I'm an award-winning technologist

Earlier this month I was delighted to receive the CompTIA Tech Champion Award, "recognizing leaders focused on driving innovation, job growth and advancements for the information technology (IT) industry." There was even a press release and a video!

To put this award in context, CompTIA is the Computing Technology Industry Association:"the leading voice and advocate for..industry and tech professionals who design, implement, manage, and safeguard the technology that powers the world’s economy."

Saturday, February 16, 2019

Risk assessment and situational awareness: minding the gender gap

Man and woman in elevator iconConsider this: a man and a woman get into an elevator.

Which one is doing risk assessment:

the man or the woman?

I've been posing this question to random groups of people on the fringes of information security and cyber-workforce events for about a year now and the results have been very interesting to say the least. Almost without exception women respond by saying "the woman." And while I can honestly say that this is what I had expected, I continue to be surprised by two things.
  • How quickly that response is voiced, usually in less than a few seconds. 
  • How many women, after answering, proceed to share - without any encouragement - their personal elevator strategies (more on these later).
Also interesting: I have not yet heard a woman say: "I've never really thought about it."

How do men answer? A lot of them do eventually say "the woman" and I take that as a positive sign. It suggests that those men understand one of the fundamental realities of gender inequality in our society: women have had to adapt to living with a higher base level of fear for their personal safety than men.

But there are some men who hesitate before answering. You see quite interesting facial expressions when someone in mixed company answers "the woman" very quickly and decisively. And yes, some men seem genuinely puzzled. For those in doubt, I suggest some reading, like Rage Becomes Her.

Fear, risk perception and social science 

My original motivation in asking this question was to get a quick sanity check on a hypothesis that I had formed while researching risk perception as it relates to technology: women tend to see more risk in technology than men and so increasing female participation in technology development and cybersecurity may reduce risk and increase security.

Some results from the more formal research into risk perception as it relates to gender and technology are illustrated in the graph below - read more about the work here.

Of course, posing the elevator question to random groups of people does not count as formal social science. The reactions that I get may be influenced by the uncontrolled demographics of the group (all male, all male, mixed). That said, I'd love to hear from anyone who is in a position to do a more formal study.

What the graph above illustrates is the gender gap in technology-related risk perception. Numerous studies have documented this over the course of several decades (see the 1994 paper "Gender, race, and perception of environmental health risks" by Flynn, Slovic, and Mertz for early references: Risk Analysis, 14, pp. 1101-1108).

As far as I know, it was studies of public sentiment around environmental issues that led to the first documentation of a gender gap in technology-related risk perception. The research that I did with my colleague at ESET, Lysa Myers, was to the best of my knowledge the first to show that this gender gap also exists with respect to risks related to digital technologies. That finding led me to hypothesize that women - on average or in the aggregate - are more risk aware than men when it comes to technology.

A counter-argument might be that men are more realistic in their assessment of risk because the true level of risk is lower than women think and closer to the population mean. However, it is my opinion that many technology risks are higher than the mean, therefore I would argue that women are more accurate in their technology risk perception than men (on average or in the aggregate).

Research into the gender and ethnic variations in risk perception has shown that white males, as a whole, see less risk in technology than black males, white females, or black females (these were the names of the categories used by the researchers). But that score - which has been dubbed the white male effect - is the result of a subset of while males seeing drastically less risk than anybody else. The group, possibly 30% of white males, lowers the overall risk scores for all white males, creating the gap you see in this chart from the 1994 Flynn, Slovic, and Mertz study (adapted):
As I indicated earlier, this study was not an outlier, other studies point in the same direction and I am not aware of any that point in the opposite direction (I did look for them). You can find quite a few studies, as well as deep dives into why some people see less risk in technology than others, at the Cultural Cognition Project at Yale Law School.

What does it all mean? As I suggested in my TEDx talk a few years ago, I think it means that the rate at which new technology risks are created would go down if decision-making roles in tech companies were more evenly distributed between genders.

Back then I said "we need more women in decision-making roles" and some surveys suggest that there are now more women in such roles than there used to be; but I think we are nowhere near the level of gender equality needed to put the brakes on fresh technological blunders.

In the coming months and years I will continue to articulate these views. In the meantime, I have another study concept you might want to consider. Document what happens when you ask women this question: "What goes through your mind if you're alone in an elevator and a man gets on."

I think you will hear some interesting personal elevator strategies. The ones that I have heard certainly gave me a better sense of just how different life still is for women and men.

Thursday, January 24, 2019

How serious is the cybercrime problem in America?

The short answer to "how serious is the cybercrime problem in America?" is: Way more serious than our government seems to realize. That is one of the conclusions that can be drawn from recent ESET research into public attitudes to cybercrime, cybersecurity, and data privacy.

To check out the details, please visit this article I wrote at WeLiveSecurity, which is where you can download the full report. It has some pretty solid that may help us persuade policy makers to move cybercrime deterrence up the public policy agenda and make it the #1 priority that it should already be.

Frankly, as a student of criminology I was shocked to see that respondents thought cybercrime was a more important challenge than drug trafficking or money laundering. Almost equally worrying was the finding that less than half of Americans surveyed think that the authorities, including law enforcement, are doing enough to fight cybercrime.

So here is the conclusion that I wrote for the sruvey report: unless cybersecurity initiatives and cybercrime deterrence are made a top priority of government agencies and corporations, the rate at which systems and data are abused will continue to rise, further undermining the public’s trust in technology, trust that is vital to America’s economic well-being, now and in the future.

Please take a moment to share this information...thank you!

Sunday, August 12, 2018

What does threat cumulativity mean for the future of digital technology and cybersecurity

In recent years, most of my presentations about cybersecurity have included a slide titled "Security is cumulative". I made the slide when a group of business people asked if I would speak to them about cybersecurity. As usual, I said I would be delighted to do so, but it would help me to know what aspects of the subject they wanted me to address. The conversation continued like this:
  • Them: “You’ve been at this for a long time, right?” 
  • Me: “Yes, I guess I’ve been researching security for about 30 years.“ 
  • Them: “Well, why not talk about the top five or six things that you’ve learned.” 
Why not, indeed. The idea appealed to me and so I created a new slide deck to capture my thoughts and my first thought was this: security is cumulative. Beneath it I wrote words to this effect: To protect information systems and the data they process you have to anticipate and defend against new threats while also defeating old threats.

Ever since I wrote that, I have seen confirmation after confirmation that it is correct. Of course, there’s probably some confirmation bias at work, but consider these recent news stories
That is five examples in 10 days – July 26 to August 4, 2018 – five headlines that reflect the reality that “security is cumulative”. While many information security professionals have, over the years, stressed the need to learn from history, I think this aspect of cybersecurity, this need to defend against an accumulating list of threats, deserves a name, so I am suggesting this one: threat cumulativity.

Here is my proposed definition of threat cumulativity: the tendency of new technologies to spawn new threats that do not displace old threats but add to them.

Of course, there will be objections to this term, starting with "cumulativity is not a word" and "everybody knows this already." Well, cumulativity is a word, as I will explain in a moment. As for "everybody knows this already" let me be blunt: that is one of the most persistent errors in security thinking, kept alive by security experts who are out of touch with the relationship between technology and people.

To be clear, if you are a security expert, you probably do know that threats are cumulative. But there are a whole bunch of people whose work impacts security who have not internalized the implications of this phenomenon. I think that having a term to describe the phenomenon will help to spread awareness of its implications.

Another objection to "threat cumulativity is likely to be: "you mean risks, not threats, so you should be talking about risk cumulativity." This is a non-trivial point and so I am going to address it in a separate article. But I think there are good strategic reasons for using 'threat' here rather than 'risk'.

As for cumulativity, it is a term used in linguistic semantics to describe an expression (X) for which the following holds: "If X is true of both of a and b, then it is also true of the combination of a and b. Example: If two separate entities can be said to be "water", then combining them into one entity will yield more "water"." (Wikipedia)

Now, I am not an expert in linguistic semantics, but I do happen to have a decent degree in English Language and Literature. To my way of thinking, appropriating cumulativity for the security lexicon is a valid use of the word, one that can help people understand - and defend against - the phenomenon it purports to describe.

I will be writing more about threat cumulativity and furnishing examples of how it appears - to my eyes at least - to spell trouble for new technologies, some of which are the object of much hope for future prosperity.

Note: the illustration at the top of the article is from the works of Vauban, a pioneer in physical security, namely fortifications.

Sunday, May 06, 2018

Conversation starter for cybersecurity and workforce networking events

Elevator icon, created and released to public domain by Stephen Cobb
Suppose you work in cybersecurity and/or workforce development and you find yourself talking to other people in those fields, maybe during a networking break between conference sessions, or at one of those randomly seated lunches. Everyone has been introduced, said where they're from and for whom they work, but now there's a lull in the conversation. Try starting the conversation with this question:

A man and a woman get into an elevator; which one is doing risk assessment?

I have been asking strangers this question for a while now and the responses are very interesting. I don't want to tell you what they are at this point - that is a separate blog post. (I'm trying to devise a more formal study of responses from a range of audiences.)

But if you know me, or know of my research into gender and risk perception, you might be able to imagine where I'd like to see the conversation go after this icebreaker (places like a deeper understanding of how our sense of risk varies based on who we are and how our experiences in life have led us to differing levels of concern about potential threats to our wellbeing).

You might also want to ask this question outside of cybersecurity circles. Maybe in class? And you could change it up a little. For example, I have used "which one is more likely to be doing risk assessment?"

You could also ask this question on Twitter or Facebook (feel free to use the image above - frankly I think it's a daft sign, but I made it based on a real one that I saw recently in a very new office building in San Francisco).

Thursday, February 01, 2018

Cybersecurity and data privacy research: a modest eight piece portfolio

Research that I have done in cybersecurity and data privacy over the last few years has borne fruit in a number of different places so I wanted to provide a centralized reference point for eight of the main outputs. This should make it easier for folks to find them. I have annotated the items for context and relevance. (Note: I have formatted all the PDFs for Letter size paper but some of them use UK English spelling, others are US English.)

1. Code as a weapon

Document: Malware is Called Malicious for a Reason: The Risks of Weaponizing Code (PDF)

History: Published in the 6th International Conference on Cyber Conflict (CyCon) Proceedings, P. Brangetto, M. Maybaum, J. Stinissen (Eds.) IEEE, 2014.

Context:  I worked with my friend and colleague Andrew Lee, who was then CEO of ESET North America, to articulate several arguments against using code as a weapon. In the world of companies and consumers, program code that you run on someone else's system without permission is typically referred to as malicious software or malware. A single "infection" can cost a single enterprise hundreds of millions of dollars worth of damage (as in the WannaCry and NotPetya attacks of 2017, which used code developed by the NSA). We argued that the development of "righteous malware" by the military and intel communities, a process sometimes referred to as weaponizing code, has proceeded with insufficient input from the people who defend against, and clean up after, real world malware attacks. The consensus of this community is that military deployment of malicious code is at best a very risky proposition.

(While I was delighted that the paper was accepted for publication, and enjoyed traveling to NATO's Cycon event in Estonia in May of 2014 to present it, one of the reviewer's comments - "not very academic" - stung a little. Consequently, in August of 2014 I enrolled in a Master of Science program at the University of Leicester in England.)

2. Cybercrime and criminology

Document: The main problem with Situational Crime Prevention is that it fails to address the root causes of crime: a critical discussion

History: This 4,000 word essay, which includes an extensive reference list, was the first piece of work that I produced for my MSc in the Department of Criminology at the University of Leicester.

Context:  The essay received a good grade and writing it required me to think hard about some of the fundamental issues in criminology. Presented in the traditional English academic essay format, a proposition is argued for and against. In this case, the idea of practical crime prevention is set against the need to understand and address crime's root causes. My argument was framed in the context of cybercrime, aspects of which - such as attribution, scale, and geography - challenge tradition approaches to crime reduction. Of particular value to my evolving analysis of cybercrime was the early work on Routine Activity Theory performed by Felson and Cohen. Way back in 1979 they warned that: "the opportunity for predatory crime appears to be enmeshed in the opportunity structure for legitimate activities".

3. Measuring cybercrime

Document: Sizing cybercrime: incidents and accidents, hints and allegations

History: Paper selected for publication and presentation at Virus Bulletin, 2015. There is actually a video of the presentation that you can watch here.

Context: Just as defense of an information system means you first need to map and measure it, we need to know the scope and scale of cybercrime before we can effectively fight it. In many countries, the government tracks the number of murders, cars thefts, bank robberies, and other crimes. This data helps inform budgeting and resource allocation while enabling the measurement of efforts to reduce crime. Unfortunately, few countries, if any, have been tracking cybercrime. I argue that this abdication of governmental responsibility severely hampers efforts to fight cybercrime and do the work of cybersecurity. In the US, the federal government now directs inquiries about the level of cybercrime towards surveys performed by commercial organizations that have a vested interest in selling security-related products and services. My review of the literature and the surveys themsleves shows that many lack academic rigor and all are open to claims of bias.

4. Cyber futures and diversity: a TEDx talk

Document: Ones and Zeroes: a tale of two futures (video)

History: Talk given at TEDx San Diego, 2015, in which I drew on three things I learned while studying criminology, plus the inspiring young women of Securing Our eCity's Cyber Boot Camp.

Context: The organizers invited speakers to look to the future. I suggested that the future looks bleak if we don't step up our game in the realm of cybersecurity. I referenced crime deterrence and sentencing, Routine Activity Theory, Cultural Theory of Risk Perception, and White Male Effect. I ended by arguing that security would improve if we increased diversity in decision-making roles in technology companies.

5. The cybersecurity skills gap

Document: Mind this Gap: cybercime and the cybersecurity skills gap

History: Paper selected for publication and presentation at Virus Bulletin, 2015.

Context: As I looked more closely at the growth in cybercrime the more it became apparent that organizations were having great difficulty staffing cybersecurity positions.

6. Data privacy versus data protection in the US

Document:  Data privacy and data protection: US law and legislation

History: This white paper is based on an essay I wrote for my MSc in Security and Risk Management.

Context: As an essay, the document did not receive a great grade (it was deemed "not argumentative enough"). However, the underlying research was sound and, when formatted as a white paper, it has proved to be very useful for anyone trying to understand the American approach to data privacy in general, and more specifically, how this differs from the European notion of data protection, as embodied in the EU's General Data Protection Regulation or GDPR.

7. What it takes to be an effective CISO

Document: Getting to know CISOs: Challenging assumptions about closing the cybersecurity skills gap.

History: This is my MSc dissertation, all 18,000 words and 84 pages of it.

Context: From the abstract: "Pervasive criminal abuse of information and communication technologies has increased the demand for people who can take on the task of securing organizations against the increasing scope and scale of threats. With demand for these cybersecurity professionals growing faster than the supply, a problematic “cybersecurity skills gap” threatens the ability of organizations to adequately protect the information systems upon which they, and society at large, are now heavily reliant. This dissertation focuses on one barrier to closing the cybersecurity skills gap: the current paucity of knowledge about key work roles within the cybersecurity workforce – such as Chief Information Security Officer or CISO – and questionable assumptions about what it takes to perform such roles effectively."

8. Risk perception in cyber: a gendered perspective

Document: Adventures in cybersecurity research: risk, cultural theory, and the white male effect

History: A two-part article, published online, to present the results of the first ever survey of cyber risks relative to gender, ethnicity, and non-cyber hazards.

Context: If you are an information security professional, chances are you will have spent a fair amount of time and effort trying to get people and organizations to do more to protect their computers and data from abuse; and you will know that not everyone takes the risks of digital technology as seriously as you do. I asked myself why some people don't listen to experts, and why some people see less risk than others. Aided by my ESET research colleague, Lysa Myers, a survey was conducted to measure the white male effect and related phenomena. Along the way we found that criminal hacking is now perceived as a serious risk to health and prosperity by a significant section of the population.