menu

Meltdown, Spectre, and user stories from the new security “normal”

When I talk to people about how to protect themselves against security problems, often the first feeling they express is guilt or shame. That’s what I heard from my friend Lindsay the other day when I exclaimed to my old high-school crew about Meltdown and Spectre.

Lindsay is a trained opera singer, a mom, and the wife of a pastor — about as refreshingly far as you can get from the daily grind of the tech industry. She thanked me for letting her know about the urgency of patching. "I’ve been putting it off," she said, "but not for good reasons: my internet often cuts off in the middle of an update, and I don’t continue. And on my phone, I have so many toddler photos and so much music that I don’t have space for the new system."

It’s striking to think about: Why would she feel these are not good reasons? They’re very common reasons for not updating — so ordinary that you could see a user experience designer putting her story forward as a central use case for a new storage management product. Viewed alongside the distractions of running around after a toddler, helping manage a church community, and running music classes, managing phone storage, and system updates looks like a low-priority chore — it’s not critical to anything she does. Setting up automatic backups to the cloud might behave unpredictably, or spur concerns about privacy or losing access to her files. And the poor connectivity she faces is the norm for a majority of internet users worldwide.

Lindsay’s story is one of the millions suddenly cast in a new light with the revelation of these chip vulnerabilities, which affect billions of devices around the world. Update, update, update, goes the drumbeat in articles on the problem. But updates have usability consequences, which on the surface seem unrelated to the primary security issue (information leaks). Patches have slowed down processor speed and caused errors, even bricking some machines.

Meltdown and Spectre should require security advisors to change the calculus of their recommendations to users and corporations. Our thinking through threat models is going to have to balance risk against usability in a more nuanced way.

In this post, I’ll expand on some of the use cases I see, where updating may not even be possible. I’ll outline some reasons why the patches will further hinder the tech industry’s understanding of the usability of their products. And I’ll touch on the theme I led with: the psychology of security behavior, and why we need to work harder to improve everyone’s fundamental security understanding and their sense of security empowerment.

Not updating because of functionality

There are two common fundamental reasons why people do not update: updates may make it impossible to do what they need to do on their machines, or they may not be able to afford the update. Ultimately, both boil down to one reason: in many cases, the software they need to get the job done is bespoke or out of date, and it would cost them a prohibitive amount of money and/or time to get that software updated.

The use case that is perhaps the most fun to think about here is NASA’s. Shortly after the announcement of the defects, a former NASA employee pointed out that updates may render the specially-crafted software systems of the space agency unable to function — say, optics or laser systems. (I say NASA’s problem is 'fun' because of a unique, unavoidable problem it faces that many other bespoke-software users don’t: some of their devices are way, way out of the range of human ability to patch them, in the far-flung reaches of our solar system and beyond. Fortunately, there are no human users whose lives will be negatively impacted by broken software.) The systems that remain on Earth, therefore, are limited in how they can communicate with those systems1. Patching, in this case, might not be an option — the continuation of a mission is likely worth the risk of criminals mucking around in data leaked by the chips.

Back on Earth, those of us in creative fields often have to weigh security concerns against the evolution of the software packages we use. One of my favorite dance musicians works with Rube-Goldberg-looking custom rig he built himself, which uses shoes and a steering wheel to trigger a homebrew system that he built years ago. It runs on Mac OS 9. He can’t update it. Myself, I produced a YouTube series which I primarily edited in Final Cut Pro 7. When I work with our archives, I have to run FCP7 on a machine which is outdated. I’m not doing enough video work right now to justify paying for updated software, so I run it on my elderly Mac laptop and keep it air-gapped from my network.

Organizations using custom-built systems will face the choice of taking a risk with the chip vulnerabilities or continuing to run their old software, investing in building new software that meets their needs, or going with an off-the-shelf solution that may not precisely do what they need. Healthcare systems — for record-keeping, monitoring, diagnostics, etc. — will pose a particular challenge. According to FDA guidelines, patches to medical systems don’t necessarily require FDA review, but they may if the patch changes 'how [the system] works' or if it makes the system 'less effective.' Given the slowdown that the Meltdown/Spectre patches incur and the errors they have caused, it remains to be seen how the FDA will address the significant patching required in this case, and then, whether and how hospitals will be able to comply.

A wide swath of companies has bespoke systems that run one part or another of their business. I was speaking with a software architect the other day whose large media organization was going to have to balance overhauling their existing content-management system — incurring not only the cost of the new system but potentially training and change to their workflows — against the possibility that the security vulnerability would be exploited. For his personal computers, though, he said he would prefer to take the security risk rather than see his computer run more slowly.
 

Not updating because of finance

We already began to see healthcare organizations impacted over the past year by malware aimed at their unpatched, older systems; WannaCry held their systems hostage and demanded ransom. Nobody leaves their devices unpatched out of malice. If hospitals aren’t updating systems, they’re likely weighing the significant cost of updating, building new software, and/or upskilling their workers against other things their patients need — like the services provided by their bespoke software.

Schools do the same; the public school system in Grand Rapids, MI, still relies on a 1980s-era Amiga, programmed by a former student, to control heat and temperature in their buildings. They estimate it will cost between $1.5 and $2 million to replace.

And in that, they’re like most of us when we’re at home. Worldwide, huge numbers of people use outdated devices and software because they can’t justify the costs of new ones. It’s that fact that casts digital divides in a new light in this particular security event.

Spectre and Meltdown are vulnerabilities at the level of chips, the basic hardware of the devices we use. Operating system and browser updates are offered to users at no cost these days. But new devices aren’t given out for free. And that means the ecosystem of known vulnerable machines just got much, much larger.

At the Internet Freedom Festival, we work with free-speech activists and journalists around the world to help them protect their work from repressive governments by using security software. Many of these people live in communities where common operating systems are frequently out of date, if not pirated and shared from potentially dubious sources. Bootleg Windows boxes are what they can access and afford. (Lest my Linux-loving colleagues insist that they are foolish for not using that freely available operating system, be aware that the One Laptop Per Child project, which had students using a Linux variant, was criticized by communities where it was deployed for asking them to teach their students a less-polished, harder-to-maintain system instead of Windows, which they understood as the international common coin of commerce.)

A stained glass artist in rural Vermont receives his first text message — from Google, for password recovery. He purchased the computer pictured here during the G.W. Bush administration. Jessamyn West, the librarian who took this picture, has championed awareness of digital divides.

The advice my colleagues give activists and journalists is to do sensitive work using free, fortified, compartmentalized operating systems like Qubes or Tails, which can be used on their usual machines, instead of Windows or Mac. (Qubes and Tails have both addressed what they can about Spectre and Meltdown, though they acknowledge there may still be some vulnerabilities.)

But herd immunity is important in our networks. The persistence of vulnerable machines around us leaves a door open for attackers. Compromised older machines have long been harnessed by malicious actors in botnets: they’re used to send spam, make distributed denial of service (DDoS) attacks to take down websites, or as stepping stones to reach machines in their networks with more valuable contents.

Given the persistence of older machines in our networks, Spectre and Meltdown vulnerabilities will be with us for a long time. How will our security advice need to change to be relevant to people who can’t afford to upgrade?

How tech producers will miss the effects of this sea change

Tech industry employees have a massive blind spot about digital divides. When you work in the tech industry, your company supplies you with top-of-the-line devices and the fastest internet it can procure. And this puts the difficulty many people have with their online experience out of sight and out of mind.

As a user-experience designer, I can sometimes see my colleagues’ privileged perspective online. It appears when, on my smaller personal laptop, I have to scroll far to the right or bottom of the page to find salient information or buttons — which I’m sure looked perfectly available to the user when the page was designed on the massive monitors our companies provide us. We literally have a broader view than our users.

Facebook encouraged their employees to think beyond the privilege of their usual working conditions by starting '2G Tuesdays.' Through that program, employees could make their internet connection simulate the slower wireless connections that are often available in developing countries.

A dance camp organizer in rural Vermont learns new spreadsheet tricks. Jessamyn West's original caption for this picture noted that she had just taught the organizer how to use "undo." Jessamyn reports on her weekly tech help drop-in sessions on Twitter and in her newsletter.

The Spectre and Meltdown vulnerabilities threaten to worsen the digital divide. Tech industry employees are, by and large, already using recent-model machines with fast processors. For them, the slowdown that patching caused won’t be noticeable as they continue to develop new and more complex software and websites. But consider your local school system, bank, a grocery chain, or hospital. How old are the devices they’re running? When they load sites and run the software, will their experience on patched devices be as speedy as it appeared to be to its producers in the tech industry? Developers and experience designers: will we be testing our software’s performance in ways that capture the fact that their machines may be slower or buggy because of patches?

Peace of mind

It’s an unavoidable fact that the ecosystem of vulnerable machines just got significantly bigger. Our challenge as technology builders is to figure out what we need to say now to people who use what we make. The security solution to this issue is to urge everyone to patch their systems. But from a usability and business perspective — a human perspective — the advice has to be more nuanced.

And as in my conversation with Lindsay, their feelings about staying secure have to be taken into account. I don’t know about you, but when I know I’m working on a vulnerable or compromised machine, I end up feeling… dirty. If so much as a weird-looking popup shows up on my machine, I frantically cast around to figure out where it came from. I worry I’m going to communicate some kind of disease to other machines I connect to. And that makes me feel guilty like Lindsay does about having a phone full of toddler pictures, or like a Zimbabwean activist friend of mine did when he confided to me that he didn’t encrypt his email (which is notoriously hard to use!) because he couldn’t get the client to work.

It’s a known fact in the infosec community that human beings are the weakest link in security. Our brains aren’t good enough to remember strong passwords. Most people alive today just haven’t had the time or resources to learn how digital systems work; their vague mental models of the systems lead them to misunderstand how to stay secure. Our emotions lead us to make bad decisions — say, about emails that show up in our inboxes — out of panic, greed, lust, curiosity, or just reflexive response to what looks like a message from the boss.

But perhaps one of the worst feelings those of us in the tech industry can inflict is a feeling of helplessness. Given the complexity and the newness of the technology, and the non-stop revelation of new vulnerabilities, it can be hard not to feel helpless. But recurring feelings of helplessness can lead to learned helplessness, and people are giving up on taking any action to protect themselves at all.

So now is not the time to make people feel bad about not patching. As we talk to those we know about Meltdown and Spectre, it behooves us not to be overdramatic about the risk or to talk down to people because they don’t know as much as we do.

The Internet Freedom Festival’s community has developed awareness that the emotional state of those we talk to about security is critical to ensuring that they adopt better security behaviors — particularly important when working with activists and journalists whose lives may be at stake, but certainly also relevant when working with executives who worry their jobs may be threatened by security breaches. Or indeed anyone who may have gone through the stress of having their identity stolen.

Fear, anxiety, and stress can all impact people’s ability to absorb and consider security information in a healthy way. The Level UP security trainer curriculum, developed for activists and journalists, has some guidance on how to manage the psychological impact of security training. It can really be considered anywhere security discussions are happening.

Those of us who build software or work in security need to acknowledge that we also make tradeoffs when realistically weighing security risk against what we need to get done. We need to walk through our thinking about the likelihood of risks and tradeoffs with usability issues — beyond patch! patch! Patch! — to management, to businesses we contract with, and to the public. Doing so will help them build their own ability to reason about the dimensions of security threats.

Insecurity is the new normal. We’ll be stronger as a network if we help each other understand the social and emotional ramifications of that fact.
 

1. For more information on how JPL scientists adapted obsolete software to continue squeezing the last drops of value out of the Cassini mission see Marisa Leavitt Cohn’s amazing research.