This week's paper is a draft from Ross Koppel, Sean Smith, Jim Blythe, and Vijay Kothari titled Workarounds to Computer Access in Healthcare Organizations: You Want My Password or a Dead Patient? First of all, great title. This paper is a work of ethnography, where the authors sat and studied how people in medical settings did their work interacting with computers, and denoted all sorts of workarounds they'd take to bypass security rules that they judge are a hindrance to their work.
The idea behind the paper is that clearly, people behind the computer systems are not working from a realistic understanding of what medical professionals have to contend with to do their job. And maybe, just maybe if they sat and figured out how said professionals do their work, it may be different:
Cyber security efforts in healthcare settings increasingly confront workarounds and evasions by clinicians and employees who are just trying to do their work in the face of often onerous and irrational computer security rules. These are not terrorists or black hat hackers, but rather clinicians trying to use the computer system for conventional healthcare activities. These “evaders” acknowledge that effective security controls are, at some level, important—especially the case of an essential service, such as healthcare. [...] Unfortunately, all too often, with these tools, clinicians cannot do their job—and the medical mission trumps the security mission.
Mostly, the idea is that computer and security experts rarely happen to also be clinical care experts. What the paper finds through observations, interviews, and reports, is that:
workarounds to cyber security are the norm, rather than the exception. They not only go unpunished, they go unnoticed in most settings—and often are taught as correct practice.
They break down workarounds in categories, and they're just amazing.
Authentication
They note endemic circumvention of password-based auth. Hospitals and clinics write down passwords everywhere, sometimes as "sticky notes form sticky stalagmites on medical devices and medication preparation room". They've noted things like:
- entire hospitals sharing a password for a medical device (the password is taped on the device)
- emergency rooms' supply rooms with locked doors but the code is written on the door as well
- vendors that distribute stickers to put your password on your monitor
- computers with all employees passwords in a word doc shortcut on the desktop

In general, this happens because no one wants to prevent a clinician from obtaining emergency supplies and someone dying because the code slipped their mind. In some cases, passwords are shared so everyone can read the same patient charts, even if they do have shared access. In some cases, bad actors can use this to mess with data.
But really even the passwords themselves are worse in healthcare. The paper states "the US Inspection General notes that NIST will certify EHR systems as secure even if passwords are only one-character long", for example.
Password expiry also gets a slam:
one physician colleague lamented that a practice may require a physician to do rounds at a hospital monthly—but that unfortunate expiration intervals can force the physician to spend as long at the help desk resetting an expired password as he or she then spends treating patients.
De-Authentication
This one is neat. After you authentified someone, you need to de-auth them when they walk away so their session ends and nobody surfs on their login. In some cases forgetting to log out can lead to abuse or mistakes where people enter information for wrong patients. Unfortunately, this is often undesirable as well and so they note the following workarounds:
- defeating proximity sensors by putting styrofoam cups over detectors
- asking the most junior person on staff to keep pressing the space bar on everyone's keyboard to prevent timeouts
- clinicians offering their logged-in session to next clinicians as a "professional courtesy" (even during security training sessions)
- nurses marking their seats with sweaters or large signs with their name on them, hiding computers, or lowering laptop screens to mark them as busy
One clinician mentioned that his dictation system has a 5 minutes timeout that requires a password and that during a 14-hour day, he spends almost 1.5 hours logging in. In other cases, the auto-logout feature exists on some systems but not all of them such that sometimes staff expect to be logged out when they are not.
One specific example of such usability problem is:
A nurse reports that one hospital’s EMR prevented users from logging in if they were already logged in somewhere else, although it would not meaningfully identify where the offending session was. Unfortunately, the nursing workflow included frequent interruptions—unexpectedly calling a nurse away from her COW. The workflow also included burdensome transitions, such as cleaning and suiting up for surgery. These security design decisions and workflow issues interacted badly: when a nurse going into surgery discovered she was still logged-in, she’d either have to un-gown—or yell for a colleague in the non-sterile area to interrupt her work and go log her out.
Which is an interesting way to see how compliance requirements can interact oddly with the reality on the ground.
Breaking the Representation
Usability problems often result in medical staff working around the system in a way that creates mismatches between reality and what the system sees reported.
One example given is that one Electronic Health Record (EHR) system forces clinicians to prescribe blood thinners to patient meeting given criteria before they can end their session, even if the patient is already on blood thinners. So clinicians have to do a risky workaround where they order a second dose of blood thinners to log out (which is lethal if the patient gets it), quit the system, then log back in to cancel the second dose.
Another example comes from a city hospital where creating a death certificate requires a doctor's digital thumbprint. Unfortunately for that hospital, there is a single doctor that has thumbs that the digital reader manages to scan, so the doctor ends up signing all the death certificates for that hospital regardless of whose patient the deceased was.
There's yet more for these mismatches:
- the creation of shadow notes, paper trails that get destroyed because they are not wanted in an official formal record
- "nurses brain" notes that list all tasks for a patient for their shift (something the computer does not support)
- the creation of shadow notes because the computer doesn't allow enough precision
- needing to note the operating room (OR) admission time precisely when the computer is 2 minutes away from there and won't allow future dates (on paper, nurses wrote now()+2 mins); so the nurse logs in, turns off the monitor, wheels the patient into the OR, then runs out to mark the record with a more accurate time
None of this is really surprising to me; any inadequate system seems to have a tendency to create its own shadow workflow that hides problems by working around them.
Permission Management
Access control plainly sucks, if I can be allowed the editorial tone:
Clinicians often have multiple responsibilities—sometimes moving between hospitals with multiple roles at each one, but accessing the same back-end EHR. Residents change services every 30 days during their training. If access is limited to one service, it needs to be reconfigured that often. However, a resident may be consulted about a former patient, to which he/she no longer has access. More frequent are clinicians who serve in multiple roles: the CMIO may need access to every patient record, not only those in her/his specific medical sub-discipline. A physician who focuses on infectious disease may also be on the committee that oversees medication errors, and thus requires access to the pharmacy IT system and the nurses medication administration system. In some hospitals, nurses sometimes authenticate as nurses and sometimes as doctors.
Also not surprised.
Undermining the Medical Mission
Many health IT systems are so bad they're seen as harming the medical objectives of practitioners.
The example given here is that some hospitals have tele-ICU, where patients must be monitored from distant nurse stations, which has a video feed and all the vitals relayed there. However, when bathing patients, the nurses have to cover the cameras to protect their privacy, and so the ICU can't monitor them adequately anymore.
There's also a case where a doctor couldn't find the required medication in the software. He found a custom field with free text where he noted the prescription, but the box was not visible on the other end so the prescription was never given and the patient lost half his stomach.
Finally, the authors circle back on the value of ethnographic investigations to properly adapt tools to work. They end by stating:
in the inevitable conflict between even well-intended people vs. the machines and the machine rule makers, it’s the people who are more creative and motivated.
I do also appreciate the 'well-intended' qualifier, and I felt like this sentence was a good way to end given current events.
one of the biggest problems i observed in corporate IT, which (from what I've heard) seems to be endemic everywhere is one, simple, very straightforward problem with a very simple solution which absolutely nobody implements, and it is precisely, and nothing more than, this:
There is no central list of accounts that each person requires, and any given password reset process only occurs to a single account at a time.
the particular reason I highlight this is that when I joined on at my last employer, they had solved this problem, and by the time i left, they had unsolved it.
on day one, I received The Spreadsheet, a single .xls containing passwords for every conceivable system that I could ever need access to. every employee received this. it was generated by a single helpdesk worker who knew how to reset every single system in the entire company. that list also provided the internal URLs of all of those systems.
was this secure? i don't care, because every single employee at every single company has something like it. you can deliver their passwords however you like; they are putting them into something like this. you're lucky if it's an xls on their computer, which is at least protected behind a domain password. if you aren't, it's a sheet of paper in their desk drawer.
for the first five years that i was at the company, nobody ever had password issues. it simply did not happen. we never got compromised, either. everyone had access to everything at all times, and if you requested a password reset, you knew exactly which system to request it for, and the helpdesk person knew how to reset that because they needed to know it in the first place to do onboarding.
then we got acquired by someone who acted like Every Other Corporation. they did away with this. new employees received one password, and were not told what it was for. it was usually for the "corporate" AD domain - we had five domains due to a multiway merger and every single employee needed access to all of them.
none of the IT staff understood this. requesting a password reset invariably resulted in the wrong password getting reset. employees did not understand how many accounts they had, or what password went with what.
in the latter five years of my employment at that company, i watched as everyone simply gave up. they all just stopped doing their jobs. after five years of working with a shockingly determined and flexible team, I found that, fairly quickly after the merger, everyone became completely paralyzed by password resets. the IT staff were outsourced, and now new employees were faced with weeks of requesting passwords for accounts they couldn't identify and had never used, from IT techs who had never heard of our company before that day.
what i learned is very simply this: if someone is met with a password prompt, they will try as hard as they can to simply avoid needing to fill it out, up to and including just walking away from the task in question. they will request a password reset the first time, but the second time they will spend an hour trying to get someone else to do it, and the third time they'll just find a way to weasel out of doing it at all. they recognize password resets - accurately - as a stupid, demeaning waste of their time. if you don't solve this problem, it will either get "solved", or people will simply not do their jobs. nobody will tolerate daily password resets, and millions of systems demand them.
by the way: we got compromised twice after the merger. after we stopped giving people .xls files. after the "best practices" were applied. never before.
