• She/Her They/Them Fae/faer

Commie non-binary trans woman.


Cariad
@Cariad

I decided to ask ChatGPT a question I once had during a job interview before being hired on at my current company:

Tell me what to do as a cyber security professional working in an industrial control network when an HMI is compromised and human safety is possibly at risk

And its immediate response was to give me this answer:

If you're a cybersecurity professional working in an industrial control network and you suspect that an HMI (Human Machine Interface) is compromised with potential risks to human safety, it's crucial to take immediate and systematic steps to address the situation. Here's a general guideline:

  1. Isolate the Compromised System:
  • Identify the compromised HMI system and disconnect it from the network immediately to prevent further spread of the attack.
  • If possible, isolate the compromised system physically to ensure it cannot communicate with other networked devices.
  1. Notify Relevant Personnel:
  • Alert the appropriate personnel, including operations, management, and IT security teams, about the compromise.
    Activate the incident response team and emergency procedures as outlined in your organization's cybersecurity and safety policies.

This has me quite pissed off.

The thing that ChatGPT did here was assume that industrial control security is handled no differently than enterprise or general computing; this is dead wrong.

Industrial control involves human safety and taking an HMI out of the equation can actually make matters worse. There have been documented situations where making arbitrary decisions such as the recommended "disconnect it from the network" can trigger cascading effects or outcomes that are undesired.

Unplugging an HMI from a PLC and a control network as a whole could cause a ripple effect that puts human lives in danger. An HMI is basically a computer (but built to withstand a rugged environment) and a PLC (programmable logic controller) is the interface the HMI interacts with to make machines do things. I am simplifying things greatly here, but the point is that they're not regular computers.

ChatGPT correctly states to notify relevant personnel, but this both fails to put this at the start and doesn't mention someone like a plant manager or similar role. You need people who are familiar with the equipment to do things like lockout, shutdown, et cetera--in my field we call them "subject matter experts" or "SMEs".

If I am to hear of a compromised HMI, my instinct is to talk to the person overseeing the operations first and get them to help mitigate it before even dreaming of disconnecting anything unilaterally. I do not want to be responsible for or creating increased risk for human casualties.

This scares the fuck out of me.


You must log in to comment.

in reply to @Cariad's post:

safety-related shit is exactly where generative bullshit's "correct-sounding" output is going to make things so, so much worse. because on its face, if you don't know anything, it sounds plausible! like oh yeah if something's broken of course you want to remove it!

dunning-kruger.app

well usually there is always a safety switch to stop the actual machine... but yeah, its like removing the dashboard, steering wheel, and pedals from a driving car: good luck, have fun. still got your keys and the handbrake!

i didnt mean to press that in case the hmi is compromised or something, i meant: there is always a way to stop the thing from doing its thing.
but that should not be done by a layperson. at least not unsupervised. ... chill, i got you i just thought the no controls in a moving car thing would be an apt example of how a missing hmi might feel like... whatever.

pardon me, it felt like you read into my comment that i was saying one should emergency off in that situation or something.

well, in the neighbourhood. my employer makes cnc machines, so yeah. plc, hmi, the drives and what-not.. our team makes software for postprocessing of machining data.

i suppose your scenario usually is more of a processing line perhaps with multiple machines working for each other?

I deal with things spinning and tossing objects that weigh as much as transit buses. Unless you’re an operator, you don’t hit the emergency stop unless you want to deal with momentum taking over.

If you’re dealing with Rube Goldberg machines especially you do not make these decisions. CNC machines are literal toys in contrast to the giant dryers the size of houses I worry about.

I also deal with self driving vehicles and not the type you see on the roads.

I asked chatgpt to write me a DNS client written in C and it produced the most hilariously dangerous code (no bounds checking on buffer reads from the network).

its great. im so looking forward to the future. im fine. not dying on the inside at all.

I am currently working a contract where large swaths of the deployment and build systems were, admittedly, the result of the lead putting prompts into chatGPT, and I suspect a fair bit of the codebase is the same, though he denies it.

How likely do you think this is to Actually Poison institutional safety procedures? Like, where is the point of failure where a dipshit inserts this into a manual, powerpoint, or placard? Genuine question, the closest I get to any of this was being the temp worker feeding a cardboard folding machine and my MO was "do what the operator tells you exactly". I would have been fooled by this but I also wasn't being let anywhere near any of the buttons.