ann-arcana

Queen of Burgers 🍔

Writer, game designer, engineer, bisexual tranthing, FFXIV addict

OC: Anna Verde - Primal/Excalibur, Empyreum W12 P14

Mare: E6M76HDMVU
. . .



dukecarge
@dukecarge

Given recent AI fervors, I would like to give a perspective from the medical device world and how the current narrative will end with an outright rejection for most uses of it.

I've had this in my head for a while and it is just a way to get it out.


When you release a new product for deployment in hospitals (AKA, patient wards, general and ICU), there is always an initial uptick of complaints from nurses. Are some of those bugs in your product? Yeah. But another set of complaints is usually bound to a distrust of the device, an inherent defensive nature of nurses. They think the device is there to replace some of their functions.

While hospital administrators (and many other industry admins) would love to be able to replace most of their staff to lower costs, at the end of the day medical devices are designed with nurses and doctors in mind. Not admins. We see a complication in their workflow and want to address it. Blood pressure taken manually can take up to 5-7 minutes (including putting the cuff on, instructing the patient in what to do), blood pressure taken by a machine can take 1-3 minutes (slap the cuff on, ask patient not to move). This example shows 3 things: a reduction in usage of the nurses day to day time, a reduction in needing to teach the patient proper form for the measurement and a reduction in complexity of task.

Overall, nurses end up gaining in the process, it becomes an automated task and they can very easily hop in and out of a room after just setting the machine to record. Nurses hated these machines at first; some claimed that they knew better than them despite the fact the machines are calibrated to the same accuracy as the manual measurement. The messaging employed from the device industry was then clear: it's not here to replace you, it is here to help.

Now imagine trying to get nurses to use a blood pressure monitor, all the while there is a group of people outside the hospital ranting about how they're going to replace every single nurse with these machines. These fans of the monitor are not doctors, nurses, or medical staff at all. They for some reason hold resentment to those professionals for being able to carry out the craft. They think it is an easy job (it's not) and that it could be relegated to machines. This is what AI promoters are like. They do not understand the fabric of the industry they are trying to automate and fundamentally misunderstand what people need to actually improve their workflow.

If the nurses example is anything to go by, having these disruptive voices in the room will just push any professionals to swear off most uses of AI. As long as the pushback is bigger than the movement (assisted by if these have any utility at all), AI will be relegated to niche utilities and forgotten. You cannot sell your product and insult your customer at the same time, and the failure to recognize that will be its downfall.


You must log in to comment.

in reply to @dukecarge's post:

this is a really insightful way to look at the issue, thank you! i was trying to figure out why the AI art promoters were starting to pull up words like "gatekeeping" when it came to the pushback and the medical comparison is good in that being a nurse or a doctor is a trained profession, like an artist, that people have worked hard to specialize in yet there is a devaluation of these skills/careers by the uninformed thinking they know better but theyre complete outsiders on the industry.

Like even if they were not selling it, technology adoption happens by way of path of least resistance! And legitimately simplifying things for artists (AKA Clip Studio paint's features that all make it easier to get things done) will get them to pick it up! And they do so organically, based off others' recommendations and the CSP folks pick up on it by having sales.

If CSP suddenly got the same sort of fans AI art generators have, they would not reach their audience, as it would convince said audience that they are helping to destroy their own future.

I’ve also been thinking about this from a medical device perspective. My experience is working with radiologists and surgeons, not nurses, but you’re spot on. Computer aided diagnosis (CAD) and robotic surgery don’t eliminate the docs, they make them faster/better. (CAD actually achieves that, I’m not sure we’re there yet with robots.)

Artists are very much not the customers here, though. A handful of clueless software vendors aimed at artists have gotten caught up in the hype and seem confused why their customers are mad at them, but they're not the ones developing or promoting ML. There's a reason there's so many people out there excitedly gesturing to the empty space where a genuine use for this stuff to the artist would be but they struggle to actually describe a more compelling one than giving you a content-aware fill that exposes you to plagiarism charges, and why the case for how this will instead undermine and replace the bottom tier of creative jobs is by contrast so clear and straightforward, and why so few of these guys exhibit any actual artistic ambitions.

The "AI will replace creatives" drumbeat is for people like the waste-of-space CEO at the organization I work at, who kept suggesting to our staff writer that it'd be cheaper to replace her with a chatbot until she up and quit. C-suite dipshits who view any work but receiving payments as a cost to be cut, actually have money and are easy to sell on the newest gadget, unlike, uh, that big lucrative freelance illustrator/novelist market silicon valley has had its sights on for decades. They're not on Twitter as much but they're reacting, very strongly, to the hype, and they're not insulted. You might develop tools for the actual workers but that doesn't mean nobody out there's making anything for the hedge fund that'd like to just liquidate everything and move on to the next institution.

Well put!

It's also worth mentioning like, from my experience in base level graphic design in small startups, the employers want you to steal. They get frustrated when you explain why you can't just take shit from Google Images willy-nilly. And it doesn't take long to realize that their real frustration is because now that you've brought it up, they don't have deniability.

AI doesn't have ethics. I don't have a chance in hell competing against that, because it's a way to steal, and it has inherent plausibility of denial. It's a godsend for all of the folks who've told me "you know that one specific competitor across town? I want you to give me that thing they do."