delan

god of no trades, master of none

dog. ao!!

Ⓐ{DHD,utistic} doggirl • bird photography, retrocomputing, speedrunning, osu, rust, (insert special interest here) • 1/6 of the servo team at @igalia • ≡ƒÅ│∩╕ÅΓÇìΓܺ∩╕Å <3 @ariashark @bark

acabzettaiwebpassion
tygsunxenia
monofurnow

a
wawawawawawawawawawawawawawawawawawawawa

web (plus atom feeds)
shuppy.org/
you may also know me as
www.azabani.com/

tjc
@tjc

I participated in a study on ethics and technology being run by the CS department at my alma mater, and as compensation, received an Amazon gift card.


tjc
@tjc

I keep thinking about this study and about the broader idea of teaching "ethics in technology". And the only way I can think of to express my thoughts is: imagine you have a university department whose purpose is to train people to work in the puppy-kicking factory. What would teaching ethics in that department look like? I suppose what it would look like is to prepare students to be good little Boy Scouts and Girl Scouts who will blow the whistle when they spot any ethical digressions in the puppy-kicking industry -- and to reassure them that yes, absolutely, there's someone watching over the industry who cares very much about those ethical digressions. To make sure they leave the university believing that while the puppy-kicking factories have their excesses here and there, those are all due to deviant individuals and as long as you're not one of them, you are morally blameless. To make sure, above all, that students never ask "why do we have puppy-kicking factories to begin with?"


tjc
@tjc

It's like, conservatives aren't wrong when they say that "ethics" instruction is ideological indoctrination. It's just that what they're wrong about is the type of ideology involved, which in reality is fundamentally conservative. It is fundamentally conservative to teach that capitalism is a good system for addressing human needs, whose catastrophes all originate from deviant individuals and not from its basic norms. Even if that precept is never named explicitly and only makes an appearance as the thing that needs to be accepted or else nothing makes any sense.


invis
@invis
This page's posts are visible only to users who are logged in.

tjc
@tjc

Indeed: don't kill people you didn't intend to kill, or don't kill the people who matter.

I've been told in apparent sincerity that working for arms dealers is good because you can help them bomb only the people who were supposed to get bombed. And that's perfectly in line with what professors have in mind by teaching ethics; to them, ethics means adapting oneself perfectly to the rules of an immoral system, in ways that must never threaten the system itself.


tjc
@tjc

Also, note that in this framing, ethics mainly consists of making technology better. It's obviously better not to kill patients with the radiation device you're manufacturing. Safer is what's best for business; it increases others' trust in your business and saves money in the long term if you don't cheap out. There's no content here that's fundamentally about ethics or morality. In situations unlike the Therac-25 incident, where there is a fundamental conflict between morality and either the welfare of a business or the abstraction of technological progress in general, you are likely to be taught that it's not your job to ask questions. Where there is accidental interest convergence between business interests, technological improvement, and the safety of human beings, there is nothing to learn aside from "be a better engineer". Situations where there is a fundamental conflict between the advancement of technical skill and scientific knowledge, and the well-being of some or all humans -- well, those are the ones that professional ethics training not only can't address, but also the ones it's designed to keep you from noticing. If technology can't follow through on its promise to put us in possession of all knowledge, to leave nothing unknown or un-dominated, to free us from the fear of the unknown -- then what? If improving the lives of the immiserated requires more than just better-functioning machines, then what? The system is structured to keep you from asking.


ireneista
@ireneista

software, as a profession, is just beginning to grapple with ethics. anyone who cares about its future should be trying to make sure we wind up with a real ethics system that prevents real harm

you know how when academia did ethics they wound up with the institutional review board (IRB) system? ever notice how IRBs sign off on war machines, studies that are all but guaranteed to hurt marginalized groups, all sorts of shit? yeah - that's because certain limits to their scope got baked into the system when it was created. so they aren't even allowed to consider stuff like harm to society as a result of the study, only harm to participants and experimenters.

it's not that that stuff is unchangeable - we'd encourage anyone in academia to try to change it - but now anyone arguing for change has to contend with the argument that this is how it's been for a long time

computer companies have recognized that they can't keep closing their eyes and pretending they don't see ethical problems, so they're trying to create a system they control, which will sign off on all the bad shit

at the risk of stating the obvious - we should resist


tjc
@tjc

I can't disagree with "we should resist", and Irenes' analysis of university IRBs is completely right. What I don't expect is that software will end up with an ethics system that's any more ethical than that of any other field of engineering. Can you name a field of engineering whose practitioners uniformly refuse to participate in weapons manufacturing? In a way, the definition of engineering is a profession that's an auxiliary to weapons manufacturing. I've tried to say before how information technology itself constitutes weaponry; if you doubt that, ask yourself why the funding for computers and communication systems, and bombs, comes from the same people.

Engineering provides tools and techniques for accelerating whatever the underlying society is already doing. This is not to say that engineering is value-neutral -- nothing could be further from the truth. To be value-neutral is to be an accessory to power. When most of the resources created by our labor are directed towards organized violence, engineering can be nothing but a force multiplier for violence. As long as we either enthusiastically consent to the investment of most of what we have into violence, and the reinvestment of the proceeds of that violence into more violence; or implicitly consent to it through our silence; engineering will only be the practice behind the theory created by our consent. To change things requires openness to the possibility that in a world not organized around violence, engineering might not exist at all; we might not need endlessly more and more efficient ways to do things, given that we already know how to solve every problem other than "how can we kill each other faster?" Resistance to violence doesn't mean resistance to misuses of engineering, any more than it means resistance to misuses of guns.


You must log in to comment.

in reply to @tjc's post:

notably, what is ostensibly "ethics" training is these days mostly a rundown of the various ways financial regulators or your employer will be mad at you if you violate racketeering laws, accept bribes, embezzle etc. so don't do that please

in reply to @tjc's post: