waverly

Trans Rights are Human Rights

  • she/her or they/them or it/its

M.A. Linguistics, B.Sc. Computer Science. Also interested in art and music theory. Tumbling through life. Navigating the universe. Laying on the floor. Profile Picture by ikimaru on Tumblr; Header courtesy NASA/JPL-Caltech.



AtFruitBat
@AtFruitBat

Research has shown that some emotional AIs disproportionately attribute negative emotions to the faces of black people, which would have clear and worrying implications if deployed in areas such as recruitment, performance evaluations, medical diagnostics or policing.


You must log in to comment.

in reply to @AtFruitBat's post:

not-fun fact, when self driving cars were tested for their pedestrian tracking (i.e., "That pedestrian is about to cross, stop" or even "Someone jumped in front of us, slam on the breaks") it was less accurate on darker skinned people than lighter skin individuals. I got this information from Joy Buolamwini, a computer scientist and digital activist who's described on the MIT Media Lab website of a "poet of code", which is an awesome description for a person who's doing great work (she has a book on AI talking about this issues I highly recommend). Anyways, selfdriving cars could more likely to hit darker skinned people because they weren't trained on majority "Pale Male" data, or what Buolamwini calls data that's majority light skinned white men. That's not terrifying and dystopian at all