minecraft

dragongirl funky fresh

april et al, several flavors of therian (plural edition)
engineer in training by day, furry artist and "web developer" also by day
adult, occasionally nsfw on main
๐Ÿ’œ@dragongirlcloaca๐Ÿ’œ
๐Ÿ’œ@8akesale๐Ÿ’œ
avatar by karvakera

last.fm recent played


find me elsewhere
floofy.tech/@starlight

AtFruitBat
@AtFruitBat

Research has shown that some emotional AIs disproportionately attribute negative emotions to the faces of black people, which would have clear and worrying implications if deployed in areas such as recruitment, performance evaluations, medical diagnostics or policing.


bytebat
@bytebat

...which would have clear and worrying implications if deployed in...

stop pretending zhis is a future problem zhat will happen when we use AI more. it has clear and worrying implications now

like seriously in what universe is it reasonable to see zhat your automated systems are saying "hmm black people bad" and respond wizh "well i guess we better not use zhis model in zhis specific area"


You must log in to comment.

in reply to @AtFruitBat's post:

not-fun fact, when self driving cars were tested for their pedestrian tracking (i.e., "That pedestrian is about to cross, stop" or even "Someone jumped in front of us, slam on the breaks") it was less accurate on darker skinned people than lighter skin individuals. I got this information from Joy Buolamwini, a computer scientist and digital activist who's described on the MIT Media Lab website of a "poet of code", which is an awesome description for a person who's doing great work (she has a book on AI talking about this issues I highly recommend). Anyways, selfdriving cars could more likely to hit darker skinned people because they weren't trained on majority "Pale Male" data, or what Buolamwini calls data that's majority light skinned white men. That's not terrifying and dystopian at all