Research has shown that some emotional AIs disproportionately attribute negative emotions to the faces of black people, which would have clear and worrying implications if deployed in areas such as recruitment, performance evaluations, medical diagnostics or policing.
arlier this year, associate professor Matt Coler and his team at the University of Groningen’s speech technology lab used data from American sitcoms including Friends and The Big Bang Theory to train an AI that can recognise sarcasm.
Absolute standout here for picking the whitest, most middle class, WASP training data possible. I've already had 15 years of battling with voice recognition technology, I can't wait to be forced to use 'english voice' on job applications in the future. backslash fucking s
We are currently living in the timeline where tech simpletons are absolutely choking to invent skynet and turn it into a Karen. i hate it here