if you weren't paying attention to it during the clearchannel or whatever et. al. takeover of radio, everything got very mild very fast because of appeals to reach broad audiences, who of course were modeled as white Christian suburbanites. it kinda killed indie music for awhile.
"AI" is gonna be worse because recommendation systems trend towards the average, but by moving that average and then recommending towards it again, you spiral the drain
One of Cathy O'Neil's main theses in her Weapons of Math Destruction is this particular failure mode of machine-learning algorithms: self-reinforcement. Their output gets fed back into the system as input, providing more backing data for whatever conclusion they first arrived at.
This is in itself a problem, but is also a multiplier on other problematic effects. Like that their original conclusion is based on training data that reflects human baises. This story keeps repeating, over and over.
- Banks make loan decisions based on data of previous loans. This means the algorithms select for people "similar" to those who have gotten loans before. The result is that blatant racism from a decade ago because algorithmically codified as structural racism, with less accountability.
- Search engines (including algorithmic DJs) based their decision of what content to promote on "engagement" data of previous results. This means the algorithms will continue to recommend content similar to what has previously been recommended, as you say.
- College admissions.
- Anything related to the justice system: predicting recidivism, sentencing guidelines.
- It goes on.
- It does not stop.
.png)