I've talked about it before, but it's insane that as a black person in America, you're taught/learn from an early age that people don't like you, don't trust you, but somehow know everything about you (that they've learned from stereotypes and pop culture) and you're just supposed to accept it as a part of life
Especially as someone with mental illness!
