My understanding is Doctors seemingly hate most women given the horror stories I've heard about women trying to get adequate health care and to be taken seriously about their health.
This is a microblogvember post! Here's the list of prompts if you want to participate.
