Yesterday I had a routine medical checkup. The doctor wasted the majority of his time in the room trying to talk me into the Covid "vaccine".
Thinking back on this later I had a realization: Covid-- or at least the social turmoil surrounding it--
destroyed my trust in doctors.
But at the same time,
recent events bolstered my trust in veterinarians.
There is a huge difference in how the vet spoke to me and how the doctor did.
The vet gave me what she believed were the facts to the best of her knowledge, told me the options, and accepted my decision. I didn't feel judged.
The doctor told me anecdotes with giant obvious (to me) holes in them, tried to scare me, assumed I get medical advice from F***book, told me the experts say it's a good idea, and didn't want to take "no" for an answer. I felt I was being manipulated and judged when I didn't succumb.
One gave me respect and the other treated me like a stubborn, stupid child.
This is how it felt anyway.
Was this the real situation, or just my perception? Is it just a difference in their personalities? Or, is it understandable that a doctor "cares" more because his patients are human so he's going to try harder to convince you of what he believes is the right course? Maybe. I know I have biases that will color everything.
I come away from the vet feeling good about the interaction, even if she says something I don't want to hear. I came away from the doctor appointment feeling terrible about life and about myself.
-