I don't know if it's something that only I happen to be unlucky enough to experience, but I find that a lot of the doctors (particularly specialists) -- mostly male -- I've been to are supremely arrogant, rude, and unprofessional. What's your take on this? Maybe I'm just generalising. They're not all bad, but some of them really are!
sort by best latest
Precisely. I hate it when they don't listen.
In medical school, I don't think they have an arrogant bone in them. They grow into that person, sometimes, depending on their personality. They call the shots-well, I let them think so, and success in medical practice can go to their heads.
You can help the HubPages community highlight top quality content by ranking this answer up or down.
For the most part, physicians love the work they do. They're not in it for the money (very few make the large salaries that incredibly successful cardio-thoracic surgeons do). Many need to have egos, which give rise to arrogance; Most are kind, care.
Yes, you're right. I think a lot of them need to be brought back down to earth, and they should stop thinking they are gods among men.