Why do many people, even Democrats, seem ashamed of the term "liberal," especially if it is applied to them? Are there any people out there who are liberals, and are proud to admit it? And do conservatives sometimes demonize liberals?
sort by best latest
It's the same in the US. I think the word liberal should go back to being an adjective, instead of a noun. One may be liberal on one issue and authoritarian on another issue. I'm sure no one is 100% liberal on everything.
You were never liberal minded. I find libertarians to be some of the least charitable of the bunch.
You can help the HubPages community highlight top quality content by ranking this answer up or down.
Winston Churchill's quote, yes. I admit, it has a certain sort of logic to it, but I don't really think that answers the question. Why is it a seen as a "bad" term by some? Is there any specific reason why liberals are portrayed the way they are?
I do think they are portrayed as tree huggers, whiners, etc. The reason being is probably because many are. The same argument could be made for the very conservative and their views. I think you can go to far. Be it left or right.
Actually, Churchill never sad that
Then who did?