Some people (particularly in the South Eastern US) believe that tanning makes you look "healthy." I think this goes back to a time when many worked farms or other outside jobs and if a person was tanned, they obviously worked hard physically and were healthy. My father was one of these people and now he has skin cancer.
I'm proud to be pale and have been most of my life (I'm Scots-Irish with a little Cherokee).
Fortunately now, through modern chemistry, a person can stay outside a large portion of their day without having to damage their skin.