Do you think "life will always be like this" or do you sense a major shift coming to America? Some say evil of great proportions is in store for the very near future. Others are optimistic and believe things will turn around. Could it be both? Is this country still great or have we lost our true identity? What road do you believe we are headed down and what role do we, the people, need to play to make things better?
sort by best latest
I like the metaphor--but what if a higher percentage of people go down the wrongful road or those that go down the wrongful road are the ones that are active in creating policy and influencing culture? What accountability does the individual have?