Through this question, I don't mean to ask about the biological evolution but more on through the philosophical aspect. I mean to say do you think humans are born good and society makes them evil or can it be vice-versa? In either cases, in this time, or this year, do you think that overall, we are growing to be more selfish, more focused on money and ultimately I guess the question is what happened to the love and care humans used to have? Has money changed us that much over the years?
sort by best latest