Hi. Good question. This is a question that so many are divided on. since 9/11, I think a lot of America has forgotten. I don't know if you believe in the Bible or are a Christian, but, the Bible says that in the end days there will be wars and rumors of wars. I think war is necessary to defend your home land and country, I am not one for revenging kind of war, but it is hard to say. Wars come in so many shapes and forms, whether it be at home with an enemy, or if the country is at war defending our home land. I would not want to be in our Nations leaders shoes to make any decisions of whether or not we should be in a war, and I won't criticize them either, for I know we do not know all the facts.