So we blame religion on the problems of our nation. We should be getting better if that's the case since the church is losing it's influence. We've taken God out of everything for the last 60 years and we still are seeing a decline in morals. Shouldn't that trend be going the other way if it was right to do that? Maybe if we pass a law against religion it would help.