Originally Posted by TheLastLemming76
Originally Posted by stevelyn
Western Christianity hasn't done the world many favors.
Watered down Christianity probably hasn’t but solid fundamentalist Christian beliefs are the building blocks of most of what’s shaped the Western world. Most of today’s problems are the product of a lack of Christian social norms not a product of them.

Put another way. As we’ve moved towards being a mostly secular society do you believe that American society has become increasingly better or worse off?
Well put.