Originally Posted by RockyRaab
If they truly believed, Christians would never celebrate anything. They'd live their entire lives in masochistic self-denial, sacrifice, and suffering to be worthy of the infinite rewards they expect when they're dead. Enjoying anything repudiates their faith.



It’s unfortunate that the American version of Christianity gives that impression. A right understanding of the Christian faith is that we are free to enjoy all the good things that God has given us. It’s unfortunate that America evangelical and fundamental thought left Christianity behind decades ago and replaced it with varying forms of moralism.

Last edited by IZH27; 10/24/21.