Originally Posted by J23
Originally Posted by mtnsnake
Hollywood can not do anything right any more.


You got that right. Likely because Hollywood is run by liberals and social justice warriors.

Yeah, that's one way of referring to them, LOL.