I'm getting to the point where all professional sports are nothing more than a woke, Hollywood inspired reality show like the NFL has turned into. I can feel better about life in general and have less stress and aggravation if I don't see, hear, or know anything about whining, leftist millionaires playing for the cameras and acting out their social justice theater production.