IMHO it's not a question of if the climate is changing....it is, it always is, and always has been. My questions are: Why is climate change inherently bad? What areas are actually benefiting from a change in weather? Why aren't the people who own oceanfront property selling and moving to higher ground or at least building their beach houses on stilts?
I think the topic is a huge scam to scare and control people for the gain of a few hucksters.