Originally Posted by Steve Redgwell
Are there any doctors that Americans trust?

I am confused with the US sometimes. No one trusts the media, yet many people quote the media to make their point(s). From what I have seen around here at least, folks seem to cherry pick who they believe. I don't mean Dems vs Republicans. I mean stuff like medical statistics, certain newspapers, magazines and other sources.

Edited to add: I spent a lot of time in the US when I was in the service. The NRA was trusted. Many of the major newspapers. Even, dare I say it, CNN back in their early days. It seems that people, regardless of political affiliation are less trusting of their governments and institutions. It often seems like if you do not believe what I say, then you are my enemy. Opinions are polarized, with no middle ground. What happened?


Of course there are doctors Americans trust. Lots of them. Its kind of like global warming, the MSM has it drilled into the masses that its "settled science." Nothing could be further from the truth. Since the beginning of global warming there have been thousands of world class scientists who disagreed with the narrative. With covid, its even worse. The censorship is unbelievable, but there have been some very good doctors trying to get the truth out for a long time. They have had to fight MSM and big tech every step of the way. Watch the following video just for kicks....would you suggest this bunch are uneducated or loony ?? How anyone can just dismiss what they have to say, and the data they provide, is beyond me.

https://www.bitchute.com/video/RqyafQHKY9Iy/