Are there any doctors that Americans trust?

I am confused with the US sometimes. No one trusts the media, yet many people quote the media to make their point(s). From what I have seen around here at least, folks seem to cherry pick who they believe. I don't mean Dems vs Republicans. I mean stuff like medical statistics, certain newspapers, magazines and other sources.

Edited to add: I spent a lot of time in the US when I was in the service. The NRA was trusted. Many of the major newspapers. Even, dare I say it, CNN back in their early days. It seems that people, regardless of political affiliation are less trusting of their governments and institutions. It often seems like if you do not believe what I say, then you are my enemy. Opinions are polarized, with no middle ground. What happened?


Safe Shooting!
Steve Redgwell
www.303british.com

Get your facts first, then you can distort them as you please. - Mark Twain
Member - Professional Outdoor Media Association of Canada
[Linked Image from i.imgur.com]