So many things are true in some sense, yet it might be better not to say them. There is a sense in which science is based on trusting human reason rather than, say, divine revelation. But this puts them on an equal level in a false way. What is “reason” in such a case? It’s largely about being critical, about questioning things — about not taking them on trust, or faith. But it goes further than that: the scientific method is in a significant sense based on a deep mistrust in human reasoning.
Let’s start with an analogy outside of science proper. Say you have a holy book, and two people, Alice and Bob. Both are taught to believe in the holy book at first. Alice never goes beyond this but goes on accepting the holy book as it was taught to her. Bob, on the other hand, starts questioning it. He does some research and finds that the way he was taught about the book is not historically accurate. Maybe he believes in the original book then, or maybe he starts questioning it on the whole because he sees things are not as simple as he was shown.
Is it now that Alice has faith in revelation and Bob has faith in reason? If you would say yes, then what is revelation? Is it whatever people happen to teach you about your holy book, or is it what the book originally meant? Maybe Alice here doesn’t have faith in reason, but if she never looks beyond what she was told, she’s really got faith in the idea that what she heard in the first place, what people told her, not God — that this is true. The object of her faith is what she already believes. If she was intent on finding out the original truth, she’d go out to find it like Bob instead of just sitting there in faith.
So what about Bob’s faith in reason? He doesn’t need any. He only needs a lack of complete faith in what he was told. If you question what you thought was true and you find evidence that it’s not true, then to stop thinking it must be true is not faith, it’s the opposite. You do need a basic faith in your own senses and ability to reason, but you need that to come to almost any view. You’d need some form of it even to listen to what someone tells you and believing it. Here, it’s enough by itself.
So, about actual science, then. Modern science has not existed all that long, even though you had respected people doing empirical study of nature before. Modern science began when mistrust in individual human capacity to reach the truth was institutionalised — when the scientific method became all about keeping people from just, in all sincerity, “proving” what they wanted to believe was true.
Certainly there are stereotypes of science: that it’s about reaching certainty, that it’s done by such clever people using such good methods that it just tells the truth. You go and study something, and then you know how it is. Well, that’s not how it really works. Science is about reaching certainty only in the sense that uncertainty is given so many chances to destroy an idea that it either does so or gives up.
An example may clarify the point. In many things such as testing new medicines, you might be required to do a double-blind randomised control group study. What does this mean? Well, you don’t just give the medicine to one group and see if they get better. You take two groups that are sufficiently large and otherwise exactly similar — you randomise them to ensure this — and give one group the medicine, the other a placebo. Otherwise you can’t tell whether people just got better naturally, or because of the placebo effect, instead of the drug. And you have to make sure the groups are otherwise similar so that there isn’t some difference in recovery because of how the people were already different. And the groups must be large enough so that the difference is not due to chance. Further, you don’t let the people who are receiving or who are giving the drug know which ones are real and which ones are placebos (“double-blind”). It might even be that the researchers or doctors giving out the drugs would unconsciously act differently when giving placebos than the real drugs. Even this might have an effect.
And if a treatment hasn’t passed such a test, even if that just means it was tested and found to work but the experiment wasn’t quite airtight, we assume it doesn’t work, no matter how many anecdotes it might have in its favour. If it does pass one such study… then great, just replicate that in a couple of other studies and we might start believing in it. Meanwhile, other scientists are ready to criticise any flaw in the original study and maybe reject it if there turns out to be any reason.
This shows a heck of a distrust in unaided human reason, because unaided human reason would have jumped to conclusions much earlier and with much less evidence. Indeed, it’s a well-established fact that (as the satirical novel Good Omens put it) some people will get better from anything. That is, basically any supposed treatment can and will be thought to work if you sell it to people well. This is because of the placebo effect, but also other factors such as selective remembering of evidence and the fact that symptoms tend to get better right after they’ve been at their worst, so they often get better after people try the snake oil.
Human “reason” might indeed fight against such claims, as they seem to contradict what we see with our own eyes. But once you understand how it works and why it’s needed, it’s nothing but rational. You can’t say that eliminating all other possible causes as carefully as possible makes observations less reliable. But this is a highly sophisticated application of reason, based on essentially hundreds’ of years’ of experience of being wrong. It’s not just reason. Science has little trust for a person’s reason, not even an individual scientist’s.
That’s why you can more or less trust the things that science trusts. Science trusts almost nothing, and requires things that lay people can barely imagine before it will assert anything. At least when it’s doing it right — every ideal gets violated at some point, and this one is no exception. But critics of scientific truths, unless they just believe in nothing at all, almost inevitably are a lot less critical themselves.