It’s probably common knowledge that people can on the one hand deceive themselves and interpret evidence so that it always seems to support their established ideas, and on the other lie or distort in order to make their claims look more true to others. If only that were all, but in truth there are other complicating factors still that affect (and effect) the adoption of false beliefs.
Let’s take as an example the article on organic food and farming at Robert T. Carroll’s The Skeptic’s Dictionary. (Here “organic farming” refers to a practice of only using pesticides and fertilisers of natural origin. “Organic farming” may mean something else someplace else, in fact I am not certain of this, but I won’t go into that any further, since this is only an example.) Carroll argues that, contrary to what their proponents claim, there is no evidence of organically grown vegetables being any more healthy or better for the environment. Under “coddling by the media”, he gives an example of how BBC News has reported misleadingly on the matter, presenting a certain study as having proven the superiority of “organic” vegetables, even though looking more closely at the details reveals that the results are more to the contrary than anything.
So, what BBC News did was an example of presenting facts misleadingly, possibly caused by the biases of a given reporter. In this case we can therefore accuse the author of the article of either shoddy thinking or dishonest reporting.
But what about now that the text has already been written? Next, someone may read this piece of news, and suddenly they’ll have reason to believe that organic vegetables are superior. BBC News reports a scientific study to have proven it, isn’t that a pretty good reason to believe it? In other words, a “good reason” to believe in something has suddenly been generated out of nowhere.
It would appear that when a group of people believe in something, they automatically start to produce such “reasons”. Supposed empirical evidence really is often born out of nowhere. It only takes someone misinterpreting a scientific result once, or conducting an incompetent experiment and spreading its results as reliable, to get the new claim to start spreading among believers. Or the original claim can be generated purely out of nowhere in the manner of rumour, and come to be claimed all the more proven as it spreads. (This also seems to happen without any motive to prove any particular view, as with the spread of the claim that we only use ten per cent of our brain.)
The problem doesn’t only apply to empirical proof, but also arguments for something. Even the craziest conclusions will gain complicated justifications that distract attention from their absurdity; this is because those who hold on to a claim have to defend it and answer to counterarguments, and naturally they also get feedback from each other and develop each others’ ideas further.
Besides of more or less spontaneous generation, such structures of proof are created on purpose, sometimes as downright propaganda. However, it should not be thought that doing it on purpose would imply purposeful lying. After all, those who are telling the truth must also be able to defend their claims.
Incidentally, even if you do not believe Carroll is right in what he is saying in the example above, the beauty of the example is in that it works either way. If Carroll’s claims about organic food and farming are untrue, then his article itself is an example of “evidence” generated to support untrue claims. People can read it and gain a reason to think that organic food is no better.
The Internet only helps in spreading such “information”. Anyone can claim anything on the Internet where in principle anyone can see it, and a slightly naïve person can believe anything they see there as if it were the final word of experts on the matter. On the other hand, one need not be especially gullible to find credible-looking justifications for the most unlikely of claims. Just as long as there are, say, enough money and supporters behind an idea, very convincing-looking arguments for it can be created, even entire supposedly scientific organisations or publications, such as the Discovery Institute promoting “scientific” creationism and attacking evolution. It should be noted that one of the most important features of real science is that only strictly proven results gained through tested methods are accepted, and their origin really is examined and investigated.
From all this, it follows that people don’t need to deceive themselves or think terribly illogically in order to believe many untrue claims, or even bad arguments. There are grounds for believing all kinds of things. In addition, people believing in the true and the untrue can quite well have equally good reasons. What do you say about an American who has heard from her parents from the start that biological evolution is a proven scientific fact, and learnt about it at school, and after that read and watched some popular science presentations on the subject, and for this reason believes evolution to be a proven fact and a sensible theory, even though she’s no biologist… and on the other hand another who was told by her parents that God created all species as they are, and to whom her homeschooling mother has told that evolution is really an unproven and religious theory that makes no sense, and who after that read some materials from, say, the Discovery Institute that argue for the same notion, and for that reason believes it? I’d say that since neither of them truly understands the subject, both have as far as each can tell equally good reasons for believing what they do, with no need for either to deceive herself or be particularly stupid. The first one just happens to be right because she happened to grow up in an environment where truer things were taught to her.
Of course, people also have a tendency to look more for evidence to support their views than against them. It’s especially understandable in this context. If you already think you know quite well what the fact of the matter is about something… then it’s much easier to only look for or come up with reasons why dissenting views are wrong, than to start to get to know the entirely new to you way of thinking behind those objections in case it happens to be right, since you don’t think it is. This is natural, but if you’re going to do it, just think what the consequences will be in the event that you actually are wrong. And considering all of the above, the risk of being wrong is much greater than one might have thought.
So what can one do to avoid getting tangled in such cancerous tumours of information? Frankly, there’s just no easy way to avoid them. If you are not willing to really work to gain background understanding about a topic, you just can’t claim to reliably know anything about it. (That is why I don’t write here about for example politics or economics even though I have some kind of opinions about them. I am in fact at this time trying to learn to understand those topics better, but it’s not easy, and it takes time.) And let it be said here, though it should be clear by this point, that even a great deal of “learning” about a topic can produce mere false information if it’s done one-sidedly and from poor sources. Critical thinking and examining of your sources is necessary, and helps up to a point — for example, careful reading of the abovementioned BBC article on organic food would apparently reveal how distorted it is. But false information spreads in so many forms that this is not enough — for example, direct lies are more difficult to detect this way — , and frankly, the topic is far too broad to be covered here. The links below may be of indirect help…
Oh yes, and you should also stop assuming those who disagree with you are stupid.
- The Skeptic’s Dictionary. I can only recommend that you read this website extensively. It has both information that seems to be generally very well researched and supported — and examples of how to obtain information, and how critical thinking works. If you want to learn to be a critical thinker (one who’s good at telling when it’s a good idea to believe in something and when it’s not), Robert Carroll gives a very good example.
- TalkOrigins. Here you can see in practice how the battle between information and misinformation and from another point of view simply different views and structures of information rages. The site answers an uncountable number of different claims, or more like accusations, against the theory of evolution by creationists. The site in fact makes it possible to compare the claims of the different sides with each other, not just by reading it alone but by checking the sources it gives directly. (I doubt many bother to do this, though. I have just a little.) This is how it should be done, and in scientific writing is done.
- Peter L. Berger and Thomas Luckmann: The Social Construction of Reality. This book presents (among other things) about the same thing that I was describing above, but from a different perspective. Social structures need the support of beliefs and arguments justifying them and ruling out the alternatives and, what do you know, such beliefs do appear, whether reality agrees or not. In the perspective adopted by Berger and Luckmann, the “real” reality is in fact pretty irrelevant.