Bill brings up one of the reoccurring topics on this blog: how should we form opinions on subjects about which we are not experts? However knowledgeable we think we are, this is a live problem for all of us. He writes
Everybody can’t be an expert on everything. Most people aren’t bright enough to be an expert on any scholarly topic at all. The heuristic “The Left is vigorously asserting A; therefore, not-A” seems like a very reasonable heuristic to me. Do you have some alternative heuristic which can be applied by an IQ 90 member of the public lacking specialized knowledge? One that will approximate the truth better? Certainly it can’t be trust the experts.
Drieu replies
In areas related to the natural sciences, yes you should defer to what the vast majority of scientists in the field claim, especially if your own amount of scientific knowledge is limited. If you have a problem stomaching this, then another option is to *keep your mouth shut*. In the social sciences, I can understand having some qualms….
In a previous post on the topic “When is one entitled to an opinion?“, I said
I would say that opinions come in different weights, each with a different cost. On the one hand, there are impressions, the less expensive but not free opinions. A man may express an opinion at a party that life in the universe is common, that Christianity brought down the Roman Empire, that we should go back to the gold standard, or whatever. I would expect such a man to be able to give me some reason why he holds this belief. I would also expect him to be able to give me a reasonable account of the alternative views on the subject–what they are and why one might hold them. If he couldn’t do that, I would consider that he had just made an ass of himself by mouthing off on a topic about which he obviously knows nothing. On the other hand, I wouldn’t think less of him if he were unfamiliar with the specialized literature on the subject, or even if there were some major arguments on the subject with which he was unfamiliar. Life is short; nobody has time to be an expert on everything. If our man at the party can give some justification for his impressions, I think he’s entitled to them.
I have different expectations for the serious advocate–the man who writes books, gives lectures, or publicly debates on a certain topic. I think it’s fair to expect such a man to know his stuff–all the major arguments both on his own side and on the others, the technicalities of his subject, its history, and so forth. I think we’re entitled to expect advocates to have done their homework, and they should be embarrassed if it comes out that they haven’t….
Here’s another of my opinions about opinions, one that might be less welcome to some of my readers. It is said that the argument from authority is the weakest of arguments, but I say that the strength of this argument grows with one’s own ignorance. If you don’t know anything about a subject, you’d better trust the experts. “Because the experts say so” is a pretty strong reason, if the subject in question is one where there is real expertise (e.g. the hard sciences). I hold people who doubt evolution, the big bang, or [the existence of] global warming to higher epistemic standards than I do people who accept the consensus opinion. If you’re going to disagree with all the experts in a field, you’d better know your stuff, or you’re going to make a fool of yourself, and I don’t want to be standing next to you when it happens.
What about Bill’s point point that experts are biased and so not to be trusted? For the social sciences, I think this is a big issue, so much so that I routinely dismiss sociological “findings” that sound too implausible and ideologically motivated. For example, whenever a study purporting to demonstrate white racism, I assume the authors are PC hacks. We all know perfectly well the worshipful, self-abasing attitude toward negroes that pervades all political factions of white society. Like Drieu, I don’t regard social scientists as real experts; they’re just propagandists. Still, I try not to let my world-view rely too heavily on my opinions on subjects about which I am uninformed. I don’t have to disprove demonstrations of black oppression, because none of my major beliefs would be challenged even if it were true.
In the natural sciences, there is real expertise. That’s one reason I’m more inclined to trust natural scientists when they’re talking about natural science. Another reason they’re probably more trustworthy is that they’re findings almost never have direct political significance. To take our most recent example, nothing climate scientists could find would really address the correctness of liberalism vs conservatism as political ideologies.
Filed under: Climate, philosophy of science |
Important topic! – especially in a world where it is mandatory to have an opinion on everything.
My impression is that your advocated attitude is suitable only for a world in which most expertize is mostly right – i.e. a world where experts are genuine authorities; but, in a world where most expertise is wrong you will be led astray badly. You would, in fact, be a dupe of propaganda (and scientists -along with everyone else – generate mostly propaganda nowadays – mostly propaganda for the continued and expanded funding of research).
The vital background is a judgment of what kind of society we live in, its social mores.
For example, is it important whether people habitually try to be truthful, or not? Is it important whether people adhere to natural morality as an ideal, or aspire to inversions of natural law? Is it important whether people aim to create and appreciate beauty, or prefer to mock and destroy beauty for socio-political reasons?
Do these matters affect trust in authority? – it seems obvious that they should.
It is easy to imagine a totalitarian society like Soviet Russia or Mao’s China in which all authority is tainted, and where the most strongly asserted beliefs are those with no truth, or the opposite of truth.
To what extent is public discourse in the modern West like this? I would say ‘quite a lot, and increasingly so’ – my impression is that you regard things differently, and you feel that most scientists are (for example) trustworthy authorities on their own subjects.
But I have worked in areas of science such as epiemiology where there was essentially no good work being done, and no good people at all – where essentially the whole of the output was at best worthless but mostly actively-misleading. There were no authorities – except of a negative type. To be a moderate was to be deluded.
I think almost all science is nowadays of this type – non-self correcting bureaucratic careerism. There are many reasons, but the core one is that it is very unnatural for people to seek truth and be honest, and when this is not enforced, people relase to the default state. Science was a small elite activity when it worked as real science – tenfold expansion made scietists less able and less honest; micro-specialization made their work meaningless.
So, why should we defer to such ‘authorities’? No reason: we should not.
Hi Bruce,
I can think of two biographical differences between us that might be having some effect.
1) You seem to have in-depth knowledge in a lot of fields. I’m less knowledgeable and so must lean on expert opinion more heavily (no doubt incurring danger in the process).
2) I’m part of the problem you describe! I must spend a significant amount of time producing propaganda to augment my funding (i.e. research proposals). I make no apologies. Without money, I can’t support my graduate students, and I won’t get tenure.
Climate science could be very useful politically.
Tracking, transparency, certification, eco-taxes, environmental excellence, and the policing of water, all give us an idea of the coming state of ecological emergency. Everything is permitted to a power structure that bases its authority in Nature, in health and in well-being.
“I must spend a significant amount of time producing propaganda to augment my funding (i.e. research proposals). I make no apologies. Without money, I can’t support my graduate students, and I won’t get tenure.”
Indeed, but that is ‘research’ (which is a sub-type of bureaucracy) and has *nothing* to do with science – indeed, it is anti-science.
The big question is whether real science can *ever* be done under such circumstances.
The best bet is to do real science in a different field using the money raised by a research career.
There are two different distinctions floating around in your discourse: natural/social and political/nonpolitical. I endorse the latter. Furthermore, I can’t think, offhand, of why anyone would endorse the former except as a convenient proxy for the latter. More carefully, it is not politics, per se, but incentives to come to particular conclusions (rather than whichever conclusion happens to be the truth) which is the problem, and politics/ideology just happens to be a common source of such incentives.
Biology presumably qualifies as a natural science. Biologists appear to me to have an obviously false consensus on race. Soviet evolutionary biology was useless because it was political. Biologists are strangely sanguine about lying when that lying is in service of their battle against the eeeeebil creationists. (Note: I believe in evolution)
By contrast, psychometricians and physical anthropologists (social scientists) have a much more honest approach to race, though even here there is a lot of distortion from politics. Linguists are usually thought to be social scientists or even humanists (and, furthermore, are a bunch of dirty commies), and I trust them when they are talking about Linguistics.
You can see the toxic effect of incentives on science “in the small” in expert witnessing. Courts often accept testimony on technical matters from academics. As long as each side in a case has deep pockets, each side has no trouble finding an academic expert willing to say what needs to be said for their side. This often leads to really “creative” claims being made by acknowledged experts in open court testifying under oath. In disciplines where expert witnessing is common, we are socialized to react to this phenomenon by chuckling, rolling our eyes, and then carefully not drawing any further conclusions.
AGW is another example demonstrating the non-trustworthiness of natural scientists when ideology is involved. The hockey stick was the Margaret Mead moment of climate “science,” and they failed even more spectacularly than cultural anthropology did. It has been apparent for some time that the hockey team are charlatans, and this has had no discernible impact on their careers or on the seriousness with which their work is taken. Similarly for the various keepers of the surface temperature record. They are plainly not (even minimally competent) scientists, and yet the respect accorded scientists continues to be accorded to them. The ideology prevalent in this discipline is quite out and in your face.
In short, you should defer to experts when they have no particular incentive to deceive you. The proviso excludes many, many politically relevant cases. So, as long as we are talking about public policy, “experts are useless” is a pretty good rule.
Finally, if you want to completely destroy the natural sciences, you could do no better than to get society to adopt the rule “defer to natural scientists and only natural scientists in their area of expertise.” You see the incentive this sets up for the powerful and the wannabe powerful, right?
Dr Charlton,
You said,
“Indeed, but that is ‘research’ (which is a sub-type of bureaucracy) and has *nothing* to do with science – indeed, it is anti-science.
The big question is whether real science can *ever* be done under such circumstances.
The best bet is to do real science in a different field using the money raised by a research career.”
These assertions fascinate me. Have you elaborated on them elsewhere, or, if not, can you elaborate?
There is one problem with Drieu’s admonition to defer to what the “vast majority” of experts in a field believe. When we do this, we are actually deferring, not to the vast majority of experts, but to the individual who claims to know what the vast majority believes. I’m not denying that there is a consensus in such cases, only pointing out that, as an outsider, my knowledge of the consensus is entirely dependent on the authority of the person who reports the consensus. I must have grounds for confidence in the man who reports a consensus before I can think about having confidence in the consensus. And here we’re back to Bruce Charlton’s point, that we must know that the man who reports the consensus is habitually truthful.
Bonald: I think you rely too much on the cocktail party scenario when thinking about opinions. The man who offers evidence and displays knowledge of contrary arguments exhibits knowledge of the “discourse,” not of the object of the discourse. He clearly has plenty of experience talking about X, but this is no guarantee he has any experience with X. A little research (proper sense) normally reveals that the subtle and learned argument that was advanced at last night’s cocktail party was also advanced, in more or less the same terms, in last week’s New York Review of Books. A description of the “present state of the question” is not an answer to the question.
Being a sort of social scientist, I’d like to advance a very modest apology for the evident bias of social science experts. Much social science has a direct bearing on social policy, and all social science rests on some particular beliefs about human beings. Because we are members of a society, we have opinions about social policy, and because we are humans, we have some knowledge of what humans are like. This makes the bias of social science experts more obvious than the bias of natural scientist. Ordinary people can legitimately claim to know a great deal about how society and human personality works, and they are perfectly correct in believing that their experience is part of the data. The same is not true of neutrinos or DNA. This does not mean that natural scientists are more honest or data-driven, only that I can’t know this because they, the natural scientists, have all the data.
Bonald, you are sounding like a technocrat. You are promoting the rule by experts that is a bane of modern life. You may have a point on this or on that where it makes sense, but when you add it all up, you end up exactly where we are. It’s hell.
The reasoning behind the rule of experts is that if you are not an expert, you must yield to the experts. If you can’t make the rationalistic scientific case against something, you are defeated. Why else do you suppose that all the fluffy subjects have been made into “science” (I speak of social science)? It’s to clear the way of opposition. Technocracy in action.
Neil Postman’s Technopoly is good reading on the matter.
A staunch traditionalist would simply resist the experts. Since he is not an expert, he will lean on the tradition. Yes, by doing this he knows better than the experts. He is holding to the tried and true, and not to the theoretical speculations of the cultural Marxists.
Well, we’re all trying to do something like that. We spend a lot of time doing non-science (grant writing, committee work, teaching, student training) to get the resources and environment we need for our actual science work. I don’t lie on proposals, i.e. promising to do stuff I don’t intend to do, or in publications, i.e. bending results to fit funding agencies’ desires. But it does indeed affect my work, because of all the pressure to publish regularly. This pushes me towards making incremental steps in the same direction as what I’ve already been doing rather than break off in entirely new directions. The latter is risky. More often than not, new ideas don’t pan out (mine often don’t anyway), and even if they do, it usually takes a long time. I’m not happy about this, so I’m looking at ways to restructure my research group to give me more freedom–while always having some “safe” projects going on the side: small, exploratory projects with students, more long-term planning with collaborators, etc.
I think a staunch traditionalist would say that society needs to be organized so that communities don’t need much expert knowledge to run their affairs. This is one of the reasons we push for localism. Arguably one does need a lot of expert knowledge to run the United States as a whole, so the US shouldn’t be run as a whole. To run my hometown of Pana, Illinois (population 6000), you have to know the locality instead of abstractions, and local traditions are a good guide. If your society is set up so that major decisions depend on esoteric knowledge, the game is already up. You either go blind or accept expert rule; either way is a disaster.
Good points, all.
Alan Roebuck – I’ve written such a lot on this topic I hardly know where to start!
Maybe here:
http://thestoryofscience.blogspot.com/
And browse my articles from when I used to edit Medical Hypotheses:
http://medicalhypotheses.blogspot.com/
Good luck…
When I was younger, I was often annoyed by the fact that my colleagues always referred to ideas with the proper names of the people who had written papers about them recently (Baker et al say . . .). At that time, it annoyed me because I have no innate tendency to remember who said interesting things. I just remembered the interesting things. So, for a while I tried hard to keep ideas and authors hooked up in my mind. This pretty quickly led to the discovery that my interlocutors generally had the sort of depth of knowledge of the ideas they were discussing which you could get by quickly reading the abstract of the paper. The fact that Baker said X in this paper was, for them, a fact like Baker has brown hair or Baker cheats on his wife.
It is very hard to judge whether someone has expertise by listening to them talk (though you can often spot total incompetence this way). You need to interrogate them or, better yet, watch them solve problems requiring that they apply the things they are allegedly expert at.
Dr. Charlton,
Thanks for the links. I’m reading them with interest.
A great essay on the topic: “The Layman’s Predicament,” Chapter 1 in Basil Mitchell, How to Play Theological Ping Pong: Collected Essays on Faith and Reason (http://www.amazon.com/How-Play-Theological-Ping-Pong-Essays/dp/0802805442/ref=sr_1_1_title_0_main?s=books&ie=UTF8&qid=1314920648&sr=1-1).