There’s a piece in Inside Higher Ed this morning from Michael Schwalbe, a sociologist from NC State, on teaching students about the nature of scientific expertise. This is a topic I’ve written about before, in both a formal academic context and a general-audience book, so I obviously have Thoughts about it, which leads directly to this post.
Schwalbe suggests five key characteristics of scientific expertise that basically function as a way to identify which experts can be trusted:
Expertise takes years to acquire.
Expertise is domain specific.
Expertise is active, not static.
Expertise is rooted in community.
Expertise knows its limitations.
These are generally pretty good, which is why I quoted them (click the link above to see all these fleshed out more). I’ve said some similar things in the past, so while I might give a bit less weight to some of these features than Schwalbe does, I would generally agree with this list as at least a useful heuristic for deciding whether a particular talking head is worth listening to.
I am a bit put off by one aspect of the piece, though, namely that it’s framed in a way that makes it seem like an algorithm for deference. That is, it’s presented as a way of evaluating experts, and the suggestion is that if somebody checks the boxes, then you should go along with what they say to do.
And that actually gives me a bit of pause, because of a point that I mostly associate with Josh Barro (I think this podcast episode may have been the first place I heard it stated as succinctly as he does). As he puts it, the role of experts, particularly in technical fields, is to give advice, but making decisions requires consideration of values and priorities that are not determined by scientific or technological means, and thus not the province of experts. Expert advice is an important input to decision-making, but not necessarily the decisive factor.
(This is closely related to my personal hobbyhorse regarding “critical thinking” as explained in the first substantive post on this Substack, but I like Barro’s framing enough that I’ve adopted a lot of it.)
Put another way, subject-matter experts can tell you (with varying degrees of accuracy) what will happen if particular actions are taken (or not taken)— what are the likely benefits, and what are the likely costs. They can’t tell you whether those costs are worth it. That’s a very different question, and comes down to a mix of personal and societal values and priorities, which can lead different people to reach different conclusions about how best to weigh the costs against the benefits for any particular course of action.
In a probably futile attempt to steer away from the hottest controversies of the day, consider the matter of diet. There are numerous public health authorities out there who will be happy to explain at great length that eating red meat has negative health consequences— increasing the risk of heart disease and other problems. These are people who check all the boxes on Schwalbe’s list: they’re credentialed doctors, speaking in their area of expertise, up on the latest research, part of a broad community, etc. I have every reason to believe they know what they’re talking about, as well as anybody can in a life-science field, anyway.
All that said, like a lot of other people who are fully capable of making the same determination about their level of expertise, I somewhat regularly cook and eat rare steak, because their expert advice is only one input to the decision-making process. I accept that they’re probably correct in saying that eating steak is a decision that brings with it certain costs in the form of elevated risk, but I consider that a reasonable trade-off for the pleasure of eating well-prepared beef. I probably eat a bit less steak than I would were there less of a noisy expert consensus against it, but I’m not willing to give it up entirely because there’s more to the decision than that.
Looking at the general process of decision-making and expert input in this light makes clear the limitations of Schwalbe’s approach. While I agree that the heuristic above can be useful to help assign a level of confidence in the predictions made by particular experts, it’s not a sure method for achieving consensus in decision-making, because that’s a question of what costs individuals and groups are willing to incur to achieve particular benefits. Which is orthogonal to the question of whether any particular expert can be trusted when they make a prediction about what will happen.
It also clarifies the nature of a lot of the nastier arguments that take place online and off, particularly in areas that touch on scientific matters. A substantial fraction of calls to “listen to the experts” or “trust the science” are actually disguised calls for people to value things differently than they do. There are, of course, plenty of examples of people questioning or outright denying the validity of expert predictions for reasons that aren’t particularly good, but a sizeable fraction of these arguments are in fact disagreements over whether a particular benefit is worth the associated cost. That is, at a fundamental level, not a problem that you’re going to solve with critical thinking or better tools for assessing expertise.
On the bright side, I do think it’s possible to shift people’s positions in these kinds of cost-benefit tradeoffs, but it requires recognizing what’s actually going on, and addressing that directly, not accusing people on the other side of being too ignorant or deluded to understand the expert consensus. Instead it requires an understanding of what they value and why, which can then be used to shape arguments that they should shift their priorities in ways that lead to better decisions.
So, while I like the general idea behind Schwalbe’s piece, and mostly agree with the heuristic, I think it’s fundamentally incomplete, because it doesn’t address the question of values. It’s still worthwhile to do a better job of teaching people how science works and how to assess expertise, but it’s not going to cure the many and manifest problems with our political #discourse.
Today is the “reading day” between classes and final exams, so as promised (threatened?) last week, I’m getting some time freed up to write more. If you want to see what I do with that, here’s a button:
And if you want to take issue with anything I say here, the comments will be open: