Jag: Förbjud reklam.
Mercier: Nej. Folk är smarta nog att inte låta sig påverkas (om de inte vill).
Levy: Jo, förbjud reklam. Folk är smarta just och endast i den bemärkelsen att de låter sig påverkas av den.
Levy säger inget om reklam specifikt. Vad han säger är att Mercier m.fl. har fel när de påstår att individer är individuellt rationella. För att individuell rationalitet ska få någon positiv verkan så måste den ingå i ett kollektiv som verkar i en miljö designad så att de tumregler som används blir ekologiskt sunda. I den mån vi kan göra något mer än att endast följa våra tumregler (och berömma oss om att vara smarta) så är det just att förutse och förebygga mismatch mellan dessa och det epistemiska klimatet.
Däremot är nudging fortfarande paternalistisk i en negativ bemärkelse om den inskränker sig till att flytta godiset längst bak i butiken.
The dual commitment of the right to dynamism and to stability ensures that being a Republican has no determinate policy implications. From a contradiction, anything follows.
p. 33
Mercier is right to emphasize that beliefs take hold only under certain conditions. The inhabitants of Orleans, or of Kishinev where an infamous pogrom resulted in the death of 49 Jews in 1903, accepted outlandish stories because they were already disposed to feel ill-will toward Jews. They embraced a handy justification for their ill-will and (in the latter case) of the atrocities they went on to commit. But the justifications don’t seem to have been inert: without them, the acts may not have taken place, or may have been less widespread or less serious. For some people they functioned as an excuse, but for those on the fringes they may have functioned as a reason. In fact, it seems impossible to explain the events at Orleans or Kishinev except by citing the kinds of rumors and misinformation Mercier claims to be inconsequential. We can’t explain these events by citing envy or worries about competition alone, because that explanation leaves certain important facts mysterious. Why the Jews and not other successful shopkeepers, or government officials? Handy scapegoats are handy because they are stigmatized, and stigmatization involves belief. The rumors may have taken hold only because people were already disposed to hate and despise Jews, but they were disposed to hate and despise Jews in very important part due to centuries of previous rumors and propaganda.
At the same time, Mercier is persuasive that it is much harder to shift people’s firm beliefs than is usually thought. Propaganda, advertising, and rumor have very limited power to move people to reject a belief that is entrenched to any significant degree (on the other hand, as we will see, it is often trivially easy to shift people from one belief to an opposed belief when the first is not entrenched, even when people take themselves to be fervently committed to it). He is also persuasive that we are very much less likely to accept rumors and conspiracy theories when they are seen by us as having “serious practical consequences” for our lives (as opposed to the lives of others for whom we have little sympathy). But the limited power of propagandists is sometimes enough to bring about serious consequences.
Propagandists can play to our prejudices and make our beliefs more extreme and us more likely to act on them. They can also sometimes take advantage of decreased vigilance with regard to beliefs we see as inconsequential for ourselves by dissimulating the consequences, and they can have spectacular success when the consequences are distant and abstract. Were the consequences of our beliefs about climate change more easily perceptible and more personal, we might be less apt to accept conspiracy theories about it. Because the consequences are far removed from individual behavior, however, we aren’t vigilant. Our relative credulity on this topic helps to explain why we face a climate crisis with little political will to address it.
p. 34-35
We are, as I’ve already noted, epistemic individualists, and we tend to be confident of our intellectual powers. Readers of a book like this one are particularly likely to have a high (and probably well-founded) opinion of their capacities. Surely I exaggerate the degree to which epistemic pollution is an obstacle to belief? Surely you (dear reader) can, with sufficient effort and application, sort through the lies and the fog, and come to an accurate assessment of the evidence? You are (very probably) in a much better epistemic position than most people. It’s not just that you are well-educated and (again, very probably) more intelligent than average. It’s not just that you probably have research skills that most people lack. You are also (very probably) epistemically luckier than most. As a consequence of your socialization (from family through to prestigious academic institution), you have acquired dispositions to trust reliable sources. You know enough to distinguish legitimate institutions from diploma mills; you have some idea of the degree of legitimacy conferred by a publication in Nature or Science. You are alert to signs of predatory publishers and on the lookout for industry funding. You are therefore protected, to some degree, from epistemic pollution.
For all these reasons, you’re indeed more likely than most to get things right when you (attempt to) judge for yourself. But that’s not because you’re a counterexample to my claims: it’s because you fit my model so well. It’s because you defer well that you do well. When you attempt to judge for yourself, you actually engage in social cognition; and that’s why you tend to get things right. You can reliably adjudicate between David Irving and his many critics, between climate scientists and denialists, between anti-vaxxers and genuine experts. But while it may seem to you that you do so well (epistemic individualist that you are) through the power of your unaided reason, a very important part of the explanation for your success is that you defer so fluently and appropriately. You owe your success to the way in which you are embedded in epistemic networks. Even so, I bet even you sometimes go wrong. Your capacities, and your disposition to defer, only get you so far. You live in an environment that is unreliable, in which frauds and fakes mimic the cues to reliability you rely on. Sometimes—I bet—you fall for their tricks. I certainly have.
p. 122-123
If we’re to bring people to believe better, it won’t be by asking them to behave more responsibly or by inculcating the epistemic virtues in them; not primarily and—I bet—not very importantly either. Epistemic humility, open-mindedness, care in evidence-gathering—these all good things (in their place). But they’re no solution to the problem of believing better, largely because it’s extremely difficult, and perhaps impossible, reliably to judge when they’re called for and when they’re not. They’re dispositions that can as easily lead away from the truth as toward it (Levy & Alfano 2019). More pointedly, it’s simply false that the epistemic virtues and their responsible application enable the person reliably to track truths. To the extent she succeeds, it is her embedding in appropriate epistemic and social networks that enables her success.
p. 125
Once we see that (most) nudges work by offering implicit testimony to agents, we’re in a good position to see that many of their opponents have got things completely backwards. They demand, in effect, that we leave things as they are so that people are offered misleading testimony, rather than change the context of choice so that people are offered testimony that genuinely tracks option quality. Though they don’t recognize it, they’re advocating the deception of others, rather than taking steps to ensure that they’re told the truth. That’s not respectful of agency: quite the opposite. Nudging well is offering honest testimony, and refusing to nudge is refusing to ensure that bad testimony is no longer offered.
p. 145
we needn’t see the legacy of the Enlightenment as exhausted by this heavy emphasis on individual rationality, in what Kant regards as its mature form (“without the guidance of others”). Rationality, in its fullest sense, is, roughly, the deployment of cognition in the effective service of truth by appropriate response to the evidential content of information. Many psychologists and philosophers see us as rationally irrational: we deploy our cognition in the effective service of truth but we do so through the use of heuristics and other fast and frugal processes that do not respond appropriately to the content of our information. We respond irrationally—in ways that are not warranted by our evidence—but we’re rational to do so. On my account, we are rationally rational. We respond to the higher-order evidence encoded in our envi- ronment and in the assertions of others, by deferring to them or even self-attributing beliefs. We do so in the service of truth. We’re rational animals after all, even if our rationality is somewhat different to how we imagined it. We need to have the courage to use one another’s understanding as well as our own.
p. 153
The concluding chapter 6 deals with nudging. (You can safely skip directly to it, but you may also want to read the whole thing).
Two things (at least) about nudging have always troubled me: 1) the fact that it is done without (and often in what is perceived to be in direct opposition to) the nudgees wishes -- both regarding the nudging itself and its intended result; and 2) that it aims to preserve a false image of what individual rationality entails (many on the right seem to insist on the natural and universal raw power of individual rationality and its conduciveness to long-term beneficial collective outcomes, while at the same time insisting on the need for concealed manipulation).
To this I would add that 3) to the extent we are (individually and/or collectively) rational and/or humble we should design or nudges consciously, openly and collectively, thus exercising a kind of Odyssean self-control of which I think we are in fact capable - even though we often shirk it.
And 4) that traditional nudging never really seems to take as its starting point individual or collective well-being, truth, and long-term sustainability, but rather aims to put a benevolent gloss on essentially short-term economic imperatives.
Anyway, Levy argues that nudging, properly done, needn’t suffer from any of these criticisms. His main point is that, to the extent we are individually rational at all, we are so because we are social, heuristic-using animals that have the capacity to take part in a potentially constructive cultural evolution. Mismatch and “pollutants” are always a problem and a risk, and nothing is ever guaranteed, but by managing the epistemic environment we can harness the power of or quirks and mitigate the risks. (Cue Jürgen Habermas and Danish epistemologist Klemens Kappell.)
Actually, they’re not even ”quirks” (not just, at best, ecologically valid, as Mercier and Gigerenzer argue) but rational considerations in the truest sense. The conformity “bias” and the prestige “bias” are wholly rational, even if they are sometimes contained to System 1.
This leads Levy to (ever so subtly) advocate coercive measures. We know we need the proper scaffolding. Let’s not fool ourselves into thinking (or pretending) that we don’t, and let us together design the ropes that we would want (beforehand) to tie us to the mast.
More specifically, we should make sure that indicators of popularity and prestige actually track utility, competence and truth.
This is pretty much what I felt before, but could not articulate.
But Levy has taught me on thing: I, too, have been to optimistic about the potential for individual rationality. Definitely not in the way Mercier, Sperber, Gigerenzer, Scott Alexander (?) and others are -- i.e. saying that if we each just “do our thing” most things will work out according to some cosmic plan for the best (possible outcome). Rather in the sense that I am, or have been, a staunch believer in the power of (individual) epistemic virtues. I now tend to accept that these are impotent in comparison to the more mundane (but possibly both ecologically and rationally sound) heuristics that we all share as (unwitting) collaborators in cumulative culture-building.