As part of the International Year of Statistics (by the way, it’s also the International Year of Water Cooperation, and the International Year of Quinoa, so good quinoa recipes in the comments please), Ipsos Mori recently conducted a survey looking at people’s factual beliefs about the UK. I’m sure you’ll all be shocked to hear that (among other things) the British people heroically overestimate:
- The amount we spend on Jobseeker’s allowance (29% of the sample believed we spend more on JSA than pensions; whereas the opposite is true – to the tune of about 1500%).
- The scale of welfare benefit fraud (estimated by the sample at £24 out of every £100; actually more like 70p)
- How much we would save by capping benefits at £26,000 (twice the number of people in the sample thought we would save more by doing this than by stopping child benefit for high earners or raising the retirement age to 66. The real savings are estimated to be £260 million, £1700 million, and £5000 million, respectively).
The Ipsos sample was pretty small (only 1,015 people), and this was an online study so all the usual caveats apply. But these results are consistent with a lot of existing research; including numbers from the British Social Attitudes survey, this YouGov Survey I’ve mentioned previously, and this study by our very own Ben Baumberg (et al.).
The survey prompted a moderate amount of media coverage – mostly of the “sigh, people don’t know anything about politics” variety. Ipsos’ own write-up is also pretty bloodless – “these misperceptions present clear issues for informed public debate and policy making”. No kidding! On the most basic figures that anyone would need to reach an opinion about welfare, people are not just a bit off-base; they’re living in an entirely different universe! This is an even bigger problem when you see that people aren’t just guessing all over the map. They are consistently wrong in the conservative direction (overestimates of benefits spending, overestimates of fraud, etc.). This has big and obvious implications for people’s views on government policy when it comes to benefit cuts.
This is illustrated quite nicely by a study carried out in the U.S. by James Kuklinski and colleagues. They surveyed around a 1,000 people in Illinois and found the usual large, conservative-biased errors in people’s beliefs about welfare (their respondents overestimated the number of people on welfare, how much money welfare recipients get, how much of the federal budget goes on welfare, and so on). They also found that these beliefs were one of the strongest predictors of people’s opinions on welfare policy – the higher people thought federal welfare spending was, for example, the more likely they were to think it should be cut.
So far, so unsurprising. But the authors went a couple of steps further. First they asked people how confident they were that their estimates (of spending etc.) were correct. You might think that people making evidence-free guesses would at least be aware of it. Sadly not – the people making the least accurate guesses were also the ones who were most confident they were right.
Next, using a smaller sample of around 70 students, they tried to see how resistant people were to being confronted with the real facts (this time just about how much money the government spends on welfare). They asked participants to first estimate what proportion of federal spending went on welfare, and then to say what they thought this figure should be. As in the larger sample, people routinely overestimated how much was spent. However, the researchers then confronted a random half of the sample with the real number. This figure was often not only lower than people’s estimates, but also substantially lower than their preferred amount. This had a strong effect on people’s opinions about policy. Not surprising really – if you said that, say, 30% of government spending went on welfare, but only 5% should, you’d feel pretty silly still calling for a cut after being told that actual spending is really only 1%.
Even though this is a small student sample, these results seem pretty heartening. Maybe presenting people with the real facts can actually make a difference – at least if you do it the right way. The researchers, however, were less optimistic. Drawing on previous research they predicted that these effects would probably be short-term; and I’m sorry to say I probably agree. Can you imagine a fervent opponent of welfare, after being cornered in this study into taking the opposite position, would come away ‘converted’ for good? Or is it more likely they’ll find some way to circle back around to their original opinion? People’s attitudes are complicated, and they don’t just come from misinformation about raw numbers. They come from personal experience, anecdotes, media narratives and so on – built up over a whole lifetime.
So does that mean I was over-reacting earlier, in my outrage over people’s ignorance of benefit numbers? Maybe a little. But I’d still rather have people start from a baseline of something approaching reality. It might not change their underlying feelings about the issue, but it might at least give them a better sense of proportion.