Strange though it is to say, but alternatives to GDP are becoming fashionable. This week saw the launch of a new measure of ‘social progress’ on which to rank countries – and perhaps surprisingly, Britain did really rather well, not just beating the USA but also Germany and Japan. As the Telegraph’s headline put it, ‘Britain ranked second only to Sweden in table of most advanced countries’.
I’m keen on looking beyond GDP – and I’m also keen on Britain doing well. So what’s making me unhappy here?
A cynic might say that this is a left-wing bias against any study that claims to show our own country doing well – we’re primed to see our own failings and to try and make our country better, which doesn’t sit well with a feeling of general satisfaction and contentment. (It’s no surprise that the headline trumpeting Britain comes from the right-wing Telegraph, while the left-wing Guardian ‘led’ – to the extent that it covered the story at all – with the the least attention-grabbing headline imaginable, ‘Michael Porter unveils new health and happiness index’).
But my problems with the index aren’t about its results per se – it’s about what it’s based on.
The Social Progress Imperative
So what is the study based on? Well, the Social Progress Index (SPI) comes from an organisation called the ‘Social Progress Imperative’, which is chaired Michael Porter – an expert on business strategy, apparently (says Wikipedia) the most cited author in business and economics today, and a Professor at Harvard Business School.
The SPI was created by Porter in collaboration with some MIT economists, and is funded by a mixture of businesses and nonprofits interested in the crossover between business strategy and social goals (on the one hand Deloitte & Cisco, on the other hand the Skoll Foundation & Fundacion Avina, which are both in turn funded by corporate philanthropists).
In other words, this is social progress as defined by business people with an interest in more than just the bottom line. Which makes it important – apparently Paraguay has already agreed to adopt it (says the Guardian), Porter teaches the CEOs of large businesses on a special course at Harvard, and it was covered by the business press like the Wall Street Journal – but also perhaps explains some of its problems.
Unpacking the numbers
The trouble with any scale is that it’s easy to lose what’s being measured – it gets called ‘social progress’, so everybody thinks ‘this is a measure of social progress’. But what IS social progress, exactly, and does this really measure it?
The SPI is admirably clear on the overall structure of the index, which is as follows (from p44 of their report):
Their justification for defining social progress in this way is primarily given on p43 of the report, citing Amartya Sen’s capabilities framework and research on the role of institutions in economic and social performance. (On p42 they also stress the importance of looking at outcomes rather than inputs – i.e. measuring what you care about, rather than the things that might possibly lead to the things you care about). They think (i) basic human needs, (ii) building blocks for wellbeing & (iii) opportunity are the key over-arching dimensions of social progress, each containing the four components listed above.
Having decided on these components, they then get a series of indicators to measure them, restricting them to indicators that are measured pretty comparably across different countries, and making sure that the measures in each component seem to be measuring similar things (i.e. that the internal validity is OK). So on these grounds, some things fell by the wayside – like the existence of public libraries in ‘Access to Basic Knowledge’ (p49).
The 2-6 indicators per component are then all standardized to make them comparable, then averaged to form the component score, then the components are averaged to form the dimension score, then the dimensions are averaged to form the overall SPI score. Their argument for giving everything an equal weighting is because “there is no clear theoretical or empirical reason to weight any of the components more highly” (p44, see also p42).
So what does that leave us with?
The troubles with this approach, though, become evident when we start to look at exactly what these indicators are (p45; to see the detail of how these are defined, go to the ‘Indicator definitions’ link on the publications page). Rather than going through all 52 indicators, let’s focus on two of the components within the Wellbeing dimension, shown below.
Now I can completely see how Access to ICT is important here, both in terms of accessing social goals and if you wanted to set up your own business. But this means that ‘fixed broadband subscriptions’ gets a bigger weight in the index than ‘life expectancy’! (LE is 1/6th of one component, broadband is 1/4th of one component). I’m not sure that anyone reporting on the index will have noticed this, it would be a struggle to justify this…
I also have quibbles with the inclusion of some of these indicators and the absence of others. Why are ‘cancer death rate’ and ‘CVD/diabetes deaths’ on this list? We already have life expectancy on there, and everyone has to die of something – I don’t see any reason to regard cancer deaths as any worse than anything else.
The UK comes top of the ‘Health and Wellness’ strand, but from looking at their nifty interactive visualisation tool, this is despite coming 10th for life expectancy – and 42nd for the cancer death rate. (The UK is pulled up the rankings particularly for being 2nd for the ‘availability of quality healthcare’ – which is based on the % answering satisfied to “In your city or area where you live, are you satisfied or dissatisfied with the availability of quality healthcare?” It’s not clear which year this is from. But if you trust that this question is comparable, then the NHS is obviously doing something right!)
Such puzzling indicators is the sign of a deeper problem – that they haven’t really gone to the trouble of justifying each of these indicators, so I simply don’t know what was going through their mind when they chose them. Moreover, the same is true for the components themselves: why do we have obesity, and why don’t we have other chronic illnesses? Why is there nothing about the provision of a basic level of security against unemployment, disability or old age, which is generally considered an essential part of social citizenship? Why is there no consideration of economic inequality and the impacts that this has on getting on in life?
And at the deepest level, even the dimensions are questionable. It’s fine to have a vague nod to Amartya Sen, but if we compare this to the Equalities Measurement Framework in the UK – the product of a vast amount of research and consultation by the Equality & Human Rights Commission (see my previous post on this) – the seems arbitrary and ill-thought. And I say this despite my reservations about the bank of different indicators in the EHRC framework as I describe in that post.
It’s hard to escape some disciplinary tribalism here, which isn’t pretty for any of you to look at – but I find it frustrating when a talented team of incredibly smart people wade into an area they don’t seem to understand, all guns blazing. (It’s not just economists that do this – I wish all sociologists who write about the economy had a better understanding of it too. There’s plenty of areas where my opinion isn’t worth anything, as my friends know only too well…). That said, there are some knowledgeable people who’ve been involved so it’s not a car crash – but Stiglitz, Sen & Fitoussi seemed to build a much better-qualified team in moving towards a similar goal.
So let’s be clear: it is definitely valuable to introduce measures beyond GDP into the discussion, and spread these throughout business schools, the business press and to CEOs worldwide. And there’s lots of great stuff in the index – it tries to think about multiple different components of social progress, and includes everything from the ‘stillbirth rate’ to ‘tolerance for homosexuals’ to ‘freedom of speech’. The press release also does a good job of highlighting where countries performed badly in one area despite doing well overall, and the website is fantastic – a model of clear communication for the rest of us to aspire to.
But to my mind, the value of an index lies more than anything in its ability to measure what people think it measures – so that when you look at the small print, you’re not surprised. And I don’t think it does that. Some of the things it measures are puzzling. Some of the things it doesn’t measure are crucial parts of ‘social progress’. And if you’re going to build an index and try and persuade everyone in the world with any influence to use it, I think you should justify every decision you’re making.