Have you ever wondered about all these international surveys that are quoted so authoritatively?
The latest, from the Legatum Institute – a think tank founded by “a private investment firm headquartered in Dubai” – has already been widely reported, finding that “Free markets, free people, and the world’s strongest society ensure that New Zealand takes the top spot in [Legatum’s Prosperity Index.”
This is great news, isn’t it?!
New Zealand has ranked first in the Prosperity Index for six of the last ten years [say the authors of the Prosperity Index]. Bar a small prosperity drop as a result of the 2008 global financial crisis and the immediate impact of the Canterbury Earthquakes, New Zealand’s prosperity has been on an upward trend, particularly since 2012. This rise has been driven by concerted efforts by policymakers, especially in economic and health policy. New Zealand’s Business Environment performance has seen it rise nine ranks to 2nd, and in Health it has risen eight ranks to 12th. Underlying strengths include Economic Quality, particularly free and open markets, where New Zealand ranks 1st, Governance (2nd), Personal Freedom (3rd), and Social Capital (1st).
So break out the bubbly! (As much of the commentariat indiscriminately do whenever any of these surveys are released.)
This follows many other survey’s showing New Zealand and its cities to be among the world’s happiest, most prosperous and most free, all just as breathlessly reported. (Oh, and also the most unaffordable. So there’s that.)
So, great news, if true, right? But if you catch yourself wondering about the degree to which we and our markets are truly free, and how these authors would really know that, especially in his detail, then you’re not alone. So am I.
It’s not just this survey. Have you ever wondered from where exactly all these folk derive their data for all these things about the world’s happiest this and the freest the-other? I pressed one fellow once whose “freedom index” showed New Zealand at the time to be the world’s freest -- earning us their “gold medal for freedom” with scores like 9.6 out of 10 for property rights only a few years after the Resource Management Act had taken most of them away.
After a whole riot of wriggling to try to avoid the questioning, he eventually conceded that much of their data is based on subjective surveys sent out to selected “leaders” in each country. And from that news it didn’t take much more to learn that most of those surveys were completed by local cheerleaders desperately keen to trumpet the virtues of their hometown. (Q: Is your place a hell of place to do business? A: [Big tick] Hell, yes!! You’re darn tootin’!)
So, garbage in, and garbage out.
But what about the Legatum Institute’s lovely-looking Prosperity Index? On what basis precisely do we earn a 68.95 for something called “Social Capital” and an apparently whopping 84.27 for “Governance,” yet only 75.52 for education and a confusing 74.09 for “Natural Environment,” I wondered? Where do these number to that many significant figures come from? And just how much of this country’s and the 260 others’ natural and business environments they’ve surveyed have the authors actually walked through? Or know anything about
Anyone wondering about any of these things will just be left to continue wondering, it seems. The report’s methodology section does have a nice diagram full of words like “selecting the variables,” standardisation” and “variable weights,” all making things sound very sciency, and even a nice picture of some kind of machinery and an assertion that it is all “methodologically sound.” But of information on where all the numbers being crunched actually come from, we are simply left to scratch our heads and wonder. There is not even a section in the diagram for “gather and assess quality of data.”
Yes, one may read this:
For each country, the latest data available … were gathered on the 104 independent variables.
But how?
And from whom?
One should note the careful use of the passive verb there. “Were gathered.” For the fact is we are never fully told by whom, or even how – nor yet what data sources or from whom particular data was gathered, how reliable (or not) the numbers in the sources may be, nor how the numbers from right around the world from (presumably) several thousand different data sources are correlated to all appear on the same scale. It is true that you can download all their data (and all neatly tabulated to up to seven significant figures!) yet you will never be able to determine, for instance, how that 74.08751 for NZ’s “Natural Environment” was made up, or what objective method was used to give NZ a 84.274 for “Governance.”
In their “2016 Methodology Report” we are told in general terms that data for all their 104 variables “are drawn from a wide range of sources including intergovernmental organisations such as the United Nations, World Bank, International Monetary Fund, and World Health Organization; independent research and non-governmental organisations (NGOs) such as Freedom House, Amnesty International, and Transparency International; and databases compiled by academics.” And “for the subjective variables, two major global surveys are used: the Gallup World Poll and the Executive Opinion Survey organised by the World Economic Forum.” (Page 12 to 14 of the report is it when it comes to the explanations you seek, and literally all of it.)
Yet when one does drill down into a few of these sources, even those they themselves call “objective,” one encounters a similar feeling of falling through quicksand with no means of support: the sources there are the often of the same style as this one, with data gathered either subjectively, or not at all, or from surveys like this one relying on each other, but simply preferencing different data or weighting it all differently. And about “executive opinion surveys” we’ve written before: if there’s a better way of giving a good score than inviting local boosters to talk up their marketplaces as places to invest then we haven’t yet discovered it.
So garbage in.
Yet they string all these subjective numbers together to four significant figures, and then total them up to issue press releases and hand out awards, yet just how these rankings are really earned, and where and how the numbers are actually gathered and formed we are never fully and truly told.
We are however given “a detailed description of the imputation techniques” utilised to produce missing data, which the reader may delight in finding in their “2016 Methodology Report.”
So, garbage out.
Are we happy and prosperous down here at the bottom of the South Pacific? Are we the most prosperous place on the planet? You will learn more by opening the window and looking around you than you will by reading junk like this.
3 comments:
I expect they've done the best that can be done with this kind of data. The bigger problem is that somebody has to weight things across the categories, and any ranking will be a function of that weighting.
Eric, couldn't they just have looked at the degree to which individuals are taxed, regulated and controlled by laws? For example compare the degree to which individuals are free to marry who they wish, say what they like, do as they please (assuming they don't harm others), exchange goods or services without restriction, and claim true ownership of their property (without rates or taxes) or themselves (free to use drugs and associate with whoever they please).
Trying to compare vague opinion based surveys or even compare mathematically derived variables between countries is relatively pointless given the different demographics, locations, historical trends, recent events, etc). What matters is the degree to which individuals in that country are free to do as they please, whether it is starting a business, marrying a same sex partner, using recreational drugs, or owning a fire-arm.
Except I think there is a bigger problem: the data they gather is subjective, and the method of it being gathered is not given. At all. It could be but it's not -- and it's not unique to this study. If they can give figures to seven significant digits, which they do, then why not show how each of those significant digits is derived. They can't, because they're less derived than they are simpy conjured up. Showing the working would expose the lack of any real rigour.
Further, which is never reported, the data is gathered as a means to an end, and that end is not an objective assessment of one country with another. It's generally just a means to get these organisations in the headlines, which helps donations; to get some countries some good headlines (which helps investment and hinders political reform); and generally to bash places like the US (where so many of the organisations making up the sources are based).
Post a Comment