Wednesday 4 January 2017

#TopTen | No. 5: But how do they *know* NZ is so prosperous? [updated]

 

Last year I wrote and posted 801 posts (“we are the 801”). And here at EnZed’s fourth-most read political blog this was the fifth most popular post: asking all those organisations regularly ranking NZ in their top tens, quite seriously, where exactly they get all their information from?


Have you ever wondered about all these international surveys always quoted so authoritatively?

I have one before me that finds NZ to be the most prosperous country in the world. Issued by something called the Legatum Institute – a think tank founded, we are told, by “a private investment firm headquartered in Dubai” – the survey has been widely reported over the last few days by everybody who would appreciation the headline that “Free markets, free people, and the world’s strongest society ensure that New Zealand takes the top spot in Legatum’s Prosperity Index.”

Great! Right? They say: 

New Zealand has ranked first in the Prosperity Index for six of the last ten years. Bar a small prosperity drop as a result of the 2008 global financial crisis and the immediate impact of the Canterbury Earthquakes, New Zealand’s prosperity has been on an upward trend, particularly since 2012. This rise has been driven by concerted efforts by policymakers, especially in economic and health policy. New Zealand’s Business Environment performance has seen it rise nine ranks to 2nd, and in Health it has risen eight ranks to 12th. Underlying strengths include Economic Quality, particularly free and open markets, where New Zealand ranks 1st, Governance (2nd), Personal Freedom (3rd), and Social Capital (1st).

So break out the bubbly – as many of the commentariat do quite indiscriminately whenever any of these sorts of surveys are released, just as they did when several other surveys were released declaring New Zealand and its cities to be among the world’s happiest, the most prosperous and the most free. All just as breathlessly reported. (Oh, but also the most unaffordable. So there’s that too.)

So, it would all be great news if it were all true, right?  But if you catch yourself wondering about the degree to which we and our markets are truly free or NZ really prosperous, and how these authors would really know any of this -- especially in the detail they claim, down to several significant figures – then you’re not alone.

Me too.

It’s not just this survey. It’s all of them. I pressed one fellow once whose regularly cited “freedom index” alleged New Zealand at the time to be the world’s freest -- earning us their “gold medal for freedom” with scores like 9.6 out of 10 for property rights only a few years after the Resource Management Act had taken most of them away.

After a whole riot of wriggling to try to avoid the questioning, he eventually conceded that much of their data is based on subjective surveys sent out to selected “leaders” in each country. And from that news it didn’t take much more to learn that most of those surveys were completed by local cheerleaders desperately keen to trumpet the virtues of their hometown. (Q: Is your place a hell of place to do business? A: Big tick! Hell, yes!! You’re darn tootin’!)

So, garbage in, and garbage out.

But what about the Legatum Institute’s lovely-looking Prosperity Index?  On what basis precisely do we earn a 68.95 for something called “Social Capital” and an apparently whopping 84.27 for “Governance,” yet only 75.52 for education and a confusing 74.09 for “Natural Environment,” I wondered?

Where specifically do these numbers to that many significant figures come from?

And just how much of this country’s and the 260 others’ natural and business environments they’ve surveyed have the authors actually walked through? Or know anything about?

LegatumAnyone wondering about any of these things will just be left to continue wondering, it seems. The report’s methodology section does have a nice diagram full of words like “selecting the variables,” “standardisation” and “variable weights,” all things you can do at your own desktop once you have your numbers and making things all sound very sciency, and there’s even a nice picture of some kind of machinery (there it is on the right) accompanying an assertion that it is all “methodologically sound.” But of information on where all the numbers being crunched actually come from, we are simply left to scratch our heads and wonder. There is not even a section in the diagram for “gather and assess quality of data.” Which, if you are simply getting your data from subjective surveys of at least 260 very different groups people, you’d think would really, really matter.

Yes, one may read this:

For each country, the latest data available … were gathered on the 104 independent variables.

But how?

And from whom?

One should note the careful use of the passive verb there. “Were gathered.” For the fact is we are never fully told by whom, or even how any particular data was gathered, what data sources were relied on or how reliable (or not) the numbers in the sources may be, nor yet how the numbers from right around the world from (presumably) several thousand different data sources are correlated to all appear on the same scale. It is true that you can download all their data (and all neatly tabulated with to up to seven significant figures!) yet you will never be able to determine, for instance, how that 74.08751 for NZ’s “Natural Environment” was made up, or what objective method was used to give NZ a 84.274 for “Governance.”

I know. I’ve tried.

In their “2016 Methodology Report” we are told in general terms that data for all their 104 variables “are drawn from a wide range of sources including intergovernmental organisations such as the United Nations, World Bank, International Monetary Fund, and World Health Organization; independent research and non-governmental organisations (NGOs) such as Freedom House, Amnesty International, and Transparency International; and databases compiled by academics.” Which is all very nice as far as it generally goes. And “for the subjective variables,we are told that two major global surveys are relied upon: the Gallup World Poll and the Executive Opinion Survey organised by the World Economic Forum.” (Page 12 to 14 of the report is it when it comes to the explanations you seek, and is literally all of any explanation you are going to find.)

Yet when one does drill down into a few of these sources, even those they themselves call “objective,” one encounters a similar feeling of falling through quicksand with no means of support: the sources there are the often of the same style as this one, with data gathered either subjectively, or not at all, or from surveys like this one relying upon each other for support (Legatum cites Fraser Institute who cite Freedom House who cite Legatum who cite … and on an on). It starts to look less like objective information and more like a circle jerk within which each organisation simply preferences different data or weights all the same data slightly differently. And about the “executive opinion surveys” that seem to be the source of much of this stuff we’ve written before at length: if there’s a better way of giving a good score than inviting local boosters to talk up their marketplaces as places to invest then we haven’t yet discovered it.

So garbage in.

Yet they string all these subjective numbers together to four significant figures, and then total them up to issue press releases and hand out awards, yet just how these rankings are really earned, and where and how the numbers are actually gathered and formed we are never fully and truly told.

We are however given “a detailed description of the imputation techniques” utilised to produce missing data, which the reader may delight in finding in their “2016 Methodology Report.”

So, garbage out.

Are we happy and prosperous down here at the bottom of the South Pacific? Are we the most prosperous place on the planet? The truth is you’ll never know by reading junk like this. You will learn more by simply opening the window and looking around.

UPDATE: See, for what it’s worth this is a whole lot more rigorous than any of these alleged indices:

New Zealand has been named the world's best country by readers of Britain's ‘Daily Telegraph.’
    The winners of the annual ‘
Telegraph’ Travel Awards were voted on by more than 75,000 readers, with New Zealand emerging as the clear winner for the fourth consecutive year.

At least with these awards there’s no pretence about the judgement not being subjective. And with a sample size of 75,000, albeit self-selecting, it’s undoubtedly larger than that used to compile any of these bogus indices.


Tomorrow, I post last year’s fourth-most popular post here at EnZed’s fourth-most read political blog asking … if Greenpeace has the truth on their side, why do they need to lie so much?

.

1 comment:

MarkT said...

These indexes that rank NZ highly are certainly subjective, but the subjectivity would presumably apply to the other countries score as well. The different studies ranking NZ at or near the top are consistent, so for it not to be true there would need to be a greater degree of bias or over-stating of the positives for NZ compared to other countries. The question is whether NZ's scores are more positively biased than others, and if so why? Yes our scores may be boosted by local cheerleaders, but won't every country have its local cheerleaders?

As I type this I'm currently in Switzerland, and as I look around me it's hard to believe that NZ is more prosperous than here. Part of my perception may well be the tendency for tourists to mainly see the positives of a place and not the negatives. It's something I clearly see from my Australian based father when he comes to NZ, and thinks most things in NZ are better. I on the other hand have lived in both NZ and Aus and have a more balanced view. I think that's because it's only when you live in a place do you encounter any underlying negatives that a tourist doesn't see.

So my perception of Switzerland may be overly positive, because my tendency as a tourist is to focus primarily on the things that are different and better. But in acknowledging that I can see why NZ's scores could end up being over-stated more than others. NZ is small and the most isolated and hardest to get to of any of the top countries, therefore harder for any outsider to see the negatives. Combine that with the narrative about the 80's free market reforms, a narrative that continues to be told today by organisations such as FEE, almost implying the reforms have been ongoing to the present day (in reality the stopped 20 years ago), and you can clearly see how the 'legend' about how good things are on an island at the bottom of the South Pacific could be propogated.

One example of how NZ's score could be over-stated: it's only when you've gone through the process of getting a complex resource consent for a simple project do you understand how the RMA makes our high ranking for 'property rights' a sham. Based on the tunnels, railways, and cable cars I see going up mountains everywhere here I doubt the same constraints exist here. But is this a difference these indexes pick up on? I doubt it.