Have you ever wondered about all these international surveys that are quoted so authoritatively?
The latest, from the Legatum Institute – a think tank founded by “a private investment firm headquartered in Dubai” – has already been widely reported, finding that “New Zealand [is] ranked one of the happiest and safest places in the world,” at which we “beat Australia.”
Break out the bubbly! (As most of the commentariat have indiscrimately done.)
This follows many other survey’s showing New Zealand and its cities to be among the world’s happiest, most prosperous and free, and just as breathlessly reported. (Anything to fill those column inches, he.)
So, great news, if true, right? But have you ever wondered from where exactly all these folk derive their data? I pressed one fellow once whose “freedom index” showed New Zealand at the time to be the world’s freest (earning us their “gold medal for freedom”) earning scores like 9.6 out of 10 for property rights only a few years after the Resource Management Act had taken most of them away.
After a whole riot of wriggling to try to avoid the questioning, he eventually conceded that much of their data is based on subjective surveys sent out to selected “leaders” in each country. And from that news it didn’t take much more to learn that most of those surveys were completed by local cheerleaders desperately keen to trumpet the virtues of their hometown. (Q: Is your place a hell of place to do business? A: [Big tick] Hell, yes!! You’re darn tootin’!)
So, garbage in, and garbage out.
But what about the Legatum Institute’s lovely-looking Prosperity Index? On what basis precisely do we earn a 3.82 for something called “Social Capital” and an apparently whopping 4.09 for “Governance,” yet only a 2.29 for “Economy” and a dire 2.11 for “Health,” I wondered? How does the fact that “One of [NZ’s] notable tech companies, Orion Health, [having] recently established a presence in the Philippines as a base for ASEAN investment,” one of the few actual facts about this country actually cited in the Index, impact on any of those figures?
Anyone wondering about any of these things will just be left to continue wondering, it seems. The report’s methodogy section does have a nice diagram full of words like “selecting the variables,” standardisation” and “variable weights,” all making things sound very sciency, and even a nice picture of some kind of machinery. But of information on where all the numbers being crunched derive, we are simply left to scratch our heads and wonder. There is not even a section in the diagram for “gather and assess quality of data.”
Yes, one may read this:
For each country, the latest data available in 2013 were gathered on the 89 independent variables.
And one should note the nice use of the passive verb there. “Were gathered.” But we are never told by whom, or even how – nor yet what data sources (or from whom) the date was gathered from, and how reliable (or not) the numbers may be. Just how these rankings are earned, and where and how are the numbers formed to make them? We are never told.
We are however given “a detailed description of the imputation techniques” utilised to produce missing data, which the reader may delight in finding in their “Technical Index.”
So, garbage out.
Are we happy and prosperous down here at the bottom of the South Pacific? You will learn more by opening the window and looking around you than you will by reading junk like this.