Research methodology scares a lot of people. There’s this idea that you need an advanced degree and very specialized education to design and conduct a research study. That isn’t always true – a lot of times, it’s just a matter of thinking logically about how to get and use meaningful data to help you understand a situation.
But researchers with advanced degrees and very specialized education, and working for hugely influential international policy-making and governance organizations – they know how to collect accurate data and analyze it appropriately. Right? Right??
Ummm….maybe not.
In March of this year, the International Labour Organization – a tripartite United Nations agency representing employers, unions and workers – released a working paper entitled Deregulating Labour Markets: How Robust is the Analysis of Recent IMF Working Papers? The paper, by economist Mariya Aleksynska, analyzed the methodology in four recent working papers published by the International Monetary Fund. The IMF, in its own words, “oversees the international monetary system and monitors the economic and financial policies of its 188 member countries”; it also regularly releases reports, working papers and discussion notes on various aspects of economic policy and practice.
The four papers that the ILO analyzed (which are available here, here, here and here) all looked at whether labour market regulation affected labour market flexibility. Broadly defined, “labour market flexibility” is how easily employers can respond and adapt to changes in their environment, such as the supply of qualified workers, wage expectations, and workplace rules like health and safety laws. All four papers concluded that “strong empirical evidence” showed lower unemployment rates in countries with less strongly regulated labour markets and fewer employment regulations. Thus, the papers suggested, “structural policies” to increase labour market flexibility may “reduce unemployment in both the short and the long term”.
The ILO analysis reveals some major problems with the data used in these studies, and some major problems with how the data were used to reach those conclusions. And it’s really troubling that these problems apparently were missed by the IMF researchers, by the IMF itself, and by the editors and reviewers at the peer-reviewed academic journals that published articles based on two of the papers.
The first problem with these papers’ methodology is that the data measuring labour market flexibility were taken from a single source: the Fraser Institute’s Economic Freedom of the World database. A single source of data should always be used with caution, because if there is something wrong with that data, there are no other data in the analysis to offset or cancel out those errors.
But not only are these studies relying on data from a single source – they are relying on a single source that has a clear ideological bias. The Fraser Institute makes no secret that it exists to promote “a free and prosperous world where individuals benefit from greater choice, competitive markets, and personal responsibility”. Although the Fraser Institute claims to provide “objective, empirical research”, its research consistently produces anti-regulation, anti-government outcomes – such as its annual declaration of “Tax Freedom Day”, and its (in)famous school rankings in which private and independent schools almost always magically rank higher than publicly-funded schools. And while the Fraser Institute claims to be politically “independent”, it has reportedly accepted large donations from the right-wing American Koch Brothers.
All of this information about the Fraser Institute, and lots more, is easily accessible through a quick Google search. So there’s no reason why anyone involved with these studies shouldn’t have been aware of the potential for bias in Fraser Institute data.
The analyses in all four IMF papers use a composite measure of “labour market flexibility” that is provided within the Fraser Institute’s Economic Freedom of the World databases. The databases contain data from 1970 to 2010 on over 140 countries (not every country is included in every year’s data). Separate databases are posted for the years 1998/99 through 2012, with the new data for the current year added to the previous years’ data.
Here are just some of the problems that the ILO report found with the Fraser Institute data, and with how the IMF research used this data.
- Three of the six components of the Fraser Institute’s “labour market flexibility” measure are taken from the World Bank Employing Workers Index (EWI). This index, part of the World Bank’s Doing Business set of indicators, has been extensively criticized for being selective in what it measures – for example, focusing on flexible employment policies while not including measures of laws that protect workers. As a result of this criticism, in 2009 the World Bank issued an advisory to its own staff to not use EWI data in formulating policy, to not use it in assisting with labour market reforms, and to not use it as a “target or performance monitoring indicator”. So, in other words, half of the Fraser Institute’s “labour market flexibility” data are flawed data that the source organization has told its own employees not to use. And these are the data that the IMF papers use in their analysis.
- In 2002, the Fraser Institute changed some of the data sources it used to calculate the “labour market flexibility” measures, and said that the earlier information in its database was revised to reflect these changes. However, the Fraser Institute apparently did not make these revisions. The 2010 database, which the IMF researchers appear to have used for their analyses, contains some of the uncorrected 2002 labour market data that were in the original 2002 database. Thus, not only are there incorrect data for these variables in some years, but the values of the variables were not calculated the same way in every year. So comparisons of changes in these variables across time may not be reliable comparisons.
- Some of the variables in the Fraser Institute database are based on very questionable data sources. For example, according to Aleksynska, prior to 2002 the “mandated cost of hiring” variable was calculated by averaging responses to a single question in the World Economic Survey. The question asked respondents to rate their agreement with this statement: “The unemployment insurance program [in my country] strikes a good balance between social protection and preserving work incentives”. This is a question about perceptions and opinions; it is clearly not intended to collect numerical data about the costs of meeting legal requirements for hiring employees.
- The IMF researchers have apparently interpreted changes in the Fraser Institute data as evidence of “large-scale reforms of labour markets toward flexibility”. One paper identified 52 such “episodes”, taking place between 2000 and 2008. Aleksynska asked the authors of the paper for their list of these reforms; the list she was given indicated that 30 of these “episodes” took place in 2002. However, as mentioned above, the Fraser Institute changed the calculation of some of its variables in 2002 – so it is entirely possible that the observed “episodes” were actually changes caused by differences in how the variables were calculated, not by any actual labour market reforms.
- When large-scale regulatory labour market reforms did happen, the Fraser Institute data do not accurately reflect them. For example, in 2006 the country of Georgia had major reforms in its labour laws, as described in this document. But the Fraser Institute data for Georgia seem to indicate changes occurring between 2004 and 2005. Additionally, Aleksynska points out that when data values change in the Fraser Institute databases, these changes are not clearly linked to specific regulatory changes – and that makes it difficult to determine whether the data accurately reflect the impact of any documented policy changes.
In addition to identifying specific methodological problems with the IMF papers’ analyses, Aleksynska also makes some very good observations about the assumptions underlying those analyses. The IMF researchers interpreted higher values for the Fraser Institute variables – higher values associated with less regulation – as representing “better quality” labour markets. However, in Aleksynska’s words, “this confounds the analysis of quantity with the analysis of quality”. That is, the impact of labour regulations should not just be assessed on how many regulations exist, but also on other criteria: for example, whether the regulations protect basic human rights in work and employment. She also points out that it’s a mistake to assume that all labour regulations are unduly restrictive or unfair to employers. These regulations can actually benefit employers – for example, by improving stability in employment relationships, by lowering hiring costs, and by facilitating investments in job training.
So why should people who aren’t statisticians or researchers care about these problems? Well, since Aleksynska’s paper was published, she and co-author Sandrine Cazes have produced another ILO paper identifying other research which has also used the flawed Employing Workers Index data. The newer paper states:
The World Bank Employing Workers Index series [is] being systematically reproduced in other datasets – the Fraser Institute, the World Economic Forum, and the International Institute for Management Development – but without properly acknowledging the debate [around the index] and the methodological changes in the data series. Furthermore, these databases use the EWI data to construct their own aggregate indices and to rank countries, thus disregarding significant recommendations and the decisions taken by the World Bank itself. (p.2)
So these are not just problems involving obscure statistical issues. They are problems of flawed data with an anti-worker and anti-labour bias being used by international organizations with significant influence on labour and employment policies around the world.
The IMF has a disclaimer on its working papers, stating that “the views expressed in this Working Paper are those of the author(s) and do not necessarily represent those of the IMF or IMF policy. Working Papers describe research in progress by the author(s) and are published to elicit comments and to further debate”. However, these particular “research in progress” papers are focused on a very important and difficult issue. Unemployment is a major social problem in many countries, particularly among young workers. If IMF-sponsored research suggests that reducing labour market regulation may reduce unemployment rates, the impact of those potentially flawed suggestions could be devastating.
And finally, it’s also very troubling how little attention was paid to the ILO paper, despite its stunning findings. I only learned about the paper through a Tweet from Press Progress about its summary of the paper’s findings. A quick Google search on the title of the paper only brought up two other Tweets mentioning the report, and two other summaries of the paper’s findings. Now, admittedly, some of the discussions in the paper are fairly technical discussions of statistical techniques. But it doesn’t take an advanced understanding of statistics to understand the problem of using a single source for data, and to understand how research results could potentially be inaccurate if the data themselves are not completely accurate. We should pay attention to what the ILO is saying in these reports, because its findings are very disturbing.
3 comments