Articles and Posts from ISQ

Amid the teeth-gnashing by international relations (IR) scholars who worry that their discipline is becoming irrelevant, Paul Avey and Michael C. Desch wisely decided to ask policy officials whether and how academic research matters to their work.  Avey and Desch’s survey provides much needed evidence to support the growing consensus that the IR field is not giving policy makers what they want. Where Avey and Desch examine the demand side of the academic-policy relationship, data from the Teaching, Research, and International Policy (TRIP) Project (TRIP website) allow us to scrutinize the supply side.  The TRIP Project houses the results of four major surveys of IR faculty over eight years, as well as data (on methodology, issue area, paradigm, policy recommendations, and 23 other variables) on every article published in the field’s twelve leading journals from 1980 to the present. [1]  At first glance, TRIP research suggests that Avey and Desch are right—IR scholars are not producing the kind of research that policymakers say they want.  But the data also suggest that the authors may be overstating the size of the gap between theory and practice—in fact, policymakers may be getting more of what they need than Avey and Desch suggest.

The policymakers in Avey and Desch’s study echo a number of critiques of the discipline.  First, critics claim, the field is overly abstract (Gallucci, 2012; Nye, 2009; Walt, 2005). Such critics, however, often describe and disparage the IR field of the 1980s-90s and do not consider recent changes, including some that have led scholars to lament the death of grand theory (Mearsheimer and Walt, 2013).  One possible test of such claims examines the percentage of articles in the major journals that explore features of IR or IR theory but include no significant empirical content. Rather than increasing, these theoretical (“analytical/ non-formal”) articles peaked in 1995 at 24% of all published work and have declined since to less than 10% in 2012.  The dreaded paradigm wars are in retreat as well (Lake, 2013): While 57% of U.S. faculty surveyed by TRIP in 2011 described themselves as realist, liberal, or constructivist, 26%—the largest single group of respondents—said their work did not fall within any major theoretical school.  This trend is accelerating: 55% of 25-34 year olds describe their research as non-paradigmatic compared to just 21% of 55-64 year olds. The trend is even more striking if we look at published IR articles: 65% were non-paradigmatic in 2012 compared to only 39% in 1980.  When U.S. officials look to the academy for consultants, they draw heavily on realists (22% of consultants are self-described realists), but the largest group of academic advisors (26%) calls their work non-paradigmatic.

Second, policymakers in Avey and Desch’s study seem none too happy with what others have called the “mathematicization” of the field (Miller, 2001).  In reality, IR scholars are not overly mathematical in their approach: 56% of U.S. respondents in 2011 described their work as qualitative, while only 23% said their primary approach was quantitative, and a scant 2% used formal models. Qualitative methods have declined over time, however; in 2006, 69% of respondents used case studies and other qualitative approaches.

As Avey and Desch note, IR scholars share policymakers’ views on the most useful methods for informing policy debates—with area studies and case studies leading the way and formal models and quantitative analyses bringing up the rear—but this consensus does not always inform academic publications.  Between 1980 and 2012, 36% of articles used statistical approaches and 11% employed formal models, but only 34% used qualitative methods despite the dominance of these approaches in the discipline.  Quantitative approaches have gained ground steadily until they outpaced qualitative methods in 2001.  This trend will only accelerate: the median age is 46 among scholars whose primary approach is statistical, compared to 52 for those who use either qualitative or formal methods.   


 Figure 1

If policymakers are not getting what they want from IR scholars in terms of research methods, they may be getting at least some of what they need.  Among IR scholars who have consulted within two years of the 2011 survey, 55% primarily use qualitative methods, and only 21% primarily employ statistics.  Despite the overrepresentation of formal and quantitative methods in published research, in other words, US government officials rely on qualitative scholars roughly in proportion to their representation in the discipline.

Finally, Avey and Desch join a chorus of voices (Gallucci, 2012; Kristof, 2014; Walt, 2005) suggesting that contemporary IR research is not sufficiently problem-driven.  TRIP data show that IR scholars in the U.S. want their research to matter to policymakers: 92% say there should be more links between the academic and policy communities.  Nevertheless, in 2011 only 23% described their work as applied, down from 31-35% in previous surveys. Even these numbers overestimate the policy relevance of most published work in the discipline’s leading journals.  Only 9% of all articles from 1980-2012 contain policy prescriptions, and this number has declined from 16% in 1980 to less than 4% in 2012.  Policy analysis also is retreating: there is a significant gap between the percentage of scholars who say they do policy analysis and the representation of this method in top journals; and the median age of scholars using policy analysis is 62 compared to 50 for the rest of the field. 


Figure 2

In short, evidence of the gulf between the theory and practice of IR is mixed.  Scholars, especially younger faculty, are relatively uninterested in doing applied research, and very little published work contains explicit policy prescriptions.  At the same time, purely theoretical articles make up a small percentage of research, and a growing percentage of scholars and published works eschew paradigmatic analysis.  While IR publications disproportionately use statistics, the field is not nearly as quantitative or formal as some scholars claim and policymakers fear. When officials turn to the academy for advice, moreover, they favor scholars who are qualitative and non-paradigmatic.  Similarly, not all the news from the policy side is bleak: 69% of policymakers believe that academic arguments provide useful intellectual background; more than 70% find academic books and articles useful or somewhat useful in their work; and 72% use scholarly arguments at least a few times a month with 45% using them a few times a week or more.  Even if most IR scholars do not intend their work to be policy relevant, and policymakers complain that it is not, it’s clear that officials are getting some of what they need—if not all that they want—and that IR scholarship is influencing policy. 

[1] The journal article database is not yet complete.  The data presented here is based on 5,154 articles (issues 1, 2, and 3) from 1980-2012, or more than 72% of all IR articles.  The survey data reported here includes the results of the U.S. survey in 2011.

ISQ On Twitter

The International Studies Association

Representing 100 countries, ISA has over 6,500 members worldwide and is the most respected and widely known scholarly association in this field. Endeavoring to create communities of scholars dedicated to international studies, ISA is divided into 7 geographic subdivisions of ISA (Regions), 29 thematic groups (Sections) and 4 Caucuses which provide opportunities to exchange ideas and research with local colleagues and within specific subject areas.
Help   |   Thanks   |   Privacy Statement   |   Terms Of Use