Wednesday, March 28, 2012

On IR Professionalization, Grad School, and What I Think I've Been Doing for the Past Four Years

. Wednesday, March 28, 2012

Dan Nexon has written a lengthy diatribe opposing what he calls the "over-professionalization" of IR, particularly as it involves graduate student instruction. Over-professionalization seems to reduce to, basically, extensive training in neopositivist approaches (mostly but not only quantitative) to the study of international relations, with less space for idiosyncratic approaches. Nexon believes this has led to, and is leading to, an impoverishment of IR theory. It's well worth reading, as is Erik Voeten's critique, and comments on both posts.

No one has responded to Nexon from my perspective -- a grad student about to go onto the job market -- and I've been thinking a decent amount about these issues lately, so I'm going to chime in. Before I get into more specific discussion, let me note that it's been very interesting to watch the ways in which my approach to grad school has evolved over the past four years, and the ways in which my fellow grad students have taken different approaches to their time in grad school. For me, having an academic career is only valuable if I can do it more or less my way. I'm not a "careerist" in that sense... if what it takes for me to get a job at a good school is to do a bunch of work that I don't enjoy or find interesting, then I don't want a job at a good school. I'd much rather have a place at a lesser school or in a non-academic setting. Therefore, I'm somewhat risk-acceptant regarding both my professionalization (evidence: my willingness to pick blog fights with tenured faculty) and my research program (which carries a nonzero chance of being an abject failure). Some of my fellow graduate students don't seem to feel the same way, and they make their choices accordingly. There's nothing at all wrong with that, it's just not the approach I'm comfortable taking.

At this point I have almost no idea what the values of "the discipline" at large are, and I'm somewhat skeptical that there is any such thing.* It seems to depend very much on which departments we're talking about, and even then it's somewhat contingent upon the balance of recent hires/turnover within departments. Same with the journals. While different journals (naturally) tend to publish different types of work, it's not clear whether that is because authors are submitting strategically, editors are dedicated to advancing their preferred research paradigms, both, or neither. There are so many journals that any discussion of them as doing any one thing -- or privileging any one type of work -- seems like painting with much too wide a brush. Nevertheless, on to Nexon's claims.

Perhaps ironically, I think the phenomenon that Nexon is trying to describe (to the extent that it exists at all) is best understood using language taken from formal bargaining theory. That is, in an environment where competition for any decent academic job is fierce -- much less a so-called "top tier" job -- there is no possible way that grad students will not be "over-professionalized". Grad students will always be incentivized to game the job market as best as they can, and their advisors and departments are incentivized to help them do it. The job market rewards graduate students that can strongly signal an ability to maintain a productive research agenda into the indefinite future. Even if the efficient outcome is for a less professionalized discipline (if e.g. it leads to more interesting theorizing), and Nexon does little to establish the case that it is (more below), those who defect and become more professionalized are more likely to get jobs because they will be the ones who can better signal their quality to hiring departments. In politics it is often said that you can't enact the policy without being elected, and you can't have a long career doing interesting research in IR without first getting hired somewhere. Getting a job is therefore the first goal of every student, advisor, and department. There is no other measure of success, nor can there be.

From where I sit, as a grad student there appear to be only so many ways you can signal your value to hiring committees that receive tons of applications for each job -- particularly if you're coming from a department outside of the top 10. One way is to demonstrate mastery over difficult material and/or techniques for inquiry that will be useful in conducting research over a long career. Another is by having work published -- indicating that you actually can do research -- and in progress -- indicating that you don't have only one decent idea -- and by getting well-regarded scholars to write strong letters on your behalf. For the first, taking an extra methods course adds to the "toolkit" in ways that taking an extra class out of your major field does not. Or hell, even within your major field, if it's on a topic that isn't directly relevant to your dissertation. Sure, the substantive class will expose you to new ideas, but nothing's stopping you from reading those books on your own. The value added of discussing the substantive work in seminars with other grad students who only had time to skim the reading is almost surely less than the time spent learning a new method. Having more training in more methods gives you a greater ability to ask a wider range of questions and a greater ability to answer them. We're always being told to not let method determine research question, but if you only know one method you have no other option. Moreover, without some methodological chops it is difficult to conduct research that is publishable (the second signal) or that is interesting enough to excite jaded faculty that are writing letters for you (the third). In short, it's important for graduate students in political science to be able to do political science. Methods training is a prerequisite for that.

Unlike Nexon, I don't see this as a very bad thing.** I've taken more methods classes in my graduate education than substantive classes. I don't regret that. I've come to believe that the majority of coursework in a graduate education in most disciplines should be learning methods of inquiry. Theory-development should be a smaller percentage of classes and (most importantly) come from time spent working with your advisor and dissertation committee. While there are strategic reasons for this -- signaling to hiring committees, etc. -- there are also good practical reasons for it. The time I spent on my first few substantive classes was little more than wasted; I had no way to evaluate the quality of the work. I had no ability to question whether the theoretical and empirical assumptions the authors were making were valid. I did not even have the ability to locate what assumptions were being made, and why it was important to know what those are. Those questions, which are central to any study in political science (or should be, anyway), can only be answered once one has some sense of how theory is constructed, how models perform, how data is collected and analyzed, and why every choice made by the analyst is important. Coming into grad school I didn't have that ability despite having being fairly comfortable with basic statistics (what Nexon lumps into GLR -- "general linear reality") and models as an undergraduate economics major. (In my opinion it's still a weakness for me, and for nearly everyone I come across in academia.)

The fact remains: students at architecture schools are not asked to design a skyscraper on the first day. Someone taking their first saxophone lesson is not encouraged to abandon traditional chord structures in favor of free improvisation. Nor would it be productive for most graduate students to be "thinking outside the box" during their early years in graduate school. At that point I didn't even know where the box was. I couldn't possible have been any less professional, and so I was incapable of producing anything of theoretical or substantive value. I don't think a single one of my grad student colleagues was any different.

More importantly, I don't think that learning a bunch of methods closed off avenues of research for me; on the contrary, it opened them up! Learning that not every real-world variable has a Gaussian distribution, that assumptions regarding the data-generating process are very important, that (yes) not every process in the world is linear, that bias can come in all shapes and sizes (regardless of what method one uses, and I don't just mean quant here)... this is important knowledge to possess! I could not acquire it without "professionalizing" myself.

So with the caveat that he has much more experience and knowledge than me, I don't think I can agree with Nexon's core argument: that professionalization of graduate students has led to a lack of interesting theorizing. I believe a decent amount of professionalization is a prerequisite for understanding how good theories are constructed and how they may be evaluated. I do think that there is not all that much interesting theory being produced in IR currently, but I think that's because it's really hard to come up with original theory. Not just in IR... theoretical development moves slowly in all sciences. Most of the work is in pushing at the edges of existing theory in more-or-less straightforward empirical ways.*** It takes something approaching genius to develop original theories and let's face it... most graduate students (and professors, practitioners, and lay-people) just don't possess the necessary quality whether they've been professionalized or not.

If there is one thing that I wish I had spent more time on in my graduate classes it would be taking the philosophy of science more seriously. I suspect, from other things he's written, that Nexon would agree. Other than the first-semester scope and methods class -- which came too soon to be of much use, and doesn't seem to be conducted super-rigorously almost anywhere -- attention paid to the philosophy of science has been spotty, sporadic, and mostly ad hoc. This is, I believe, a problem with social science more generally and not just graduate instruction at the top 20 (or so) IR grad programs. But such a focus would almost necessarily have to lead to more professionalization rather than less.

*When we discuss what "IR" wants in terms of hiring decisions, it seems relevant to note that hiring decisions are generally made by departments as a whole and not just the IR people. I'm sure that in many cases faculty from other disciplines will defer to IR people when making IR hires, but I imagine that there's quite a lot of variation there.

**I'm not sure he does either. He seems to be saying different things at different times. I think what he'd like to say is that a strong methods training would be a very good thing if it didn't lead to the sort of professionalization that he doesn't like. But he doesn't quite say that, and I can't tell if it's really true.

***On a somewhat related point, tangential to my post as well as Nexon's: if I was more familiar with the literature I think I'd like to write a post arguing that political "science" must mean something specific. Political practice is different from political science, and I think there are a lot of uses in keeping the distinction between normative and positive theory intact. The falsifiability criterion seems to me to be as good as any. I think there are good reasons for arguing that certain types of theory is not political "science" without diminishing its utility for people who are interested in politics, theories about the world, etc. The word "science", if it is to have any meaning, must distinguish between styles of inquiry based on their approach, their methodology. I'm not sure where that line falls, but it must exist.

9 comments:

Zachary Jones said...

Definitely agree about the philosophy of science part. Really liked the essay. I definitely (still?) didn't know what I was doing in my first year.

Simon Frankel Pratt said...

I'm going into a phd programme in political science (U of T) this upcoming fall, and I'm quite interested in this ongoing debate. I have almost no ability to use and abuse statistical methods, though I'm quite conversant in my philosophy of science. I've got a couple publications in decent but field-specific journals (terrorism-related) but these are all based off of fairly simple historiographical and comparative work.

I worry that I'll end up some awful sort of theorist whose obtuse work bears no relevance to anyone other than those within my own echo-chamber.

I will take methods courses like they are going out of style. Why is that even a figure of speech?

Dan Nexon said...
This comment has been removed by the author.
Dan Nexon said...

Nice critique. Sorry I didn't see it sooner. I'd go further: we're in real danger of a mixed-methods equilibrium trap in which people combine statistical and narrative inference for no good reason other than getting a job.

The difficulty that I think you are having is the same that Eirk had: understanding that my argument is about configurational effects. So there's no tension in my saying that there should be more methods training and saying that graduate students should be given more space to think and less pressure to produce publishable work. Right! Now! It is when you combine lots of emphasis on relatively narrow methods training, hubris about the nature of "social science," and pressure to professionalize as quickly as possible that theorization suffers.

I'm troubled by your claim that you had no ability to evaluate substantive courses before you had lots of "methods training." Plenty of work stipulates the terms by which its authors intend it to be judged. Not all work can be best judged by the kind of training you discuss. Indeed, if you think it can, that's probably indicative of what I'm concerned about.

And yes, get as much training as possible in grad school. It gets much harder to find time later in your career.

AnonProf said...

I have to agree with Dan Nexon, that comment about not being able to judge substantive work or even do things like identify the assumptions in that work without having had a lot of stats training seemed odd. I've had quite a bit of stats myself but the problems with inference in an article far more often lie in the basics of research design, defining and measuring variables, or whether a quantitative approach makes any sense than they do in the details of model estimation.

Unless of course we all decide that we're more concerned with playing games with models and datasets than real-world inference.

I'm reminded of a junior colleague's recent project, someone just out a top program with a heavy methods emphasis. He was doing some interesting, even innovative stats work, but with some really fundamental problems in whether his empirical data actually measured the concepts of interest. Sure enough when he presented it the audience nodded at the stats and then beat him bloody on conceptual mismatches and measurement problems in his variables. His reply was basically to shrug and say they was probably right, but nonetheless this dataset was "what everyone uses". And he's right on that, within the JCR/JPR/CMPS/etc community. What struck me as illustrating Nexon's point was a real sense from this new guy of, "Look, people, how will I get my manuscript submitted this semester if I can't just 'use what everyone uses'"?

Kindred Winecoff said...

Dan -

The problem is that the incentive structure in the discipline does not allow both of "more methods training" and "less pressure to publish". If you learn methods that will help you to publish, using them to publish is a dominant strategy. Moreover, there's no good way to signal that the time you spent not publishing was spent actually thinking in a way that will lead to better future theorizing.

Again, it's about signaling. And I don't think there's any way to change the structure of this game.

AnonProf -

Thanks for commenting. I'm not sure I understand what you're saying. For example: "the problems with inference in an article far more often lie in the basics of research design, defining and measuring variables, or whether a quantitative approach makes sense" to me sounds like a lack of good methods training. I'm not *just* referring to "the details of model estimation" when I refer to methods training, which is why I specifically mentioned things like assumptions regarding the data-generating process, knowledge about non-Gaussian distributions, taking potential bias seriously, etc. That's all part of methods, to me, as are the things you identified as being problematic.

So your colleague that had "fundamental problems in whether his empirical data actually measured the concepts of interest" was making a methodological mistake, not a theoretical mistake. It sounds like the audience picked up on that criticized his methodological approach rather than his theoretical argument.

See what I mean?

AnonProf said...

Kindred--

In my example, what was interesting was that this overwhelmingly non-quantitative audience focussed on the problem of the easy-to-quantify dependent variable my colleague was looking at being a bad proxy for the underlying concept of interest. I agree. But that hasn't stopped a number of published articles in JCR, ISQ and such from using the same (bad) measure, and indeed my colleague remains confident that reviewers for the journals he'd submit will understand that "this is what people use".

This gets to Dan's larger point in that GOOD methods training would indeed fight against that. But I worry that the pressures of narrow professionalization lead to poor methods training and poor methods evaluation where we collectively suspend disbelief and applaud work that applies a new estimation tweak to the usual X from POLITY correlated with Y from MIDS, because that's the kind of work that lets you crank out enough articles/year. Studies that use "better" methods might yield better results from a scientific standpoint, but because those studies would be more difficult and especially, more time consuming (e.g., collecting original data), they produce a lower publication count and so that way is an inferior career strategy.

Kindred Winecoff said...

Anon -

I took Dan's point to mean that professionalization was leading to a lack of good *theory*, not to poor execution of methods used to evaluate theory. What you're suggesting is something else, which I'd characterize as "We need *more* professionalization so that we stop seeing so many crap methods". I might agree with that but it's a different kind of argument.

I do agree that the discipline incentivizes article count over quality. (The quality is *supposed* to be ensured by peer review.) It also incentivizes innovative new approaches and new ways of thinking, but there's a higher risk/reward there. I think Dan's career so far demonstrates that. If you can do it well then you'll do well in the discipline, as he has. But not everybody can do that sort of work well (particularly early in their careers), and for them it's less risky to tweak POLITY on MIDS.

I think there's some value in the latter too. I'm very much okay with purely empirical studies of the sort that Voeten is describing in his Monkey Cage post.

Blogger said...

Did you know that you can create short urls with LinkShrink and earn cash for every click on your shortened urls.

On IR Professionalization, Grad School, and What I Think I've Been Doing for the Past Four Years
 

PageRank

SiteMeter

Technorati

Add to Technorati Favorites