Michael F.Shaughnessy
Senior Columnist EdNews.org
Eastern New Mexico University

1. First of all, what seems to be wrong with current educational research?

It's really a tale of two cities. There is a body of valuable, careful, and insightful work that tackles pressing issues in useful ways. Unfortunately, I see a much larger body of activity that is ideological, poorly designed and executed, jargon-laden, and of little interest to anyone except the researcher conducting it.

There is good and important research being done on teacher quality, school leadership, assessment, pedagogy, educational finance, choice-based reform, and any number of other pressing concerns in education. There is also provocative and high-caliber work that may be less immediately relevant but that promises to provide new insights and pose important new questions. However, when one surveys the hundreds of peer-reviewed educational journals, one sees that many or most emphasize trivia and appear to have remarkably lax standards regarding the quality, rigor, and clarity of the scholarship that they opt to publish.
Whether that is due to inadequate attention to the review process, an unruly proliferation of lower-tier journals, the tenure and promotion requirements of faculty at hundreds of education departments and schools, insufficient training and preparation of too many scholars, or to some other factor, I really can't say.

2) Are you unfairly picking on education researchers?

Some have expressed concern that the piece unfairly picks on educational research, noting that every field has its embarrassments and that highlighting mediocre scholarship may undercut support for the high-quality work that is getting done. I see it differently. In fact, the desire to maintain peaceable relations, avoid conflict, and mind one's own business can make it easy for serious scholars and organizational leaders to turn a blind eye to silliness in their ranks. Frequently, it is only when a spotlight is shined on some of the hijinks that it becomes possible to muster the will to address it.

The kind of half-hearted monitoring that the column represents is indeed conducted in a number of fields—most famously with the Modern Language Association. Our intention is not to single out education researchers as a class but to bring to educational research the illumination that can prompt researchers and observers to take a good look at a seemingly unexceptional status quo. There are severe limits to how much this kind of exercise can do, but those of us who have faith in transparency and public debate believe that this kind of attention can prompt valuable discussion and reassessment.

3) Are you just having fun in this piece, or is there a serious argument here?

We obviously had fun with the piece, and the nature of the exercise dictates that the column not take itself too seriously. That said, there is a substantive argument here. As I just mentioned, we are trying to raise attention and change behavior, and there are some modestly encouraging signs on that front. I have received notes from a few scholars who were spurred by either last year's column or this year's to probe more deeply into finding some way to assess the quality of AERA presentations or of educational research more generally. Similarly, the 2006 column was credited by USA Today's Greg Toppo with helping to prompt his careful and extended article on the state of educational research during this year's AERA.

The aim is to focus on the larger issue rather than on personalities; to that end, we don't name presenters or authors and we avoid ridiculing any individuals. And we tried to write the piece with some affection, rather than the lacerating or ad hominem language that might be favored by those who do not regard themselves as members of the educational research community. The ultimate object is to chide all of us in the education research community to think more aggressively about policing the quality of the work being presented under our auspices.

4) Let's face it…Many conferences and conventions allow small sample size work, miniscule literature reviews, and case studies to be presented, with less than stellar conclusions. Is it simply a matter of money and propagation of the status quo?

It's really a question of incentives. In most cases, faculty members need to participate in a formal session in order for their universities to fund part or all of their research travel. The result is implicit pressure to create enough opportunities to accommodate all interested scholars in some form or other. Meanwhile, there is little countervailing pressure to police the quality of presentations or scholarship.

This state of affairs would be relatively harmless if it weren't for the fact that many schools and departments of education treat AERA presentations as significant evidence of academic accomplishment when it comes to awarding tenure, pay raises, and research support. It also has the unfortunate effect of diluting the more rigorous and substantial work that is being presented. We all understand that conferences are a useful opportunity for professionals in any field to network, connect with colleagues and collaborators, discuss the state of their profession, and catch up on new developments. This gathering process is a useful thing and valuable in its own right. The concern arises when that convening winds up providing a forum for troubling scholarship—especially when that scholarship seems reflective of much of the research in the field and not infrequently later shows up being published in peer-reviewed journals.

5) What ever happened to large scale studies conducted over time? Or are the "publish and present" demands so onerous on new faculty members?

Such work is indeed taking place. In fact, there is good reason to believe that there's more of it today, and that it's being pursued in a more methodologically sophisticated fashion than it was ten or twenty years ago. Unfortunately, the required resources are so substantial, the technical challenges so severe, and the timelines for such data collection so long, that this work can never address more than a fraction of the pressing questions we want to see addressed.
That's not necessarily a problem—simply a reality—but it does suggest just why intellectually serious and probing smaller scale work by education researchers can be so valuable.

6) Do you think the Internet has contributed to the state of affairs where just about anything, on any topic can be "posted"?

The internet has had a really powerful impact on academic culture, and one that we haven't yet done a very good job of sorting out. On the whole, the internet has had two contradictory effects. On the one hand, it has greatly accelerated communication among scholars, made research much more widely available, created vast new opportunities for collaboration and scholarly exchange, and made it possible to disseminate data and findings more easily and successfully. On the other hand, it has enabled researchers to bypass the conventional "vetting" mechanisms that have sought to maintain quality control and have advantaged those individuals and institutions who are savvier, better equipped, and more focused on using the technology to market their ideas and research. Of course, the questionable state of quality control at AERA—and in many education journals—raises the question as to whether such gatekeeping is actually maintaining standards of quality, or whether it has more frequently served to advantage favored points of view and disadvantage others.

7) In the current educational climate, of how much value is a presentation say, at the American Educational Research Association?

As I mentioned above, there are many schools and departments of education where an AERA presentation is regarded as a substantial accomplishment. For instance, I have observed or been contacted as part of more than a few tenure or hiring cases where AERA presentations were deemed credible evidence of scholarly activity. So, in the current climate, these presentations have real value in the academy.

Of course, in the article, I make the argument that this is problematic. Some of the work presented at AERA is important and significant. However, I would suggest that if all we know about a piece of work is that it was presented at AERA, then we know nothing about its quality, rigor, value, or fair-mindedness. As I suggested before, this would be more acceptable if it was not imagined that presenting at AERA conferred some degree of legitimacy or signaled some measure of credibility.

8) Are our doctoral level institutions really training doctoral students nowadays to do creative, innovative studies with quality methodology and rigorous statistics?

That's a great question. Certainly, there is good evidence that upper-tier economics, political science, sociology, and public policy programs are producing PhDs with quantitative skills and methodological sophistication that dramatically surpass those of earlier generations. This has been the pattern of the social sciences for several decades, and nothing has changed on that score. Whether some programs are emphasizing formal theory or econometric training to the degree that fewer graduates may have an aptitude for or interest in field work is a question some have posed. But I don't know that anyone has any good answers to that.

With regard to doctoral level training in education, I'm in no position to pass judgment on the quality of instruction being offered at the hundreds of institutions offering education doctorates. I can say, however, that the education policy work by young scholars that I find most compelling consistently seems to be produced by young scholars trained in the disciplines. Whether that judgment is a product of my own tastes as a reader, self-selection on the part of doctoral candidates, the quality of preparation, or some other factor, I really can't say.

9) I believe that last year, I interviewed you about the sorry state of affairs in educational research. Have things gotten better worse, or stayed about the same?

These things tend not to change much from one year to the next. When we're talking about educational research in America, we're talking about tens of thousands of individuals in thousands of academic, nonprofit, and government institutions. Movement in any direction is going to be extraordinarily slow and gradual.

That said, it is hard to know much about what kind of quiet changes may be happening. For all the attention that education researchers have devoted to critiquing schools' measurement and thought, little effort has been devoted to determining empirically what research gets done and how rigorous or relevant it actually is. For instance, while the AERA annually releases data on the size of the conference and on the gender and race of attendees and presenters, no one in the profession makes any effort to consider the quality of the research being conducted. One hopes that my writing might, in a small way, contribute to encouraging some scholar or funder to initiate such an effort.

10) Given that there are so many relevant issues out there to be explored, why are conferences and conventions allowing miniscule topics of esoteric interest to be presented?

I am certainly not encouraging scholars to narrow the range of topics that they address. After all, we're remarkably bad at knowing what line of inquiry is going to foster new insights or what seemingly marginal field of study today might prove relevant tomorrow. Explorations of K–12 schooling in Saudi Arabia or Iraq might have seemed peripheral in 1996, only to prove a matter of urgent concern five years later. That said, I think that little of the work that we flag in the article could fit into that rubric. There are a number of potentially "esoteric" and qualitative lines of inquiry that I would not dream of poking fun at.

I'd love to see more evidence that scholars were asking hard questions and pursuing disciplined research on the effects of teacher gender on student outcomes, the impact of early sexual activity on learning, educational quality and equality in the former Soviet republics, strategies to build AIDS awareness in Africa, the content of instruction in madrassas, the extraordinary growth of post-secondary education in China, peer dynamics among the children of illegal immigrants, and any number of other questions that might seem far away from a focus on topics like teaching, learning, pedagogy, or assessment. My concern is not that seemingly esoteric topics are being tackled, but that these inquiries so often seem ideological, frivolous, poorly executed, and jargon-laden in order to obscure the fact that they have little or no substance to report.

Published April 23, 2007