By Adolfo Guzman-Lopez | KPCC | http://bit.ly/wy2c3U
Jan. 4, 2012 | A Washington D.C.-based think tank Wednesday released its list of the nation’s most influential education scholars. The rankings consider a blogosphere presence a must.
In baseball a player who can run, field, throw, hit and hit with power is valuable indeed. The American Enterprise Institute’s rankings of education scholars measure mentions in newspapers, blogs and publications. Scholars also receive a Google score for articles, books and papers cited online, and an Amazon ranking.
Number one on the list of influential education scholars is Stanford’s Linda Darling-Hammond; number two is New York University’s Diane Ravitch, whose book, “The Death and Life of the Great American School System,” is a favorite among public school teachers.
●● smf: List follows
Southern California scholars show up at number 11 (Gary Oldfield, UCLA) , 58 (Julian Betts, UCSD) and 61 (Eva Baker, UCLA). The American Enterprise Institute says the ranking is meant to push universities to encourage scholars to get their ideas and research to a lay audience and policymakers.
A lot of this research, the conservative think tank says, is falling on deaf ears these days as presidential candidates focus on the economy in their political platforms.
●● smf: This is a slow education news week!
The list is like the US News & World Report list of the “Best” Colleges/High Schools/Skateboard Parks ; The who’s in/who’s out on American Idol, The People’s Choice Awards or the bizarre formulae to determine the Bowl Championship Series rankings: It’s
There are names of actual influential education scholars that scream missing from the list (Deborah Meiers, UCLA’s John Rogers, Alfie Kahn, etc.) - let alone the think-tankers and foundation types who actually run the show. Or the bloggers and Ed press writers. (This list is the work product of a blogger …with too much time and not enough content on his mind!)
It’s interesting that Michael Kirst (#20) is there – he’s actually in a policy making position as President of the CA State Bd of Ed …but that didn’t get him any points in the ranking!
‘…as a cranky commenter complained in the EdWeek comments: Why weren’t Twitter Retweets factored in? Cheese Louise!
The 2012 RHSU Edu-Scholar Public Presence Rankings
January 4, 2012 7:46 AM :: Today, RHSU (Rick Hess Straight Up) unveils the 2012 Edu-Scholar Public Presence rankings. The metrics, as explained yesterday, are designed to recognize those university-based academics who are contributing most substantially to public debates about schools and schooling. The rankings offer a useful, if imperfect, gauge of the public impact edu-scholars had in 2011, factoring in both long-term and shorter-term contributions. The rubric reflects both a scholar's body of academic work--encompassing books, articles, and the degree to which these are cited--and their 2011 footprint on the public discourse. The (above)table reports the 2012 rankings.
The scores reflect, in roughly equal parts, three things: articles and academic scholarship, book authorship and current book success, and presence in new and old media. (See yesterday's post for the specifics.) The point of measuring quotes and blog presence is not to tally sound bites but to harness a "wisdom of crowds" sense of a scholar's footprint on the public debate--whether that's due to their current scholarship, commentary, larger body of work, media presence, or whatnot. We worked hard to be careful and consistent, but there were inevitable challenges in determining search parameters, dealing with common names or quirky diminutives, and so forth. Bottom line: this is a serious but inevitably imperfect attempt to nudge universities, foundations, and professional associations to consider the merits of doing more to cultivate, encourage, and recognize contributions to the public debate.
The top scorers? All are familiar edu-names, with long careers featuring influential scholarship, track records of comment on public developments, and outsized public and professional roles. In order, the top five were Linda Darling-Hammond, Diane Ravitch, Eric Hanushek, Larry Cuban, and Richard Arum. Darling-Hammond and Ravitch lapped the field, cracking 200 points on a scale where only a handful of scholars topped 100. Rounding out the top ten were Terry Moe, Paul Peterson, Pedro Noguera, Daniel Koretz, and David Cohen. Notable, if not too surprising, is that the top ten are all veteran, accomplished scholars. This reflects the nature of the scoring, which heavily weights the influence of a scholar's body of work and not simply whether a scholar collected a bunch of press clippings or blog mentions in 2011.
Stanford University fared very well, claiming three of the top five scholars (and six of the top fifteen). Harvard University claimed four of the top fifteen, and NYU claimed another three.
By category: Darling-Hammond posted the top Google Scholar score, at 83; Cuban topped the books category at 37.5; Ravitch topped the Amazon rankings with a 19.7; she also posted the high score in the education press category, at 41.5; twelve scholars topped the blog mentions by maxing out at 50 points (although, without the cap, Hanushek would have taken the prize quite handily); and Arum topped the general press mentions with a 26.8.
A number of top scorers, like Ravitch, have books of recent vintage. For instance, among the top ten, just in the past two years, Moe published Special Interest, his unflinching critique of teacher unions; Darling-Hammond published The Flat World and Education; Peterson published Saving Schools; Cohen published Teaching and Its Predicaments; and Noguera published Creating the Opportunity to Learn. And Arum doubtless benefited from the continuing outsized impact of his oft-cited Academically Adrift.
As with any such ranking, this exercise ought to be interpreted with appropriate caveats and caution. That said, it's revealing that a number of sober, less-controversial scholars--like Arum, Cohen, Dan Koretz, and Bob Pianta--dotted the top twenty. Meanwhile, less senior scholars who punched above their weight included Roland Fryer, Sara Goldrick-Rab, and Patrick McGuinn.
Given that professional norms vary (note that few economists crack the top twenty), it's interesting to eyeball the results discipline by discipline (admittedly, there's a bit of fuzziness when it comes to pigeonholing some scholars). The top-ranked economists were Hanushek, Hoxby, Roland Fryer, Hank Levin, and Tom Kane. The top-ranked political scientists were Moe, Peterson, Richard Elmore, Mike Kirst, and Bruce Fuller. The top-scoring sociologists were Arum, Noguera, Gary Orfield, Adam Gamoran, and Tony Bryk. Top scorers in the area of teacher education and curriculum and instruction were Darling-Hammond, Gloria Ladson-Billings, David Berliner, Ken Zeichner, and Carol Tomlinson.
The emphasis accorded to an established body of work advantages senior scholars at the expense of junior academics. And, given that the ratings are a snapshot of 2011, the results obviously favor scholars who recently penned a successful book or big-impact study this year. But both of these also accurately reflect how thinkers can disproportionately impact public discussion--so I'm disinclined to see problems in such a "bias."
There's also the challenge posed by bloggers like Jay Greene, Goldrick-Rab, Bruce Baker, and Sherman Dorn, whose own blogging or think tank critiques mean that they are publishing with great frequency. The key: the aim was not to measure how much a scholar writes, but how much resonance their work has. Flagging blog entries and newspaper mentions in which a scholar is identified by university affiliation here serves a dual purpose: avoiding confusion caused by common names while also ensuring that scores aren't unduly padded by a scholar's own blogging (since those posts generally don't include an affiliation). If bloggers are provoking discussion, the figures will reflect that. If a scholar is mentioned sans affiliation, that mention is omitted here; but that's true across-the-board. If anything, that probably tamps down the scores of well-known scholars for whom university affiliation may seem unnecessary. C'est la vie.
If readers want to argue the relevance, construction, reliability, or validity of the metrics, I'll be happy as a clam. I'm not sure that I've got the measures right, that categories have been normed in the smartest ways, or even how much these results can or should tell us. That said, I think the same can be said about U.S. News college rankings, NFL quarterback ratings, or international scorecards of human rights. For all their imperfections, I think such efforts convey real information--and help to spark useful discussion. That's what I've sought to do here.
I'd welcome suggestions regarding possible improvements--whether that entails adding or subtracting metrics, devising smarter approaches to norming, or what have you. I'd welcome critiques, concerns, questions, and suggestions. Take a look, and have at it.