Today, we unveil the 2013 RHSU Edu-Scholar Public Presence rankings. The metrics, as explained yesterday are designed to recognize those university-based academics who are contributing most substantially to public debates about K-12 and higher education.
The top scorers are familiar edu-names with long careers, bodies of influential scholarship, track records of commenting on public developments, and outsized public and professional profiles. In order, Linda Darling-Hammond and Diane Ravitch are tied for first (In the case of ties, scholars are listed alphabetically by last name), followed by Howard Gardner, Rick Hanushek, and Paul Peterson. Rounding out the top ten were Larry Cuban, Gary Orfield, Yong Zhao, Richard Elmore, and Tony Wagner. These results reflect the nature of the scoring, which recognizes the influence of a scholar's body of work and not simply whether a scholar garnered press clippings or blog mentions in 2012.
With that said, here are the 2013 rankings:
Click chart for larger view, with zoom
The RHSU Edu-Scholar Rankings are restricted to university-based researchers and excluded think tankers (e.g. Checker Finn or John Chubb) whose job description is to influence the public discourse. After all, the point is to nudge what is rewarded and recognized at universities. (The term "university-based" provides a bit of useful flexibility. For instance, Tony Bryk currently hangs his hat at Carnegie. However, he is an established academic who retains a university affiliation and campus digs. So he's included.)
In calculating scores, we sought to be careful and consistent. That said, there were inevitable challenges in determining search parameters, dealing with common names or quirky diminutives, and so forth. Bottom line: this is a serious but inevitably imperfect attempt to nudge universities, foundations, and professional associations to consider the merits of doing more to cultivate, encourage, and recognize contributions to the public debate.
In terms of affiliations, Harvard fared impressively, claiming two of the top five spots, and six of the top 20. Stanford also claimed two of the top five and four of the top 20, while the University of Virginia and NYU each nabbed two top 20 spots. Other institutions with faculty placing in the top 20 were UCLA, the University of Oregon, Columbia (Teachers College), the University of Wisconsin, Johns Hopkins, UC Berkeley, and Arizona State.
A number of top scorers have penned influential books of recent vintage. In just the past two years, Terry Moe published Special Interest, Paul Peterson Saving Schools, Howard Gardne rMultiple Intelligences, Daniel Willingham Why Don't Students Like School?, David Cohen Teaching and Its Predicaments, Yong Zhao World Class Learners, Pedro Noguera Creating the Opportunity to Learn, Tony Wagner Creating Innovators, and Richard Murnane Whither Opportunity.
As with any such ranking, this exercise ought to be interpreted with appropriate caveats and caution. Given that the ratings are a snapshot of 2012, the results obviously favor scholars who penned a successful book or big-impact study this year. But that's how the world works. And that's why we do this every year.
Because of the scoring caps, a few scholars tended to hit the ceiling in any given category:
➢ Maxing out on Google Scholar were Darling-Hammond, Gardner, Hanushek, Levin, Robert Slavin, Bob Pianta, Helen Neville, David Berliner, Kenneth Zeichner, John Bransford, Deborah Ball, Marilyn Cochran-Smith, Lynn Fuchs, and Camilla Benbow.
➢ When it came to book points, Ravitch, Gardner, Nel Noddings, Cuban, and Peterson each maxed out. Ravitch scored the highest Amazon ranking at 19.7, as well as the highest Klout score at 8.2.
➢ With regards to mentions in the education press, only Ravitch hit the cap, while Ravitch, Darling-Hammond, Gardner, Hanushek, Cuban, Wagner, Pedro Noguera, and Roland Fryer each hit the cap when it came to blog mentions. A similar cast of characters maxed out on newspaper mentions - Ravitch, Darling-Hammond, Gardner, Hanushek, Fryer, and Jonathan Zimmerman.
If readers want to argue the relevance, construction, reliability, or validity of the metrics, I'll be happy as a clam. I'm not sure that I've got the measures right, that categories have been normed in the smartest ways, or even how much these results can or should tell us. That said, I think the same can be said about U.S. News college rankings, NFL quarterback ratings, or international scorecards of human rights. For all their imperfections, I think such efforts convey real information--and help spark useful discussion. That's what I've sought to do here.
I'd welcome suggestions for possible improvements--whether that entails adding or subtracting metrics, devising smarter approaches to norming, or what have you. I'd be interested in hearing your critiques, concerns, questions, and suggestions. So, take a look, and have at it.