February 6, 2013

New Hewlett Foundation Working Paper on Evaluation

The Hewlett Foundation has placed its cards on the table with admirable transparency as to how it seeks to carry out its work. Other foundations would help themselves and the field by following suit. Bridgespan partner Daniel Stid poses two questions for them to reflect on as they do so.

By: Daniel Stid

The William and Flora Hewlett Foundation has just released a working paper entitled "Evaluation Principles and Practices." With this document and a prior companion publication, "Outcome Focused Grantmaking: A Hard-headed Approach to Soft-hearted Goals," the Hewlett Foundation has—with admirable transparency—placed its cards on the table with respect to how it seeks to carry out its work. Other foundations would help themselves and the field by following suit.

(In the interest of disclosure I should note that the Hewlett Foundation is a funder of The Bridgespan Group, and that Fay Twersky, who co-authored the paper with Karen Lindblom, is a friend from whom I have learned much about evaluation over the years.)

The first two principles articulated in the paper are especially noteworthy:

  1. "We lead with purpose. We design evaluation with actions and decisions in mind. We ask, 'How and when will we use the information that comes from this evaluation?' By anticipating our information needs, we are more likely to design and commission evaluations that will be useful and used. It is all too common in the sector for evaluations to be commissioned without a clear purpose, and then to be shelved without generating useful insights. We do not want to fall into that trap."
  2. "Evaluation is fundamentally a learning process. As we engage in evaluation planning, implementation, and use of results, we actively learn and adapt. Evaluative thinking and planning inform strategy development and target setting. They help clarify evidence and assumptions that undergird our approach. As we implement our strategies, we use evalu¬ation as a key vehicle for learning, bringing new insights to our work and the work of others."

These two principles are reciprocal. The first emphasizes that information gleaned from evaluations will be used for decisions and actions; the second that the point of evaluation is learning, adaptation, and course correction. These two points alone, if systematically followed, would help foundations avoid what in my experience are the biggest pitfalls for foundations in this area: undertaking evaluations that, for all the expense and effort involved, have no practical impact, and treating an initial theory of change as sacrosanct instead of something to be tested, refined, and strengthened with data and reflection.

The authors present the paper "not as a blueprint, but in a spirit of collegiality and an interest in contributing to others’ efforts and continuing our collective dialogue about evaluation practice." To honor this spirit, let me flag two questions that came to mind as I read it.

First, as foundations become more strategic and develop their own increasingly elaborate outcome goals, logic models, and evaluation plans, how can they resist the temptation to see and treat their grantees instrumentally, simply as means to the ends the foundation is advancing? The Hewlett Foundation has a track record and philosophy of providing general operating support when grantees’ goals are highly aligned with their own, one major way for a foundation to help ensure that "the field" they need this year will still be there when they need it in years to come. But Hewlett is an outlier in using this approach.

Second, how can foundations mitigate the powerful incentives that grantees have to share information in ways that maximize the appearance of impact and ROI for the funding they have received, thereby undermining genuine learning, reflection, and improvement—for foundations and grantees alike? Hewlett started a trend among foundations a few years ago by publicizing at least some self-identified failures in grant-making. For grantees, however, acknowledging failure is typically not a recipe for securing more funding from foundations.

I’d welcome your comments on the Hewlett paper and the challenges I’ve flagged above.


Creative Commons License logo
This work is licensed under a Creative Commons Attribution 4.0 International License. Permissions beyond the scope of this license are available in our Terms and Conditions.