Tag Archives: grantee

“The Power of Dialogue on Nonprofit Data and Evaluation.”

Calvin and Hobbes do a happy dance

Happy dance

This is a blog article about a blog article.  I’m doing a happy dance, because the Foundation Center‘s GrantSpace blog has published my article on “The Power of Dialogue on Nonprofit Data and Evaluation.”

Please feel free to read the article and give me your feedback!

How grant makers and nonprofit grant recipients can do great things together with data and evaluation

This is not actually a photo from the dialogue series. We refrained from taking photos, because we wanted to foster an atmosphere of candor and comfort as grantors and grantees engaged in conversation about a difficult topic. However, it is a favorite photo from another recent Tech Networks of Boston event.

 

Oh, my!  It took Tech Networks of Networks almost two years to organize and implement a series of candid dialogues about data and evaluation for grantors and nonprofit grantees, and now it’s complete.  The process was a collaboration in itself, with TSNE MissionWorks, and Essential Partners serving as co-hosts. An advisory group and planning group gave crucial input about the strategy and tactics for this event.

What you see here are a few notes that reflect my individual experience. In this article, I am not speaking on behalf of any organization or individual.

As far as I can ascertain, this series was the first in which grant makers and nonprofit grant recipients came together in equal numbers and met as peers for reflective structured dialogue. World class facilitation and guidance was provided by Essential Partners, with the revered Dave Joseph serving as facilitator-in-chief.

Here’s how I’d characterize the three sessions:

  • June 2017:  Let’s get oriented. What is the heart of the matter for grantors and grantees?
  • September 2017:  You know, we really need to address the imbalance of power in the grantor/grantee relationship.
  • January 2018:  Ok, can we agree on some best practices how to address this as grantors and grantees? Why, yes. We can.

The plan is to make the recommendations that came out of the final dialogue publicly available online, to provide a starting point for a regional or even national conversation about data and evaluation.

Meanwhile, I’d like to offer my own recommendations.  Mine are based on what I learned during the dialogue series, and also on untold numbers of public and private conversations on the topic.

 

_____________________________________________________________________________

 

My Recommendations

 

Funders can help by: 

  • Understanding that nonprofits perceive funders as having not just money but also much more power.
  • Asking nonprofits to define their goals, their desired outcomes, and their quantitative measures of success – rather than telling them what these should be.
  • Factoring in the nonprofit organization’s size, capacity, and budget – making sure that the demand for data and evaluation is commensurate.
  • Understanding the real cost in dollars to grantees who provide the data reporting and evaluation that you request.  These dollar amounts might be for staff time, technology, training, an external consultant, or even for office supplies.
  • Providing financial support for any data or evaluation that the funder needs –  especially if the nonprofit does not have an internal need for that data or evaluation.    Items to support might include staff time, technology, training, or retaining an external consultant with the necessary skill set.
  • Putting an emphasis on listening.

 

Nonprofits can help by: 

  • Engaging in a quantitative analysis of their operations and capacity, and sharing this information with funders.
  • Understanding that grant makers are motivated to see nonprofit grant recipients succeed.
  • Understanding that grant makers are often under pressure from donors and their boards to deliver a portfolio of outcomes.
  • Integrating the use of data and evaluation into most areas of operation – this means building skills in data and evaluation across the entire organization.
  • Gathering with other nonprofits that have similar desired outcomes and comparing notes on failures and best practices.
  • Fostering a data-friendly, continuous learning culture within nonprofit organizations.

 

Both groups can help by: 

  • Engaging in self-scrutiny about how factors such as race and class affect how data is collected, categorized, analyzed, and reported.
  • Talking frankly about how power dynamics affect their relationships.
  • Engaging in ongoing dialogue that is facilitated by a third party who is experienced in creating a safe space.
  • Talking about and planning the evaluation process well before the grant begins.
  • Creating clear definitions of key terms pertaining to data and evaluation.
  • Making “I don’t know” an acceptable response to a question.
  • Measuring what you really value, rather than simply valuing what you can easily measure.
  • Working toward useful standards of measurement.  Not all programs and outcomes are identical, but very few are entirely sui generis.
  • Sharing responsibility for building the relationship.
  • Speaking with each other on a regular basis.
  • Studying (and implementing) community-based participatory research methods.

 

_____________________________________________________________________________

 

And now, because I can insert a poll here, I’m going to.

 

 

 

_____________________________________________________________________________

 

And now, because I can insert a contact form here, I’m going to.  Please feel free to let me know if you’re interested in being part of a regional or national conversation about how grantors and grantees can move forward and work constructively with data and evaluation.

 

 

_____________________________________________________________________________

 

Creative Commons License
Some rights reserved. This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

 

_____________________________________________________________________________

How grantmakers and how nonprofits use information about outcomes

State of Evaluation 2012: Evaluation Practice and Capacity in the Nonprofit Sector, a report by the Innovation Network

I’m sitting here, reflecting on the Innovation Network’s “State of Evaluation 2012” report.

I encourage you to download it and read it for yourself; start with pages 14 and 15. These two pages display infographics that summarize what funders (also known as “grantors,” or if you’re Bob Penna, as “investors”) and nonprofits (also known as “grantees”) are reporting about why they do evaluation and what they are evaluating.

Regardless of whether you call it evaluation, impact assessment, outcomes management, performance measurement, or research – it’s really, really difficult to ascertain whether a mission-based organization is delivering the specific, positive, and sustainable change that it promises to its stakeholders. Many organizations do an excellent job at tracking outputs, but falter when it comes to managing outcomes. That’s in part because proving a causal relationship between what the nonprofit does and the specific goals that it promises to achieve is very costly in time, effort, expertise, and money.

But assuming that a mission-based organization is doing a rigorous evaluation, we still need to ask:  what is done with the findings, once the analysis is complete?

What the aforementioned infographics from the “State of Evalution 2012”  tell me is that both grantors and grantees typically say that the most important thing they can do with their outcome findings is to report them to their respective boards of directors.  Considering the depth of the moral and legal responsibility that is vested in board members, this is a pretty decent priority.  But it’s unclear to me what those boards actually do with the information.  Do they use it to guide the policies and operations of their respective organizations?  If so, does anything change for the better?

If you have an answer to the question of how boards use this information that is based on firsthand experience, then please feel to post a comment here.

%d bloggers like this: