Tag Archives: data

“The Power of Dialogue on Nonprofit Data and Evaluation.”

Calvin and Hobbes do a happy dance

Happy dance

This is a blog article about a blog article.  I’m doing a happy dance, because the Foundation Center‘s GrantSpace blog has published my article on “The Power of Dialogue on Nonprofit Data and Evaluation.”

Please feel free to read the article and give me your feedback!

How grant makers and nonprofit grant recipients can do great things together with data and evaluation

This is not actually a photo from the dialogue series. We refrained from taking photos, because we wanted to foster an atmosphere of candor and comfort as grantors and grantees engaged in conversation about a difficult topic. However, it is a favorite photo from another recent Tech Networks of Boston event.

 

Oh, my!  It took Tech Networks of Networks almost two years to organize and implement a series of candid dialogues about data and evaluation for grantors and nonprofit grantees, and now it’s complete.  The process was a collaboration in itself, with TSNE MissionWorks, and Essential Partners serving as co-hosts. An advisory group and planning group gave crucial input about the strategy and tactics for this event.

What you see here are a few notes that reflect my individual experience. In this article, I am not speaking on behalf of any organization or individual.

As far as I can ascertain, this series was the first in which grant makers and nonprofit grant recipients came together in equal numbers and met as peers for reflective structured dialogue. World class facilitation and guidance was provided by Essential Partners, with the revered Dave Joseph serving as facilitator-in-chief.

Here’s how I’d characterize the three sessions:

  • June 2017:  Let’s get oriented. What is the heart of the matter for grantors and grantees?
  • September 2017:  You know, we really need to address the imbalance of power in the grantor/grantee relationship.
  • January 2018:  Ok, can we agree on some best practices how to address this as grantors and grantees? Why, yes. We can.

The plan is to make the recommendations that came out of the final dialogue publicly available online, to provide a starting point for a regional or even national conversation about data and evaluation.

Meanwhile, I’d like to offer my own recommendations.  Mine are based on what I learned during the dialogue series, and also on untold numbers of public and private conversations on the topic.

 

_____________________________________________________________________________

 

My Recommendations

 

Funders can help by: 

  • Understanding that nonprofits perceive funders as having not just money but also much more power.
  • Asking nonprofits to define their goals, their desired outcomes, and their quantitative measures of success – rather than telling them what these should be.
  • Factoring in the nonprofit organization’s size, capacity, and budget – making sure that the demand for data and evaluation is commensurate.
  • Understanding the real cost in dollars to grantees who provide the data reporting and evaluation that you request.  These dollar amounts might be for staff time, technology, training, an external consultant, or even for office supplies.
  • Providing financial support for any data or evaluation that the funder needs –  especially if the nonprofit does not have an internal need for that data or evaluation.    Items to support might include staff time, technology, training, or retaining an external consultant with the necessary skill set.
  • Putting an emphasis on listening.

 

Nonprofits can help by: 

  • Engaging in a quantitative analysis of their operations and capacity, and sharing this information with funders.
  • Understanding that grant makers are motivated to see nonprofit grant recipients succeed.
  • Understanding that grant makers are often under pressure from donors and their boards to deliver a portfolio of outcomes.
  • Integrating the use of data and evaluation into most areas of operation – this means building skills in data and evaluation across the entire organization.
  • Gathering with other nonprofits that have similar desired outcomes and comparing notes on failures and best practices.
  • Fostering a data-friendly, continuous learning culture within nonprofit organizations.

 

Both groups can help by: 

  • Engaging in self-scrutiny about how factors such as race and class affect how data is collected, categorized, analyzed, and reported.
  • Talking frankly about how power dynamics affect their relationships.
  • Engaging in ongoing dialogue that is facilitated by a third party who is experienced in creating a safe space.
  • Talking about and planning the evaluation process well before the grant begins.
  • Creating clear definitions of key terms pertaining to data and evaluation.
  • Making “I don’t know” an acceptable response to a question.
  • Measuring what you really value, rather than simply valuing what you can easily measure.
  • Working toward useful standards of measurement.  Not all programs and outcomes are identical, but very few are entirely sui generis.
  • Sharing responsibility for building the relationship.
  • Speaking with each other on a regular basis.
  • Studying (and implementing) community-based participatory research methods.

 

_____________________________________________________________________________

 

And now, because I can insert a poll here, I’m going to.

 

 

 

_____________________________________________________________________________

 

And now, because I can insert a contact form here, I’m going to.  Please feel free to let me know if you’re interested in being part of a regional or national conversation about how grantors and grantees can move forward and work constructively with data and evaluation.

 

 

_____________________________________________________________________________

 

Creative Commons License
Some rights reserved. This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

 

_____________________________________________________________________________

My ever expanding theory of change for nonprofit data and evaluation

Drowning in data, drafting a data checklist, and asking “WHY?”

Two poster boys of nonprofit data sanity: Bob Penna (l) and Steve Pratt (r).

Two poster boys of nonprofit data sanity: Bob Penna (l) and Steve Pratt (r).

Now that TNB Labs is up and running, we’re receiving a lot of requests from nonprofit organizations who are perplexed about how to manage the data that they have, before they plunge any further into data analytics or think about acquiring a new data analysis tool.  This has given me a lot of opportunities to reflect on how difficult it can be for people whose expertise lies elsewhere to orient themselves to data governance.

Steve Pratt‘s blog article “Drowning in Data?” has been a huge inspiration.  In it, he explains the importance of data inventories, and offers to send the Root Cause template to anyone who requests it.  I highly recommend that you send an email to info@rootcause.org, and ask for a copy.

At the same time, as I went over Steve’s template, I had a nagging feeling that we needed something even more elementary.  Remembering my friend Bob Penna‘s exhortation of a few months before, about asking “who, when, where, what, how, and why,” I quickly drafted a data checklist that focused on those basic questions.  When I sent it to Bob, he very quickly returned it with some excellent enhancements; the most brilliant one was to start the checklist with the question “WHY?”  As he very sensibly pointed out, if you can’t come up with a good reason why you are collecting, analyzing, reporting, and archiving information, you might as well stop there.  In the absence of a persuasive answer to the question “why?” there’s no need to ask “who, when, where, what, and how;” in fact there’s no reason to collect it at all.

With that wisdom in mind, I have tweaked the draft of the data checklist, and herewith present it to you for feedback. This version is the result of a Penna/Finn collaboration:

You can view it by clicking on this link.

Before you take a look at it, I recommend reading “Drowning in Data?”  After you’ve perused the spreadsheet, I recommend reading Bob Penna’s book, “The Nonprofit Outcomes Toolbox.”

 

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

TNB Labs has launched, and its first priority is data services for nonprofits!

tnb labs logo july 2016

 

Greg Palmer and I are very pleased to announce the launch of TNB Labs, a company dedicated to providing nonprofit organizations with high quality data services and program management.  TNB Labs works with organizations to audit and assess data management methodology, develop and implement data standards, and provide structural oversight through data governance to organizations of all sizes.

Through our partnership with Tech Networks of Boston, we have had a unique opportunity to listen to hundreds of stakeholders at TNB’s Roundtables uncover a need to elevate the role of data in nonprofit organizations.  TNB Labs is focused on improving the capacity of nonprofit organizations at every stage of the outcome management process.

The immediate priorities of TNB Labs are:

1) Providing Master Data Management (MDM) services to nonprofit organizations in support of their missions, focusing on data governance, data quality, data modeling, data visualizations, and program evaluation.

2) Providing workforce program management for Desktop Support Technicians (DST), Data Support Analysts (DSA), and Data Analytics/Data Evaluation entry level professionals.

3) Managing the TNB Roundtable series, which is now jointly owned by Tech Networks of Boston and TNB Labs.

TNB Labs is led by Greg Palmer (chief executive officer), and Deborah Elizabeth Finn (chief strategic officer).  The other co-founders are Bob Master (former CEO of Commonwealth Care Alliance) and Susan Labandibar (founder of Tech Networks of Boston).

TNB Labs is here to solve your problems.  Please contact us with any questions and comments you have about TNB Labs, or to learn more about data management or program management services that might be helpful to your organization.

Best regards from Deborah and Greg

Greg Palmer
gpalmer@tnblabs.org
508.861.4535

Deborah Elizabeth Finn
definn@tnblabs.org
617.504.8188

TNB Labs, LLC
PO Box 2073
Framingham, MA 01703
www.tnblabs.org

 

 

It’s not just a half-day outcomes management training for nonprofit executives – it’s an occasion for rejoicing!

snoopy happy dance

For more than two years, I have been worrying aloud about the lack of training for nonprofit professionals who want to lead their organizations in implementing outcomes management and data visualization.  Today I’m rejoicing, because Tech Networks of Boston opened registration for a free (and sales-pitch-free) half-day outcomes management training for nonprofit executives.

It’s happening in April because some wonderful allies have stepped up – such as TNB’s co-hosts, the Mel King Institute and the College of Public and Community Service at the University of Massachusetts, and the wonderful Kathryn Engelhardt-Cronk of Community TechKnowledge, who will serve as our trainer.

This isn’t the full series of three day-long trainings on outcomes management and outcomes data visualization that I had originally envisioned, and that I still hope we can organize.  If we are able to do that, the other trainers will be the equally wonderful Beth Kanter and Georges Grinstein.  Right now, I’m looking at plans for Kathryn’s half-day outcomes management training as a miracle in itself, but also as the thin edge of the wedge.  (If you prefer more up to date jargon, you can call it a “proof of concept.”)

Of course, my thinking has become even more grandiose since I originally came up with the idea of a three-day outcomes/data viz training series.  Now I’m thinking in terms of a “Massachusetts Institute of Nonprofit Technology,” in which the first initiative would be a degree program in nonprofit data analysis.

Let’s take this training opportunity, which will be brief in comparison to the more elaborate programs that I’ve envisioned, and build on it!

 

 

How much fun is the Nonprofit Technology Conference? This much fun. (Plus some thoughts about shifting from tactical to strategic support of nonprofit organizations.)

Deborah is delighted by the artist's rendition of a concept of Tech Networks of Boston's. The photo was taken at the Netsuite.Org booth, at the 2015 Nonprofit Technology Conference .

Photo by Peggy Duvette of Netsuite.Org.

The good folks of Netsuite.Org had a great idea for their exhibit area at the Nonprofit Technology Conference this year.  They asked attendees to describe their technology visions in three words.  I chose “shared” “data,” and “outcomes.” and an artist quickly drew up a visual to express this.  (Unfortunately, I did not note down her name; I hope I can find it in order to give her proper credit for her work.)  The photo shown above was taken by Peggy Duvette, and as you can see, I was delighted to see this concept, which is part of Tech Networks of Boston’s strategic thinking, become part of the patchwork quilt of ideas that were being expressed.

Here’s a close-up of the TNB concept:

I (Deborah) took this photo at the Netsuite.Org booth, at the 2015 Nonprofit Technology Conference. Alas, I did not note down the name of the artist who did this drawing.

I took this photo at the Netsuite.Org booth, at the 2015 Nonprofit Technology Conference. Alas, I did not note down the name of the artist who did this drawing.

At TNB, we are thinking more and more about collaborative technology management – not just in terms of how we work with our nonprofit clients, but also about how clusters of NTAPs and nonprofits can work together toward a shared long term goal.   We have great relationships (and in many cases, shared nonprofit clients) with some great local nonprofit technology assistance providers, such as Annkissam* and 501Partners.  The three NTAPs are already collaborating on a series of sales-pitch-free evenings in which local nonprofit professionals are offered pro bono tech consultations.

However, the potential exists to do so much more, especially considering how many clients we share.

Wouldn’t it be great if the three NTAPs could offer their shared clients the following:

1) Seamless integration of TNB, AK, and 501P’s services.

2) Shared best practices for clusters of nonprofits with similar programs, operations, or missions.

3) Coordinated outcomes measurement and management for nonprofits that have overlapping constituencies.

The joy of #15NTC is in realizing that although we are just three NTAPs in one region, we are part of a wider movement.  In fact, if you were to look at the entire collection of artist’s renderings that were done at the Netsuite.Org exhibit area, you’d see that many nonprofit organizations are on the cusp of dreaming this dream.  Most of in the nonprofit sector understand that for lasting positive change in the world, one program at a single nonprofit organization is not enough.  The future is in sharing and coordinating our work.  What if nonprofit technology assistance providers started with that challenge, rather than the challenge of keeping a network server from crashing?  The emphasis would shift from the tactical support of nonprofits to the strategic support of their missions.  And by “missions,” I don’t mean vague statements; I mean specific (and even quantifiable) positive changes that nonprofit profits have committed themselves to delivering to their stakeholders.

Because mission achievement is why we all get up in the morning to do our jobs.

And because building a nonprofit technology movement that supports mission achievement is the best possible reason for participating in the Nonprofit Technology Conference.

 

* I also serve Annkissam directly as a consultant.

 

 

 

NPtech Labor Market Alert: The Big Job Title of 2015 Will Be “Data Analyst”

 

Disclaimer: This illustration is for entertainment purposes only. I am not a professional data analyst.

Disclaimer: This illustration is for entertainment purposes only. I am not a professional data analyst.

 

My training, such as it is, is heavily skewed toward qualitative methods; at the same time, I have a lot of respect for quantitative analysis.  However, my favorite form of research consists of staring off into space and letting ideas float into my head.  Sometimes I validate my findings by engaging in conversations in which I talk louder and louder until everyone agrees that I’m right.  It seems to work.

Lately, I’ve had a little time to stare off into space and let ideas float into my head; by this, I mean that I traveled to Austin, Texas for the Nonprofit Technology Conference (also known as #15ntc) and had some down time on the plane.  By the time I arrived in Austin, I had become convinced that “Data Analyst” would be this year’s standout job title in the field of nptech.  At the conference, I was able to confirm this – by which I mean that I didn’t meet anyone there who talks more loudly than I do.

What are the take-ways?  It depends on who you are:

  • For data analysts who are now working in the field of nonprofit technology:  prepare to be appreciated.
  • For data analysts now working in other sectors: think about whether this is a good moment to make a career shift in which you use your geek powers for good. But make sure you know what you’re getting into.
  • For nonprofit executives: don’t kid yourselves. Brilliant data analysts who want to work in the nonprofit sector aren’t going to be attracted by job announcements that indicate that the successful candidate will also be responsible for network administration, hands-on tech support, social media, and web development.
  • For workforce development professionals:  this is your cue. It’s time to put together a program for training computer science graduates to be nonprofit data geeks.
  • For donors, grantmakers, and other funders:  if you want reports from nonprofits are based on reliable and valid methods of analysis, then you will need to underwrite data analysts at nonprofits.  That means money for training, for salaries, and for appropriate technology.

If you don’t agree with my findings, please take a moment to share yours in the comments section.

Visualizing the role of data for mission-based organizations – Round II

I am much obliged to all the good folks who have posted suggestions and feedback about my first attempt to create an image that would represent my thinking on the role of data in mission-based organizations.  Likewise, those who emailed me their thoughts deserve thanks!
I’ve created a revised version that incorporates some of the feedback.  Before you take a look at it, please bear in mind that:

  1. I am not a graphic designer.
  2. I am not attempting to create a graphic that illustrates everyone’s ideas about the role of data in a mission-based organization.  I am merely trying to illustrate my ideas.
Visualizing the role of data for mission-based organizations - Round II

Visualizing the role of data for mission-based organizations – Round II

Item #2 on the list notwithstanding, I am enjoying very much the opportunity to learn more about what others in the field think about (and visualize) when they ponder the role of data in our sector.  Once again, I invite you to post your reflections, suggestions, and questions in the comments section here on this blog.

Data Day 2013 in Boston

Data Day 2013:  I'll be offering pro bono strategic tech consultations

 

I’m excited about Data Day at Northeastern University tomorrow, which is being co-hosted by the Metropolitan Area Planning Council and the Boston Indicators Project.

I’ll be offering pro bono strategic tech consults at this event; my time is being underwritten by Tech Networks of Boston. If you’re planning to attend, please come say hello to me! Just look for this sign.

 

Chris Zibailo: A hero in ICT and expectation management

Chris Zibailo, DSCI

This morning, I ran into a long-lost colleague whom I remember as a hero.  Or rather, Chris Zibailo recognized my voice, and ran over to reintroduce himself to me this morning.

Chris and I met in 1999, when I was the information systems manager at Family Service of Greater Boston (FSGB).  FSGB was in the middle of a big geographic transition; we had sold our headquarters on Beacon Hill, and moved our information systems, plus everything else, to temporary quarters in Downtown Crossing. We were now facing, for the second time in just under a year, a move to our permanent headquarters in Jackson Square.

Fortunately, I was reporting to the world’s best chief administrative officer for a nonprofit human service organization, Bill Chrisemer.  I should take a moment and acknowledge Bill as a hero as well, because he always did his utmost to help me succeed in supporting FSGB.

It was the right time for Bill and me to think about state of the art voice and data lines.  Enter Chris, with a promise on behalf on his firm that got our attention:  we suck less.

Chris is my hero, because he delivered extraordinary service; he not only managed our expectations perfectly, but exceeded them.  We not only received the information and communication technology components that were critical for our operations, but all the personal care that Chris could give us in a difficult move.  I remember a particularly harrowing moment, while planning the weekend cut-over of all services for the entire organization, when we realized that someone had to be at our Quincy satellite office to wait for and let in the Bell Atlantic workers.  It was a thankless task and one that might have entailed hours of waiting around, and our information systems team had already been assigned critical tasks.  Just as I remember the harrowing moment of that realization, I also remember my overwhelming feeling of gratitude and relief when Chris volunteered for the job, which most definitely was not in the contract for services that we signed with him.  We gave him the keys, he did this tedious task, and all was well.

Later that year, Bill Chrisemer left, I was diagnosed with cancer (and had successful surgery), and DSCI underwent some significant changes. It was a very tough time, partly because Family Service of Greater Boston’s organizational culture had changed. In 2000, I left FSGB to take a job as TechFoundation’s national nonprofit liaison officer, and in 2002, I left TF to become a solo consultant.  I had lost touch with Chris, and heard a rumor that he had left his firm, but I still thought of him as the gold standard whenever I dealt with telephone and internet service providers on behalf of my clients.

Fast forward to this morning.  Imagine my delight when Chris caught up with me!  Delight was piled on delight when Chris told me that the acquisition of his firm, those many years ago, was not satisfactory, so he and his colleagues banded together to invest in DSCI and turn it into a hosted communication and connectivity service provider for the 21st century.

Kudos to you, Chris.  You’re still my hero.

My current daydream: The marriage of outcomes management apps with data visualization apps

The marriage of outcomes management with data visualization

Given my current preoccupation with both outcomes management and data visualization for mission-based organizations, perhaps it’s not a surprise that I’m daydreaming about integrating applications that were designed for these two tasks.

This daydream was inspired by a recent conversation with Patrice Keegan, executive director of Boston Cares (a HandsOn Network affiliate).  She is keenly interested in both outcomes and data visualization, and she leads a nonprofit of modest size that collaborates not only with many local partners but also with a national network of sister organizations that facilitate short-term volunteering.  In other words, Boston Cares provides a gateway to volunteerism for individuals, corporations, and community-based nonprofits, and then shares best practices with its counterparts across the United States.

What better poster girl could there be than Patrice, for my Cause, which is making it not only possible but easy for her to take her outcomes analyses and turn them in visuals that tell the story of the social impact of Boston Cares?

Moreover, what good is a cause and a poster child, without a poster?  Here’s mine:

Patrice Keegan of Boston Cares
Special note to software developers in the nonprofit sector:  please take a look at that bright, shining face, and give your efforts to the cause.

Measuring what we value, and presenting the findings more interactively than ever

Boston Indicators Project logo

First of all, a personal resolutionI will not whine.

The Boston Indicators Project, which is an initiative by the Boston Foundation and the Metropolitan Area Planning Council, relaunched its web site in November, and I was not invited to the event.  I will subdue my inclination to pout, and move on to praising the new web site.

Fortunately, a fellow Boston Technobabe, Kat Friedrich, did attend; you therefore have the option of skipping my blog article and going straight to hers.  Kat’s focus is on “How Nonprofits Can Earn News Coverage Using Data Visualization,” which is certainly a great take-away for mission-based organizations.

My interest is slightly different.  Here are a few things that are especially striking:

The new Boston Indicators web site is great example of nonprofit technology in the service of a mission that is much greater any one community foundation or specific region.  I happen to live in the greater Boston area, so I’ve been more easily drawn to it than I would be if I were living elsewhere.  But it’s an example to any individual or organization, of the power of the universal access to the significant data, and the importance of analyzing it in ways that benefit the community.

What I learned about outcomes management from Robert Penna

Robert Penna

Yesterday, along with a number of colleagues and friends from Community TechKnowledge, I had the privilege of attending a training by Robert Penna, the author of The Nonprofit Outcomes Toolbox.

As you probably  know, I’ve been on a tear about outcomes measurement for a few months now; the current level of obsession began when I attended NTEN’s Nonprofit Data Summit in Boston in September.  I thought that the presenters at the NTEN summit did a great job addressing some difficult issues – such as how to overcome internal resistance to collecting organizational data, and how to reframe Excel spreadsheets moldering away in file servers as archival data.  However, I worked myself into a tizzy, worrying about the lack, in that day’s presentations, of any reference to the history and literature of quantitative analysis and social research.  I could not see how nonprofit professionals would be able to find the time and resources to get up to speed on those topics.

Thanks to Bob Penna, I feel a lot better now.  In yesterday’s training, he showed me and the CTK team just how far you can go by stripping away what is superfluous and focusing on what it really takes to use the best outcomes tools for job.  Never mind about graduate level statistics! Managing outcomes may be very, very difficult because it requires major changes in organizational culture – let’s not kid ourselves about that.  However, it’s not going to take years out of each nonprofit professional’s life to develop the skill set.

Here are some other insights and highlights of the day:

  • Mia Erichson, CTK’s brilliant new marketing manager, pointed out that at least one of the outcomes tools that Bob showed us could be easily mapped to a “marketing funnel” model.  This opens possibilities for aligning a nonprofits programmatic strategy with its marcomm strategy.
  • The way to go is prospective outcomes tracking, with real time updates allowing for course correction.  Purely retrospective outcomes assessment is not going to cut it.
  • There are several very strong outcomes tools, but they should be treated as we treated a software suite that comprises applications that are gems and applications that are junk.  We need to use the best of breed to meet each need.
  • If we want to live in Bob Penna’s universe, we’re going to have to change our vocabulary.  It’s not “outcomes measurement – it’s “outcomes management.” The terms “funder” and “grantmaker” are out – “investor” is in.

Even with these lessons learned, it’s not a Utopia out there waiting for nonprofits that become adept at outcomes management.  Not only is it difficult to shift to an organizational culture that fosters it, but we have to face continuing questions about how exactly the funders (oops! I should have said “investors”) use the data that they demand from nonprofit organizations.  (“Data” is of course a broad term, with connotations well beyond outcomes management.  But it’s somewhat fashionable these days for them to take an interest in data about programmatic outcomes.)

We should be asking ourselves, first of all, whether the sole or primary motivation for outcomes management in nonprofits should be the demands of investors.  Secondly, we should be revisiting the Gilbert Center’s report, Does Evidence Matter to Grantmakers? Data, Logic, and the Lack thereof in the Largest U.S. Foundations.We need to know this. Thirdly, we should be going in search of other motivations for introducing outcomes management.  I realize that most nonprofits go forward with it when they reach a point of pain (translation:  they won’t get money if they don’t report outcomes). 

During a break in Bob’s training, some of my CTK colleagues were discussing the likelihood that many nonprofit executives simply hate the concept of outcomes management.  Who wants to spend resources on it, if it subtracts from resources available for programmatic activities?  Who wants to risk finding out (or to risk having external stakeholders find out) that an organization’s programs are approximately as effective as doing nothing at all?  Very few – thus the need to find new motivations, such as the power to review progress and make corrections as we go.  I jokingly told my CTK colleagues, “the truth will make you free, but first it will make you miserable.”  Perhaps that’s more than a joke.

The state of nonprofit data: Uh-oh!

The Nonprofit Technology Network (NTEN) has released a report prepared by Idealware on the current state of nonprofit data.  Highly recommended!

Some of the news it contains is scary.  In our sector, we currently aren’t very successful at collecting and analyzing the most crucial data.  For example, only 50% of the respondents reported that their nonprofit organizations are tracking data about the outcomes of clients/constituents.

According to the survey respondents, there are daunting barriers to tracking and using data:

  • issues related to collecting and working with data (27 percent of responses).
  • lack of expertise (24 percent of responses)
  • issues of time and prioritization (22 percent of responses).
  • challenges with technology (23 percent).

Page 13 of the report features a chart that I find especially worrisome.  It displays of types of data that nonprofit organizations should or could be using, with large chunks falling into three chilling categories:

  • we don’t know how to track this
  • we don’t have the technology to effectively track this
  • we don’t have the time/money to effectively track this

In the case of data about outcomes, 17% lack the knowledge, 20% lack the technology, and 22% lack the time or money (or both) to track it.

Are you scared yet?  I confess that I am.  Perhaps half of all nonprofits surveyed don’t know – and don’t have the resources to find out – whether there is any causal relationship between what their activities and the social good that they are in business to achieve.

And that’s just programmatic outcomes.  The news is also not very encouraging when it comes to capturing data about organizational budgets, constituent participation in programs, and external trends in the issue areas being addressed by nonprofit organizations.

So much for the bad news.  The good news is that now we know.

It takes some courage to acknowledge that the baseline is so low.  I applaud Idealware and NTEN for creating and publishing this report.  Now that we know, we can address the problem and take effective action.