This is not actually a photo from the dialogue series. We refrained from taking photos, because we wanted to foster an atmosphere of candor and comfort as grantors and grantees engaged in conversation about a difficult topic. However, it is a favorite photo from another recent Tech Networks of Boston event.
Oh, my! It took Tech Networks of Networks almost two years to organize and implement a series of candid dialogues about data and evaluation for grantors and nonprofit grantees, and now it’s complete. The process was a collaboration in itself, with TSNE MissionWorks, and Essential Partners serving as co-hosts. An advisory group and planning group gave crucial input about the strategy and tactics for this event.
What you see here are a few notes that reflect my individual experience. In this article, I am not speaking on behalf of any organization or individual.
June 2017: Let’s get oriented. What is the heart of the matter for grantors and grantees?
September 2017: You know, we really need to address the imbalance of power in the grantor/grantee relationship.
January 2018: Ok, can we agree on some best practices how to address this as grantors and grantees? Why, yes. We can.
The plan is to make the recommendations that came out of the final dialogue publicly available online, to provide a starting point for a regional or even national conversation about data and evaluation.
Meanwhile, I’d like to offer my own recommendations. Mine are based on what I learned during the dialogue series, and also on untold numbers of public and private conversations on the topic.
Understanding that nonprofits perceive funders as having not just money but also much more power.
Asking nonprofits to define their goals, their desired outcomes, and their quantitative measures of success – rather than telling them what these should be.
Factoring in the nonprofit organization’s size, capacity, and budget – making sure that the demand for data and evaluation is commensurate.
Understanding the real cost in dollars to grantees who provide the data reporting and evaluation that you request. These dollar amounts might be for staff time, technology, training, an external consultant, or even for office supplies.
Providing financial support for any data or evaluation that the funder needs – especially if the nonprofit does not have an internal need for that data or evaluation. Items to support might include staff time, technology, training, or retaining an external consultant with the necessary skill set.
Putting an emphasis on listening.
Nonprofits can help by:
Engaging in a quantitative analysis of their operations and capacity, and sharing this information with funders.
Understanding that grant makers are motivated to see nonprofit grant recipients succeed.
Understanding that grant makers are often under pressure from donors and their boards to deliver a portfolio of outcomes.
Integrating the use of data and evaluation into most areas of operation – this means building skills in data and evaluation across the entire organization.
Gathering with other nonprofits that have similar desired outcomes and comparing notes on failures and best practices.
Fostering a data-friendly, continuous learning culture within nonprofit organizations.
Both groups can help by:
Engaging in self-scrutiny about how factors such as race and class affect how data is collected, categorized, analyzed, and reported.
Talking frankly about how power dynamics affect their relationships.
Engaging in ongoing dialogue that is facilitated by a third party who is experienced in creating a safe space.
Talking about and planning the evaluation process well before the grant begins.
Creating clear definitions of key terms pertaining to data and evaluation.
Making “I don’t know” an acceptable response to a question.
Measuring what you really value, rather than simply valuing what you can easily measure.
Working toward useful standards of measurement. Not all programs and outcomes are identical, but very few are entirely sui generis.
Sharing responsibility for building the relationship.
Speaking with each other on a regular basis.
Studying (and implementing) community-based participatory research methods.
And now, because I can insert a contact form here, I’m going to. Please feel free to let me know if you’re interested in being part of a regional or national conversation about how grantors and grantees can move forward and work constructively with data and evaluation.
Two poster boys of nonprofit data sanity: Bob Penna (l) and Steve Pratt (r).
Now that TNB Labs is up and running, we’re receiving a lot of requests from nonprofit organizations who are perplexed about how to manage the data that they have, before they plunge any further into data analytics or think about acquiring a new data analysis tool. This has given me a lot of opportunities to reflect on how difficult it can be for people whose expertise lies elsewhere to orient themselves to data governance.
Steve Pratt‘s blog article “Drowning in Data?” has been a huge inspiration. In it, he explains the importance of data inventories, and offers to send the Root Cause template to anyone who requests it. I highly recommend that you send an email to firstname.lastname@example.org, and ask for a copy.
At the same time, as I went over Steve’s template, I had a nagging feeling that we needed something even more elementary. Remembering my friend Bob Penna‘s exhortation of a few months before, about asking “who, when, where, what, how, and why,” I quickly drafted a data checklist that focused on those basic questions. When I sent it to Bob, he very quickly returned it with some excellent enhancements; the most brilliant one was to start the checklist with the question “WHY?” As he very sensibly pointed out, if you can’t come up with a good reason why you are collecting, analyzing, reporting, and archiving information, you might as well stop there. In the absence of a persuasive answer to the question “why?” there’s no need to ask “who, when, where, what, and how;” in fact there’s no reason to collect it at all.
With that wisdom in mind, I have tweaked the draft of the data checklist, and herewith present it to you for feedback. This version is the result of a Penna/Finn collaboration:
1) Providing Master Data Management (MDM) services to nonprofit organizations in support of their missions, focusing on data governance, data quality, data modeling, data visualizations, and program evaluation.
2) Providing workforce program management for Desktop Support Technicians (DST), Data Support Analysts (DSA), and Data Analytics/Data Evaluation entry level professionals.
TNB Labs is led by Greg Palmer (chief executive officer), and Deborah Elizabeth Finn (chief strategic officer). The other co-founders are Bob Master (former CEO of Commonwealth Care Alliance) and Susan Labandibar (founder of Tech Networks of Boston).
TNB Labs is here to solve your problems. Please contact us with any questions and comments you have about TNB Labs, or to learn more about data management or program management services that might be helpful to your organization.
Best regards from Deborah and Greg
Deborah Elizabeth Finn
Of course, my thinking has become even more grandiose since I originally came up with the idea of a three-day outcomes/data viz training series. Now I’m thinking in terms of a “Massachusetts Institute of Nonprofit Technology,” in which the first initiative would be a degree program in nonprofit data analysis.
Let’s take this training opportunity, which will be brief in comparison to the more elaborate programs that I’ve envisioned, and build on it!
The good folks of Netsuite.Org had a great idea for their exhibit area at the Nonprofit Technology Conference this year. They asked attendees to describe their technology visions in three words. I chose “shared” “data,” and “outcomes.” and an artist quickly drew up a visual to express this. (Unfortunately, I did not note down her name; I hope I can find it in order to give her proper credit for her work.) The photo shown above was taken by Peggy Duvette, and as you can see, I was delighted to see this concept, which is part of Tech Networks of Boston’s strategic thinking, become part of the patchwork quilt of ideas that were being expressed.
The joy of #15NTC is in realizing that although we are just three NTAPs in one region, we are part of a wider movement. In fact, if you were to look at the entire collection of artist’s renderings that were done at the Netsuite.Org exhibit area, you’d see that many nonprofit organizations are on the cusp of dreaming this dream. Most of in the nonprofit sector understand that for lasting positive change in the world, one program at a single nonprofit organization is not enough. The future is in sharing and coordinating our work. What if nonprofit technology assistance providers started with that challenge, rather than the challenge of keeping a network server from crashing? The emphasis would shift from the tactical support of nonprofits to the strategic support of their missions. And by “missions,” I don’t mean vague statements; I mean specific (and even quantifiable) positive changes that nonprofit profits have committed themselves to delivering to their stakeholders.
Because mission achievement is why we all get up in the morning to do our jobs.
And because building a nonprofit technology movement that supports mission achievement is the best possible reason for participating in the Nonprofit Technology Conference.
* I also serve Annkissam directly as a consultant.
Disclaimer: This illustration is for entertainment purposes only. I am not a professional data analyst.
My training, such as it is, is heavily skewed toward qualitative methods; at the same time, I have a lot of respect for quantitative analysis. However, my favorite form of research consists of staring off into space and letting ideas float into my head. Sometimes I validate my findings by engaging in conversations in which I talk louder and louder until everyone agrees that I’m right. It seems to work.
Lately, I’ve had a little time to stare off into space and let ideas float into my head; by this, I mean that I traveled to Austin, Texas for the Nonprofit Technology Conference (also known as #15ntc) and had some down time on the plane. By the time I arrived in Austin, I had become convinced that “Data Analyst” would be this year’s standout job title in the field of nptech. At the conference, I was able to confirm this – by which I mean that I didn’t meet anyone there who talks more loudly than I do.
What are the take-ways? It depends on who you are:
For data analysts who are now working in the field of nonprofit technology: prepare to be appreciated.
For nonprofit executives: don’t kid yourselves. Brilliant data analysts who want to work in the nonprofit sector aren’t going to be attracted by job announcements that indicate that the successful candidate will also be responsible for network administration, hands-on tech support, social media, and web development.
For workforce development professionals: this is your cue. It’s time to put together a program for training computer science graduates to be nonprofit data geeks.
For donors, grantmakers, and other funders: if you want reports from nonprofits are based on reliable and valid methods of analysis, then you will need to underwrite data analysts at nonprofits. That means money for training, for salaries, and for appropriate technology.
If you don’t agree with my findings, please take a moment to share yours in the comments section.
I am not attempting to create a graphic that illustrates everyone’s ideas about the role of data in a mission-based organization. I am merely trying to illustrate my ideas.
Visualizing the role of data for mission-based organizations – Round II
Item #2 on the list notwithstanding, I am enjoying very much the opportunity to learn more about what others in the field think about (and visualize) when they ponder the role of data in our sector. Once again, I invite you to post your reflections, suggestions, and questions in the comments section here on this blog.
I firmly believe that if your organization is driven by data, you’re stopping too soon.
It’s important to roll that data (which is raw material) into information (which has been sorted and analyzed), to roll that information into knowledge (which has been enhanced by understanding of context), and to roll that knowledge into wisdom (which been enhanced by experience and intuition). From there you can proceed to good decisions, and ultimately to mission success; moreover, at this point you of course now have more data. From there, it’s an opportunity for continuous improvement and possibly even further innovation.
My challenge right now is to come up with a clear image to convey this. The one that springs naturally to my mind is linear, but trusted advisors seem to favor a more cyclical illustration.
Please take a look at these two logos (which were created by yours truly), and tell me which one gets my message across most effectively:
And please feel free to post comments here (or send me email) to elaborate on your thoughts about this!
I’ll be offering pro bono strategic tech consults at this event; my time is being underwritten by Tech Networks of Boston. If you’re planning to attend, please come say hello to me! Just look for this sign.
This morning, I ran into a long-lost colleague whom I remember as a hero. Or rather, Chris Zibailo recognized my voice, and ran over to reintroduce himself to me this morning.
Chris and I met in 1999, when I was the information systems manager at Family Service of Greater Boston (FSGB). FSGB was in the middle of a big geographic transition; we had sold our headquarters on Beacon Hill, and moved our information systems, plus everything else, to temporary quarters in Downtown Crossing. We were now facing, for the second time in just under a year, a move to our permanent headquarters in Jackson Square.
Fortunately, I was reporting to the world’s best chief administrative officer for a nonprofit human service organization, Bill Chrisemer. I should take a moment and acknowledge Bill as a hero as well, because he always did his utmost to help me succeed in supporting FSGB.
It was the right time for Bill and me to think about state of the art voice and data lines. Enter Chris, with a promise on behalf on his firm that got our attention: we suck less.
Chris is my hero, because he delivered extraordinary service; he not only managed our expectations perfectly, but exceeded them. We not only received the information and communication technology components that were critical for our operations, but all the personal care that Chris could give us in a difficult move. I remember a particularly harrowing moment, while planning the weekend cut-over of all services for the entire organization, when we realized that someone had to be at our Quincy satellite office to wait for and let in the Bell Atlantic workers. It was a thankless task and one that might have entailed hours of waiting around, and our information systems team had already been assigned critical tasks. Just as I remember the harrowing moment of that realization, I also remember my overwhelming feeling of gratitude and relief when Chris volunteered for the job, which most definitely was not in the contract for services that we signed with him. We gave him the keys, he did this tedious task, and all was well.
Later that year, Bill Chrisemer left, I was diagnosed with cancer (and had successful surgery), and DSCI underwent some significant changes. It was a very tough time, partly because Family Service of Greater Boston’s organizational culture had changed. In 2000, I left FSGB to take a job as TechFoundation’s national nonprofit liaison officer, and in 2002, I left TFto become a solo consultant. I had lost touch with Chris, and heard a rumor that he had left his firm, but I still thought of him as the gold standard whenever I dealt with telephone and internet service providers on behalf of my clients.
This daydream was inspired by a recent conversation with Patrice Keegan, executive director of Boston Cares (a HandsOn Network affiliate). She is keenly interested in both outcomes and data visualization, and she leads a nonprofit of modest size that collaborates not only with many local partners but also with a national network of sister organizations that facilitate short-term volunteering. In other words, Boston Cares provides a gateway to volunteerism for individuals, corporations, and community-based nonprofits, and then shares best practices with its counterparts across the United States.
What better poster girl could there be than Patrice, for my Cause, which is making it not only possible but easy for her to take her outcomes analyses and turn them in visuals that tell the story of the social impact of Boston Cares?
Moreover, what good is a cause and a poster child, without a poster? Here’s mine:
Special note to software developers in the nonprofit sector: please take a look at that bright, shining face, and give your efforts to the cause.
An enormous added value of taking the MetroBoston DataCommon training is that they walk you through the process of creating a free Weave account. This means that the version of Weave that you will be using is already loaded with crucial data sets from sources such as the Census Bureau and the Bureau of Labor Statistics. You will be able to analyze, understand, and communicate your organization’s mission and impact, while using hard data about regional conditions to provide a context.
My interest is slightly different. Here are a few things that are especially striking:
Measuring what we value. This principle is prominently displayed on the relaunched web site, and is one that I learned in 2002 from the Boston Indicators Project’s co-founder and director, Charlotte Kahn. (I worked on the 2003 indicators report, which was the very first to be webified.) The version I heard from her own lips is “we should measure what we value, rather than only valuing what we can measure.” It’s not enough to throw together a lot of data about our region, simply because it’s available. We have to think about what it means, why it’s important, and it helps us understand the most effective strategies for positive change.
The new Boston Indicators web site is great example of nonprofit technology in the service of a mission that is much greater any one community foundation or specific region. I happen to live in the greater Boston area, so I’ve been more easily drawn to it than I would be if I were living elsewhere. But it’s an example to any individual or organization, of the power of the universal access to the significant data, and the importance of analyzing it in ways that benefit the community.
Thanks to Bob Penna, I feel a lot better now. In yesterday’s training, he showed me and the CTK team just how far you can go by stripping away what is superfluous and focusing on what it really takes to use the best outcomes tools for job. Never mind about graduate level statistics! Managing outcomes may be very, very difficult because it requires major changes in organizational culture – let’s not kid ourselves about that. However, it’s not going to take years out of each nonprofit professional’s life to develop the skill set.
Here are some other insights and highlights of the day:
Mia Erichson, CTK’s brilliant new marketing manager, pointed out that at least one of the outcomes tools that Bob showed us could be easily mapped to a “marketing funnel” model. This opens possibilities for aligning a nonprofits programmatic strategy with its marcomm strategy.
The way to go is prospective outcomes tracking, with real time updates allowing for course correction. Purely retrospective outcomes assessment is not going to cut it.
There are several very strong outcomes tools, but they should be treated as we treated a software suite that comprises applications that are gems and applications that are junk. We need to use the best of breed to meet each need.
If we want to live in Bob Penna’s universe, we’re going to have to change our vocabulary. It’s not “outcomes measurement – it’s “outcomes management.” The terms “funder” and “grantmaker” are out – “investor” is in.
Even with these lessons learned, it’s not a Utopia out there waiting for nonprofits that become adept at outcomes management. Not only is it difficult to shift to an organizational culture that fosters it, but we have to face continuing questions about how exactly the funders (oops! I should have said “investors”) use the data that they demand from nonprofit organizations. (“Data” is of course a broad term, with connotations well beyond outcomes management. But it’s somewhat fashionable these days for them to take an interest in data about programmatic outcomes.)
During a break in Bob’s training, some of my CTK colleagues were discussing the likelihood that many nonprofit executives simply hate the concept of outcomes management. Who wants to spend resources on it, if it subtracts from resources available for programmatic activities? Who wants to risk finding out (or to risk having external stakeholders find out) that an organization’s programs are approximately as effective as doing nothing at all? Very few – thus the need to find new motivations, such as the power to review progress and make corrections as we go. I jokingly told my CTK colleagues, “the truth will make you free, but first it will make you miserable.” Perhaps that’s more than a joke.
Some of the news it contains is scary. In our sector, we currently aren’t very successful at collecting and analyzing the most crucial data. For example, only 50% of the respondents reported that their nonprofit organizations are tracking data about the outcomes of clients/constituents.
According to the survey respondents, there are daunting barriers to tracking and using data:
issues related to collecting and working with data (27 percent of responses).
lack of expertise (24 percent of responses)
issues of time and prioritization (22 percent of responses).
challenges with technology (23 percent).
Page 13 of the report features a chart that I find especially worrisome. It displays of types of data that nonprofit organizations should or could be using, with large chunks falling into three chilling categories:
we don’t know how to track this
we don’t have the technology to effectively track this
we don’t have the time/money to effectively track this
In the case of data about outcomes, 17% lack the knowledge, 20% lack the technology, and 22% lack the time or money (or both) to track it.
Are you scared yet? I confess that I am. Perhaps half of all nonprofits surveyed don’t know – and don’t have the resources to find out – whether there is any causal relationship between what their activities and the social good that they are in business to achieve.
And that’s just programmatic outcomes. The news is also not very encouraging when it comes to capturing data about organizational budgets, constituent participation in programs, and external trends in the issue areas being addressed by nonprofit organizations.
So much for the bad news. The good news is that now we know.
It takes some courage to acknowledge that the baseline is so low. I applaud Idealware and NTEN for creating and publishing this report. Now that we know, we can address the problem and take effective action.