Tag Archives: stakeholders

Workforce development for the nonprofit tech professionals of the future: It will be a consortium, not a building with a dome!

We don't need an edifice; we need a consortium!

 

It’s been about a year and a half since I starting agitating for a Massachusetts Institute of Nonprofit Technology, an initiative that will kick off by training the nonprofit data analysts of the future.

The concept has morphed and evolved a great deal in that time, thanks to all the great input from Massachusetts stakeholders, but also from a team of ELP fellows from the Center for Collaborative Leadership.

One thing that is quite clear is that there is no need to create a new institution, or raise up a building with a splendid dome.  (The Massachusetts Institute of Technology can rest easy, without fear of competition, or brand encroachment.)  I believe that all of the necessary institutions exist already here in the Bay State.  What is needed is a consortium that can knit them together for this purpose, some funding, and some candidates.

It’s a pipeline, or perhaps a career ladder that the consortium needs to build – not an edifice.  Although I love the splendid domes of MIT, we can simply admire them, and hope that eventually some of the people who work and study under those domes will become part of the consortium.

Here’s what I think we need:

  1.  Allies from workforce development, job readiness, and college readiness programs.  These are the folks who will raise awareness of the coming need for technology professionals who can provide data analysis and other data services to nonprofits, and guide them to the next rung of the career ladder. Examples include Economic Mobility Pathways (EMPath), Shriver Job Corps, International Institute of New England, JFYnet, Jobs For the Future, National Fund for Workforce Solutions, SkillWorks, Boston PIC, YearUp, and Massachusetts Department of Labor and Workforce Development.
  2. Allies who provide relevant training and education to candidates who aspire to careers in data services and data analytics for nonprofits.  Examples include Bunker Hill Community College and Tech Foundry.
  3. An organization that is able to place, mentor, and coach candidates in entry level data services positions at local nonprofit organizations.  That’s TNB Labs.  These entry level workers will be known as “data support analysts,” or DSAs.
  4. Allies from local nonprofit organizations who are willing to host (and pay for the services of) a DSA for a period of one or two years.  TNB Labs will be the official employer of these workers, providing them with a salary, benefits, a modest sum for further professional development, coaching, and mentoring.  The DSAs will be working on site at the nonprofit organizations and dedicating themselves to tasks assigned by the nonprofits.  Examples of distinguished nonprofits that could play this role are Community Servings, Saint Francis House, Community Catalyst, Health Care For All, Massachusetts Association of Community Development Corporations, Perkins School, City Year, Jewish Family & Children’s Services, Cambridge Health Alliance, Family Service of Greater Boston, Massachusetts League of Community Health Centers, Greater Boston Food Bank, the Boston Foundation, AIDS Action Committee, and the Home for Little Wanderers.  (Not that they’ve actually signed on for this, but that they would be great members of this consortium.)

At the conclusion of the one or two year placement at a nonprofit organization, I think that any of the following outcomes would count as a win:

  • The host nonprofit hires the DSA (with a raise and a promotion) as a long term regular employee.
  • The DSA lands a job providing data services at another nonprofit organization.
  • The DSA lands a job in a different field or sector that is congruent with his/her/their career aspirations.
  • The DSA is able to apply to a four-year degree program, transferring course credits, on the job experience, two-year degrees, or certifications that he/she/they have earned.

The latter scenario – of advancing in higher education – brings us to the final category of allies needed for our consortium.  The best example of this kind of ally is UMass-Boston, which has programs in related areas, such as:

In addition, our consortium has a great ally in an individual UMass-Boston faculty member, Michael Johnson, whose research focus is decision science for community-based organizations.  He has expressed a generous desire to be a mentor to community college students in this career ladder, and to encourage those who are qualified to apply to be Ph.D. students in this field.

And that’s just UMass-Boston!  I’m not as familiar with the offerings of other distinguished colleges and universities in the area, but the Boston University program in nonprofit management and leadership , the Nonprofit Leadership program at Wheelock, and the Institute for Nonprofit Practice at Tufts come to mind immediately as potential allies.

So here we are. The need is there for data service providers who can serve the missions, programs, and operations of nonprofit organizations.  If we can weave all these allies together into a network, we can meet these needs.

All that we require is:

  • Allies who are ready, willing, and able to pitch in.
  • Public awareness that this career ladder is available.
  • Funding to assist candidates cannot afford tuition for college coursework and other forms of training.
  • Funding to assist nonprofits that would like to host a data service analyst from this program, but lack the (modest) funding to support one.

Let’s do this!

Every nonprofit needs a theory of change for its technology. . .and for its evaluation process

if then

I’ve spent a lot of my professional life (thus far) thinking about the missions of nonprofit organizations, and about information/communication technologies for nonprofits.

In the past few years, it’s become fashionable to talk about the importance of a “theory of change” for nonprofits.  This is merely a way of underlining the importance of making an explicit statement about the causal relationship between what a nonprofit organization does and the impact that it has promised to deliver.  I applaud this!  It’s crucial to say, “if we take all of the following resources, and do all of the following actions, then we will get all of the following results.”  An organization that lacks the capacity to marshal those resources and take those actions needs to reconsider, because it is on track to fail. If its capacity is not aligned with its commitment, it should acquire the resources or change its commitment to results.  Of course, it some cases, it will merely need to revise its theory of change.  In any case, it will have to work backward from its mission, and understand how each component contributes to achieving it.

This kind of thinking has lead to a lot of conversations (and a lot of anxiety) in the nonprofit sector about performance measurement, outcomes management, evaluation, and impact assessment.

I’d love to have some of this conversation focus on the information/communication technologies that nonprofit organizations are using.  In other word, it’s time to be explicit about a theory of change that explains in detail how every component of the technology an organization uses contributes (directly or indirectly) to its ability to deliver a specific kind of social, cultural, or environmental impact.

Likewise, I’d love to have the conversation address the ways in which the efforts of a nonprofit organization’s performance measurement, outcomes management, evaluation, or impact assessment team contributes (directly or indirectly) to its ability to deliver the kind of impact that it promised its stakeholders.

 

 

Kathryn Engelhardt-Cronk outlines a necessary factor in successfully implementing a nonprofit technology project

Kathryn Engelhardt-Cronk, CEO and founder of Community TechKnowledge

 

I’ve learned a lot from my buddy Tom McLaughlin, but the moment I first became a devoted fangirl of his was when I heard that he had quipped, “organizational culture eats strategy for breakfast.”

It’s true.  It’s so true in nonprofit technology that it hurts every time I think about it. However, I was immediately and immensely grateful to Tom for articulating so succinctly and eloquently what had been merely tacit knowledge for me.

One of the biggest problems in any nonprofit technology implementation is the difficulty in reconciling it with the organization’s culture.  It’s not just that individuals within it may not want to learn new things or do things differently – it’s that every organization is a delicate ecosystem of incentives, disincentives, alliances, and hostilities. A change in information and communication technology systems can easily upset the organization’s equilibrium.  Just the same, new implementations may become necessary, and at that point the challenge is not to arrive at an abstract understanding of group dynamics, but to gain the good will and participation of all the stakeholders by demonstrating that the potential benefits of the change are far greater than the threats to the status quo.

In other words, getting buy-in becomes a crucial goal; its a necessary (but not sufficient) condition for the success of the implementation.  This is a cost-benefit analysis that takes place at a very emotional level at a nonprofit organization.

That’s where Kathryn Engelhardt-Cronk can help.  She’s just published a white paper on “Getting 100% Buy-In for Your Next Nonprofit Technology Adoption.”  You can download it for free from the Community TechKnowledge web site.  I strongly recommend it!

(And now for a full disclosure of financial relationship:  I’ve served as a paid consultant to Kathryn’s organization, Community TechKnowledge, for some time.  However, she did not ask me to endorse this white paper, and she certainly is not paying me to recommend it.)

 

 

 

 

Peter Miller on what nonprofit organizations need to know about community technology centers

peterbrodiemiller

At the Tech Networks of Boston Roundtable on November 7th, Peter Miller will be the featured guest, and the topic will be what nonprofit organizations need to know about community technology centersThird Sector New England will be playing cohost, and the session will be held at the Boston NonProfit Center.

If you’re wondering why you, as a nonprofit professional, need to know at all about community technology centers (CTCs), here are a few points to consider:

1) If your organization offers advocacy or direct services to the community, then it’s important to know that CTCs are powerful resources for your constituents.  They provide access to online tools and information, skills training, and a focal point for community members that are interested in bridging the digital divide.

2) Some CTCs are based in community access television organizations, and a key places for community members to learn about the overlap between online communications and other forms of media.

3) Some CTCs are based in libraries, and it’s clear that professional librarians can be powerful allies for nonprofits and their constituents.  Librarians understand about free access to information and about knowledge for the public good; they can bring their skills to bear in bridging not only the digital divide but the knowledge divide.

4) Some CTCs are based in housing developed by community development corporations.  They can be crucial in assisting residents with online education, with finding and applying for jobs, and with online organizing for local needs.

5) CTCs can help your nonprofit with its internal professional development needs, if they are offering courses or certification in software or hardware skills that are crucial to your operations.

In general, the worldwide community technology movement is a power for social good, and you should at least be briefed on what it’s all about!

“Power corrupts. PowerPoint corrupts absolutely.” (Redux)

A slide from the PowerPoint version of Abraham Lincoln's Gettysburg Address

This is another article, salvaged with help from the Wayback Machine, from my now-defunct first blog. I think that the points I made then are as valid in 2013 as they were in 2005.  What do you think?

Mon 14 Feb 2005 06:41 AM EST

Most days of the week, I tend to think of information technology as morally neutral.  It isn’t inherently good or evil; the applications of a technology are good or evil.

But I do find some forms of information technology irritating or counter-productive – especially as they are often used in the nonprofit/philanthropic sector.

PowerPoint happens to be in that category.

I came to conclusion through my favorite research method.  (I.e., staring off into space for about half an hour.)  During this strenuous research, I asked myself two questions:

  1. When have I enjoyed giving a presentation based on PowerPoint?
  2. When have I enjoyed or learned a lot from someone else’s PowerPoint presentation?

Although I try to avoid giving PowerPoint presentations these days, I had no trouble answering Question #1 on the basis of previous experience.  I almost always liked it.  It’s great to have my talking points, my graphic displays, and my annotations packaged in one document.  Assuming that there’s no equipment failure on the part of the projector, the screen, the computer, or the storage medium that holds the PowerPoint document – it’s very convenient – although it’s not very safe to assume that none of these factors will fail.

In short, PowerPoint is designed to make presenters reasonably happy.  (Except in cases of equipment failure.)

The answer to Question #2 is a little more difficult.  I can be an exacting judge of how information is presented, and of whether the presenter is sensitive to the convenience and learning styles of the audience.

Perhaps the presenter put too many points on each slide, or too few.  Perhaps I was bored, looking at round after round of bulleted text, when graphic displays would have told the story more effectively.  Perhaps I wondered why the presenter expected me to copy the main points down in my notebook, when he/she knew all along what they were going to be.  Perhaps the repeated words, “next slide, please,” spoken by the presenter to his/her assistant seemed to take on more weight through sheer repetition than the content under consideration.  Perhaps there were too many slides for the time allotted, or they were not arranged in a sequence that made it easy to re-visit specific points during the question and answer period.

In short, PowerPoint as a medium of presentation does not tend to win friends and influence people.  (Of course, the best designed PowerPoint presentations succeed spectacularly, but the likelihood of creating or viewing one is fairly low.)

However, all is not lost.  If you have struggled to attain some high-level PowerPoint skills, and your role in a nonprofit/philanthropic organization calls for you to make frequent presentations, I can offer you advice in the form of the following three-point plan:

  1. Knock yourself out.  Create the PowerPoint presentation of your dreams.  Include all the bells and whistles.  Be sure to write up full annotations for each slide.
  2. Print out this incredible PowerPoint presentation in “handout” format, and give a paper copy to each person at the beginning of your talk.  As a bonus, you can also tell your audience where they can view or download it on the web.
  3. Cull out all but five or six slides for each hour of your planned presentation.  These should only include graphics that must be seen to be believed, and text that is more effective when read silently than when spoken.  This severely pared-down version will be the PowerPoint document that you will actually use during your presentation.

I realize that this will probably not be welcome advice, but the interests of your organization will undoubtedly dictate that you deploy a PowerPoint strategy that will, at the very least, not alienate the audiences at your presentations.

If you have any lingering hopes that PowerPoint is the best tool for engaging stakeholders in your mission, my final advice to you to review the PowerPoint version of Abraham Lincoln’s Gettysburg Address.

 




A note on the title of this article:

I wish I had invented this aphorism, but I didn’t.

In 1887, John Dalberg-Acton (1st Baron Acton) wrote, “Power tends to corrupt and absolute power corrupts absolutely.”

In 2003, Edward Tufte wrote “Power corrupts.  PowerPoint corrupts absolutely.

The telephone analogy (Redux)

This is another article salvaged from my now-defunct first blog.  (Many thanks are due to the Wayback Machine, which enabled me to retrieve a copy.) It was first published in 2005, well before smart phones were prevalent among non-geeks. 

An inherent flaw in the analogy at the time was that telephones, once installed, caused much less trouble to nonprofit executives than the typical IT infrastructure. 

As we flash forward to 2013, with a culture in which smart phones are not only prevalent but offer functions previously associated with information systems, it’s interesting to reflect on how well the telephone analogy has stood the test of time. 

So many of us, inside and outside of the nonprofit sector, devote an inordinate amount of time looking forward to upgrading our phones, and that’s a shocking change. 

One thing that hasn’t changed enough is the failure of many nonprofit organizations to think through the budgetary and operational implications of acquiring new technologies.

The telephone analogy

Fri 11 Feb 2005 10:52 AM EST

Are you a nonprofit/philanthropic professional who is having trouble making the case that your organization needs to bring its technology infrastructure into the 21st century – or at least into the 1990s?

Please allow me to acquaint you with the telephone analogy.*

First of all, can you think of a functioning nonprofit/philanthropic organization whose board, chief executive officer, or chief financial officer would ever say…

  • “… we don’t need to find or raise the money to install telephones or pay our monthly phone bill.”
  • “…we don’t need to dedicate staff time to answering the phone or returning phone calls.”
  • “…we don’t need to orient staff and volunteers about personal use of the phones, about what statements they can make on our behalf to members of the media and the public who call our organization, or about how queries that come into the main switchboard are routed to various departments, or about how swiftly high-priority phone calls are returned.”
  • “…we don’t need to make sure that when donors, stakeholders, constituents, and clients call our main number they can navigate the automated menu of choices.”
  • “…we don’t need to show staff members how to put callers on hold, transfer calls, or check voice-mail now that we have an entirely new phone system.”

Apparently, most mission-based organizations have resigned themselves to the fact that telephone systems are an operational necessity.  Somehow, the leadership finds the money, time, and motivation to meet the organization’s telephony needs.

If only we could get the same kind of tacit assumption in place for every mission-based organization’s technology infrastructure!

I propose two possible strategies, either of which would of course need to be tailored your organization’s culture:

  • Encourage your board, CEO, and CFO to see your technology infrastructure as analogous to your telephone system.
  • Persuade them that your telephone system is an information and communication technology system – and then encourage them to regard other components of the system (such as computers, networks, and web sites) with the same kind of tacit support and acceptance.

I look forward to hearing from anyone who has tried this strategy – or developed one that is even more persuasive.



* N.B.:  I need to warn you in advance that all analogies eventually break down, but this is a pretty useful one, especially since a telephone these days really is the front end of an information and communications technology system.

How grantmakers and how nonprofits use information about outcomes

State of Evaluation 2012: Evaluation Practice and Capacity in the Nonprofit Sector, a report by the Innovation Network

I’m sitting here, reflecting on the Innovation Network’s “State of Evaluation 2012” report.

I encourage you to download it and read it for yourself; start with pages 14 and 15. These two pages display infographics that summarize what funders (also known as “grantors,” or if you’re Bob Penna, as “investors”) and nonprofits (also known as “grantees”) are reporting about why they do evaluation and what they are evaluating.

Regardless of whether you call it evaluation, impact assessment, outcomes management, performance measurement, or research – it’s really, really difficult to ascertain whether a mission-based organization is delivering the specific, positive, and sustainable change that it promises to its stakeholders. Many organizations do an excellent job at tracking outputs, but falter when it comes to managing outcomes. That’s in part because proving a causal relationship between what the nonprofit does and the specific goals that it promises to achieve is very costly in time, effort, expertise, and money.

But assuming that a mission-based organization is doing a rigorous evaluation, we still need to ask:  what is done with the findings, once the analysis is complete?

What the aforementioned infographics from the “State of Evalution 2012”  tell me is that both grantors and grantees typically say that the most important thing they can do with their outcome findings is to report them to their respective boards of directors.  Considering the depth of the moral and legal responsibility that is vested in board members, this is a pretty decent priority.  But it’s unclear to me what those boards actually do with the information.  Do they use it to guide the policies and operations of their respective organizations?  If so, does anything change for the better?

If you have an answer to the question of how boards use this information that is based on firsthand experience, then please feel to post a comment here.

What I learned about outcomes management from Robert Penna

Robert Penna

Yesterday, along with a number of colleagues and friends from Community TechKnowledge, I had the privilege of attending a training by Robert Penna, the author of The Nonprofit Outcomes Toolbox.

As you probably  know, I’ve been on a tear about outcomes measurement for a few months now; the current level of obsession began when I attended NTEN’s Nonprofit Data Summit in Boston in September.  I thought that the presenters at the NTEN summit did a great job addressing some difficult issues – such as how to overcome internal resistance to collecting organizational data, and how to reframe Excel spreadsheets moldering away in file servers as archival data.  However, I worked myself into a tizzy, worrying about the lack, in that day’s presentations, of any reference to the history and literature of quantitative analysis and social research.  I could not see how nonprofit professionals would be able to find the time and resources to get up to speed on those topics.

Thanks to Bob Penna, I feel a lot better now.  In yesterday’s training, he showed me and the CTK team just how far you can go by stripping away what is superfluous and focusing on what it really takes to use the best outcomes tools for job.  Never mind about graduate level statistics! Managing outcomes may be very, very difficult because it requires major changes in organizational culture – let’s not kid ourselves about that.  However, it’s not going to take years out of each nonprofit professional’s life to develop the skill set.

Here are some other insights and highlights of the day:

  • Mia Erichson, CTK’s brilliant new marketing manager, pointed out that at least one of the outcomes tools that Bob showed us could be easily mapped to a “marketing funnel” model.  This opens possibilities for aligning a nonprofits programmatic strategy with its marcomm strategy.
  • The way to go is prospective outcomes tracking, with real time updates allowing for course correction.  Purely retrospective outcomes assessment is not going to cut it.
  • There are several very strong outcomes tools, but they should be treated as we treated a software suite that comprises applications that are gems and applications that are junk.  We need to use the best of breed to meet each need.
  • If we want to live in Bob Penna’s universe, we’re going to have to change our vocabulary.  It’s not “outcomes measurement – it’s “outcomes management.” The terms “funder” and “grantmaker” are out – “investor” is in.

Even with these lessons learned, it’s not a Utopia out there waiting for nonprofits that become adept at outcomes management.  Not only is it difficult to shift to an organizational culture that fosters it, but we have to face continuing questions about how exactly the funders (oops! I should have said “investors”) use the data that they demand from nonprofit organizations.  (“Data” is of course a broad term, with connotations well beyond outcomes management.  But it’s somewhat fashionable these days for them to take an interest in data about programmatic outcomes.)

We should be asking ourselves, first of all, whether the sole or primary motivation for outcomes management in nonprofits should be the demands of investors.  Secondly, we should be revisiting the Gilbert Center’s report, Does Evidence Matter to Grantmakers? Data, Logic, and the Lack thereof in the Largest U.S. Foundations.We need to know this. Thirdly, we should be going in search of other motivations for introducing outcomes management.  I realize that most nonprofits go forward with it when they reach a point of pain (translation:  they won’t get money if they don’t report outcomes). 

During a break in Bob’s training, some of my CTK colleagues were discussing the likelihood that many nonprofit executives simply hate the concept of outcomes management.  Who wants to spend resources on it, if it subtracts from resources available for programmatic activities?  Who wants to risk finding out (or to risk having external stakeholders find out) that an organization’s programs are approximately as effective as doing nothing at all?  Very few – thus the need to find new motivations, such as the power to review progress and make corrections as we go.  I jokingly told my CTK colleagues, “the truth will make you free, but first it will make you miserable.”  Perhaps that’s more than a joke.

The state of nonprofit data: Uh-oh!

The Nonprofit Technology Network (NTEN) has released a report prepared by Idealware on the current state of nonprofit data.  Highly recommended!

Some of the news it contains is scary.  In our sector, we currently aren’t very successful at collecting and analyzing the most crucial data.  For example, only 50% of the respondents reported that their nonprofit organizations are tracking data about the outcomes of clients/constituents.

According to the survey respondents, there are daunting barriers to tracking and using data:

  • issues related to collecting and working with data (27 percent of responses).
  • lack of expertise (24 percent of responses)
  • issues of time and prioritization (22 percent of responses).
  • challenges with technology (23 percent).

Page 13 of the report features a chart that I find especially worrisome.  It displays of types of data that nonprofit organizations should or could be using, with large chunks falling into three chilling categories:

  • we don’t know how to track this
  • we don’t have the technology to effectively track this
  • we don’t have the time/money to effectively track this

In the case of data about outcomes, 17% lack the knowledge, 20% lack the technology, and 22% lack the time or money (or both) to track it.

Are you scared yet?  I confess that I am.  Perhaps half of all nonprofits surveyed don’t know – and don’t have the resources to find out – whether there is any causal relationship between what their activities and the social good that they are in business to achieve.

And that’s just programmatic outcomes.  The news is also not very encouraging when it comes to capturing data about organizational budgets, constituent participation in programs, and external trends in the issue areas being addressed by nonprofit organizations.

So much for the bad news.  The good news is that now we know.

It takes some courage to acknowledge that the baseline is so low.  I applaud Idealware and NTEN for creating and publishing this report.  Now that we know, we can address the problem and take effective action.