Nonprofit Impact: Data Is More Than a Four-Letter Word

Arrows

“In God we Trust…all others, bring data.” Funny, sure, but daunting if you need to show a nonprofit’s accomplishments.

“The sad fact for me is that some people are afraid of it,” said Edie Steele of Nonprofit Navigator, an expert in data collection, measurement and evaluation. Fortunately for the fearful, Steele’s clear explanations and easy-to-follow planning model provide a reassuring path for nonprofits to communicate their impact. She outlined her approach in a special program at Foundation Center in Washington DC, “Selling and Telling: Impact and Your Organization.” The program was also sponsored by the Grant Professionals Association National Capital Area Chapter.

Here’s some of Steele’s plain-English advice:

What’s your purpose?

You’re trying to make a change. Put simply, your purpose is the results you’re after. Try to write a purpose statement in one sentence. Make it succinct and clear. “It’s that old idea—you don’t want to make the reader work too hard,” Steele said.

Your mission and message

The bottom line of a business is to make money. The bottom line of a nonprofit is to achieve its mission. Nonprofits “sell” a change. Your mission describes the “what and why” of change. Think about how a person’s life will be different because of your organization. Think about change in relation to “what,” and make sure that comparison is appropriate. Focus on your values, not money. “At many organizations, the tendency is to have mission creep,” Steele said. “The problem with that is then you lose the public’s understanding of what you do best in the world.”

Measuring change

Your “story of change” describes your impact.  Your “theory of change” is the conviction you have about why you’ll succeed. Steele pared change to three types: knowledge, behavior and attitude. Anyone who knows a toddler or teenager knows attitudes fluctuate, she said, so track change in knowledge and behavior. “You have a place you want to take this, and that would be your target,” Steele said. “It expresses what you want to do, specifically, measurably.” How do you know you can get there? “Think of an indicator as a signpost along the way that helps you collect information to make sure you’re headed in the right direction,” Steele said.

Don’t:

  • Use ambiguous or hard-to-measure concepts like “self-esteem.” Better, she said, to tie your evaluation to competence, which can be demonstrated.
  • Rely on satisfaction surveys. “Did anybody have a purpose statement saying we are out to satisfy people?”
  • Pre-test people for areas in which they may have little knowledge. “That’s not nice,” Steele said.

Do:

  • Use rubrics. They measure specific behaviors that can be noticed over time and can also be useful to guide learners.
  • Set baselines. They help establish reasonable targets. If you don’t have a baseline, reach out to other organizations that already have asked the same questions.
  • Plan carefully. If your inputs don’t work out as expected, you’ll have cover to clarify why you didn’t reach your target.
  • Learn from your data. Correct your course and start again.

“You don’t have to be perfect,” Steele said. “But you do have to have a consistent process that people have faith in.”

Edie Steele invites your questions. Contact her at Edie@NavigatorNY.com. Her planning model and glossary are available online

Barbara Cornell

BARBARA CORNELL is a social sector outreach librarian at Foundation Center in Washington, DC. She has volunteered at nonprofits in the U.S., Portugal, Italy and Cambodia. Barbara worked for Congressional Quarterly before becoming a staff reporter for daily newspapers. She has been nominated for the Pulitzer Prize by the Kansas City (Mo.) Star and El Nuevo Dia in San Juan, Puerto Rico. She has freelanced for Reuters, Time Magazine and other publications