Analytics

Questions you should never ask if you're in the business of making an impact

This post originally appeared on Medium.

In my last job, I can’t remember the number of times I was asked questions like this:

“Our nonprofit is collecting all this data about our client outcomes. How do these relate to our demographics?”
“What is our funding mix? Is it good?”
“How do we know what donors are probably going to give again next year?”

I’ve worked in and around many nonprofits and social enterprises, enough to know questions that can be answered with data come up all the time. Much of my work over the last several years has been helping social sector leaders answer questions like these. This work can be tricky — over five years ago, David Henderson pointed out, “Data helps answer questions, it does not determine what questions should be answered.”

Asking good questions with the data at hand is an important skill for teams and organizations making the world a better place. But all this questioning and analysis can have a shadow side — asking the wrong questions can be a waste of time and distract your team from your most important goals. Here are 5 common traps leaders in the social sector make when asking questions, and how to avoid them.

1. Questions without an objective: Those of us who work in the nonprofit or social enterprise space often give up financial gain to pursue work we find interesting, challenging, and meaningful. But when we’re thinking about investing resources to answer questions about the work we do, “interesting” is not good enough. Is it clear to your staff why your question is important? For example:

OBSCURE OBJECTIVE: “I just saw one of my friends share a really interesting article from the Huffington Post about the link between family stressors and early childhood health outcomes. Can you tell me how these are related in our clients?”

CLEAR OBJECTIVE: “As we talked about last year, one of our top learning is to understand and promote positive health outcomes for our clients. Can you help us analyze whether our children’s health indicators have any relationship to their parent’s reported stress levels?

2. Questions that have low strategic value: A second criteria for considering questions to answer through data analysis is, “Does the answer help us improve our ability to fulfill our mission?” For example, questions that test your assumptions or logic of how change happens are often strategically important, even if they make you or your staff uncomfortable. So too are questions that have clear implications for your budget or resource allocation.

LOW VALUE: “How many clients expressed satisfaction in our financial counseling program last year? Is there a better number that demonstrates how well our program is working?”

HIGH VALUE: “Our program assumes that because clients set unique financial goals and have unique circumstances, the amount of support each will receive must vary widely. However, are there any cutoffs where additional support from us is unlikely to help a client increase his or her financial well-being?”

3. Questions where you don’t have expectations: A third important aspect to a good analysis question is: Is the answer testable? If you don’t have any internal expectations or external points of comparison, interpreting whether the answer you get is good or bad isn’t possible.

NOT TESTABLE: “What was the effect of our program to increase parental resilience among the at-risk families we serve?”

TESTABLE: “Other programs like ours using the same assessment tools find they have a small positive effect on parental resilience, as measured by an effect size (Cohen’s D) of 0.3 or greater. What’s our effect size?”

4. Questions that have been answered before: In the world of nonprofits and social enterprises, there is a commonly held belief that each organization is unique beyond the point of comparison. While this is true to a certain extent, chances are the issue you’re addressing has been around for a while. Please, before you start mining your own data spend a few hours on Google Scholar or at your local college or university checking out the research literature on the issue you’re addressing. You may not find the answer you need, but may find out how to ask a better question.

ALREADY ANSWERED: “Is the head start approach to early childhood education really effective?”

NOT YET ANSWERED: “As a startup early childhood education provider, how does the social and emotional development of our children compare to that seen in similar head start programs?”

5. Questions where the opportunities for action are unclear: Perhaps the most important aspect of a good analysis question is whether the results are actionable. Will the results potentially cause you to change your strategy, execution, or culture? If not, the question may not be worth exploring.

NOT ACTIONABLE: “Can you give me a summary of client outcomes for each of my direct reports? I’m not sure we have enough data to talk to them about the results, but I think looking at this could be interesting.”

ACTIONABLE: “Can you tell me which workshops last year were most engaging for our clients based on their feedback? We only have budget this year for three workshops and I need to decide which ones we should focus on.”

One final thought — as a changemaker in the social sector, timing is everything. Sometimes it might seem that you can’t make a decision without more analysis; other times you don’t have time for any analysis; often you’ll have these thoughts at the same time! One strategy is to set aside time for answering critical questions about your finances, programs, and operations every year. As questions come up, save them for your Annual Analysis project. Prepare your staff to be engaged, and if you need outside experts or volunteers, get them lined up. Then focus your attention on your most important, timely, and strategic questions.

Do you have other tips to help social sector leaders ask smarter questions? Send them to me and I’ll share your thoughts in an upcoming post.

Why nonprofit leaders need habits for measuring performance

This post originally appeared on the American Evaluation Association's blog, and can be found here. 

Over the last two years I worked as the Data and Evaluation Manager at the San Francisco Child Abuse Prevention Center (SFCAPC), a mid-size nonprofit focused on ending child abuse in San Francisco. This was a fantastic learning experience — I worked with 50+ staff members ranging from policy advocates to social workers, helping them use our data to serve clients better.

About two months into this job, I read a book that changed how I approach this work entirely:

I believe great people to be those who know how they got to where they are, and what they need to do to go where they’re going. They go to work on their lives, not just in their lives… They compare what they’ve done with what they intend to do. And when there’s disparity between the two, they don’t wait very long to make up the difference. — Michael E Gerber, The E-Myth Revisited

Reading Gerber’s book convinced me on the importance of systems and habits to help people succeed at their jobs. This particular nonprofit had a database that held client information — Efforts to Outcomes — but had few habits to improve that system and use that data to serve our clients better. So, I set out to create habits for how we do our “data” work.

I read through books, blogs, and web sites, and talked to mentors and friends to get a sense for what other organizations do. I found many resources that shaped my approach to creating these new “data habits”. Some of the best are:

  • The books, whitepapers, and newsletter from the Leap of Reason Institute — The free e-books on this site by Mario Moreno and David Hunter, and the institute’s recent whitepaper, The Performance Imperative, are the resources I recommend most often to others.
  • Sheri Cheney Jones’ book Impact and Excellence — Jones’ book contains many useful strategies she uses in her consulting practice to help clients generating insights from their data when resources are scarce.
  • The Root Cause Blog — Root Cause is another consulting firm that supports nonprofits in creating effective data and evaluation habits; they share some of their tricks on their blog.
  • The Data Analysts for Social Good professional association — Members get access to 25+ webinars ranging on topics from justifying the return on investment of data analysis to introductory analytical techniques using Excel, R, and other platforms.

After reading these I was convinced that developing regular habits would be critical to my organization’s ability to use it’s data to improve the lives of clients. But exactly what these habits should be remained unclear.

So with the support of our programs staff, I began experimenting. Over the last 18 months, we arrived at six specific habits that helped me provide consistent value to our staff.

  1. Keep a reporting calendar: Organizations are often required to submit detailed program participant and activity uploads for certain government contracts. I created a comprehensive calendar of when to submit these uploads and who needed to review the data before it uploaded.
  2. Define data integrity controls: Data integrity controls minimize the risk that information in a database is incorrect. For example, we scrubbed our database of test data monthly and audited a sample of new families to verify the accuracy of data entry each quarter. We summarized data integrity controls in a spreadsheet outlining each procedure, information source, performer, reviewer, and results.
  3. Review dashboards: The first week of each month, I sent a performance dashboard to each of the program managers. Managers discussed these metrics with their teams. Then they shared explanations for variances and any action items they were going to take at the manager’s meeting the following week.
  4. Schedule time for troubleshooting and report development: To build staff buy-in I needed to be responsive to database troubleshooting and report development needs. I tracked time for these tasks and blocked out time for them weekly. In an average week, I spent 2–10 hours troubleshooting and training, and 5–15 hours developing self-service reports staff could use to access program data themselves.
  5. Automate annual development data pulls: A significant “data” responsibility was pulling data for our development team, including demographics and unduplicated client counts. Working with our development team, I developed a self-service report designed to answer 80% of their “stats” asks, saving everyone time.
  6. Have a data analysis process: Stakeholders across our organization came to me with many good questions to explore in our data which I just didn’t have time to answer. I created a master tracker of these questions and set aside several weeks each year to explore those that were most critical. This “Annual Data Analysis” process set expectations and created a process to focus our limited analysis time.

These habits helped us save time, set expectations, and create lasting systems. They’re far from perfect, but were a huge step forward for us. Now that I’m no longer working for this organization, I realize these habits not only helped organize the work, but enabled continuity between me and my successor.

If you’ve read this and are working in the social sector, I’d love to know — What are the data & evaluation habits you’ve found valuable at your organization?