The Evidence Working Group members have compiled general principles to consider when thinking through how to collect and use information about your activities:

  • Use indicators already developed within existing frameworks, particularly the UN Sustainable Development Goals (SDG) indicators. In addition to considering what indicators and information you need to uniquely collect for your group, consider also integrating existing indicators into your measurement framework.
  • Be critical in your analysis of your activities – Document challenges along with successes in your activities. Challenges are expected and understood in projects, and documenting both successes and challenges raises the credibility of the analysis of your activities.
  • Attempt to be rigorous in understanding your activities’ impact. Everyone finds it difficult to causally link activities and outputs to outcomes; it may not be necessary to assess your program using the most rigorous methods. For example, a Randomized Control Trial is considered the gold standard in impact evaluation, but sound qualitative interviewing using accepted technique is also useful (and accepted by policymakers, donors, academics and program managers) to understanding your activities’ impact.
  • Match the right methods for data collection and analysis with the purpose of enquiry. Consider greater strength of evidence produces more certainty in the results, but also comes with increased cost of collection & complexity.
  • CART Principles: When designing a data collection system, keep the CART principles in mind (adapted from Poverty Action Lab):
    • Credible – Collect high quality data and analyze them accurately.
    • Actionable – Commit to act on the data you collect.
    • Responsible – Ensure the benefits of data collection outweigh the costs.
    • Transportable – Collect data that generate knowledge for other programs and organizations.
    • Prepare to write up the results and analysis for stakeholders and/or for publishing in the peer-reviewed literature. Design program evaluations with this goal in mind to enhance quality and impact. A format for research reporting includes Introduction (with literature review), Materials and Methods, Results, Discussion, Conclusion and References. Further guidance on preparing to publish can be found here.

Return to the Guide to Excellence in Evidence: How can faith groups get started or get better at collecting and using evidence?