The curriculum is organized into eight technical areas with supporting modules that encompass core knowledge and skills related to evaluation design, data, analysis, interpretation, and effective communication. Modules include specific content (presentations, activities, handouts, learning guide) and recommendations on how to introduce and reinforce the skill through formal workshops, small group mentorship or other training approaches. The modularized curriculum framework is designed to be responsive to user needs and is aligned with NEP’s cycle-based approach. Modules can be combined together based on the evaluation design and team or individual capacity.
General evaluation principles and developing research questions
Why is evaluation important? How do we develop impact models that guide meaningful evaluation questions for program and policy improvement?
1.1 Why Evaluate?
1.2 Impact model
1.2.1 Impact models
1.2.2 Common evaluation framework
1.2.3 Contextual factors
1.2.4 Outputs and outcomes
1.3 Writing evaluation questions
1.4 Developing an evaluation plan
1.5 Developing an analysis plan
Core Data Concepts
What different types of data are available in countries that we can use for evaluation? How do equity and contextual factors influence health outcomes?
2.1 Introduction to survey data
2.1.1 Introduction to household survey data
2.1.2 Facility surveys
2.2 Introduction to health information systems and routine data
2.3.1 Quality of care indicators
2.3.2 Nutritional status indicators
2.3.3 Mortality indicators
2.3.4 Coverage indicators
2.4 Introduction to equity
How do we identify appropriate routine, survey and other data sources to answer our evaluation questions?
3.1 Data mapping process
Data Quality Assessment
How do we assess the quality of survey and routine data that we use for evaluations?
4.1 Five dimensions of data quality
4.2 WHO DQA Tool
4.3 Using survey data to assess routine data quality
4.4 Assessing what data to use based on quality
How do we prepare for analysis? How do we appropriately capture and organize data so it is available for ongoing use?
5.1 Good data management principles*
5.2 NEP Data System
5.3 Handling changes in administrative boundaries
5.4 Calculating indicators
What types of statistical analyses can we perform to answer out evaluation questions and put together a story on what is going well, and areas for improvement?
6.1 Point estimates
6.3 Changes over time
6.4 Equity analysis
6.4.1 Basic measures of inequalities
6.4.2 Complex measures of inequalities
6.4.3 Inequalities in non-ordinal groups
6.4.4 Trends in inequalities
6.5 Lives Saved Tool
6.6 Bayesian estimation
6.6.1. District assignment
6.6.2 Empirical Bayes
6.7 Regression Analysis
6.7.1 Introduction to regression
6.7.2 Linear regression need not be linear
6.7.3 Interpreting simple linear regressions
6.7.4 Causality v. correlation
6.7.6 Confounders and effect modifiers
6.7.7 Interpreting multiple linear regressions
6.7.8 Logistic regression
6.7.9 Regression Considerations
6.7.10 Guidelines for an Effective Data Analysis Report
6.8 Qualitative analysis
6.9 Stats Report
New Data Collection
When is it necessary to collect new data? How can we design a study to complement existing data and contribute to answering our evaluation questions?
7.1 When to collect new data
7.2 Principles of new study design
Interpretation and Reporting
How can we effectively communicate our findings across stakeholder audiences to promote policy change?
8.1 Communicating for advocacy
8.1.1 Know your audience
8.1.2 Clear and simple
8.1.4Creating an advocacy strategy
8.2 Data visualisation
8.2.1 Data visualization introduction
8.2.2 Displaying quantitative data
8.2.3 Displaying qualitative data
8.2.4 Orientation to graphs in Excel and Stats Report
8.3 Data interpretation
8.3.1 Key messages and findings
8.4 Oral presentation
8.4.1 Powerpoint tips
8.5.1 Knowledge brief
8.5.2 Policy brief
8.5.3 Report writing
8.5.4 Scientific writing