Practical Advice About Evaluations for Program Managers
Disclaimer: The summaries and interpretations provided on this page are unofficial and have not been reviewed, endorsed, or approved by the Canada School of Public Service (CSPS).
Summary
- Program evaluation in the Government of Canada affects most aspects of government operations, from policy implementation to service delivery, and many public servants are directly or indirectly involved in evaluating whether programs work as intended.
- Evaluations should be viewed as learning tools that promote accountability and improve programs serving Canadians, rather than merely compliance exercises or burdens that are disconnected from day-to-day program operations.
- Strong evaluation functions help program managers understand how to reach clients more effectively, run programs more efficiently, and provide evidence to support requests for additional resources.
- When evaluators understand program managers’ needs, deadlines, and objectives, they can adapt evaluation methodologies accordingly, and the Treasury Board of Canada Secretariat encourages evaluators to consider a wide range of approaches and methods.
- Evaluation methodologies include both quantitative and qualitative approaches, and program managers should be aware of the full toolbox of methods available rather than assuming evaluations are only quantitative exercises.
- From a strategic policy perspective, evaluation work can identify systemic issues across multiple programs by synthesizing findings from numerous evaluations conducted over several years, including thousands of key informant interviews.
- At Indigenous Services Canada, synthesis of evaluations over seven to eight years revealed common trends such as high staff turnover and lack of flexibility in terms and conditions, which inform medium-term planning and human resources strategy.
- Evaluations for newer, innovation-type initiatives provide opportunities to step back, ask fundamental questions, and get objective advice about what is working well and what needs improvement, particularly when initial assumptions about how change would occur may have been incorrect.
- Effective evaluation requires collaboration and partnership between program evaluators, program managers, and clients or partners who deliver programs and services, while maintaining evaluator objectivity and neutrality.
- The common goal shared by evaluators and program managers is improving services to Canadians and ensuring optimization of financial resources.
- Seeking community participation and co-developing evaluation terms of reference helps ensure the right questions are asked and respects what program clients have experienced.
- Program managers have relationships with partners and key stakeholders, making their close involvement essential, and engaging groups targeted by programs is particularly important in the Indigenous Services context but relevant for all Government of Canada programs.
- The people who benefit from programs, whether Indigenous communities, new immigrants, Employment Insurance recipients, seniors, or other groups, are the experts with the strongest voices and should be centered in evaluation work.
- Good data collection remains challenging but essential for program evaluation, as decision-makers want concrete statistics about how many clients were served, what difference the program made, and whether it was the best approach when programs come up for renewal.
- Designing data requirements early in program development and integrating them into the program structure, including updating performance information profiles, makes a significant difference in evaluation effectiveness when evaluation time arrives.
- Evaluations serve the important but often unrecognized function of documenting a program’s complete history and purpose, creating a comprehensive written record that may not exist elsewhere across Government of Canada programs.
- Quantitative data is critical for telling a program’s performance story, and working closely with Chief Data Officers and results and delivery teams at the front end ensures appropriate performance indicators are established in Treasury Board submissions.
- Providing input on data requirements at the program design stage gives evaluators a realistic hope of having good solid data five years later when conducting program evaluations.
- New technologies offer significant potential to improve the ability to link data sources and analyze unstructured data, building on foundational pieces like the census of population and surveys of Indigenous peoples for longitudinal studies.
- Qualitative data gathered through individual and small group conversations with partners and key informants generates very rich information that is equally important to quantitative metrics.
- Evaluation methods centered in Indigenous worldviews recognize and appreciate the cultural significance of programs, including culturally important elements like clean water, requiring different ways of gathering data and accepting different forms of evidence.
- Artistic expressions and cultural artifacts created by program clients represent valid and meaningful sources of evidence that should be incorporated into evaluation work alongside traditional data collection methods.
- Evaluators must be mindful of their own biases about what constitutes good data and recognize that credible evidence comes in many forms beyond conventional quantitative measures.
- The Mi’kmaq concept of Etuaptmumk or two-eyed seeing, developed by a Mi’kmaw elder, represents an approach that combines Indigenous knowledge systems with Western evaluation methods to see the best of both perspectives.
Actionable Advice
- View evaluations as learning and improvement opportunities rather than merely compliance exercises or burdens disconnected from your day-to-day work.
- Explain to evaluators what you want to use the evaluation findings for, including how you plan to reach clients, improve efficiency, or support resource requests.
- Collaborate closely with evaluators to help them understand your program needs, deadlines, and objectives so they can adapt their methodology accordingly.
- Explore the full range of evaluation methodologies available, including both quantitative and qualitative approaches, rather than assuming evaluations are purely quantitative.
- Use evaluation findings to identify opportunities to improve how you reach program clients and run programs more efficiently.
- Gather evidence from evaluations to support requests for additional resources when needed.
- Participate in evaluation synthesis efforts to identify systemic issues and trends across multiple programs that can inform strategic planning.
- Use evaluation insights to inform medium-term planning and human resources strategy, such as addressing high staff turnover or inflexible terms and conditions.
- For newer or innovation-type programs, approach evaluations as opportunities to test initial assumptions about how change would occur and make necessary adjustments.
- Build partnerships with evaluators while ensuring they maintain objectivity and neutrality throughout the evaluation process.
- Work with evaluators to co-develop evaluation terms of reference to ensure the right questions are asked and stakeholder experiences are respected.
- Leverage your relationships with partners and key stakeholders to facilitate evaluator access and engagement throughout the evaluation process.
- Actively seek community participation in evaluations, particularly from those who benefit from your programs, recognizing them as experts with the strongest voices.
- Design your data requirements early in program development rather than waiting until evaluation time.
- Integrate data collection requirements into your program structure from the beginning and keep performance information profiles updated.
- Work closely with Chief Data Officers and results and delivery teams when developing Treasury Board submissions to ensure appropriate performance indicators are established.
- Prepare concrete statistics about client numbers served and program impacts to support decision-making and program renewals.
- Explore new technologies to improve your ability to link data sources and analyze unstructured data for evaluation purposes.
- Use foundational data sources like census data and relevant surveys to conduct longitudinal studies of program impact over time.
- Incorporate qualitative data collection methods, including individual and small group conversations with partners and key informants.
- Consider evaluation methods centered in Indigenous worldviews when working with Indigenous communities or culturally significant programs.
- Be open to different ways of gathering data and different forms of evidence, including artistic expressions and cultural artifacts from program clients.
- Examine your own biases about what constitutes good data and remain open to diverse forms of credible evidence.
- Apply concepts like two-eyed seeing to combine Indigenous knowledge systems with Western evaluation methods for more comprehensive assessments.
- Recognize and use evaluation documentation as a valuable record of your program's history, purpose, and evolution.