Site logo

  • Comprehensive Guide to Developing a Robust Monitoring and Evaluation Plan in 13 Steps
  • Learning Center

13 Steps to develop a monitoring and evaluation plan

Developing a Monitoring and Evaluation Plan can be a complex process. These general steps can help get you started: Define your project, identify key performance indicators, set targets, determine data collection methods, establish a timeline, analyze and interpret data, use findings to inform future decisions, and more steps. Creating a solid Monitoring and Evaluation Plan can help ensure project success and improve resource allocation. Let’s get started!

Table of contents

Step 1 Identify your evaluation audience or stakeholders 

Step 2 identify program goals and objectives, step 3 define the evaluation questions.

  • Step 4 Developing Evaluation Objectives
  • Step 5 Identify monitoring Questions

Step 6 Define Indicators to Include in Evaluation Plan

Step 7 define data collection methods and timeline – creating a methodology, step 8 identify m&e roles and responsibilities.

  • Step 9 Identify who is responsible for data collection and timelines

Step 10 Create an Analysis Plan and Reporting Templates

Step 11 review the m&e plan, step 12 implementing and monitoring the evaluation plan.

  • Step 13 Using Results to Make Informed Decisions ( Plan for Dissemination and Donor Reporting)

The audience or stakeholders for an evaluation can vary depending on the program or project being evaluated. However, some common stakeholders that may be involved in an evaluation include:

  • Program managers and staff: Program managers and staff are responsible for implementing the program and are often the primary users of evaluation findings. They may use evaluation findings to make program adjustments or improvements, and to report on program progress to other stakeholders.
  • Funders: Funders are often the primary source of funding for a program and may require evaluations to ensure that the program is meeting its intended goals and outcomes. Evaluation findings can be used to inform funding decisions and may be included in funding reports or proposals.
  • Beneficiaries: Beneficiaries are the individuals or communities that are directly impacted by the program. Their perspectives and feedback are important in evaluating the effectiveness of the program and can help to identify areas for improvement.
  • Other stakeholders: Other stakeholders may include partners, collaborators, policymakers, and the broader community. They may have an interest in the program and its outcomes and may use evaluation findings to inform their own work or decision-making.

It is important to consider the needs and perspectives of all stakeholders when conducting an evaluation. Evaluation findings should be communicated in a way that is clear and accessible to all stakeholders and should be used to inform decision-making and improve program effectiveness.

Program goals and objectives are critical components of a program design and evaluation. Program goals are broad statements that describe the overarching purpose or intended outcome of the program. Objectives, on the other hand, are more specific and measurable statements that describe the steps that will be taken to achieve the program goals.

Here are some examples of program goals and objectives:

Program Goal: To improve access to clean water in rural communities.

Program Objectives:

  • To install 50 new water filtration systems in rural communities by the end of the year.
  • To provide training on water sanitation and hygiene practices to 500 community members by the end of the year.

Program Goal: To reduce food insecurity in the local community.

  • To establish a community garden program that will provide fresh produce to 100 families by the end of the year.
  • To distribute food baskets to 200 families in need each month.

Program Goal: To improve academic performance of at-risk students.

  • To provide after-school tutoring services to 50 at-risk students each week.
  • To increase the graduation rate of at-risk students by 10% within the next 2 years.

Program Goal: To increase access to healthcare services in underserved communities.

  • To establish 3 new health clinics in underserved communities within the next 3 years.
  • To provide health education and screening services to 500 community members within the next year.

Program goals and objectives should be specific, measurable, achievable, relevant, and time-bound (SMART). These criteria help to ensure that the program is focused, achievable, and measurable, and that progress towards the goals and objectives can be tracked and evaluated.

Key questions to be asked to determine if an M&E plan is working include the following:

  • Are the M&E activities progressing as planned?
  • Are M&E questions being answered sufficiently? Are other data needed to answer these questions? How can such data be obtained?
  • Should the M&E questions be re-framed? Have other M&E questions arisen that should be incorporated into the plan?
  • Are there any methodological or valuation design issues that need to be addressed? Are there any practical or political factors that need to be considered?
  • Are any changes in the M&E plan needed at this time? How will these changes be made? Who will implement them?
  • Are appropriate staff and funding still available to complete the evaluation plan?
  • How are findings from M&E activities being used and disseminated? Should anything be done to enhance their application to programs?

Catch HR’s Eye Instantly:

  • Resume Review
  • Resume Writing
  • Resume Optimization

Premier global development resume service since 2012

Stand Out with a Pro Resume

 Step 4 – Developing Evaluation Objectives

Developing evaluation objectives is a critical step in creating a comprehensive monitoring and evaluation plan. Evaluation objectives specify what the evaluation will assess and define the criteria for success.

Evaluation objectives should be aligned with the project’s overall goals and objectives, and they should be specific, measurable, achievable, relevant, and time-bound.

Sure, here are some examples of evaluation objectives as a step in the development of a monitoring and evaluation plan:

  • Specific: To assess the effectiveness of a new training program by measuring the increase in job performance and knowledge among participants.
  • Measurable: To determine the impact of a community development project by measuring the increase in access to essential services such as healthcare, education, and clean water among the target population.
  • Achievable: To increase website traffic by 20% within six months by optimizing website content and implementing a targeted digital marketing campaign.
  • Relevant: To improve employee retention by 15% within one year by implementing new professional development opportunities and increasing employee engagement.
  • Time-bound: To reduce customer complaints by 25% within the next quarter by improving customer service response times and implementing a customer feedback system.

These are just a few examples of evaluation objectives, and they can be adapted to fit the specific needs and goals of each project or program.

By developing SMART evaluation objectives as a step in the development of a monitoring and evaluation plan, you can ensure that the evaluation process is focused, achievable, and aligned with the project’s overall goals and objectives. This will help ensure that the monitoring and evaluation plan is effective in assessing progress and providing valuable insights for project improvement and decision-making..

Step 5 Identify the monitoring questions

Monitoring questions are an important part of any project, as they help to ensure that the project is moving forward in the right direction. In the case of a project to improve customer service, the monitoring questions might include:

  • How satisfied are customers with the current level of service?
  • What areas need improvement?
  • What resources are available to support customer service improvement?
  • What processes are in place to ensure customer service is consistently meeting customer needs?
  • What metrics are being used to measure customer service performance?

These questions will help to identify areas for improvement, and provide guidance on how to best implement changes. By regularly monitoring these questions, the project team can ensure that customer service is always improving and meeting customer needs.

Indicators are specific, measurable variables or metrics that can be used to assess progress towards achieving the objectives of a project or program. In an evaluation plan, indicators are essential components that help determine whether a project is achieving its intended outcomes and objectives. To define indicators to include in an evaluation plan, follow these steps:

  • Identify the objectives of the project: Review the project’s goals and objectives to determine what specific outcomes the project is intended to achieve.
  • Determine the data needed to measure progress: Identify the data needed to assess progress towards achieving each objective.
  • Develop measurable indicators: Develop specific, measurable indicators that will allow you to track progress towards achieving each objective.
  • Ensure that the indicators are relevant: Ensure that the indicators selected are relevant to the objectives of the project and that they provide meaningful information that can be used to inform decision-making.
  • Consider data availability and collection methods: Ensure that data is available for the selected indicators and that collection methods are practical and cost-effective.
  • Establish a baseline: Establish a baseline measurement for each indicator to determine the starting point for tracking progress.

Here are some examples of indicators that could be included in an evaluation plan:

Program Objective: To install 50 new water filtration systems in rural communities by the end of the year.

Indicators:

  • Number of water filtration systems installed
  • Number of community members with access to clean water
  • Water quality tests results

Program Objective: To establish a community garden program that will provide fresh produce to 100 families by the end of the year.

  • Number of families participating in the community garden program
  • Number of pounds of fresh produce harvested
  • Number of families reporting improved food security

Program Objective: To provide after-school tutoring services to 50 at-risk students each week.

  • Number of at-risk students attending tutoring sessions
  • Average increase in grades of at-risk students
  • Percentage of at-risk students passing core subjects

Program Objective: To establish 3 new health clinics in underserved communities within the next 3 years.

  • Number of new health clinics established
  • Number of community members served by the new health clinics
  • Number of community members reporting improved access to healthcare services

Overall, it is important to choose indicators that are meaningful, measurable, and aligned with program goals and objectives. This will help to ensure that the evaluation is able to accurately assess program effectiveness and identify areas for improvement.

By defining indicators to include in an evaluation plan, you can ensure that the evaluation process is focused and effective in providing valuable insights for project improvement and decision-making.

Creating a methodology is a crucial step in developing a monitoring and evaluation plan. In this step, you define the data collection methods and timeline for the evaluation. To do this, you need to identify the data that needs to be collected to assess progress towards achieving each objective. Then, you can select appropriate data collection methods that are appropriate for the data being collected and the resources available. Common methods include surveys, interviews, focus groups, observation, and document review.

Once you have identified the data and the appropriate data collection methods, you can establish a timeline for data collection and evaluation that aligns with the project’s overall timeline and key milestones. This timeline should be realistic and consider the availability of resources and the time required to collect and analyze the data.

It is also important to assign responsibility for each aspect of the methodology, including data collection, analysis, and reporting. You should ensure that the individuals responsible have the necessary skills, resources, and support to carry out their assigned tasks effectively.

To ensure data quality, you should develop strategies to ensure that data is accurate, reliable, and valid, and that any biases or errors are minimized. Establishing a baseline measurement for each indicator is crucial to determine the starting point for tracking progress.

By creating a methodology that defines data collection methods and timeline, you can ensure that the monitoring and evaluation plan is effective in assessing progress and providing valuable insights for project improvement and decision-making.

This step involves defining the roles and responsibilities of individuals or teams involved in the monitoring and evaluation process to ensure that everyone knows what is expected of them.

To begin, it is important to identify the key stakeholders involved in the project and determine their respective roles and responsibilities in the M&E process. This includes identifying the project manager or coordinator, data collectors, data analysts, and decision-makers who will use the M&E findings to inform project decisions.

Once the stakeholders have been identified, it is necessary to define their specific roles and responsibilities. For example, the project manager or coordinator may be responsible for overall project management and ensuring that the M&E plan is implemented as intended. Data collectors may be responsible for collecting and managing data, while data analysts may be responsible for analyzing and interpreting data. Decision-makers may be responsible for using the M&E findings to inform project decisions.

It is also important to establish communication channels and protocols for sharing information and M&E findings among stakeholders. This includes defining the frequency and format of progress reports, as well as procedures for addressing any issues or challenges that arise during the M&E process.

By identifying M&E roles and responsibilities, you can ensure that everyone involved in the monitoring and evaluation process understands their roles and responsibilities, which helps to ensure the effective implementation of the M&E plan and the project’s success.

Step 9 Identify who is Responsible for Data Collection and Timelines

This step involves determining the individuals or teams responsible for collecting and managing the data required to assess progress towards achieving project objectives, as well as defining the timelines for data collection.

To identify who is responsible for data collection, it is important to review the project goals and objectives and determine what data needs to be collected to assess progress. The individuals or teams responsible for data collection may include project staff, external consultants, or other stakeholders with relevant expertise.

Once the individuals or teams responsible for data collection have been identified, it is necessary to establish a timeline for data collection that aligns with the project’s overall timeline and key milestones. This timeline should be realistic and consider the availability of resources and the time required to collect and analyze the data.

In addition to establishing a timeline, it is also important to define the specific data collection methods that will be used and ensure that those responsible for data collection have the necessary resources and support to carry out their assigned tasks effectively. This may include providing training on data collection methods, ensuring access to necessary equipment or software, and establishing protocols for data management and quality control.

By identifying who is responsible for data collection and timelines, you can ensure that data is collected in a timely and efficient manner, which is crucial for the effective implementation of the M&E plan and the success of the project.

This step involves determining how data will be analyzed and reported, including the selection of appropriate methods and the development of reporting templates to ensure that data is presented in a clear and concise manner.

To create an analysis plan, it is important to review the project objectives and identify the key performance indicators that will be used to assess progress. This will help determine what data needs to be analyzed and what statistical methods will be used to analyze the data. The analysis plan should include a detailed description of the statistical methods that will be used, including any assumptions or limitations associated with these methods.

Once the analysis plan has been developed, it is necessary to create reporting templates to ensure that data is presented in a clear and concise manner. Reporting templates should include the key performance indicators and the specific data that will be reported, as well as any graphs, charts, or tables that will be used to present the data. Reporting templates should be designed to provide a clear picture of progress towards achieving project objectives, and should be easy to read and understand.

It is also important to establish protocols for data sharing and reporting to ensure that data is shared in a timely and effective manner. This may include establishing a timeline for reporting, identifying the stakeholders who will receive the reports, and determining the format and level of detail required for each report.

By creating an analysis plan and reporting templates, you can ensure that data is analyzed and reported in a systematic and standardized manner, which is crucial for the effective implementation of the M&E plan and the success of the project.

This step involves a comprehensive review of the M&E plan to ensure that it is aligned with the project’s overall goals and objectives, and that it is practical and feasible to implement.

To review the M&E plan, it is necessary to first review the project’s goals and objectives and ensure that the M&E plan is aligned with these. This includes reviewing the performance indicators and ensuring that they are relevant, measurable, and appropriate for tracking progress towards achieving project objectives.

Next, it is necessary to review the data collection methods and analysis plan to ensure that they are practical and feasible to implement. This includes reviewing the data collection timeline, the individuals or teams responsible for data collection, and the resources required to collect and manage the data.

In addition to reviewing the M&E plan itself, it is also important to review the communication and reporting protocols to ensure that they are effective and efficient. This may include reviewing the reporting templates, the stakeholders who will receive the reports, and the frequency and format of progress reports.

Overall, the goal of reviewing the M&E plan is to ensure that it is practical, feasible, and effective in assessing progress towards achieving project objectives. This step is critical for the success of the project, as it ensures that the M&E plan is aligned with the project’s overall goals and objectives and that it is designed to provide valuable insights for project improvement and decision-making.

This step involves carrying out the data collection, analysis, and reporting activities defined in the M&E plan, as well as monitoring progress towards achieving project objectives.

To implement the evaluation plan, it is necessary to follow the protocols and procedures defined in the M&E plan. This may include assigning responsibilities to individuals or teams involved in data collection, analysis, and reporting, and providing training and support as needed.

Monitoring progress towards achieving project objectives involves tracking the performance indicators defined in the M&E plan and comparing them to the baseline measurements to determine progress. This may involve conducting regular progress reports and reviewing data for any trends or issues that may require further attention.

Throughout the implementation and monitoring of the evaluation plan, it is important to maintain open lines of communication among stakeholders and to address any issues or challenges that arise in a timely manner. This may involve revising the M&E plan as needed to ensure that it remains aligned with project objectives and is effective in assessing progress towards achieving them.

Overall, implementing and monitoring the evaluation plan is critical for the success of the project, as it provides valuable insights for project improvement and decision-making. By following the protocols and procedures defined in the M&E plan and monitoring progress towards achieving project objectives, you can ensure that the project is on track to achieve its intended outcomes and objectives.

Step 13 Using Results to Make Informed Decisions

Using results to make informed decisions is a critical step in the monitoring and evaluation process. This step involves analyzing the data collected through the evaluation plan and using it to make informed decisions about the project’s future direction. It also involves disseminating the findings to relevant stakeholders, including donors, to demonstrate the project’s impact and to inform future funding decisions.

To plan for dissemination and donor reporting, it is important to first analyze the data collected through the evaluation plan and identify the key findings and insights. This includes identifying any successes or areas for improvement and determining what actions can be taken to address these.

Once the key findings have been identified, it is necessary to develop a plan for disseminating the findings to relevant stakeholders. This may include developing reports, presentations, or other materials that summarize the key findings and insights in a clear and concise manner. It may also involve identifying the stakeholders who will receive the findings and determining the best way to reach them.

In addition to disseminating the findings, it is important to report the results to donors to demonstrate the project’s impact and to inform future funding decisions. This may involve developing donor reports that summarize the key findings and insights and provide an overview of the project’s progress towards achieving its objectives.

Overall, using results to make informed decisions is critical for the success of the project, as it ensures that the project is on track to achieve its intended outcomes and objectives. By planning for dissemination and donor reporting, you can demonstrate the project’s impact and ensure that it receives continued support from donors and other stakeholders.

To Conclude

The 13 steps outlined in this article provide a comprehensive framework for developing a robust monitoring and evaluation plan that can effectively assess progress towards achieving project objectives and provide valuable insights for project improvement and decision-making.

Key steps in developing a monitoring and evaluation plan include defining the purpose and scope of the evaluation, identifying stakeholders and their roles and responsibilities, developing SMART evaluation objectives, identifying indicators to measure progress, defining data collection methods and timelines, creating an analysis plan and reporting templates, reviewing the M&E plan, implementing and monitoring the evaluation plan, using results to make informed decisions, planning for dissemination, and donor reporting.

By following these steps, organizations can ensure that their monitoring and evaluation plans are practical, feasible, and effective in assessing progress towards achieving project objectives. Additionally, the insights gained from monitoring and evaluation can help to inform decision-making and improve project outcomes, leading to greater success and impact. Ultimately, investing in monitoring and evaluation is crucial for any organization that wants to achieve its goals and have a meaningful impact in the world.

In conclusion, it’s important to remember that no plan will be effective without the right monitoring and evaluation strategies in place. Without an effective plan, you won’t be able to track the progress of your project or measure its success. Take the time to plan out your monitoring and evaluation plan, as this will help ensure that your project is successful and that it meets the desired outcomes.

' data-src=

Simana KONE

Very hepfull and great usefull tool

' data-src=

Santo Obina

This is helpful and a resourceful tool

' data-src=

Fation Luli

Hey EvalCommunity readers,

Did you know that you can enhance your visibility by actively engaging in discussions within the EvalCommunity? Every week, we highlight the most active commenters and authors in our newsletter , which is distributed to an impressive audience of over 1.289,000 monthly readers and practitioners in International Development , Monitoring and Evaluation (M&E), and related fields.

Seize this opportunity to share your invaluable insights and make a substantial contribution to our community. Begin sharing your thoughts below to establish a lasting presence and wield influence within our growing community.

assignment monitoring and evaluation

This is so instrumental to consider when regarding a successful evaluations.

Leave a Comment Cancel Reply

Your email address will not be published.

How strong is my Resume?

Only 2% of resumes land interviews.

Land a better, higher-paying career

assignment monitoring and evaluation

Jobs for You

College of education: open-rank, evaluation/social research methods — educational psychology.

  • Champaign, IL, USA
  • University of Illinois at Urbana-Champaign

Deputy Director – Operations and Finance

  • United States

Energy/Environment Analyst

Climate finance specialist, call for consultancy: evaluation of dfpa projects in kenya, uganda and ethiopia.

  • The Danish Family Planning Association

Project Assistant – Close Out

  • United States (Remote)

Global Technical Advisor – Information Management

  • Belfast, UK
  • Concern Worldwide

Intern- International Project and Proposal Support – ISPI

Budget and billing consultant, manager ii, budget and billing, usaid/lac office of regional sustainable development – program analyst, team leader, senior finance and administrative manager, data scientist.

  • New York, NY, USA
  • Everytown For Gun Safety

Energy Evaluation Specialist

Services you might be interested in, useful guides ....

How to Create a Strong Resume

Monitoring And Evaluation Specialist Resume

Resume Length for the International Development Sector

Types of Evaluation

Monitoring, Evaluation, Accountability, and Learning (MEAL)

LAND A JOB REFERRAL IN 2 WEEKS (NO ONLINE APPS!)

Sign Up & To Get My Free Referral Toolkit Now:

  • Featured Themes
  • > Climate Change & Conflict
  • > Youth, Peace & Security
  • > Tech for Good
  • > Asia Religious & Ethnic Freedom
  • All Regions
  • Resource Library
  • Peace Impact Framework
  • Grounded Accountability Model
  • Add a Resource
  • Organizations
  • Collaboration Map
  • Discussions
  • Add an Organization

assignment monitoring and evaluation

Monitoring and Evaluation: Tools, Methods and Approaches

The purpose of this M&E Overview is to strengthen awareness and interest in M&E, and to clarify what it entails. You will find an overview of a sample of M&E tools, methods, and approaches outlined here, including their purpose and use; advantages and disadvantages; costs, skills, and time required; and key references. Those illustrated here include several data collection methods, analytical frameworks, and types of evaluation and review. The M&E Overview discusses:

  • Performance indicators
  • The logical framework approach
  • Theory-based evaluation
  • Formal surveys
  • Rapid appraisal methods
  • Participatory methods
  • Public expenditure tracking surveys
  • Cost-benefit and cost-effectiveness analysis
  • Impact evaluation

This list is not comprehensive, nor is it intended to be. Some of these tools and approaches are complementary; some are substitutes. Some have broad applicability, while others are quite narrow in their uses. The choice of which is appropriate for any given context will depend on a range of considerations. These include the uses for which M&E is intended, the main stakeholders who have an interest in the M&E findings, the speed with which the information is needed, and the cost.

You must be logged in in order to leave a comment

Related Resources

“to protect her honour”: child marriage in emergencies – the fatal confusion between protecting girls and sexual violence.

Theme: Conflict Sensitivity & Integration

“Investing in Listening”: International Organization for Migration’s Experience with Humanitarian Feedback Mechanisms in Sindh Province, Pakistan

Theme: Democracy & Governance , General

Region: Europe , Oceania

Share on Mastodon

We value your privacy

We and our partners are using technologies like Cookies or Targeting and process personal data like IP-address or browser information in order to personalize the contents you see. We also use it in order to measure results or align our website content. Because we value your privacy, we are herewith asking your permission to use the following technologies.

The Compass for SBC

The Compass for SBC

Helping you Implement Effective Social and Behavior Change Projects

How-To-Guide

How to Develop a Monitoring and Evaluation Plan

Home > How to Guides > How to Develop a Monitoring and Evaluation Plan

Introduction

Click here to access this Guide in Arabic

ل مراجعة هذا الدليل باللغة العربية، انقر هنا

What is a Monitoring and Evaluation Plan?

A monitoring and evaluation (M&E) plan is a document that helps to track and assess the results of the interventions throughout the life of a program. It is a living document that should be referred to and updated on a regular basis. While the specifics of each program’s M&E plan will look different, they should all follow the same basic structure and include the same key elements.

An M&E plan will include some documents that may have been created during the program planning process, and some that will need to be created new. For example, elements such as the logic model /logical framework, theory of change, and monitoring indicators may have already been developed with input from key stakeholders and/or the program donor. The M&E plan takes those documents and develops a further plan for their implementation.

Why develop a Monitoring and Evaluation Plan?

It is important to develop an M&E plan before beginning any monitoring activities so that there is a clear plan for what questions about the program need to be answered. It will help program staff decide how they are going to collect data to track indicators , how monitoring data will be analyzed, and how the results of data collection will be disseminated both to the donor and internally among staff members for program improvement. Remember, M&E data alone is not useful until someone puts it to use! An M&E plan will help make sure data is being used efficiently to make programs as effective as possible and to be able to report on results at the end of the program.

Who should develop a Monitoring and Evaluation Plan?

An M&E plan should be developed by the research team or staff with research experience, with inputs from program staff involved in designing and implementing the program.

When should a Monitoring and Evaluation Plan be developed?

An M&E plan should be developed at the beginning of the program when the interventions are being designed. This will ensure there is a system in place to monitor the program and evaluate success.

Who is this guide for?

This guide is designed primarily for program managers or personnel who are not trained researchers themselves but who need to understand the rationale and process of conducting research. This guide can help managers to support the need for research and ensure that research staff have adequate resources to conduct the research that is needed to be certain that the program is evidence based and that results can be tracked over time and measured at the end of the program.

Learning Objectives

After completing the steps for developing an M&E plan, the team will:

  • Identify the elements and steps of an M&E plan
  • Explain how to create an M&E plan for an upcoming program
  • Describe how to advocate for the creation and use of M&E plans for a program/organization

Estimated Time Needed

Developing an M&E plan can take up to a week, depending on the size of the team available to develop the plan, and whether a logic model and theory of change have already been designed.

Prerequisites

How to Develop a Logic Model

Step 1: Identify Program Goals and Objectives

The first step to creating an M&E plan is to identify the program goals and objectives. If the program already has a logic model or theory of change, then the program goals are most likely already defined. However, if not, the M&E plan is a great place to start. Identify the program goals and objectives.

Defining program goals starts with answering three questions:

  • What problem is the program trying to solve?
  • What steps are being taken to solve that problem?
  • How will program staff know when the program has been successful in solving the problem?

​Answering these questions will help identify what the program is expected to do, and how staff will know whether or not it worked. For example, if the program is starting a condom distribution program for adolescents, the answers might look like this:

High rates of unintended pregnancy and sexually transmitted infections (STIs) transmission among youth ages 15-19
Promote and distribute free condoms in the community at youth-friendly locations
Lowered rates of unintended pregnancy and STI transmission among youth 15-19. Higher percentage of condom use among sexually active youth.

From these answers, it can be seen that the overall program goal is to reduce the rates of unintended pregnancy and STI transmission in the community.

It is also necessary to develop intermediate outputs and objectives for the program to help track successful steps on the way to the overall program goal. More information about identifying these objectives can be found in the logic model guide .

Step 2: Define Indicators

Once the program’s goals and objectives are defined, it is time to define indicators for tracking progress towards achieving those goals. Program indicators should be a mix of those that measure process, or what is being done in the program, and those that measure outcomes.

Process indicators track the progress of the program. They help to answer the question, “Are activities being implemented as planned?” Some examples of process indicators are:

  • Number of trainings held with health providers
  • Number of outreach activities conducted at youth-friendly locations
  • Number of condoms distributed at youth-friendly locations
  • Percent of youth reached with condom use messages through the media

Outcome indicators track how successful program activities have been at achieving program objectives. They help to answer the question, “Have program activities made a difference?” Some examples of outcome indicators are:

  • Percent of youth using condoms during first intercourse
  • Number and percent of trained health providers offering family planning services to youth
  • Number and percent of new STI infections among youth.

These are just a few examples of indicators that can be created to track a program’s success. More information about creating indicators can be found in the How to Develop Indicators guide .

Step 3: Define Data Collection Methods and TImeline

After creating monitoring indicators, it is time to decide on methods for gathering data and how often various data will be recorded to track indicators. This should be a conversation between program staff, stakeholders, and donors. These methods will have important implications for what data collection methods will be used and how the results will be reported.

The source of monitoring data depends largely on what each indicator is trying to measure. The program will likely need multiple data sources to answer all of the programming questions. Below is a table that represents some examples of what data can be collected and how.

Implementation process and progressProgram-specific M&E tools
Service statisticsFacility logs, referral cards
Reach and success of the program intervention within audience subgroups or communitiesSmall surveys with primary audience(s), such as provider interviews or client exit interviews
The reach of media interventions involved in the programMedia ratings data, brodcaster logs, Google analytics, omnibus surveys
Reach and success of the program intervention at the population levelNationally-representative surveys, Omnibus surveys, DHS data
Qualitative data about the outcomes of the interventionFocus groups, in-depth interviews, listener/viewer group discussions, individual media diaries, case studies

Once it is determined how data will be collected, it is also necessary to decide how often it will be collected. This will be affected by donor requirements, available resources, and the timeline of the intervention. Some data will be continuously gathered by the program (such as the number of trainings), but these will be recorded every six months or once a year, depending on the M&E plan. Other types of data depend on outside sources, such as clinic and DHS data.

After all of these questions have been answered, a table like the one below can be made to include in the M&E plan. This table can be printed out and all staff working on the program can refer to it so that everyone knows what data is needed and when.

Number of trainings held with health providersTraining attendance sheetsEvery 6 months
Number of outreach activities conducted at youth-friendly locationsActivity sheetEvery 6 months
Number of condoms distributed at youth-friendly locationsCondom distribution sheetEvery 6 months
Percent of youth receiving condom use messages through the mediaPopulation-based surveysAnnually
Percent of adolescents reporting condom use during first intercourseDHS or other population-based surveyAnnually
Number and percent of trained health providers offering family planning services to adolescentsFacility logsEvery 6 months
Number and percent of new STI infections among adolescentsDHS or other population-based surveyAnnually

Step 4: Identify M&E Roles and Responsibilities

The next element of the M&E plan is a section on roles and responsibilities. It is important to decide from the early planning stages who is responsible for collecting the data for each indicator. This will probably be a mix of M&E staff, research staff, and program staff. Everyone will need to work together to get data collected accurately and in a timely fashion.

Data management roles should be decided with input from all team members so everyone is on the same page and knows which indicators they are assigned. This way when it is time for reporting there are no surprises.

An easy way to put this into the M&E plan is to expand the indicators table with additional columns for who is responsible for each indicator, as shown below.

Number of trainings held with health providersTraining attendance sheetsEvery 6 monthsActivity manager
Number of outreach activities conducted at youth-friendly locationsActivity sheetEvery 6 monthsActivity manager
Number of condoms distributed at youth-friendly locationsCondom distribution sheetEvery 6 monthsActivity manager
Percent of youth receiving condom use messages through the mediaPopulation-based surveyAnnuallyResearch assistant
Percent of adolescents reporting condom use during first intercourseDHS or other population-based surveyAnnuallyResearch assistant
Number and percent of trained health providers offering family planning services to adolescentsFacility logsEvery 6 monthsField M&E officer
Number and percent of new STI infections among adolescentsDHS or other population-based surveyAnnuallyResearch assistant

Step 5: Create an Analysis Plan and Reporting Templates

Once all of the data have been collected, someone will need to compile and analyze it to fill in a results table for internal review and external reporting. This is likely to be an in-house M&E manager or research assistant for the program.

The M&E plan should include a section with details about what data will be analyzed and how the results will be presented. Do research staff need to perform any statistical tests to get the needed answers? If so, what tests are they and what data will be used in them? What software program will be used to analyze data and make reporting tables? Excel? SPSS? These are important considerations.

Another good thing to include in the plan is a blank table for indicator reporting. These tables should outline the indicators, data, and time period of reporting. They can also include things like the indicator target, and how far the program has progressed towards that target. An example of a reporting table is below.

Number of trainings held with health providers051050%
Number of outreach activities conducted at youth-friendly locations02633%
Number of condoms distributed at youth-friendly locations025,00050,00050%
Percent of youth receiving condom use messages through the media.5%35%75%47%
Percent of adolescents reporting condom use during first intercourse20%30%80%38%
Number and percent of trained health providers offering family planning services to adolescents2010625080%
Number and percent of new STI infections among adolescents11,00022%10,00020%10% reduction 5 years20%

Step 6: Plan for Dissemination and Donor Reporting

The last element of the M&E plan describes how and to whom data will be disseminated. Data for data’s sake should not be the ultimate goal of M&E efforts. Data should always be collected for particular purposes.

Consider the following:

  • How will M&E data be used to inform staff and stakeholders about the success and progress of the program?
  • How will it be used to help staff make modifications and course corrections, as necessary?
  • How will the data be used to move the field forward and make program practices more effective?

The M&E plan should include plans for internal dissemination among the program team, as well as wider dissemination among stakeholders and donors. For example, a program team may want to review data on a monthly basis to make programmatic decisions and develop future workplans, while meetings with the donor to review data and program progress might occur quarterly or annually. Dissemination of printed or digital materials might occur at more frequent intervals. These options should be discussed with stakeholders and your team to determine reasonable expectations for data review and to develop plans for dissemination early in the program. If these plans are in place from the beginning and become routine for the project, meetings and other kinds of periodic review have a much better chance of being productive ones that everyone looks forward to.

After following these 6 steps, the outline of the M&E plan should look something like this:

  • ​Program goals and objectives
  • Logic model/ Logical Framework/Theory of change
  • Table with data sources, collection timing, and staff member responsible
  • Description of each staff member’s role in M&E data collection, analysis, and/or reporting
  • Analysis plan
  • Reporting template table
  • Description of how and when M&E data will be disseminated internally and externally

M&E Planning: Template for Indicator Reporting

M&E Plan Indicators Table Template

M&E Plan: Data Sources Table Example

Tips & Recommendations

  • It is a good idea to try to avoid over-promising what data can be collected. It is better to collect fewer data well than a lot of data poorly. It is important for program staff to take a good look at the staff time and resource costs of data collection to see what is reasonable.

Glossary & Concepts

  • Process indicators track how the implementation of the program is progressing. They help to answer the question, “Are activities being implemented as planned?”
  • Outcome indicators track how successful program activities have been at achieving program goals. They help to answer the question, “Have program activities made a difference?”

Resources and References

Evaluation Toolbox. Step by Step Guide to Create your M&E Plan. Retrieved from: http://evaluationtoolbox.net.au/index.php?option=com_content&view=article&id=23:create-m-and-e-plan&catid=8:planning-your-evaluation&Itemid=44

infoDev. Developing a Monitoring and Evaluation Plan for ICT for Education. Retrieved from: https://www.infodev.org/infodev-files/resource/InfodevDocuments_287.pdf

FHI360. Developing a Monitoring and Evaluation Work Plan. Retrieved from: http://www.fhi360.org/sites/default/files/media/documents/Monitoring%20HIV-AIDS%20Programs%20(Facilitator)%20-%20Module%203.pdf

Banner Photo: © 2012 Akintunde Akinleye/NURHI, Courtesy of Photoshare

ABOUT HOW TO GUIDES

SBC How-to Guides are short guides that provide step-by-step instructions on how to perform core social and behavior change tasks. From formative research through monitoring and evaluation, these guides cover each step of the SBC process, offer useful hints, and include important resources and references.

Share this Article

Harvard and MIT’s $800 Million Mistake

assignment monitoring and evaluation

Harvard and MIT’s $800 Million Mistake: The Triple Failure of 2U, edX, and Axim Collaborative

The future of Coursera’s only credible alternative for universities rests in the hands of 2U’s creditors.

  • 10 Best Data Science Courses for 2024
  • 7 Best Free OCaml Courses for 2024
  • 6 Best Free Ecology Courses for 2024
  • [2024] Massive List of Thousands of Free Certificates and Badges
  • Learn Something New: 100 Most Popular Courses For September

600 Free Google Certifications

Most common

Popular subjects.

Communication Skills

Data Analysis

Digital Marketing

Popular courses

What is a Mind?

The Ancient Greeks

Quantum Mechanics for Everyone

Organize and share your learning with Class Central Lists.

View our Lists Showcase

Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Planning for Monitoring and Evaluation

Philanthropy University and fhi360 via Independent Help

How will you measure your project’s success? This course will help you answer this question by introducing the basics of monitoring and evaluation (M&E). In this course, you will learn how successful projects plan for data collection, management, analysis, and use. As you complete the course assignments, you will create an M&E plan for your own project.

Module 1: Introduction to Monitoring and Evaluation 


Link M&E to your project’s design

Module 2: Linking M&E to Project Design 


Define the indicators that you will measure

Module 3: Identifying Indicators & Targets


Choose appropriate data collection methods

Module 4: Data Collection 


Create clear, useful data collection tools

Module 5: Roles & Responsibilities 


Assign M&E roles and responsibilities

Related Courses

From data collection to data use, project initiation: starting a successful project, evaluating public health programs at scale, household surveys for program evaluation in lmics, the dmaic framework - define and measure phase, project - monitoring and control.

4.7 rating, based on 61 Class Central reviews

Select rating

Start your review of Planning for Monitoring and Evaluation

  • Puguh Dwi Kuncoro @puguhdwikuncoro 8 months ago Amazing course, I think it's good initiation to learn about Monitoring and Evaluation. I hope there are a opportunity to develop more knowledge Helpful
  • AA Anonymous 5 years ago Having completed this course, I give credit to the Philanthropy University and FHI360 team for conveying such a technical given subject matter (monitoring and evaluation) in such a simplified manner. Though, the fundamental concepts, skill sets, t… Read more Having completed this course, I give credit to the Philanthropy University and FHI360 team for conveying such a technical given subject matter (monitoring and evaluation) in such a simplified manner. Though, the fundamental concepts, skill sets, tools and knowledge of the field (monitoring and evaluation) are enormous, however, the basic and fundamental of it all were all covered and discussed in simplified terms, devoid of any technical jargons. The assessments and quizzes were also relevant, as it gave me the opportunity to appropriately apply and exercise the knowledge gained, tools learnt and skills developed. Kudos to Philanthropy University team for putting up quality learning opportunities in collaboration with international organizations of high reputation, to esteemed learners at no cost to the prospective learner, in contrast to the popular belief that nothing of high quality comes for free. Having undertook some selected courses of Philanthropy University and attest to the fact that it was a time well spent for me, I look forward to undertaking many more courses to come. Helpful
  • Opeyemi Simon Samuel 5 years ago This course on Monitoring and Evaluation has provided me with an added skill on M&E and has puts me in better place to be hired to do an M&E job. Its easy, clear and flexible. I recommend it to everyone who has interest. Helpful
  • AC ADITYA KUMAR CHAURASIYA 5 months ago Its a good course for self enhancement and quality building. One who is looking for fast and reliant ability can pursue it. It is effective and value time spending Helpful
  • AA Anonymous 1 year ago A worthwhile course. Challenges you to excel. It Is relevant and applicable in today's world. I would recommend it to anyone intersted in developing their skills set. Thanks I enjoyed it. Helpful
  • AM Andulile Raphael Mwaisaka 5 years ago The course is narrated in simple language which most of the people can get through even for though English is not their native language. The step by step instructions are very helpful even for a slow learner can corp with it. Examples used are very simple and therefore well understood to motivate a learner try to do himself/herself for practical purpose. To conclude I would say, the course is well prepared in a manner which inspire someone to continue reading and doing something for practical purpose. It is real a self- paced learning course. No technical terminologies have been used or where they are used are given simple descriptions which most learner can understand. I enjoyed learning. Helpful
  • GS Godlove Shu 5 years ago This course "Planning for Monitoring and Evaluation" introduced the fundamentals of M&E and went further to guide the learners through a robust process of creating an M&E Plan. It explained the meaning, similarities and differences between monitoring and evaluation. In addition, the course walked the learners through the important components of an M&E plan (Introduction document, Logical Framework, Indicator Document, Data Flow Map, Roles and Responsibility Chart). The course stressed the need and made learners understand that monitory and evaluation was fundamental to the success of any project and must be inculcated into the project as early as at the design phase. Helpful
  • SN Sani Emmanuel Numa 5 years ago Before now, I have been scouting the internet for resources on M&E with little success due to the junks out there. But in spite of my skepticism about the thorough nature of online course, I decided to just give it a try and really, the Philanthropy University platform is the best. The case studies, resource, illustrations, tasks etc. was wonderful and for everyone of my team at Oliveserah Business and Academic Concepts, taking courses with Philanthropy University will become a norm. Helpful
  • KN Kortu Ndebe 5 years ago This course by Philanthropy University is so educative and fully completing it will help bring a lot of positive changes within your organization and leads to a successful project implementation, putting you in the right position for evaluation and even monitoring. I wholeheartedly recommend this course and other courses offered by Philanthropy University like Essentials of Non-profit Strategy, Project Management. Helpful
  • AA Anonymous 5 years ago This course is really a great one, for anyone seeking for the right knowledge in project monitoring and evaluation. I was really finding it difficult to monitor and evaluate a project, this course was really an eye-opener for me, I grabbed the knowledge which i earnestly needed. Thanks to philanthropy University and their Team! Helpful
  • SI Saed Mahad Ismail 5 years ago Understanding about monitoring,evaluation,accountability and learning.Also how to plan the projects during monitoring and evaluation. Also what is monitoring, types of monitoring,process of monitoring and evaluation and reasons of to make monitoring and evaluation and to achieve a project successfull. Helpful
  • CN Charles Nyaribo 4 years ago This is a very important course as far as successful project development completion is concern. Am rural development practitioner with extensive experience in agricultural projects both nationally and internationally. I learnt that a project to be successful it must have a team of competent and qualified monitoring and evaluation staff. Helpful
  • AA Anonymous 4 years ago The course is very interesting and helpful! I highly recommend it to anyone who is interested in monitoring and evaluation or working in this field. The course will take you through establishing a strong M&E plan for your project with practical assignments that help in having a concrete understanding of the whole thing! Loved it so much! Helpful
  • Zeb 5 years ago The course "Planning for Monitoring and Evaluation" communicated in so meaningful and interactive ways. Certainly, the learning outcomes of this course are value addition for my stuy and skills. I strongly recommend to all M&E position lovers to try this course and feel the change of learning with implementation. Best wishes for all... Helpful
  • AA Anonymous 5 years ago Philanthropy University has just nourished it more clear to me through the Monitoring and Evaluation (M&E) online free courses which I will strongly recommend for every individual including; those that are seeking for job and those who are working. Helpful
  • TA Taiwo Olayemi Adebowale 5 years ago The Planning for Monitoring and Evaluation course gives you the requisite knowledge and skills for any M&E project. The instructional platform is user friendly and learning materials are exceptionally informative. You will achieve above and beyond your goal! Helpful
  • AA Anonymous 5 years ago The course is well structured, with good case examples. The discussion forums are good, very engaging and the learning methodology employed is good for both beginners and those with little expertise in M&E Helpful
  • AA Anonymous 5 years ago for the firt step in setting up a M&E plan, questions such us need of informations for M&E plan, which informations is important for partners , i think this stage must be improve and be completed; Helpful
  • AA Anonymous 5 years ago This course was perfect. I understood the core concepts of planning for monitoring and evaluation and the applications in data collection, analyses and presentations and also in implementation research. Helpful
  • AA Anonymous 5 years ago Enlightening course, Professional learning platform, Easy to understand course contents/materials, Interactive discussion forums, Great course instructional team and Ability to learn from peers. Helpful

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

tools4dev Practical tools for international development

assignment monitoring and evaluation

How to write a monitoring and evaluation (M&E) framework

Download as PDF

Note: An M&E framework can also be called an evaluation matrix.

One of the most popular downloads on tools4dev is our M&E framework template . We’ve had lots of questions from people on how to use specific parts of the template, so we’ve decided to put together this short how-to guide.

Choose your indicators

The first step in writing an M&E framework is to decide which indicators you will use to measure the success of your program. This is a very important step, so you should try to involve as many people as possible to get different perspectives.

You need to choose indicators for each level of your program – outputs, outcomes and goal (for more information on these levels see our articles on how to design a program and logical frameworks ). There can be more than one indicator for each level, although you should try to keep the total number of indicators manageable.

Each indicator should be:

  • Directly related to the output, outcome or goal listed on the problem tree or logframe.
  • Something that you can measure accurately using either qualitative or quantitative methods, and your available resources.
  • If possible, a standard indicator that is commonly used for this type of program. For example, poverty could be measured using the Progress Out of Poverty Index . Using standard indicators can be better because they are already well defined, there are tools available to measure them, and you will be able to compare your results to other programs or national statistics.

Here is an example of some indicators for the goal, outcome and output of an education program:

assignment monitoring and evaluation

Some organisations have very strict rules about how the indicators must be written (for example, it must always start with a number, or must always contain an adjective). In my experience these rules usually lead to indicators that are convoluted or don’t make sense. My advice is just to make sure the indicators are written in a way where everyone involved in the project (including the donor) can understand them.

Define each indicator

Once you have chosen your indicators you need to write a definition for each one. The definition describes exactly how the indicator is calculated. If you don’t have definitions there is a serious risk that indicators might be calculated differently at different times, which means the results can’t be compared.

Here is an example of how one indicator in the education program is defined:

assignment monitoring and evaluation

After writing the definition of each indicator you also need to identify where the data will come from (the “data source”). Common sources are baseline and endline surveys, monitoring reports, and existing information systems. You also need to decide how frequently it will be measured (monthly, quarterly, annually, etc.).

Measure the baseline and set the target

Before you start your program you need to measure the starting value of each indicator – this is called the “baseline”. In the education example above that means you would need to measure the current percentage of Grade 6 students continuing on to Grade 7 (before you start your program).

In some cases you will need to do a survey to measure the baseline. In other cases you might have existing data available. In this case you need to make sure the existing data is using the same definition as you for calculating the indicator.

Once you know the baseline you need to set a target for improvement. Before you set the target it’s important to do some research on what a realistic target actually is. Many people set targets that are unachievable, without realising it. For example, I once worked on a project where the target was a 25% reduction in the child mortality rate within 12 months. However, a brief review of other child health programs showed that even the best programs only managed a 10-20% reduction within 5 years.

Identify who is responsible and where the results will be reported

The final step is to decide who will be responsible for measuring each indicator. Output indicators are often measured by field staff or program managers, while outcome and goal indicators may be measured by evaluation consultants or even national agencies.

You also need to decide where the results for each indicator will be reported. This could be in your monthly program reports, annual donor reports, or on your website. Indicator results are used to assess whether the program is working or not, so it’s very important that decision makers and stakeholders (not just the donor) have access to them as soon as possible.

Put it all into the template

Once you have completed all these steps, you’re now ready to put everything into the M&E framework template .

Download the M&E framework template and example

Tags Monitoring & Evaluation

About Piroska Bisits Bullen

Avatar photo

Related Articles

assignment monitoring and evaluation

What can international development learn from tech start-ups?

13 May 2021

assignment monitoring and evaluation

Social Enterprise Business Plan Template

12 May 2021

assignment monitoring and evaluation

How to write an M&E framework – Free video tutorial & templates

10 September 2017

The Importance of Monitoring and Evaluation for Decision-Making

  • First Online: 08 June 2021

Cite this chapter

assignment monitoring and evaluation

  • Nadini Persaud   ORCID: orcid.org/0000-0003-1827-2867 3 &
  • Ruby Dagher   ORCID: orcid.org/0000-0001-5211-2125 4  

583 Accesses

With a rich discussion of the role of monitoring and evaluation (M&E), the differences between monitoring and evaluation, and the various types of evaluations that can be undertaken, this chapter provides a rich assessment of the tools that evaluators can use to assess advancements in the SDGs, lessons learned, accountability, and the power of the interconnected nature of the SDGs. It also provides a critical assessment of the challenges that the evaluation domain faces.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Agrawal, R., & Rao, B. L. N. (2016). Evaluations as catalysts in bridging development inequalities. In R. C. Rist, F. P. Martin &. A. M. Fernandex (Eds.), Poverty, inequality, and evaluation: Changing perspectives (pp. 25–38). Washington, DC: The World Bank.

Google Scholar  

Baptiste, L., Lese, V., Gordon, V., Bailey, A., Persaud, N., Nicholson, C., et al. (2019). The transformative agenda for evaluation in small island developing states: The Caribbean and the Pacific. In R. D. van den Berg, C. Magro, & S. S. Mulder (Eds.). Evaluation for transformational change: Opportunities and challenges for the sustainable development goals (pp. 71–87). Exeter, UK: IDEAS.

Carlsson, J., Koehlin, G., & Ekbom, A. (1994). The political economy of evaluation: International aid agencies and the effectiveness of aid . London: Palgrave Macmillan UK.

Book   Google Scholar  

Cracknell, B. E. (2000). Evaluating development aid: Issues, problems and solutions . New Delhi: Sage.

Crawford, P., & Bryce, P. (2003). Project monitoring and evaluation: A method for enhancing the efficiency and effectiveness of aid project implementation. International Journal of Project Management, 21, 363–373. https://doi.org/10.1016/S0263-7863(02)00060-1 .

Article   Google Scholar  

Cunill-Grau, N., & Ospina, S. M. (2012). Performance measurement and evaluation systems: Institutionalizing accountability for Government results in Latin America. New Directions for Evaluation, 134 , 77–91.

Estrella, M. (2000). Learning from change. In M. Estrella (Ed.), Learning from change: Issues and experiences in participatory monitoring and evaluation (pp. 1–15). London: Intermediate Technology Publications Ltd.

Chapter   Google Scholar  

European Union. (2015). Evaluation matters: The evaluation policy for European union development co-operation . https://ec.europa.eu/international-partnerships/system/files/evaluation-matters_en.pdf .

Feeny, S. (2020). Transitioning from the MDGs to the SDGs: Lessons learnt? In S. Awaworyi (Ed.), Moving from the millennium to the sustainable development goals (pp. 343–351). Singapore: Palgrave.

Feinstein, O. N. (2012). Evaluation as a learning tool. New Directions for Evaluation, 134, 103–112. https://doi.org/10.1002/ev.104 .

Fitz-Gibbon, C. T., & Morris, L. L. (1987). How to design a program evaluation . Newbury Park: Sage.

Gabay, C. (2015). Special forum on the millennium development goals: Introduction. Globalizations, 12 (4), 576–580. https://doi.org/10.1080/14747731.2015.1033173 .

Ibrahim, S. (2013). Linking evaluation work in Arab countries to the crises in the “3F’s”—Finances, food, and fuel. In R. C. Rist, M. H. Boily & F. R. Martin (Eds.), Development evaluation in times of turbulence: Dealing with crises that endanger our future (pp. 1–4). Washington, DC. The World Bank.

Kusek, J. S., & Rist, R. C. (2004). Ten steps to a results-based monitoring and evaluation system: A handbook for development practitioners. Washington, DC. https://www.oecd.org/dac/peer-reviews/ World Bank 2004 10_Steps_to_a_Results_Based_ME_System.pdf.

Love, A. (1991). Internal evaluation: Building organizations from within . Newbury Park: Sage.

Marinic, S. (2012). Emergent evaluation and educational reforms in Latin America. New Directions for Evaluation, 134 , 17–27.

Menon, S. (2013). Evaluation and turbulence: Reflections. In R. C. Rist, M. H. Boily & F. R. Martin (Eds.), Development evaluation in times of turbulence: Dealing with crises that endanger our future (pp. 25–30). Washington, DC. The World Bank.

Morra-Imas, L. G., Morra, L. G., and Rist, R. C. (2009). The road to results: Designing and conducting effective development evaluations . Washington, DC: The World Bank. https://issuu.com/world.bank.publications/docs/9780821378915?layout=http%253A%252F%252Fskin.issuu.com%252Fv%252Flight%252Flayout.xml&showFlipBtn=true.

Morra-Imas, L. G., & Rist, R. C. (2009). The road to results: Designing and conducting effective development evaluations . Washington, DC: The World Bank. https://openknowledge.worldbank.org/bitstream/handle/10986/2699/52678.pdf?sequence=1&isAllowed=y

Neirotti, N. (2012). Evaluation in Latin America: Paradigms and practices. New Directions for Evaluation, 134, 7–16. https://doi.org/10.1002/ev .

Organization for Economic Cooperation and Development. (2020). OECD better criteria for better evaluation: Revised and updated evaluation criteria . https://www.oecd.org/dac/evaluation/evaluation-criteria-flyer-2020.pdf .

Parsons, D. (2017). Demystifying evaluation: Practical approaches for researchers and users . Bristol: Policy Press.

Patton, M. Q. (2016). A transcultural global systems perspective: In search of blue marble evaluators. Canadian Journal of Program Evaluation , Special Issue, 374–390.

Patton, M. Q. (2020). Blue marble evaluation: Premises and principles . New York, NY: The Guildford Press.

Persaud, N. (2019). An exploratory study on public sector program evaluation practices and culture in Barbados, Belize, Guyana, and Saint Vincent and the Grenadines: Where are we? Where do we need to go? Journal of MultiDisciplinary Evaluation, 15 (32), 17–27.

Persaud, N. (in press). Strengthening evaluation culture in the English Speaking Commonwealth Caribbean: A guide for evaluation practitioners and decision-makers in the public, private, and NGO sectors . Kingston, Jamaica: Arawak Publications.

Persaud, N., & Dagher, R. (2020). Evaluations in the English-speaking Commonwealth Caribbean region: Lessons from the field. American Journal of Evaluation, 41 (2), 255–276. https://doi.org/10.1177/1098214019866260 .

Reichert, J., & Gatens, A. (2019). Demystifying program evaluation in criminal justice: A guide for practitioners. IL. https://icjia.illinois.gov/researchhub/files/Demystifying_Evaluation191011T20092818.pdf .

Rossi, P. H., Lipsey, M. W., & Henry, G. T. (2018). Evaluation: A systematic approach (8th ed.). Thousand Oaks: Sage.

Rotondo, E. (2012). Lesson learned from evaluation capacity building. New Directions for Evaluation, 134 , 93–101.

Scriven, M. (1991). Prose and cons about goal-free evaluation. Evaluation Practice, 12 (1), 55–62.

Scriven, M. (2016). Roads to recognition and revolution. American Journal of Evaluation, 37 (1), 27–44.

Shepherd, R. (2016). Deliverology and innovation in evaluation: Canada as a case study . Paper presented at collaborative conference between The University of the West Indies, The Caribbean Development Bank, and Carleton University, on Strengthening the role of evaluation in the Caribbean: Lessons from the field, Bridgetown, Barbados.

United Nations Development Programme [UNDP]. (2016). From the MDGs to sustainable development for all: Lessons from 15   years of practice . New York. https://www.undp.org/content/undp/en/home/librarypage/sustainable-development-goals/from-mdgs-to-sustainable-developmentforall.html .

Weber, H. (2015). Reproducing inequalities through development: The MDGs and the politics of method. Globalizations, 12 (4), 660–676. https://doi.org/10.1080/14747731.2015.1039250 .

Download references

Author information

Authors and affiliations.

University of the West Indies, St. Michael, Barbados

Nadini Persaud

University of Ottawa, Ottawa, ON, Canada

Ruby Dagher

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Nadini Persaud .

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Persaud, N., Dagher, R. (2021). The Importance of Monitoring and Evaluation for Decision-Making. In: The Role of Monitoring and Evaluation in the UN 2030 SDGs Agenda. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-70213-7_4

Download citation

DOI : https://doi.org/10.1007/978-3-030-70213-7_4

Published : 08 June 2021

Publisher Name : Palgrave Macmillan, Cham

Print ISBN : 978-3-030-70212-0

Online ISBN : 978-3-030-70213-7

eBook Packages : Political Science and International Studies Political Science and International Studies (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

USAID logo

  • Search USAID Learning Lab Search
  • Login to Contribute

Designing Qualitative Research for Monitoring and Evaluation (M&E)

Click here to access the course

In this 2.5-hour Designing Qualitative Research for Monitoring and Evaluation (M&E) e-course, you will learn how to design and plan for a rigorous qualitative study or routine monitoring activity using the step-by-step guidance provided by the Qualitative Inquiry Planning Sheet (QuIPS). 

After completing this course, you will be able to:

  • design a qualitative inquiry that aligns with your project’s existing knowledge and evidence gaps;
  • identify the appropriate qualitative methodology and sampling for your research;
  • develop a plan for data collection, management, and analysis;
  • address research limitations, risks, and your ethical review; and
  • create a plan for engaging key stakeholders in the design and dissemination of findings.

The course includes: 

Module 1: Pre-Design Work

  • Identify evidence gaps based on source documents.
  • Identify key people who can support, advise, and review your inquiry work.
  • Identify key stakeholders who would benefit from the results of your qualitative inquiry.

Module 2: Defining the Purpose, Objectives, and Inquiry Questions

  • Identify the purpose and objectives of your inquiry.
  • Define the questions you hope to answer.
  • Determine the type of data you will collect to answer these questions.

Module 3: Designing the Methodology

  • Learn about a variety of qualitative inquiry methods.
  • Identify which methods and tools are most appropriate for your qualitative inquiry.
  • Learn how to choose a sampling strategy.

Module 4: Planning for Data Collection and Analysis

  • Develop an implementation plan for data collection.
  • Understand the process for qualitative data analysis and be able to articulate your data analysis plan.
  • Understand the best way to share and use your qualitative findings.

Module 5: Anticipating Limitations and Risks and Ethical Review

  • Document the constraints and ethical research protocol of your qualitative inquiry.

Developed by IDEAL in close partnership with the USAID Bureau for Humanitarian Assistance (BHA), this course is based on the Qualitative Design Toolkit , which focuses its guidance on completing the QuIPS. To learn more about the e-course on the FSN website HERE   or access the course directly on the Humanitarian Leadership Academy's Kaya platform HERE .

Page last updated August 23, 2024

7 steps for setting-up a Monitoring & Evaluation system

Designing a Monitoring and Evaluation system, or M&E system, is a complex task that usually involves staff from different units. This article describes the development of such a system in 7 steps (1). Each step is linked with key questions, which are intended to stimulate a discussion of the current state of the M&E system in a project or in an organization. Therefore, the 7 steps do not represent a strict chronological sequence for the development of an M&E system. All steps should be considered from the beginning:

  • Step 1 : Define the purpose and scope of the M&E system
  • Step 2 : Agree on outcomes and objectives - Theory of change (including indicators)
  • Step 3 : Plan data collection and analysis (including development of tools)
  • Step 4 : Plan the organization of the data
  • Step 5 : Plan the information flow and reporting requirements (how and for whom?)
  • Step 6 : Plan reflection processes and events
  • Step 7 : Plan the necessary resources and skills

Download the tool “M&E gap analysis” to work on these 7 steps .

Do you wish to get more information on designing a project level database for your M&E activities? Then, take a look at "The complete guide for a project level M&E database" too.

Take a look at how a database for development assistance for projects would look like via this database template .

Explore a database for indicators tracking for global M&E in this database template .

If you like this article don't forget to register to the ActivityInfo newsletter to receive new guides, articles and webinars on various M&E topics !

This guide is also available in French and Spanish

Step 1: Define the purpose and scope of the M&E system

It is crucial to define the scope of the M&E system at the very beginning. A question that will likely need to be answered is whether the system should be impact-oriented and whether we even want to monitor higher-level impacts, or whether the project team is satisfied with simply recording the proper implementation of activities and their results. Both can make sense and be correct, depending on the circumstances and what is needed to be able to improve the project in the best possible way. Of course, high-level results-oriented monitoring is usually preferable, but it could also fail due to the available capabilities. Moreover, it should be clear from the beginning who will continue to work with the M&E findings later on.

A challenge in this first step is engaging staff and convincing them that the additional time and effort to set-up an M&E system is worthwhile in order to improve project steering and thus the quality of project or program results. There are many and varied activities that can be carried out to this end. For some project/program teams, a workshop or a presentation may be helpful to convince of the usefulness of monitoring; in other cases, various face-to-face discussions may be more appropriate. The approach must ultimately be decided by the person responsible for M&E and depends on the resources available and the key people involved. Prior consultation on these can be useful.

Step 2: Agree on outcomes and objectives - Theory of change (including indicators)

A Theory of Change (ToC) is a description of how and why activities are expected to lead to short, medium, and long term outcomes over a period of time. This is more than just identifying outcomes and objectives. In fact, a ToC is a set of impact assumptions or hypotheses that can be described as a visual diagram, a narrative, or both. Once there is a ToC or something similar, staff will know better which M&E data to collect.

ToCs are not necessarily complex but they do provide a way to summarize the complexity of a situation and bring clarity to it. At best, this allows a wide range of stakeholders to come to a shared understanding of why and how activities will lead to desired results. Further information and examples of ToC are provided by Culligan & Sherriff in " A Guide to the MEAL DPro ".

It is helpful to involve a variety of stakeholders when developing the ToC – this could include staff, beneficiaries, partners, funders and even other experts who are familiar with the technical theme. The development process, and the thinking involved, are often as important as the diagram or narrative produced. However, if this seems too time-consuming, a common, good practice is to produce a first draft, which can be then discussed with other key stakeholders. The result of this work should be a complete but not over-complicated description of the activities and its results, with prioritized outcomes for measurement and SMART indicators (2) to collect data against them.

Of course, ToCs need regular review because as context and needs change so do they. But high level outcomes and impacts are usually valid for some years.

It’s worth noting that an important critique of ToC is that it neglects social realities and possible negative project effects and that it might narrow the view on planned project/program goals.

Step 3: Plan data collection and analysis (including development of tools)

For this step, it is recommended to create an M&E work plan, an M&E matrix or a combination of both. Depending on the needs of a program or project, the design of such a document may vary greatly. However, to ensure that the M&E activities are implemented, it is advisable to determine in such a matrix clear responsibilities with timelines or frequencies of data collection. Evaluation Toolbox provides a template that can be customized according to these needs

The methods to collect the data depend on the information needed. For example, if quantitative information on jobs created is needed, then a survey with a standardized questionnaire may be useful whereas if information on the reasons for behavioral changes of supported groups is required, qualitative interviews or a combination of interviews and a standardized survey may be more useful. Data collection tools (e.g. interview guidelines and questionnaires) should be pre-tested before they are actually used. Important guidance on how to develop such tools can also be found in the aforementioned Evaluation Toolbox and at the websites of INTRAC . Some of the staff involved probably need to have analytical skills (e.g. statistical skills in the case of analysis of questionnaires). If such skills are missing or there is no time to develop the required tools or to analyze the collected data, hiring an external M&E expert could be considered.

For some programs or projects, there might be an M&E officer who coordinates such M&E activities. If this is not the case, then it will be necessary that the staff coordinate the activities among themselves. It will then be even more important to assign clear responsibilities within the team.

Are you trying to decide whether to build an internal tool, work with external contractors or use an off-the-shelf tool such as ActivityInfo? Then, take a look at our White Paper "Off the shelf, external contractors or building your own application: When off-the-shelf is your best option for your information management needs"

Step 4: Plan the organization of the data

To use the collected data, the information needs to be stored and shared with the people involved, regardless of their location. One can store data physically or digitally using an information system. This means that M&E systems and data management go together. The data management system should be designed according to the needs, size and complexity of the project or program. Staff engaged in M&E activities, may need to liaise with the IT support of their organization. In any case, it is important to label and organize items in storage clearly (chronological, by location, by content or any other category considered useful).

Good data management includes storing data securely to avoid unauthorized access, theft, or unintentional destruction of data and to comply with any legal requirements, such as data protection legislation. This often involves IT protection methods, such as passwords, firewalls and virus checks. But it might also simply mean having a lock on a filing cabinet. The global collaboration organization BetterEvaluation synthesizing advice from the UK Data Archive (3) recommends : not to store digital data on externally networked servers or computers; the installation of firewalls and security systems to protect against malware and viruses; the existence of password protected computer systems; the encryption of sensitive materials (even when transferring data by email); the signing of non-disclosure agreements. If it is important for a survey to include personal data, such as address or name, it is essential to obtain the permission of the respondent beforehand.

When using external software for data management needs, the terms of use, data protection and confidentiality, and the servers location should be checked.

Data management is linked with data quality assurance too: It is important to avoid gathering data of low quality and to ensure that data is "cleaned" of any errors. The collected data may be the basis for further decisions. Data quality methods may include the use of multiple data sources, such as triangulation of data and interviewer training and supervision. It should be clarified among the staff who is responsible for data quality assurance and how.

Step 5: Plan the information flow and reporting requirements (how and for whom?)

To be useful, information gained through M&E needs to be communicated to different stakeholders. Most likely, there are certain reporting requirements set by donors. However, it is good practice to disseminate and to discuss findings among other stakeholder groups so that learning from M&E has a wider reach. At the very least, the M&E results should also be discussed with the supported communities and groups.

There are many ways to communicate M&E information with stakeholder groups. The best communication method will depend on the audience and how the information will be used. For example, project managers may require much more detailed information on the progress made; program directors may require regular, short summary reports across different projects and programs, with aggregated tables and statistics; policy-makers might benefit from a short brief summarizing the main issues, and making recommendations for change; a member of the public that supports an organization through donations might prefer to see a story of change, a photograph or a short video that enables to connect with beneficiaries on an emotional level.

Sometimes, especially when communicating information to partners or supported groups, it is useful to discuss communication methods with the audience beforehand. This is a core element of participatory M&E. Consideration should also be given to how information can be communicated to people with audio or visual disabilities, or whether stakeholders are able to access the venues for meetings. When communicating information to illiterate or semi-literate people, presenting information in written form is of little use.

It is also important to know when information needs to be communicated. For example, if decision-making meetings occur on a quarterly basis then it is important to communicate M&E findings before those meetings are held. Similarly, when seeking to influence a government policy, it is important to supply information at the right time so that it has the maximum chance of achieving its purpose.

The use of communication strategies or dissemination plans will facilitate the organization of the information flow. The key point is to be very clear about who needs what M&E information, when and where. Narratives (formal reports, case studies, newsletters, press releases, policy briefs) are the most common way of communicating M&E findings. Other means of communication are through photographs, videos, pictures and cartoons. The big advantage of the latter mentioned communication channels is that they can communicate information from supported communities and groups directly to different audiences, without being filtered through a report. In addition, M&E findings can be communicated verbally in meetings and workshops, through feedback sessions and even through informal conversations. Speaking directly to a target audience allows messages to be tailored to the individual or group, and allows for some discussion of findings as well.

Also, more artistic and traditional methods of communication such as poems, drama, mime and song can be used to share M&E information with others. Using such activities can help prevent M&E becoming a sterile exercise, and can foster a broader understanding and discussions about change.

Recent technological advances offer another way of communicating M&E information. Websites and social media sites, podcasting, and webinars have made it much easier to present and communicate information in new and innovative ways. Communication via mobile phones and tablets offer further opportunities in the communication of M&E information, although to date, this has mainly been used for data collection (for example surveying through text messages) rather than for communication of M&E findings (4).

The dissemination plan below, taken from the website of the Technical Centre for Agricultural and Rural Cooperation provides a good example of how such a plan could look like. Other examples are offered by the websites of BetterEvaluation and MEAL DPro initiative .

Dissemination Plan
Audience Purpose Message Products and channels Timeline
Who do you want to reach,
who needs to learn about your experience?
For each target audience:
what is the purpose for sharing with them?
For each target audience:
what are the lessons that you want to share with them?
For each target audience:
what are the best ways to reach them?
For each product/ channel:
when do you plan to share?
Which steps need to be taken?
1.
2.
Etc…

Source: Website of the CTA. Technical Centre for Agricultural and Rural Cooperation, licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License

In some organizations, there is a communications officer responsible for external communications and organizing the flow of information; in others, especially smaller organizations, this task is the responsibility of an M&E officer or the project team itself.

Step 6: Plan reflection processes and events

This step goes a bit further and wants to not only communicate results but discuss M&E findings with stakeholders so that everyone learns from each other. Again, the discussion and learning formats can vary widely. These could include, workshops, exchange visits, seminar, conferences, After-Action Reviews (AAR) (5) to name a few. However, learning does not happen in one sitting. It is important that moments of reflection take place regularly throughout the life of the project or program. The incorporation of learning events in the project/program cycle is key. In this regard, annual or bi-annual reviews are critical learning opportunities to reach conclusions about achievement and failures. The optimal sequence of learning events follows reporting lines of decision making. It should be ensured that the right people are involved in such reviews. Therefore, it may sometimes be important to include decision makers in the reviews (so that they learn at the same time as their staff and therefore make appropriate decisions), but it can be a challenge to ensure that this does not affect the openness of the conversations. It could be very helpful if staff are trained in facilitating intentional group learning processes.

Regular team meetings are another important opportunity for reflection. Team members may include project staff, implementing partners, and primary stakeholder representatives – this depends on how the project is structured. Weekly meetings are common but if other stakeholders are involved this may be needed less frequently. In each project context, there are usually forums where implementing partners interact with each other. These events offer another chance for reflection.

It is recommended to assign roles and responsibilities for leading the learning events. In addition, the learning and resulting conclusions for further actions should be documented well, with a focus on documenting “action needed”, “person responsible for implementation”, “deadline”, and “persons responsible for follow-up”. Such documentation could be tabular (see the example below), but any other form is fine as long as it records the most important items.

Documenting learning and conclusions
Weakness identified Improved action suggested Person(s) responsible for action Timeline Unit/person responsible for follow-ups

Source: own composition

Lastly, it is worth mentioning here that learning in the context of M&E is about having a culture that encourages intentional reflection and processes that support this culture. All teams learn as they implement project activities. But to take advantage of this learning and consistently translate it into improved practice, learning must be planned and managed.

Step 7: Plan the necessary resources and skills

It is good to start planning the M&E budget already in the project/program design phase so that adequate funds are allocated and later available for M&E activities. There is no standard formula to determine the budget for a project/program’s M&E system. An industry standard is that between 3 and 10 percent of a project/program’s budget should be allocated to M&E (6). A planning table for key M&E activities can be useful in this regard. It is particularly important to budget for any expensive items, such as baseline surveys and evaluations.

Moreover, an effective M&E system requires capable people. Therefore, when defining roles and responsibilities for M&E, specific consideration should be given to the M&E qualifications and expectations, including the approximate proportion of time for each person to support the system. A first step in planning for M&E human resources is to determine the available M&E experience within the project/program team, partner organizations, target communities and any other potential participants in the M&E system. This will inform the need for capacity building or outside expertise. For long-term and larger projects/programs, it may be useful to create an M&E training schedule. Ideally, data collection, analysis and M&E training involves the people to whom these processes and decisions most relate (7).

One key planning consideration is who will have the overall responsibility for the M&E system. It is important to clearly identify who will coordinate all these M&E activities and to whom others will turn to for M&E guidance. The responsible person or team should supervise the M&E functions, and have an overview of any problems that might arise.

This article was written with the intention to support especially smaller organizations in their M&E activities. Hopefully the information has been helpful and of practical use in setting up M&E systems. The author welcomes suggestions, additions and comments.

The team of ActivityInfo would like to warmly thank Ms Susanne Neymeyer for this insightful and detailed guide on setting up a Monitoring and Evaluation system. Ms Neymeyer has been an ActivityInfo Education Partner since July 2020.

Susanne Neymeyer is an independent M&E consultant with more than 15 years experience in the field of development cooperation and humanitarian aid. Her academic background is in social work, adult education and evaluation. Susanne started working as an independent M&E consultant in 2009. Since then, she has evaluated and supported a wide range of development and humanitarian projects and programs all over the world. Before she became an international consultant, she worked for various international development and humanitarian organizations as project coordinator and manager.

Footnotes, references and further reading

(1) : In other guidance notes the development of an M&E-System is described in 6 or 10 steps. I have a preference for 7 steps, but a representation of the process with fewer or more steps is of course just as good.

(2) : SMART indicators are Specific, Measurable, Achievable, Relevant and Time-Bound. More information about SMART indicators and how to design them is provided by the INGO People in Need (PIN)

(3) : Van den Eynden, V., Corti, L., Woolard, M., Bishop, L. and Horton, L. (2011). Managing and sharing data: Best practice for researchers. UK Data Archive, University of Essex: Essex.

(4) : More information on innovative tools can be found at Glenn O’Neil. (2017) A Guide: Integrating Communication in Evaluation

(5) : An After-Action Review (AAR) is a simple process used by a team to capture the lessons learned from past successes and failures with the goal of improving future performance. More information on AAR and other knowledge sharing methods provides the Knowledge Sharing Toolkit developed among others by the Food and Agriculture Organization of the United Nations (FAO) and the United Nations Children's Fund (UNICEF).

(6) : IFRC. (2011). Project/programme monitoring and evaluation (M&E) guide

(7) : Ultimately, the degree of participation of supported communities will vary according to the project/program and context. Some examples of M&E participation include among other activities: vulnerability capacity assessments; involvement of local representatives in the project/program design and the identification of indicators; participatory monitoring where elected community representatives reporting on key monitoring indicators; sharing monitoring and evaluation findings with community members for participatory analysis and identification of recommendations.

  • Get started
  • Project management
  • CRM and Sales
  • Work management
  • Product development life cycle
  • Comparisons
  • Construction management
  • monday.com updates

Project Monitoring, Evaluation, & Control: Ultimate Checklist

assignment monitoring and evaluation

If keeping projects on track was easy, there wouldn’t be a whole field of project management dedicated to it. Every project needs to have a system in place to monitor and evaluate its progress and put it back on track when plans start to veer.

Project monitoring is a process within project management that allows you to check in on progress and find a path to make meaningful changes wherever necessary. Project monitoring is another phase of project management, and is just as important as all the other stages.

In this blog post, we’ll look at what makes project monitoring so important, its different substages, best practices, and how platforms like monday.com help project managers with all the heavy lifting.

What is project monitoring?

project monitoring goals

Project monitoring is one of the key steps of the project management process, which includes initiation , planning , executing , monitoring and controlling, and closing . In the project monitoring stage, teams track a project’s progress by looking at different metrics that could affect a project’s outcome, such as:

  • Verifying a project’s scope
  • Timeline for deliverables
  • Budgets in relation to schedules
  • Quality control
  • Team workload

Essentially, at this stage of the project management process, a project manager or other stakeholder is measuring a project’s current performance against the project plan and overall goals and identifying potential risks to project performance. If you’re not hitting milestones (e.g., delivering a prototype within a specified time), the project has a high chance of failure. This stage is essential to identifying additional roadblocks that could affect a project’s outcome.

Project monitoring involves tracking a project’s metrics, progress and associated tasks, making sure that everything is completed on time, within the project budget, and according to project requirements and standards. It’s also making sure that work doesn’t go beyond the initial project scope.

Pro tip: want to automate your monitoring and keep everyone on track? You need a project management software like monday.com work management to get it all done seamlessly.

What is project evaluation?

Project evaluation works alongside project monitoring, and while they have similar goals, their processes are a little different. With project monitoring, managers may look at a project’s progress compared to initial project goals. However, in the project evaluation portion of this stage, they’d take factors like feedback and data collected in the monitoring stage to evaluate why a project is veering off its course, scope, or budget.

In the evaluation part, you’d look at the information gathered from monitoring and making decisions based on it. For example, taking scope creep into consideration, you can consider whether you need to adjust schedules or fast-track certain processes to meet deadlines.

The evaluation process happens throughout the project, not only after project objectives are met. There may also be more in-depth evaluations at big milestones, like the retrospective at the end of a sprint.

What is project control?

Finally, as the last piece of the puzzle, project control is about implementing corrective action to keep a project on track, in scope, and on budget while still maintaining high quality.

In this part of the project monitoring stage, you might want to take action such as:

  • Re-allocating resources
  • Revising and resetting schedules, milestones, and project timelines
  • Updating project plans and goals
  • Redistributing tasks to team members
  • Updating budget expectations

Project control acts as the final substage of monitoring a project, where after you’ve collected data and evaluated how to make changes, you finally put your findings to work by making tangible changes to your project method, strategies, and course of action.

Importance of project monitoring, evaluation, and control

Companies waste an average of 5.2% of their investment on projects due to poor performance. Project monitoring, evaluation, and control act as three sides to the same tool that can be used to help you improve a project’s overall efficiency by catching and resolving issues before it’s too late.

In this stage of project monitoring, broken down into further substages, it’s clear to see how this process can benefit a project overall and better lead to successful outcomes. Project monitoring, evaluation, and control are essential aspects of project management for a few reasons:

  • Better ensures project success by verifying a project stays on track and meets its objectives and deliverables
  • Identifies issues earlier so project managers can make plans to address and correct them
  • Optimizes resources by ensuring they’re continuously used efficiently
  • Encourages accountability within teams by solidifying each individual’s tasks and responsibilities
  • Enables project managers to identify variances and project deviations quicker before they evolve into bigger issues

Best practices in project monitoring, control, and evaluation

After getting a better idea of what project monitoring entails, it’s important to implement your unique monitoring process with certain best practices in mind. While every team and company’s project monitoring process will look different, if you follow these tips, you’ll be able to get the most out of it and ensure the continued success of your project.

Set expectations early

From the start of your project, you always want to clearly outline project goals as well as expectations on the team and individual levels. Establish what you expect from everyone on your team and make sure each member knows how to clearly communicate their updates or roadblocks when they’re not meeting deadlines so that the monitoring process goes more smoothly. You should also decide on mitigation strategies for every kind of scenario if you encounter potential issues or project risks throughout the project lifecycle.

Use project monitoring tools

While you should be using project management tools throughout an entire project, they become particularly helpful during the monitoring process. An all-encompassing project monitoring tool like monday.com will help you streamline workflows like data and feedback collection, allow you to gather reports on performance and key metrics, and let you view team workloads to better assess where your project is and what needs to be changed or fixed.

Pro tip: you can also use monday.com to see your data and project progress using different views such as the calendar view, or a kanban board view, chart view, or timeline view.

Encourage ongoing communication and feedback

While the process of monitoring, evaluating, and controlling a project may lie with the project manager, it’s a lot easier on the person in charge when there’s effective communication throughout a project’s lifecycle. Getting regular updates from team members and key stakeholders on deliverables, roadblocks, feedback, and timelines allows you to monitor progress on an ongoing basis without having to constantly request new information.

Decide monitoring frequency

Some projects may benefit from regular monitoring on an ongoing basis, while others may only need to go through this process once or twice. Consider how frequently you want to go through this entire process, whether weekly, monthly, or quarterly. You can also implement monitoring with every sprint or completed project phase instead of having it on a timeline.

Automate data collection

Since project monitoring and evaluation rely on a lot of data that evaluate performance and progress, like key performance indicators (KPIs), use automation tools to help collect this information regularly. By having automated reports based on real-time information, you reduce the resources and time needed to manually gather this information. Software like monday.com allows you to automate reporting so that you can spend less time gathering data and more time analyzing and actioning it.

Challenges in project monitoring, evaluation, and control

Monitoring a project is just one of the many things to do on a long list of project management tasks. However, like many other elements of project management, it’s an important step to ensure work is moving along smoothly. That said, monitoring a project can come with its own challenges along the way that can make it more difficult to accurately evaluate roadblocks and provide solutions. Let’s look at some of the most common challenges in project monitoring.

  • Data accuracy : Collecting accurate and up-to-date data is important in complex projects, but can be difficult to do when there’s more than one source of data
  • Scope creep: Projects are prone to changes along the way, making it a challenge to monitor the progress against an original plan when plans are always fluid or if a project’s scope isn’t clearly defined from the start
  • Resource constraints: Having limited resources like personnel, budget, and even time can make project monitoring less effective, leading to inaccurate assessments
  • Reporting: Project managers need to be on top of multiple reports during the monitoring process to gather different types of data, and consolidating all this data can be time-consuming and prone to error, especially when done manually
  • Miscommunication: Using too many work tools can lead to a break in communication and make performance assessments harder to track and less reliable, especially if reporting is on one tool, chats are on another, and you’re tracking tasks on a third

The best way to avoid these roadblocks and others is to prepare your project from the start, including making a plan for how you’ll continuously monitor it by setting clear objectives and trackable metrics. Additionally, using a platform that allows you to do everything in one place, like monday.com, will help you avoid the pitfalls of broken communication, manual reporting, and a lack of accurate data.

How to monitor, control, and evaluate projects with monday.com

Project monitoring and evaluation enable you to make better decisions about ongoing and future projects, but without an all-encompassing platform like monday.com, it can be tricky to keep everything organized. monday.com helps teams and project managers organize data, track performance, follow project progress, and generate reports all in one place.

With easy-to-use automation, communication, and collaboration tools, you can rest assured that your entire project team can use one platform for multiple aspects of planning and executing a project. Here’s a closer look at some of monday.com’s features that make it ideal for project monitoring purposes.

Generate advanced reports

reports dashboard project monitoring

Using a customizable dashboard, you can automatically generate reports and visual representations based on the data on your monday.com platform. Choose how you visualize data in your report with charts, timeline and schedule overviews, graphs, and widgets to see all your insights at a glance.

Monitor workloads

workload view project monitoring

With over 27 different work views, you can choose how to best manage your project. The workload view allows you to quickly see which tasks employees are working on, how full their plates are, and what their workloads look like in the future. This allows you to more accurately assess performance, revise schedules, and allocate tasks based on available resources.

Improve workflow management

project monitoring monday.com board

Implementing change that will positively affect your project is the goal of project monitoring, and monday.com makes that easy. With pre-made templates and boards to get started quickly, you can improve your team’s workflows with personalized automations to keep tasks moving quickly and multiple collaboration tools to make it easy to communicate, send and receive updates, and highlight the status of a task in real time.

The ultimate checklist for project monitoring, evaluation, and control

Project monitoring.

Set expectations early:

  • Clearly outline project goals.
  • Define team and individual expectations.
  • Establish regular communication of updates and roadblocks.
  • Plan mitigation strategies for potential issues.

Use project monitoring tool s:

  • Utilize an all-encompassing tool for project monitoring like monday.com.
  • Streamline workflows for data and feedback collection.
  • Gather reports on performance and key metrics.
  • Monitor team workloads.

Encourage ongoing communication and feedback:

  • Facilitate regular updates from team members and stakeholders.
  • Address deliverables, roadblocks, feedback, and timelines consistently.

Decide monitoring frequency:

  • Determine whether to conduct monitoring weekly, monthly, quarterly, by sprint, or by project phase.

Automate data collection:

  • Use automation tools for real-time data reporting.
  • Reduce manual data collection efforts.

Project Evaluation

Use gathered data:

  • Evaluate progress compared to initial project goals.
  • Analyze feedback and data from monitoring.

Identify causes of deviations:

  • Assess reasons behind scope, budget, or timeline issues.

Make informed decisions:

  • Adjust schedules and fast-track processes as needed.
  • Conduct evaluations throughout the project and at major milestones.

Project Control

Implement corrective actions:

  • Re-allocate resources as required.
  • Revise and reset schedules, milestones, and project timelines.
  • Update project plans and goals.
  • Redistribute tasks to team members.
  • Adjust budget expectations.

Mitigating Challenges

Ensure data accuracy:

  • Integrate data from reliable sources.

Manage scope creep:

  • Define project scope clearly from the start.
  • Adjust plans consistently as changes occur.

Address resource constraint s:

  • Optimize resource usage.
  • Monitor personnel, budget, and time limitations.

Simplify reporting:

  • Use consolidated tools for data gathering and reporting.
  • Automate reporting processes.

Improve communication:

  • Use a unified platform to minimize communication gaps.
  • Track performance and updates in a single tool.

Tools & Features

Generate advanced reports:

  • Use customizable dashboards for data visualization.

Monitor workloads:

  • Utilize various work views (for example – calendar view, waterfall, and other real-time views) to better track project performance

Improve workflow management:

  • Apply pre-made templates and personalized automations.
  • Use collaboration tools for real-time updates and communication.

By following this checklist, you can ensure effective project monitoring, evaluation, and control, leading to better project outcomes and optimized processes.

Project monitoring at your fingertips

In a perfect world, all your projects would go according to plan. Everything would be completed on time and within budget, but that’s not always the reality. Employees may miss deadlines, external project stakeholders may back out, budgets can be exceeded, and project plans may not go smoothly.

Project monitoring and evaluation enables you to identify and mitigate issues that may impact the project scope, quality, timeline, or budget. You can then take those insights and use them to optimize processes for future projects. Using a platform like monday.com ensures that you have all the information you need at your fingertips to best monitor your project, evaluate where you’re at and what can be improved, and take control to steer a project back on course.

  • Project schedule management

assignment monitoring and evaluation

Don’t miss more quality content!

Send this article to someone who’d like it.

  • Français
  • Español

Monitoring and Evaluation Analyst -for Tunisian only -

Advertised on behalf of.

Tunis, TUNISIA

Type of Contract :

Service Contract

Starting Date :

23-Sep-2024

Application Deadline :

09-Sep-24 (Midnight New York, USA)

Post Level :

Duration of initial contract :, time left :, languages required :.

Arabic   English   French  

Expected Duration of Assignment :

UNDP is committed to achieving workforce diversity in terms of gender, nationality and culture. Individuals from minority groups, indigenous groups and persons with disabilities are equally encouraged to apply. All applications will be treated with the strictest confidence. UNDP does not tolerate sexual exploitation and abuse, any kind of harassment, including sexual harassment, and discrimination. All selected candidates will, therefore, undergo rigorous reference and background checks.

N Women, grounded in the vision of equality enshrined in the Charter of the United Nations, works for the elimination of discrimination against women and girls; the empowerment of women; and the achievement of equality between women and men as partners and beneficiaries of development, human rights, humanitarian action and peace and security.

UN Women in Tunisia is currently focusing on three areas – Governance and Women’s Political Participation, Women’s Economic Empowerment and Women's Peace and Security, and working with various stakeholders from Government, CSO and Parliament to ensure gender is mainstreamed in public policies and strategies, legislative reforms and in political and economic participation processes. Since the 2011 revolution, UN Women Tunisia has supported the integration of equality provisions in the 2014 constitution and parity conditions in electoral laws. These efforts were consolidated during this post-revolution process by the revision of different discriminatory laws, the establishment of high-level gender issues mechanisms such as the “Conseil des Pairs”, the adoption of the National Action Plan on 1325 and the integration of full parity into local elections law. Specifically, the office has supported research on the role of women in the prevention of violent extremism, the socio-economics factor and women’s engagement in violent extremism, and the link between public violence against women and violent extremism, all of which aim to inform decision-making on the issue of violent extremism as well as UNCT programming on this topic.

UN Women Libya has strengthened its programme presence in Libya, responding to the complex governance, humanitarian, development and security challenges through its triple mandate. UN Women’s Libya’s premise is that peacebuilding efforts, including political dialogues, conflict resolution and humanitarian efforts will be more effective and have higher chances of success if they are inclusive, responding to gendered experiences, needs, capacities and interests of the Libyan women across their diversities, this includes supporting women’s participation in peace processes at all levels. In addition, the programme is working to strengthen women’s political participation, ensure gender-responsive economic recovery, coordinate gender mainstreaming within the UNCT in Libya, integrate women’s needs into the current humanitarian response, and ensure their full participation in humanitarian consultations and processes. The program is aligned with the UN Sustainable Development Cooperation Framework (UNSDFC) 2023-2025, UN Women Global Strategic Plan (2022-2025), UN Women Libya’s Strategic Note (SN 2023-2025) as well as the international conventions on women and human rights such as the Convention on Ending all forms of Discrimination Against Women and the Beijing Platform for Action.

Duties and Responsibilities

Reporting to the Programme Management Specialist, the Monitoring and Evaluation Analyst will provide support to the Tunisia and Libya Cluster Offfice’s colleagues in incorporating monitoring and reporting into programme formulation as well tracking against Strategic Plan targets and the reporting of results to internal and external audiences.

Facilitate and substantively contribute to the incorporation of monitoring and reporting into programme formulation

  • Facilitate and substantively contribute to the development of monitoring indicators, monitoring calendars, and field monitoring plans and quality assurance processes;
  • Include inputs from relevant evaluation findings, conclusions and recommendations into programme formulation;
  • Contribute to annual work plan monitoring, reviews and reporting;
  • Provide technical support to partners in developing Performance Monitoring Frameworks (PMFs), systems and plans, and Baseline Surveys;
  • Facilitate the clearance of donor agreements and Programme Cooperation Agreements with CCO

Contribute substantively to the monitoring and tracking of results against country/ regional level targets and UN Women Strategic Plan

  • Coordinate with Programme Team to ensure that data collection and analysis from field visits are coordinated and standardized across programmes;
  • Monitor data from partners and programmes on a quarterly basis and provide substantive inputs to regular management briefs to inform decision making;
  • Visit partners, along with the Programme Team, to support monitoring of results and planning processes as required;
  • Monitor the spending of donor funds and other programme expenditures and disbursements;
  • Draft and monitor the CCO Monitoring, Evaluation, and Research Plan.

Facilitate the reporting of results to internal and external audiences

  • Facilitate the process of the CCO meeting internal and external reporting requirements and deadlines, including annual reporting process;
  • Support programme team in drafting donor and programme reports;
  • Identify relevant evaluation findings, conclusions and recommendations and input them into programme reporting;
  • Review progress reports submitted by partners and provide feedback to improve quality and timeliness of reporting;
  • Collect and maintain data for country, regional and global corporate reports, mid-term reviews, and final evaluations.

Provide technical support to the CCO in the implementation the UN Women Evaluation Policy

  • Coordinate the implementation of UN Women’s Evaluation plan in the CCO
  • Provide guidance to programme staff on evaluations;
  • Ensure communication between the CCO and RO regarding Evaluations;
  • Coordinate the completion of management’s response to the UN Women Global Accountability and Tracking of Evaluation Use (GATE).

Contribute to knowledge building and capacity building

  • Identify and disseminate good practices, lessons and knowledge, as identified through programme implementation, monitoring and evaluation activities;
  • Contribute to the development of capacity development tools, including training materials and packages;
  • Facilitate capacity building opportunities for staff and partners in the areas of Results Based Management (RBM), Monitoring and Evaluation;
  • Promote the awareness and understanding of the shared responsibility of Monitoring and Evaluation (M&E) among all staff members through communication, training, learning and development activities.

Consultant’s Workplace and Official Travel :This is an office-based consultancy.

Competencies

Core Values

  • Respect for Diversity 
  • Integrity 
  • Professionalism 

Core Competencies

  • Awareness and Sensitivity Regarding Gender Issues :
  • Accountability ;
  • Creative Problem Solving; 
  • Effective Communication ;
  • Inclusive Collaboration ;
  • Stakeholder Engagement;
  • Leading by Example .

Please visit this link for more information on UN Women’s Core Values and Competencies:  

https://www.unwomen.org/en/about-us/employment/application-process#_Values  

FUNCTIONAL COMPETENCIES 

  • Good knowledge of programme formulation and implementation and Results Based Management;
  • Good knowledge of monitoring and evaluation, evaluation design, data collection and analysis, and reporting;
  • Ability to synthesize program performance data and produce analytical reports;
  • Good analytical and report writing skills;
  • Knowledge of UN programme management systems.

Required Skills and Experience

Education and Certification

  • Master’s degree (or equivalent) in Political or Social Science, Economics, International Development Studies, Gender/Women's Studies is required;
  • A first-level university degree in combination with two additional years of qualifying experience  may be accepted  in lieu of the advanced university degree;
  • A project/programme management certification (such as PMP®, PRINCE2®, or MSP®) would be an added advantage

Experience:

  • At least 5 years of progressively responsible experience at the national or international level in monitoring and reporting of development projects/ programmes;
  • Experience in the United Nations systems in an asset;
  • Field experience is an asset.
  • Fluency in English and French is required;
  • Working knowledge or Arabic is requested;
  • Knowledge of another official UN language is desirable (Portuguese, Chinese, Russian or Spanish).

How to Apply

  • Personal CV or P11 (P11 can   be downloaded from:   https://www.unwomen.org/sites/default/files/Headquarters/Attachments/Sections/About%20Us/Employment/UN-Women-P11-Personal-History-Form.doc  )
  • A cover letter (maximum length: 1 page)

Managers may ask (ad hoc) for any other materials relevant to pre-assessing the relevance of their experience, such as reports, presentations, publications, campaigns, or other materials

UN Women has a zero-tolerance policy on conduct that is incompatible with the aims and objectives of the United Nations and UN Women, including sexual exploitation and abuse, sexual harassment, abuse of authority, and discrimination. All selected candidates will be expected to adhere to UN Women’s policies and procedures and the standards of conduct expected of UN Women personnel and will therefore undergo rigorous reference and background checks. (Background checks will include the verification of academic credential(s) and employment history. Selected candidates may be required to provide additional information to conduct a background check.)

Monitoring & Evaluation Specialist, NO-C, Fixed Term, Brasilia, Brazil # 100374

Brasília | brazil.

  • Organization: UNICEF - United Nations Children’s Fund
  • Location: Brasília | Brazil
  • Grade: Mid level - NO-C, National Professional Officer - Locally recruited position
  • Monitoring and Evaluation
  • Closing Date: 2024-09-09

UNICEF Brazil is looking for an experienced professional in Monitoring & Evaluation. Based in Brasilia, under the guidance of the Chief of Planning, Monitoring, Research and Evaluation, the M&E Specialist ensures the Country Office and the national partners are strengthened and receive the necessary support for research, monitoring and evaluation activities that will provide the most relevant and strategic information to manage the Country Programme, including tracking and assessing UNICEF’s distinct contribution. UNICEF’s active commitment towards diversity and inclusion is critical to deliver the best results for children. This job opportunity is presential and for Brazilians only or internal candidates with a work permit for Brazil.

UNICEF works in over 190 countries and territories to save children’s lives, defend their rights, and help them fulfill their potential, from early childhood through adolescence.

At UNICEF, we are committed, passionate, and proud of what we do. Promoting the rights of every child is not just a job – it is a calling.

UNICEF is a place where careers are built: we offer our staff diverse opportunities for personal and professional development that will help them develop a fulfilling career while delivering on a rewarding mission. We pride ourselves on a culture that helps staff thrive, coupled with an attractive compensation and benefits package.

Visit our website to learn more about what we do at UNICEF.

For every child, commitment

Since 1950, UNICEF has supported the most important transformation for children and adolescents in Brazil. UNICEF Brazil with its main office in Brasília and nine field offices UNICEF works in close partnership with national and sub-national government, corporate partners, youth, and civil society organizations. For information of the work of our organization in Brazil, please visit our website:   UNICEF Brazil

How can you make a difference ?  

To ensure that the UNICEF Country Office has useful, valid and reliable information on the situation of children’s and women’s rights; the performance of UNICEF-supported programmes including their relevance, efficiency, effectiveness, and sustainability, and in emergency contexts, their coverage, coordination and coherence.

Under the guidance of the Chief of Planning, Monitoring, Research and Evaluation, the M&E Specialist ensures the Country Office and the national partners are strengthened and receive the necessary support for research, monitoring and evaluation activities that will provide the most relevant and strategic information to manage the Country Programme, including tracking and assessing UNICEF’s distinct contribution. The Monitoring & Evaluation Specialist will be responsible to:

1.  Integrated Monitoring, Evaluation & Research Plan (IMEP) 

  • Ensure that the Country Office and national and sub-national partners use a well-prioritized and realistic plan for research, monitoring and evaluation activities that will provide the most relevant and strategic information to manage the Country Programme and support the UNICEF Goal Area sections and decentralized Field Offices in their efforts for evidence and data generation. The specialist is expected to consider and put in place knowledge, data and evidence partnerships that can effectively reduce existing data gaps on child right deprivations, especially taking into account equity levels.

2.  Situation Monitoring and Assessment 

  • Ensure that the Country Office and national partners have timely and accurate measurement of changes in conditions in the country or regions, including monitoring of socio-economic trends and the country’s wider policy, economic or institutional context, to facilitate planning and to draw conclusions about the impact of programmes or policies on the country’s trajectory towards the Sustainable Development Goals (SDGs) and the objectives of the Strategic Plan of UNICEF. This includes leading the actualization of information platforms and periodic updates of the SITAN. In case necessary, the M&E Specialist to also asked to support evidence generation and situation monitoring during emergencies and to work with the broader UN partners on data & evidence.

3. Programme Performance Monitoring 

  • Ensure that the Country Office has quality information to assess progress towards expected results established in annual work plans and that results are in line with the greater objectives defined in the CPD and through the Theory of Change. Support sections to monitor and understand the distinct UNICEF contribution to results, including during emergency response.

4.  Evaluation

  • Ensure that UNICEF-supported evaluations (as described in the CEP) are designed and implemented in accordance with UN quality standards, and that the results are disseminated in a timely fashion to stakeholders in order to improve programme performance and contribute to wider learning. This includes ensuring a fluent communication and coordination with the Evaluation units both at regional and headquarter level and providing agile and efficient support to the sections that are leading the evaluations.

  5.  M&E Capacity Building

  • Ensure that the monitoring and evaluation capacities of Country Office staff and national partners – government and civil society – are strengthened enabling them to increasingly engage in and lead monitoring and evaluation processes .

  6.  Coordination and Networking 

  • Ensure that the UNICEF office is effectively linked to wider UNICEF M&E developments in a way that both contributes to and benefits from organizational learning on effective M&E management. Brazil Country Office is uniquely positioned both to pilot several of the new innovative M&E approaches as well as share expertise and best practices with other smaller UNICEF offices in the region and through South-South Cooperation. At the same time, a lot of the research and evaluations that are generated will be leveraged for communication and advocacy and the M&E specialist will play an important role in coordinating with these sections.

  7. Team management

  • Provide oversight and guidance to the M&E colleagues that work in the decentralized field offices and in Brasilia ensuring they are providing adequate support to the sections and offices and making sure that information flows freely.

To qualify as an advocate for every child you will have…

Minimum requirements:

Advanced university degree (master’s) in Social Sciences, International relations, Political science, International finance, Public policies and relations, Statistics, Development studies.  

Work Experience:

A minimum of five years of relevant professional work experience in programme development and implementation including monitoring and evaluation activities.

Language : Fluency in English and Portuguese is required. 

Desirables:

  • Experience working in a developing country.
  • At least one instance of exposure to emergency programming, including preparedness planning. Active involvement in a humanitarian crisis response programme preferred.
  • Knowledge of Spanish.
  • Developing country work experience and/or familiarity with emergency. 

For every Child, you demonstrate...

UNICEF’s Core Values of Care, Respect, Integrity, Trust and Accountability and Sustainability (CRITAS) underpin everything we do and how we do it. Get acquainted with Our Values Charter: UNICEF Values

The UNICEF competencies required for this post are…

(1) Builds and maintains partnerships

(2) Demonstrates self-awareness and ethical awareness

(3) Drive to achieve results for impact

(4) Innovates and embraces change

(5) Manages ambiguity and complexity

(6) Thinks and acts strategically

(7) Works collaboratively with others 

(8) Nurtures, leads and manages people

Familiarize yourself with our competency framework and its different levels.

UNICEF is here to serve the world’s most disadvantaged children and our global workforce must reflect the diversity of those children. The UNICEF family is committed to include everyone , irrespective of their race/ethnicity, age, disability, gender identity, sexual orientation, religion, nationality, socio-economic background, or any other personal characteristic.

We offer a wide range of measures to include a more diverse workforce , such as paid parental leave, time off for breastfeeding purposes , and reasonable accommodation for persons with disabilities . UNICEF strongly encourages the use of flexible working arrangements.

UNICEF does not hire candidates who are married to children (persons under 18). UNICEF has a zero-tolerance policy on conduct that is incompatible with the aims and objectives of the United Nations and UNICEF, including sexual exploitation and abuse, sexual harassment, abuse of authority, and discrimination. UNICEF is committed to promoting the protection and safeguarding of all children. All selected candidates will undergo rigorous reference and background checks and will be expected to adhere to these standards and principles. Background checks will include the verification of academic credential(s) and employment history. Selected candidates may be required to provide additional information to conduct a background check.

UNICEF appointments are subject to medical clearance.  Appointments may also be subject to inoculation (vaccination) requirements, including against SARS-CoV-2 (Covid). Should you be selected for a position with UNICEF, you either must be inoculated as required or receive a medical exemption from the relevant department of the UN. Otherwise, the selection will be canceled.

As per Article 101, paragraph 3, of the Charter of the United Nations, the paramount consideration in the employment of the staff is the necessity of securing the highest standards of efficiency, competence, and integrity.

UNICEF’s active commitment to diversity and inclusion is critical to deliver the best results for children. For this position, eligible and suitable afro-descendants, indigenous, LGBTQIA+ and other minorities  are encouraged to apply.

Government employees who are considered for employment with UNICEF are normally required  to resign from their government positions before taking up an assignment with UNICEF. UNICEF reserves the right to withdraw an offer of appointment, without compensation, if a visa or medical clearance is not obtained, or necessary inoculation requirements are not met, within a reasonable period for any reason. 

UNICEF does not charge a processing fee at any stage of its recruitment, selection, and hiring processes (i.e., application stage, interview stage, validation stage, or appointment and training). UNICEF will not ask for applicants’ bank account information.

All UNICEF positions are advertised, and only shortlisted candidates will be contacted and advance to the next stage of the selection process. An internal candidate performing at the level of the post in the relevant functional area, or an internal/external candidate in the corresponding Talent Group, may be selected, if suitable for the post, without assessment of other candidates.

This job opportunity is presential and for Brazilians only or internal candidates with a work permit for Brazil.

Additional information about working for UNICEF can be found here .

assignment monitoring and evaluation

Performance Monitoring and Evaluation Analyst (GMG/SEG 2)

(Salary: $4,266,270.00 per annum)

JOB PURPOSE

Under the general direction of the Director, Corporate Planning and Performance Management, the Performance Management Analyst is responsible for assisting with the development and management of the performance management of the Ministry’s policies, programmes and projects to ascertain the attainment of established objectives and performance standards. 

KEY OUTPUT  

  • Ministry/Departments/Agencies Performance reports produced
  • Trend Analysis conducted 
  • Research conducted and findings compiled
  • Quantitative and qualitative analysis conducted  
  • Annual/Quarterly/Periodic Reports prepared
  • Advice and interpretation provided 
  • Individual Work plan developed

Key responsibility areas INCLUDES:

  • Contributes in the development of a Performance Monitoring and Evaluation Results Measurement (RM) Framework, guided by the Ministry’s Strategic/Corporate Plan as the machinery for the management of ministry’s performance and its portfolio agencies’ plans, programmes and projects;
  • Assists with the formulation of Performance Indicators for use in the assessment of the ministry’s and its portfolio agencies’ policies, programmes and projects;
  • Monitors and evaluates newly implemented plans, policies and procedures to analyze effectiveness and progress;
  • Liaises with all departments/ branches/units and portfolio agencies to gather status data, conducts analyses and makes recommendations;
  • Provides support in the setting of Operational Objectives to guide the operations of the performance management function;
  • Organizes and convenes Corporate Planning Coaching sessions for assigned programmatic areas;
  • Assists with the design and conducts secondary research in to the impact, relevance and effectiveness of the Ministry’s policies, departments/agencies, programmes and projects to inform and update the Ministry’s policies, planning process;
  • Establishes evaluation schedules and guides Heads of Divisions/Units and Programme Managers on the importance of the evaluation exercise in the decision making process;
  • Evaluates Divisions/Branches/Sections/Units Strategic Corporate and operational plans against set performance targets ensuring that these plans are based on key outputs and objectives, and are linked to budget forecasts;
  • Advises Heads of Divisions/Branches/Sections/Units on significant variance from targets in strategic corporate and operational plans and programmes, and recommends alternative strategies;
  • Monitors projects/ programmes and the implementation of decisions taken in respect of policy issues, and offers solutions for the handling of constraints and procedural bottlenecks;
  • Supports the monitoring of the quality and completeness of data for the documenting of project performance, ensure data within the project for evidence –based decision making, and solve data problems when they arise; 
  • Collaborates with key stakeholders in the development, implementation and maintenance standard operating procedural manuals on the Performance Management process;
  • Monitors and assesses the ministry policies, programmes and projects against established objectives and performance criteria;
  • Conducts qualitative and quantitative analysis of Ministry’s programmes, policies and projects;
  • Contributes in the strategic planning process of the Ministry with team members;
  • Liaises with the Finance and Accounts Division in the ongoing monitoring of expenditure on programmes and projects and assesses capital and operational budgets against ministry policies and priorities; and propose adjustments where appropriate;
  • Liaises with the Risk Management Branch in the sharing of data/information to aid in the mitigating of prospective risks;
  • Prepares reports/findings on performance management exercises;
  • Prepares official papers and submissions on monitoring and evaluation results in order to inform and update planning and policy development;
  • Evaluates and updates measures designed to improve the methods and standards used in developing performance indicators for the ministry’s policies, programmes and projects;
  • Liaises with monitoring and evaluation divisions within central government and related entities, to support the strengthening of the performance management and evaluation process;
  • Provide technical advice to internal and external stakeholders;
  • Develops, implements and maintains standard operating procedural manuals on the Performance Management and Evaluation process. 

PERFORMANCE STANDARDS

  • The Performance Management Plan provides a sound framework for effective monitoring and evaluation of ministry policies, programmes and projects
  • Performance indicators are measurable, reliable and valid  
  • Performance monitoring and evaluation exercises are conducted in accordance with established procedures
  • Plans, policies, programmes and operations of the Ministry are monitored and assessed according to agreed timelines to ensure conformity to Ministry objectives and established standards of performance
  • Performance Management findings provide sound bases for decision-making
  • Performance management reports are prepared and provided within allotted timeframes
  • Technical advice and recommendations provided are sound and supported by qualitative/quantitative data
  • Individual work plans conform to established procedures and implemented accorded to establish rules
  • Reports are evidence-based and submitted in a timely manner
  • Confidentiality, integrity and professionalism displayed in the delivery of duties and interaction with staff.

Minimum Required Education and Experience   

  • Bachelor’s Degree in Management Studies, Public Administration, Business Administration or a related discipline; 
  • Specialized training in Corporate/Strategic Planning, Performance Monitoring and Project Management;
  • Three (3) years related experience. 

Kindly submit a cover letter and resume along with the names, telephone numbers, and email addresses of two (2) references, one of whom must be a former or current supervisor.

   

Applications with résumés are to be submitted no later than   Wednesday, September 11, 2024 , to:  

  Senior Director

Human Resource Management & Development

Ministry of Health & Wellness

10a Chelsea Avenue

Kingston 10

Email: [email protected]

The Ministry of Health & Wellness thanks all applicants for their interest, but only those shortlisted will be contacted.

  • Organization: UNDP
  • Country: Sierra Leone
  • City: Freetown
  • Office: UN Women Sierra Leone in Freetown
  • Follow @UNjobs

Description of assignment title: Monitoring & Evaluation (M&E) Specialist

Assignment country: Sierra Leone

Expected start date: 09/19/2024

Sustainable Development Goal: 17. Partnerships for the goals

Volunteer category: International UN Volunteer Specialist

Host entity: UNDP

Type: Onsite

Duration: 12 months

Number of assignments: 1

Duty stations: Freetown

Mission and objectives

The UNDP Sierra Leone Country Office was established in 1978 and has over the years grown to a total staff strength of over 140 and with an average annual delivery of USD 25 million. UNDP has established a Planning and Support Unit (PSU) to provide effective and efficient management advisory support services to senior management and to programme and operations Units. It specifically leads on programme quality assurance, results-based management, resource mobilization, and donor relations and reporting.

Under the overall guidance of the Resident Representative and direct supervision of the Team Leader of the PSU, the Donor Relations and Reporting Specialist leads on partnership building and ensuring donor compliance and high-quality donor reporting. He/she will: (i) identify and explore potentials for strategic partnership and programme synergies, (ii) coordinate donor relations and development assistant (iii) ensure donor reporting compliance and consistency with UNDAF/CPD and other corporate documents (iv) develop and enhance partnerships and (v) mobilize resources. He/she will also support knowledge management and information dissemination in a coordinated, systematic, client-oriented and timely manner for all programmes. The Donor Relations and Reporting Specialist works in close collaboration with Senior Management, Team Leaders in the Programme and Operations Units. He or she is expected to coordinate with relevant colleagues in other UN Agencies.

The UN Secretary-General launched on 1 January 2019 a bold and new global reform that repositioned the UN Development System to deliver more effectively and efficiently with the achievement of the 2030 Agenda and the Sustainable Development Goals. As part of this reform, UN Resident Coordinator Offices (UN RCO), under the leadership of an empowered and independent UN Resident Coordinator - the highest-ranking official of the UN Development System and Representative of the UN Secretary-General at the country level - support countries in the achievement of their development priorities and the attainment of the SDGs.

The Peacebuilding Fund (PBF) support has helped catalyze critical peacebuilding processes over the past twenty years, helping to restore democracy, rebuilding state institutions, sustaining post-war reconstruction, and rebuilding social cohesion, after the country civil war. Thanks in part to PBF assistance, and with the technical support of UN agencies alongside the government's commitment and participation, progress has been made in sustaining peace and moving forward some of the recommendations of the Truth and Reconciliation report. Since the end of the civil war, PBF has invested about USD 84 millions.

Since 2016, the UN Peace and Development Advisor has helped to further develop the PBF portfolio - which, currently, is about USD 16 millions -, providing assistance to the UN Agencies, NGOs, and Government to plan and design the project, undertaking the monitoring and supporting to unblock bottlenecks and properly advice on conflict sensitivity aspects and provide dedicated training on conflict analysis to local stakeholders. Considering the number of tasks ahead, it has been decided to establish a small dedicated PBF Secretariat within the RC Office, consisting of the PBF Coordinator and the PBF M&E expert to help in the coordination and quality assurance of the PBF portfolio of projects in Sierra Leone. In addition, a dedicated PBF Joint Steering Committee will be established to strengthen high level oversight of the PBF-supported projects and to strengthen overall coordination of the portfolio. The JSC will be co-chaired by the United Nations Resident Coordinator and the Minister of Planning and Economic Development of the Sierra Leone's Government. The PBF Secretariat, will help the Joint Steering Committee (JSC) in its coordination and monitoring tasks.

The PBF Secretariat personnel will be recruited by UNDP, as the administrative Agency for this purpose.

Under the overall guidance of the Resident Coordinator (RC/O), reporting directly to the PBF Coordinator, the Monitoring & Evaluation (M&E) Specialist of the PBF Secretariat will work in close collaboration with the RCO M&E Officer and oversee development and implementation of a monitoring, evaluation, and reporting framework for the PBF portfolio in Sierra Leone. He/she will contribute to strengthening PBF recipient agencies and partners' M&E capacities.

The M&E Specialist will oversee the development of a joint integrated M&E system for PBF projects (including cross-border and gender and youth promotion initiative projects) to promote synergies between projects. He/she will ensure the formulation of common results and indicators at the macro level to help measure achievements based on priorities defined in the future PBF eligibility application.

Task description

1. Design and implement monitoring and evaluation frameworks, systems, and tools, and as applicable, conduct programme quality assurance in keeping with corporate policies.

  • Develop integrated monitoring and evaluation (M&E) and results-based management (RBM) frameworks and systems for overseeing Peacebuilding Fund (PBF) projects. This will help provide a portfolio level sense of progress and will help the Resident Coordinator (RC), the Joint Steering Committee (JSC) and Peacebuilding Support Office (PBSO) in making informed decisions.
  • Provide M&E-related analysis to the JSC and PBSO, including monitoring and analysis of PBF-supported project implementation progress, and facilitating potential field visits by the JSC.
  • Monitor the impact of peacebuilding projects by organizing field visits and joint M&E missions to project sites, in close collaboration with the Agencies and project teams.
  • Facilitate joint M&E activities with national partners and JSC as appropriate.
  • Identify any challenges to project implementation and, with the PBF Coordinator and project implementing teams, propose solutions or corrective measures in a timely manner.
  • Conduct a joint annual review of work plans as part of reviewing implementation progress.
  • Prepare briefing to communicate project results to RC(O), JSC and PBSO

2. Lead coordination of and support to monitoring, reporting, evaluation and results-based management of PBF portfolio

  • Support the implementing agencies of PBF projects in collecting and analyzing data.
  • Ensure that data collection aligns with future PBF Strategic Result Framework.
  • In collaboration with project teams, ensure that the results of different projects are documented, monitored, evaluated, analyzed.
  • Support PBF projects to commission baseline and endline surveys as well as project evaluations in close collaboration with the PBF Secretariat Coordinator and the peacebuilding team.
  • Regularly review individual project implementation through monitoring visits. eW1xdh4 DY4mwU
  • Review PBF agency project reports and support their quality assurance. by providing timely feedback
  • Support the PBF Coordinator to draft the PBF strategic annual report.

3. Provide substantive contributions to capacity building and facilitate knowledge building and sharing on monitoring, evaluation, and results-based management.

  • Contribute to assessing M&E capacity needs and elaborating capacity development plans, tools, and methodologies for implementing Agencies.
  • Support the project coordinator to identify, document, and disseminating key lessons learned and good practices derived from monitoring and evaluating activities.
  • Create a database of interventions and stakeholders involved in peacebuilding to promote synergies.
  • Support the PBF coordinator to promote visibility of the PBF portfolio.

4. The incumbent performs other duties within their functional profile as deemed necessary for the efficient functioning of the Office and the Organization.

Eligibility criteria

Age: 18 - 80

Nationality

Candidate must be a national of a country other than the country of assignment.

Requirements

Required experience

3 years of experience in M&E, working with quality assurance instruments, results-based management models, and multiple stakeholders.

  • Proven technical knowledge of the results-based management cycle and monitoring and evaluation is required.
  • Proven experience in designing monitoring and evaluation systems is required.
  • Proven ability to work with quantitative and qualitative data, and appropriately use qualitative data /information collection tools in innovative ways is desirable.
  • Experience in the use of computers, office software packages (MS Word, Excel, etc.), and web-based management systems, and up-to-date, advanced knowledge of data management tools and database packages.
  • Experience in designing, implementing and/or managing assessments and/or data collection exercises/processes is desired.
  • Experience conducting applied research projects and reporting on findings is desired.
  • Understanding of development and peacebuilding contexts and specific challenges and approaches to M&E in such contexts is required.
  • Experience in peacebuilding settings is desirable.

Area(s) of expertise

Development programmes, Information technology

Driving license

English, Level: Fluent, Required

French, Level: Working knowledge, Desirable

Required education level

Master degree or equivalent in Development/Development Studies, Economics, Statistics, Business Management, Social Policy, or related field is required;

a Bachelor degree in Social Sciences, Business Administration, Development Studies, Economics, Political Sciences, Peace and Conflict Studies or related field in combination with 2 additional years of qualifying experience in lieu of the Master's degree.

Competencies and values

Business Direction & Strategy

System Thinking: Ability to use objective problem analysis and judgement to understand how interrelated elements coexist within an overall process or system, and to consider how altering one element can impact on other parts of the system

Business Management

Portfolio Management: Ability to select, prioritize and control the organization's programmes and projects, in line with its strategic objectives and capacity; ability to balance the implementation of change initiatives and the maintenance of business-as-usual, while optimizing return on investment

Monitoring: Ability to provide managers and key stakeholders with regular feedback on the consistency or discrepancy between planned and actual activities and programme performance and results

Evaluation: Ability to make an independent judgement based on set criteria and benchmarks. Ability to anticipate client's upcoming needs and concerns

Knowledge Generation: Ability to research and turn information into useful knowledge, relevant for context, or responsive to a stated need

Partnership Management

Relationship Management: Ability to engage with a wide range of public and private partners, build, sustain and/or strengthen working relations, trust and mutual understanding

Result Orientated

Think Innovatively

Adapt with Agility

Act with Determination

Engage and Partner

Enable diversity and Inclusion

Other information

Living conditions and remarks

Freetown is the capital city of Sierra Leone, situated on the Atlantic coast of WestAfrica. country recently suffered the worst Ebola Virus outbreak in the history of Sierra Leone and probably in Africa. Consequently, leading to economic and social shocks.

The country is however, slowly recovering from these joint shocks of the Ebola Virus Disease and a collapse of world iron ore prices. Sierra Leone is generally safe, the security level is low, and the society is very religiously tolerant. Freetown is a family duty station.

The cost of living for expatriates is generally fair, and there are good supermarkets, restaurants, and hotels available. There are good communication services, including mobile and internet services, available with a fair price. Local and regional banks deal in both local and foreign currencies.

Inclusivity statement

United Nations Volunteers is an equal opportunity programme that welcomes applications from qualified professionals. We are committed to achieving diversity in terms of gender, care protected characteristics. As part of their adherence to the values of UNV, all UN Volunteers commit themselves to combat any form of discrimination, and to promoting respect for human rights and individual dignity, without distinction of a person's race, sex, gender identity, religion, nationality, ethnic origin, sexual orientation, disability, pregnancy, age, language, social origin or other status.

Note on Covid-19 vaccination requirements

Selected candidates for certain occupational groups may be subject to inoculation (vaccination) requirements, including against SARS-CoV-2 (Covid-19) in line with the applicable host entity policy

U.S. flag

An official website of the United States government

Here's how you know

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. A lock ( ) or https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Keyboard Navigation

Use these commands to navigate the primary menu and its sub menus via keyboard.
FunctionKey
Primary menu:
Sub menu:
Primary menu:Alt + o
Close menu:Esc
  • Agriculture and Food Security
  • Anti-Corruption
  • Conflict Prevention and Stabilization
  • Democracy, Human Rights, and Governance
  • Economic Growth and Trade
  • Environment, Energy, and Infrastructure
  • Gender Equality and Women's Empowerment
  • Global Health
  • Humanitarian Assistance
  • Innovation, Technology, and Research
  • Water and Sanitation
  • Burkina Faso
  • Central Africa Regional
  • Central African Republic
  • Côte d’Ivoire
  • Democratic Republic of the Congo
  • East Africa Regional
  • Power Africa
  • Republic of the Congo
  • Sahel Regional
  • Sierra Leone
  • South Africa
  • South Sudan
  • Southern Africa Regional
  • West Africa Regional
  • Afghanistan
  • Central Asia Regional
  • Indo-Pacific
  • Kyrgyz Republic
  • Pacific Islands
  • Philippines
  • Regional Development Mission for Asia
  • Timor-Leste
  • Turkmenistan
  • Bosnia and Herzegovina
  • North Macedonia
  • Central America and Mexico Regional Program
  • Dominican Republic
  • Eastern and Southern Caribbean
  • El Salvador
  • Middle East Regional Platform
  • West Bank and Gaza
  • Dollars to Results
  • Data Resources
  • Strategy & Planning
  • Budget & Spending
  • Performance and Financial Reporting
  • FY 2023 Agency Financial Report
  • Records and Reports
  • Budget Justification
  • Our Commitment to Transparency
  • Policy and Strategy
  • How to Work with USAID
  • Find a Funding Opportunity
  • Organizations That Work With USAID
  • Resources for Partners
  • Get involved
  • Business Forecast
  • Safeguarding and Compliance
  • Diversity, Equity, Inclusion, and Accessibility
  • Mission, Vision and Values
  • News & Information
  • Operational Policy (ADS)
  • Organization
  • Stay Connected
  • USAID History
  • Video Library
  • Coordinators
  • Nondiscrimination Notice and Civil Rights
  • Collective Bargaining Agreements
  • Disabilities Employment Program
  • Federal Employee Viewpoint Survey
  • Reasonable Accommodations
  • Urgent Hiring Needs
  • Vacancy Announcements
  • Search Search Search

Program Officer (Climate Adaptation, Monitoring and Evaluation Advisor), FL-0343-02

Opening and closing dates.

Agency:  U.S. Agency for International Development Organization:  Bureau for Resilience Environment and Food Security, Feed the Future Policy, Analysis, and Engagement Office (REFS/FTF-PAE) Location of Position:  Washington, DC  Open Period:  August 29, 2024 - September 20, 2024 Appointment Type:  This is an excepted service, time-limited appointment, that is not to exceed five initial years and may be considered for two, two-year extensions, depending on the needs of the service. Telework Eligibility:  Telework eligible for up to four days per pay period (two days per week).  Remote Eligibility:  The position is NOT remote eligible. Salary:  (USD) $132,860 - $191,900 per year Number of Vacancies:  One

Description of Organization:  The Bureau leads USAID’s integrated and inclusive approach to confront the crises of food security and climate change. The Bureau coordinates across USAID Missions, Bureaus, and Independent Offices to foster a world in which all people have sustained access to safe water, nutritious food, a healthy environment, and improved livelihoods as a result of inclusive, locally driven development strategies that protect the planet and our future.   Duties and Responsibilities:  The Program Officer serves as the Agriculture and Climate Adaptation Monitoring and Evaluation Advisor and will support USAID and other organizations by providing technical leadership and guidance in the monitoring and evaluation of Feed the Future investments in general, with specific attention to monitoring and evaluating climate adaptation and agriculture investments. The Program Officer will work with a variety of USAID Operating Units, international organizations, universities, and other institutions, including through participation in technical meetings, conferences, seminars, and other activities. In addition, the incumbent will monitor financial and administrative aspects of program implementation; provide operational oversight of programs to ensure that USAID's responsibilities and obligations are met; lead project and activity management teams including monitoring, evaluation and learning teams; and oversee technical and non-technical contractor activities that support the program. As a Program Officer, you will:

  • Provide interim evaluation reports and findings as data points to be used in strategic portfolio and strategic performance reviews;
  • Serve as a key point of contact with persons and groups both inside and outside of USAID for climate adaptation-related monitoring, evaluation, and analysis;
  • Manage contracts and/or grants that collect monitoring data, conduct evaluative research on programs with climate adaptation objectives, or conduct climate related analyses; and
  • Provide high-level technical and analytical leadership and guidance on monitoring and evaluation metrics and methods for REFS-supported climate portfolios, strategies, and initiatives, particularly in relation to climate adaptation programming and mainstreaming and including adaptation, mitigation, and development co-benefits.

Specialized Experience:  For the FL-02 (GS-14 equivalent), you must have a minimum of one year of specialized experience at the FL-03 (GS-13) grade level or equivalent and an advanced degree in a relevant area of study. Specialized experience is defined as providing high-level technical support related to monitoring, evaluation, measurement, or analytics related to climate change and/or climate adaptation especially in the context of agriculture.  Education Requirement: 

  • At the FSL-02 level, at least a graduate degree is required.

Conditions of Employment: 

  • Must be a U.S. citizen. Candidates must indicate citizenship on their application. If citizenship is not stated, the application will not be considered; and
  • Must be eligible to obtain and maintain a Secret-level security clearance.

Desired Knowledge, skills and abilities: 

  • Development of indicators and measurement techniques to monitor and evaluate climate adaptation results within food systems; 
  • Strong data analysis skills;
  • Developing methodological guidance for monitoring and evaluation in general and in the context of agriculture, resilience, and climate adaptation in particular;
  • Strong familiarity with the application of USAID’s  2023 Feed the Future Indicator Handbook  and USAID’s  2023 Climate Change Standard Indicator Handbook ; and
  • Strong familiarity with leading global climate models, their outputs, and the ways these data can appropriately be used.

Required Documents: 

  • A cover letter expressing interest and clearly addressing the stated requirements for the position;
  • A resume/CV that must include the month, year, and the number of hours worked per week for each position listed; and
  • At least three professional references

Please submit your application package to  [email protected] . Please use the subject line “ REFS_174 FSL APPLICATION PACKAGE: Program Officer, FL-0343-02 (Agriculture and Climate M&E Specialist, PD# 40649) .”   Application submissions are required by  11:59 p.m. ET, September 20, 2024.  Packages not submitted by the deadline with the specified subject line—or incomplete packages—will not receive consideration. This notice may be used to fill additional vacancies, as the workforce needs of the Bureau may change. USAID DEI Commitment  USAID’s Bureau for Resilience, Environment and Food Security (REFS) is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will be considered for employment without regard to their race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. REFS remains committed and compliant with all fair employment practices to ensure that our place of work is diverse, equitable, and inclusive.  Additional Resources:

  • EEO Policy Statement
  • Reasonable Accommodation Policy
  • Foreign Service Salaries
  • Healthcare & Insurance

Any questions concerning this notice may be directed to Mousumi Sarkar at  [email protected] .

Bureau/Office

Pay scale/grade, eligibility, share this page.

  • [email protected]
  • +254 20 6994000

assignment monitoring and evaluation

County Monitoring and Evaluation Officer – OVC

assignment monitoring and evaluation

  • Lodwar, Turkana County
  • Posted 24 hours ago

Amref Health Africa in Kenya

Amref Health Africa in Kenya

The County Monitoring and Evaluation Officer – OVC will be responsible for the day-to-day tracking and reporting of sub purpose 3 (OVC) project activities to measure the project’s impact and progress, by ensuring that data is readily available to be used as a foundation for programmatic adjustments and evidence-based decision-making. S/he will be required to implement and update project M&E Plan, Indicator Tracking Table (ITT) regularly and PMP including identifying appropriate indicators and design of data collection tools for the project to collect and report on data to measure performance and achievement of project objectives.

View the full job description (JD) >>>

Latest News

Post pic

Subscribe to our Newsletter

Our country websites, south sudan, west africa hub, netherlands, united kingdom, north america, united states, quick links, amref flying doctors, amref health innovations, amref international university, ahaic conference, accessibility, privacy policy, ethics hotline, amref health africa all rights reserved ©.

assignment monitoring and evaluation

  • Lasting Health Change in Africa

West Africa

United states of america.

IMAGES

  1. 9 Sample Project Evaluation Templates to Download

    assignment monitoring and evaluation

  2. Monitoring AND Evaluation ASSIGNMENT-2

    assignment monitoring and evaluation

  3. What is Monitoring and Evaluation (M&E)?

    assignment monitoring and evaluation

  4. 9 Sample Project Evaluation Templates to Download

    assignment monitoring and evaluation

  5. The Roles of Monitoring and Evaluation in Projects

    assignment monitoring and evaluation

  6. a circular diagram with the words planning a monitoring and evaluation

    assignment monitoring and evaluation

VIDEO

  1. Thendo Monitoring and evaluation framework for collaborative learning

  2. PERFORMANCE EVALUATION ASSIGNMENT 1 PRESENTATION (REWARD SYSTEM)

  3. Monitoring & Evaluation Part 3 of 2

  4. Performance Evaluation Assignment

  5. Monitoring & Evaluation Guidance for Projects

  6. Monitoring, Evaluation and Learning Webinar

COMMENTS

  1. Introduction to Monitoring and Evaluation: The Basics

    Monitoring and evaluation (M&E) is a critical process for assessing the performance and effectiveness of programs, projects, and policies. This process involves collecting and analyzing data on program activities, outputs, outcomes, and impact to determine whether the desired results have been achieved.

  2. Monitoring and Evaluation Process: Steps and Challenges

    In conclusion, the monitoring and evaluation process involves the systematic collection, analysis, and use of information to assess program effectiveness and impact. The process typically includes several steps such as defining needs analysis, project design, selecting indicators, collecting and analyzing data, and using the results to inform ...

  3. Comprehensive Guide to Developing a Robust Monitoring and Evaluation

    Developing a Monitoring and Evaluation Plan can be a complex process. These general steps can help get you started: Define your project, identify key performance indicators, set targets, determine data collection methods, establish a timeline, analyze and interpret data, use findings to inform future decisions, and more steps.

  4. PDF Developing Monitoring and Evaluation Plans: A Guide for Project ...

    Putting together Monitoring and Evaluation Plan Elements. Element 1: Project Outcomes - Level and Outcome Statement. Element 2: Monitoring and Evaluation Questions. Element 3: Define Indicators (or Performance Measures) Element 4: Identify Data Sources and Select Data Collection Methods. Element 5: Consider Timing of Data Collection.

  5. Unit 10: Monitoring and Evaluation

    2.1 M&E systems and common deficiencies. A monitoring and evaluation system is made up of the set of interlinked activities that must be undertaken in a co-ordinated way to plan for M&E, to collect and analyse data, to report information, and to support decision-making and the implementation of improvements. .

  6. Monitoring and Evaluation: Tools, Methods and Approaches

    Monitoring and Evaluation: Tools, Methods and Approaches. The purpose of this M&E Overview is to strengthen awareness and interest in M&E, and to clarify what it entails. You will find an overview of a sample of M&E tools, methods, and approaches outlined here, including their purpose and use; advantages and disadvantages; costs, skills, and ...

  7. How to Develop a Monitoring and Evaluation Plan

    Step 1: Identify Program Goals and Objectives. The first step to creating an M&E plan is to identify the program goals and objectives. If the program already has a logic model or theory of change, then the program goals are most likely already defined. However, if not, the M&E plan is a great place to start.

  8. PDF Monitoring & Evaluation- Lecture Notes Session 1: Overview of

    Monitoring and Evaluation-MSc-Lecture Notes- May-August 2016 -pnk Page 2 Provides in-depth analysis Compares planned with actual achievements Looks at processes used to achieve results Considers results at outcome level and in relation to cost Considers overall relevance of program activities for resolving health problems References implemented activities

  9. Planning for Monitoring and Evaluation

    Syllabus. Module 1: Introduction to Monitoring and Evaluation. Link M&E to your project's design. Module 2: Linking M&E to Project Design. Define the indicators that you will measure. Module 3: Identifying Indicators & Targets. Choose appropriate data collection methods. Module 4: Data Collection. Create clear, useful data collection tools.

  10. PDF Topic outline 1. Introduction to Monitoring & Evaluation (M&E)

    1. Introduction to Monitoring & Evaluation (M&E)Topic outlineThis course provides a basic introduction to monitoring and evaluation concepts and how they apply to the field of human resources for heal. (HRH) to inform evidence-based planning and decision-making. This course introduces. terminology that has different definitions in various ...

  11. How to write a monitoring and evaluation (M&E) framework

    The first step in writing an M&E framework is to decide which indicators you will use to measure the success of your program. This is a very important step, so you should try to involve as many people as possible to get different perspectives. You need to choose indicators for each level of your program - outputs, outcomes and goal (for more ...

  12. The Importance of Monitoring and Evaluation for Decision-Making

    Most of us, if not all of us, use informal M&E in our everyday decision-making processes. M&E is a management process that combines the oversight (monitoring) with the assessment of choices, processes, decisions, actions, and results (evaluation). It has two main uses: internal and external (see Fig. 4.1).

  13. PDF Introduction to Developing Monitoring and Evaluation Frameworks

    stigations through monitoring and evaluation, such as gender and other crosscutting issues. The Monitoring and Evaluation Frame-work shows how data are collected, agg. egated, and analyzed on a regular basis in order to answer the agreed evaluation questions. The data generated by the Monitorin. and Evalua.

  14. Designing Qualitative Research for Monitoring and Evaluation (M&E

    In this 2.5-hour Designing Qualitative Research for Monitoring and Evaluation (M&E) e-course, you will learn how to design and plan for a rigorous qualitative study or routine monitoring activity using the step-by-step guidance provided by the Qualitative Inquiry Planning Sheet (QuIPS). After completing this course, you will be able to:

  15. PDF Basic Principles of Monitoring and Evaluation

    over time (monitoring); how effectively a programme was implemented and whether there are gaps between the planned and achieved results (evaluation); and whether the changes in well-being are due to the programme and to the programme alone (impact evaluation). Monitoring is a continuous process of collecting and analysing

  16. 7 steps for setting-up a Monitoring & Evaluation system

    Designing a Monitoring and Evaluation system, or M&E system, is a complex task that usually involves staff from different units. This article describes the development of such a system in 7 steps (1). Each step is linked with key questions, which are intended to stimulate a discussion of the current state of the M&E system in a project or in an ...

  17. Project Monitoring, Evaluation, & Control: Ultimate Checklist

    Project monitoring is one of the key steps of the project management process, which includes initiation, planning, executing, monitoring and controlling, and closing. In the project monitoring stage, teams track a project's progress by looking at different metrics that could affect a project's outcome, such as: Verifying a project's scope.

  18. PDF How to design a monitoring and evaluation framework for a policy

    together into an overarching monitoring and evaluation framework can be challenging. But it is not impossible. 1.1 Aims and audiences This guidance note is intended as a practical guide to designing a monitoring and evaluation5 (M&E) framework for policy research projects or programmes.6 Its primary audience is M&E designers and managers but it ...

  19. Monitoring and Project Evaluation Assignment

    Monitoring and project evaluation assignment - Free download as Word Doc (.doc / .docx), PDF File (.pdf), Text File (.txt) or read online for free. Process evaluation assesses whether a project has the necessary resources and is implemented as planned. It involves key stakeholders and monitors activities, outputs, and outcomes. Process evaluation helps understand how program elements relate to ...

  20. PDF Planning for Monitoring and Evaluation

    Description: Monitoring and evaluation (M&E) is a set of tools, processes, and concepts that help teams measure their progress and stay on track.This module introduces the basic concepts of M&E. You'll learn the value of M&E, the steps for creating an M&E plan, and the role of M&E in the project cycle. Assignment: Introduction Document.

  21. Monitoring and Evaluation ASSIGNMENT-1

    Monitoring and Evaluation ASSIGNMENT-1 - Free download as Powerpoint Presentation (.ppt / .pptx), PDF File (.pdf), Text File (.txt) or view presentation slides online. This document discusses monitoring and evaluation of projects and programs. It defines key terms like projects, programs, monitoring and evaluation. Projects are temporary endeavors with defined start and end dates, while ...

  22. 2020 Guide On Monitoring and Evaluation For Beginners

    This document provides a 10-step guide for designing a monitoring and evaluation (M&E) system for projects. It begins by outlining the first 5 steps: 1) define the scope and purpose of the M&E system, 2) define evaluation questions, 3) identify monitoring questions, 4) identify indicators and data sources to answer the questions, and 5) identify responsibilities for data collection, storage ...

  23. PDF Guidelines for the Design of Monitoring and Evaluation Systems for

    2.01 The design of monitoring and evaluation systems is project-specific. As already discussed, the essence of a monitoring system is to provide management with user-oriented information on the provision of project inputs and use of such inputs by the beneficiaries. Any monitoring system must conce ntirate on these functions.

  24. Monitoring AND Evaluation ASSIGNMENT-2

    Monitoring AND Evaluation ASSIGNMENT-2 - Free download as Powerpoint Presentation (.ppt / .pptx), PDF File (.pdf), Text File (.txt) or view presentation slides online. Monitoring AND Evaluation ASSIGNMENT-2 by Afework Alaro (Ph.D. candidate at Hawassa University, Ethiopia)

  25. Monitoring and Evaluation Analyst -for Tunisian only

    Expected Duration of Assignment : 12 months. ... the Monitoring and Evaluation Analyst will provide support to the Tunisia and Libya Cluster Offfice's colleagues in incorporating monitoring and reporting into programme formulation as well tracking against Strategic Plan targets and the reporting of results to internal and external audiences.

  26. Monitoring & Evaluation Specialist, NO-C, Fixed Term, Brasilia, Brazil

    The Monitoring & Evaluation Specialist will be responsible to: 1. Integrated Monitoring, Evaluation & Research Plan (IMEP) ... for employment with UNICEF are normally required to resign from their government positions before taking up an assignment with UNICEF. UNICEF reserves the right to withdraw an offer of appointment, without compensation ...

  27. Performance Monitoring and Evaluation Analyst (GMG/SEG 2)

    Contributes in the development of a Performance Monitoring and Evaluation Results Measurement (RM) Framework, guided by the Ministry's Strategic/Corporate Plan as the machinery for the management of ministry's performance and its portfolio agencies' plans, programmes and projects;

  28. Monitoring & Evaluation (M&E) Specialist, Sierra Leone

    Description of assignment title: Monitoring & Evaluation (M&E) Specialist Assignment country: Sierra Leone Expected start date: 09/19/2024 Sustainable Development Goal: 17. Partnerships for the goals ... Develop integrated monitoring and evaluation (M&E) and results-based management (RBM) frameworks and systems for overseeing Peacebuilding Fund ...

  29. Program Officer (Climate Adaptation, Monitoring and Evaluation Advisor

    Specialized experience is defined as providing high-level technical support related to monitoring, evaluation, measurement, or analytics related to climate change and/or climate adaptation especially in the context of agriculture. Education Requirement: At the FSL-02 level, at least a graduate degree is required. Conditions of Employment:

  30. County Monitoring and Evaluation Officer

    The County Monitoring and Evaluation Officer - OVC will be responsible for the day-to-day tracking and reporting of sub purpose 3 (OVC) project activities to measure the project's impact and progress, by ensuring that data is readily available to be used as a foundation for programmatic adjustments and evidence-based decision-making.