Loading...

Monitoring strategy And data collection

Module 5: Monitoring tools

5.1 Strategies to address challenges of monitoring PVE programming

Strategies to address the challenges of monitoring offer examples of challenges encountered in PVE programming, the potential impact these may have and an example of a strategy for monitoring these challenges.
Why use it? To recognise challenges early on and to stimulate ideas for potential monitoring strategies.

This tool is most useful at the implementation, monitoring and adaptation stage. If used for baseline or context analysis, it is also useful at the design stage. It is important to be aware that data collection methods should be considered at design stage in order to ensure the right methods have been selected for what the tool seeks to measure. It can be used together with:

1.1 Understanding the VE challenge

1.4 Prioritisation of factors

2.2 Articulating change

Table 8: Strategies to address challenges of monitoring PVE programming

Challenges Impact Monitoring strategies
 Limited evidence base/data for impact.  Inability to evidence hypotheses, contribution and attribution.
  • Unpack assumptions around prevention. Use evidenced ToCs.
  • Targeted research and analysis into PVE hypotheses.
  • Use contribution analysis tools and processes.
 Rapidly changing and dynamic VE context (PVE priorities change, e.g. change in focus to reintegration of returned fighters).  Intervention strategies and ToC logic and baseline data may be out of date or need adaptation.
  • Regularly review conflict analyses and needs assessments (macro and micro level).
  • Review and adapt evidence-based ToCs specific to the conflict dynamics and on likely conflict scenarios.
  • Monitor for conflict sensitivity.
  • Conduct scenario planning.
 Insecurity and threats to safety of M&E personnel (e.g. with VE groups, ‘at- risk’ groups).  Lack of access to or oversight of areas of implementation. Lack of ability to verify or triangulate data.
  • Third-party monitoring by trusted partners and community groups with access to hard-to-reach groups.
  • Develop M&E processes which consider the conflict dynamics and risks to implementers and M&E teams.
  • Remote evaluation techniques, e.g. comments boxes, SMS or telephone reporting, web-based monitoring or surveys, regular verbal reports and peer observations. See ECHO guidance on remote management; GSDRC remote management of projects in fragile states.
 Difficulties accessing those most ‘at-risk’ of VE.  Unable to reach and monitor project’s ability to work with the most at risk of VE.
  • Context analysis to develop in-depth, contextualised understanding of who is at most risk of VE.
  • Consultation with local experts to develop an understanding of vulnerability within communities/areas identified. Working through intermediaries (CSOs, community leaders, etc.) to access these groups.
  • Remote M&E or third-party M&E.
 Bias in M&E 'participants'/'respondents' feedback.  Results in unreliable data and inaccurate reporting and analysis of data.
  • Using anonymised data-collection techniques (such as online, SMS and remote survey techniques).
  • Testing and validation on questions to identify and reduce risk of bias.
 Reliance on partners’ M&E systems.  Reduced oversight. Weak systems result in a lack of data, learning and evidence. Failure to integrate learning into project design.
  • Partner DM&E capacity needs assessment.
  • Tailored development of M&E tools for partners.
  • Accompany capacity-building process for partners (direct, remote or through local third party).
  • Budget for capacity-building support for M&E within project budgets.
 Lack of reliable and verifiable publicly held data (such as national statistics).  Creates data gaps – difficult to monitor objective and variable national-level indicators. Difficult to triangulate data.
  • Identification of alternative data sources (e.g. other implementing agencies or academic institutions).
  • Triangulation of existing data from various sources. Community audits and self-assessments (with a range of actors for comparison).
  • Consultations with external/independent experts.

Monitor the context focusing on priority factors and dynamics related to VE identified in your analysis.

Monitor the intervention in terms of its stated PVE goals as well as capturing negative and positive unintended outcomes.

Monitor the interaction between the PVE context and the intervention:

  • identify indicators that track the interaction between the project and the context during implementation;
  • develop monitoring plans for the context and interaction indicators and their implementation; and
  • conduct a regular review of monitoring information with appropriate project adjustment.

  • Ensure the monitoring process is sensitive to the context.
  • Consider the framing of questions, make-up of the evaluation team, and data collection and analysis methods that are participatory, transparent and involve feedback.
  • Use complementary indicators – beyond output-level quantitative measures.
  • Include indicators which tr ack impact and outcome-level change, both quantitative and qualitative, as well as indicators which track the PVE context and risk, interaction between the context and intervention, process and gender sensitivity.