| Limited evidence base/data for impact.
|| Inability to evidence hypotheses, contribution and attribution.
- Unpack assumptions around prevention. Use evidenced ToCs.
- Targeted research and analysis into PVE hypotheses.
- Use contribution analysis tools and processes.
| Rapidly changing and dynamic VE context (PVE priorities change, e.g. change in focus to reintegration of returned fighters).
|| Intervention strategies and ToC logic and baseline data may be out of date or need adaptation.
- Regularly review conflict analyses and needs assessments (macro and micro level).
- Review and adapt evidence-based ToCs specific to the conflict dynamics and on likely conflict scenarios.
- Monitor for conflict sensitivity.
- Conduct scenario planning.
| Insecurity and threats to safety of M&E personnel (e.g. with VE groups, ‘at- risk’ groups).
|| Lack of access to or oversight of areas of implementation. Lack of ability to verify or triangulate data.
- Third-party monitoring by trusted partners and community groups with access to hard-to-reach groups.
- Develop M&E processes which consider the conflict dynamics and risks to implementers and M&E teams.
- Remote evaluation techniques, e.g. comments boxes, SMS or telephone reporting, web-based monitoring or surveys, regular verbal reports and peer observations. See ECHO guidance on remote management; GSDRC remote management of projects in fragile states.
| Difficulties accessing those most ‘at-risk’ of VE.
|| Unable to reach and monitor project’s ability to work with the most at risk of VE.
- Context analysis to develop in-depth, contextualised understanding of who is at most risk of VE.
- Consultation with local experts to develop an understanding of vulnerability within communities/areas identified. Working through intermediaries (CSOs, community leaders, etc.) to access these groups.
- Remote M&E or third-party M&E.
| Bias in M&E 'participants'/'respondents' feedback.
|| Results in unreliable data and inaccurate reporting and analysis of data.
- Using anonymised data-collection techniques (such as online, SMS and remote survey techniques).
- Testing and validation on questions to identify and reduce risk of bias.
| Reliance on partners’ M&E systems.
|| Reduced oversight. Weak systems result in a lack of data, learning and evidence. Failure to integrate learning into project design.
- Partner DM&E capacity needs assessment.
- Tailored development of M&E tools for partners.
- Accompany capacity-building process for partners (direct, remote or through local third party).
- Budget for capacity-building support for M&E within project budgets.
| Lack of reliable and verifiable publicly held data (such as national statistics).
|| Creates data gaps – difficult to monitor objective and variable national-level indicators. Difficult to triangulate data.
- Identification of alternative data sources (e.g. other implementing agencies or academic institutions).
- Triangulation of existing data from various sources. Community audits and self-assessments (with a range of actors for comparison).
- Consultations with external/independent experts.