Almm monitoring and evaluation tools draft[1]acm

download Almm monitoring and evaluation tools draft[1]acm

of 27

  • date post

  • Category


  • view

  • download


Embed Size (px)

Transcript of Almm monitoring and evaluation tools draft[1]acm

  • 1.ALMM monitoring and evaluation tools EUNES IPA Project Technical Assistance to enhance forecasting and evaluation capacity of the National Employment Service Europeaid/128079/C/SER/RS

2. Monitoring - Definition Monitoring aims: to highlight strengths and weaknesses in implementation. to enable responsible personnel to deal with problems, improve performance, build on success, and adapt to changing circumstances. to provide the mechanism by which relevant information is channeled to the right people at the right time. 3. Monitoring The types of information necessary Programme inputs, Progress against objectives and against the Implementation Plan. Results of activities and outputs achieved. Impact on the target group. The way the programme is managed and style of work. The means of gathering information Site visits to local offices and projects; Interviews with staff, project personnel and beneficiary groups; Observation of project activities; Analysis of activity reports, statistical reports and other documents; Analysis of financial documents. 4. Performance Monitoring System- levels Process monitoring reviewing and planning work on a regular basis; assessing whether activities are carried out as planned; identifying and dealing with problems; building on strengths: and assessing whether the style of work is the best way to achieve the programme objectives Impact monitoring progress towards objectives is measured continuously; implementation is modified in response to changing circumstances without losing sight of overall objectives and aims; the need to change objectives (if necessary) can be identified; the need for further research can be identified; assumptions can be verified 5. Performance Monitoring System ..provides the basis for the kind of management information system which is essential for programme operations, especially in situations where implementation is delegated or decentralised to local level. Steps establishing / confirming programme goals; developing performance indicators corresponding to programme goals; collection of data concerning the indicators; analysis of the data; appropriate presentation of information; using the information findings to improve activities. 6. Performance Monitoring System The Monitoring System for an ALMM-funded project- key documents: QUARTERLY MONITORING REPORT (covering staffing levels; activities engaged in over the reporting period; achievements /products; seminars and events etc); QUARTERLY FINANCIAL REPORT; PARTICIPANT START AND COMPLETION DETAILS (broken down by gender, age, target group, etc). Monitoring of an ALMM would focus: Number (broken down by gender, age, etc) from the target group(s) who participated within a specified period; The cost of the programme over the same period; The completion rate; Qualifications obtained as a result of participation (if applicable); Employment status (in the short run) immediately after completion. 7. Monitoring vs. Evaluation Monitoring assess the success or failure of the programme provides rapid information about the programme programmes are expected to monitor for quality control and procedural purposes in monitoring the effectiveness of the ALMPs- secondary effects should be taken into account Evaluation provides explanations longer term process 8. Evaluation determins whether and why a programme is successful assessing implementation and outcomes from a wider angle viewpoint can take place at all stages of the programme makes use of monitoring data 9. Programme Evaluation individual systematic studies assess how well project / programme has worked and what lessons can be learned conducted by external experts a programme evaluation examines achievement of programme objectives in the wider context based on the statistics collected OR address important questions of a more qualitative nature evaluation as a learning process 10. Evaluation Objectives: find out- asssess- recommend To find out... whether the programme is making progress towards achieving its objectives? who has benefited from the intervention? what the impact has been on the beneficiaries? have there been changes to the target group due to external factors? To asess.. whether the impact, if there is one, is due to the programme, or to other factors? whether the aims and objectives of the programme are still relevant, or whether there is a better way of achieving them? whether the work is being carried out efficiently and what major problems and constraints have arisen? whether the resources allocated where used efficiently and effectively? how changes in the needs of the target group effect future programmes? 11. Evaluation Objectives: find out- asssess- recommend to make recommendations about: how the programme could be improved; how the aims and objectives should be modified or revised; how the work can be monitored and evaluated in the future; how the work could be made more cost-effective 12. Programme Evaluation partly statistical exercise statistics do not provide full assesment good evaluation involves evidence based interpretation elements of qualitative analysis 13. Evaluation Elements: Outcomes and Process Outcomes- what was achieved and with what results? Impact evaluation process three steps: What are the estimated impacts of the programme on the individual? Are the impacts large enough to yield net social gains? Is this the best outcome that could have been achieved for the money spent? (effectiveness). Feasibility of replicating programmes outcomes might also arise under this heading. 14. Evaluation Elements: Outcomes and Process Process- how the outputs were achieved, how the programme was managed? programme design and methodology; programme management; service delivery mechanisms; the quality of the co-operation with partner organisations innovation (if any). 15. Relevance of Evaluation The results of an evaluation exercise will: identify what worked well and what worked less well (outputs and processes); assist in the planning of current and future programmes ; help to build on success, develop good practice, and avoid repeating mistakes; assist in the monitoring of the programmes future phase; help to shape dissemination and mainstreaming strategy. Qualitative analysis helps in judging the outcomes of the approach i.e. the learning and process 'successes' of a programme that are not necessarily captured by Labour Market Information System (LMIS) statistics alone, but require complementary feedback from beneficiaries, employers and other stakeholders, using interviews, focus group sessions, questionnaires, etc. Evaluating the multiple contexts of a project may also point to situations that limit a projects ability to achieve anticipated outcomes, or lead to the realization that specific interventions and their intended outcomes may be difficult to measure or to attribute to the project itself. 16. Evaluation: Who and How? Evaluation - the How? Four key issues for planning and undertaking an evaluation are: Who should undertake the evaluation? When should evaluation take place? What should be evaluated? How is an evaluation conducted? Who Should Undertake the Evaluation? self-evaluation an evaluation exercise conducted by the programme sponsor, or any other (partner) organisation involved; and external evaluation an evaluation undertaken by an individual or organisation from outside the programme. 17. Self-Evaluation Most programmes will probably conduct important to ensure that is done properly skilled staff, independent from its management time and other resources available data understanding of research methods and data analysis ability to reflect on the progress of the programme against its stated objectives Skills to interpret this information and report it in a clear and useful manner are required 18. External Evaluation The external evaluator can offer expert services; the levels of expertise and resources required for thorough evaluation are likely to be greater. Expert can often provide a more cost-effective solution than in case of self- evaluation. Objective evaluation; the external evaluator may also elicit more honest information from the staff and the beneficiaries of your programme. The perceived independence of the external evaluator can help to ensure that other organisations take the results more seriously. They may also (depending on your programme) be able to undertake the evaluation of the programme within a wider context. This may help you to address evaluation questions relating to mainstreaming and multiplier effects. Decision- Terms of Reference 19. When Should Evaluation Take Place? Two intervals: at an interim stage, at the end of the approved period for the programme. The monitoring data you collect are a key source of information for both interim and final evaluations. 20. Interim evaluation addresses whether the programme: has achieved its objectives by the dates set out in the work plan; and is on track to achieve its objectives by the end of the programme. Final evaluation draw conclusions on the design, implementation and degree of success of your programme in the light of your objectives and indicators; inform funding bodies and other stakeholders of your results, and the actual and potential impact of your programme; stimulate support for transfer and mainstreaming of your innovation; form the basis of the final report and other publications; and stimulate new ideas for innovation. Aims of Interm and Final Evaluation 21. How Is An Evaluation Conducted? PLAN ANALYSE INDICATORS ANALYSE DATA REPORT 22. Planning the evaluation- key points Focus of the evaluation. A clear, focused brief should go some way towards ensuring that you get the information you need from the exercise; Programme objectives. Programme products. Programme processes: processes related to partnership arrangements and decisionmaking, pro