UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

23
i190 Spring 2014: Information and Communications Technology for Development (ICTD) in Practice University of California Berkeley, School of Information LECTURE 22: 21 Apr 2014 Instructor: San Ng (www.sanng.com) Class Website: i190spring2014.sanng .com

Transcript of UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Page 1: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

i190 Spring 2014: Information and Communications Technology for Development (ICTD) in Practice

University of California Berkeley, School of Information

LECTURE 22: 21 Apr 2014

Instructor: San Ng (www.sanng.com)

Class Website: i190spring2014.sanng.com

Page 2: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

i190 Framework Conceptual

Week1: Introduction to Course W2: What is Development? W3: What is ICTDW4: Who Does What in Practice? Mapping the ICTD Landscape 

i190ICTD in

Practice: Core Skills

Technical (eApplications)

W5: Overarching Issues of eApplications W6: Infrastructure, Telecenters, Agriculture,W7: Revisiting Agriculture,, W8 : e-Health, EducationW9: eGovernance Microfinance 

Management

W10: BreakW11: Intro to Project Management  Planning and Assessment W12: Initiating: Design, Scheduling, Budgeting, HR

W13: Implementation W14: Monitoring and Evaluation/ Next Cycle  W15: Final Projects & Wrap Up

Page 3: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Introduction to Project Management

Planning

Initiation

Implementation

Monitoring & Evaluation

Next Phase? Transformation?

Page 4: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Implementation- Best PracticesCase: ITC e-Choupal: What are the needs/problems that this ICTD project is trying to address?

Page 5: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Implementation- Best PracticesCase: ITC e-ChoupalWhat made implementation successful?•Trust: choice•Meets Needs: Clear Value•Appropriate Tech: Simplicity of technology, new and old tech•Local structures and systems•Incremental Roll out •Mission-based

Page 8: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Implementation- Complex Environments

Case: Competing for Development (A)

•If you were Ghazialam, would you go ahead with the $65,000 investment?

•What are the key tradeoffs? What would an ‘ideal’ outcome look like?

Page 9: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Implementation- Complex Environments

Case: Competing for Development (B1-6)

6 groups

Role play exercise: Make the case within your role for the $65,000 investment

Page 10: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Introduction to Project Management

Planning

Initiation

Implementation

Monitoring & Evaluation

Next Phase? Transformation?

Page 11: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Different Types of Evaluation and Performance Measurement

Program Level Organization Level Community andSocietal level

WikipediaCase

Page 12: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Evaluation Purpose

Measuring program effectiveness

Determining if a program meets its objectives

TYPES

*baselinesFormative*ongoing

* feedback* changing

the program

Summative* look at final outcomes

*impacts* cut or keep

Page 13: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

OTHER PURPOSES* compliance* legitimacy* certification* lessons learned* check for unintended consequences* benchmarking * more money* white wash and eye wash p* kill a project

* political attack

* new opportunities

* protection and self interest

* melt down indicators

Page 14: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

EVALUATION CHECK LIST

WHEN IS EVALUATION WORTH DOING?

* Who Wants This and What Decision Do They Want to Make?(lessons learned)

* Are the Impediments Manageable?(resources, objectives, agreement, special issues)

* Is there Political Support?(general support)

THE REGULAR COMPONENTS OF THE EVALUATION PROCESS* Purpose and Objectives

* Indicators

* Design

* Data and Utilization

* Problems

Page 15: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

EVALUATION ACTOR MAPPING

BOARDOR LEGISLATORS

OTHERSTAFFOR DEPTS.

TOP MGMT

PROGRAMPARTICIPANTS

PROGRAM STAFF

PRESS ORCOMMUNITY

DONORS

MIDDLEMGMT.

OUTSIDERSFOR TEACHINGPURPOSES

Page 16: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

RANGE OF INDICATORS

FEELINGS DO YOU TRUST THESE

INPUTS PEOPLE/

PROCESS involvement/coordination

OUTCOMES(INTERMEDIATE) INCREASED INCOME(FINAL) X LEVEL OF CONTANIMATION

EFFICIENCY AND PRODUCTIVITY lbs. Of fish/$ $/lbs. Of fish

SPAN AND SCOPE OF COVERAGE % target population served

SATISFACTION customer satisfaction/ commercial

IMPACTS (sustainable) measurable change/ broader/ (PROGRAM CAUSED OUTCOME) longer term

Page 17: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

* HOW WOULD YOU ASSURE THAT YOURRESULTS WERE VALID AND RELIABLE?

RELIABLITY-- DO YOU GET THE SAME RESULTTIME AFTER TIME.

VALIDITY-- UNBIASED COMPARED TO A STANDARD

Page 18: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

What is the purpose of a research design?

* tailored to each problem

* to answer very specific questions

* weigh benefits and costs and resources

* can allow cross comparison

* was it the program that made the difference?

Page 19: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Using Evaluation Results-- Style Differences

Academic Style* Slow

* Scientific Method

* Clear Objectives

* Careful Study

* Written Communication

* Precision

* Academic Reference Group

Managerial Style* Pressure to Decide

* Many Simultaneous +Fragmented Tasks

* Competing Objectives

* Action

* Verbal Communication

* Incomplete Date

* Managerial Reference Group

PROBLEMS

* TIME * VERY RIGOROUS * IRRELEVANT

* FORMAT * NOT RIGOROUS * COMMUNICATION

Page 20: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

DESIGN TYPESPRE EXPERIMENTAL (PE)

* Goals verses Performance

* Before and After 0 X 0

QUASI EXPERIMENTAL (QE) * Time Series 01 02 03 04

* Non Equivalent Control Group 01 X 02 03 04

* Multiple Time Series 01 02 X 03 04 05 06 07 08

EXPERIMENTAL

* Two Group Pre and Post Test 01 X 02 R 03 04

* Post Test Only X 01 R 02

* Solomon Four 01 X 02

03 04 X 05 06

R

Page 21: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

Some of the More Common Methods

* Balanced Score Cards and Other Overall General Assessments

* Goals vs. Performance and also Cost and Efficiency

* Outcome Assessment

* Benchmarking

* Best Practice

* Rapid Assessment Tools (quicker and dirtier rather than deeper)

* More based on sampling than 100% study

Page 22: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

DATA COLLECTION METHODS* General Statistical Analysis

* Cost Benefit/ Rates of Return

* Simulations

* Content Analysis

* Record Reviews

* Unobtrusive Measures

* Group Observation

* Surveys and Testing

* Personal Interviews

* Participation Observation

* Case Studies

LESSINTRUSIVE

MOREINTRUSIVE

THE ETERNAL TRIANGLE

Precision

Cost Complexity

Page 23: UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

  Measureable Indicators

Types of Data needed

Data Collection Methods/Frequency for M&E

Overriding Goal      

Objectives (at least 4)

     

Instructions: Everyone was pleasantly shocked by these successful results. However, the founder Jimbo Wales intuitively knows that the number of articles per se does not measure Wikipedia's success completely, especially since Wikipedia began with a completely different set of goals/activities and became 'successful' only organically. He wants to hire you to determine a sound methodology to evaluate Wikipedia's success. He wants you to design a Logical Framework for Wikipedia (based on what we already learned in class), with indicators he can measure to determine Wikipedia progress and success. He has given us a sample template that we will discuss and brainstorm in class: