1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.

22
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data

Transcript of 1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.

1

The Good, the Bad, and the Ugly:

Collecting and Reporting Quality

Performance Data

2

Presentation Plan

Data definition

Uses of Data

Data Quality Assessments (DQAs)

5 Elements of DQA

Practical Tips for Assessing Data Quality

3

What is Data?

Data is observed and recorded information

If it is not in writing, it does not exist

4

Data Can be Used to .. Make important decisions Ask questions such as:

Were targets met?

How did performance comparewith last year?

What is the trend fromthe baseline year?

How does actual datacompare with targets?

5

Data Can be Used to…

Address Implementation problems Are the activities on track and outputs on time?

Are outputs being produced in sufficient quantity?

Is the quality of the activities and outputs adequate?

Are our partners achieving critical results and outputs for which they are responsible?

6

Data Can be Used to….

Share results widely

Hold individuals and partners accountable

Encourage and emphasize activities that contribute to results

7

Data Quality Assessments

8

Data Quality Assessments Are…. Procedures for determining if a data set is

suitable for its intended purpose.

It determines data’s “fitness for use”

9

How Good Do Data Have to Be?When deciding about data quality, ask:

What level of quality is needed for confident decisions?

How critical are the data for decision-making? What are the consequences of decisions based on

the data?

10

The Elements of Data QualityValidity— Do data clearly and directly measure what is

intended?

Reliability— Can the same results be replicated using the same measurement procedures?

Timeliness— Are data sufficiently current and available for decision-making?

Precision— What margin of error is acceptable?

Integrity— Are mechanisms in place to protect data from manipulation for political or personal reasons?

11

Element 1: Validity

“Data are valid to the extent that they clearly, directly and adequately represent the result it was intended to measure”

12

Face validity:

Is simply looking at the concept to see if it makes sense, on the face of it.

If it looks like a duck and quacks like a duck,

then it must be a duck!

13

Construct Validity:

The extent to which a measure assesses specific traits or concepts (e.g IQ)

More rigorous test of validity than face or content validity

Instruments with high construct validity are used to test theory:

Content Validity:The extent to which a measure represents all facets of a social concept (eg. self-esteem, ethnic identity, attitudes).

14

Validity Questions 1: Are samples representative? Are the instruments/ instructions/questions

clear? If translated, are instrument questions

equivalent and culturally appropriate? Were steps taken to limit transcription errors?

Were the data collectors trained?

For raw data, were computation/compilation formulas written down and consistently applied?

15

Reliability:

“Data should reflect stable and consistent data collection processes and analysis methods over time. Progress toward performance targets should reflect real changes rather than variations in data collection methods.”

Are the same sampling and data collection methods, procedures, and instruments used from year to year, location to location, and among contributing IPs?

Are there internal data quality control procedures

written down and followed? Are data collection, computation, analysis and reporting procedures

written down and followed?

16

Reliability Test

Can an independent researcher obtain the same results from thesame raw data?

Are written internal data quality control procedures in place and followed?

Are data collection, computation, analysis andreporting procedures written down and followed?

17

Timeliness:

Are the data timely (current and available on a regular basis)?

Is the timing good (available when needed) for the decisions and/or reports they support?

18

Precision: Are the data of sufficient detail support decision-

making?

Are data accurate enough to present a fair picture of performance?

If a survey was used, was the margin of error calculated? Is the margin of error acceptable to decision makers? Are the data within the margin of error?

Remember: The smaller the margin of error, the greater the cost of data collection.

19

Integrity:

Are mechanisms in place to prevent corruption of the datafor personal, political, or professional interests?

Has an independent DQA been done? Do the data pass the “burden of proof” test?

(They pass unless manipulation can be proven.)

20

In summary, A DQA Answers…

How good your data is for audit If you need further

disaggregation for meaningful use

How will we use the data How aware IP are about data

strengths and weaknesses

21

Practical Tips for Assessing Data Quality Build assessment into

normal work processes Use software checks & edits

of data on computer systems

Get feedback from users of the data

Compare the data with data from other sources

Obtain verification by independent parties

22

Thank you and please ask questions!

Please visit our website:www.nigeriamems.com