Full Guide Performance Measurement - without...

67
Performance Measurement: A Reference Guide What gets measured gets done If you don’t measure results you can’t tell success from failure If you can’t see success you can’t reward it If you aren’t rewarding success, you’re probably rewarding failure If you can’t see success you can’t learn from it If you can’t recognize failure you can’t correct it If you can demonstrate results you can win public support A Reference for Ministries March 2005

Transcript of Full Guide Performance Measurement - without...

Page 1: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Performance Measurement: A Reference Guide

What gets measured gets done If you don’t measure results you can’t tell success from failure If you can’t see success you can’t reward it If you aren’t rewarding success, you’re probably rewarding failure If you can’t see success you can’t learn from it If you can’t recognize failure you can’t correct it If you can demonstrate results you can win public support

A Reference for Ministries ⏐ March 2005

Page 2: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

© Queen’s Printer for Ontario, 2004.

Please do not reproduce this document without the permission of the Queen’s Printer for Ontario

Page 3: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Acknowledgements

Kevin Perry, Sheila DeCuyper, Barbara Adams and Godfrey Edwards at Management Board Secretariat would like to thank all of those involved in the development of the Performance Measurement Guide.

Nilam Bedi, Nancy Elliot, Sam Erry, Rob Glaister, Rian Pienaar, Sara Sandhu, Simon So, Michel Theoret, Scott Tyrer, Henric Wiegenbroeker, and Bev Wolfus formed the inter-ministerial committee who kindly read the early drafts and helped to define the structure and contents of the final product. A number of programs offered examples of their work in performance measurement for the appendices. Thanks to Geoff Steers of the OPP for the article on the PRISM system, Bohdan Wynnycky for the description of the Municipal Performance Measurement Program and Bev Wolfus for sample program logic models. Our thanks to Russ Obonsawin and Simon Trevarthan of the Ministry of Finance for ensuring consistency between the guide and Modern Controllership’s performance measurement training. A number of people took the time to provide detailed feedback on the guide as part of our consultation with CAOs across government. Thank you to Richard Clarke, Angela Faienza, David Field, Chris Giannekos, John Kerr, Robert Montgomery, Sheila McGrory, Mercedes Nolan, Anne Premi, Elizabeth Rogacki, Len Roozen, Deborah Ross, Carol Townsend, Lisa Watson and Kim White.

Thanks to central agency staff who generously contributed their expertise and creativity to make the guide interesting and align the framework offered in the Guide with the direction of government. Deputy Ministers Kathy Bouey and Carol Layton and Assistant Deputy Minister, Peggy Mooney provided direction and encouragement at each stage of development.

Special thanks to the members of the Performance Measurement and Evaluation Network without whose experience developing performance measures and systems across government the guide could not have been written. We hope that you will find the guide useful in your ongoing work to develop results-based management practices across the Ontario Public Service.

A Reference for Ministries ⏐ February 2005
Page 4: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Table of Contents Executive Summary................................................................................................................... 4

Performance measurement is the method for demonstrating results. ................................................ 4 Performance Measures: Six Steps to Creating Performance Measures ............................................... 5 Logic Model ................................................................................................................................. 6 Meaningful Measures.................................................................................................................... 7 Performance Targets .................................................................................................................... 8 Reporting Performance................................................................................................................. 8

Internal reporting..................................................................................................................... 8 Public Reporting.........................................................................................................................9

Measures Checklist........................................................................................................................9 Conclusion................................................................................................................................. 10

SECTION 1: The Big Picture ................................................................................................... 11 Key Terms:................................................................................................................................ 11 1.0 Context: Results-Based Management ............................................................................... 12 1.1 Performance Measurement as part of Results-Based Management...................................... 131.2 Performance Measurement Systems................................................................................. 15 1.3 Reporting Performance Measures .................................................................................... 16 1.4 Performance Measurement Terminology........................................................................... 16 1.5 Summary of Key Points................................................................................................... 16 Checklist ................................................................................................................................... 17

SECTION 2: The Who, What and Why of Performance Measurement ................................... 18 Key Terms................................................................................................................................. 18 2.0 What is a Performance Measure?..................................................................................... 19 2.1 Why Measure Performance? ............................................................................................ 19 2.2 Who Uses Performance Measurement Information and Why?............................................. 20 2.3 Levels and Types of Measurement ................................................................................... 22 2.4 Summary of Key Points................................................................................................... 23 Checklist ................................................................................................................................... 24

Performance Measurement: A Reference Guide – Table of Contents 2

Page 5: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Table of Contents SECTION 3: ............................................................ 25How to Develop Performance Measures

Key Terms................................................................................................................................. 25 3.0 Overview....................................................................................................................... 26 3.1 Developing Performance Measures .................................................................................. 26

STEP 1: Define the strategy in relation to government priorities ................................................ 26STEP 2: Identify and consult on cross-ministry and horizontal initiatives...................................... 28 STEP 3: Identify the performance measures ............................................................................. 28 STEP 4: Check the measures for pitfalls ................................................................................... 30 STEP 5: Establish Baselines..................................................................................................... 34 STEP 6: Set Performance Targets ............................................................................................ 34

3.2 Summary of Key Points...................................................................................................... 35 Checklist ................................................................................................................................... 36

SECTION 4: Reporting Performance Measures ...................................................................... 37 Key Terms:................................................................................................................................ 37 4.0 Introduction................................................................................................................... 38 4.1 Canadian Comprehensive Auditing Foundation / La Fondation canadienne pour la vérification

integrée (CCAF/FCVI) Principles for Reporting................................................................... 38 Appendix 1 .............................................................................................................................. 40

Performance Measurement Resources in the Ontario Public Service................................................ 40 Other Resources ........................................................................................................................ 40

Appendix 2 .............................................................................................................................. 41 Glossary of Performance Measurement Terms .............................................................................. 41

Introduction........................................................................................................................... 41 Definitions ............................................................................................................................. 41

References ................................................................................................................................ 45 Appendix 3a ............................................................................................................................ 46

Examples of How Ontario is Using Performance Measurement ....................................................... 46 Appendix 3b ............................................................................................................................ 48

Municipal Performance Measurement Program.............................................................................. 48 Appendix 4 .............................................................................................................................. 53

Outputs.................................................................................................................................... 53 Outcomes.................................................................................................................................. 54 High-Level Indicators ................................................................................................................. 55

Appendix 5 .............................................................................................................................. 56 Sample Logic Models .................................................................................................................. 56

Appendix 6 .............................................................................................................................. 60 Step-by-Step Instructions for Creating a Logic Model .................................................................... 60

Appendix 7 .............................................................................................................................. 64 Measures Checklist..................................................................................................................... 64

Appendix 8 .............................................................................................................................. 65 Example of How Benchmarking Can Be Used................................................................................ 65

Performance Measurement: A Reference Guide – Table of Contents 3

Page 6: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Executive Summary This guide describes the Ontario Public Service’s approach to performance measurement and explains how performance information is used in decision-making and business planning.

Performance measurement is the method for demonstrating results. The diagram below illustrates the relationship of performance measurement to other parts of the results-based management process. The focus is on ministry strategies for meeting government priorities or serving another important public interest. Performance measures for these strategies demonstrate the contribution that their results make to government priorities.

The results achieved at the activity level, in the form of outputs or short-term outcomes, will be used to support project plans, quarterly reporting and Management Board of Cabinet and Cabinet policy submissions. However, the latter submissions should also be supported by evidence that demonstrates intermediate-term outcomes.

The performance measures that ministries report centrally will be a combination of:

• measures related to public reporting and;

• intermediate level outcome measures that demonstrate the contributions of ministry strategies to meeting government priorities or serving other important public interests.

Figure 1: Performance Measurement is a key to results-based management

Performance Measurement: A Reference Guide – Executive Summary 4

Government Priorities

All other Government

activities

Results

Ministry Strategies

Ministry Activities

• Public results measures and high level indicators

• Intermediate-term outcome measures

• Outputs and short-term outcomes

• Project Plans and milestones

Broader Public Sector Activities Agency Activities

Page 7: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Performance Measurement: A Reference Guide – Executive Summary 5

Performance Measures: Six Steps to Creating Performance Measures Developing performance measures is not easy. Poorly integrated performance measurement systems can be worse than no system at all and may actually support poor decision-making. There are six steps to establishing good performance measures.

Use a logic model to define the ministry's strategies in relation to government priorities Ministries should use logic models to illustrate relationships among activities, strategies and the results to be achieved.

Step 1

Identify and consult on cross-ministry or horizontal initiatives Ministries should consult with third party service providers, broader public sorganizations and other ministries to align performance measurement systemsto promote greater coordination and avoid duplication.

ector Step 2

Identify individual performance measures Developing performance measures is an iterative process and it is rare to get a satisfactory product the first time. It may be helpful to research measures that have been developed for similar activities elsewhere.

Step 3

Check the measures for pitfalls A number of pitfalls can compromise the value or usefulness of a performance measure. The most common pitfalls are attribution and measurement corruption.

Step 4

Establish baselines A baseline is the level of results at a given time that provides a starting point for assessing changes in performance and establishing objectives or targets for future performance.

Step 5

Set performance targets A target is a clear and concrete statement of planned results (including outputs and outcomes) to be achieved within the time frame of parliamentary and departmental planning and reporting against results, which can be compared. Every ministry is asked to use comparative data to set targets based on its own performance, established industry standard, articulated customer preference and/or performance of a comparable organization. Reasonable targets are challenging but achievable.

Step 6

Page 8: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Once established, measures are used to measure progress, take corrective actions and adjust targets where applicable.

Let’s look at Step 1 – “Using a logic model” in more detail, because the logic model provides the building blocks for good performance measures.

Logic Model The logic model provides a foundation for developing performance measures that will support decision-making. A logic model is a tool that can help define strategies and activities in relation to government priorities. It clearly shows the relationships among government priorities, ministries' strategic objectives, and how ministry activities contribute to achieving those objectives and priorities through their expected outcomes. The process of creating a logic model and making the linkages among inputs, outputs and outcomes can help build common understanding of what is expected, prioritize activities and identify appropriate performance measures.

Many people will be familiar with program logic models, but they can also be used at a strategic level or created for the work of an entire organization. Everyone who uses logic models adjusts them to meet their own purposes, but a standard logic model is shown below.

The important points to remember when creating a logic model are:

• a logic model is an iterative process that involves many people working together rather than a product which one person can produce;

• the information entered into the logic model at the early stages may need to be revised as new information is entered;

• it's not just the information in the boxes that counts, but the relationships between the boxes.

Figure 2: Standard Logic Model Diagram

List the benefits or changes in conditions, attitudes and behaviour that are expected to result from the activities in the intermediate-term

List the changes in participation, awareness, behaviour, compliance or capacity that are expected to result from the activities in the short term

List the tangible products of the activities

List all activitiesList all inputs

Desired Intermediate

Outcomes Activities Outputs

Desired Short-Term Outcomes

Inputs

Objectives Of The Strategy: What does the strategy hope to accomplish

Customers: Who Benefits

Larger Public Interest: (Government Priorities are usually here)

Performance Measurement: A Reference Guide – Executive Summary 6

Page 9: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Figure 3: Logic Model Process – How Each Component Leads To The Next Stage

Your Planned Work Achieving Objectives

Activities Outputs Outcomes High Level Changes Inputs

A logic model will help identify all the potential measures for a strategy or activity, but we don’t want to measure everything, just those things that are meaningful for decision-making.

See Appendix 6 for step-by-step instructions for creating a logic model.

Meaningful Measures Meaningful performance measures should subscribe to the following criteria. They should:

• show how ministry activities contribute to achieving results;

• use reliable, verifiable and consistent data collection methods;

• provide key information for decision-making;

• capture all areas of significant spending;

• identify and track impact as well as progress towards meeting desired outcomes;

• incorporate consideration of risks and act as “thermometers” for risk management.

The Ontario government uses three levels of performance measurement:

should be developed to demonstrate the short-term progress that ministry activities make towards achieving the objectives of ministry strategies.

Output measures

(short-term and intermediate-term) should be developed to demonstrate the achievement of ministry strategies and/or the contribution of ministry strategies to meeting government priorities.

Outcome measures

measure social, environmental or economic conditions for which government alone is not accountable, but which reflect the extent to which the government's priorities are being achieved.

High-level indicators

Performance Measurement: A Reference Guide – Executive Summary 7

Page 10: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Performance Measurement: A Reference Guide – Executive Summary 8

Identifying output measures and high-level indicators is relatively easy but identifying good outcome measures can be difficult. The Ontario government uses three types of outcome performance measures.

The extent to which a strategy is producing its planned outputs in relation to use of inputs. Efficiency:

The extent to which a strategy is producing its planned outcomes and meeting intended objectives. At least one outcome effectiveness measure is required for each ministry strategy.

Effectiveness:

The degree to which the intended recipients or beneficiaries of a product or service indicate that the product or service meets their needs and expectations for quality and efficiency.

Customer satisfaction:

Ministries should use outcome measures of effectiveness, efficiency, and customer satisfaction wherever possible.

In addition to using performance measurement information to support internal decision-making, ministries will be asked to report their progress to support central decision-making and public reporting.

Performance Targets Every ministry is asked to use comparative data to set targets based on its own performance, established industry standards, articulated customer preference and/or performance of a comparable organization. Reasonable targets are challenging but achievable.

Reporting Performance Government and ministries need to be able to report on performance so that the public and stakeholders can make informed judgements about achievements with public resources. Reporting performance internally to Ontario Public Service management and staff is also important in order to guide decision-making and support continuous improvement efforts. There are two types of reporting: internal and public.

Internal reporting • Annual reporting on the results achieved to date;

• Quarterly reporting includes performance measurement information to show how short-term objectives are being achieved and to identify risks to achievement.

Page 11: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Public Reporting

• Evidence to support reports to the public

Ministries will also use performance measurement in a variety of other reporting relationships. For example, Deputy Ministers and Ministers will require ongoing performance reporting for internal ministry management and many ministries also contribute data to sector-specific federal, provincial and territorial bodies.

The 2004 Ontario Budget referred to the nine principles for public reporting developed by the Canadian Comprehensive Auditing Foundation (CCAF/FCVI) as a model for improving public reporting.

Measures Checklist

Is the measure actually a good measure of the objectives the strategy intends to achieve?

Relevance:

Validity: Does the measure actually measure what it is supposed to?

Reliability: Do different users of the same measure report the same result?

Verifiable: Will the measure produce the same data if measured repeatedly?

Attribution: Does the measure relate to factors that the ministry can affect?

Clarity: Is the measure clearly defined and easily understood?

Does the measure provide correct information in accordance with an accepted standard?

Accuracy:

Cost Effectiveness: Is the value of the measure greater than the data collection costs?

Sensitivity: Is the measure able to measure change?

Timeliness: Can data be collected and processed within a useful timeframe?

Comparability:

Consistency:

Integrity:

Performance Measurement: A Re

Can the data be compared with either past periods or with similar activities?

Does the data feeding the measures relate to the same factors in all cases at all times?

Will the measure be interpreted to encourage appropriate behaviours?

ference Guide – Executive Summary 9

Page 12: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Conclusion In order to demonstrate results we need to measure our performance and use that performance information for planning and on-going management of government activities. Reporting performance is important to guide decision-making and support continuous improvement efforts.

For resources to support performance measurement see Appendix 1.

Performance Measurement: A Reference Guide – Executive Summary 10

Page 13: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

SECTION 1: The Big Picture

Key Terms: Accountability: The obligation to answer for results and the manner in which responsibilities are discharged. Accountability cannot be delegated. Responsibility is the obligation to act whereas accountability is the obligation to answer for an action.

Activity: An activity is the work performed by ministries to implement public policy and provide services to the public. All activities consume resources and produce products and/or services. One or more activities will be critical to the achievement of overall public policy objectives. Ministries must be able to demonstrate a direct causal link between the activity and the outcome(s).

Monitoring: The process of collecting and analyzing information to track program outputs and progress towards desired outcomes.

Outcome: The actual effects/impacts or results of the outputs. Outcomes provide relative information, for example: “Percentage of total cases resolved to client satisfaction within “x” weeks”.

Output: The products or services that result from activities. Outputs are often expressed as, “Number of “x’”, for example: “Total number of cases”.

Priority: Higher order goals of government that reflect its commitment to citizens and contribute to enduring human, economic, civic and environmental benefits.

Result: A condition (outcome) or product (output) that exists as a consequence of an activity.

Strategy: Plan outlining how specified ministry activities and programs contribute to a government priority and results.

Performance Measurement: A Reference Guide – Section 1: The Big Picture 11

Page 14: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

This guide:

• describes the Ontario Public Service’s approach to performance measurement;

• explains how performance information is used in decision-making and business planning;

• provides guidance on how to use a logic model to develop performance measures;

• establishes criteria for good performance measures;

• identifies pitfalls to performance measurement;

• details central performance measurement and reporting requirements.

The appendices will be of particular interest to staff with ministry leadership roles for performance measurement and the guide may be used as a support to internal ministry training.

1.0 Context: Results-Based Management

Transparency is an important part of government accountability and governments all over the world are increasingly expected to demonstrate what results they achieve with taxpayers’ money. It is not enough to know what government is spending; decision makers want to know the outcome or impact of government decisions. The process of identifying objectives and desired outcomes, measuring progress and making decisions on the basis of results is called results-based management.

Results-based management is a comprehensive, government-wide approach that informs results-based decision-making, ensuring that all government-funded activities are aligned with strategies that contribute to meeting government priorities or serve an important public interest.

Results-based management incorporates the following principles:

• government priorities drive planning processes across all government function;

• effective accountability – within ministries and between ministries and third party service providers and broader public sector organizations – requires that performance measures be built into programs and activities so that expectations are clearly articulated, to monitor and evaluate performance on a regular basis and take corrective action where necessary;

• horizontal integration of strategies and activities across ministries and the broader public sector helps to demonstrate how wide ranging activities complement each other in achieving government priorities;

• demonstrable results drive the development of strategies and activities, policy and legislative agendas, investment decisions and public accountability.

Results-based management requires reliable, objective information at all levels of government. This information is gathered through performance measurement. Strong performance measures, that demonstrates objectives of government-funded services are being met, and are critical to implementing results-based management successfully.

The vehicles for implementing results-based management are:

• Ministries’ results-based plans, which are the key government documents guiding ministry decisions, and

• Performance measurement, which is how results are demonstrated.

Performance Measurement: A Reference Guide – Section 1: The Big Picture 12

Page 15: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

1.1 Performance Measurement as part of Results-Based Management Performance measurement has been used in the Ontario Public Service for over 20 years but has not always actively informed ongoing decision-making. The critical changes to performance measurement that are introduced by a results-based management approach are:

• In addition to being used by ministries internally, performance measurement is integrated into all key government decision-making processes including:

o budget setting and printed estimates;

o Management Board of Cabinet and Cabinet Policy Committee decisions;

o broader public sector contract management;

o ongoing internal government management (results-based planning, quarterly reporting, expenditure and risk management, report-backs).

• Performance measurement becomes part of a larger management process that aligns ministry strategies and activities with broad government priorities and specific results.

• Broader public sector organizations are accountable to the government for the results that they achieve. Ministries’ performance measures for contract and service management must be aligned with ministry strategies to meet government priorities and specific results.

Figure 1: Performance Measurement is a key to results-based management

Performance Measurement: A Reference Guide – Section 1: The Big Picture 13

Government Priorities • Measures associated with results reported to the public may be drawn from either strategies or activities

All other Government

activities

Results

Ministry Strategies

Ministry Activities

• Intermediate-term outcome measures, reported annually in RbP

• Some may be linked to results

• Outputs and short-term outcomes to support quarterly reporting, release from hold-back, and activity-based costing

• Project Plans and milestones

• Some may be linked to results

Agency Activities BPS Activities

Page 16: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Some performance measurement information will be used for both internal management and for publicly reporting on progress towards achieving government priorities.

The diagram on the previous page illustrates the relationship of performance measurement to other parts of the results-based management process. The focus is on ministry strategies for meeting government priorities or serving another important public interest. The results achieved by various government- funded activities - whether delivered directly, through contracts, agencies or the broader public service – should measure results that demonstrate the contribution they make to ministry strategies. The results achieved at the activity level, in the form of outputs or short-term outcomes, will be used to support project plans, quarterly reporting and Management Board of Cabinet and Cabinet policy submissions. However, the latter submissions should also be supported by evidence that demonstrates intermediate-term outcomes. The performance measures that ministries report centrally will be a combination of: a) measures related to public reporting and, b) intermediate level outcome measures that demonstrate the contributions of ministry activities and strategies to meeting government priorities or serving other important public interests.

Figure 4: Identifying the right performance measures means knowing how activities align with strategies

Strategy 1 Strategy 2 Strategy 3

A number of activities may be needed to meet the objectives of a strategy.

Intermediate-term outcome performance measures demonstrate that the objectives of the strategy are being met.

It is important to know what portion of an activity is contributing to a strategy

Short-term outcome and output performance measures demonstrate the progress that activities are making to meeting the objectives of the strategy.

100% Strategy 1 Strategy 1

40% Strategy 2

S S70%

Strategy 3

S100%

trategy 330%

trategy 1100%

trategy 260%

Activity 5 Activity 4 Activity 3 Activity 2 Activity 1

Performance Measurement: A Reference Guide – Section 1: The Big Picture 14

Page 17: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

As the diagram on the previous page illustrates, it will be advantageous for ministries to develop integrated sets of performance measures that can be combined or disaggregated to serve a variety of purposes.

A ministry strategy usually will include numerous activities. A single activity may contribute to meeting the objectives of more than one ministry strategy. Knowing how and to what degree ministry activities contribute to meeting the objectives of ministry strategies is a precondition to identifying the key performance measures that demonstrate success and provide information to support decision-making.

For more information on the relationship between performance measurement and results-based management, see PIL - Results-Based Planning: Further Reading

http://opspolicy.gov.on.ca/scripts/index_.asp?action=31&N_ID=5&PT_ID=14198&U_ID=32832

1.2 Performance Measurement Systems In consultations about making the shift to a results-based management approach within government, an Ontario Public Service staff member observed that; as public servants, we have always been conscious of meeting a need and providing a service, but it’s also important to think about the impact those activities have and the results they achieve. The results based management approach helps us measure our achievements.

Results-based management focuses on outcomes, not activities and thus, a systematic approach to performance measurement is needed to demonstrate the results that government-funded activities produce. A performance measurement system is a comprehensive, integrated set of measures (of various types and levels) that provides a multi-faceted picture of the ministry’s progress toward its targeted outputs (products and/or services) and outcomes. A good system contains strong performance measures and is easy to access, update and adapt so that individual pieces of information can be grouped for different purposes. Performance measurement systems enable ministries to manage their strategies and demonstrate they are achieving their own, and government, objectives.

Rather than simply stating facts, performance measurement will be used to provide a full picture of results including a:

• description of the overall context for which performance is being assessed;

• statement of what was expected to be accomplished and at what cost;

• description of what was accomplished in light of the expectations;

• discussion of what was learned and what will be done next;

• description of what was done to assure the quality of the data1.

Ministries will need to assess existing performance measurement systems to:

• ensure existing performance measurement systems are working as intended;

• coordinate with transfer payment recipients and other ministries to ensure horizontal integration and agree on roles and responsibilities for data collection and reporting;

• identify gaps in performance measurement and create appropriate measures and systems for consistent data collection and reporting.

1 Adapted from Mayne, J. 2003. “Reporting on Outcomes: Setting Performance Expectations and Telling Performance Stories,” Ottawa: Office of the Auditor General of Canada.

Performance Measurement: A Reference Guide – Section 1: The Big Picture 15

Page 18: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

1.3 Reporting Performance Measures There are two types of reporting: internal and public.

Internal reporting

• Annual reporting on the results achieved to date (primarily outcome measures, linked to ministry strategies)

• Quarterly reporting includes performance measurement information to show how short-term objectives are being achieved (primarily output measures, linked to ministry activities)

Public Reporting

• Evidence to support reports to the public

1.4 Performance Measurement Terminology There is an extensive vocabulary associated with performance measurement that is often a subject for debate. In order to ensure consistency in the way we talk about performance measures in the Ontario Public Service, Management Board Secretariat has developed a glossary of terms, some of which are found at the beginning of each section in this guide. Appendix 2 contains the full Glossary of Performance Measurement Terms.

1.5 Summary of Key Points

• Results-based planning is an integrated, government-wide approach that aims to ensure all government activities are aligned with strategies that contribute to achieving government priorities.

• Results are demonstrated through performance measurement and may be reported publicly.

• A systematic approach to performance measurement needs to be developed in order to demonstrate the various ways that government-funded activities contribute to achieving results.

Performance Measurement: A Reference Guide – Section 1: The Big Picture 16

Page 19: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Checklist

Ministries are expected to use performance measurement to: • Manage activities internally • Monitor the performance of broader public sector organizations via contract and

service management • Report progress on a quarterly basis to Management Board of Cabinet • Support requests for resources or policy change • Report to the public on results

Ministries’ performance measures for contract and service management must be aligned with ministry strategies to achieve government priorities and specific results.

Ministries will need to assess existing performance measurement systems to: • Ensure they are working as intended • Coordinate with transfer payment recipients and other ministries to ensure

horizontal integration and agree on roles and responsibilities for data collection and reporting

• Identify gaps in performance measurement and create appropriate measures and systems for consistent data collection and reporting

Performance Measurement: A Reference Guide – Section 1: The Big Picture 17

Page 20: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

SECTION 2: The Who, What and Why of Performance Measurement

Key Terms Customer Satisfaction: The degree to which the intended recipients or beneficiaries of a product or service indicate that the product or service meets their needs and expectations for quality and efficiency.

Effectiveness: The extent to which an organization, policy, program or initiative is producing its planned outcomes and meeting intended objectives.

Efficiency: The extent to which an organization, policy, program or initiative is producing its planned outputs in relation to expenditure of resources.

High Level Indicator: A measure of changes in social, environmental or economic conditions.

Inputs: The resources (human, material, financial, etc.) allocated to carry out activities, produce outputs and/or accomplish results.

Performance Measurement: A Reference Guide – Section 2: The Who, What and Why 18

Page 21: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Performance Measurement: A Reference Guide – Section 2: The Who, What and Why 19

2.0 What is a Performance Measure? A performance measure is quantifiable information that provides a reliable basis for assessing achievement, change or performance over time.

2.1 Why Measure Performance? We measure performance because:

Managers need evidence to help

improve activities to meet ministry

strategies.

The public wants to know – and has a right to know – what government does with

their money.

Government needs to report accurately to the public on

what it has done.

Ministers (and Deputy Ministers) need to tell

Cabinet what their

on an annual basis. ministry has accomplished

Government-funded activities need evidence

to support their recommendations to

their Minister, Management Board of Cabinet or Cabinet for changes to ministry

activities.

Management Board of Cabinet and Cabinet need evidence to support their

decisions.

HOW WELL ARE WE

DOING? CAN WE DO

BETTER?

Systematic measurement and assessment of performance supports results-based planning by generating evidence of:

• how well existing government activities perform;

• the extent to which these activities are in the public interest and meet client needs;

• whether these activities are consistent with government priorities and expectations.

Page 22: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

20

2.2 Who Uses Performance Measurement Information and Why? Performance measurement information has four uses:

• demonstrating accountability for expenditure;

• supporting informed decision-making;

• providing accurate information about government-funded activities to the public;

• promoting continuous improvement of government-funded activities and the administration of government itself.

Different users have different information needs:

• external users like clients of government programs and citizens might use performance information to better understand government’s accomplishments and as a way to be more involved in the democratic process;

• internal users such as Ministers, senior management and central agency staff use performance information to support continuous improvement of ministry activities and strategies and in making strategic and resource allocation decisions.

The charts below summarize the different uses of performance measurement information by external and internal government users. For examples of effective use of performance measurement information in Ontario see Appendix 3.

Table 1: How government performance information is used externally

Performance Measurement: A Reference Guide – Section 2: The Who, What and Why

Accountability

Public Reporting

• To understand what the government thinks is important

• To see what results government produces with the money spent

• Enables analysis, interpretation and evaluation of government performance

• Public reports demonstrate that government is making good use of performance information (e.g., improvements to programs/services and operational efficiencies)

• Citizen education about performance information can help citizens and stakeholders understand performance data and the various ways to use it, from how to ask questions and evaluate performance to how to influence public decisions

Decision-Making • To understand what the government thinks is

important • To see what results government produces with

the money spent

Citizen/Stakeholder

Purp

oses

of P

erfo

rman

ce M

easu

rem

ent

Page 23: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Performance Measurement: A Reference Guide – Section 2: The Who, What and Why 21

Table 2: Levels of organization at which performance measures are used internally

Across Government

Individual Level Activity Level Ministry Level

Performance measures are used across government to support the development of the Fiscal Plan and the Printed Estimates

Performance measures support accountability for expenditure at the ministry level via Results-Based Plans and quarterly reporting to Management Board of Cabinet

Annual Deputy Minister performance contracts

Individual performance plans

Performance measures support accountability fexpenditure via Integrated Financial Information System, Management Board of Cabinet and periodic audits by Internal Audit Division and the Ontario Provincial Auditor

or

Acco

unta

bilit

y fo

r Exp

endi

ture

Performance measures are used at the level of Management Board of Cabinet and other Cabinet Committees to support expenditure and policy decisions in the context of broader government priorities

Performance measures are used to inform decision-making at the activity level by:

Providing evidence to managers within the ministry about how well objectives are being met

Helping to identify gaps in service or manage risks to activity delivery by ministries or other broader public sector organizations

Performance measures are used at the ministry level to help identify or adapt ministry strategies to serve vital public interests and deliver on government priorities

Performance measurement information is used to guide individual decisions, at all levels of the organization, from program managers to Cabinet Committee members

Decis

ion-

Makin

g

Uses

of P

erfo

rman

ce M

easu

rem

ent I

nfor

mat

ion

Performance measures are used to inform decision-making at the activity level by:

Providing evidence to managers within the ministry about how well objectives are being met

Helping to identify gaps in service or manage risks to deliver activities by ministries or other broader public sector organizations

Performance measurement information is used to guide individual decisions, at all levels of the organization, from program managers to Cabinet Committee members

Performance measures are used at the level of Management Board of Cabinet and other Cabinet Committees to support expenditure and policy decisions in the context of broader government priorities

Performance measures are used at the ministry level to help identify or adapt ministry strategies to serve vital public interests and deliver on government priorities

Publ

ic Re

porti

ng

Performance measures are used at a ministry level to support continuous improvement by ensuring:

Activities align with ministry strategies to meet objectives

Opportunities for integration among activities are maximized

Performance measures are used to support continuous improvement at the activity level by helping staff and managers to adapt activities to meet the changing needs of clients

All Ontario Public Service employees can use performance measures to identify opportunities for continuous improvement

Performance measures are used to support continuous improvement across government by promoting horizontal coordination, communication and integration among government-funded services to improve public access, and maintain a high quality service

Cont

inuo

us Im

prov

emen

t

Page 24: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

2.3 Levels and Types of Measurement There are different types and levels of performance measures for different uses. Later sections of this guide will explain how to develop the right type of performance measure for each purpose.

There are three levels of performance measurement used in the Ontario Public Service:

Output measures measure the tangible products or services that result from activities and are often raw data expressed in terms of frequency. “Number of (x)” is an output measure. For example, “Number of acres of green space protected through legislation” is an output.

Outcome measures (short-term and intermediate-term) measure the effects/impacts or results of the outputs. Outcomes provide information in relation to other information, or the broader context. Thus, outcomes contain a level of analysis. For example, “Percentage of new growth on existing developed land, e.g., infill, brown- or grey-fields" is an outcome measure because it measures the outputs (new growth) in relation to the existing developed land base.

High-level indicators measure social, environmental or economic conditions. High-level indicators provide important information about the broader context to which government must respond and are therefore useful to support decision-making. However, they are rarely influenced solely by government initiatives. For example, “Percentage of households living below the national poverty line” provides important information about the economic security of families, which is important to know when government is developing social policy. However, government programs in and of themselves are unlikely to affect a change in this measure.

Short-term and intermediate-term outcome performance measurement information is required to support decisions concerning legislation, policy, resource requests and budget and allocations. In some cases, output information may also be valuable. Output or short-term performance measurement information is required for quarterly reports of progress towards meeting the objectives of ministry strategies and may be included in public reports.

Appendix 4 contains detailed descriptions of output measures, outcome measures and high-level indicators for ministries’ internal training purposes.

Identifying output measures and high-level indicators is relatively easy. Identifying good outcome measures can be difficult. The Ontario Public Service uses three types of outcome measures:

Efficiency: The extent to which an organization, policy, program or initiative is producing its planned outputs in relation to expenditure of resources.

Effectiveness: The extent to which an organization, policy, program or initiative is producing its planned outcomes and meeting intended objectives.

The degree to which the intended recipients or beneficiaries of a product or service indicate that the product or service meets their needs and expectations for quality and efficiency.

Customer Satisfaction:

Performance Measurement: A Reference Guide – Section 2: The Who, What and Why 22

Page 25: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Ministries are already required to conduct customer surveys and, if adapted, these tools may also be used to support measurement of customer satisfaction. (See http://intra.pmed.mbs.gov.on.ca/ for more information on Ontario Public Service quality standards).

2.4 Summary of Key Points • The Ontario Public Service uses performance measurement information at all levels of the

organization to demonstrate accountability for expenditure, to support decision-making and continuous improvement and to report to the public.

Performance Measurement: A Reference Guide – Section 2: The Who, What and Why 23

Page 26: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Performance Measurement: A Reference Guide – Section 2: The Who, What and Why 24

Each outcome measure should measure one of the following: Efficiency, Effectiveness and Customer Satisfaction. Ministries should have a range of outcome measures.

Ministries are expected to use performance measurement information to support internal decision-making, policy submissions and requests for resources.

External and internal users of performance measurement information have different needs and these uses need to be considered in the development of performance measurement systems.

The three levels of performance measurement used in the Ontario Public Service are output measures, outcome measures and high-level indicators. Output measures illustrate short-term progress; outcome measures demonstrate the impact of that progress.

Checklist

Page 27: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

SECTION 3: How to Develop

Performance Measures Key Terms Attribution: The demonstrable assertion that a reasonable connection can be made between a specific outcome and the actions and outputs of a government policy, program or initiative.

Baseline: The level of results at a given time that provides a starting point for assessing changes in performance and for establishing objectives or targets for future performance.

Benchmarking: The process of measuring and comparing one’s own processes, products or service against a higher performing process, product or service and adapting business practices to improve.

Target: A clearly stated objective or planned result [which may include output(s) and/or outcome(s)] to be achieved within a stated time, against which actual results can be compared.

Performance Measurement: A Reference Guide – Section 3: How to Develop Performance Measures 25

Page 28: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

3.0 Overview Developing good performance measures is not easy. Poorly integrated performance measurement systems can be worse than no system at all and may actually support poor decision-making.

There are six steps to establishing good performance measures:

"There is nothing so useless as doing efficiently that which should not be done at all."

Peter F. Drucker

1. Define the strategy in relation to government priorities.

2. Identify and consult on cross-ministry initiatives.

3. Identify the performance measures.

4. Check the measures for pitfalls.

5. Establish baselines.

6. Set performance targets.

The following sections take the reader through each step in some detail.

3.1 Developing Performance Measures STEP 1: Define the strategy in relation to government priorities The first step involves articulating how ministry strategies contribute to meeting government priorities or serving other important public interests. This takes time, involves many people and requires tools to help organize information logically.

A logic model is a tool that describes ministry activities as part of ministry strategies to meet government priorities. The framework for organizing the logic model presented in this guide is the same as that used by ministries to develop their results-based plans.

A logic model clearly shows the relationships among government priorities, ministries’ strategic objectives, and how ministry activities contribute to achieving those objectives and priorities through their expected outcomes. The process of creating a logic model and making the linkages among inputs, outputs and outcomes can help build common understanding of what is expected, prioritize activities and identify appropriate performance measures.

A standard logic model diagram is shown below. For examples of real logic models that ministries have created or found particularly useful see Appendix 5. Step-by-step instructions for how to develop a logic model are provided in Appendix 6.

Performance Measurement: A Reference Guide – Section 3: How to Develop Performance Measures 26

Page 29: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Figure 2: Standard Logic Model Diagram

The logic model provides a foundation for developing performance memaking and is also the basis for meeting Management Board Secretari

The important points to remember when creating a logic model are:

• developing a logic model is a process that involves many peopproduct which one person can produce;

• the information entered into the logic model at the early stageinformation is entered;

• It’s not just the information in the boxes that counts, but the rboxes.

Figure 3: Logic Model Process – how each component leads to

(Gover

Inputs Activities Outputs S I

List all inputs List all activities List the tangible products of the activities

Lisin awbecomcapexpresactsho

Your Planned Work Achie

Objectives Of The Strategy: What does the strategy hope to accomplish

OActivities Outputs Inputs

Customers: Who Benefits

Performance Measurement: A Reference Guide – Section 3: How to D

Desired hort-Term

Outcomes

asures that will suppoat reporting requireme

le working together ra

s may need to be revi

elationships betwe

the next stage

t the changes participation, areness, haviour,

pliance or acity that are ected to ult from the ivities in the rt term

List chaconattitbehexpfromin thterm

ving Objectives

utcomes

evelop Performance M

Desired ntermediateOutcomes

Larger Public Interest: nment priorities are usually here)

rt decision-nts.

ther than a

sed as new

en the

the benefits or nges in ditions, udes and aviour that are ected to result the activities e intermediate-

High Level

Changes

easures 27

Page 30: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Is there agreement about what will be measured and how? "If every general contractor defined a foot as the length of his own foot, our buildings and the construction industry in general would be in a state of disarray. Yet that is precisely what we have done in health care. Every hospital,practice and specialty society has traditionally measured key processes of healthcare in unique and differing ways." Gene Beed, M.D., 1996 Conference of the (US) National Committee for Quality Assurance on Health Data.

STEP 2: Identify and consult on cross-ministry and horizontal initiatives The next step is to identify and consult with other ministries and broader public sector organizations that are contributing to the same result, or that are working to serve a common public interest to:

1. Share performance measures for like activities. 2. Reduce the number of performance measures used. 3. Increase the value of each measure.

For example, both the ministries of Natural Resources and Municipal Affairs and Housing are involved in activities that protect natural areas or prime agricultural land from certain kinds of development. However, Natural Resources developed its activities with respect to “green space”, while Municipal Affairs and Housing developed its activities in relation to clearly defined “green belts”.

Rather than have two measures measuring slightly different things related to the same public interest, the ministries worked together to develop a mutually acceptable definition of “green space” and performance measures to which both will contribute data.

STEP 3: Identify the performance measures Performance measures should:

• show how ministry activities contribute to achieving government objectives;

• provide key information for decision-making;

• capture all areas of significant spending;

• identify and track impact as well as progress towards meeting desired outcomes;

• incorporate consideration of risks and act as “thermometers” for risk management.

Data collection methods should ensure that measures are:

• Valid (actually measure what they are intended to measure);

• Reliable (different users of the same measure will report the same results) and;

• Sensitive (able to measure change).

Performance Measurement: A Reference Guide – Section 3: How to Develop Performance Measures 28

Page 31: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

The SMART rule provides a way to test the strength of a performance measure:

S

M

A

R

T

App

Per

endix 7 c

formance

Specific: Performance measures state clearly and concisely what will be measured. Theyfocus on significant activities and capture all major areas of spending. They are outcome focused and not a list of activities

Measurable: Performance measures should be quantified, even if based on qualitative data.

Achievable and Attributable: Performance measures relate to things that the ministry can influence and achieve.

Realistic: Performance measures are based on reliable, verifiable data that reflect the ministry/activity contribution to achieving government priorities and results.

Timely: Performance measurement data can be collected, processed and distributed within a useful timeframe and at reasonable cost.

ontains a complete checklist for testing performance measures.

Measurement: A Reference Guide – Section 3: How to Develop Performance Measures 29

Page 32: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

What does the measure really measure? In Ontario, in 2000, it was illegal to drag a dead horse down a main street. Was the fact that no dead horses were dragged down main streets in Ontario in 2000 proof that the law was working?

STEP 4: Check the measures for pitfalls The most common pitfalls to performance measurement are attribution and measurement corruption and the only way to detect and avoid them is to know how they may arise. Ask enough questions to be reasonably confident that the pitfalls don’t undermine the usefulness of the measure.

Attribution

Attribution is the assertion that a connection can be made between an outcome and the actions of a government policy, program or initiative. Determining attribution for outputs is relatively straightforward as outputs are the tangible products produced through activities. Demonstrating attribution for outcomes is more complicated because a number of intervening factors, in addition to the activities, may contribute to the outcome. Creating a logic model is the strongest method for identifying the contribution of an activity to the achievement of intended results.

To test for attribution, ask the question, “Is the result we are measuring produced by our actions?” Sometimes, the answer to the above question is clear, as in the example on the right. The fact that no dead horses were dragged down main streets of Ontario in 2000 is actually a better indicator that the law is no longer relevant than it is that the law is working. Other times, the answer is less clear.

Government activities are often not solely responsible for the results achieved in relation to their objectives. There may be factors or events over which the government has no control that affect the outcome. For example, in 2003, the tourism industry in Ontario was negatively affected by the outbreak of SARS, despite ongoing government-funded programs to increase tourism in the province.

Thus, when designing performance measures, it is important to measure the contribution that government-funded activities make, or the influence they exert, rather than measuring only those things over which government has direct control. At the same time, it is important to maintain an awareness of changes that could affect the results. These changes are external risk factors and include broader economic, environmental and sector-specific trends. Clearly identifying the risk factors also qualifies the performance measure so the results reported can be interpreted correctly.

When we think of problems of attribution, we usually think about how performance measurement information is used to claim undue success. But problems of attribution can also make success appear as failure, as illustrated in the example below.

Performance Measurement: A Reference Guide – Section 3: How to Develop Performance Measures 30

Page 33: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

If you can’t see success, you can’t reward it In the US, the Social Services department was responsible for assuring the welfare of children who were “wards of the state.” Though the state did not directly deliver services to children, they were accountable, and they established a whole range of output measures to ensure children were well cared-for. For example, third-party providers of care to children had to report to the Department of Social Services on how many children received their annual medical and dental examinations, were in full attendance at school, etc. The results were rolled up into a Social Services Department performance measure of “compliance.” Compliance rarely reached 60 per cent and the Social Services department was concerned. On further examination they learned that children might have missed a dental examination because they had the flu (and therefore could not be treated safely) or had missed school because they were in transition to adoption. In other words, low compliance by the third party delivery agent was not necessarily an indicator of poor care. In fact, sometimes the lack of compliance was an indicator of high quality service to children that went unrecognized. The Social Services department re-examined their compliance indicators in consultation with third-party service providers. Not only did they obtain higher quality information, rates of compliance increased, as did public confidence in Social Services.

The example above also raises an important area in which attribution issues arise – in situations where a broader public sector organization or agency delivers service on behalf of the government, but for which the government remains accountable for the result. Where such organizations also have a high degree of independence, ministries sometimes have reservations about their ability to demonstrate attribution and therefore accountability for the results achieved by broader public sector organizations. This is why it is essential to involve broader public sector organizations and agencies in the creation of logic models for ministry strategies to which their activities contribute. Obtaining agreement about the objectives and strategies and involving third party organizations in the identification of appropriate performance measures is critical to resolving problems of attribution.

Performance Measurement: A Reference Guide – Section 3: How to Develop Performance Measures 31

Page 34: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Figure 5: Good performance measures measure the contribution of activities to government priorities

A r

Output 3

DesireResult

Outcome 2

Ministry Strategies

and Activities

Output 2

Outcome 1

Output 1

A

Performance Measurement: A Reference Guide – Section 3: How t

ttribution Barrie

d s

Government Priorities

Factors Outside Control

ttribution Barrier

o Develop Performance Measures 32

Page 35: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

What gets measured gets done The Russian furniture industry, in an attempt to increase efficiency, adopted a pay-for-performance policy in which workers were paid a commission on every pound of furniture produced. Production rates remained stable, but Russian furniture became the heaviest furniture in the world. If you aren’t rewarding success, you’re probably rewarding failure In the UK, national government provided funding to support training programs targeted to people who faced significant employment barriers and had been unemployed for more than two years. The training was provided on a four-year contractual basis by private sector companies whose contracts were renewed on the basis of performance - the number of successful graduates. After four years, the training companies had met their targets and their funding was renewed. However, research showed that there had been no significant change in the target group. This was because the training companies had not admitted members of the target group to the training because they were more difficult to train, less likely to complete within the contract time, and would therefore jeopardize the company’s ability to meet its targets.

Measurement illustrates the contribution that activities make, without falling into the trap of incorrectly attributing all success or failure to the activity.

Even when all appropriate steps have been taken, attribution can remain an issue. This is why providing a qualitative description and analysis of performance information is valuable. It explains the larger context and data collection methods, the limitations on the interpretation of the data, and the key risk factors so that attribution issues are clearly identified. In this way, users of information are less likely to use the performance measurement information inappropriately.

For a more detailed discussion of attribution and how to avoid attribution problems, see John Mayne’s article, “Addressing Attribution Through Contribution Analysis: Using Performance Measures Sensibly,” (Office of the Auditor General of Canada, 1999), which can be found at the following internet address:

http://www.oag-bvg.gc.ca/domino/other.nsf/html/99dp1_e.html/$file/99dp1_e.pdf.

Measurement Corruption

The second pitfall of performance measurement is measurement corruption. People adapt their behaviour in relation to the thing being measured, as in the example on the right. It is important to try to anticipate any undesirable behaviours the measure could inspire and revise the measure accordingly. Another example of measurement corruption in the context of target setting is given on page 36.

The risk of measurement corruption in performance measurement is relatively high, due to our natural desire to be successful and our boundless creativity and adaptability. It is rarely intentional and is usually produced by the measurement itself.

Performance measures are less likely to become corrupt if all those involved understand the measure, what outcome it is intended to measure and agree to the choice as relevant and appropriate. This is another reason why everyone affected should be involved in the development of the logic model and identification of performance measures.

Another way to deal with the effects of measurement corruption is to ensure performance measures are part of a system of measurement. If one measure becomes corrupt, other measures may signal that corruption by their lack of change. The example of training programs on the right illustrates this point. The lack of change in employment rates of the target population provided a clue that the measurement of training completion may have been corrupt by raising the question, “Why isn’t there change for the target group as a result of the government-funded program to improve its access to employment?”

Performance Measurement: A Reference Guide – Section 3: How to Develop Performance Measures 33

Page 36: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

If you can’t recognize failure you can’t correct it A transitional employment (TE) program was designed to place clients with mental illness – particularly those with little work history - in short-term, entry-level jobs on a rotational basis. At any given time there were 20 jobs and 300 clients. Program staff were certain that all clients were being given the opportunity to participate. When a new performance measurement system was established, staff were surprised to learn that most of the time the same 20 clients were being offered the TE jobs as they became available. The fast pace of the program, the heavy demands on staff time and the lackof objective information had resulted in available jobs being offered to clients with the strongest employment record, a total contradiction of the program’s fundamental purpose. In response to the performance information, the program changed their job allocation practices to ensure that everyone on the wait list, particularly the most disabled clients, were given access to TE jobs.

Unintended Consequences

Performance measures can also help to identify unintended consequences of implementing a policy or program. In the example to the right, staff assumed they were treating all clients equally, but performance measurement demonstrated the rapid response times of the program led to jobs being given to those clients who were already the most prepared for employment.

Measuring performance will not “save the world.” Performance measures are a tool. They help us to:

• discover quickly if progress is being made toward objectives as expected and to tell the difference between success and failure;

• make ongoing decisions about how to best use resources, when to make changes to services and identify existing gaps;

• identify when corrective action may be needed and guide what that action should be;

• most activities are on a large scale and delivered over vast regions. Performance measurement helps us to know when we are successful and to identify potential problems quickly.

STEP 5: Establish Baselines A baseline is the level of results at a given time that provides a starting point for assessing changes in performance and establishing objectives or targets for future performance.

Ministries are expected to have identified baselines for existing performance measures. When a ministry introduces a new performance measure, it should establish the baseline during the first year the performance measure is in use. Every time performance measures or data collection methods are changed, new baselines need to be set.

STEP 6: Set Performance Targets A target is a clear and concrete statement of planned results (including outputs and outcomes) to be achieved within the time frame of parliamentary and departmental planning and reporting against which results can be compared. Every ministry is asked to use comparative data to set targets based on its own performance, established industry standard, customer preference and/or performance of a comparable organization. Reasonable targets are challenging but achievable.

Performance Measurement: A Reference Guide – Section 3: How to Develop Performance Measures 34

Page 37: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Setting annual and long-term targets

Performance targets set expectations for results and are the basis from which measurement takes place and improvement begins. Without targets, it isn’t possible to know whether the organization is improving or falling behind in achieving results in priority areas.

Targets show whether the ministry proposes to meet or exceed the standards for performance. They should be clear and quantifiable and expressed in absolute number, percentage or ratio terms. They should also define the timeframe within which they will be achieved. Targets are used as a key tool to drive, measure, improve and control performance.

Benchmarking

Benchmarking is the process of measuring and comparing one’s own processes, products or services against a higher performing process, product or service and adapting business practices to improve performance. Where possible, ministries should benchmark their performance information against other Ontario Public Service ministries and public or private sector standards. Benchmarking performance information against another internal or external high performing organization may help ministries to identify more effective and efficient processes for achieving intended results. For an example of how one jurisdiction used benchmarking to demonstrate its performance see Appendix 8.

There are three commonly accepted forms of benchmarking:

• Standards Benchmarking: Setting a standard of performance in relation to the performance of other organizations, which an effective organization could be expected to achieve. The publication of a challenging standard can motivate staff and demonstrate a commitment to improve services;

• Results Benchmarking: Comparing the performance of a number of organizations providing a similar service. In the public sector, this technique can allow the public to judge whether their local provider makes effective use of resources, compared to similar providers;

• Process Benchmarking: Undertaking a detailed examination within a group or organization of the process that produces a particular output. This is done to understand the reasons for the variances in performance and incorporate best practices.

While there is currently no formal requirement for ministries to benchmark their performance, it is expected that such comparisons will be incorporated into all public reporting by 2007.

3.2 Summary of Key Points There are six steps to establishing good performance measures:

1. Define the strategy in relation to government priorities.

2. Identify and consult on cross-ministry initiatives.

3. Identify the performance measures.

4. Check the measures for pitfalls.

5. Establish baselines.

6. Set performance targets.

Performance Measurement: A Reference Guide – Section 3: How to Develop Performance Measures 35

Page 38: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Checklist

Ministries should use logic models to illustrate relationships among activities, strategies and the results to be achieved.

Ministries should consult with third party service providers, broader public sector organizations and other ministries to align performance measurement systems to promote greater coordination and avoid duplication.

Pf

I

Os

Oa

T

Performan

erformance measures should follow the “SMART” rule and subscribe to the ollowing criteria:

• Show how ministry activities contribute to achieving government priorities and results

• Use reliable, verifiable and consistent data collection methods

• Provide key information to support decision-making

• Capture all areas of significant spending

dentify and track impact as well as progress towards meeting desired outcomes.

utcome measures should be developed to demonstrate the achievement of ministry trategies.

utput measures should be developed to demonstrate the progress that ministry ctivities make towards achieving the objectives of the strategies.

argets are required for all performance measures.

Baselines are required for all performance measures.

ce Measurement: A Reference Guide – Section 3: How to Develop Performance Measures 36

Page 39: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

SECTION 4: Reporting Performance

Measures Key Terms: Public Performance Reporting: the formal mechanisms that a government uses to communicate with the public and legislatures in accordance with agreed guidelines.

CCAF: Canadian Comprehensive Auditing Foundation is a national non-profit research and education foundation that has researched public sector governance, accountability, management and audit.

Quarterly Reporting: Ministries report quarterly to Management Board of Cabinet on their expenditure progress, towards meeting objectives and on the risks to achievement.

Non-Financial Information: Information about the quality and effectiveness of government services; for example, levels of access to health care in remote areas.

Financial Information: Information that assists government in accounting for the costs of operations for citizens and the legislature.

Performance Measurement: A Reference Guide – Section 4: Reporting Performance Measures 37

Page 40: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

4.0 Introduction Ministries are expected to report performance measures as part of:

• Results-based planning: Ministries will be required to report annually through the results-based planning process on their progress in meeting the performance targets associated with the strategies identified in their previous year’s results-based plan.

• Requests for resources or policy change and in Management Board of Cabinet report backs. In these cases, performance measures are used to demonstrate the need for resources or policy, and/or to provide evidence of progress to date towards meeting objectives. Ministries will be expected to demonstrate performance in all requests for resources or policy approval.

• Quarterly reporting. Ministries report quarterly to Management Board of Cabinet on their progress towards meeting objectives and on the risks to achievement.

• Public Reporting: The government may use performance measurement results to report to the public on achievements.

• Deputy Ministers’ performance contracts: Performance measurement continues to be a key part of Deputy Ministers’ performance contracts and ministries will be required to report annually on progress towards the achievement of annual targets.

4.1 Canadian Comprehensive Auditing Foundation / La Fondation canadienne pour la vérification integrée (CCAF/FCVI) Principles for Reporting The format of public reporting on performance measures is changing. The emphasis increasingly is on providing the larger context to explain how and why particular results were achieved. This change in emphasis can be seen through efforts to:

• Ensure ministry performance measures are aligned with strategies and government priorities;

• Create cross-ministry measures, reducing the number of measures and increasing the coordination of ministries contributing data to the outcome measure;

• Incorporate the CCAF/FCVI Principles for Public Reporting into all government reporting vehicles.

The Canadian Comprehensive Auditing Foundation (CCAF/FCVI) has developed a set of nine principles that represent the “next generation” of reporting for modernizing governments. These principles have been officially adopted by the federal government, the governments of British Columbia (BC), Saskatchewan and others, and were referred to in the 2004-05 Ontario Budget as a model for improving our reporting on performance.

The performance measure template and the quarterly reporting requirements are being adjusted to incorporate a more comprehensive analysis of the results achieved. These changes will facilitate the adoption of the CCAF/FCVI principles. The following chart provides a summary of the nine principles. A full discussion of the CCAF/FCVI principles is located at: http://www.ccaf-fcvi.com/english/ documents/executive_summary.pdf.

Performance Measurement: A Reference Guide – Section 4: Reporting Performance Measures 38

Page 41: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Performance Measurement: A Reference Guide – Section 4: Reporting Performance Measures 39

CCAF/FCVIReportingPrinciple

Description

This principle is intended to add focus to reporting. By focusing on the priorities of an organization and the needs of the users, the usefulness and the quality of performance reports is greatly improved. The focus of reporting should be driven by the likely use of the information.

As part of the results-based planning process, ministries are identifying the critical performance measures that align strategies and activities with the government’s priorities.

Focus on the few critical aspects of performance

Since performance reports should focus on a few critical aspects of performance rather than present volumes of information, it is especially important to tell the reader why the reported measures were selected and how they relate to the overall strategic direction of the organization.

This principle deals with providing disclosure on the choices made and the items to be reported, as well as changes in measurement or presentation and steps taken to ensure the reliability of data. This is necessary to increase user confidence in the information being reported.

Disclose the basis for reporting

Public performance reporting should be based on credible quantitative and qualitative information fairly interpreted and presented. Publicly reported information should be consistent, relevant, reliable, understandable and fair.

This principle encompasses two distinct types of comparative information:

1. Historical trend information for the organization over time; and

2. Comparison of the organization’s performance to other similar entities.

Meaningful public reporting must put results information in context with their associated costs. Only when results are presented in relation to the resources consumed can the user assess and evaluate results.

Relating resources to strategies and to results is essential if the public is to understand the value that government gets for its money. Ministries are asked to report on the costs of delivering strategies in relation to the achievement of government priorities.

Other factors that could impact on results include changing economic, social or demographic information; standards of conduct, ethics or values; public perception of performance or acceptance of objectives; and unintended impacts.

This information is essential to assessing the degree to which the results achieved can be attributed to government actions. Where attribution is less than complete, an explanation of other critical factors serves to guide interpretation of the results and qualifies the assessment of performance.

The term “capacity” refers to an organization’s ability to achieve results at a specified level. A number of internal factors can impact on an organization’s results including the quantity and quality of human, financial, intellectual, technological or physical resources available.

Disclosure and discussion of capacity are important to meaningful public reporting. For example, starting up an activity may mean that the development of capacity is the single most significant aspect of performance. Introducing new delivery approaches may introduce new risks, require new techniques or processes, or call for different skills sets.

Consideration of capacity has been built into the new performance measurement template.

Public performance reporting should identify key strategic risks, explain their influence on policy choices and performance expectations, relate results achieved to the risks, and level of risk accepted.

The identification and evaluation of risks has been incorporated into both the quarterly reporting requirements and the new performance measurement template. Ministries are asked to use Modern Controllership risk assessment tools and provide a summary of both internal and external risk factors in all future reporting.

This principle focuses on the use of performance measures as predictive tools (identifying what the desired outcomes and accompanying targets are) and reporting on achievements.

By identifying government priorities and results, the government is “looking forward.” Reports to the public will report on achievement towards addressing the priorities.

Reporting results in the context of a previously released plan is a major transition that supports the “look back” portion of this principle.

Present credible information, fairly interpreted

Provide comparative information

Integrate financial and non-financial information

Explain other factors critical to performance

Explain key capacity considerations

Explain key risk considerations

Look forward as well as back

Page 42: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Appendix 1 Performance Measurement Resources in the Ontario Public Service Performance Measurement and Program Evaluation Team, Program Management and Estimates Division, Management Board Secretariat

• Sets Ontario Public Service policy on performance measurement;

• Offers portfolio-based advice to ministries on performance management;

• Maintains a database of Ontario Public Service performance measures and coordinates performance measurement reporting;

• Coordinates the Performance Measurement and Program Evaluation Network for ministry staff involved in the above activities.

The PMED website http://intra.pmed.mbs.gov.on.ca includes links to:

• the Performance Measurement Guide;

• web links to international Performance Measurement literature, resources, and cross-jurisdictional comparisons;

• web links to Ontario Public Service Ministry materials about their own performance measurement and management systems.

Modern Controllership Training unit, Fiscal and Financial Policy Division, Ministry of Finance offers free courses on performance measurement to support ministries in their efforts to develop performance measurement systems and monitor and report on performance. Information can be found on there website: http://intra.mc.fin.gov.on.ca.

Other Resources Policy Innovation and Leadership (PIL) supports the policy community by building knowledge networks, promoting continuous learning, sharing innovative tools and practices, and profiling excellence. Enhancing our capacity to deliver cross-cutting and well-informed policy analysis in support of the government’s priorities will put the Ontario Public Service at the leading edge of public policy innovation.

PIL - About Us

Performance Measurement: A Reference Guide – Appendix 1

40
Page 43: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Appendix 2 Glossary of Performance Measurement Terms Introduction The following definitions are based on a number of sources and have been tailored for a variety of Ontario Public Service purposes such as program evaluation, performance measurement, risk management, results-based planning and controllership. These definitions are intended to clarify and encourage a common, consistent language within the Ontario Public Service governance, accountability and planning frameworks.

Definitions Accountability

The obligation to answer for results and the manner in which responsibilities are discharged. Accountability cannot be delegated. Responsibility is the obligation to act whereas accountability is the obligation to answer for an action.

Activity

An activity is the work performed by ministries to implement public policy and provide services to the public. All activities consume resources and produce products and/or services. One or more activities will be critical to the achievement of overall public policy objectives. Ministries must be able to demonstrate a direct causal link between the activity and the outcome(s).

Attribution

The demonstrable assertion that a reasonable connection can be made between a specific outcome and the actions and outputs of a government policy, program or initiative.

Baseline

The level of results at a given time that provides a starting point for assessing changes in performance and for establishing objectives or targets for future performance.

Benchmarking

The process of measuring and comparing one’s own processes, products or service against a higher performing process, product or service and adapting business practices to improve.

Cost Benefit Analysis

A process that assesses the relation between the cost of an undertaking and the value of the resulting benefits

Cost Effectiveness

The extent to which an organization, program, etc. is producing its planned outcomes in relation to use of resources (inputs).

Performance Measurement: A Reference Guide – Appendix 2 41

Page 44: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Cross-ministry Measure

Measure with a desired result to which more than one ministry, but not all ministries, contribute. (See also horizontal measure).

Customer

The person, whether inside or outside the organization, to whom services or products are delivered.

Customer Satisfaction

The degree to which the intended recipients or beneficiaries of a product or service indicate that the product or service meets their needs and expectations for quality and efficiency.

Effectiveness

The extent to which an organization, policy, program or initiative is producing its planned outcomes and meeting intended objectives

Efficiency

The extent to which an organization, policy, program or initiative is producing its planned outputs in relation to expenditure of resources.

Evaluation

The systematic collection and analysis of information on the performance of a policy, program or initiative to make judgements about relevance, progress or success and cost-effectiveness and/or to inform future programming decisions about design and implementation.

High Level Indicator

A measure of changes in social, environmental or economic conditions.

Horizontal Measure

Measure with a desired result to which most or all ministries contribute. (See also Cross-ministry measure).

Indicator

A quantitative or qualitative ratio or index used to signal and indirectly measure the performance of a program over time.

Inputs

The resources (human, material, financial, etc.) allocated to carry out activities, produce outputs and/or accomplish results.

Intermediate Outcomes

Benefits and changes in behaviour, decisions, policies and social action attributable to outputs to demonstrate that program objectives are being met, e.g., increased employability as a result of a training program.

Long-term outcomes

The ultimate or long-term consequences for human, economic, civic or environmental benefit, to which government policy or legislation contributes, e.g., life expectancy rates, overall economic performance.

Performance Measurement: A Reference Guide – Appendix 2 42

Page 45: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Monitoring

The process of collecting and analyzing information to track program outputs and progress towards desired outcomes.

Objective

Achievable and realistic expression of a desired result.

Outcome

The actual effects/impacts or results of the outputs. See Short-term Outcomes, Intermediate Outcomes, Long-term Outcomes.

Output

The products or services that result from activities.

Performance-Based Budgeting

A method of allocating resources that connects inputs provided to the achievement of pre-defined objectives. Performance-based budgeting is focused on what can be achieved for a given amount and allows for comparisons between expected and actual progress.

Performance measure

Quantifiable information that provides a reliable basis for directly assessing achievement, change or performance over time.

Performance measurement

The process of assessing results, e.g., the degree to which objectives are being achieved.

Priority

Higher-order goals of government that reflect its commitment to citizens and contribute to enduring human, economic, civic and environmental benefits.

Qualitative data

Non-numeric information collected through interviews, focus groups, observation and the analysis of written documents. Qualitative data can be quantified to establish patterns or trends, e.g., improvement in a child’s reading level as observed by parents and teacher.

Quantitative data

Information that is counted, or compared on a scale, e.g., improvement in a child’s reading level as measured by a reading test.

Reliability

The extent to which measurements are repeatable and consistent under the same conditions each time.

Result

A condition (outcome) or product (output) that exists as a consequence of an activity.

Results-based Management

A comprehensive, government-wide approach that informs results-based decision-making, ensuring that all government-funded activities are aligned with strategies that contribute to meeting government priorities or serve an important public interest.

Performance Measurement: A Reference Guide – Appendix 2 43

Page 46: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Risk

The chance of something happening that will impact on the achievement of objectives.

Risk Management

The active process of identifying, assessing, communicating and managing the risks facing an organization to ensure that an organization meets its objectives.

Short-term Outcomes

First-level effects of, or immediate response to the outputs, e.g., changes in compliance rates or degree of customer satisfaction.

Standards

Pre-defined quantifiable levels of performance that are commonly understood and agreed upon and are the basis for judging or comparing actual performance.

Strategy

Plan outlining how specified ministry activities and programs contribute to a government priority and results or other important public interest.

Target

A clearly stated objective or planned result [which may include output(s) and/or outcome(s)] to be achieved within a stated time, against which actual results can be compared.

Validity

The extent to which a measurement instrument accurately measures what it is supposed to measure. For example, a reading test may be a valid measure of reading skills, but it is not a valid measure of total language competency.

Vote

The numerical designation of a program or a group of activities within the Government of Ontario’s Printed Estimates.

Performance Measurement: A Reference Guide – Appendix 2 44

Page 47: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

References Canadian Comprehensive Auditing Foundation (CCAF/FCVI), 2002 Reporting Principles: Taking Public Performance Reporting to a New Level.

Canadian Evaluation Society, 1999 Evaluation for Results.

Gaster, C. 2000. “Business Plan 101” in Behavioural Healthcare Tomorrow, February issue.

Governmental Accounting Standards Board of the Financial Accounting Foundation (GASB), 2003 Reporting Performance Information: Suggested Criteria for Effective Communication.

Management Board Secretariat, 2000 “Glossary,” in Performance Measurement Guide.

Management Board Secretariat, 2000 Business Planning and Allocations Directive.

Management Board Secretariat, 2000 Performance Management Operating Policy.

Management Board Secretariat, 2003 Guide to Program Evaluation.

Management Board Secretariat, 2003 Guidelines And Instructions: Deputy Ministers And Senior Management Group Performance Management Plan 2003 – 2004.

Management Board Secretariat, 1999 Making Measures Count: Integrating Performance Measurement into Decision Making.

Management Board Secretariat, 2000 Performance Measurement in the Business Planning Process.

Mayne, John, 2003 Reporting on Outcomes: Setting Performance Expectations and Telling Performance Stories, Ottawa: Office of the Auditor General of Canada.

Ministry of Education, 2003 2004/05 Business Planning Training slides.

Ministry of Finance, 2003 “Modern Controllership Glossary” on MC - Glossary

Ministry of Finance, Modern Controllership Training Unit, 2003 “Governance and Accountability in the Public Sector”.

Schacter, Mark, 2002 “What Will Be, Will Be”, Ottawa: Institute on Governance.

Treasury Board Secretariat, 2001 Guide for the Development of Results-based Evaluation and Accountability Frameworks: Lexicon. Ottawa: Treasury Board Secretariat.

UK Evaluation Society, 2003 “Glossary of Evaluation Terms” on http://www.evaluation.org.uk/Pub_library/Glossary.htm

UNDP (United Nations Development Programme) 1997 Results-Oriented Monitoring and Evaluation Handbook.

UNESCO (United Nations Educational, Scientific and Cultural Organization) Internal Oversight Service, 2003 Glossary of Main Evaluation Terms.

Performance Measurement: A Reference Guide – Appendix 2 45

Page 48: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Appendix 3a Examples of How Ontario is Using Performance Measurement PRISM: The Ontario Provincial Police’s Performance Reporting Information Systems Manager

Forthcoming in Chief Information Officer magazine, 2004

Senior level decision makers in the Ontario Provincial Police were interested in finding a way to gain direct access to the knowledge they required to make decisions.

As an organization, the OPP knew that they were providing outstanding service, (according to a 2001 survey completed by Leger Marketing for the Canadian Press Association, 88% of Ontarians are satisfied with the work of the OPP) but decision makers felt that to be able to continue to improve and provide better services, they needed better access to the information they collected.

If this was really going to work, the OPP understood that they required a solution that:

• was a business enabler, not just a piece of technology

• addresses the priorities of the organization (very rarely technology driven)

• everyday “non techie” people can use

• provided an accountability framework

• was designed to allow sophisticated analysis and score-carding without requiring users to have expertise in statistical analysis

• could be developed and implemented quickly and within limited budgets that reflect today’s realities

• was built on a technology which addresses the concerns and priorities of the IT community

To meet their requirements, the OPP developed PRISM - Performance Reporting Information Systems Manager. PRISM is a performance measurement application, designed to provide senior level decision makers direct access to critical data required for decision-making purposes. PRISM provides the OPP, specifically members of Commissioner’s Committee with the ability to intuitively plan, monitor, evaluate and control expenditures and activities.

Under the leadership of Deputy Commissioner Bill Currie, members of the OPP’s Strategic Services Command worked with an Ontario based company, ABS System Consultants to develop ‘PRISM’.

The OPP used ABS’ performance measurement and decision support platform, Metrics3D, to act as the foundation for PRISM. Unlike virtually all other Business Intelligence Technologies, Metrics3D is a comprehensive ‘all-in-one’ package, combining numerous components normally requiring several different and often incompatible software products including:

• Performance Measurement

• Decision Support

• Risk Management

• OLAP

• Benchmarking

• Scorecard

Performance Measurement: A Reference Guide – Appendix 3a 46

Page 49: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

• Advanced Analysis and Reporting

• Dashboards

• Meta Data Dictionary Software

PRISM works by accessing data the OPP maintains in a number of disparate internal operational systems and cross-referencing it with other sources of data, including data from other police jurisdictions and socio-demographic census data. Users have the ability to explore their organization’s data in a variety of ways including, comparative analysis, peer analysis, cross tab (OLAP), time series detailed graphs, benchmarking, and dashboards.

“PRISM allows our organization to analyze a number of critical factors as they relate to performance, quickly” stated Currie. “I now have the ability, on my desktop, to instantly look at our organization from a variety of different perspectives and evaluate the performance and effectiveness of our services”, he continued.

According to Staff Sgt. Jeff Steers, PRISM’s Project Manager, “PRISM can be used by everyone in our organization, including those who are not very IT literate”. “PRISM provides our people with more than the ability to just drill up and down in an existing report”. The “point-and-click” interface allows them to easily create new reports and what-if scenarios from hundreds of indicators”, he says.

Why did the PRISM Implementation work so well? According to Staff Sgt. Steers, as with any successful implementation, there are several reasons.

First, the OPP chose to use software that was designed specifically for the public sector and could be tailored to address the organization’s unique priorities. Second, the OPP teamed up with experienced consultants who understood how to balance the complex relationship between the business priorities of the organization and what was technically feasible. Third, an executive leader who was willing to think “outside the box” and champion a vision of what users truly required.

Deputy Commissioner Currie mentioned that the OPP are currently working to roll out the application to the rest of their organization via the web, including regional and local detachments. “PRISM is changing the way the OPP provides services and demonstrates how the organization is accountable to the Province of Ontario” Currie said.

Performance Measurement: A Reference Guide – Appendix 3a 47

Page 50: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Appendix 3b

Municipal Performance Measurement Program

Performance Measurement: A Reference Guide – Appendix 3b 48

Page 51: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

For more information,

call (416) 585-6841, e-mail

[email protected] or

visit www.mah.gov.on.ca.

Performance Matters!

Welcome to the first edition ofPerformance Matters! This

newsletter, produced by the Ministry ofMunicipal Affairs and Housing, is the firstin a series on performance measurementin the municipal sector. We trust it willhelp readers better understand theconcept and benefits of performancemeasurement. In the short term, we hopeit provides opportunities for learningabout how other municipalities use thepractice and for sharing lessons learned.In the long term, we’d like it to contributeto the growing movement of performancemeasurement in the public sector.

Performance measurement is not anew concept. Numerous Ontariomunicipalities measure theirperformances in key service areas.Many have participated in performancemeasurement studies, sharing andcomparing their results to learn newways of improving service delivery.This newsletter highlights theinitiatives of two municipalities:Thunder Bay and Guelph. Futureeditions will showcase the equallysuccessful efforts of other Ontariomunicipalities.

Guelph

In January 1998, the City of Guelphestablished a performance measurement

system. The city views performancemeasurement as a way to permit validcomparisons that lead to new andimproved ways of doing business. DavidKennedy, Guelph’s director of finance andhead of its performance measurementproject, believes that performance cannotbe judged as absolutely good or poor. Formunicipalities, the comparison ofperformances – over time or betweenorganizations – is what really providesuseful information on trends andalternative process systems. With its newsystem, the city wanted to find outwhether it was providing the right servicesat the right time and in the right quantity.

The first steps were to establish a projectteam and set its objectives, which were to

• run pilot projects in select service areasin1998 and, from these, develop “lessonslearned” to use in a subsequent city-wide rollout of the new performancemeasurement system;

• roll out the new system city-wide in the1999 fiscal year;

• tie the performance measurement system to the city’s new financial- management and budgeting system; and

• tie the performance measurement system to the city’s financial and non-financial strategic goals.

The first pilot projects were done in theareas of transit, planning, public works andmuseum services. Additional projects weredeveloped for waste-water treatment,waterworks, solid-waste management,building inspection and parks andrecreation services. Staff working in theseareas decided what activities or indicatorsto measure and how to collect the data.They also chose time frames for measuring,methods for presenting and displaying theresults and the goals they hoped to achieve.During the balance of 1998, staff collectedand plotted monthly data.

The city’s project team met every secondweek to swap experiences, share informationand solve any problems. Early in 1999, theteam identified lessons learned andrecommended a process for rolling out thesystem to other service areas. Among thelessons was the need to evaluate performancemeasurement initiatives in othermunicipalities as well as in the private sector.

(continued on page 2)

Page 52: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

In 1994, Thunder Bay created adivision called management studies to

help other city divisions and departmentsmaximize their operating potential,improve their efficiency and effectivenessin service delivery and seek out bestpractices.

In 1996, this division helped thefinance department examine paymentoptions for municipal water, property taxand telephone bills. This review wasprompted by three factors: the growingnumber of choices available in the privateand public sectors; the technologicaladvances that reduced the cost ofpreviously cost-prohibitive alternatives;the demands from customers for abroader range of options.

The division of management studieslooked at what was involved in payingbills through banks, credit cards, debitcards, pre-authorized payments,telephone and mail. By benchmarking,or comparing processes used by other

North American cities, it discovered that93 per cent of municipalities acceptedpayments for tax and water bills throughbanks and that payment by credit cardwas the method used least.

It also learned that

• the most efficient method was to paybills through a bank;

• mail payments processed throughbanks cost less than mail paymentsprocessed through the city; and

• pre-authorized payment was the mostcost-effective method if processing wasdone through the city.

It was evident that allowing customersto make payments at banks wouldimprove service and save the city anestimated $92,723 a year. Local banks,however, were not keen on taking on thisbusiness because they were trying toreduce the volume of transactions madeover the counter.

Thunder Bay

Guelph (continued from page 1)

The pilot projects went well and taughtparticipants a great deal. As Kennedy says,“The reason we had pilot projects was tolearn from them. We documented thegood, the bad and the ugly and shared thatinformation to allow other departments asmoother transition into theirperformance measurement systems.”

Good results included improvedproductivity, communication and work-flow scheduling. The team also identifiednew ways to make service delivery moreefficient and effective. Overall, servicedelivery improved.

For example, the city’s fleet-servicesdivision is responsible for maintainingmore than 500 vehicles. It measured itsperformance in shop effectiveness bytracking the time spent on redoing repairsto get them right. It found that almost six

per cent of work done in1999 was spentredoing repairs. As a result, the divisionmade improvements that reduced thenumber of additional repairs by morethan 50 per cent.

Guelph has since made itsperformance measurement systemintegral to all its operations. It views thesystem as providing an excellent focus foridentifying problems and effectingimprovements. The city is also involvedin forming a new association, theMunicipal Performance ImprovementAssociation, to encourage thedevelopment of new performancemeasures and the sharing of knowledgeand experiences. For more information,contact David Kennedy by phone at(519) 837-5610 or by e-mail [email protected].

(continued on page 3)

Guelph has made

its performance

measurement system

integral to all its

operations.

Page 53: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Faced with this obstacle, the cityexpanded its acceptance of pre-authorizedpayments. In measuring its performancewith this new process, the city showedpositive results. The cost per transactionfell 200 per cent to $0.19 from $0.56 bymail and $0.60 at city wickets.

The city also added debit-cardterminals at three locations to make theservice more accessible. While the cost ofprocessing payments through terminalswas higher than most other paymentoptions, the total annual cost wasrelatively low, and the added customerconvenience outweighed the dollar costs.

The city continues to measure costsper transaction and review customerpayment options to continually improveits payment services. It has sinceintroduced telephone banking and isconsidering credit-card and online-payment options. The results of this

project are cost savings and improvedcustomer service, thus greater efficienciesand more effective services.

The division of management studiesmeasures how well it does at cutting costsfor its clients. Its target is to find potentialsavings in excess of 200 per cent of itsannual operating budget. It reports yearlyon the number of projects completed, thesavings identified and the overall level ofclient satisfaction.

Since 1994, this division has met orexceeded its performance target. Forexample, in 1999, it received a client satisfac-tion rating of 87 per cent and identifiedmore than $300,000 in cost savings,revenue and productivity opportunities.Its operating budget was $89,500.

For more information, contactKathleen McFadden by phone at (807) 625-3658 or by e-mail [email protected].

Thunder Bay (continued from page 2)

Did You Know?The Municipal Performance Measurement Program

The Ministry of Municipal Affairs andHousing recently introduced the

Municipal Performance MeasurementProgram. This program requires allOntario municipalities to collect data on35 broad performance measures in ninecore service areas and to report totaxpayers on their performances. Theservice areas are garbage, water, sewage,transportation, fire, police, localgovernment, land-use planning and socialservices, which, according to 1998 data,make up about 80 per cent of municipaloperating costs. The measures indicatehow effective and efficient municipalitiesare in delivering these services.

The program’s objectives includeimproving services and strengtheningmunicipal accountability to taxpayers.Opportunities for improvement existbecause this program will provide a

common set of performance measuresand a comprehensive, Ontario-based datainventory. Municipalities can access thisinformation to share and compareperformance results and to learn betterpractices.

This year, municipalities must submit2000 data through the ministry’sfinancial information return by April 30,2001, and report to their taxpayers byJune 30, 2001.

For more information and materials onthe program, contact your localMunicipal Services Office:Central 1-800-668-0230Southwest 1-800-265-4736Northwest 1-800-465-5027 East 1-800-267-9438Northeast 1-800-461-1193

In measuring its

performance with

this new process,

Thunder Bay showed

positive results.

Page 54: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Other PerformanceMeasurement Programs

Thunder Bay and Calgary aremembers of the International City/

County Managers Association’s (ICMA)Centre for Performance Measurement.The ICMA is a non-profit professionaland educational organization based inWashington, DC. Some 120 cities andcounties participate annually in thecentre’s comparative performancemeasurement program. They collect,analyse and compare their results onselected municipal performancemeasures. For more information, visitthe ICMA Web site at www.icma.org.

The Cordillera Institute is anindependent research and public policyorganization. One of its projects is calledthe Sharing and Comparing Initiative, ajoint effort with the Ontario Good RoadsAssociation and ten municipalities: NorthBay, Burlington, Markham, Belleville,Kingston, Cornwall, Peterborough,Kitchener, Waterloo and Chatham-Kent.

This project aims to givemunicipalities practical methods forachieving measurable savings in theiroperations while enhancing theireffectiveness. Participating municipalitiescompare their performances and sharebest practices in the areas of generalgovernment and road maintenance.For more information, contact theinstitute at [email protected].

PerformanceMeasurement WebSites

Grande Prairie, Albertawww.city.grande-prairie.ab.ca/perform.htm#topGrande Prairie’s site includes issues andprinciples of performance measurementas well as a bibliography and links toresources.

Governmental Accounting StandardsBoard (GASB)www.rutgers.edu/Accounting/raw/gasb/

From the GASB home page, click onperformance measures to reach the GASBPerformance Measurement Project. Thissite contains information about the useand reporting of performance measuresfor government services.

Government Performance Project (GPP)www.governing.com/gpp/gpintro.htmThe GPP rates the performancemanagement of local and stategovernments and selected federal agenciesin the United States from 1996.

PerformanceMeasurement Books

Performance Measurement: Getting ResultsHarry Hatry, Urban Institute Press(Washington, DC, 1999). Informationon this book can be found on the UrbanInstitute’s Web site atwww.urbaninstitute.org/pubs/pm/index.htm.

Also from the Urban Institute:How Effective Are Your CommunityServices? Procedures for Monitoring theEffectiveness of Municipal Services2nd edition, Hatry, Blair, Fisk, Greiner,Hall, and Schaenman, ICMA and TheUrban Institute (Washington DC, 1992).

Measuring Up: Governing’s Guide toPerformance Measurement for Geniuses(and other Public Managers)Jonathan Walters, Governing Books(Washington DC, 1998). This book,written in plain language, takes a light-hearted approach to performancemeasurement. For more information, seethe Governing Magazine Web site atwww.governing.com.

References to any organization, publicationor Web site are for information purposesonly. The Ministry of Municipal Affairsand Housing neither endorses nor condonesthe information that these sources provide.

Issued by the Ministry of Municipal Affairs and Housing, March 2001

Page 55: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Appendix 4 Outputs Outputs are the tangible items produced through activity. Because they are tangible, they can be recorded and counted. For example, number of customers served; number of cases closed and number of facilities built or serviced are all outputs. Outputs are essential to operational management; they may tell managers:

• If a project is progressing on time and on budget

• Whether the “right” customers are being served

Output measures are very useful for projecting expenditures and assessing resource needs within specific time frames. However, output measures have limited value in assessing the degree to which the overall objectives of a project are being met. For example, “Number of customers served” does not tell a manager whether the needs of customers were met. That is why each output should be linked to a short-term outcome.

Performance Measurement: A Reference Guide – Appendix 4 53

Outputs in a Nutshell

Advantages • Easy to identify because they are tangible • Easy to monitor because they involve counting and that means data can be

readily available (if good data collection systems are in place) • Useful for creating ratios (time: client served) that demonstrate efficiency or

act like a “thermometer” and give operational managers timely signs that the activity is “on track” or “going off the rails”

• Useful for demonstrating progress towards an objective. For example, if healthcare in a region is poor and one reason is low access to health services, then “number of health care providers” may be a useful output to monitor to demonstrate progress toward increased access before the desired changes in the provider-patient ratio can be demonstrated.

Limitations • Output measures tell us something was done, but they aren’t enough to assess whether the thing done was the “right” thing, or if it was done well.

Purpose • To demonstrate accountability for expenditures because they tell what tangibles were produced for what cost.

• To help inform short-term decision-making, for example related to the timelines and costs for project implementation or to monitor policy processes.

• They may be used for public reporting to demonstrate progress towards an objective is being made even though the desired results are not yet apparent.

• Output measures are most useful for operations. • They may also be used by central agencies, for example, as part of quarterly

reporting, or to demonstrate that progress towards an objective is being madeWho uses them?

Cautions • Measuring a few critical outputs is more valuable than measuring many that are not useful. For example, a school-based activity aiming to reduce the number of school drop-outs may measure: number of students in the activity, ongoing attendance rates in the activity, student-teacher ratios, cost per student of extra support, etc., but activity managers may find that “ongoing attendance” is the real gauge of progress toward achieving the objective. Monitoring the other outputs may be unnecessary.

Page 56: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Outcomes Outcome performance measures help to demonstrate the extent to which each ministry’s activities contribute to the achievement of ministry objectives and government priorities. Outcome measures may be short term or intermediate-term. Short-term outcome measures track the immediate impact of activities; e.g., changes in compliance rates or degree of customer satisfaction. Intermediate-term measures track changes in behaviour, decisions, policies and social action that result from government- funded activities; e.g., increased employability as a result of a training activity.

Outcome measures cannot be separated from objectives because they are intended to measure the degree to which objectives are being met. Therefore, in order to ensure that the performance measures identified are the right ones, it is important to have clearly articulated objectives.

Outcome measures provide the information needed for planning, setting policy direction and allocating resources among competing priorities. Outcome measures should also be used by ministries to highlight activity successes and to help manage risks that may affect the ability to meet objectives.

Outcomes in a Nutshell

Advantages • Outcomes are focused on the impact of an activity, so they provide the strongest evidence of whether or not objectives are being met.

• Outcome measures provide the right level of information to support strategic planning and decision-making.

Limitations

Purpose

• Developing useful outcome measures is a challenge.

• Outcome measures are more useful for strategic decision-making than for time-sensitive project management.

• Outcome measures serve all four purposes of measurement: accountability for expenditure, informing decision-making, public reporting and continuous improvement.

• Outcome measures demonstrate accountability for the results achieved with expenditure, not just the products produced.

• Outcome measures are used at all levels of the organization. • Activity managers use outcome measures to support continuous improvement,

to ensure that activities continue to meet the needs of customers and to align activities with government priorities.

• Ministries use outcome measures to justify to central agencies what resources should be allocated and where.

• Cabinet committees and the Cabinet use outcome measures to inform policy decisions and government priorities.

Who uses them?

Cautions • Make sure that the outcome measure relates to objectives and is integrated into the design of the strategy.

Performance Measurement: A Reference Guide – Appendix 4 54

Page 57: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

High-Level Indicators High-level indicators measure societal-level change; for example, life expectancy or economic growth. Many factors affect high-level indicators and it is rarely possible to identify a single cause for change. In other words, it is difficult to attribute change to a single cause or solely to the activities of government initiatives. There are too many stakeholders and variables involved to hold ministries accountable for high-level indicators.

However, because high-level indicators provide an overview of socio-economic conditions and government has a role to positively affect those conditions, it is important that ministries track high-level indicators to which their strategies contribute. Furthermore, government priorities, such as “Better Student Achievement” may require that high-level indicators be tracked in addition to other ministry-based performance measures.

High-Level Indicators in a Nutshell

Advantages • Have often been identified already, have been broadly accepted as useful and can be adopted by a jurisdiction or government with little or no adjustment.

• Most data and data collection issues have been identified and articulated, which facilitates the adoption of such measures.

• Data is often collected by another organization, e.g., Statistics Canada, which reduces the resources required to monitor the high-level indicator.

• Provides ability to compare with other jurisdictions/governments. • Helps managers ensure that their activities are aligned with higher-level

objectives and strategies (e.g., government priorities).

Limitations • In many (but not all) cases it is difficult to attribute the result to any particular action. Rather, the result is frequently produced by interactions among a variety of activities.

• Can only rarely be used to demonstrate the success of a particular activity.

Purpose • Primarily used as part of public reporting, particularly to make general cross-jurisdictional comparisons about the impact of government activities.

• May also be used to highlight the need for government intervention in a particular area.

• Ministers and others use them in identifying government priorities, for making comparisons among jurisdictions and for highlighting positive change during their administrations.

• Senior management in ministries may use them to support re-focusing the strategies or activities of a ministry.

• It is sometimes tempting to use high-level indicators to attribute greater success than is warranted to an activity. For example, one year there was six per cent growth in a particular economic sector (a high level indicator). The same year, the government had targeted spending to improve growth in that sector. However, the six per cent sector growth may have been spurred by a depreciation of the domestic currency, rather than government investment. Therefore, the ministry would have difficulty claiming that, “as a result of government investment, economic growth in “x” sector was six per cent.”

Who uses them?

Cautions

Performance Measurement: A Reference Guide – Appendix 4 55

Page 58: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Appendix 5

Sample Logic Models

Performance Measurement: A Reference Guide – Appendix 5 56

Page 59: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Domestic Violence Services of Greater New Haven, Inc. 24-Hour Hotline

Program Logic Model Clients & Program Objectives: The hotline provides victims of domestic violence with 24-hour, seven day a week access to a trained Domestic Violence staff person who provides free, confidential telephone counselling focused on domestic violence issues. This approach works because it protects a woman’s confidentiality while providing her with an opportunity to be heard and validated which is essential in order for her to take any next steps toward a life without violence. Long-term Outcome Intermediate Outcomes Initial Outcome Outputs Activities Inputs

Caller exercises more control in her significant relationship.

Caller gains more emotional stability

Caller develops a safety plan

Caller develops a support network

Caller able to think more clearly

Caller identifies next steps Caller identifies other social supports

Caller’s anxiety is lowered[EMOTIONAL]

Caller’s perception of herself as victim validated [COGNITIVE]

Caller provided immediate support [SOCIAL]

2500 annual calls received/responded to

Staff are available to speak to victims of domestic violence 24 hours/day, 7 days/week. Immediate telephone counselling is provided to assess the caller’s need for emotional support, safety planning and

referrals

(1) 5 Certified Battered Women’s Counselors with annual certification training (2) 1 Master’s level supervisor (3) Telephone lines, answering service, cell phones, pagers (4) Office supplies – paper, pens, clips, files, rolodex (5) Listings of community resources (6) Standards review by CT Coalition Against Domestic Violence

Caller manages feelings about immediate crisis

Caller identifies that she has rights and choices

Caller reports decreased sense of isolation

Page 60: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Chart 2: Logic Model for Service System Management

Mini

stry

-fund

ed p

rogr

ams

are

deliv

ered

thou

gh e

ffect

ive s

yste

ms

ofse

rvic

es

Outc

omes

Inpu

ts

Serv

ice p

lans t

hat

addr

ess t

he ra

nge

ofcli

ent n

eeds

, ref

erra

ls,ca

se co

nfer

ence

note

sAc

tiviti

es

Outp

uts

Goal

s

Ass

ess c

lient

s, pr

ovide

direc

t ser

vices

, ref

ercli

ents

to o

ther

nee

ded

supp

orts

and

serv

ices

Serv

ice s

yste

m m

anag

men

t

Acce

ssibl

e Se

rvice

s

Polic

ies, f

undin

g str

ateg

ies, p

roto

cols,

guid

eline

s, dir

ectiv

es, F

TEs,

$s

Coor

dinat

ionpr

otoc

ols, jo

intag

ency

acti

vities

Mar

ket s

ervic

es, a

rrang

etra

nspo

rtatio

n, p

rovid

ese

rvice

s at h

ome,

loca

tese

rvice

s in

conv

enien

tloc

ation

s, sc

hedu

lewo

rk, p

rovid

e tra

nslat

ion

Stra

tegy

Prov

iders

com

mun

icate

on a

regu

lar b

asis.

Prov

iders

dev

elop

com

mon

tools

.Pr

ovide

rs d

evelo

psh

ared

goa

ls.

Serv

ices

are

coo

rdin

ated

:W

ithin

the

syste

m o

f ser

vices

,se

rvice

pro

vider

s kno

w ab

out

each

oth

ers'

serv

ices,

involv

eot

her p

rovid

ers w

hen

appr

opria

te, a

nd h

ave

good

work

ing re

lation

ships

Coor

dina

ted

man

gem

ent:

Fund

ers h

ave

a sh

ared

unde

rsta

nding

of o

vera

rchin

ggo

als.

Polic

ies, p

lans a

nd fu

nding

strat

egies

com

plem

ent e

ach

othe

r.Co

mm

unica

tions

acr

oss m

inistr

iesar

e co

nsist

ent w

ith e

ach

othe

r.

Joint

poli

cies,

decis

ions,

plans

,co

mm

unica

tions

.

Joint

ly de

velop

polic

y, m

ake

decis

ions,

and

plan

serv

ices.

Coor

dinat

eco

mm

unica

tions

.

Horiz

onta

l Man

agem

ent

Dete

rmine

cultu

ral

need

s, m

odify

prac

tices

, pro

vide

prod

ucts

that

are

cultu

rally

acc

epta

ble

Prod

ucts

that

are

cultu

rally

acc

epta

ble

Serv

ices

are

clie

nt-fo

cuse

d.Cl

ients

rece

ive se

rvice

s in

awa

y tha

t res

pects

their

beli

efs

and

prac

tices

.Cl

ients

are

offe

red

the

rang

e of

serv

ices a

nd su

ppor

ts th

eyne

ed.

Clie

nts

are

able

to a

cces

sse

rvic

es:

Clien

ts ar

e ab

le to

eas

ily fi

nd o

utab

out s

ervic

es.

Clien

ts ar

e ab

le to

eas

ily g

et to

serv

ices o

r ser

vices

are

bro

ught

toth

e cli

ent.

Clien

ts re

ceive

serv

ices i

n a

langu

age

they

und

ersta

nd.

Appendix 5a: United Way of Greater New Haven , Hhttp://www.uwgnh.org/docs/dvslogicmodel.pdfH

Appendix 5b: Ministry of Community and Social Services, Corporate Policy and Intergovernmental Affairs Branch, 2004.

Performance Measurement: A Reference Guide – Appendix 5 58

Page 61: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Chart 3: Logic Model for Governance and Management

ACTI

VITI

ES

OUT

COM

ES

OUT

PUTS

INPU

TS

As a

resu

lt of

goo

d go

vern

ance

and

man

agem

ent,

clie

nts

and

com

mun

ities

rece

ive

affo

rdab

lean

d ef

fect

ive

serv

ices

that

refle

ct p

rovi

ncia

l prio

ritie

s.

Stee

ring:

Tran

sfor

mat

ion

visi

on.

Polic

y do

cum

ents

.St

rate

gic

plan

s.G

uide

lines

and

reso

urce

s.Pl

anni

ng to

ols

for a

genc

ies.

SERV

ICE

DELI

VERY

:Se

rvic

e co

ntra

cts.

Budg

ets.

Dire

ctly-

deliv

ered

ser

vice

s.In

form

atio

n re

sour

ces.

Perfo

rman

ce/e

valu

atio

nin

form

atio

n.Lo

cal p

lans

.

PRO

VINC

IAL/

REG

IONA

L LE

VEL:

Con

sult

stak

ehol

ders

.D

evel

op p

olic

y pr

oduc

ts.

Def

ine

expe

ctat

ions

and

risk

s.Pl

an s

ervi

ces

at re

gion

al/p

rovi

ncia

l lev

el.

Iden

tify

risks

.N

egot

iate

con

tract

s w

ith m

unic

ipal

leve

l and

agen

cies

.Al

loca

te re

sour

ces.

Del

iver

dire

ct s

ervi

ces.

Mon

itor/e

valu

ate

perfo

rman

ce.

Impl

emen

t cor

rect

ive

actio

n.Pr

ovid

e su

ppor

t to

serv

ice

prov

ider

s.In

form

atio

n m

anag

emen

t; p

ublic

info

rmat

ion.

MUN

ICIP

AL L

EVEL

:D

efin

e ex

pect

atio

ns.

Plan

ser

vices

at m

unic

ipal

leve

l. Id

entif

y ris

ks.

Neg

otia

te c

ontra

cts

with

pro

vinc

e an

d ag

enci

es.

Allo

cate

reso

urce

s to

age

ncie

s/pr

ovid

ers.

Del

iver

ser

vice

s.M

onito

r/eva

luat

e pe

rform

ance

.Im

plem

ent c

orre

ctiv

e ac

tion.

Prov

ide

supp

ort t

o se

rvic

e pr

ovid

ers.

Info

rmat

ion

man

agem

ent;

prov

ide

info

rmat

ion

to p

rovi

ncia

l lev

el;

publ

ic in

form

atio

n.C

orre

ctive

act

ion .

Agen

cies

Mun

icip

al-le

vel

CM

SMs

& D

SSAB

s

Oth

er m

inis

tries

; fed

eral

gove

rnm

ent

Min

istry

(MC

SS /

MC

YS)

Gov

ernm

ent p

olic

y, d

irect

ives

.gu

idel

ines

& re

sour

cing

dec

isio

ns

COM

MUN

ITY/

AGEN

CY L

EVEL

:Pl

an a

genc

y se

rvic

es.

Neg

otia

te c

ontra

cts

with

pro

vinc

e, m

unic

ipal

leve

l and

pro

vide

rs.

Allo

cate

reso

urce

s w

ithin

age

ncy.

Del

iver

ser

vice

s.M

onito

r/eva

luat

e pe

rform

ance

.Im

plem

ent c

orre

ctiv

e ac

tion.

Info

rmat

ion

man

agem

ent;

pro

vide

info

rmat

ion

to p

rovi

ncia

l/mun

icip

al le

vels

.C

orre

ctiv

e ac

tion.

Min

istry

-fund

edse

rvic

es a

re d

eliv

ered

effe

ctiv

ely.

Serv

ice

targ

ets

are

achi

eved

or

exce

eded

.

Volu

me

mea

sure

s.Q

ualit

y m

easu

res

ofbu

sine

ss p

roce

sses

and

prod

ucts

, inc

ludi

ngco

mpl

ianc

e.

Polic

y a

nd o

pera

tions

prod

ucts

and

stra

tegi

cpl

ans

are

info

rmed

by

resu

lts, e

nviro

nmen

tal

info

rmat

ion,

and

stak

ehol

ders

Out

com

e M

easu

res

Out

com

e M

easu

res

Sub-

goal

2. G

over

nmen

t-fun

ded

orga

niza

tions

dem

onst

rate

acc

ount

abilit

y to

fund

ers

and

to th

e pu

blic

.

Sub-

goal

3. S

ervi

ce s

yste

mm

anag

ers

achi

eve

valu

e fo

r mon

ey.

Sub-

goal

1.

The

min

istry

pro

duce

s hi

ghqu

ality

pol

icy

and

oper

atio

ns p

rodu

cts

.

Perfo

rman

ceM

easu

res

The

publ

ic,

stak

ehol

ders

, and

partn

ers

rece

ive

info

rmat

ion

abou

tpr

ogra

ms

and

resu

lts.

Info

rmat

ion:

Perfo

rman

ce /

eval

uatio

n re

ports

.C

onsu

ltatio

n in

form

atio

n.

GO

ALS

Appendix 5a: United Way of Greater New Haven , Hhttp://www.uwgnh.org/docs/dvslogicmodel.pdfH

Appendix 5b: Ministry of Community and Social Services, Corporate Policy and Intergovernmental Affairs Branch, 2004.

Performance Measurement: A Reference Guide – Appendix 5 59

Page 62: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Appendix 6 Step-by-Step Instructions for Creating a Logic Model Figure 2: A Standard Logic Model Diagram

List the changes in participation, awareness, behaviour, compliance or capacity that are expected to result from the activities in the short term

ntermediate-

List the benefits or changes in conditions, attitudes and behaviour that are expected to result from the activities in the iterm

List the tangible products of the activities

List all activities List all inputs

Desired Intermediate

Outcomes Activities Outputs

Desired Short-Term Outcomes

Inputs

Objectives Of The Strategy: What Does The Strategy Hope To Accomplish

Customers: Who Benefits

Larger Public Interest: (Government Priorities Are Usually Here)

There are seven steps to creating a logic model:

1. Defining the public interest served by the ministry’s strategy.

2. Defining who are the strategy’s primary customers.

3. Defining the objectives of the strategy.

4. Identifying the inputs.

5. Defining activities.

6. Defining outputs.

7. Defining the desired short and intermediate-term outcomes.

Performance Measurement: A Reference Guide – Appendix 6 60

Page 63: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Define Public Interest (Government Priorities and Other Important Public Interest)

Answering the question, “what is the public interest?” involves thinking about the higher purpose of the strategy. Some people will think of this as the “vision,” others will think of it as the “mission.” Vision and mission are different things, and that’s why they need to be discussed - to ensure that everyone involved really does agree about the overall purpose of the strategy and the public interest being served.

Where the strategy clearly relates to a government priority, this work is made easier because the priority provides the wording for what public interest is served. Where the strategy serves other important public interest, the ministry will need to identify that purpose and may need to involve other ministries in the discussion of how that purpose will be articulated, for example, on government’s regulatory role.

Define Primary Customers

The next step is to clearly define who is going to benefit from, or be directly impacted by, the strategy. All government strategies will have a long list of internal and external customers – but it is important to clearly identify which one(s) are intended to benefit from the strategy. Keep in mind that there should be a clear link between who benefits and what public interest is being served.

Define Objectives

An objective is a statement that says what the strategy wants to accomplish. Clear objectives:

• provide purpose and direction;

• motivate activity delivery and action;

• help to provide guidance and plan training;

• increase productivity and efficiency;

• make the link between the initiative and the public interest.

Good statements of objectives are specific, include an action and leave little room for interpretation. To use a very simple example: An activity provides out-of-school support over six years to ESL children aged 6-12, to ensure they are reading at a high school level by the time they enter high school.

“Reading at a high school level” is only the beginning of a good statement of the objective.

“Children who complete the activity will be able to read in English at a grade nine level by the time they complete the activity” is specific, contains an action and leaves little room for interpretation. Staff or customers new to the activity, other organizations and Management Board of Cabinet will know what the activity is trying to achieve. Notice that clear objectives make identifying performance measures much easier!

CAUTION: Make sure that the objectives of the strategy align with the public interest and customer groups that have been identified.

Performance Measurement: A Reference Guide – Appendix 6 61

Page 64: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Identify Inputs

Inputs are all the resources available to achieve the objectives of the strategy. In developing a logic model, it is always useful to list the inputs available because:

There may be resources available that haven’t been considered, such as external research, because they aren’t used on a continual basis.

Resources limit what is possible to achieve, so they force the question, “Can the objectives be met with the resources available?” If the answer is no, the objectives may need to be reconsidered or inputs may need to be used more creatively.

Define Activities

Activities are the actions undertaken by the ministry directly or by broader public sector organizations on behalf of the ministry to meet the objectives of the strategy. Identify as activities any programs that help to meet the objectives of the strategy. There may be a number of activities related to each objective and there may be activities that contribute to meeting more than one objective.

This is the stage when the value of the logic model becomes clear. If there are activities that don’t align with an objective, then perhaps those activities aren’t really nessessary. Similarly, if there are several activities contributing to an objective, perhaps those activities can be better aligned to avoid duplication and free up resources for other purposes. The logic model helps to ensure that:

• All activities that contribute to meeting objectives are listed

• All activities listed do meet objectives

Define Outputs

Outputs are the tangible things produced by the activities. Outputs might include, for example, fines, books, beds, waiting lists, inspections, consultations, customers or response times. In other words, outputs are things that can be counted easily. List all the outputs that each activity is expected to produce. Then go through the list of outputs and decide which outputs really help to meet objectives.

If an output isn’t necessary to meet the objectives, then perhaps it is not needed and the inputs used to create it could be better used in another way. Or, perhaps the output is necessary, but contributes to meeting the objectives of a different strategy. Then perhaps the activities that produce the output need to be reconsidered to better align them with the strategy.

HINT:

Because outputs can be counted, they are easily converted into performance measures. When the logic model is

completed, identify which outputs really demonstrate that progress is being

made towards meeting the objectives. Those outputs are the ones that should

be tracked as output performance measures.

Define the Desired Short-term and Intermediate-term Outcomes

In some ways, stating the desired outcomes is similar to stating the objectives, and in this way, a logic model is less a linear process than a cyclical one. However, desired outcomes may be staged elements of the objectives. Let us return to the earlier example of an activity targeted to ensuring that the English reading skills of ESL children aged

Performance Measurement: A Reference Guide – Appendix 6 62

Page 65: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

six to twelve reach a grade nine level by the time they complete the activity. Educators creating a logic model for the activity may know that the children in the activity face barriers to reading other than language skills – learning disabilities, poor physical health or low self esteem. Thus, in order to achieve the objective, a number of precursors to learning may also need to be achieved.

Therefore, the desired short-term outcomes may relate to improved conditions to support learning. The desired intermediate-term outcome of the activity, implemented over a number of six year cycles, may be to ensure that all ESL children entering grade nine are able to read at a grade nine level.

Notice that, in this case, the desired short-term outcome needs to be reached before the objective can be met. The desired intermediate-term outcome relates to meeting the objective on a continuous basis. In other words, desired short-term outcomes relate to the impact of the activity and desired intermediate-term outcomes relate to the achievement of the objectives. Achievement of the objectives, in turn, contributes to achievement of government priorities, which may be measured through high-level indicators.

IMPACT OF THE ACTIVITIES

INT

Some pyears),

2 These a

Perform

SHORT-TERM OUTCOME

ERMEDIATE-TERM OUTCOME MEETING THE OBJECTIVES

eople find it helpful to apply timeframes to “short-term” (1-3 years) and “intermediate-term” (3-5 but time frames are only a guide and will vary across activities.

HINT:

Identifying desired outcomes helps to identify outcome-based performance measures. The difference between a desired outcome and an outcome measure is that:

• A desired outcome is a statement of what the activity wants to achieve, related to the objectives. (e.g., All ESL children entering grade 9 will be able to read in English at a grade 9 level)

• An outcome measure is a statement of what will be measured to demonstrate progress towards the desired outcome. (e.g., Percentage of ESL children entering grade 9 who pass a standardized English reading test)

re examples used for explanatory purposes only and not suggestive of practices or measures to be adopted.

ance Measurement: A Reference Guide – Appendix 6 63

Page 66: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Appendix 7

Measures Checklist

Is the measure actually a good measure of the objectives the strategy intends to achieve?

Relevance:

Validity: Does the measure actually measure what it is supposed to?

Reliability: Do different users of the same measure report the same result?

Verifiable: Will the measure produce the same data if measured repeatedly?

Attribution: Does the measure relate to factors that the ministry can affect?

Clarity: Is the measure clearly defined and easily understood?

Does the measure provide correct information in accordance with an accepted standard?

Accuracy:

Cost Effectiveness: Is the value of the measure greater than the data collection costs?

Sensitivity: Is the measure able to measure change?

Timeliness: Can data be collected and processed within a useful timeframe?

Can the data be compared with either past periods or with similar activities? Comparability:

Does the data feeding the measures relate to the same factors in all cases at all times? Consistency:

Integrity: Will the measure be interpreted to encourage appropriate behaviours?

Performance Measurement: A Reference Guide – Appendix 7 64

Page 67: Full Guide Performance Measurement - without coversportal.publicpolicy.utoronto.ca/en/ContentMap... · The logic model provides a foundation for developing performance measures that

Appendix 8 Example of How Benchmarking Can Be Used Performance Measures Count in Tight Budget Times

Bob Layton, city manager, and Don Gloo, assistant to the city manager of Urbandale, Iowa, recently used performance measurements to support a request for additional financial resources during a tight budget season. According to Layton, a request for additional police officers had been declined when the performance measurement information that they submitted to support the request - on crime rates, case clearance rates, and citizen survey data - did not justify additional staffing.

Urbandale used benchmarking to compare its performance on crime rates and clearance rates to more than 125 other cities in North America. The benchmarking results showed that Urbandale’s performance on case clearance was outstanding (see graph below).

"Throughout the entire budget process, financial constraints forced us to make difficult decisions and considerable compromises on priorities. Having good performance measures makes it easier to defend our resource allocation decisions," Gloo said.

Performance Measurement: A Reference Guide – Appendix 8 65