MEGA TRENDS, MACRO REFLECTIONS, MESO RESPONSES AND MICRO ... · Mega trends, macro reflections,...

18
Intended for Canadian Evaluation Society Document type Conference Paper Date May, 2010 MEGA TRENDS, MACRO REFLECTIONS, MESO RESPONSES AND MICRO ADAPTATIONS

Transcript of MEGA TRENDS, MACRO REFLECTIONS, MESO RESPONSES AND MICRO ... · Mega trends, macro reflections,...

Intended for

Canadian Evaluation Society

Document type

Conference Paper

Date

May, 2010

MEGA TRENDS, MACRO

REFLECTIONS, MESO

RESPONSES AND MICRO

ADAPTATIONS

Intended for

Canadian Evaluation Society

Document type

Conference Paper

Date

May, 2010

Key words: credentialing, certification programs, evaluation capacity building

Conference Theme: Going Gold

MEGA TRENDS, MACRO REFLECTIONS, MESO RESPONSES AND MICRO

ADAPTATIONS.

SOME REFLECTIONS ON

THE CANADIAN CREDENTIALING INITIATIVE

Mega trends, macro reflections, meso responses and micro adaptations

1-1

Revision 1

Date 2010/04/20

Made by SNI

Checked by SETL

Approved by SNI

Description Conference paper for 2010 CES Conference

Contact

information

Dr. Steffen Bohni Nielsen Director of Evaluative Knowledge

Ramboll Management Consulting

Nørregade 7A DK 1165 Copenhagen

Denmark

[email protected]

Cell +45 2948 8103 Fax +45 3397 8103

Mega trends, macro reflections, meso responses and micro adaptations

1-2

CONTENTS

Abstract 3 1. The Mega Trend 3 1.1 Certification as a mega trend 3 2. Macro Reflections 4 2.1 The Canadian PDP scheme is likely to spread 4 3. Meso Responses and Micro adaptations 4 3.1 A knowledge management strategy 5 3.1.1 Managing knowledge -the case of Ramboll 6 3.2 The Inputs: The Evaluation Capability Development Program 8 3.2.1 A designation policy 8 3.2.2 An independent body of administering credentials 8 3.2.3 Professional standards for program evaluation 8 3.2.4 A defined set of competencies 8 3.2.5 Levels of evaluation competency 9 3.2.5.1 The Depth Chart - Managing HRD 12 3.2.6 A formal written/oral examination 12 4. Some reflections on the future of the PDP 14 5. References 14 6. Biography 16 7. Appendices 16

[TABLE OF FIGURES HEADING]

Figure 1 – Elements in the knowledge management of evaluation delivery ....................... 7 Figure 2 – Evaluation Capability Ladder...................................................................... 9 Figure 3 – Evaluation Capability Depth Chart, Q1 2010 ............................................... 12 Figure 4 – Monitoring the Evaluation Depth Chart - Dummy ........................................ 12 Figure 5 – Screen Dump PEQUAT ............................................................................ 16

Mega trends, macro reflections, meso responses and micro adaptations

1-3

ABSTRACT

For several years it has been a mega trend in the global village that organizational service

delivery and quality must be documented by elaborate accreditation or certification

programs. Likewise, various disciplines have enhanced this trend by fortifying service monopolies through licensure or by carving out a niche for themselves through certification.

Also documentation on individual skills by certifications is in demand. This increased level of systematic certification, the saying goes, will help both the demand side and the supply side

in the market place. Arguably, then, the Canadian Evaluation Society’s credentialing is not only a macro reflection of local demand but also a response to a global trend.

The fact that program evaluation standards, more or less similar in content, are now

adopted in the US, Canada, Germany, Austria, and Switzerland makes it likely that the Canadian credentialing standards for program evaluators may spread across the Western

hemisphere. For peripheral onlookers the question beckons; how should and can organizations and

individuals engaged in evaluation practice respond to the path set out by CES? In other words, what may be the meso responses and micro adaptations to the CE scheme? This

author provides a presentation of the organizational response made by Europe’s largest

evaluation consultancy to the CPE and discusses its implications at the meso and micro level.

1. THE MEGA TREND

On the Canadian Evaluation Society website CES president Francois Dumaine writes:

“Any new path needs a brave soul to engage in it, not fully aware of what to

expect, but confident of where it's leading. In the case of professional

designation in the field of program evaluation, the community of evaluators in

Canada is that brave soul. CES will be the first organization in the world to

offer a Credentialed Evaluator (CE) professional designation!”

There is little doubt that the CES PDP is indeed pioneering in the evaluation field globally,

but certification is by no means a path never walked before.

1.1 Certification as a mega trend

For several years it has been a mega trend in the global village that organizational service

delivery and quality must be documented by elaborate accreditation or certification

programs.

Organizations in the education and health sector undergo rigorous accreditation procedures,

corporations strive to achieve quality management system certifications such as ISO9001.

Likewise, various disciplines have enhanced this trend by fortifying service monopolies by

licensure or carve out a niche for themselves through certification. In fact, professions have

a long tradition of fencing off its practice area through certifications. This trend has been

evident from medicine and law to more recent professional certifications such as those of

internal auditors, management consultants and others.

Also documentation on individual skills by certifications is in demand. This, the saying goes,

will help both the demand side and the supply side in the market place. An example is the

European Commission initiative, “EUROPASS”, the programme, essentially standardizing

competency profiles in a curriculum vitae format aimed to cover more than 500 million

inhabitants in Europe. Further, a host of new diplomas for various professional

developments has surfaced in order for the work force to document its hard skills.

As such the Canadian Evaluation Society’s Professional Designations Program is only one of

many reflections in a mega trend towards professional or corporate licensure.

Mega trends, macro reflections, meso responses and micro adaptations

1-4

2. MACRO REFLECTIONS

That said, the Canadian Evaluation Society’s credentialing program is also a macro

reflection of an emergent pressure on professionalizing the evaluation practice from the

demand side (Government of Canada, 2001, 2004, 2009).

In this context, I gather that it would be superfluous for most participants to venture into detailed discussion about the content of the Canadian Evaluation Society’s Professional

Designations Program.

Suffice it here to say that the PDP, in my view, relates to six essential components: (i) A designation policy;

(ii) A set of professional standards for program evaluation; (iii) A defined set of competencies; (iv) An independent body administering and ascertaining the credentials,

and also importantly what it does not entail;

(v) A formal written/oral examination whether CE applicants meet the standards; (vi) There are no levels of evaluation competency.

I will not go through how the CE scheme differentiates itself from the credentialing schemes of other disciplines as the basis for this has already been done (Huse & MacDavid, 2006).

2.1 The Canadian PDP scheme is likely to spread In the context of this presentation, I assume that everyone is aware of the 2008 Canadian

Evaluation Society adoption of the evaluation standards established by the US Joint Committee on Standards for Educational Evaluation.

The American Evaluation Association (AEA) was represented in the Joint Committee and

officially supports its work. However, the AEA has not formally ratified and adopted the

standards but, perhaps due to institutional politics, developed a set of generic guiding

principles for evaluators in 2004.

In the mean time, the Joint Committee’s standards have been adopted almost ad verbatim

by the Swiss Evaluation Society (SEVAL) in 2000, and the German Evaluation Society

(DeGeval) in 2001. Very similar standards are equally promoted by a number of

organizations engaged in the evaluation of Development Aid.

While the PDP in the short term is a national, Canadian initiative it has every opportunity as

a first mover to become an international standard.

3. MESO RESPONSES AND MICRO ADAPTATIONS

Looking at the PDP from across the Atlantic and from the distinct position of a for-profit external evaluation provider – the scheme poses a number of intriguing questions:

1. Outside In: Will the Credentialed Evaluator (CE) become a marketable quality

differentiator in and - perhaps more importantly from the author’s perspective - outside

Canada?

2. Inside Out - meso: Can the CE scheme be used strategically to develop competencies

among evaluation consultants?

3. Inside Out- Micro: Can the CE scheme be used to advance career and document skills

for the individual consultants?

The crucial point, underlying all of the above questions, is that for an organization with

scale in its evaluation services the Certified Evaluator scheme must serve a dual purpose.

Firstly, it must meaningfully be part of the organization’s communication with the outside

world (through client engagement, branding, participation in evaluation communities etc.).

Mega trends, macro reflections, meso responses and micro adaptations

1-5

Secondly, it must be an inherent part of the organization’s knowledge management

strategy.

When looking at the Canadian Evaluation Society’s PDP scheme its relative simplicity makes

it instantly potentially valuable for organizations wishing to communicate that they are

serious about the business of evaluation (in the wider, non-monetary, sense of the word).

This is the outside in perspective.

I remain more sceptical as to its applicability to serve the other part of the equation - the

inside-out perspective. In the remainder of the presentation I shall elaborate on this point

and potentially provide some input to further developing the PDP.

3.1 A knowledge management strategy

Certification is ultimately a signifier for mastering a certain set of qualification, In the

context of service firms, this translates to the ability to apply knowledge to solve som

(client) problem.

My point is therefore, that the CE scheme must be seen in relation to an organization’s

wider knowledge management strategy.

One definition, I tend to use is:

”Knowledge Management is the intentional process that links the organization’s

existing knowledge capacity (the sum of our knowledge base and human

capabilities) with future services (knowledge application).”

In essence knowledge management is about how an organization:

1. Develops knowledge;

2. Shares knowledge;

3. Captures knowledge;

4. Reduces risks of losing knowledge;

5. Creates value from knowledge.

A knowledge management strategy then is a strategy that supports the execution of a

business strategy whether the organization is public, not-for-profit, or for-profit.

In the context of the “measurement industry” a knowledge management strategy is the overall approach in which a company intends to align its knowledge base and capabilities to

the intellectual requirements of its business strategy and model.

It is important then that knowledge management is more than just content management in IT systems, it also entails elements otherwise coined as organizational learning, human

resource development etc. (see also Zack, 1999; Fahey & Prusak, 1998). In essence it is about managing the organization’s intellect in such a way that is premier level of knowledge

is applied. As Ted Hall of McKinsey phrases it:

“Knowledge is only valuable when it is between the ears of consultants and

applied to clients’ problems”

Ted Hall, quoted in Bartlett (2000:6).

In the context of this presentation the element of knowledge management to be discussed

is the development of knowledge among its professionals.

Despite the current economic crisis, demographic change, change in workforce employment

patterns etc. it is likely that corporations, universities, and other professional service

organizations increasingly will be competitive through the strategy in which they manage

intellectual capital.

In an edition of the Harvard Business Review researchers James Quinn, Philip Anderson &

Sylvain Finkelstein state boldly:

Mega trends, macro reflections, meso responses and micro adaptations

1-6

“In the post-industrial era the success of a corporation lies more in its intellectual and

systems capabilities than in its physical assets. The capacity to manage human

intellect – and to convert it into useful products and services – is fast becoming the

critical executive skill of the age”

(Quinn, Anderson & Finkelstein, 1998: 182)

According to the authors, the key challenge is thus to define and manage the professional

intellect in relation to the business one is in. Quinn, Anderson & Finkelstein go on to make

another important distinction. They distinguish between four levels of professional intellect

(1998:183-84):

1. Cognitive knowledge (know-what). The basic mastery of a discipline.

2. Advanced skills (know-how). Effective execution of “book learning”

3. Systems understanding (know-why). Transcends effective execution, anticipates

complex problems. Highly trained intuition

4. Self-motivated creativity (care-why).Intellectual leaders with will, motivation and

adaptability to renew and innovate.

Their point is that organizations which manage to nurture and retain employees with “self-

motivated creativity” are much more likely to outperform their peers. Also, they add, it is

essential to capture knowledge in such a way that the organizational learning is accessible

to its members.

Using this framework and looking at the defined set of competencies underlying the

Certified Evaluator scheme, I boldly, claim that it certifies evaluators at the know-what

level. This is not bad thing. However, when looking at the scheme from the inside-out

perspective of managing the professional intellect it is not quite adequate either.

3.1.1 Managing knowledge -the case of Ramboll Let me elaborate this point by looking at the case of Ramboll Management Consulting

(Ramboll). In Europe, Ramboll is the largest provider of evaluation services and is as a

consultancy specialized in public sector development.

The fact sheet presents some key information about Ramboll.

Table 1 – Fact Sheet Ramboll Management Consulting

Facts, 2009

Ownership Owned by Ramboll (9000 employees), which in turn

is owned by the Ramboll Foundation

Scope Management consulting to public sector clients

Services Evaluation, Research, Management, HRD, IT

consulting

Operations in Denmark, Germany, Sweden, Norway, Finland

Net project revenue Approx. CAD 57 million

Net project revenue, evaluations Approx CAD 14,5 million (25% of turnover)

Full Time Staff 400

# of evaluations per year 250+ mainly in social, education, employment,

health and business & industry

# of consultants involved in

evaluations on regular basis

130

Annual turnover of staff 23%

Type of consultancy Talent factory. Majority of new recruits come to

Ramboll directly from university

Human capital 98% with Master’s Degree or above

Spanning several service areas, I will, in the following, be focusing on the part of the

company engaged in the delivery of evaluation services.

In the case of Ramboll Management Consulting, we face several knowledge management

related challenges every day. As a “talent factory” we face a significant annual staff

1-7

turnover in an industry where attrition is an inherent challenge. For many plying their trade

as management consultants it is the first or second stop in their career. It is usually a type

of employment which leverages market value and career opportunities for the individual.

For Ramboll it poses a number of particular problems:

• Improving the time to evaluation capability for new hires;

• Ensuring a critical mass of capable evaluation consultants;

• Developing thought leaders that drive knowledge development;

• Retaining thought leaders within the ranks of RMC;

• Capturing organizational learning;

• Ensuring the access, distribution of the knowledge base;

• Fostering a knowledge sharing culture;

• Improving delivery of quality evaluations in our projects;

• Ensuring that human and financial resources are invested to create the highest Return

On Investment (ROI).

In the particular context of a high-leverage management consultancy, the knowledge

management strategy is conceived in the sense that knowledge needs to be managed at all

levels in the delivery chain. Or in Evaluator Speak: Inputs, Activities, Outputs, and

Outcomes must be managed.

The figure below gives an overview of the measures that form part of the knowledge

management strategy.

Figure 1 – Elements in the knowledge management of evaluation delivery

In this context it is not relevant to talk about all elements in the knowledge management

strategy. While professional service organizations can create conceptualizations, elaborate

quality assurance processes etc. their service delivery ultimately hinges on the people who

deliver the service. In evaluation terms these are the inputs.

In other words, managing human resources is but one element of knowledge management

but it is the one in which the Canadian Evaluation Society’s CE scheme potentially can add

value. Let me therefore investigate this in further detail.

Input

Capability Development Program

Knowledge access*

Concept base

Career incentives to deliver quality

Process

Quality Assurance Tool (PEQUAT)

Quality and risk review

Output

Quality Assurance Tool (PEQUAT)

Independent quality assurance

Outcome

Monitoring client satisfaction

Monitoring evaluation utilization**

Mega trends, macro reflections, meso responses and micro adaptations

1-8

3.2 The Inputs: The Evaluation Capability Development Program

As presented earlier, The PDP has some essential components that can structure this

discussion:

(i) A designation policy; A stated purpose for the PDP and target group.1

(ii) Professional standards for program evaluation; A set of professional and ethical standards to be pursued by professionals within the evaluation field.2 3

(iii) A defined set of competencies; Describing the competencies that must be mastered4 (iv) An independent body administering and ascertaining the credentials5, The CES

credentialing Board headed by one of the CES vice-presidents.

And also importantly what is does not entail;

(v) A formal written/oral examination whether CE applicants meet the standards;

(vi) There are no levels of evaluation competency.

Having tracked the CES’ PDP initiative this served as an inspiration for Ramboll’s approach.

3.2.1 A designation policy Ramboll’s Capability Development Program also contains a designation policy that outlines

the statement of intent, objectives and overarching criteria for evaluation professionals.

However, Ramboll’s designation policy had to fit into a wider institutional context of (i)

supporting a business strategy and (ii) aligning with a formal career development policy as

well as (iii) other polices such as the policy on quality management.

3.2.2 An independent body of administering credentials In 2004 Ramboll created an internal evaluation society, which, among others, functions as

the independent body administering applications for different levels of evaluation

competency (more about this below).

All applications must meet a set of qualifying criteria and be endorsed by a line manager.

All applications are, upon verification, approved by the Director of Evaluative Knowledge

(who is member of the senior management team in Ramboll).

3.2.3 Professional standards for program evaluation Ramboll has no independent professional standards for evaluators. With reason. - Ramboll

carries out evaluation in several different (supra)national contexts where different

standards have been adopted by the client or by national evaluation societies.

Having carried out a gap analysis we concluded that these standards are not contradictory

but place emphases differently and operate at different levels of concreteness.

Having no wish to differ from these external standards we had to adopt an evaluation

quality management approach which encompasses all these standards. For this purpose we

developed the Program Evaluation Quality Assessment Tool (PEQUAT).

3.2.4 A defined set of competencies In Ramboll we looked closely at the set of competencies defined by the Canadian Evaluation

Society. Firstly, we concluded that the competencies (at that point) were too generic to be

useable for our purpose. Therefore, we decided that the set of competencies were likely to

be acquired over time and as part of a wider career development. Consequently, a more

differentiated and precise skills set had to be defined for different phases in the career

development.

1 http://www.evaluationcanada.ca/txt/20090531_ce_policy.pdf 2 http://www.evaluationcanada.ca/site.cgi?s=6&ss=10&_lang=en 3 http://www.evaluationcanada.ca/site.cgi?s=5&ss=4&_lang=en 4 http://www.evaluationcanada.ca/txt/20090531_competencies_companion.pdf 5 http://www.evaluationcanada.ca/txt/20090531_implementation_plan.pdf

1-9

3.2.5 Levels of evaluation competency We opted to differentiate what we coin as evaluation capability through different levels of

capability. We created four levels of capability matching those presented by James Quinn,

Philip Anderson & Sylvain Finkelstein above (though alternating the qualifications tag a bit

for communicative purposes).

Figure 2 – Evaluation Capability Ladder

The core mechanism in the CDP is that a professional pathway to develop evaluation and

consultant skills must be developed to manage the human capital aspect of the knowledge

management strategy.

By defining different levels of expertise we seek to achieve several things at once; (i)

reduce time to competency for new hires, (ii) create an attractive career development to

retain more senior staff, (iii) manage quality better, (iv) and develop our knowledge base.

Each level in the capability ladder is defined and a set of capabilities and affiliated criteria

must be met to reach this level. Meeting these criteria becomes integral to the career

development plan for all relevant consultants.

CapabilityLevel

Thought Leader

Evaluation Expert

Evaluation Specialist

EvaluationPractitioner

Qualifications

Know All

KnowWhy

Know How

KnowWhat

CertificationRequirement

All below + Publication

Elite EvaluationEducation

Advanced EvaluationEducation

Evaluation Education

Mega trends, macro reflections, meso responses and micro adaptations

1-10

Table 2 – Definitions – the Ramboll Evaluation Capability Development Programme

Capability Level Definition Capability Criteria Training Requirements

Thought Leader The Thought Leader is a vital driver in RMC’s business success and development of our knowledge base in the evaluation industry. A distinguishing characteristic of the Thought Leader is the recognition from the outside world that he/she deeply understands RMC’s business model, the needs of our customers, and the broader methodological and theoretical issues of the discipline and as such, through synthesis or innovation, translates these insights into actionable services and products for the benefit of RMC

Has ability to identify market trends and needs and translate these into services Holds international view on RMC’s services in the industry Is a driver in the methodological synthesis and innovation of RMC services in the industry and thus raising our standards Is internationally recognized by peers and clients as a leader in the industry with regard to some methodology Markets our expertise externally through publications and conference papers Carries training modules on select methods Is a driver in developing the knowledge base, Ensures that quality standards are met in RMC products and services

Project Leader/Director for at least 20 evaluation projects At least 4 peer-reviewed publications in evaluation/pub.adm. journals At least 4 conference papers Authored at least 2 concepts Prior appointment as evaluation expert Active participation in key (inter) national evaluation networks Meets quality standards in all deliverables in all recent projects where involved

Elite Evaluation Education Advanced Evaluation Education Evaluation Education Or equivalent

Evaluation Expert The Evaluation Expert is instrumental in RMC’s business success and development of our knowledge base in the evaluation industry through synthesis and innovation. The Evaluation Expert is highly qualified within the evaluation domain and holds extensive knowledge of and ability to apply various methodologies and tools knowing the virtues and limitations of each. The Evaluation Expert has completed an extensive advanced training program in evaluation as well as having honed these skills through extensive practical experience.

Is a driver in the methodological synthesis and innovation of RMC services in the industry and thus raising our standards Is nationally recognized by peers and clients as a leader in the industry Markets our expertise externally through publications and conference papers Carries training modules on select methods Is a driver in developing the knowledge base, Ensures that quality standards are met in RMC products and services

Project Leader/Director for at least 15 evaluation projects At least 3 conference papers Authored at least 1 concept Prior appointment as evaluation specialist Active participation in key (inter) national evaluation networks Meets quality standards in deliverables where PL or PD

Elite Evaluation Education Advanced Evaluation Education Evaluation Education Or equivalent

Evaluation Specialist

The Evaluation Specialist is devoted to practicing within the evaluation domain. The Evaluation Specialist is licensed as a professional meeting specified requirements in the evaluation industry. The Evaluation Specialist has completed an extensive advanced training program in evaluation as well as having applied these skills in practice as a project leader. The Evaluation Specialist has the ability to select the most appropriate and timely evaluation methods and translate them into practice in evaluation team’s work.

Is a reliable project leader that sees evaluation methodologies and concepts meet specified standards Can make project management judgment of most relevant methodology and tools in lieu of client needs, budgetary constraints and methodological soundness Contributes to developing the knowledge base, Ensures that quality standards are met in RMC products and services through his/her own and project team practice

Project Leader/Director for at least 5 evaluation projects Demonstrates solid understanding of evaluation tools and methodology and their strengths and weaknesses Applies concepts in accordance with quality standards Prior appointment as evaluation practitioner Active participation in the Ramboll Management Evaluation Society Meets quality standards in deliverables where PL or PD

Advanced Evaluation Education Evaluation Education Or equivalent

Mega trends, macro reflections, meso responses and micro adaptations

1-11

Capability Level Definition Capability Criteria Training Requirements

Evaluation Practitioner

The Evaluation Practitioner is licensed as a professional meeting specified requirements in the evaluation industry. The Evaluation Practitioner has completed a specialized training program in evaluation as well as having applied these skills in practice. The Evaluation Practitioner has the ability to apply the most appropriate and timely evaluation methods and translate them into practice in their own work

Is a reliable practitioner of evaluation methodologies and concepts that meets specified standards Can apply a range methodologies and tools in lieu of client needs, budgetary constraints and methodological soundness Contributes to developing the knowledge base, Ensures that quality standards are met in RMC products and services through his/her own and project team practice

At least 800 billable hours on evaluation projects Demonstrates sound application of evaluation tools and methodology Applies concepts in accordance with quality standards Active participation in the Ramboll Management Evaluation Society

Evaluation Education Or equivalent

1-12

The CDP is not only a credentialing process it is also a way to strategically manage human

resource development.

3.2.5.1 The Depth Chart - Managing HRD This notion is depicted in the Depth Chart. The basic idea is that the company at all times can

monitor its strength in depth and prioritize investments in HRD. The depth chart has e.g. made it

evident that the business units working with evaluation have significantly different HRD needs

which must be addressed.

Figure 3 – Evaluation Capability Depth Chart, Q1 2010

Figure 4 – Monitoring the Evaluation Depth Chart - Dummy

3.2.6 A formal written/oral examination As mentioned before the Capability Development Programme is managed by a programme

secretariat that validates all applications for the different levels of capability.

As part of the Capability Development Program, Ramboll offers its evaluators a wide range of

training opportunities. However, participation is only required if the line management and the

consultant him/herself consider the course learning outcomes meaningful.

Basically, the whole Capability Development Program has been developed through the notion of

“High Impact Learning” in which theories of change for business goals have been developed and

investments in competence development targets mastery of the core processes in the results

chain. Learning outcomes are differentiated on the basis of individual skills assessments

(Brinkerhoff & Apking, 2001).

Capability Level

Thought Leader

Evaluation Expert

Evaluation Specialist

Evaluation Practitioner

New Hire

Number of consultants

2

16

35

33

44*

0

20

40

60

80

100

120

140

160

180

200

Q1

2010

Q3

2010

Q1

2011

Q3

2011

Q1

2012

Q3

2012

Number of evaluators

New Hire

Practitioner

Specialist

Expert

Thought Leader

1-13

To ensure training is targeted relevant learning outcomes, three levels of testing occur. (i) Pre

training to identify areas of strength and learning needs, (ii) after completing the courses aimed

at mastering the required set of capabilities. This examination is carried out using on-line

multiple choice tests. This is to ascertain that the evaluator has acquired the evaluation

capabilities expected at a given level. (iii) More importantly, the evaluator must demonstrate a

track record of excellence in applying these skills in the delivery of evaluation work. This is done

by using the quality assessment tool, PEQUAT, mentioned above. The tool is derived from,

among other the professional standards of CES (and others), and contains more than 160

checklist items throughout the lifecycle of an evaluation. (See illustration in Table 3).

These examinations along with several other criteria form the basis for “making the grade” to

another capability levels.

Table 3 – Example - Summation of evaluation quality (dummy)

Finally, it must be remembered that the CDP forms but one part of a wider knowledge

management strategy in which mechanisms to stimulate knowledge capture, sharing, access, and

development are in place. These are, however, outside the scope of this paper.

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Design Tendering Structuring Collecting Analysing Judging Reporting Utilising Closing

Degree of compliance

Overall score

Project Manager Quality Assurer

Mega trends, macro reflections, meso responses and micro adaptations

1-14

4. SOME REFLECTIONS ON THE FUTURE OF THE PDP

The case of Ramboll Management Consulting was used to assess the different questions:

1. Outside In: Will the Credentialed Evaluator become a marketable quality differentiator in and

outside Canada?

Yes, most likely in Canada but internationally only if partnerships with other evaluation

societies are established.

2. Inside Out - meso: Can the CE scheme be used strategically to develop competencies among

staff?

Only to some extent. The CE qualifying criteria are not differentiated which makes it more

amenable as a quality marker for branding purposes, than to intentionally develop human

resources.

3. Inside Out – Micro: Can the CE scheme be used to advance career and document skills for

the individuals?

Yes, it can be used to document skills but is less likely to be useful to advance one’s career

once higher levels of professional intellect is reached.

In sum, these tentative answers suggest that a next step to be considered by the CES is to (i)

work towards international partnerships to advance the CE scheme and/or (ii) further differentiate levels of competency to make it further amenable and adaptable to larger

organization’s internal needs and knowledge management structures6.

5. REFERENCES

Bartlett, Christopher A. (1996): “McKinsey & Company: Managing Knowledge and Learning”,

Harvard Business School case study 9-396-357

Brinkerhoff, Robert & Apking, Anne M. (2001): “High-Impact Learning. Strategies for Leveraging

Business Results from Training”, Basic Books, New York

Canadian Evaluation Society (2009a): CES Policy on the Credentialed Evaluator (CE) designation, available at http://www.evaluationcanada.ca/txt/20090531_ce_policy.pdf

Canadian Evaluation Society (2009b): “COMPANION DOCUMENT FOR COMPETENCIES FOR

CANADIAN EVALUATION PRACTICE, available at

http://www.evaluationcanada.ca/txt/20090531_competencies_companion.pdf

Canadian Evaluation Society (2009c): Professional Designations Program

Implementation Plan, available at http://www.evaluationcanada.ca/txt/20090531_implementation_plan.pdf

Canadian Evaluation Society (2008): “Program Evaluation Standards”

Available at http://www.evaluationcanada.ca/site.cgi?s=6&ss=10&_lang=EN

Canadian Evaluation Society (n.d.): “CES GUIDELINES FOR ETHICAL CONDUCT”

Available at http://www.evaluationcanada.ca/site.cgi?s=5&ss=4&_lang=en

European Commission (2003): “Evaluation Quality Assessment Form”, available at

6 A similar differentiation has been witnessed in other professions as well as in singular tools such as six sigma where one can

acquire green or black belt certification,

Mega trends, macro reflections, meso responses and micro adaptations

1-15

http://ec.europa.eu/budget/library/documents/evaluation/guides/quality_asses_form_en.pdf

Fahey, Liam & Prusak, Laurence (1998): “The Eleven Deadliest Sins of Knowledge Management”

California Management Review Reprint Series, The Regents of the University of California

CMR, Volume 40, Number 3, Spring 1998

German Evaluation Society (2001): EVALUATION STANDARDS (DeGEval-Standards), available at

http://www.degeval.de/calimero/tools/proxy.php?id=19084

Government of Canada, Treasury Board (2009): Policy on Evaluation, available at

http://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=15024

Government of Canada, Treasury Board (2004): Evaluation Function in the Government of

Canada, available at http://www.tbs-sct.gc.ca/cee/pubs/func-fonc-eng.asp

Government of Canada, Treasury Board (2001): Policy on Evaluation (archived), available at

http://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=12309&section=text#cha5

Government of The Netherlands, Ministry of Foreign Affairs, Policy and Operations Evaluation

Department (2004), IOB Evaluation no. 298, Quality Assessment Grid. Available at:

http://www.euforic.org/iob/docs/200504251500064486.pdf?&[email protected]&pas

sword=9999&groups=IOB

Huse, Irene & McDavid, James C. (2006): “Literature Review: Professionalization of

Evaluators”, available at http://www.evaluationcanada.ca/txt/2_literature_e.pdf

International Development Research Centre (2002): “Quality Assessment of IDRC Evaluation

Reports”. Available at:

http://www.idrc.ca/uploads/user-S/115644981714guideline-web.pdf

Joint Committee on Standards for Educational Evaluation (2000): Available at:

http://www.eval.org/EvaluationDocuments/progeval.html

OECD DAC (2007): Evaluation Quality Standards (draft). Available at:

http://www.oecd.org/dataoecd/30/62/36596604.pdf

Quinn, J.B., Anderson, P. & Finkelstein, S.(1998): “Managing Professional Intellect. Making the

Most of the Best”, in Harvard Business Review on Knowledge Management, Harvard Business

School Press, Boston

Swiss Evaluation Society (2000): Evaluation Standards. Available at:

http://www.seval.ch/en/documents/SEVAL_Standards_2000_en.pdf

United Kingdom Evaluation Society Guidelines for Evaluators. Available at:

http://www.evaluation.org.uk/resources/guidelines.aspx

UNFPA: quality assessment criteria for evaluations. Available at:

http://www.unfpa.org/upload/lib_pub_file/714_filename_eva_assessment.pdf

Zack, Michael H. (1999): “Developing a Knowledge Strategy”, California Management Review

Reprint Series, The Regents of the University of California, CMR, Volume 41, Number 3, Spring

1999

1-16

6. BIOGRAPHY

Steffen Bohni Nielsen is Director of Evaluative Knowledge at Ramboll Management Consulting -

Europe’s largest provider of program evaluations. Steffen has consulted to ministries, agencies, and service providers on evaluation capacity building, results-based management, and evaluation

use. He has also managed several national and international program evaluations. He is the author of several articles and books on evaluative knowledge and acts as a referee for American

Journal of Evaluation, Evaluation and the International Review of the Administrative Sciences. He is a former Board Member of the Danish Evaluation Society and has served as the European

Commission’s independent national expert on Danish social policy. In his current position, he is in

charge of implementing Ramboll’s knowledge management strategy in the field of evaluation,

herein its internal program evaluation certification program. He holds a Ph.D. in social anthropology based on fieldwork in a native community in British Columbia.

7. APPENDICES

Table 4 – Overview of Professional Evaluation Standard

CES AEA UKES SEVAL DeGeval

OECD-

DAC

EU

COM

IDRC NL

MOFA

UNFPA

Utility

standards

X X X X X X X X X X

Feasibility

standards

X X X X X

Propriety

standards

X X X X X X X X

Accuracy

standards

X X X X X X X X X X

Figure 5 – Screen Dump PEQUAT

Intended for: Project Manager

Also involved: Project Team

Timing

When initiating the

analysing phase.

Rationale:

To ascertain that the

analyses properly

reflects and supports

the intended

methodology.

Time to complete 5 minutes

Intended use:

To be used by Project

Manager to ensure the

data analyses are fit

for purpose.

Standards

(general/specific)Theme Checklist

Compliance

rating

Degree of

complianceMax score Reference Source

Overall score 75% 100 points

Analysis of

Quantitative Data 63% 30 points

Accuracy Analysis

Does the analysis use

appropriate analytical

tools for

descriptive/inferential

analysis? Acceptable (3) 60% 20 points

A8 Analysis of Quantitative Information--

Quantitative information in an evaluation should

be appropriately and systematically analyzed so

that evaluation questions are effectively

answered.

American Evaluation

Association

Accuracy Analysis

Has relevant statistical

tests been used, e.g.

significance testing, etc. Good (4) 80% 5 points

A8 Analysis of Quantitative Information--

Quantitative information in an evaluation should

be appropriately and systematically analyzed so

that evaluation questions are effectively

answered.

American Evaluation

Association

Accuracy Analysis

Does the analysis

plausibly take into

account performative

and informative

limitations to the

quantitative data used? Acceptable (3) 60% 5 points

A7 Systematic Information--The information

collected, processed, and reported in an

evaluation should be systematically reviewed

and any errors found should be corrected.

American Evaluation

Association

Analysis of

Qualitative Data 80% 30 points

Accuracy Analysis

Does the analysis use

appropriate analytical

tools such as

typologizing/coding/tim

e-series/network? Excellent (5) 100% 15 points

A9 Analysis of Qualitative Information--

Qualitative information in an evaluation should

be appropriately and systematically analyzed so

that evaluation questions are effectively

answered.

American Evaluation

Association

Accuracy Analysis

Does the analysis

plausibly take into

account performative

and informative

limitations to the

qualitative data used? Acceptable (3) 60% 15 points

A7 Systematic Information--The information

collected, processed, and reported in an

evaluation should be systematically reviewed

and any errors found should be corrected.

American Evaluation

Association

Synthesis 80% 40 points

Accuracy Analysis

Does the analysis of

each evaluation

question contain an

assessment of what

sources should be the

basis for findings? Good (4) 80% 10 points

A4 Defensible Information Sources--The sources

of information used in a program evaluation

should be described in enough detail, so that

the adequacy of the information can be

assessed.

American Evaluation

Association

Accuracy Analysis

Are data collected by

multiple methods

and/or from multiple

sources to check for

consistency and

negative cases

systematically

compared

(triangulated)? Good (4) 80% 15 points

(4) SOUND ANALYSIS. Are data systematically

analysed to answer evaluation questions and

cover other information needs in a valid

manner?

European

Commission

Accuracy Analysis

Are the data analysed

and interpreted in a

logical flow? Good (4) 80% 15 points

10.2 Clarity of analysis. The analysis is

structured with a logical flow. Data and

information are presented, analysed and

interpreted systematically. Findings and

conclusions are clearly identified and flow

logically from the analysis of the data and

information. Underlying assumptions are made

explicit and taken into account. OECD DAC

Risk Managementcompliance

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Overall score Analysis of

Quantitative

Data

Analysis of

Quali tative Data

Synthesis

Degree of compliance

0

5

10

15

20

25

30

35

40

Analysis of

Quantitative Data

Analysis of Qualitative

Data

Synthesis

Points

Max score Project score

Information on when

to use the checklist

The checklist item,

the rate and the

automatically

calculated score

Calculates the

aggregate score for

each specific

standard. Being

green is good, being

red is critical

Calculates the

aggregate score for

each specific

standard compared

to max. Score. Being

green is good, being

red is critical