THE IMPLEMENTATION OF DEVELOPMENTAL APPRAISAL IN …

94
i THE IMPLEMENTATION OF DEVELOPMENTAL APPRAISAL IN MATLOSANA AREA PROJECT OFFICE SCHOOLS by TSIETSI SHADRACK LETLHOO Submitted in accordance with the requirements for the degree of MASTER OF EDUCATION in the subject EDUCATIONAL MANAGEMENT at the UNIVERSITY OF SOUTH AFRICA SUPERVISOR: DR. P. MAFORA JANUARY 2011

Transcript of THE IMPLEMENTATION OF DEVELOPMENTAL APPRAISAL IN …

i

THE IMPLEMENTATION OF DEVELOPMENTAL APPRAISAL IN

MATLOSANA AREA PROJECT OFFICE SCHOOLS

by

TSIETSI SHADRACK LETLHOO

Submitted in accordance with the requirements for the degree of

MASTER OF EDUCATION

in the subject

EDUCATIONAL MANAGEMENT

at the

UNIVERSITY OF SOUTH AFRICA

SUPERVISOR: DR. P. MAFORA

JANUARY 2011

i

DECLARATION

I declare that THE IMPLEMENTATION OF DEVELOPMENTAL APPRAISAL IN

MATLOSANA AREA PROJECT OFFICE SCHOOLS is my own work and that all

the sources that I have quoted have been indicated and acknowledged by means of

complete references.

XT.S. Letlhoo

Mr

ii

DEDICATION

I dedicate this Dissertation of Limited Scope to my wife NTOMBIZANELE LETLHOO,

my mother, BEFINA LETLHOO, my children, TSHOLO, BOIPELO, BOITSHOKO,

LINDIWE and LESEDI and ALL my brothers. I also dedicate it to my late father

MALEKE LETLHOO, my late two brothers METLHOLO and KGOTSO and my late

two sisters SIMOLLANG and KEBOGILE. May God sanctify their souls.

iii

ABSTRACT

The study examines the effectiveness of the implementation of Developmental Appraisal

(DA) within the Integrated Quality Management System (IQMS) in selected secondary

schools in Matlosana, North West Province.

The methods used to gather information includes the literature review and the empirical

investigation, based on the qualitative research approach, which involved six focus group

interviews from the six sampled secondary schools in Matlosana Area Project Office, in

the North West Province.

The review of related literature revealed that there are roles and responsibilities officials

are tasked with; processes; challenges; and effective strategies when implementing DA.

The empirical investigation revealed that there are challenges that hamper the effective

implementation of DA in schools. The challenges include: inadequate support from the

Area Project Office; lack of resources for educator development; inadequate time frames

for implementation; disruption of normal teaching and learning; lack of honesty on the

part of the appraisee and appraiser; and conflict. At the end, the study recommends some

strategies that can be used to overcome some of these challenges.

Key terms:

Developmental Appraisal, roles of School Developmental Teams (SDTs), perceptions of

effectiveness.

iv

LIST OF ACRONYMS

APO – Area Project Office

DA – Developmental Appraisal

DAS – Developmental Appraisal System

DSG – Development Support Group

ELRC – Education Labour Relation Council

EMD – Education Management and Development

HOD – Head of Department

IQMS – Integrated Quality Management System

NGO – Non-Governmental Organisation

OBE – Outcomes Based Education

PGP – Personal Growth Plan

PM – Performance Measurement

SADTU – South African Democratic Teachers Union

SDT – Staff Development Team

SIP – School Improvement Plan

SMT – School Management Team

UK – United Kingdom

WSE – Whole School Evaluation

v

TABLE OF CONTENTS PAGE

DECLARATION i

DEDICATION ii

ABSTRACT iii

LIST OF ACRONYMS iv

CHAPTER ONE 1

ORIENTATION OF THE RESEARCH 1

1.1 Introduction 1

1.2 Statement of the problem 2

1.3 Objectives of the Research 3

1.4 Research Method 3

1.4.1 Research Design 3

1.4.2 Data Collection 4

1.4.2.1 Interview 4

1.4.2.2 Literature 4

1.4.2.3 Data Analysis 5

1.4.2.4 Trustworthiness 5

1.5 Demarcation of the Research 5

1.6 Clarification of Concepts 6

1.7 Division of Chapters 7

1.8 Conclusion 8

CHAPTER TWO 9

LITERATURE REVIEW 9

2.1 Introduction 9

2.2.1 Personnel/Staff Appraisal 10

2.2.2 Developmental Appraisal 13

2.3.1 Roles and Responsibilities of officials tasked with the implementation of

Developmental Appraisal (DA) 20

2.3.1.1 The Principal 20

2.3.1.2 The Educator 21

vi

2.3.1.3 School Management Team (SMT) 21

2.3.1.4 Staff Development Team (SDT) 22

2.3.1.5 Development Support Group (DSG) 23

2.3.1.6 The Area Office 24

2.3.2 The Processes of Implementing Developmental Appraisal 25

2.3.2.1 Planning 25

2.3.2.2 Self-evaluation by the educator 25

2.3.2.3 Pre-evaluation Discussion 26

2.3.2.4 Lesson Observation 27

2.3.2.5 Evaluation In Respect of Other Performance Standards 28

2.3.2.6 Feedback and Discussion 28

2.3.2.7 Resolution of Differences or/Grievances 29

2.3.2.8 Monitoring 29

2.3.2.9 Second and Subsequent Year of Implementation 30

2.4.1 Challenges on the Implementation of DA 30

2.4.2 Effective Strategies of Implementing DA 34

2.5 Conclusion 36

CHAPTER THREE 38

RESEARCH METHODOLOGY 38

3.1 Introduction 38

3.2 Research Design 38

3.3 Data Collection Method 40

3.3.1 Focus Group Interviews 40

3.3.2 The Interview Schedule 42

3.4 Sampling 42

3.4.1 Site Selection 42

3.4.2 Participants Selection 43

3.5 Data Analysis 43

3.6 Trustworthiness 45

3.6.1 Truth Value 45

3.6.2 Applicability 45

vii

3.6.3 Consistency 46

3.6.4 Neutrality 46

3.7 Ethical Considerations 46

3.8 Conclusion 47

CHAPTER FOUR 48

DATA ANALYSIS 48

4.1 Introduction 48

4.2 Brief Synopsis of Methodology 48

4.3 Data Analysis Process 49

4.4 Research Findings 49

4.4.1The Processes of Developmental Appraisal within the IQMS 50

4.4.1.1 Planning 50

4.4.1.2 Lesson Observation 51

4.4.1.3 Feedback and Discussions 52

4.4.1.4 Resolution of Differences 52

4.4.1.5 Self-evaluation 53

4.4.1.6 Pre-evaluation Discussions 53

4.4.1.7 Monitoring 54

4.4.2 The Roles of the SDT in Implementing DA 54

4.4.2.1 Prepares and Monitors management plan 55

4.4.2.2 Develops school improvement plan 55

4.4.2.3 Submits all the necessary documents to APO 56

4.4.2.4 Ensures that all members are trained 56

4.4.2.5 Facilitates and give guidance on DSG establishment 56

4.4.3 The Roles of the APO officials in the implementation of DA 57

4.4.3.1 Monitoring 57

4.4.3.2 Control 57

4.4.3.3 Educator development 58

4.4.4 Perceptions regarding the effectiveness of the SDT and APO officials in

implementing DA 58

4.4.4.1 The Effectiveness of the SDT 58

viii

4.4.4.2 The Effectiveness of the APO officials 59

4.4.5 Challenges on the Implementation of DA 60

4.4.5.1 Inadequate support from the APO 60

4.4.5.2 Lack of resources for educator development 61

4.4.5.3 Inadequate time frames for implementation 61

4.4.5.4 Disruption of normal teaching and learning 61

4.4.5.5 Lack of honesty on the part of appraisee and appraiser 62

4.4.5.6 Conflict 62

4.5 Summary of the findings 62

4.6 Conclusion 63

CHAPTER FIVE 64

SUMMARY, FINDINGS, RECOMMENDATIONS & CONCLUSION 64

5.1 Introduction 64

5.2 Summary 64

5.3 Summary of important findings 66

5.3.1 Findings from Literature 66

5.3.1.1 Roles and Responsibilities of officials tasked with the Implementation of DA 66

5.3.1.2 The Processes of Implementing DA 67

5.3.1.3 Challenges on the Implementation of DA 68

5.3.1.4 Effective strategies of Implementing DA 69

5.3.2 Conclusion from the empirical investigations 69

5.3.2.1 The processes of DA within IQMS 70

5.3.2.2 The roles of the SDT in implementing DA 71

5.3.2.3 The roles of the APO officials in implementing DA 72

5.3.2.4.1 Educators’ perceptions regarding the effectiveness of the SDT in

implementing DA 72

5.3.2.4.2 Educators’ perceptions regarding the effectiveness of APO officials in

implementing DA 73

5.3.2.5 Challenges on the implementation of DA 73

5.4 Recommendations 73

5.4.1 Training APO officials, SDTs and Educators in DA 74

ix

5.4.2 Provision of resources 74

5.4.3 Proper planning for the implementation 74

5.4.4 Promoting honesty 74

5.4.5 Dealing with conflict 75

5.5 Limitation of the study 75

5.6 Further Research 75

5.7 Conclusion 76

BIBLIOGRAPHY 77

APPENDIX A 83

1

CHAPTER ONE

ORIENTATION OF THE RESEARCH

1.1 INTRODUCTION AND BACKGROUND

Prior to 1995 there was no acceptable evaluation system in South Africa to evaluate

educators. The inspectors or subject advisors used to write reports about educators and

kept them without giving educators feedback. The reports were however used when a

decision was to be taken about whether an educator was to be promoted or not. This

made the educators unhappy and the educators decided that these inspectors and subject

advisors should not be allowed to enter schools. They used learners to chase them out of

schools (Gardiner, 2004:22). They felt that there was a need to develop an instrument

which would be acceptable to all stakeholders and would enhance the development of the

competency of educators and quality of public education in South Africa (ELRC,

1998:51).

In order to resolve the conflict outlined above the Department of Education and the

Educator Unions entered into negotiations. The negotiations resulted in a National

Teacher Appraisal project which was set up in 1995. The project resulted in a new

appraisal instrument for educators including inspectors and subject advisors (Mpolweni,

1998:55). This new instrument was called Developmental Appraisal System (DAS). DAS

was to be implemented in 1998 but this did not happen. The teacher organisations,

especially South African Democratic Teachers Union (SADTU) did not agree with the

Department of Education on certain issues on the implementation of DAS. They argued

that the instrument used to appraise educators was a duplication of the instrument used in

performance measurement and was therefore, not developmental. Negotiations were held

again at the Education Labour Relation Council, and an agreement was reached to

integrate Developmental Appraisal System, Performance Measurement System and

Whole School Evaluation. The result was the introduction of the Integrated Quality

Management System (IQMS) (ELRC, 2003:1).

2

The purpose of Integrated Quality Management System is to identify specific needs of

educators, schools and district offices for support and development; to provide support

for continued growth; to promote accountability; to monitor an institution’s overall

effectiveness; and to evaluate an educator’s performance. The main aim of

Developmental Appraisal System within the Integrated Quality Management System is to

facilitate the personal and professional development of educators in order to improve the

quality of teaching practice and educational management (ELRC, 1998:3). Its purpose is

to appraise individual educators in a transparent manner with a view to determining the

areas of strength and weakness, and draw up programmes for individual development

(ELRC, 2003:1).

1.2 STATEMENT OF THE PROBLEM

Although the Developmental Appraisal as component of IQMS seeks to identify

educators’ personal and professional needs and to facilitate their development, it is

common to hear teachers expressing their discontent with some aspects of this appraisal

system. Teachers, for example, complain about the classroom visits in appraisal, which

they see as unfair, inappropriate and more about accountability than development

(Declerq, 2008:13). This study, therefore, seeks to examine teacher perceptions regarding

the effectiveness of the implementation of DA in the Matlosana Area Project Office

Schools.

In the light of the above the main research problem is divided into the following sub-

questions:

How are the processes of Developmental Appraisal being carried out in the

Matlosana Area Project Office schools?

What are the roles of the School Development Teams and the Area Project Office

in the implementation of Developmental Appraisal in the Matlosana Area Project

Office schools?

What are the perceptions of educators regarding the effectiveness of the School

Development Teams and the Area Project Office in the Implementation of

Developmental Appraisal in the Matlosana Area Project Office schools?

3

What recommendations can be made that may serve as guidelines for the effective

implementation of Developmental Appraisal?

1.3 OBJECTIVES OF THE RESEARCH

The objectives of this research are to:

describe the processes of Developmental Appraisal implementation in the

Matlosana Area Project Office schools;

describe the roles of the School Development Teams and the Area Project Office

in the implementation of Developmental Appraisal in the Matlosana Area Project

Office schools;

ascertain the views of educators regarding the effectiveness of School

Development Teams and the Area Project Office in the implementation of

Developmental Appraisal; and

make recommendations that may serve as guidelines for the effective

implementation of Developmental Appraisal within the IQMS.

1.4 RESEARCH METHOD

1.4.1 RESEARCH DESIGN

According to McMillan and Schumacher (2001:172), a research design describes how the

study was conducted. It summarises the procedures for conducting the study, including

when, and under what conditions the data will be obtained. Research methods are the

ways one collects and analyses data.

The research approach to be used in this study will be the qualitative research method,

which means data will be collected in a face to face situation by interacting with

educators and principals in their natural settings. Qualitative research method values

richness of data and depth of understanding (Collins, Du Plooy, Grobbelaar, Terre

Blanche, Van Eeden, Van Rensburg and Wigston, 2000:134). This method will also be

used because it offers opportunities for conducting exploratory and descriptive research

that assume the value of context and setting, and that searches for deeper understanding

of the participants’ experiences (Best and Kahn, 1998:239). This method is therefore

4

considered relevant for investigating the educators’ perceptions regarding the

effectiveness of the implementation of Developmental Appraisal within the IQMS.

1.4.2 DATA COLLECTION

1.4.2.1 INTERVIEWS

Focus Group Interviews will be used to collect data in this research. According to

Schulze, Myburgh and Poggenpoel (2005:69), focus groups are more economical than

conducting numerous individual interviews, the group dynamics is a synergistic factor in

bringing information out and participants have more confidence in expressing their

feelings honestly within a support group of peers than in individual interviews. For this

reason the focus group is seen as the best method of collecting data in this study.

The participants will be drawn from six secondary schools. Random sampling will be

used to select the six secondary schools from the four clusters in the Area Project Office.

One principal, one head of department and three educators will be selected from each

sampled school. A total of six focus groups will be established from the six selected

schools. Each focus group will therefore be comprised of five members, namely, the

principals, one head of department and three educators. A variety of opinions will be

obtained from both management and educators on the implementation of DA (Collins et

al., 2000:177). Five guiding questions will be formulated and the researcher will guide

the discussions during the interviews. The interviews will be audio-taped and transcribed

verbatim.

1.4.2.2 LITERATURE REVIEW

A review of the literature is done in order to provide the theoretical framework of the

research. According to Schulze, Myburg and Poggenpoel (2005:21), a literature study is a

systematic, critical analysis and summary of existing literature relevant to the current

research topic. It involves reading an appropriate proportion of voluminous literature that

is available. The purpose of the literature review will be to get more information on the

processes of DA implementation and how effective DA can be implemented in SA and

other countries.

5

1.4.2.3 DATA ANALYSIS

According to Schulze, Myburg and Poggenpoel (2005:15), the aim of analysing and

interpreting research findings is to increase the validity of research by ensuring that errors

and inaccuracies are eliminated. The focus group interviews will be tape-recorded,

transcribed verbatim (word for word) and analysed. The Johnson and Christenson method

will be used to analyse data. The following steps will be followed:

Segmenting;

Coding;

Compiling master list;

Checking for intercoder and intracoder reliability;

Enumeration;

Showing relationship among categories; and

Using diagrams to summarise relationships (Schulze, Myburg and Poggenpoel,

2005:65).

1.4.2.4 TRUSTWORTHINESS

According to Schulze, Myburg and Poggenpoel (2005:15), Goba’s model for

trustworthiness addresses ways for warding off biases in the results of qualitative

analysis. Within this model the following strategies are used to ensure trustworthiness:

Credibility (truth value): the research will be conducted in such a way that

research phenomena will be accurately described; and

Transferability: the research design will be adequately described so that other

researchers may use the study to extend findings to other studies.

1.5 DEMARCATION OF THE RESEARCH

The study will be conducted in six secondary schools out of the hundred and twenty two

schools in the Matlosana Area Project Office. Principals, heads of departments in schools

and educators will be interviewed using focus group interviews in the six selected

schools. The research will be limited to the investigation of DA implementation within

the IQMS in the Matlosana APO schools only.

6

1.6 CLARIFICATION OF CONCEPTS

Developmental Appraisal

Appraisal is defined by Hunt (1992:10), as a structured interview that requires

communication between the organisation (appraiser) and the employee (appraisee) to

make assessment about the future.

Developmental Appraisal according to (ELRC, 1998:9) is a process that develops both

the skills and career prospects of the individual educator and leads to improvement at

school or institutional level.

In this study Developmental Appraisal refers to an evaluation system to identify strengths

and weaknesses of educators in a transparent manner with the purpose of developing

educators to provide quality education and education management.

Implementation

According to Hornby (1986:426), implementation means how something is carried out,

or how to put something into effect, or to provide a plan or procedure to do something.

In this study implementation means the way in which the processes of DA as a

component of IQMS are being carried out in the Matlosana APO schools.

Matlosana Area Project Office

Matlosana refers to the Municipality in the Dr Kenneth Kaunda District of the North

West Province. The municipality covers the following towns: Klerksdorp, Orkney,

Stilfontein and Hartebeesfontein.

Area Project Office (APO) refers to an area demarcated for the purposes of managing

educational institutions in the North West Province. Each Area Project Office is divided

into schools. Each APO is headed by an APO Manager.

7

The Matlosana Area Project Office in this study refers to an area demarcated for

education management purposes in the Dr Kenneth Kaunda District of North West

Province. It covers the following towns and their surrounding farms: Klerksdorp, Orkney,

Stilfontein and Hartebeesfontein.

Perception

Langenbach, Vaugh and Aagaard (1994:117), define perception as the art of linking what

is sensed with some past experience to give the sensation meaning.

In this study perception refers to educators’ understanding of how DA is being

implemented within the IQMS.

1.7 DIVISION OF CHAPTERS

Chapter one provides the introduction and background, problem statement, research aims

and objectives, the method that will be used, demarcation of the study and division of

chapters.

Chapter two will deal with the literature review concerning the effectiveness of DA

implementation.

Chapter three will outline the methodology used by the researcher during data collection

period. The methodology will be based on the following aspects: the purpose of

qualitative research; the design of interview questions; the selection of respondents; and

the interviews of the focus groups.

Chapter four will deal with the analysis and the trustworthiness of the data collected.

Chapter five will provide the interpretation of the findings, recommendations and

conclusion.

8

1.8 CONCLUSION

In this chapter an introduction and background of the research, the aims of the research,

methods of the research, demarcation of research, the clarification of concepts and the

division of the study chapters were presented.

The next chapter will examine the relevant literature on Developmental Appraisal.

9

CHAPTER TWO

LITERATURE REVIEW

2.1 INTRODUCTION

Schulze, Myburg and Poggenpoel (2005:21) define Literature Review as a method of

acquiring information. They further define literature review as a systematic, critical

analysis and summary of existing literature relevant to the current research topic. It

involves reading an appropriate portion of voluminous literature that is available.

Literature review, according to Shank and Brown (2007:17) is used to tell us what is

already known. A view of the literature serves several purposes in research. The

knowledge gained from the literature aids in stating the significance of the problem,

developing the research design and relating the results of the study to prior knowledge

(McMillan and Schumacher, 2006:105).

Schulze, Myburg and Poggenpoel (2005:21) and McMillan and Schumacher (2006:105)

identify the following purposes of literature review, to:

define and limit the problem;

place the study in a historical perspective;

avoid unintentional and unnecessary replication;

select promising methods and measures;

relate the findings to previous knowledge and suggest further research;

develop research hypotheses;

develop clear research design;

sharpen and deepen the theoretical framework of the research;

clarify the relationship between the proposed study and previous work on the

topic;

identify gaps in knowledge, as well as weaknesses in previous studies, so that you

will be able to add to existing knowledge and introduce new ideas and

perspectives;

10

develop an acceptable body of knowledge on a topic and gain further insight into

the topic; and

identify variables that must be considered in research.

The focus of this chapter will be literature review on the Implementation of

Developmental Appraisal in Schools in the Matlosana Area Project Office. The

researcher hopes that this literature study will shed some light on the effective

implementation of Developmental Appraisal in Schools. It is however imperative to

discuss Staff Appraisal in general to have a broader understanding of Appraisal before

discussing the Implementation of Developmental Appraisal in schools.

2.2.1 PERSONNEL/STAFF APPRAISAL

Fiddler and Cooper (1992:29), Revil (1992:12) and Myland (1992:9) define Personnel or

Staff Appraisal as the process by which an employee and his or her superordinate meet at

regular intervals to discuss the performance issues, example, progress, development

needs, performance and potential of the appraisee. The employee and his or her employer

then agree on what action needs to be taken by both the employee and the employer to

benefit both the individual employee and the organisation or institution. The employee

will be able to perform well and the organisation will get quality products. Roberts

(1994:1) on the other hand believes that personnel appraisal is a system of human

resource management that is designed to enhance employee effectiveness.

Personnel or staff appraisal is also described by Poster and Poster (1991:1) as a means of

promoting through the use of certain techniques and procedures, the organisation’s ability

to accomplish its mission of maintaining or improving what it provides while at the same

time seeking to maintain or enhance staff satisfaction and development. They maintain

that for employees in any concern to perform effectively, they must be well motivated,

have a sound understanding of what is expected of them, have a sense of ownership and

possess the abilities they are charged with. For employees to achieve these personnel or

staff appraisal should provide accurate, complete and fair evaluations of each person’s

11

performance and provide information useful to both the organisation and the individual

(Fidler and Cooper, 1992:2) and (Murphy and Cleveland, 1995:379).

Thomas and Patten (1982:28) on the other hand argue that the information that is

provided cannot always be measured objectively by a person evaluating the employee for

the purpose of appraisal. This suggests that an effective staff appraisal method is needed

to ensure objectivity and fairness in identifying developmental needs of employees.

A related definition of the above is provided by Poster in Ngwenya (2006:35) as the

process by which key results areas or targets are jointly planned by the appraisee and the

appraiser in a collaborative manner.

Thomas and Patten (1982:28) strongly emphasise this when they suggest that the

appraiser should be able to coach the appraisee on how to do better, or take action

regarding people who have reached plateaus or decline in performances. Personnel or

staff appraisal is also there to measure and evaluate an individual’s behaviour and

accomplishment, as pointed out by Devries, Morrison, Shulman and Gerlach (1981:1).

This suggests that weaknesses and strengths are identified. Weaknesses are addressed

through development and mentoring and strengths improved upon through self-

development. Johannsen and Page in Ngwenya (2006:1) make the point that this is

important for measuring up the institution’s goal for promotion, salary review, personal

decisions and termination of services.

The above suggests that both the employee and employer will benefit. The employee will

benefit when salary and promotion are reviewed and the employer will benefit when

those who cannot perform their services are terminated.

Litheko (2001:22) further describes staff appraisal as a means of knowing what existed in

the past, what exists now and how that can be modified or changed in the future so that it

has a positive impact on change, performances, productivity, professional growth and

commitment. Philip (1990:22) further makes the point that, appraisal involves knowing

12

the goals and objectives of appraisal, being aware of resources, people, materials, funds

and enriched staff development opportunities. A related argument is made by Talor in

Monyatsi (2006:216) that appraisal involves letting people know what is required and

expected of them, assessing how they are doing, reviewing this with them regularly and

agreeing with them on what happens next.

Monyatsi (2006: 217) contends that appraisal is one way in which to review the

performance and potential of staff. An effective appraisal scheme therefore offers a

number of potential benefits to both the individual and organisation. These benefits are

the:

Identification of individual’s strengths and weaknesses;

Identification of problems which may be restricting progress and causing

inefficient work practices;

Development of a greater degree of consistency through regular feedback on

performance and discussion about potential which encourages better

performance from staff;

Provision of information for human resource planning to assist succession

planning and determine suitability for promotion and for particular types of

employment and training;

Improvement of communication by giving staff the opportunity to talk about

their ideas and expectations and how well they are progressing; and

Improvement of quality of working life by increasing mutual understanding

between manager and their staff.

Performance appraisal should therefore be viewed as one of those processes in

organisations that aim at enhancing productivity through mutual interaction between the

supervisor and the subordinate. The feedback provided during the appraisal process is

vital in informing all those involved in the organisation about what ought to be done in

order to map the way forward (Monyatsi, 2006:17).

13

Mercer (2006:17) further makes the point that an appraisal system provides a useful

window through which to examine the individual and the organisation’s performance.

This means that for any organisation to function effectively appraisal of staff is

important. The individual benefits through the development of weaknesses and the

organisation benefits through staff producing quality products.

In schools in South Africa Integrated Quality Management (IQMS) is used to evaluate

educators. In this IQMS there are three programmes: The Developmental Appraisal

(DA), Performance Measurement (PM) and Whole School Evaluation (WSE).

Because the focus of this study is the implementation of DA within the IQMS in schools,

the discussion of what developmental appraisal is, and how it is implemented in South

Africa and other countries, is necessary.

2.2.2 DEVELOPMENTAL APPRAISAL (DA)

Jones (1993:4) and Bartlett (2000:27) define developmental appraisal as a continuous and

systematic process intended to help individual teachers with their professional

development and career planning and to help ensure that in-service training and

deployment of teachers match the complementary needs of teachers and schools. A

related definition is provided by Revil (1992:120) as a systematic and two- way process

where the resource manager and individual member of staff come together at regular

intervals to discuss and assess openly and jointly, a member of staff’s progress,

development needs, performance and potential.

Both these definitions emphasise a continuous process that seeks to identify

developmental needs of an individual educator and to act on those identified needs.

Developmental appraisal therefore constitutes a professional needs analysis by the

appraisee and his or her Development Support Group (DSG). The needs are identified

through the use of Appraisal Interviews and Observation of educator practising in the

classroom or outside the classroom. The appraiser and appraisee use the performance

standards and criteria agreed upon by the educator and his or her appraisers. Thereafter

14

professional development occurs when the identified needs are matched with the

appropriate in-service training and development of the appraisee (Hartley, 1992:1).

This development of the identified needs will then, according to Campbell (2003:356),

lead to accountability which is facing two ways: from individual teachers to perform

duties effectively, and from the school management to support the individual teacher.

Appraisal therefore becomes an essential ingredient of school development. It provides a

framework to identify teacher’s strengths and weaknesses, and facilitate the identification

of personal professional development plans within the broader aim of the school

development (Declerq, 2008:11).

DA also according to Revil (1992:120) provides an opportunity to undertake an overall

review of performance, work content, load and quality, to look back on what has been

achieved during a previous period, to identify strengths which might be remedied, and to

negotiate and agree on a plan of action for the next period.

Quinlan (2005:1) further describes DA as a learning instrument designed to

diagnose faults;

build strengths and correct weaknesses;

inform teachers about how they are doing;

set targets for teachers to aim at; and

bring home to teachers the basic requirements of good teaching.

Bartlet (2000:28) believes that developmental appraisal is a natural and essential element

in the management of any professional staff. Teachers need to be carefully and

sensitively managed; they need help in remedying weaknesses; and recognition and

encouragement for their strengths. And they need to have confidence in the fairness and

consistency of the process.

Mercer (2006:18) on the other hand contends that appraisal ought to improve learning or

teaching. Improving the quality of the learning process should be essential in the

15

objectives of the appraisal of teachers. Mortimore and Mortimore as quoted by Mercer

(2006:18), argue that appraisal ought to have an impact on the quality of student learning

as well as the organisation skills, planning, and teamwork of the school staff. Appraisal is

therefore essential for teacher accountability.

It means that DA does not only look for faults, but also helps the educator to correct those

faults and be able to perform better next time. This also means that both the institution

and the educator benefit from this development.

Hancock and Settle (1990:20) and Bennette (1992:3) have identified the following phases

of implementing DA:

Self-appraisal. This is done on a regular basis. It is good for self-development and

for the encouragement of personal growth. It also means that the responsibility is

placed upon the individual to establish long-term and short-term goals for

development;

External appraisal. It is normally performed by teacher managers or external

management officials. It is used to support the teacher by assisting him or her on

setting of objectives and in providing responsibility for individuals to develop

herself or himself, to encourage the educator to be accountable for her or his own

work, to gain the commitment from the teacher which will tempt her or him away

from possible position of either self-satisfaction or work avoidance, especially

where there are tasks which are disliked; and

Group appraisal. A team which is constantly involved with the teacher does the

group appraisal. After appraising the teacher, she or he must be able to answer the

following questions:

Can I evaluate accurately the quality or otherwise of my work performance?

Can I match my skills appropriately to the different aspects of the job?

Do I recognise my own present and future training needs?

Am I able to select a career path appropriately?

Do I feel more motivated to do my job?

16

This suggests that for DA to be effective there must be commitment from the individual

and the commitment from those providing the educator with development support.

Appraisal is also implemented differently by different countries. A brief discussion of

how DA is implemented and the purpose of DA implementation in some countries will be

presented below.

In New Zealand and the UK, developmental appraisal, according to Tomlison

(2003:220), has purposes for individual, personally and professionally with links to team

performance within the school. In secondary schools, the team rather than the school is

the level at which differential performance is often focused. As the centre of attention

about performance moves to whole school and the wider education system, the emphasis

transfers to accountability.

In the US they use self-assessment, action research and peer coaching. The most

ambitious challenge for this is for leaders to become authentic members of learning

communities where they facilitate inquiry-based collaborative work.

In Singapore the process is clearly bureaucratic, but appears to have been insensitive to

the quality of current performance and is perceived as based on non-transparent criteria

(Tomlison, 2003:222).

Monyatsi (2006:225) describes the process of developmental appraisal in Botswana, as a

process intended as a means of achieving accountability, although it had developmental

tenets as well. Many teachers view the current teacher appraisal in Botswana as

demoralising, even threatening. Evidently, Botswana’s teacher appraisal policy ideals

(non-disciplinary, accurate, open, base on proper training, part of continuous support and

staff development) are not sufficiently met in practice, therefore endangering the

relevance, accountability and quality of educational provision, according to the findings

of Monyatsi (2006:225).

17

In Britain, according to Bartlet (2000:29), appraisal for every teacher takes place on a

two-yearly cycle. By law each appraisal cycle should consist of observation of teaching,

an appraisal interview which resulted in an appraisal statement with targets, and a review

meeting. Thus self-appraisal, an initial appraisal meeting and other forms of information

gathering were incorporated into the cycle. An appraisal system could be seen as a

vehicle for personal professional development and also as monitoring teachers.

Since the focus of this study is developmental appraisal in schools in the Matlosana Area

Project Office of the North West Province in South Africa, detailed presentation of how

DA is implemented in South African schools, is necessary. The implementation is clearly

described in the Integrated Quality Management System Training Material document,

(ELRC, 2003:1-5). According to this document the purpose of DA is to appraise

individual educators in a transparent manner with a view to determining areas of strength

and weakness, and to draw up programmes for individual development. The strengths and

weaknesses are determined using the evaluation instrument.

The instrument is divided into two parts. One part (made up of four Performance

Standards) is for observation of educators in practice in the classroom and the other part

(made up of eight Performance Standards) is related to aspects for evaluation that falls

outside of the classroom. The first four Performance Standards are:

The creation of a positive learning environment;

Knowledge of curriculum and learning programme;

Lesson planning, preparation and presentation; and

Learner assessment.

Each of the above Performance Standards asks a question:

Does the educator create a suitable environment for teaching and learning?

Does the educator demonstrate adequate knowledge of learning area and does she

or he use this knowledge effectively to create meaningful experiences for

learners?

18

Is lesson planning clear, logical and sequential, and is there evidence that an

individual lesson fits into a broader learning programme?

Is assessment used to promote teaching and learning?

It means that it is of the utmost importance for both the appraiser and the appraisee to

understand these performance standards to be able to answer questions asked in each

performance standard accurately and correctly. This will also help the appraiser and the

appraisee to diagnose the developmental needs of the appraisee.

Each performance standard also includes a number of criteria. For each of these criteria

there are four descriptors which are derived from the four rating scales (ELRC, 2003:1)

and (Declerq, 2008:12). The descriptors will therefore describe what the educator needs

to do to improve his or her performance on that particular performance standard.

The other part of the instrument is made up of the following Performance Standards:

Professional development in the field of work or career and participation in

professional bodies;

Human relations and contribution to school development;

Extra-curricular and co-curricular participation;

Administration of resources and records;

Personnel;

Decision making and accountability;

Leadership, communication and serving the governing body; and

Strategic planning, financial planning and Education Management and

Development (EMD).

Each one of these Performance Standards also asked the following Questions:

Does the educator participate in activities, which foster professional growth?

Does the educator demonstrate respect, interest and consideration for those with

whom she or he interacts?

Is the educator involved in extra- and co-curricular activities?

19

Does the educator use resources effectively and efficiently?

Does the educator manage and develop personnel in a way that the vision and

mission of the institution are accomplished?

Does the educator display sound decision making skills and does she or he take

responsibility for decision made?

Is she or he a visionary leader who builds commitment and confidence in staff

members?

Is the educator proficient in planning an education management development

(ELRC, 2003:3)?

The following rating scales are used when the performance of the educator is rated for the

purpose of development and performance measurements:

Rating 1: Unacceptable. This level of performance does not meet expectations and

requires urgent intervention and support. It means that the educators need the

support of the Development Support Groups to develop those areas needing

development;

Rating 2: Satisfies minimum expectations. This level of performance is acceptable

and is in line with minimum expectations, but development and support are

required;

Rating 3: Good. Performance is good and meets expectations, but some areas are

still in need of development and support; and

Rating 4: Outstanding. Performance is outstanding and exceeds expectations.

Although performance is excellent, continuous self-development and

improvement are advised (ELRC, 2003:4).

It is the educator’s Development Support Group (DSG) together with the educator herself

or himself who have to accurately rate the educator. This suggests that honesty and

fairness need to prevail. The educator needs to be fair in order to be able to identify his or

her weaknesses. The DSG is also supposed to be honest and fair for the sake of helping

the educator to develop and also for performance measure that may affect the educator’s

pay.

20

Performance standards one (1) to seven (7) apply to all post level one educators.

Performance standards one (1) to ten (10) apply to post level two educators (HODs) or

Education Specialists. Performance standards one (1) to twelve (12) apply to post level

three and four educators (Deputy Principals and Principals) (ELRC, 2003:5).

Because the focus of this study is the implementation of developmental appraisal in

schools, it is therefore important to discuss the processes involved when DA is

implemented in schools, but before discussing the processes it is necessary to clarify the

roles and responsibilities of people tasked with implementation of DA within the IQMS

in schools and the Area Project Office or the District Office.

2.3.1 ROLES AND RESPONSIBILITIES OF OFFICIALS TASKED WITH THE

THE IMPLEMENTATION OF DEVELOPMENTAL APPRAISAL (DA)

The following are the officials who are responsible for the implementation of DA within

the IQMS in schools:

2.3.1.1 The Principal

The principal has overall responsibility to ensure that the IQMS is implemented

uniformly and effectively at the school. The principal must ensure that every educator is

provided with a copy of the training material for IQMS and other relevant IQMS

documentation. Together with the School Management Team (SMT) and Staff

Development Team (SDT) he is responsible for advocacy and training at school level.

After advocacy and training the principal will facilitate the establishment of the SDT in a

democratic manner. She or he must ensure that all documentations sent to the Area Office

are correct and delivered in time. She or he is responsible for internal moderation of

evaluation results in order to ensure fairness and consistency (ELRC, 2003:2).

This suggests that the principal should be well informed as far as the processes of

implementing developmental appraisal are concerned to ensure the effective

implementation. The principal must also be able to be a leader who will always monitor

21

and encourage the SDT to perform its important duties correctly and efficiently as the

most important structure in the implementation of DA at the school level.

2.3.1.2 The Educator

The Educator must undertake self-evaluation of her or his performance. She or he must

identify her or his personal support Group- Development Support Group (DSG). The

educator must develop a Personal Growth Plan (PGP) and finalise it together with the

DSG. She or he must co-operate with the DSG. She or he must attend in-service training

and other programmes in terms of areas identified for development. She or he must

engage in feedback and discussion with the DSG (ELRC, 2003:2).

This means that the educator must fully understand the appraisal instrument to be able to

evaluate himself or herself and be in the position to objectively and fairly identify

weaknesses for purpose of development. The educator must also select the DSG that will

be in the position to help him or her develop in identified weaknesses.

2.3.1.3 School Management Team (SMT)

SMTs inform educators of in-service training and other programmes that will be offered

and make the necessary arrangements for educators to attend. They must assist with the

broad planning and implementation of IQMS. They must ensure that school self-

evaluation is done in terms of Whole School Evaluation (WSE) policy and in

collaboration with the Staff Development Team (SDT) (ELRC, 2003:3).

This suggests that the School Management Team need to play a vital role in ensuring that

educators receive training on the identified weaknesses and provide guidance and

mentoring. As senior in the DSG the SMT member should also ensure that the correct

procedures are followed in the implementation of DA.

22

2.3.1.4 Staff Development Team (SDT)

i. Composition

The Staff Development Team is made up of the principal, the Whole School Evaluation

co-ordinator, democratically elected members of the School Management Team and

democratically elected post level one educators. The school should decide on the size of

the SDT. It is suggested that members be up to about six depending on the size of the

school. In schools with only one or two educators such educators make up the SDT, but

the Area Project Office provides the support. How they provide this support is however

not clear in the IQMS training material document.

ii. Roles and Responsibilities

Ensures that all staff members are trained on the procedures and processes of the

IQMS;

Coordinates all activities pertaining to staff development;

Prepares and monitors the management plan for the IQMS;

Facilitates and gives guidance on how DSGs have to be established;

Prepares a final schedule of DSG members;

Links Developmental Appraisal to the School Improvement Plan (SIP);

Liaises with the department in respect of high priority needs such as in-service

training, short courses, skills programmes or learner ships;

Monitors effectiveness of the IQMS and reports to the relevant persons;

Ensures that all records and documentations on IQMS are maintained;

Oversees mentoring and support by the DSGs;

Develops the school improvement plan based on information gathered during

Developmental Appraisal;

Coordinates ongoing support provided during the two developmental cycles each

year;

Completes the necessary documentation for Performance Measurement (PM) for

pay or grade progression, signs off on these to ensure fairness and accuracy;

Submits all the necessary documentation (e.g. SIP) to the Area Project Office in

good time for data capturing;

23

Deals with the differences between appraisees and appraisers in order to resolve

the differences;

Coordinates the internal WSE processes;

Ensures that the IQMS is applied consistently; and

Liaises with the external WSE Team to coordinate and manage the cyclical

external WSE process (ELRC, 2003:3).

2.3.1.5 Development Support Group (DSG)

I. Composition

For each educator the DSG should consist of the educator’s immediate senior and one

other educator (peer). An educator’s peer must be selected by the educator on basis of

expertise that is related to the prioritised needs of the educator. It is important that the

peer has the confidence and trust of the educator as he or she will have to offer

constructive criticism as well as support and guidance. Only in exceptional cases, e.g. in

the case of a principal, may a peer be selected from the staff of another school. In some

instances it is possible for an educator to select more than one peer based on her or his

particular needs. In respect of one educator schools the Area Office provides the support

and mentoring. Each educator may have different DSG while some individual (e.g.

HODs or Education Specialists) will be involved in several DSGs (for different

educators). Once educators have determined who their DSGs are, this information has to

be factored in the broad planning of the SDT to ensure that there are no clashes with

HODs or Education Specialists having to evaluate different teachers at the same time and

to ensure a reasonable spread and pace of work for evaluators towards the end of the

year. A member of the DSG may be changed in instances where development has already

taken place and where new priorities have been identified (ELRC, 2003:5).

II. Roles and Responsibilities

The main purpose of the DSG is to provide mentoring and support;

The DSG is responsible for assisting the educator in the development and

refinement of her or his Personal Growth Plan (PGP) and to work with the

24

SDT to incorporate plans for development of an educator into the school

improvement plan;

The DSG is responsible for the baseline evaluation of the educator (for

Development purposes) as well as the summative evaluation at the end of the

year for Performance Measurement; and

The DSG must verify that the information provided for Performance

Measurement (PM) is accurate (ELRC, 2003:5).

This suggests that the educator needs to make sure that he or she selects the Development

Support Group that will be able to carry out the roles and responsibilities mention above.

2.3.1.6 The Area Office

The Area Office has the following roles and responsibilities in the implementation of

developmental appraisal within the IQMS in schools:

Overall responsibility of advocacy, training and proper implementation of IQMS;

Responsibility with regard to the development and arrangement of professional

development programmes in accordance with identified needs of educators and its

own improvement plan;

The area manager has the responsibility to moderate evaluation results of schools

in her or his area in order to ensure consistency. In cases where the evaluation

results of a school’s general level of performance or where the area manager has

reasons to believe that the evaluation at a particular school was either too strict or

too lenient, she or he must refer the results back to the school for consideration;

Must ensure that the evaluation results of schools are captured and processed in

time to ensure successful implementation of salary and grade progression; and

Should ensure that the implementation process in schools is monitored on an

ongoing basis (ELRC, 2003:6).

This means that the departmental officials on IQMS need to visit schools regularly to

ensure that IQMS is implemented correctly and effectively. The officials should also be

in a position to provide support on identified weaknesses of the educators.

25

The officials tasked with the implementation of Developmental Appraisal need to follow

certain procedures and processes. Those processes are discussed in the next section of

this chapter.

2.3.2 THE PROCESSES OF IMPLEMENTING DEVELOPMENTAL APPRAISAL

According to the ELRC (2003:6-10) document, the following processes are to be

followed when the developmental appraisal within the IQMS is implemented:

2.3.2.1 Planning

IQMS planning by the Staff Development Team (SDT) must incorporate all the processes

together with time frames in which they are to be completed, as well as all individuals

involved together with each one’s responsibility. Planning must take the school’s year

plan into account (which is drawn up by the School Management Team). The schools

must factor in to their broad planning the cycles of evaluation and development, e.g.

Baseline Evaluation in the first term, Summative Evaluation in the last term, first

developmental cycle from end of March to end of June and second developmental cycle

end of June to end of September. Secondary schools must ensure that educators who

teach Grade 9 or Grade 12 classes are evaluated before the external assessments or

examinations commence. By the end of February educators must be provided with a

timetable indicating when they can be evaluated. The principal calls a general staff

meeting at the beginning of the year at which educators are apprised of the IQMS

procedures and processes (ELRC, 2003:6).

The above planning suggests that each school should have a functional and effective SDT

which must plan its activities in conjunction with the SMT. The SDT must also get the

support from the SMT and the Area Office to carry out its duties effectively and

qualitatively.

2.3.2.2 Self-Evaluation by the Educator

Immediately after the initial advocacy and training, each educator should evaluate her or

himself using the same instrument that will be used for both Developmental Appraisal

26

and Performance Measurement. This enables the educator to become familiar with the

evaluation instrument. Educators familiarise themselves with the Performance Standards,

the Criteria (what they are expected to do) as well as the level of performance (how they

are expected to perform) in order to meet at least the minimum requirements for pay

progression. The self-evaluation forms part of both Developmental Appraisal (DA) and

Performance Measurement (PM) (ELRC, 2003:7). The emphasis of self-evaluation serves

the following purposes:

The educator is compelled to reflect critically on her or his performance and set

own targets and timetable for improvement;

Evaluation, through self-evaluation, becomes an ongoing process;

The educator is able to make inputs when the observation (for evaluation

purpose) takes place and this process becomes more participatory; and

The educator is able to measure progress and successes and build on these

without becoming dependent on cyclical evaluation (ELRC, 2003:7) and

(Declerq, 2008:12).

The self-evaluation of the educator plays a vital role on the process of developmental

appraisal because the educator is the one who needs to know exactly what it is, that

which he or she is lacking for development. This also suggests that honesty on the part of

the educator is needed if the educator wants to be developed on the areas needing

development.

2.3.2.3 Pre-evaluation Discussion

Each DSG must have a pre-evaluation discussion with the educator concerned during

which the following must be clarified:

Whether the educator understands what is expected of her or him in terms of the

various performance standards and criteria and how she or he will be rated;

The educator is given the opportunity to clarify his or her areas of concern;

The DSG informs the educator about procedures and processes that will be

followed throughout the IQMS cycle;

27

The DSG explains to the educator that lesson observation involves performance

standards one to four whilst other aspects involve the remaining performance

standards;

The DSG explains to the educator that the evaluation in respect of the remaining

performance standards will be based on general ongoing observation by the DSG

and on documentary evidence and other information that the educator may

provide to the DSG;

Guidance is provided to the educator on the development of her or his Personal

Growth Plan (PGP). After the baseline evaluation further discussions on the

development of the PGP need to take place; and

The educator is also given an opportunity to raise issues that are hampering her or

his performance. This is important in the light of the contextual factors, which

may be recorded in the report and considered for possible adjustment of the marks

awarded in respect of a particular criterion (ELRC, 2003:7-8).

The pre-evaluation discussion plays an important role since all issues of concern can be

discussed and clarified here to avoid misunderstandings. The DSG is also able to explain

to the educator their expectations of the educator.

2.3.2.4 Lesson Observation

After identifying the personal DSG the educator needs to be evaluated, for the purpose of

determining a baseline evaluation with which subsequent evaluation(s) can be compared

in order to determine progress. This evaluation should be done by both members of DSG.

The purpose of this evaluation by the DSG is

to confirm (or otherwise) the educator’s perception of his or her own

performance as arrived at through the process of self-evaluation;

to enable discussion around strengths and areas in need of development and to

reach consensus on the scores for individual criteria under each of the

Performance Standards and to resolve any differences of opinion that may

exist;

28

to provide the opportunity for constructive engagement around what the

educator needs to do for himself or herself, what needs to be done by the school

in terms of mentoring and support (especially by the DSG) and what INSET

and other programmes need to be provided by, for example, the Area Office;

to enable the DSG and the educator (together) to develop a PGP which includes

targets and time frames for improvement. The PGP must primarily be

developed by the educator with refinements being done by the DSG; and

to provide a basis for comparison with the evaluation for PM purposes which is

carried out at the end of the year (ELRC, 2003:8).

The above purposes suggest that lesson observation plays an important role as far as

diagnosing the educator’s developmental needs is concerned. Lesson observations also

help in performance measurement for pay progression. This suggests that honesty is

needed on the part of the educator, but also tension might arise because of pay related

performance.

2.3.2.5 Evaluation In Respect of Other Performance Standards

An educator’s evaluation in respect to these performance standards is based on general

ongoing observation, discussion and feedback by the DSG, submission of documentary

evidence, proof of participation and other information provided by the educator (ELRC,

2003:9).

It means that once an educator has established the DSG, the DSG must always observe

the educator to ensure that they are in the position to evaluate him or her accurately and

fairly in order for him or her to develop in the identified weaknesses.

2.3.2.6 Feedback and Discussion

The DSG must discuss their evaluation with the educator and must provide feedback.

Differences (if any) need to be resolved. Feedback on observation should focus on:

Performance and not personality;

Observations and not assumptions;

29

Objectivity and not subjectivity;

The specific and concrete and not the general and the abstract;

Sharing information and not giving instructions;

The individual’s needs; and

A request from the individual (ELRC, 2003:8).

This means that honesty and fairness are needed from both the educator and the

Development Support Group.

2.3.2.7 Resolution of Differences and/or Grievances

Most differences of opinion between an educator and the DSG should be resolved at that

level. Where agreement cannot be reached the matter must be referred to the SDT within

a week. If there is still no resolution within five working days, either party may request a

formal review by the grievance committee. The grievance committee will make a

recommendation to the head of the provincial department. The head of department will

evaluate the recommendation and motivation submitted by the Grievance Committee

before taking a decision which shall be made within five working days (ELRC, 2003:9).

This resolution of differences is important because the educator and the DSG need to

agree on the identify weaknesses of the educator. This will enable the DSG to help the

educator to be developed on these weaknesses, to benefit the educator and the school.

2.3.2.8 Monitoring

The monitoring process is an ongoing activity, which is conducted by departmental

officials, SMTs and DSGs (ELRC, 2003:9). The purpose of this monitoring is to ensure

that the DA and other IQMS processes are carried out accurately and consistently by

schools.

Declerq (2008:13) criticises this monitoring citing problematic issues such as: the

awkward combination in one system of internal and external bureaucratic (with a

standardised appraisal instrument) and professional monitoring (with peer contextual

30

appraisal) for development and accountability which leads to inevitably to tensions; and

the poor leadership capacity, at district and school level, to effectively implement the

appraisal system, and to manage its inherent dilemmas.

2.3.2.9 Second and Subsequent Year of Implementation

The second and subsequent implementations of IQMS on a particular educator differ

from the first implementation in the following ways:

Teachers will need to be evaluated by their DSGs only once per annum;

The summative evaluation at the end of the previous year becomes the baseline

evaluation for the next year. It is therefore necessary to do only the summative

evaluation at the end of each year (for performance measurement) and to

compare this with the summative evaluation of the previous year in order to

determine progress; and

Only new teachers entering the system for the first time will need to be evaluated

at the beginning of the year (ELRC, 2003:10).

The above suggests that the processes of implementing Developmental Appraisal is a

huge task with many people involved and this means that it is not easy to implement it

without certain challenges that will be encountered. Therefore some of these challenges

that may be encountered during the implementation of developmental appraisal are

discussed in the next section of the chapter.

2.4.1 CHALLENGES ON THE IMPLEMENTATION OF DEVELOPMENTAL

APPRAISAL

According to Declerq (2008:13) there is an assumption in IQMS that a certain level of

professional competence, openness and respect towards colleagues exists among school

staff. Its form of internal peer appraisal assumes that most schools have a professional

collaborative climate and culture where staff work and reflect together on how to

improve teaching but this is not the case. Declerq (2008:13) further argues that research

done by Talor and Vinjevold has shown how unsatisfactory the professional

31

qualifications of many educators are, as well as their mastery of subject knowledge and

pedagogical knowledge. Research by Tailor also points out that poor culture of teaching

and learning subsists today in the majority of poorly functioning schools, and South

African learners’ achievements are among the lowest in the world for comparative

countries. According to Fleisch (2008:13), these poor results, influence educators’ values

and attitudes making them defensive towards any form of performance appraisal.

A related argument is given by Campbell (2003:357) when he says there may be tension

between collaboration with colleagues and performing effectively independently of

colleagues, especially if teacher appraisal is focussed exclusively on individual

performance. Some important values, such as commitment to enhancing all pupils to

realise their potential, either is not measurable or could only be measured in time scales

that are too long for purposes of appraisal.

SADTU (2002 &2005) argues that, it is unfair to hold educator accountable for effective

curriculum implementation and poor learner achievements. Both the difficult teaching

conditions and recent policies, which are beyond educators’ control, greatly influence

learners’ poor attitudes, low levels of interest and achievement. For these reasons, many

educators resist this formal appraisal process (and specifically its classroom visits), which

they see as unfair, inappropriate to their work circumstances, and more about

accountability than development. In many low-functioning schools, the process is seen as

a cumbersome, time-consuming and fruitless exercise, which does not bring any benefits

and is therefore not treated seriously (Wadvalla, 2005).

Another problem is the lack of capacity of educators’ monitoring which might develop

with training, expertise and moderation. The system requires authoritative evaluators,

capable of making data-informed professional judgements. They need to have an

understanding of how to uphold and raise evaluation standards, criteria, work with

techniques of observation, and develop effective diagnosis report. Yet, not many South

African schools have experienced effective internal appraisers and yet, according to

Newman and Ringdon in Declerq (2008:14) the experience of effective internal appraisal

32

is necessary for effective external accountability. The question remains whether the

system can develop the capacity to produce these knowledge well-trained professional

appraisers who have access to sufficient data information to interpret effectively the

appraisal instrument to reflect adequately on educators’ practices and areas of

development and compile meaningful PGPs?

There is also a problem in expecting appraisers to use one standardised instrument to

evaluate educators both for development and for reward or sanctions. An important

condition for effective developmental appraisal is that performance standards should be

contextual and negotiated with educators. The combination in one instrument of

monitoring for development and for performance management exacerbates the already

difficult power relationship between appraisees, school-based appraisers and district

appraisers. These parties have different interests and agendas in this evaluation, thus

threatening rigorous, reliable and valid evaluation (Declerq, 2008:14).

Shalem (2003:34) further argues that professional development programmes are

inadequate at providing meaningful opportunities for educators to learn. Most

programmes are top-down, departmental or NGO-driven, with little educator involvement

in their design and delivery. The school support capacity rarely exists in low-functioning

schools and is made worse by a tradition of poor collegiality and lack of respect among

staff in many schools. The department or district support capacity is also stretched by the

new OBE system because the majority of provincial or district officials are themselves

not familiar with OBE and the competences, values and culture required to implement it.

According to Narsee (2006:15), this inadequate district school support is likely to remain

in the future, because of lack of human, social and organisational capital. Declerq

(2008:15) also argues that contestations are likely to arise about what educators, versus

appraisers, identify as appropriate professional development priorities and support

opportunities.

According to Metcalf (2008) the different professional development needs of educators,

and the need to move them from where they are to where they have to be, require a

33

multipronged approach to professional development. This approach recognises that

curriculum and assessment policy implementation support is not always the most

important for educators, to acquire the basic subject and pedagogical content knowledge

to be in control of what to teach.

Gardiner (2004:23) on the other hand argues that the interests of the education system are

given significantly more weight than the needs of educators. For example the Personal

Growth Plan of the educator is to be integrated into the School Improvement Plan, the

Whole School Evaluation Policy and the Provincial Strategic Plan. The Province will

then determine the availability of resources, which courses and workshops are offered

and what teachers themselves are supported in doing.

The other problem in the IQMS according to Weber (2005:65) is the idea that the

Department of Education has the responsibility of providing facilities and resources to

support learning. This idea is however not clear, about what will be provided, how it will

be provided, who will monitor and evaluate the adequacy of the provision and efficacy of

development of human resources. In schools there are also no common goals set by

management for staff to follow. This does not encourage communal responsibility and

leads to some staff feeling alienated and isolated (Humphreys and Thomas, 1995:135).

Bartlett (2000:31) has also identified the following challenges in the implementation of

developmental appraisal:

Lack of rigour in the whole process which was shown by poor target setting, the

line manager not being the appraiser;

The process being too protected, not fitting into school training; and

Development plans and the two-yearly cycle being incongruous with

management planning.

According to Tomlison (2003:221), a link with pay also poses a potential problem in the

implementation of Developmental Appraisal, and it is undermining the professional

development supported by teachers. This also has the potential for exacerbating further

34

the tension that limits teachers’ confidence in appraisal. When judgements are made

about the quality of teachers’ performance, using it as a basis for competency procedures

for some teachers appear to undermine its value for the vast majority in relation to

professional development. He also argues that the bureaucratic processes undermine and

are insensitive to the professional development of the educators.

Mercer (2006:24) further makes the point that appraisal procedures in the US encourage

safe rather than creative, flexible teaching, focusing on highly specific competencies at

the expense of a more holistic assessment of a teacher’s overall contribution to the

institution. Wragg in Mercer (2006:24), similarly, suggests that competency-based

appraisal in UK ignores the need for imaginative and reflexive skills. Mercer (2006:24)

further makes the point that, the more teacher appraisal is concerned with managerial

control, the more quality will be defined in terms of minimal competencies, meaning, the

most generic of technical skills.

The above discussion means that certain measures need to be taken in order to correct or

deal with the challenges in the implementation of Developmental Appraisal. Therefore

the effective strategies of implementing Developmental Appraisal will be discussed

below.

2.4.2 EFFECTIVE STRATEGIES OF IMPLEMENTING DEVELOPMENTAL

APPRAISAL

According to Humphreys and Thomas (1995:138) Developmental Appraisal can be

effectively implemented when the following strategies are in place:

The involvement of several trusted colleagues in discussing an aspect of their

professional development allows each teacher to employ the collective wisdom

rather than merely depending on self-appraisal of their work, or one appraiser,

thus creating a context which enables teachers to collaborate in focussed manner

to enhance individual’s efforts at development;

The involvement of management in all sessions ensures a relationship between an

individual teacher’s professional development needs and school development

35

planning. Teachers’ in-service training needs to arise naturally from the project

rather than being prescribed in isolation by management;

The negotiated responsibility of approach avoids the extremes between models of

appraisal allowing teachers to choose their own focuses for appraisal and those

wherein the area of focus was prescribed by the head of department. This appears

to allow for both the individual’s professional development and the school

development; and

The involvement of an outsider allows an unbiased view not clouded by previous

experience of the school.

Myland(1992:14) expounds further on this when he makes the point that performance

appraisal should be seen by those involved as an opportunity for honest discussion. In

this discussion, the appraiser does not judge but promotes agreement about how current

performance matches up to requirements and how the future might be approached. The

record of appraisal interview should in fact be an agreed summary of discussion, from

which description of current performance and future action plan must be discernible.

Toch (2008:34) believes that using evaluators with backgrounds in candidates’ subjects

and grade levels strengthens the quality of appraisal.

The review of Developmental Appraisal in Britain by Bartlett (2000:31) proposed that

appraisal should be integrated with the other management processes and information

systems directed at school improvement. The appraisal process should address how well

teachers were performing and what would be needed to assist their future professional

development. Thus it should encourage, recognise and value good work whilst addressing

any weaknesses with suggestions for future action. Appraisal should be grounded in the

regular monitoring and improvement of teachers’ effectiveness in the classroom.

Transparency was to replace confidentiality. Roles should be clear, with performance

standards and success criteria stated. Targets should require teachers to focus sharply on

their effectiveness in the classroom taking account of findings or other key performance

indicators during the developmental appraisal evaluation report.

36

Bartlett (2000:32) also suggests that the structured discussion with individual line

manager should be conducted on an annual basis as part of the developmental appraisal

of educators. Declerq (2008:16) further makes the point that, a teacher appraisal system

should be based on valid or realistic assumptions about the specific teaching realities and

the available professional appraisal and support capacity in the system. It should engage

with the way teachers and departmental officials perceive teachers’ work and

responsibilities and strive towards reaching some basic consensus.

Monyatsi (2006:225) similarly makes the point that for developmental appraisal to reap

benefits the school should have an effective training strategy and mount professional

programmes where purposes and procedures are fully shared and explained to all

participants. The clarity of purpose plays a crucial role in making the process more

effective, especially if it is carried out formally. He also emphasises the importance of

transparency, trust, and honesty in the implementation of developmental appraisal at

schools. This should take place through professional collaboration between the appraisee

and appraiser, with regular meetings between the parties concerned to negotiate appraisal

purposes and outcomes. From these meetings, realistic targets that are within the job

description of the teacher should be discussed and agreed upon at each stage of the

appraisal process. It is also important that the teacher’s developmental appraisal process

should recognise teachers as full partners in the process, not as raw objects to be

developed by senior staff and experts (Monyatsi, 2006:226). The system also should be

flexible, designed by each school to meet its own needs (Barnard, 1999:1).

The above discussion suggests that the appraisal to be effective, certain measures need to

be put in place. It also means that contextual factors need to be taken into consideration

when developmental appraisal is implemented.

2.5 CONCLUSION

This chapter began with the definition of literature review and the purposes of reviewing

literature for the study. The definition of personnel or staff appraisal and how it was

implemented was also discussed for a broader understanding of appraisal before

37

discussing Developmental Appraisal (DA) as it is implemented in schools. The roles and

responsibilities of officials tasked with the implementation of DA, the processes involved

in the implementation and challenges in the implementation of Developmental Appraisal

were discussed before suggesting the effective strategies of implementing developmental

appraisal. The next chapter will describe the methodologies used for data collection,

analysis and interpretation.

38

CHAPTER THREE

RESEARCH METHODOLOGY

3.1 INTRODUCTION

In the preceding chapter on literature review, the researcher examined literature on the

Implementation of Developmental Appraisal to narrow the topic into researchable

questions. The literature review has provided the researcher with a critical and synthesis

understanding of the topic of the study.

This chapter outlines detailed description of the research methodology that will be

followed to generate empirical evidence to answer the research questions in this study.

The chapter therefore will specifically describe research design, data collection methods,

sampling, data analysis, trustworthiness and ethical considerations.

The research design is therefore outlined on the next section of this chapter.

3.2 RESEARCH DESIGN

Bogdan and Biklen (2007:54), Ary, Jacobs, Razavieh and Sorensen (2006:470) and Ary,

Jacobs and Razavieh (2002:426), describe research design as the researcher’s plan of how

to proceed to gain an understanding of some group or some phenomenon in its natural

setting. In agreement with this description, McMillan and Schumacher (2001:172) say

that research design, describes how the study was conducted, it summarises the

procedures for conducting the study, including when, and under what condition the data

will be conducted.

The purpose of research design is presented by McMillan and Schumacher (2001:377) as

to

describe and explore;

describe and explain;

add to the literature;

39

build rich description of complex situation; and

give directions for future research.

The nature of this research is descriptive and explanatory. The suitable research design,

for this study is therefore, a qualitative research design. Qualitative research design is

described by McMillan and Schumacher (2001:395) as an inquiry in which researchers

collect data in face-to-face situations by interacting with selected persons in their setting.

Gay, Mills and Airasian (2006:399) on the other hand described qualitative research, as

the collection, analysis and interpretation of comprehensive narrative and visual data in

order to gain insights into a particular phenomenon of interest. Qualitative research is

preferred in this study because it describes and analyses people’s individual and

collective social actions, beliefs, thoughts, and perceptions (Ary et al., 2002:25). It is also

concerned with understanding the social phenomena from the participants’ perspective.

Understanding is acquired by analysing the many contexts of the participants’ meaning

for these situations and events (McMillan and Schumacher, 2001:396).

The qualitative research design is also preferred in the study because of the following

characteristics:

It shows concern for context;

It studies real-world behaviour as it occurs naturally;

The human investigator is the primary instrument for gathering and analysing

data; and

It deals with data that are in the form of words rather than numbers and statistics.

Focus group interview as a qualitative data collection tool will be used in this study. In

the next section of this chapter, the basis of using focus group interview is discussed in

detail.

40

3.3 DATA COLLECTION METHODS

According to Gay, Mills and Airasian (2006:413) data collection in qualitative studies,

involves spending time in the setting under study, immersing oneself in this setting, and

collecting as much relevant information as possible.

Observations, interviews, phone calls, personal and official documents, photographs,

recordings, journals, e-mail messages and responses, and informal conversations are all

sources of qualitative data (Gay et al., 2006:413). In this study focus group interview will

be employed as data collecting technique. The importance of using this technique is

expounded below.

3.3.1 Focus Group Interviews

Johnson and Christensen (2004:185), describe a focus group interview as a type of

interview in which a researcher leads discussion with a small group of individuals to

examine, in detail, how the group members think and feel about a topic. The focus group

interviews are much more flexible and open in form. The respondents are free to answer

in their own words and can answer either briefly or at length (Ary et al., 2006:482).

Bogdan and Biklen (2007:69) on the other hand maintain that focus group interviews are

structured to foster talk among the participants about particular issues.

The focus group interviews also have the following advantages:

They are more economical than conducting numerous individual interviews;

The group dynamics is synergistic factor in bringing information out;

Participants have more confidence in expressing their feelings honestly within a

support group of peer than in individual interviews (Schulze, Myburg and

Poggenpoel, 2005:69);

They are helpful because they bring several different perspectives into contact;

and

The researcher gains insight into how participants are thinking and why they are

thinking as they do (Ary et al., 2006:482).

41

Ary et al. (2006:482), Johnson and Christensen (2004:185) and Ary et al. (2002:434) all

agree that the number of participants in the focus group should be six to twelve people.

They contend that the group should be small enough that everyone can take part in the

discussion, but large enough to provide diversity in perspective. In addition, Johnson and

Christensen (2004:185) maintain that the conduct of two to four focus groups as part of a

single research study is quite common because it is unwise to rely on the information

provided by a single focus group.

In this study therefore, the researcher will establish six sets of focus groups for

interviews. There will be a separate focus group for each selected school in the area.

Each focus group will therefore consist of one principal, one head of department and

three educators.

Unstructured focus group interviews will be conducted in this study. Gay et al.

(2006:419) describe unstructured focus group interviews as little more than a casual

conversation that allows the qualitative researcher to inquire into something that has

presented itself as an opportunity to learn about what is going on at the research setting.

In addition to facilitating the focus group interviews in this study, the researcher will

tape-record the proceedings and make field notes with expressed permission of the

participants. The primary data of qualitative interviews are verbatim accounts of what

transpired in the interview session. Tape recording the interviews ensures completeness

of the interaction and provides material for reliability checks, while the need for taking

notes helps to reformulates data analysis. Neither note-taking nor tape-recording will

interfere with the researcher’s full attention on the person during focus group interview

sessions as emphasised by McMillan and Schumacher (2001:355).

The interview schedule will be discussed in the next section.

42

3.3.2 The Interview Schedule

According to Bogdan and Biklen (2003:70), the interview schedules are used primarily to

gather comparable data across sites. The interview schedules also allow for open-ended

responses and are flexible enough for the observer to note and collect data on unexpected

dimensions of the topic (Bogdan and Biklen, 2003:71). Therefore, for purpose of this

study, an interview schedule for principals, heads of departments and educators will be

carefully constructed, with the intention of permitting more latitude than in a structured

interview. The interview schedule for different respondents is presented in Appendix A.

Another important aspect of a research design is sampling which will be discussed in the

next section.

3.4 SAMPLING

The purposeful sampling method will be used in this study. According to McMillan and

Schumacher (2001:433), purposeful sampling is a strategy to choose small groups or

individuals likely to be knowledgeable and informative about phenomenon of interest. In

agreement with this, Ary et al. (2002:428) contend that purposeful sampling is sufficient

to provide maximum insight and understanding of what the researcher is studying. The

researcher uses his or her experience and knowledge to select a sample of participants

that he or she believes can provide the relevant information about the topic or setting. In

purposeful sampling (judgemental sampling), the researcher specifies the characteristics

of a population of interest and tries to locate individuals who have these characteristics.

Once the group is located, the researcher asks those who meet the inclusion criteria to

participate in the research study (Johnson and Christensen, 2004:215).

3.4.1 Site Selection

In this study the researcher will select six secondary schools from the thirty seven

secondary schools in the Matlosana Area Project Office. The schools will be selected

using their location as the basis for their selections. That is, whether a school is a

township or a town school. The schools are selected this way because, township schools

and town schools do not have the same development resources to develop the educators,

43

while the town schools have more resources for educator development, the township

schools have little or none. Therefore three schools will be selected from the township in

the Area and three from the town schools in the Area.

3.4.2 Participants Selection

The following participants will be selected based on the following reasons:

Six secondary school principals. In addition to their position as ex officio

members of the Staff Development Team (SDT), the researcher selected

principals because according to ELRC, Integrated Quality Management

document (2003:2), the principal has the overall responsibility to ensure that

the IQMS, of which Developmental Appraisal is part, is implemented

uniformly and effectively at the school;

Six heads of departments (HODs). Heads of departments are selected

because according to the ELRC, IQMS document (2003:3) they should

assist with the broad planning and implementation of IQMS; and

Eighteen educators. Three educators per school would be selected because

according to the ELRC, IQMS document (2003:2), educators have to attend

development programmes in terms of areas identified for development

during appraisal of their performance.

The researcher will visit the selected schools after the permission is granted by the

Department of Basic Education. At the schools the researcher will explain the purpose of

conducting the study and assure the educators that the findings of the study will be kept

confidential. With the help of the selected school’s principal, those educators who are

willing to participate will be selected.

In the next section of the chapter Analysis of data will be presented.

3.5 DATA ANALYSIS

In this study data will be collected by means of focus group interviews, from the

purposefully selected participants and sites. The interviews will be tape-recorded and

44

followed by verbatim transcription. According to Bogdan and Biklen (2003:147) data

analysis is a process of systematically searching and arranging the interview scripts, field

notes, and other materials that are accumulated to come up with findings. McMillan and

Schumacher (2001:461), describe the qualitative data analysis as inductive process of

organising the data into categories. Ary et al. (2006:490) on the other hand believe that

data analysis is the most complex and mysterious phase of qualitative research.

According to them it involves reducing and organising the data, synthesising, searching

for significant patterns and discovering what is important. The researcher organises what

he or she has seen, heard and read and tries to make sense of it in order to create

explanations, develop theories or pose new questions (Ary et al., 2006:490).

Schulze et al. (2005:15) believe that the aim of analysing and interpreting research data is

to increase the validity of research by ensuring that errors and inaccuracies are

eliminated.

Creswell (1998:148) presented five stages in data analysis as follows:

Creating and organising data;

Managing data by generating categories, themes and patterns;

Reading and memoing to form codes;

Describing, classifying and interpreting data; and

Representing and visualising (writing the report).

The researcher will employ Tech’s approach of data analysis as captured by Schulze et al.

(2005:15) to

get a sense of the whole by reading through all transcripts. Jot down ideas as they

come to mind;

write down thoughts about the meaning of each piece of information in the

margin;

make a list of all topics. Cluster similar topics together;

45

take the list and return to the data. Abbreviate topics by means of codes next to

each segment of data in the transcribed interview. See if new categories and

codes emerge;

form categories by grouping topics together. Determine relationships between

categories;

make a final decision on the abbreviation of categories and codes. Alphabetise

codes;

assemble all the data for each category in one place; and

recode existing data if necessary.

3.6 TRUSTWORTHINESS

The trustworthiness of qualitative research is based on rigour to assure the required

reliability and validity (Tobin & Begley, 2004:389-390 and Twycross & Shields,

2005:36). Rigour refers to integrity and competence in conducting the empirical research

by adhering to detail and accuracy to ensure the authenticity and trustworthiness of data,

findings and conclusions.

Trustworthiness, according to De Vos; Strydom; Fouche & Delport (2005:349), reveals

the truth value, applicability, consistency and neutrality of the research results as

discussed below:

3.6.1 Truth value

Truth value, according to De Vos et al. (2005:350), is the confidence that the researcher

has in the truth of the findings, based on the research design, the informants and context

in which the research was conducted. The truth value of this study will be obtained from

principals, HODs and educators in the Matlosana Area Project Office schools, who

willingly will participate and associate themselves with the aims of this study.

3.6.2 Applicability

De Vos et al. (2005:349), define the applicability of research findings as the degree to

which the findings of the study can be applied to other contexts and situations. Although

46

the focus of this study is based on schools in the Matlosana Area Project Office schools,

there are some generic aspects that are applicable to other school situations within the

education system because no school is an isolated entity in itself.

3.6.3 Consistency

Consistency refers to the reliability of research results if the research is conducted again

with the same participants or in a similar context (De Vos et al., 2005:350). Consistency

will be established by means of “audit trail” in the form of filed notes and tape recordings

that any external reviewer can access.

3.6.4 Neutrality

Neutrality of the study results refers to freedom from bias in the research procedures and

results. Neutrality will be ensured by the researcher who will be cautious and deliberately

not indicate or show any bias towards any aspect that will be part of the empirical section

of this study (De Vos et al., 2005:350).

The above-stated aspects related to trustworthiness of the research will be addressed by

the researcher in the following ways (De Vos et al., 2005:359):

Confidentiality will be assured; and

Questions will be asked in simple understandable language.

3.7 ETHICAL CONSIDERATIONS

Ary et al. (2002:50) maintain that when a qualitative researcher employs human beings as

subjects in research, extreme care must be taken to avoid any harm to them. This means

that the researcher has an obligation to respect the rights, needs, values, and desires of the

informants (Creswell, 1998:165).

In this study the researcher will observe the following research ethics:

Written permission will be asked from the Matlosana Area Project Office

manager, principals and school governing bodies of the selected secondary

schools to proceed with the study;

47

The researcher will approach the participants in person and communicate the

aims of the study to participants and assure them of confidentiality and

anonymity;

The participants will be informed of data collection devices and activities.

Permission will be asked to audiotape interviews so that the researcher can obtain

accurate information; and

The researcher will observe the rights of participants to remain anonymous and

their information be treated as confidential. These rights will be respected,

specifically, when no clear understanding to the contrary has been reached. In

compliance with this consideration, the researcher will employ stringent

measures to conceal the names of participants and those of the institutions to

which they are attached. Instead, pseudo- or code names will be used (Ary et al.,

2002:504).

3.8 CONCLUSION

In this chapter research methodology has been discussed in details. The following aspects

of the research method were explained: research design; data collection methods;

sampling; data analysis; trustworthiness and ethical consideration. The next chapter will

deal with data collection and analysis.

48

CHAPTER FOUR

DATA ANALYSIS

4.1 INTRODUCTION

The preceding chapter dealt with the research methodology and design that was used to

collect data in this study. This chapter deals with the analysis and interpretation of data

collected through the use of focus group interviews. This chapter also specifically deals

with the empirical findings to answer the main research question of this study. The main

question of the study is how developmental appraisal within the IQMS is being

implemented in the Matlosan Area Project Office schools.

4.2 BRIEF SYNOPSIS OF METHODOLOGY

The researcher collected data by means of focus group interviews in this study. Six focus

groups, from the six purposefully selected secondary schools in the Matlosana APO were

established. Each focus group comprised one principal, one head of department (HOD)

and three educators. The identity of the participants remained withheld throughout the

study for the purposes of confidentiality and anonymity. Instead, code names were used,

as captured in section 3.7 of this study.

The researcher used an interview schedule which served as useful guide during the focus

group interviews (c.f. Annexure A). All focus group interviews were conducted face-to-

face. The average interview took 20 minutes with a range from 15 minutes to 30 minutes.

A voice recorder was used to record the interviews for later use when doing the analysis

of the findings.

The researcher believes that the information collected represents what participants

experience in reality when they implement developmental appraisal in their respective

secondary schools. This is the main objective of this study, to understand the

effectiveness of developmental appraisal implementation from the educators’ own

perspectives.

49

In the next section of the study, the researcher focuses on the data analysis process.

4.3 DATA ANALYSIS PROCESS

Data collected by means of questionnaires, interviews, diaries or any other method mean

very little until they are analysed and evaluated. Raw data taken from questionnaires,

interview schedules, check lists and so on, need to be recorded, analysed and interpreted

(Bell, 2005:201). In this study data were collected through focus group interviews, the

voice recorder was used for verbatim transcription later on and to analyse the findings of

the research after data collection. The information from interviews would have meant

nothing if not interpreted and analysed (Bell, 2005:201).

According to Hittleman and Simon (2002:182), categorising and interpreting are to look

for similarities and differences, groupings, patterns and items of particular significance.

This involves examining and organising notes from interviews and reducing the

information into smaller segments from which patterns and trends can be seen. In this

study the findings of the research were categorised using questions in the interview

schedule as the main categories. The findings were further sub-divided into sub-

categories according to the participants’ responses.

Qualitative researchers begin their analysis while still in the research setting and finish it

after all data have been collected (Hittleman and Simon, 2002:182). In this study notes

were taken while busy with interviewing the participants for later analysis and

interpretation. The voice recordings were also transcribed immediately after the interview

while still fresh in the researcher’s mind. The findings were read thoroughly to have a

better comprehension of what was said before doing the analysis.

4.4 RESEARCH FINDINGS

The aim of conducting interviews was to examine teachers’ experiences regarding the

implementation of developmental appraisal within the IQMS in their respective

secondary schools. The researcher, in other words, wanted to know what teachers see as

taking place in the participating schools when implementing developmental appraisal.

50

With regard to data analysis, these findings were appropriately organised into categories

and sub-categories. The findings were organised in the form of categories based on the

theme derived from interview schedule. The following is the layout as derived from the

categories:

The processes of Developmental Appraisal within IQMS;

The roles of Staff Development Team in implementing DA;

The role of Area Project Office officials in implementing DA;

The effectiveness of SDTs and APO officials in implementing DA; and

Challenges on the implementation of DA.

The findings discussed were presented under each subheading in subsections that were

aligned to the relevant subcategories that emerged from the focus group interview data.

4.4.1 The processes of Developmental Appraisal within the IQMS

During data collection it was discovered that the participants have knowledge of some of

the processes of developmental appraisal. During the interviews in selected secondary

schools it was discovered that some similar ideas were expressed on what the processes

of developmental appraisal entail with that captured in the review of related literature in

this study although some processes were not known.

These processes were generated from the participants’ responses during the interview on

the processes of implementing developmental appraisal: planning; self-evaluation; pre-

evaluation discussions; lesson observation; feedback and discussion; resolution of

differences; and monitoring. The processes are presented in detail below:

4.4.1.1 Planning

The majority of participants believe that planning plays a major role in the effective

implementation of developmental appraisal within the IQMS. At the beginning of the

year, where the SDT is not in existence the staff had to elect the SDT. This is confirmed

by Principal C when stating that: “The idea is also to elect from within the staff that is the

principal and elected members by staff to form the development team”. By development

51

team the participant means the Staff Development Team (SDT). The planning on the

IQMS must incorporate all the processes and time frames, individuals involved and the

school plan into account. Teacher E 1 attests to this when explaining: “We have to

prepare first within the school how we are going to plan for that process of IQMS”. In

support of this HOD E responded: “The first step is where the SDT has to draw up their

plan from the information they gathered from the DSGs, the DSGs inform SDT when they

are going to sit or visit a person who is the appraisee”. The Staff Development Teams

also have to familiarise themselves with IQMS policy and workshop other educators,

especially, the new ones. On this idea Teacher D 1 said: “What we do is that we start off

with checking the policy, we start with that”. The planning for the year must be drawn by

the SDT together with School Management Team (SMT). The SDT also in their planning

they have to draw up the management plan that has to take the school plan into account.

This was confirmed by the response of HOD C: “SDT sit down and organise a

programme for the whole year to indicate the dates for the visits”. To elaborate further

on this HOD F said: “I think the first one is the school management plan, which we apply

on all activities that must be followed in terms of implementation process”. The

management plan must also indicate when the time table for the class visits will be

drawn. This point was made by Teacher F1 when stating that: Before IQMS starts, the

SDT should draw the time table so that everyone should know when to go to class”.

The above are in line with what was discovered in the literature review (c.f.2.3.2.1).

4.4.1.2 Lesson Observation

Data collected revealed that the majority of participants understand the important role the

class observation play during implementation of developmental appraisal. This fact is

confirmed by the response of the following respondents: Principal A said: “The two

teachers come to your class, look at your file, listen to your class as how you give class

and then they assess you”. To elaborate more on this Teacher C1said: “From there the

teacher perform. The teacher with expertise or the teacher who is knowledgeable on the

subject fills in his her space”. Teacher D1 further collaborated this when stating that:

“Then we go to class visit with the teacher and give the lesson”. HOD E also further

52

explains this when responding by saying: “The DSG inform the SDT when are they going

to sit or visit a person who is the appraisee”. In conclusion on this Principal F responded:

“When we meet the teacher in the classroom, the teacher must come with the peer”.

The above responses are in line with what was discovered in the literature review

(c.f.2.3.2.4).

4.4.1.3 Feedback and Discussion

The majority of participants were aware of the need to meet after lesson observation to

discuss issues related to the lesson observation. This fact is confirmed by a number of

responses of the participants. Principal A responded: “Then we have a meeting where we

discuss the marks you gave yourself and what the other two people gave you, so we

decide on the marks and then you have to fill in a form where you comment on the things

that bother you and on things why you gave yourself a three or two or whatever”. This is

supported by the response of Teacher B1: “After observing the lesson we call that

teacher, who was visited, and we discuss everything with that particular teacher, the

scores given to the teacher”. A related response was given by Teacher C1: “Thereafter

we meet again and collude the scores. We focus on those areas that we identified for that

particular teacher”. Principal F further elaborated on this when stating that: “The DSG

and the peer together with the teacher discuss all the findings and appropriate forms are

filled”. This was supported by HOD E: “After visiting the DSG and the educator sit down

in a meeting where you would discuss what they discovered, our weaknesses and

strengths, then they will add that into the Personal Growth Plan”.

The above views are supported by literature review (c.f.2.3.2.6).

4.4.1.4 Resolution of Differences

It was also discovered that sometimes differences arise during evaluation of educator in

developmental appraisal. This was indicated by Teacher B1when declaring: “Let me also

mention that if a teacher is not satisfied with the marks allocated to him or her that

matter will be taken to the principal or somebody who is neutral”. In corroborating this

53

fact Principal B explained: “There is a process for a teacher who is not happy with the

appraisal process, it may be taken up with the SMT and the management so that we re-

evaluate”.

This view is in contrast to what the literature review indicated (c.f.2.3.2.7). According to

the ELRC (2003), most differences between the educator and the DSG should be resolved

at that level. Where agreement cannot be reached the matter must be referred to the SDT

which is an elected body with the principal as an ex-officio member and not the SMT as

suggested by the participant.

The following processes were not known to the majority of participating schools: self-

evaluation; pre-evaluation discussions; and monitoring. This is indicated by the fact that

each one of these processes was remembered by only one respondent during all the focus

group interviews.

4.4.1.5 Self-evaluation

Self-evaluation also plays an important role as far as the processes of developmental

appraisal are concerned. The response of Principal A indicated this when stating: “We put

all our information in a file, you asses yourself and then the other two teachers come to

your class”. This process of developmental appraisal is also indicated in the literature

review (c.f.2.3.2.2).

4.4.1.6 Pre-evaluation Discussions

This important process was also mentioned by only one respondent during the interviews.

Principal F responded: “When we meet the teacher in the classroom, the teacher must

come with the peer, they must have the meeting beforehand to discuss what is expected of

her in class”. The finding of the literature review has also revealed pre-evaluation

discussion as one of the processes of developmental appraisal (c.f.2.3.2.3).

54

4.4.1.7 Monitoring

The other process mentioned by only one respondent is monitoring. This process was

mentioned by HOD E when saying: “Then later they request the SDT and yourself to do

follow up on those weaknesses identified: This in the context of developmental appraisal

means monitoring by the SDT to ensure that educators are developed on the identified

weaknesses.

The review of the literature has suggested monitoring as one of the important processes

of developmental appraisal (c.f.2.3.2.8).

There were other processes of developmental appraisal which were never mentioned by

respondents in all the focus group interviews. According to the ELRC, (2003:6) there are

nine processes that must be followed when implementing developmental appraisal.

During the interviews only seven were mentioned and the other two were not mentioned

which suggest that they were not known (c.f.2.3.2.5) and (c.f.2.3.2.9).

4.4.2 The Roles of the Staff Developmental Team (SDT) in implementing DA

During data collection, it was realised that the majority of participants had a fair

understanding of what constitute the roles of the Staff Development Team (SDT) with

regard to the implementation of developmental appraisal. From their respective focus

groups interviews, the participants expressed similar views in most cases concerning

what they perceived as the roles of the SDT in implementing developmental appraisal.

The majority of the views expressed are also supported by the findings of the literature

review of this study (c.f.2.3.1.4).

The interviews revealed that in implementing developmental appraisal, the SDT has

various roles to fulfil which among others include the following: prepares and monitors

management plan; develops school improvement plan; submits all necessary documents

to APO; ensures that all members are trained; coordinates all activities of staff

development; facilitates and gives guidance on DSG establishment; links developmental

55

appraisal to school improvement plan; oversees mentoring and support by DSG; and

develops the school improvement plan.

These roles were generated from the responses of the participants during the interviews

on their view regarding the roles of the SDT in the implementation of developmental

appraisal within the IQMS and are discussed in detail below:

4.4.2.1 Prepares and monitors management plan

The majority of the participants showed a fair amount of understanding of this role of

SDT of preparing and monitoring the management plan. The responses from the

participants during the interviews confirmed this. Principal B revealed: “The SDT

obviously put the whole process in motion by working according to a specific plan”. In

support of this Principal C declared: “SDT’s most important role is to draw up the

programme of action of the IQMS for the whole school”. Teacher D further emphasised

this when saying: “We start off by initiating the programme”.

By programme here the participants are referring to the management plan. To conclude

this HOD A responded: “They draw a plan they take our files because each teacher has

his or her file”.

4.4.2.2 Develops school improvement plan

The interviews also revealed that some participants understand the role of developing the

school improvement plan by the SDT in the implementation of developmental appraisal.

This is indicated by the responses of the following interviewees: Principal B said: “We

look at the PGP and the school improvement plan is then drawn from the PGP”. In

support of this Principal F stated: “I think first is to draw up SIP (the School

Improvement Plan), then the school improvement plan is informed by the PGP for the

teacher are given PGPs for personal growth plan”.

The investigation conducted through the interviews revealed that not all the roles of the

SDT are known by the participants from the selected schools. This has been indicated by

the number of respondents during the interview and the fact that not all the roles of the

56

SDT were mention during the interviews (c.f.2.3.1.4). The following roles of the SDT

were not known by the majority of the participants: submits all necessary documentations

to APO; ensures that all members are trained; coordinates all activities of staff

development; facilitates and gives guidance on DSG establishment; links developmental

appraisal to school improvement plan; and oversees mentoring and support by DSG. This

is evidenced by the fact that only one or two participants could mention each one of these

roles of SDT.

4.4.2.3 Submits all necessary documentations to APO

This role of the SDT was mentioned by only one participant during the interview.

Teacher B said: “It is ongoing on different interval, making sure that required documents

until the final submission”. What the participant was trying to indicate, was that the

process of DA is ongoing but at the end of the year there are documents that have to be

submitted by the SDT to the APO.

4.4.2.4 Ensure that all members are trained

The participants expressed the view that the role of the SDT is to ensure that all members

of staff are trained after the processes of developmental appraisal have been

implemented. This was evidenced by the response of HOD A: “To help develop the

young teachers or with difficulties, to help assess them to overcome those difficulties”.

This point was also emphasised by HOD E, when stating: “They usually organise

meetings where we as staff discuss challenges”.

4.4.2.5 Facilitates and give guidance on DSG establishment

The interviews also revealed that some participants were aware of the role of the SDT,

which is to facilitate and to give guidance on the establishment of the DSG. This fact was

confirmed by the response of Principal C, when stating that: “And also lead to the

formation of DSG as well”. The principal here was explaining that after the election of

SDT, the SDT then facilitates the establishment of DSGs.

57

Some of the SDT roles were never mentioned by the participants as captured in the

literature review (c.f.2.3.1.4). This suggests that those roles not mentioned were not

known by participants because in all the focus group interviews the respondents

remembered the above roles but never remember these ones.

4.4.3 The Roles of the Area Project Office (APO) officials in the implementation of

DA

During data collection, it was discovered that little was known by participants, regarding

the roles the APO officials play in implementing developmental appraisal. This fact was

indicated by the responses of the participants. Only a few of them could mention at least

three functions of the APO officials when implementing developmental appraisal. These

roles are monitoring, control of IQMS, and Educator development. Discussions in detail

follow below:

4.4.3.1 Monitoring

This important function of the APO official that ensures that IQMS is implemented

consistently was mentioned by only two participants. HOD A mentioned: “They always

come to do monitoring”. In corroboration with this Teacher C1 declared: “They normally

visit us to come and check the process”.

4.4.3.2 Control

With regard to the role of APO officials of controlling the implementation of

developmental appraisal, some educators indicated that they have an understanding of

what this role entails. The response of the following interviewees confirmed this: HOD A

responded: “We communicate with department to report on the progress we made on

IQMS and when forms are to be completed”. This suggests that it is the APO officials

who control the progress and give dates for the submission of required documents. In

support of this Teacher C2 said: “They normally sent us circulars, which indicate the

dates or which we should start with the implementation”. Principal D, on the other hand

mentioned that: “They make sure we have forms and booklets, we have criteria that we

got from department and circuit managers tell us what is expected of us”.

58

4.4.3.3 Educator Development

Only a few participants seem to understand the role of APO officials with regard to

educator development. This was revealed by the response of only two participants during

the interviews. Principal C stated: “Learning Area Specialists will then assist by

workshops on those areas that had been identified”. In support to this Principal F

declared: “Teachers go to workshops, subject advisors come to school and during the

meetings the HODs discover those areas needing development and subject advisors are

asked to help the teachers”.

In contrast to what the empirical investigation indicated, the literature study on

developmental appraisal indicated that the roles of the APO official are more than what

were discovered during the interviews (c.f.2.3.1.6).

4.4.4 Perceptions regarding the effectiveness of the SDT and the APO officials in

implementing DA

4.4.4.1 The effectiveness of SDTs

The findings of the empirical investigations revealed that the majority of the participants

are of the opinion that the SDTs in their respective schools are implementing

developmental appraisal effectively. This assumption is confirmed by the following

responses: Teacher B2 claimed: “We have excellent help you can come to them, they are

always there to help you with problems, whether it is a discipline problem or a problem

with a fellow staff member, they are always there to help you”. Here the teacher is in fact

referring to the School Management Team and not the SDT as the SDT deals only with

educator development and not discipline. Principal B when elaborating on effectiveness

of the SDT indicated that: “They make teachers better teachers at the end of the day”.

HOD C further explained on this when saying: “The SDT assists HODs to make sure that

each teacher go to class and prepares”. Principal C further stated that: “Weaknesses are

identified and how you are going to rectify those weaknesses”. Here what is meant is that

the SDT helps the teachers to develop in those identified weaknesses discovered during

developmental appraisal. HOD E on the other hand suggested: “I can say is effective

because when the departmental officials come here there is evidence of our files in the

59

office of the SDT, every year we submit the results of our performance”. To add to this

Principal F said: “We never experience grievances everything goes well every year. We

also have a master file to be scrutinised by anybody from outside”.

Only one participant believed that the SDT of their school is not effective. The response

of Teacher C1 was that: “It is not that much effective in the sense that sometimes follow

ups are not made, probably because of exorbitant amount of work that is in front of the

teachers”. By follow-ups the participant means the follow-up on the identified

weaknesses of educators needing development.

4.4.4.2 The Effectiveness of APO officials

It was discovered that some participants believe that the APO officials are not

implementing developmental appraisal effectively, whereas there were those who believe

that they are effective in implementing DA. There were again some who did not want to

say whether the officials are effective or not, perhaps it was because they really do not

understand the role that the APO officials should play in implementing DA.

To indicate the perception that the APO officials are not effective, Teacher B3 said: “I

don’t know whether they are working undercover, you hardly see them”. In collaboration

with this view Teacher B2 declared: “Because they don’t do IQMS, but PMDS, that is

why they don’t know what is going on at schools. Lack of training from the APO officials

was also one of the things the participants complained about. Principal B complained: “I

have had no training. As to the teachers they have no contact with any official who is

involved in appraisal. The only visit is by circuit manager”. Teacher B3 stresses this

when complaining that: “There is no communication we are not getting support from

them”. HOD F also is of the opinion that they are not effective in management of school

when stating that: “I will also make an assumption that they are not effective,

particularly when it comes to management of school development. For example the

surroundings, they normally don’t check what is said and they classify school quintile

wrong”. Here the respondent was referring to checking the area where the school is

situated so that the school must be classified correctly for funding allocations.

60

There were participants who, on the contrary to what was said above, believe that the

APO officials are effective. For example Teacher A1 declared: “We always get

documents from them”. Teacher C1 on the other hand responded: “They normally call us

to assemble where we are going to tackle problems. They come up with programmes to

develop us”.

4.4.5 Challenges on the implementation of DA

The other objective of this study was to look at the challenges that face the educators

when implementing developmental appraisal in schools. From the focus group interviews

the following emerged as challenges that hamper proper implementation of

developmental appraisal in most schools that participated in the study:

Inadequate support from the APO;

Lack of resources for educator development;

Inadequate time frame for implementation;

Disruption of normal teaching and learning;

Lack of honesty on the part of appraisees and appraisers; and

Conflict

The above challenges are discussed in detail below:

4.4.5.1 Inadequate support from the APO

Data collection revealed that inadequate support from the APO is one of the challenges

that educator are facing after being appraised. In confirmation to this view the

participants responded as follows: Teacher B2: “When there is a meeting they don’t show

up”. In support to this HOD A stated: “There are some of the APO officials that are not

involved with the subject, some subject advisors see their teachers more than the others,

they make excuses why they don’t see us”.

This challenge was also detected in the review of the related literature (c.f.2.4.1).

61

4.4.5.2 Lack of resources for educator development

The second challenge discovered during empirical investigation of this study was lack of

resources for educator development. This challenge was raised by Principal A by

responding as follows: “The other issue is the human resources to make sure that all that

is needed to develop teachers is actually there, more especially financial resources”.

The investigation of the related literature has also revealed lack of resources as a problem

in implementing developmental appraisal (c.f.2.4.1).

4.4.5.3 Inadequate time frames for implementation

During the focus group interviews the researcher discovered that the time allocated to

implement developmental appraisal at some school is not enough. This is indicated by the

response of HOD E: “The first one is the issue of time, it is difficult to intertwine this and

our normal practice here at school, and sometimes you find that this is difficult for the

DSGs to meet. The issue of time is a challenge”.

This challenge has also been discovered during the literature review of this study

(c.f.2.4.1).

4.4.5.4 Disruption of normal teaching and learning

The empirical investigations also indicated that normal teaching and learning are

disrupted when educators are visited in class by two member of the DSG. This was

revealed by Teacher D1: “Class visit that we have done is often at the same time and

when we have to visit the teacher we have to leave our classes. Two teachers sit in the

class to observe what is going on”. In addition to this HOD F also mentioned the issue of

examinations that are affected by this class visit when stating: “The main concern I can

raise is that normally we start late and sometimes a cycle coincide with examinations and

I see that as a major challenge”.

62

4.4.5.5 Lack of honesty on the part of appraisee and appraiser

The researcher also realised that the issue of honesty was a challenge while collecting

information by means of interviews. The respondents believe that educators are not

honest when scoring themselves or their colleagues during appraisal of their performance

to diagnose weaknesses. Principal B justified this when responding: “Teachers over rate

themselves”. In corroborating this Teacher B3 stated: “The whole thing is on

transparency in scoring, we score ourselves very high”. To further clarify this HOD E

responded: “I think the other challenge is the issue of honesty. You will find this

performance standard requires whether the teacher is active in extra-curricular activities

and the peer is shy to score him or her low. No honesty on the part of the evaluator and

the evaluee”. The other issue is that of selecting a suitable topic so that the educator

should excel in class. This issue was raised by Teacher C1: “Teachers select a topic that

is suitable for them instead of picking the one that is difficult”.

4.4.5.6 Conflict

Conflict was also identified as one of the challenges that are encountered by educators

when implementing developmental appraisal within the IQMS during the interviews. The

conflict that might arise, according to the response of one of the participants, is in the

issue of scoring one another during educator evaluation. Principal E’s response confirmed

this: “If teachers over rate themselves it can lead to conflict”.

The findings from literature also identified conflict as one of the challenges facing

educators when implementing developmental appraisal (c.f.2.4.1).

4.5 SUMMARY OF THE FINDINGS

During the data collection process it was discovered that the educators, HODs and

principals in the Matlosana Area Project Office secondary schools have knowledge of

some processes of DA, some roles of the SDT and APO officials in implementing

developmental appraisal. This is evidenced by some similarities between what the

findings of the interviews revealed and what was captured in the literature review. There

were however, some of the processes of DA and roles of SDT and APO officials which

63

were not known, this is revealed by the fact that these processes or roles were never

mentioned in all six focus group interviews.

It was also revealed during data collection that the majority of the participants are of the

opinion that the SDTs in their respective schools are effective. With regard to the APO

officials there were those educators that believe the APO officials are not effective

whereas the others believe they are effective in implementing DA.

The findings have also revealed some challenges that are facing educators in the

Matlosana APO when implementing DA. The challenges indicated are: inadequate

support from the APO; lack of resources; inadequate time frame for implementation;

disruption of normal teaching and learning; lack of honesty on the part of the appraisees

and appraisers; and conflict. There is also evidence of some similarities between the

finding of the empirical investigations and literature review on the challenges of

implementing developmental appraisal.

4.6 CONCLUSION

In this chapter, a brief synopsis of the methodology was given, and an analysis process

was undertaken. The findings of research are also discussed and verified with findings of

the literature review. Lastly a summary of findings is also presented. Chapter five will

focus on summary, findings, recommendations and conclusion.

64

CHAPTER FIVE

SUMMARY, FINDINGS, RECOMMENDATIONS AND CONCLUSION

5.1 INTRODUCTION

In the preceding chapter on data analysis, the researcher interpreted and analysed data in

order to come to the conclusion of the research. This chapter will conclude the study by

giving a summary of an overview background of the study, summary of important

findings from the literature and empirical investigations. Recommendations, limitation of

the study, further research and conclusion will also be presented.

5.2 SUMMARY

It was pointed out in chapter one that before 1995 there was no acceptable evaluation

system in South Africa to evaluate educators. This resulted in conflict between teacher

unions and the department of education. The new instrument was then developed to

evaluate educators. The new instrument came to be known as Integrated Quality

Management System (IQMS). The aims of IQMS are to: identify specific needs of

educators, schools and district offices for support and development; provide support for

continued growth; promote accountability; monitor an institution’s overall effectiveness;

and evaluate an educator’s performance. The aim of Developmental Appraisal (DA)

within the IQMS is to facilitate the personal and professional development of educators

in order to improve the quality of teaching practice and educational management. The

purpose of DA is to appraise individual educators in a transparent manner with a view of

determining the areas of strengths and weaknesses, and drawing up programmes for

individual development (ELRC, 2003:1).

Against this background the aim of the study was formulated as follows: to examine

teachers’ perceptions regarding the effectiveness of the implementation of

Developmental Appraisal in the Matlosana Area Project Office schools.

65

The aim was further divided into the following sub-questions:

How are the processes of DA being carried out in the Matlosana APO schools?

What are the roles of the SDT and APO in the implementation of DA in the

Matlosana APO schools?

What are the perceptions of educators regarding the effectiveness of SDT and

APO in the implementation of DA in the Matlosana APO schools?

What recommendations can be made that may serve as guidelines for the

effective implementation of DA?

The research objectives were to

describe the processes of DA implementation in the Matlosana APO schools;

describe the roles of the SDT and the APO in the implementation of DA in the

Matlosana APO schools;

ascertain the views of educators regarding the effectiveness of SDT and APO in

the implementation of DA; and

make recommendations that may serve as guidelines for the effective

implementation of DA within the IQMS.

A literature review was presented in chapter two, this serves to describe and explain the

major facets of the study.

To have a broader understanding of Appraisal, Staff/Personnel Appraisal in general was

discussed and thereafter the Implementation of Developmental Appraisal in schools. In

this chapter the following were also discussed to have a clearer understanding on DA

implementation: the roles and responsibilities of officials tasked with the implementation

of DA; the processes of implementing DA; challenges on the implementation of DA; and

the effective strategies of implementing DA.

The detailed description of research methodology is given in chapter three of the study.

Research design, data collection methods, data analysis process and ethical

considerations were explained in detail.

66

Data analysis detailed description is presented in chapter four of the study. This chapter

specifically dealt with a brief synopsis of methodology, data analysis process, research

findings, and summary of findings.

5.3 SUMMARY OF IMPORTANT FINDINGS

In the preceding section a summary of the entire study was given and in this section, a

summary of the important findings and conclusion from empirical investigations will be

discussed.

5.3.1 Findings from Literature

During the data collection on literature review the following findings were revealed

regarding the effective implementation of DA:

5.3.1.1 Roles and responsibilities of officials tasked with the implementation of DA

In order to understand how DA is implemented in schools, the researcher deemed it

necessary to discuss the roles and responsibilities of officials tasked with the

implementation of DA. Briefly the findings of literature indicated the following officials

and their responsibilities:

The Principal has the overall responsibility to ensure that IQMS is implemented

uniformly and effectively at school (c.f.2.3.1.1);

The Educator must undertake self-evaluation of her or his performance

(c.f.2.3.1.2);

The SMT informs educators of in-service training and other programmes that will

be offered and make the necessary arrangements for educators to attend

(c.f.2.3.1.3);

The SDT coordinates all the activities pertaining to staff development

(c.f.2.3.1.4);

The DSG provides mentoring and support to educators (c.f.2.3.1.5); and

The APO has the overall responsibility of advocacy, training and proper

implementation of IQMS (c.f.2.3.1.6).

67

In implementing DA, the officials tasked with the implementation have to follow certain

processes of DA within the IQMS. These processes are discussed below:

5.3.1.2 The Processes of implementing DA

The literature of the study indicated the following processes when implementing DA:

Planning is done by the SDT and the SMT and must incorporate all the processes

and take the school’s year plan into account (c.f.2.3.2.1);

Self-evaluation by each educator to familiarise herself or himself with evaluation

instrument and reflect critically on her or his performance (c.f.2.3.2.2);

Pre-evaluation discussion between the educator and the DSG to clarify all issues

pertaining to the educator’s evaluation (c.f.2.3.2.3);

Lesson observation is done to determine baseline evaluation with which

subsequent evaluation can be compared in order to determine progress

(c.f.2.3.2.4);

Evaluation in respect to other performance standards which is based on general

ongoing observation, discussion and feedback by DSG, submission of

documentary evidence, proof of participation and other information provided by

the educator (c.f.2.3.2.5);

Feedback and discussion by the DSG and the educator to resolve differences if

any (c.f.2.3.2.6);

Resolution of differences and/or grievances which is done by the grievance

committee, but the final decision is taken by the head of department to resolve the

grievance (c.f.2.3.2.7);

Monitoring which is ongoing must be conducted by the departmental officials,

SMTs and DSGs to ensure that the implementation is accurate and consistent

(c.f.2.3.2.8); and

Second and subsequent year of implementation where educators are evaluated

only once a year and the evaluation of the previous year becomes the baseline

evaluation (c.f.2.3.2.9).

68

It was also discovered that when following these processes of implementing DA there are

certain challenges that educators encountered.

5.3.1.3 Challenges on the implementation of DA

The following challenges were discovered during the review of the literature (c.f.2.4.1):

Lack of professional competence, openness and respect towards colleagues;

Lack of a professional collaborative climate and culture where staff work and

reflect together to improve teaching in most schools;

Unsatisfactory professional qualification of many educators and mastery of the

subject knowledge and pedagogy;

Defensive educators towards any form of performance appraisal, especially

classroom visit;

Difficult teaching conditions and new policies;

Lack of capacity for educator monitoring;

The use of one standardised evaluation instrument for both development and

reward or sanction;

Not enough attention given towards the educators’ needs;

No clarity on the responsibility of the department of providing facilities and

resources to support teachers;

Lack of rigour in the whole process shown by poor target setting and the line

manager not being the appraiser;

The process not fitting into school’s training, development plan and the yearly

cycle being incongruous with management planning; and

The link with pay poses potential problem in the implementation of DA and it is

undermining the professional development supported by the educators.

These challenges indicate that the implementation of DA in schools is not as it is

supposed to be and it means something has to be done to address these challenges

that hamper the effective implementation of DA. There are suggested effective

strategies that can help schools implement DA effectively that were revealed by the

literature review.

69

5.3.1.4 Effective strategies of implementing DA

In order to implement DA effectively it was discovered that the following strategies can

be useful (c.f.2.4.2): educators should discuss aspects of their professional development

with the department of basic education; the management should be involved in teacher

development; approach to development to be negotiated with the educator; there must be

involvement of an outsider on educator evaluation; there must be honest discussions

between the appraiser and appraisee; evaluators with background in appraisee’s subject

and grade level should be used to evaluate educators; appraisal should encourage,

recognise and value good work whilst addressing any weaknesses with suggestions for

future action; there must be regular monitoring and improvement of teacher’s

effectiveness in the classroom; appraisal should be based on valid or realistic assumptions

about specific teaching realities and the available professional appraisal and support

capacity in the system; the school should have an effective training strategy and mount

professional programmes where purposes and procedures are fully shared and explained

to all participants; there must be transparency, trust, and honesty in the implementation of

DA at school; and the implementation system should be flexible, designed by each school

to meet its own needs.

If the above strategies are carried out, DA will be implemented effectively in schools

because the objective of developmental appraisal is to diagnose educators’ weaknesses in

a transparent manner in order to develop them. This will happen when educators are

involved in designing their development programmes. The use of the outsider may also

help with the issue of fairness in the evaluation of educators. If DA is monitored regularly

the schools may be able to implement it effectively because if there are faults in the

implementation they are likely to be corrected.

5.3.2 Conclusion from the empirical investigations

The findings of the empirical investigations were organised into categories based on the

themes derived from the interview schedule. The themes are:

The processes of DA within IQMS;

The roles of the SDT in implementing DA;

70

The roles of the APO officials in implementing DA;

Educators’ perceptions regarding the effectiveness of SDTs and the APO officials

in implementing DA; and

Challenges on the implementation of DA.

5.3.2.1 The processes of DA within IQMS

The first objective of this study was to describe the processes of DA in the Matlosana

APO schools (c.f.1.3).

During the interview it was discovered that the participants had some knowledge of the

processes of DA, but the fact that some processes were never mentioned, means that not

all the processes are followed when implementing DA in some of the Matlosana APO

schools. The processes which the participants were able to mention and describe are:

Planning (c.f.4.4.1.1);

Lesson observation (c.f.4.4.1.2);

Feedback and discussions (c.f.4.4.1.3);

Resolution of differences (c.f.4.4.1.4);

Self-evaluation (c.f.4.4.1.5);

Pre-evaluation discussions (c.f.4.4.1.6); and

Monitoring (c.f.4.4.1.7).

The processes which were not mentioned are:

Evaluation in respect to other performance standards (c.f.2.3.2.5); and

Second and subsequent year of implementation (c.f.2.3.2.9).

This suggests that these processes are not carried out at schools and for DA to be

implemented effectively all processes need to be followed.

The second objective of the study was to describe the roles of the Staff Development

Team and the Area Project Office in the implementation of Developmental Appraisal in

the Matlosana APO schools (c.f.1.3).

71

5.3.2.2 The roles of the SDT in implementing DA

The empirical investigations also indicated that the majority of the participating

secondary schools did not know all the roles that the SDTs have to play when

implementing DA at schools in the Matlosana APO (c.f.4.4.2). The roles that the

participants know are

prepares and monitors management plan (c.f.4.4.2.1);

develops school improvement plan (c.f.4.4.2.2);

submits all necessary documents to APO (c.f.4.4.2.3);

ensures that all members are trained (c.f.4.4.2.4); and

facilitates and give guidance on DSG establishment (c.f.4.4.2.5).

The roles that participants did not know are (c.f.2.3.1.4)

coordinates all activities pertaining to staff development;

links Developmental Appraisal to the School Improvement Plan (SIP);

liaises with the department in respect of high priority needs;

monitors effectiveness of the IQMS and reports to the relevant persons;

ensures that all records and documentations on IQMS are maintained;

oversees mentoring and support by DSGs;

coordinates ongoing support provided during two development cycle each year;

completes the necessary documentation for Performance Measurement (PM) for

pay or grade progression, sign off on these to ensure fairness;

deals with the differences between appraisees and appraisers in order to resolve

the differences;

coordinates the internal WSE processes;

ensures that the IQMS is applied consistently; and

liaises with the external WSE Team to coordinate and manage the cyclical

external WSE process.

72

5.3.2.3 The roles of the APO officials in implementing DA

The interviews have also revealed that the majority of participants did not know what

roles the APO officials are playing when helping schools to implement DA. It therefore

means that the APO officials are not helping schools to implement DA effectively

(c.f.4.4.3). The only roles that some participants were able to remember are:

Monitoring (c.f.4.4.3.1);

Control (c.f.4.4.3.2); and

Educator development (c.f.4.4.3.3).

This means that these are the only roles that the officials are performing when

implementing DA. The roles which were not known during the interviews and which

means are not carried out during the implementation of DA are (c.f.2.3.1.6):

Overall responsibility of advocacy, training and proper implementation of IQMS;

Responsibility with regard to the development and arrangement of professional

development programmes in accordance with identified needs of educators and its

own improvement plan;

Moderate evaluation results of schools in order to ensure consistency;

Ensure that the evaluation results of schools are captured and processed in time to

ensure successful implementation of salary or pay progression; and

Ensure that the implementation processes in schools are monitored on an ongoing

basis.

The fact that the roles of the SDT and APO are not all known by the participants,

suggests that the SDT and APO officials are not carrying out all the responsibilities that

they are supposed to carry out to ensure that IQMS is implemented effectively in schools.

The third objective of the study was to ascertain the views of the educators regarding the

effectiveness of the SDTs and APO officials in implementing Developmental Appraisal

(c.f.1.3).

73

5.3.2.4.1 Educators’ perceptions regarding the effectiveness of the SDT in

implementing DA

The majority of the participants believe that the SDTs in their respective schools are

implementing DA effectively in the Matlosana APO in spite of the challenges faced by

the SDTs (c.f.4.4.4.1).

5.3.2.4.2 Educators’ perceptions regarding the effectiveness of APO officials in

implementing DA

The majority of the participants were of the opinion that the APO officials are not

effective in helping schools in the Matlosana APO to implement DA (c.f.4.4.4.2)

The fourth objective of the interviews was to find out what are the obstacles faced by

educators in the Matlosana APO schools when implementing developmental appraisal.

5.3.2.5 Challenges on the implementation of DA

During the investigations it was revealed that there are challenges that educators face

when they implement DA in their respective schools. Challenges include:

Inadequate support from the APO (c.f.4.4.5.1);

Lack of resources for educator development (c.f.4.4.5.2);

Inadequate time frames for implementation (c.f.4.4.5.3);

Disruption of normal teaching and learning (c.f.4.4.5.4);

Lack of honesty on the part of the appraisees and appraisers (c.f.4.4.5.5); and

Conflict (c.f.4.4.5.6).

5.4 RECOMMENDATIONS

The findings from the literature and the empirical investigations have revealed that there

are challenges when developmental appraisal within the IQMS is implemented in

schools. The following are recommendations that may be used for the effective

implementation of developmental appraisal in schools:

74

5.4.1 Training APO officials, SDTs and Educators on DA

It is recommended that more training should be provided on the implementation of DA

for the APO officials and SDTs to empower them for the roles that they have to play to

help schools implement DA effectively. The investigation has revealed that their roles are

not known which suggests that they are not playing their roles to implement DA

effectively. It is further recommended that more workshops on the processes of DA and

roles of officials tasked with the implementation of DA be conducted for educators to

understand all the processes that need to be followed. The educators should also be able

to voice their frustrations and discuss with colleagues effective strategies to implement

DA.

5.4.2 Provision of resources

In order to be trained in the identified weaknesses during the evaluation of their

performance, resources like workshop materials and venues need to be provided for this.

Each school should be provided with resources according to their school improvement

needs. It is further recommended that educators’ in-service training should arise naturally

from the evaluation of their performance and not be prescribed by APO officials in

isolation. It should be a negotiated responsibility.

5.4.3 Proper Planning for the implementation

It is recommended that DA within the IQMS should be integrated with school

management plan to deal with the issue of inadequate time frame for implementation.

This will also help the SMT to plan together with the SDT for programmes for teacher

development on the identified weaknesses and development on other curriculum

implementation.

5.4.4 Promoting honesty

All stakeholders should see appraisal as an important opportunity for honest discussions

which may benefit the individual educator, the school and education in general when

weaknesses are identified and improved upon. It is recommended that the appraisers

should not judge but promote agreements which will focus on how to help the appraisee

75

improve on the identified weaknesses. The appraiser should also be knowledgeable on

the appraisee’s subject or learning area in order to develop the appraisee.

5.4.5 Dealing with conflict

To avoid conflict, there must be clarity on the purpose of the evaluation system, there

must be trust between the appraisee and the appraiser, there must be honesty and the

process must be transparent. It is also advisable that there must be professional

collaboration between the appraisee and the appraiser with regular meetings to negotiate

appraisal purpose and outcome. From these meetings, realistic targets that are within the

job description of the teacher should be discussed and agreed upon at each stage of the

appraisal process. It is also important that the teacher developmental appraisal process

should recognise teachers as full partners in the process, not as raw objects to be

developed by SMTs and APO officials.

5.5 LIMITATION OF THE STUDY

The findings could have been enhanced by having homogeneous groups. This could have

enabled educators to express themselves freely in their groups as compared to when they

are with their School Management Team members (Principal and HOD) in the same

group. It was however difficult to have the different homogeneous groups (principals,

HODs and educators) in one place. Other studies may take this into consideration.

5.6 FURTHER RESEARCH

The aim of this study, which is to examine teachers’ perceptions regarding the

effectiveness of the implementation of DA in the Matlosana APO schools, has revealed

new problems that may be researched in future. With reference to the findings of this

study the researcher would like to recommend the following for further research:

The empirical investigations revealed that the APO officials are not effective in

helping schools implement DA effectively. It is therefore necessary to

investigate the role of the APO officials in the implementation of DA in

schools.

76

A comparative study of challenges facing educators in well resourced town

schools and less resourced township schools to establish whether resources

have any impact on educator development.

A study on the impact of effective teacher developmental appraisal and the

performance of learners in secondary schools.

5.7 CONCLUSION

The implementation of developmental appraisal within the IQMS was brought about by

the fact that in SA there was no acceptable evaluation system for educator. Educators

need to be evaluated in order to diagnose strengths and weaknesses in their performance.

Teachers will then be developed on identified weaknesses to improve teaching and

learning at schools.

The purpose of this study was to examine teachers’ perceptions regarding the

effectiveness of this evaluation instrument called Developmental Appraisal within the

IQMS in the Matlosana APO schools. The findings from the literature review revealed

that for DA to be implemented effectively, officials tasked with implementation need to

perform certain duties, and certain processes need to be followed. Contrary to the

findings from literature review, the findings of empirical investigations indicated that

officials tasked with the implementation are not doing what they are supposed to do and

the processes are not followed. This was indicated by the fact that not all the tasks of

officials on the implementation, and the processes of implementation are known in the

Matlosana APO schools.

The above suggests that in the Matlosana APO schools Developmental Appraisal is not

implemented effectively and therefore teachers’ performance cannot be enhanced. If

teachers are not performing as they are supposed to means that learners are also going to

suffer in terms of poor performance.

77

BIBLIOGAPHY

Ary, D., Jacobs, L.C., Razavieh, A. & Sorensen, C. 2006. Introduction to Research in

Education. (7th

Edition) Canada: Thomson Wadsworth.

Ary, D., Jacobs, L.C & Razavieh, A. 2002. Introduction to Research in Education. (6th

Edition) Canada: Wadsworth.

Barnard, 1999. Performance applause. 42 (4): 1-2.

Bartlett, S. 2000. The development. British journal of educational studies 48 (1) 24-37.

Bell, J. 2005. Doing your Research Project: A guide for first-time researchers in

education, health and science. 4th

Edition. Berkshire: Open University Press.

Bennett, H with Lister, M and McManus, M. 1992. Teacher Appraisal: Survival and

beyond. London: Longman.

Best, J.W. and Kahn, J.V. 1998. Research in Education. Eighth Edition. Boston: Allyn

and Bacon.

Bogdan, R.C. & Biklen, S.K. 2007. Qualitative Research for Education: An Introduction

to Theory and Methods (Forth Edition). Boston Pearson Education Group, Inc.

Campbell, R.J.2003. Differential Teacher Effectiveness: towards a model for research

and teacher appraisal. Oxford review of education 29 (3): 348-362.

Collins, K.J., Du Plooy, G.M., Grobbelaar, M.M., Terre Blanche, M.J., Van Eden, R.,

Van Rensburg, G.H. and Wigston, D.J. 2000. Research in Social Sciences. Only study

guide for RSC 201-h. Pretoria: South Africa.

78

Creswell, J.W. 1998. Qualitative Inquiring and Research Design. Choosing Among Five

Traditions. Thousand Oaks, CA: Sage Publications.

Declerq, F. 2008. Perspective in education. Teacher quality appraisal and development:

The flaws in IQMS 26 (1): 7-18.

Devries, D.L., Morison, A.M., Shulman, S.L. and Gerlach, M.L. 1981. Performance

Appraisal on the Line. New York: John Wiley and Sons.

De Vos, A.S., Strydom, H., Fouche, C.B. & Delport, C.S.L. 2005. Research at grass roots

for the social sciences and human services professions (3rd

Edition). Pretoria: Van Schaik.

Fidler, B. And Cooper, R. 1992. Staff Appraisal and Staff Management in School and

Colleges: A Guide to Implementation. Harlow: Longman.

Fleisch, B. 2008. Primary education in crisis. Why learners underachieve in reading and

mathematics. Johannesburg: Juta.

Gay, L.R., Mills, G.E. and Airasian, P. 2006. Educational Research: Competencies for

Analysis and Applications. Upper Saddle River, New Jersey: Pearson.

Gardiner, M. 2004. Teacher appraisal: Conditions of Service. South African Journal of

Education 7 (1): 17-24.

Hancock, R. & Settle, D. 1990. Teacher Appraisal and Self-Evaluation: A Practical

Guide. Oxford: Basil Blackwell Ltd.

Hartley, D. 1992. Teacher Appraisal: A Policy Analysis. Edinburgh: Scottish Academic

Press.

79

Hitleman. D.R. and Simon, A.J. 2002. Interpreting Educational Research: An

Introduction for Consumers of Research. Third Edition. New Jersey: Prentice Hall.

Hornby, A.S. 1986. Oxford Advanced Learner’s Current English Dictionary. Cape Town:

Pharos.

Hunt, N. 1992. How to Conduct Staff Appraisals. United Kingdom: How to Books.

Humphreys, K. & Thomas, M. 1995. School Organisation: Searching for Markers,

Collective self-appraisal 15 (2): 133-144.

Johnson, B. & Christensen, L. 2004. Educational Research. (2nd

Edition). Quantitative,

Qualitative and Mixed Approaches. Boston: Pearson.

Jones, J. 1993. Appraisal and Staff Development in Schools. London: Routledge.

Langenbach, M., Vaugh, C. and Aagaard, L. 1994. An Introduction to Educational

Research. Boston: Allyn and Bacon.

Litheko, SRS. 2001. An Appraisal Model for Academic Staff Performance at Technical

Colleges in South Africa. Welkom: Vista University.

McMillan, J.H. & Schumacher. S. 2001. Research in Education: A Conceptual

Introduction. (5th

edition). New York: Longman.

McMillan, J. H & Schumacher, S. 2001. Research in Education: A Conceptual

Introduction. (4th

Edition). New York: Addison Wesley Longman, Inc.

McMillan, JH. & Schumacher, S. 2006. Research in Education: Evidence-Based Inquiry.

Boston: Pearson.

80

Mercer, J. 2006. Appraising higher education faculty in the Middle East: Leadership

lesson from a different world. Management in education 20 (1): 17-26.

Metcalfe, M. 2008. Why our schools do not work. Sunday Times 13/01/2008.

Monyatsi, P. 2006. Teacher appraisal in Botswana Secondary Schools: A critical

analysis. A South African Journal of education 26 (2): 215-228.

Mpolweni, S.N. 1998. Negotiating a system of appraisal: Education practice (1): 54-57.

Murphy, KR. & Cleveland, JN. 1995. Understanding Performance Appraisal: Social,

Organisational, and Goal-Based Perspective. London: Sage Publication.

Myland, L. 1992. Management Skill Guide: Managing Performance Appraisal. Surey:

Croner.

Narsee, H. 2000. The common and contested meanings of education districts in South

Africa. Unpublished doctoral thesis (PhD), Pretoria: University of Pretoria.

Ngwenya, VC. 2006. Performance Appraisal as a Model of Staff Supervision in

Zimbabwean Schools with special Reference to the Makoba Circuit. Pretoria: University

of South Africa.

Phillip, T. 1990. Appraising Performance for Results. London: McGraw-Hill Book

Company.

Poster, C. And Poster, D. 1991. Teacher appraisal: A Guide to training. London:

Routledge.

Quinlan, O. With Davidoff, S. 2005. Valuing Teachers Through Appraisal: Teacher In-

service Project. Cape Town: Via Africa.

81

Revil, D. 1992. Working Paper on Staff Development and Appraisal.

Roberts, G.E. 1994. Maximising performance appraisal system acceptance: Perspective

from Municipal Government Personnel Administration. Public Management. IGI.

SADTU. 2002. Comments on discussion document:” Performance standards for teacher

appraisal and teacher development” February.

SADTU. 2005. Quality teachers for quality education training for a stronger teaching

force. October Scheerens J 2000. The school effectiveness knowledge base as a guide for

school improvement.

Schulze, S., Myburg, C.P. and Poggenpoel, M. 2005. Research Methodology. Study

Guide 2 for MEDEM 2-R. Pretoria: University of South Africa.

Shalem, Y. 2003. Do we have a theory of change? Calling change models to account.

Perspectives in education, 21 (1): 29-49.

Shank, G. & Brown, L. 2007. Exploring Educational Research Literacy. New York:

Routledge.

South Africa. 1998. Education Labour Relation Council (ELRC), Developmental

Appraisal for Educators. Pretoria: Government Printer.

South Africa. 2003. Education Labour Relation Council (ELRC), Integrated Quality

Management System. Pretoria: Government Printer.

Thomas, H. & Patten, JR. 1982. A Manager’s Guide to Performance Appraisal: Pride,

Prejudice and the Law of Equal Opportunity. London: Collier Macmillan Publishers.

82

Tobin, G.A. & Begley, C.M. 2004. Methodological rigour within a qualitative

framework. Journal of advanced nursing, 48 (4): 388-396.

Toch, T. 2008. Teacher evaluation. Educational Leadership 66 (2): 32-33

Tomlison, H. 2003. Teacher Appraisal. Educational Management and Administration 31

(2): 220-222.

Twycross, A. & Shields, L. 2005. Validity and reliability – what’s it all about? Paediatric

nursing.

Wadvalla, I. 2005. The Implementation of the IQMS in two different schools in Gauteng,

Unpublished Bed Hons project. Johannesburg: University of Witwatersrand.

Weber, E. 2005. New Control and Accountability for South African Teachers and

Schools: IQMS Perspective in Education, 23 (2): 63-72.

83

APPENDIX A

INTERVIEW SCHEDULE

Research topic: “The Implementation of Developmental Appraisal in the Matlosana Area

Project Office Schools”.

1. How are the processes of Developmental Appraisal (DA) within IQMS being carried

out in your school?

Can you mention and describe the processes of DA within the IQMS in your

school?

Which processes are being followed when implementing DA within the IQMS in

your school?

Can you explain how each process of implementing DA within IQMS is carried

out in your school?

2. What are the roles of the Staff Development Team (SDT) in implementing DA in your

school?

How does the SDT in your school implement DA?

Which functions are performed by the SDT in your school to implement DA?

Can you elaborate on the duties of the SDT in your school when implementing

DA?

3. What are the roles of the Area Project Office (APO), officials in the implementation

DA in your school?

How do the APO officials help your school in the implementation of DA?

How do you think the APO officials should help your school in the

implementation of DA?

What functions according to the IQMS document should the APO officials

perform to implement DA?

84

4. What are your perceptions regarding the effectiveness of the SDT and the APO

officials in implementing DA in your school?

Do you think the SDT and APO officials are implementing DA effectively in your

school?

Do you think the SDT and APO officials are carrying out their responsibilities

efficiently in your school?

Is the DA being implemented in accordance the IQMS document requirements by

the SDT and APO officials in your school?

5. What are the challenges on the implementation of DA in your school?

Which problems do you encounter when implementing DA in your school?

What are your concerns regarding the effectiveness in the implementation of DA

in your school?

What do you think are the obstacles on effective implementation of DA in your

school?