Report on the DBE NSC November 2018 Examinations Portfolio...

72
Report on the DBE NSC November 2018 Examinations Portfolio Committee on Basic Education 12 February 2019 Dr Mafu Rakometsi 1

Transcript of Report on the DBE NSC November 2018 Examinations Portfolio...

Report on the DBE NSC November 2018 Examinations

Portfolio Committee on Basic Education

12 February 2019

Dr Mafu Rakometsi

1

1. Umalusi mandate and regulatory framework

2. Framework for Quality Assurance of Assessment

3. The quality assurance processes undertaken in 2018

3.1 Scope of the 2018 quality assurance of assessment

3.2 Areas of good practice/compliance

3.3 Tracking of directives for compliance 2016-2018

3.4 Areas of poor learner performance in selectedsubjects in 2018.

4. Standardisation and Resulting

5. Conclusion

Presentation Outline

2

UMALUSI MANDATE AND REGULATORY FRAMEWORK

3

Umalusi mandate and regulatory framework

Umalusi derives its mandate for quality assurance of assessment from the:

• National Qualifications Framework (NQF) Act No. 67 of 2008

Sections 27 (h) and 27 (i) and

• The General Further Education and Training Quality Assurance Act asamended in 2008. Section 17 (A) which state that:

(3) The Council must, perform the external moderation of assessment of all assessment bodies and education institutions

(4) The Council may adjust raw marks during the standardisation process and the

(5) The Council must, with the concurrence of the Director-General and after consultation with the relevant assessment body or education institution, approve the publication of the results of learners if the Council is satisfied that the assessment body or education institution has satisfied the conditions that warrant such an approval.

4

FRAMEWORK FOR QUALITY ASSURANCE OF

ASSESSMENT

5

Framework for Quality Assurance of Assessment

Quality assurance of assessment is conducted to ensure that assessment leading to the award of certificates in schools, adult education centres and technical and vocational education and training colleges is of the required standard. This is in order to ensure that the certificates issued by Umalusi are credible.

This is achieved through:

• Moderation of examination question papers, Practical Assessment

Tasks (PAT) and Common Assessment Tasks in the case of Life

Orientation

• Monitoring and moderation of School Based Assessment (SBA)

• Monitoring of the conduct, administration and management of

assessment and examination processes

• Monitoring and Moderation of marking

• Management of concessions and examination irregularities

• Standardisation of assessment outcomes

• Approval of release of results

6

National senior Certificate

The National Senior Certificate examinations are administered by three Assessment bodies (Department of Basic Education (DBE), Independent Examinations Board (IEB) and South African Comprehensive Assessment Institute (SACAI), all 3 assessment bodies are quality assured and certificated by Umalusi:

Overview of candidates that wrote the NSC examinations

Assessment Body 2016 2017 2018

DBE FT 664 600 622 802 619 133

DBE PT 1 237 2 689 1 150

TOTAL 665 837 625 491 620 283

IEB FT 11 101 11 518 11 587

IEB PT 479 474 628

TOTAL 11 580 11 992 12 215

SACAI PT 1 207 1 438 1 608

TOTAL 1 207 1 438 1 608

7

QUALITY ASSURANCE PROCESSESS UNDERTAKEN

in 2018

8

Overview of the Quality Assurance of Assessment and Examination Processes

• Umalusi moderated and approved 116 DBE NSCNovember 2018 question papers with their markingguidelines which went through different moderation levelsbetween February and August 2018.

• Between July and October 2018, Umalusi conductedverification and moderation of SBA of 15 subjects thatwere selected from across the nine provinces.

• The state of readiness to conduct examinations wasconducted across all the nine (9) PEDs.

• Umalusi monitored a sample of 261 examination centresand 28 marking centres across the nine (9) PEDs.

9

Overview of the Quality Assurance of Assessment and Examination Processes

• Umalusi attended 122 marking guideline discussions.

• Twenty-eight (28) subjects were sampled for theverification of marking in 94 marking centres.

• DBE presented a total of 67 subjects for thestandardisation of the November 2018 National SeniorCertificate (NSC) examinations.

• All irregularities that were identified were managed in accordance with the Regulations pertaining to the Conduct, Administration and Management of the National Senior Certificate Examinations by the Provincial Examination Irregularities Committees (PEICs), challenges experienced were captured per Provincial Education Department.

10

MODERATION OF QUESTION PAPERS

11

Moderation of Question Paper

In 2018 Umalusi moderated and approved 116 question papers with their marking guidelines for the 201811 examinations.

The question papers went through different levels of moderation and approved at different levels as indicated in the table below:

Number of

moderations

November

2015

November

2016

November

2017

November

2018

One 11,5% 22,6% 17,5% 32,8%

Two 67,7% 68,4% 79,3% 64,6%

Three 17,7% 8,3% 3,2% 2,6%

Four 2,3% 0.8% 0% 0%

Five 0,8% 0% 0% 0%

12

Moderation of Question Papers

Areas of Good Practice

The DBE is commended for the improvement in the percentageof question papers that were approved at first and secondmoderation. The analysis of the question paper moderationreports revealed that 97.4% of the November 2018 questionpapers met all external moderation criteria during the first andsecond moderations, compared to 96.8% in November 2017.

Equally commendable was the fact that 38 question paperswere approved at first moderation.

Umalusi noted an improvement of more than 3% in compliancewith the following criteria at first moderation in comparison withthe November 2017 examination :

Language and bias (from 56% to 59%); and

Marking guidelines (from 37% to 41%).

13

Moderation of Question PapersDirectivesDBE must

Nov 2016 Nov 2017 Nov 2018 Progress in 2018

Reduce the number of

Question Papers requiring

more than two external

moderation sessions due

to non - compliance

91% of question papers complied

96.8% of

question

papers

complied

97.4% of

question

papers

complied

A gradual reduction of question papersrequiring more than two moderations forapproval was noted. Only three questionpapers went up to third moderation in2018.

Retrain the examiners

and internal moderators

on the following aspects;

(technical details (TD),

quality of questions (QQ),

quality of marking

guidelines (MG))

Question

papers that

complied

with criteria:

TD: 53%

QQ: 40%

MG: 35%

Question papers that complied with criteria:

TD: 48%QQ: 29%MG: 37%

Question papers that complied with criteria:

TD: 41%QQ: 28%MG: 41%

Less than 50% of the question paperscomplied fully with these three criteria(technical details, quality of questionsand quality of marking guidelines) – in2018 the compliance to technicaldetails and quality of question paperscriteria declined further, to a state worsestate than in 2016 and 2017. The Markingguideline even though it did not meetthe 50% mark, it is steadily improving.

Improve examiners and

internal moderators ability

to set higher order

questions and balance

distribution of cognitive

levels

59% of

question

papers

complied

with the

cognitive skills

requirements

60% of

question

papers

complied

59% of

question

papers

complied

The compliance level is stagnatingbetween 59% and 60% - hence noimprovement or decline. Thecompliance level is still very low – moreshould be done to improve.

14

Moderation of Question PapersDirectives

DBE must

Nov 2016 Nov 2017 Nov 2018 Progress in 2018

Improve on the development of

marking guidelines at the time of

setting the question paper rather

than to rely on post-examination

refining of the marking

guidelines

83%

compliant

80%

compliant

A decline of 3% has

been noted. Though

compliance is high,

there is room for

improvement.

Conduct workshops to

capacitate examiners and

internal moderators in the setting

of question papers, placing

more emphasis on the criteria

with lower levels of compliance,

i.e

Technical details (Compliance

to this criterion has shown a

decline since 2016);

Internal moderation;

Quality of questions; and

Cognitive skills.

Partially

compliant

Internal

moderation

75%

compliant

Internal

moderation

71%

compliant

The decline of 4%

noted in the internal

moderation criterion,

indicate that question

papers submitted for

external moderation

were in an

unacceptable state

and could not be

approved.

15

MODERATION OF SBA

16

Scope of Moderation of SBA

Subjects moderated during July and October 2018

* Fitting and Machining; Welding and Metal work

Accounting Mathematical Literacy

Business Studies Mathematics

Consumer Studies Mechanical Technology*

Economics Music

Geography Physical Sciences

History SASL HL

Life Orientation Tourism

Life Sciences

17

Moderation of SBA

Areas of Good Practice

Areas of good practice were observed in different provinces and the good practices differed from one subject to another. Some of the good practices noted are as follows:

Inclusion of the subject improvement plans in some Western Cape teacher files;

The use of district/provincial common assessment tasks in most provinces was also commendable, since it helped to maintain quality and standards;

Gauteng and Western Cape are commended for the improved quality of internal moderation in Life Sciences.

18

Moderation of SBADirectives

DBE must ensure:

Nov

2016

Nov 2017 Nov 2018 Progress in 2018

That the quality assurance

of assessment in subjects

with a practical

component for both the

Practical Assessment Task

(PAT) and the practical

examinations is attended

to

Not-

compliant

compliant Done. A generic guideline to

standardise the moderation of

PATs was developed.

The guidelines have been

customized in 6 subjects ( to

make it subject specific) and the

other subjects will follow in 2019

That adherence to subject

assessment policies in all

subjects in all provinces is

strengthened;

Not-

compliant

New directive this will be

observed in 2019.

Discourage the use of

recycled tasks and

shadow marking in SBA

through effective internal

moderation processes.

Not-

complia

nt

Not-

compliant

Not

compliant

Shadow marking is still evident in

some subjects.

19

Moderation of SBADirectives

DBE must :

Nov

2016

Nov

2017

Nov 2018 Progress in 2018

Ensure that internal

moderation is conducted

efficiently and effectively

at all levels of the system

Not-

compliant

New directive, this will be

observed in 2019.

Ensure that sufficient

focused support is given

to SASL HL SBA and the

moderation thereof.

Not-

compliant

New directive, this will be

observed in 2019.

20

AUDIT OF APPOINTED MARKERS

21

Audit of Appointed Markers

An audit of appointed markers was conducted in all the nine (9) PEDs and the following were observed:

• PEDs used PAM as the basis for selection of markers,

• Enhanced criteria for selection of markers was observed in some PEDs;

• In certain PEDs the marker evaluation outcomes of the previous examination were used as one of the selection criteria,

Areas of Good Practice

Some PEDs were innovative in their marker appointment

processes and used provincially determined criteria to

enhance the PAM criteria.

22

Audit of Appointed Markers

Directives

DBE must:

Nov 2016 Nov 2017 Nov 2018 Progress in 2018

Ensure that marker

selection panels consider

recommendation made

by principals and/or

district officials in the

appointment of markers.

Not-compliant

A few cases were observed in two PEDs. A new directive will be observed in 2019

Ensure that PEDs verify

qualifications of markers

at all levels before

appointments are made.

Not-

compliant

Partially

compliant

Partially

compliant

A few discrepancies in the selection process were still observed by Umalusi during the 2018 audit.Five out of nine provinces did not adhere to 2016 & 2017 directive of verifying qualifications before making appointments, in most of the verified subjects.

23

STATE OF READINESS

24

Monitoring the State of Readiness

The state of readiness (SOR) was conducted across all PEDs between 5 September 2018 and 2 October 2018

Areas of Good Practice

Umalusi noted the following areas of good practice:

• Development of clear Standard Operating Procedure across qualityassurance processes as a measure to standardise operations and toimprove on standards in systems used in the delivery of qualityexaminations and assessments.

• An increase in the monitoring of high risk examination centresthrough the deployment of resident monitors across the PEDs.

• Online monitoring report by monitors to expedite daily reports inMpumalanga Province.

25

Monitoring the State of ReadinessDirectives

DBE must:

Nov 2016 Nov 2017 NOV 2018 Progress in 2018

Effectively implement the

policy on the registration of

immigrant candidates;

Not-

compliant

Partially

compliant

Complia

nt

PEDs strengthened the

implementation of policy,

and there were only few

incidents where candidates

did not submit relevant

documentation;

Ensure that all districts

conduct an audit of

examination centres to

verify their readiness to

administer examinations;

Not-

compliant

Partially

compliant

Complia

nt

Compared to the 2017 audit,

Umalusi noted a significant

improvement in the efforts of

the DBE and the PEDs in

auditing the examination

centres, across all nine

provinces.

Ensure that security features

at districts/nodal points are

evaluated and improved.

Storage facilities should

have features such as

double locking systems,

alarms in working condition,

and surveillance cameras;

Not-

compliant

Partially

compliant

Complia

nt

Umalusi noted substantial

improvements on security

features at the nodal

points/districts and storage

facilities during the state of

readiness across the PEDs

26

Monitoring the State of Readiness

Directives

DBE must ensure that

Nov 2016 Nov 2017 Nov 2018 Progress in 2018

The North West and Free State

PEDs strengthen or improve

printing facilities to avoid

manual handling of question

papers;

Not

compliant

Not

compliant

Manual packaging of

question papers in the

Free State and North

West was still not

addressed.

Proper surveillance systems are

installed at all printing facilities.

Not

compliant

Partially

compliant

North west did not

adequately address

norms and standards

requirements for

security at the printing

site.

27

MONITORING OF THE WRITING OF EXAMINATIONS

28

Monitoring of the writing phase

EC FS GP KZN LP MP NC NW WC Total

Number

of centres

35 24 34 36

(includes

1 in

Eswatini)

34 19 20 21 38 261

Areas of Good Practice

Umalusi noted the following areas of good practice:

• Introduction of web-cameras as a resource to facilitate

efficiency in SASL HL examinations across centres offering the

subject.

• Implementation of a three tier monitoring approach across PEDs.

29

Monitoring the Writing of Examinations

Directives

DBE must:

Nov 2016 Nov 2017 Nov 2018 Progress in 2018

Ensure that invigilators read

examination rules to

candidates, check question

papers for technical errors

with the candidates and

give candidates 10 minutes

reading time during which

no writing or scribbling should

be allowed;

Not

Compliant

Not

Compliant

Compliant This was

addressed and

is no longer a

directive.

Ensure that all candidates

who leave the examination

room during writing are

accompanied by invigilators,

and it is advisable to keep a

record of such candidates.

Not

Compliant

Compliant Compliant This was

addressed and

is no longer a

directive.

30

Monitoring the Writing of ExaminationsDirectives

DBE must:

Nov 2016 Nov 2017 Nov 2018 Progress in 2018

Ensure that the

examination centres verify

the candidates at the

entry point for relevant

documentation to avoid

impersonation;

Not

compliant

Not

Compliant

Partially

compliant

Forty-one centres

out of the 261

centres monitored

admitted

candidates, without

verifying their

identity.

Ensure that the Chief

Invigilators prepare daily

situational report and file

copies of dispatch form in

the examination file for

reference.

Not

compliant

Not

compliant

Partially

compliant

Approximately

25.4% of the

examination

centres monitored

did not comply with

the requirement of

preparing the daily

situational reports as

well as filing of

dispatch forms.

31

Monitoring the Writing of Examinations

Directives

DBE must:

Nov 2016 Nov 2017 Nov 2018 Progress in 2018

Ensure that the key to the

facility that stores

examination material is kept

by the Chief invigilator

before the start of the

examination session;

Not

compliant

Compliant Compliant Full compliance was

observed from all the

examination centres in 2018.

Ensure that all examination

sessions have a seating

plan drawn, and available

for verification;

Not

compliant

Not

compliant

Compliant Full compliance was

observed from all the

examination centres in 2018.

Ensure that principals are

appointed as chief

invigilators as per regulation,

and letter of delegation

must be issued in cases

where they are not able to

administer the sessions;

Not

compliant

Compliant Compliant Full compliance was

observed from all the

examination centres

monitored.

32

MONITORING OF MARKING

33

Monitoring of Marking

Areas of Good Practice

It is pleasing to note that the monitors reported positively about

their respective monitoring sessions, and the following positive

remarks regarding areas of good practice were made:

The use of a CD by two PEDs with all the necessary templates

and a manual document to be kept as back-up was well

received;

The compilation of a comprehensive marking manual and a

manual containing most marking forms used at venues were

of great help;

Some centres had back up plans (like the use of generators)

for the unforeseen load shedding occurrences.

34

Monitoring of MarkingDirectives

DBE must:

Nov 2016 Nov 2017 Nov 2018 Progress in 2018

Ensure that the security at the

main entrances into the

marking centres is tight, and

security guards must be

trained by the assessment

body in order for them to be

effective and efficient;

Not

compliant

Not

compliant

Compliant The level of

improvement was

observed. Strict access

control was

maintained at the

main entrance of each

centre.

Ensure that all training material

are delivered on time in order

to allow the determined norm

time to be achieved without

pressure;

Not

compliant

Not

compliant

Partially

compliant

Late delivery of

marking guidelines was

observed at some

marking centres.

Ensure that monitoring of

marking centres by the

assessment body takes place

and evidence of such visits is

availed to the centre.

Not

compliant

Compliant Compliant The directive was fully

adhered to.

35

MARKING GUIDELINE DISCUSSIONS

36

Marking Guideline Discussions

Area of Good Practice

The PED chief markers and internal moderators markedthe authorisation scripts separately and without anydiscussion. Their performance declared them competentand they were duly authorised to train markers at theprovincial marking centres.

37

Marking Guideline DiscussionsDirectives

DBE must ensure that

Nov

2016

Nov 2017 Nov 2018 Progress in 2018

All chief markers and internal

moderators come to the

meetings having marked the

required sample of scripts.

In 41

question

papers only

2 provinces

pre-marked

less than 20

scripts

In 26

question

papers

only 2

provinces

pre-

marked

less than

20 scripts

In 21

question

papers, 8

provinces

pre-

marked

less than

20 scripts.

A reduction in non

compliance has been

noted (from 41 question

papers in 2016 to 21

question papers in 2018).

The directive is partially

addressed.

Sufficient time is allowed

between the writing of a

question paper and its

marking guideline discussion

meeting. (This allows time for

chief markers and internal

moderators to mark the

sampled number of scripts in

preparation for the marking

guideline discussion

meetings).

An average

of 7 days

between

examinatio

n day and

marking

guideline

meeting

An

average

of 9 days

between

examinati

on day

and

marking

guideline

meeting

Compliant In 2018 an average of 9

days between

examination day and

marking guideline meeting

was maintained.

38

Marking Guideline DiscussionsDirectives

DBE must ensure that

Nov

2016

Nov 2017 Nov 2018 Progress in 2018

All provinces offering a

subject are represented at

the marking guideline

discussion, irrespective of

scripts marked at their

province or not.

12 subjects

involved

Same

subjects as

in 2016

Compliant Only subjects that are

marked centrally did not

have representatives

from all nine provinces.

The print quality of all

diagrams/ illustrations in all

question papers are

thoroughly checked

before delivered to

marking centres.

Math

Literacy

Paper 1 in

three

provinces

(NC, MP &

GP)

Partially

compliant

The print quality of the

time zone map for the

Tourism question papers

in all provinces was poor.

Different people are

appointed for different

levels of the same question

papers.

IsiNdebele,

Tshivenda

and

Xitsonga

FAL

Tshivenda

and

Xitsonga

FAL

Partially

compliant

Improvement in

IsiNdebele HL and FAL:

meetings were

organised on different

dates.

39

Marking Guideline Discussions

Directives

DBE must ensure that

Nov

2016

Nov 2017 Nov 2018 Progress in 2019

Interpreters are available

throughout the SASL HL

marking guideline

discussion sessions.

Not

compliant

A new directive will be

observed in 2019

Strategies are developed

on how to conduct

marking guideline

standardisation meetings

for subjects with a practical

components such as Visual

Arts

Not compliant

Complaint Marking Guideline

meetings were

conducted for Virtual

Arts and Design.

40

VERIFICATION OF MARKING

41

Areas of Good Practice

The following areas of good practice were noted:• The discussed and approved marking guidelines were

adhered to for most subjects.• Approval of changes and additions to marking guidelines

followed due process.• The quality of marking was found to be good. Where novice

markers were appointed, they were appropriately assisted bysenior markers.

• The determination of and adherence to a marking tolerancerange for examination scripts made marking more reliable.Variances in marks allocated were mostly within the agreedtolerance range.

• Positive reporting regarding the fairness, reliability and validityof marking was commendable.

• Pairing of inexperienced and experienced markers and deafand non-deaf markers for SASL HL was commendable.

Verification of Marking

42

Verification of Marking

Directives

DBE must ensure that:

Nov 2016 Nov 2017 Nov 2018 Progress in 2018

The quality of marking in

Mathematical Literacy needs

attention, especially in Gauteng

and Kwa-Zulu Natal.

(Visual Arts in EC, Sesotho in GP

and FS)

Not compliant

Not compliant

Complaint The moderators had no

complaints.

Marking in each subject is

synchronised across PEDs, and a

more effective way needs to be

found to guarantee that any

approved changes to the final

Umalusi approved MG arrive

timeously in the PEDs

Not compliant

Not compliant

Not Compliant

Gauteng and Limpopo

started marking earlier

than all the other PEDs.

43

41

AREAS OF POOR LEARNER PERFORMANCE IN

SELECTED SUBJECTS

Areas of poor performance

ACCOUNTING

Question 4 Question 4 (70 marks) based on cash flow statement and

interpretation, was found to be the most challenging question but fell

within the criteria of the CAPS.

The question showed some innovation from previous years but had

several very easy sub-questions that have been assessed many times

in the past.

42

Areas of poor performance

BUSINESS STUDIES

Question 2 Q 2.4 (9 marks): Candidates confused diversification strategies with

intensive strategies.

Q 2.5 (6 marks): Candidates provided provisions of the BCEA.

Q 2.6 (11 marks): The Labour Relations Act was confused with EEA.

Q 2.7 (10 marks): Candidates confused steps in formulating a strategy

with steps in evaluating a strategy.

Question 4 Q 4.2: Candidates answered with diversity issues and

recommendations. They confused Cultural rights, creative thinking with

Benefits of diversity in the workplace.

Q 4.4: Candidates answered with repetition

Q 4.5: This is a new question. Candidates confused the application of

force field with Kurt Lewin change management theory.

Q 4.6.1: The “Expert” was a difficult type of personality for candidates to

identify.

Q 4.6.2: The strategy proposed by candidates was negative.

Q 4.7: Candidates answers were not in the context of the workplace.

43

Areas of poor performance

CIVIL TECHNOLOGY

Question 2 –Civil Services and

Construction(40 marks)

Generic

section

• Drawings, calculations and identification members

• Poor subject knowledge

• Lack of appropriate terminology

• Lack of appropriate skills

• Poor preparation

44

Areas of poor performance

COMPUTER APPLICATIONS TECHNOLOGY

Paper 1 • Question 4: specifically 4.5 (8 marks) the use of a complicated

spreadsheet function

• Question 5: specifically 5.3 – 5.5 (14 marks), more complex database

queries and reports

• Question 7: specifically 7.2 – 7.4 (11 marks), integration of

applications and more difficult spreadsheet

Paper 2 • Question 4 (25 marks): systems technologies; some really simple

definitions, e.g. device driver and functions of an operating system

were not known

• Question 8 (15 marks): theoretical underpinnings of the application

software; teachers often complain about the amount of practical

work in the theory paper

• Question 10 (25 marks): questions on various topics based on an

unknown scenario

45

Areas of poor performance

CONSUMER STUDIES

Question 3 In particular dietary related diseases and food borne diseases require

attention. Examples include obesity, Q3.3 (22 marks) and Listeriosis, Q3.4

(12 marks)

Question 4 • Question 4.3.2 (4 marks) regarding optical illusion was generally

poorly answered. The content of this question is based in Grade 11,

but applies to clothing for the world of work.

Question 6 • Question 6.1 (2 marks), distribution of products, poorly understood.

• Candidates were not always able to interpret and apply the

information in the scenario correctly.

46

Areas of poor performance

DRAMATIC ARTS

Question 1

(30 marks)

The essay (Q1) is by its nature a very difficult question as candidates

must construct an argument, in response to the stimulus, based on the

play text and its associated theatrical movement as well as the source

and the details of the text.

Question 8 8.1.1 (6 marks) candidates could not explain the difference between a

poem, monologue and prose.

8.2 (10 marks) unexpected difficult item – had to read and analyse

poem before they could decide how to stage it.

8.4 (12 marks) mini essay required complex skills to relate well to theatre

movement. Many described same movement from Q1 in spite of being

warned not to.

47

Areas of poor performance

ECONOMICS

Paper 1 Questions: 2.3.1 (1 marks) candidates could not identify the factors of

production;

2.2.3 (2 marks); 2.3.3 (2 marks); 3.2.3 (2 marks) and 3.3.3 (2 marks) failed to

describe concepts;

2.3.5 (4 marks) Could calculate

2.4 (8 marks); 2.5 (8 marks); 3.4 (8 marks); 3.5 (8 marks); 4.4 (8 marks) and 4.5

(8 marks) provided incomplete responses or did not attempt to answer

Paper 2 Questions: 2.3.3 (2 marks); 3.2.3 (2 marks); 3.3.3 (2 marks) and 4.3.3 (2 marks)

failed to describe concepts;

2.3.5 (4 marks) Could not calculate;

3.2.5; 2.4 (8 marks); 2.5 (8 marks); 3.3.4 (2 marks); 3.3.5 (4 marks); 3.4 (8 marks);

3.5 (8 marks); 4.2.4 (2 marks); 4.2.5 (4 marks); 4.3.4 (2 marks); 4.3.5 (4 marks);

4.4 (8 marks) and 4.5 (8 marks) provided incomplete responses or did not

attempt to answer.

Candidates failed to respond to middle and high cognitive level questions

including drawing of graphs.

48

Areas of poor performance

ELECTRICAL TECHNOLOGY

Digital

Electronics

Question 4.3 showed NOR Gates in the BCD encoder which influenced the

truth table. This alternative was however included in the adapted marking

guideline.

Question 2 (60 marks) – Switching Circuits is common to Electronics and

Digital and was answered poorly. It is a question that heavily rests on

practical for scaffolding of knowledge and understanding.

Question 4 (55 marks) – Digital and Sequential Devices are abstract and

requires studying to memorise the topic. Principles of operation rests

heavily on digital principles laid in Grades 10 and 11

Question 5 (55 marks) – Microcontrollers are complex in operation and

require not only solid background in Q4 but extends to design.

Electronics Question 3 (30 marks), 4 (60 marks) and 5 (60 marks) were answered poorly.

All three these questions had a strong electronic content and as answers

were consistently wrong throughout, a possible explanation could be that

teacher content knowledge training was insufficient. Question 4 and 5 are

particularly grounded in practical work which scaffolds learning

extensively.

Power

Systems

Question 6 (60 marks) contains new content, Variable Speed Drives, which

could account for the poor reaction in that question.

49

Areas of poor performance

GEOGRAPHY

Paper 1 • Q1.6.5 (6 marks) candidates were not able to adequately respond

to the changes in the shape of the cross-profile along the different

courses of a river system

• Q2.4.4 (8 marks) candidate performance in the writing of

paragraph questions demonstrated dismal levels in abilities in

analysis and interpretation.

• The concept of rural-urban fringe in Q3.4 (15 marks) was poorly

responded to by the majority of candidates.

• The candidates struggled with the paragraph to explain the factors

preventing South Africa from being competitive with other top beef

exporters in Question 3.5.4 (8 marks)

Paper 2 • Question 3 (25 marks): Candidates experienced problems relating

the theory to the maps. Their poor knowledge of the theory tested

in the mapwork also impacted negatively on the performance.

50

Areas of poor performance

HISTORY

Paper 1 and

Paper 2

Paper :1 Question 2 (50 marks) and Paper 2: Questions 3 (50 marks)

and 5 (50 marks):

Most candidates are not able to use evidence from sources and

their own knowledge to answer the paragraph questions. They are

struggling to apply higher order skills in source-based questions. The

questions that test the usefulness of sources, are problematic.

51

Areas of poor performance

LIFE SCIENCES

Paper 1 The worst performed question was Q2.3 (11 marks) – with an

average performance of 20% for the sampled candidates. This

question is based on an investigation. Data was provided in the

form of bar graphs. It required interpretation and analysis. In

addition, it required knowledge about the processes and concepts

with respect to investigations.

Paper 2 The worst answered question was Q2.1 (13 marks) which yielded a

31% average performance for the sampled candidates. This

question was based on a set of diagrams representing meiosis. It

required the application of knowledge. Hence, it was categorised

as a higher order question (level C). 7/13 marks may be considered

to be easy to moderately difficult, while 6/13 may be categorised

as difficult.

52

Areas of poor performance

MATHEMATICS

Paper 1 • Q 10 (6 marks) (16%) on geometry (similarity) combined with

optimisation caused the most problems.

• Candidates also struggled a lot with Q 11 (7 marks) (30%) and Q 12

(9 marks) (29%) on basic counting principles and probability.

• The poor performance in probability is still a reason for concern. This

indicates that candidates generally still have a poor grasp of

probability and especially the applying of counting principles.

Paper 2 • Candidates performed the worst in Q 10 (16 marks) on Euclidean

geometry that required some higher order thinking and reasoning

(32%).

• Their performance in the 3 questions on trigonometry (42 marks)was

poor and disappointing.

53

Areas of poor performance

MATHEMATICAL LITERACY

Paper 1 • Definitions in context was poorly answered due to poor language skills or

inadequate preparation. See Q1.2.3 (2 marks), Q2.1.1 (2 marks), Q2.2.1 (2

marks), Q2.3.2 (2 marks), Q3.1.3 (2 marks) and Q5.2.1 (2 marks)

• Conversions of units, time and number formats poorly answered. See

Q1.1.4 (2 marks), Q1.4.1 (2 marks), Q2.2.4 (3 marks), Q3.1.2 (5 marks),

Q3.1.4 (2 marks), Q3.2.1 (2 marks), Q3.2.2 (2 marks), Q4.1.5 (4 marks),

Q4.2.4 (2 marks), Q5.1 (11 marks)

Paper 2 • Q2 (38 marks) and Q3 (39 marks) was poorly answered – usually

measurement is a difficult question for many. Poor interpretation skills. See

Q1.2 (13 marks), Q1.3 (4 marks), Q3.2 (21 marks).

• Q3 was also poorly answered due to poor interpretation skills.

• Probability concept & percentage increase in Q4 presented problems to

some candidates.

MECHANICAL TECHNOLOGY

Automotive • Questions 3 (14 marks), 5 (23 marks), 8 (23 marks), 9 (18 marks) and 10 (32

marks) was poorly answered, posing challenges to many candidates.

54

Areas of poor performance

PHYSICAL SCIENCES

Paper 1 • Question 1 (20 marks): Poorly answered questions were 1.2, 1.4, 1.8

and 1.10 (2 marks per sub-question).

• Question 2: Question 2.3.3 (4 marks) was the only challenge in this

question, because it required resolving the tension into components.

• Question 5: Candidates could not give a correct and complete

definition of a non-conservative force in question 5.1 (2 marks). As

usual, question 5.5 (5 marks), involving energy principles proved to be

beyond the capabilities of most of the learners. In calculating the

work done by frictional force candidates omitted the work done by

either gravity or the force F.

• Question 9 (6 marks): This question based on a drawn graph was the

worst answered question in the paper. The equation of the graph was

given in the question, but the learners failed to apply their basic

mathematical skills to the required physics.

Paper 2 • Question 1 (20 marks): Poorly answered questions were 1.5, 1.6, 1.7

and 1.8 (2 marks per sub-question).

• Question 4 (17 marks): Learners find difficulties in identifying the

different types of organic reactions and how to determine the

ensuing products of these reactions.

55

Areas of poor performance

TECHNICAL MATHEMATICS

Paper 1 • Candidates generally performed poorly in Question 4 (24 marks)

(Functions) with an average percentage of 28,74% for the sampled

candidates.

Paper 2 • Candidates generally performed poorly in Question 9 (14 marks)

(Proportion and Similarity) with an average percentage of 11,0%.

• Candidates also performed poorly in Question 5 (12 marks)

(Trigonometric Functions) with an average of 31,0% , also

performed poorly in Question 6 (9 marks) (Solution of Triangles)

with an average of 36,0% and Question 8 (13 marks) (Circle

Geometry) with an average of 35,0%.

56

Areas of poor performance

TECHNICAL SCIENCES

Paper 1 • Question 2 (22 marks): candidates not familiar with the information

sheet (wrong formulae were used)

• Question 3 (25 marks): struggled with the concepts of momentum

and impulse

• Question 4 (22 marks): could not recall the definitions and laws of

energy

Paper 2 • Question 4 (15 marks): They did not know the different type of

addition reactions

• Question 5 (17 marks): failed to recall the definition for electrolysis.

• Question 9 (11 marks): Many learners did not know the definition of

electromagnetic waves.

57

STANDARDISATION AND RESULTING

61

What is Standardisation?

A process used to eliminate the effect of

factors other than the learners’

knowledge, abilities and

aptitude on their performance.

62

Why Standardise Results?

• To ensure that learners are

not advantaged or

disadvantaged by factors

other than their knowledge

of the subject, abilities and

aptitude.

• To achieve comparability

and consistency of the results

from one year to the next.

63

Standardisation Principles

No adjustment

should exceed 10%

of the historical

average in either

direction (upward or downward)

If the distribution of

the raw marks is

below or above the historical

average, the marks may

be adjusted either way subject to limitations

In the case of an individual candidate,

the adjustment

effected should not

exceed half of the raw

mark obtained by

the candidate

After considering qualitative

and quantitative

reports, Umalusi

formulates positions on

each subject

64

Results Standardised by Umalusi Per Qualification

Department of Higher Education and

Training (DHET)

• National Certificate (Vocational)

• General Education and Training

Certificate

• N1-N3

Department of Basic Education (DBE) • National Senior Certificate

• Senior Certificate (amended)

Independent Examinations Board

(IEB)

• National Senior Certificate

• General Education and Training

Certificate

South African Comprehensive

Assessment Institute (SACAI)

• National Senior Certificate

Benchmark Assessment Agency

(BAA)

• General Education and Training

Certificate

65

Standardisation Process

(1) Question papers are set by examiners

(2) Examiners submit question

papers to internal

moderators

(3) Internal moderators

submit QPs to external

moderators

(4) Question papers are

approved by external

moderators

(5) Monitoring exams’ state of

readiness

(6) Monitoring Exams

(7) Verification of marking and capturing of

marks

(8) Standardisationand approval of

results by the ASC

(9) Announcement of the approval of results by Council

18 months

66

Standardisation Decisions DBE NSC

Description November

2016

November

2017

November

2018

Number of instructional

offerings presented

58 58 67

Raw marks 26 38 39

Adjusted (mainly upwards) 28 16 17

Adjusted (mainly downwards) 4 4 11

Number of instructional

offerings standardised

58 58 67

67

Conclusion

The conduct, administration and management of the

National Senior Certificate is regulated and it is imperative

that all the processes undertaken are premised on the

“Regulations Pertaining to the Conduct, Administration

and Management of the National Senior Certificate”; and

Non adherence to the Regulations and Umalusi policies

may lead to instances which can compromise the

credibility and integrity of the assessments and

examinations.

68

Approval of the results

The evidence presented to Umalusi suggests that the conduct,

administration and management of the DBE November 2018 NSC

examinations was satisfactory.

The rigor with which the DBE monitored the examinations both in

the preparatory phase for the conduct of the examinations and

the actual conduct of the examinations is an indication of the

level of commitment to the delivery of a credible examination.

Umalusi appreciates the efforts put into the whole process.

The DBE is also commended on the swift response in dealing with

the different cases of alleged irregularities reported during the

conduct of the examinations.

The DBE is still urged to address the directives for compliance

issued in the 2018 quality assurance report.

69

Approval of the results

Having studied all the evidence at hand on the

management and conduct of the National Senior

Certificate examinations administered by the

Department of Basic Education (DBE) Umalusi is

satisfied that, apart from instances of irregularities,

there were no systemic irregularities reported that

could have compromised the overall integrity and

credibility of the November 2018 NSC examination,

and as a result the Executive Committee of Council

approved the release of the DBE November 2018

NSC results.

70

THANK YOU

71

Contact details

012 349 1510

0800 000 889

[email protected]

[email protected]

[email protected]

72