Chapter 11

20
CHAPTER 11 REVISING INSTRUCTIONAL MATERIALS Carolyn Jenkins-Haigler

Transcript of Chapter 11

Page 1: Chapter 11

CHAPTER 11

REVISING INSTRUCTIONAL MATERIALS

Carolyn Jenkins-Haigler

Page 2: Chapter 11

BACKGROUND

Summarizing and analyzing data obtained from

formative evaluation

Revising materials

The changes that are made to the content of the

materials

The changes that are related to the procedures

employed in using the materials

Page 3: Chapter 11

OBJECTIVES

Describe various methods for summarizing dataobtained from formative evaluation studies.

Summarize data obtained from formative evaluationstudies.

Given summarized formative evaluation data,identify weaknesses in instructional materials andinstructor-led instruction.

Given formative evaluation data for set ofinstructional materials, identify problems in thematerials, and suggest revisions for the materials.

Page 4: Chapter 11

KINDS OF DATA TO ANALYZE

Learner characteristics

Entry behavior

Direct responses to the instruction

Learning time

Posttest performance

Responses to an attitude questionnaire

Comments made directly in the materials

Page 5: Chapter 11

ANALYZING DATA FROM ONE-TO-ONE

TRIALS

The designer must look at the similarities and

differences among the responses of the learners, and

determine the best changes to make in the instruction.

Three Sources Of Suggestions For Changes

Learner suggestions

Learner performance

Your own reactions to the instruction

Page 6: Chapter 11

ANALYZING DATAFROM SMALL-

GROUP

The fundamental unit of analysis for all the assessments is the individual assessment item. Performance on each item must be scored as correct or incorrect.

Methods For Summarizing Data

Item-by-objective performance

Graphing learners’ performance

Descriptive fashion

Page 7: Chapter 11

ANALYZING DATA

FROM FIELD TRIAL

Comments can be captured in one-on-one charts where you list out comments made by each learner

Assessment scores can be shown in charts or hierarchies that represent your individual objectives

Assessment scores can be shown in charts or hierarchies that represent your individual objectives

Page 8: Chapter 11

LEARNERS’ PERFORMANCE ACROSS TESTS

Derive assessment instruments based on the objectives to:

Diagnose an individual’s possessions of the necessary prerequisites for learning new skills

Check the results of student learning during the process of a lesson

Provide document of students progress for parents or administrators

It is useful in evaluating the instructional system itself (Formative/ Summative evaluation) and for early determination of performance measures before the development of lesson plan and instructional materials

Page 9: Chapter 11

GRAPHING LEARNERS’PERFORMANCES

The goal of continuous monitoring and charting of studentperformance is twofold. First, it provides you, the teacher,information about student progress on discrete, short-termobjectives. It enables you to adjust your instruction to reviewor re-teach concepts or skills immediately, rather than waitinguntil you've covered several topics to find out that one or morestudents didn't learn a particular skill or concept. Second, itprovides your students with a visual representation of theirlearning. Students can become more engaged in their learningby charting and graphing their own performance

Page 10: Chapter 11

OTHER TYPES OF DATA

OBSERVATIONAL ASSESSMENT is the most common form of formative assessment.Teachers can circulate the room to monitor students' progress. If students are workingindependently or in groups, teachers should intervene when the students are notunderstanding the material. Teachers can also take note of students' comments andparticipation levels during class discussions to gauge their learning.

SELECTED RESPONSE ASSESSMENTS are any type of objective exam where there is onlyone correct answer for each question. Multiple choice, fill-in-the-blank, matching and true/falsequestions are all types of selected response assessments. This type of assessment allows theteacher to score exams quickly and with a large degree of reliability in scoring from one examto another.

CONSTRUCTED RESPONSE ASSESSMENTS require students to generate their ownresponse rather than selecting a single response from several possible ones. These exams aremuch more subjective as there is not a single correct answer. Instead, teachers must gradeeither with a rubric or holistically to maintain a fair degree of reliability.

PERFORMANCE ASSESSMENTS require students to perform as a means of showing theyunderstand class material. The types of performances can include actual performing, as in aclass debate, or performance by creating, as in making a brochure or TV ad. These assessmentsevaluate complex cognitive processes as well as attitude and social skills, and students oftenfind them engaging.

PORTFOLIO ASSESSMENTS evaluate a student's progress over the course of the semester. Itis more than a one-time picture of what a learner has accomplished. Portfolios include all of astudent's work in a particular area. For example, a student in an English class could have aportfolio for a research paper that includes note cards, outlines, rough drafts, revisions and afinal draft. The teacher would evaluate the portfolio as a whole, not just the final draft, to seehow the student has grown.

Page 11: Chapter 11

SEQUENCE FOR EXAMINING DATA

The information on the clarity of instruction, impacton learner, and feasibility of instruction needs to besummarized and focused.

Particular aspects of the instruction found to be weakcan then be reconsidered in order to plan revisionslikely to improve the instruction for similar learners.

Page 12: Chapter 11

ENTRY BEHAVIORS

A step-by-step determination of what people are doing when they perform the goal and what entry behaviors are needed.

Involves identification of the context in which the skills will be learned and the context in which the skills will be used.

Page 13: Chapter 11

PRETESTS & POSTTESTS

After the students in the one- to- one trials havecompleted the instruction, they should review theposttest and attitude questionnaire in the samefashion.

After each item or step in the assessment, ask thelearners why they made the particular responses thatthey did.

This will help you spot not only mistakes but also thereasons for the mistakes, which can be quite helpfulduring the re-vision process.

Page 14: Chapter 11

INSTRUCTIONAL STRATEGY

Instructional strategy is an overall plan of activities to achievean instructional goal; it includes the sequence of intermediateobjectives and the learning activities leading to theinstructional goal.

Its purpose is to identify the strategy to achieve the terminalobjective and to outline how instructional activities will relateto the accomplishment of the objectives.

Emphasis is given on presentation of information, practice andfeedback, and testing.

A well-designed lesson should demonstrating know-ledgeabout the learners, tasks reflected in the objectives, andeffectiveness of teaching strategies.

Page 15: Chapter 11

LEARNING TIME

One design interest during one- to- one evaluation isdetermining the amount of time required for learnersto complete instruction, which is a very roughestimate, because of the interaction between thelearner and the designer.

You can attempt to subtract a certain percentage ofthe time from the total time, but experience hasindicated that such estimates can be quite inaccurate.

Page 16: Chapter 11

INSTRUCTIONAL PROCEDURE

Instructional strategy is an overall plan of activities to

achieve an instructional goal; it includes the sequence of

intermediate objectives and the learning activities

leading to the instructional goal.

Its purpose is to identify the strategy to achieve the

terminal objective and to outline how instructional

activities will relate to the accomplishment of the

objectives.

Emphasis is given on presentation of information,

practice and feedback, and testing.

A well-designed lesson should demonstrating know-ledge

about the learners, tasks reflected in the objectives, and

effectiveness of teaching strategies.

Page 17: Chapter 11

REVISION PROCESS

Use the data, your experience, and sound learning principlesas the bases for your revision.

The aim is to revise the instruction so as to make it as effectiveas possible for larger number of students.

Data from the formative evaluation are summarized andinterpreted to attempt to identify difficulties experience bylearners in achieving the objectives and to relate thesedifficulties to specific deficiencies in the materials.

Page 18: Chapter 11

REVISING SELECTED MATERIALS

1. Omit portions of the instruction.

2. Include other available materials.

3. Simply develop supplementary instruction.

Page 19: Chapter 11

SUMMARY

The final step in the design and development process (and thefirst step in a repeat cycle) is revising the instruction. Data fromthe formative evaluation are summarized and interpreted toidentify difficulties experienced by learners in achieving theobjectives and to relate those difficulties to specific deficiencies inthe instruction. It is used to re-examine the validity ofinstructional analysis and the assumptions about the entrybehaviors and characteristics of learners. It may be necessary toreexamine statements of performance objectives and test times inlight of collected data. The instructional strategy is reviewed andfinally all of these considerations are incorporated into revisionsof the instruction to make it a more effective instructional tool.

Page 20: Chapter 11

CAROLYN JENKINS-HAIGLER