FSU - Systematic Instructional Design Guide

23
SYSTEMATIC INSTRUCTIONAL DESIGN This study guide is a companion to The Systematic Design of Instruction by Walter Dick, Lou Carey, James O. Carey. Instructional Design Reiser & Dempsey (2007) define instructional design as a "systematic process that is employed to develop education and training programs in a consistent and reliable fashion" (pg. 11). Reiser, R.A. & Dempsey, J.V. (2007). Trends and issues in instructional design (2nd ed.). Upper Saddle River, NJ: Pearson Education, Inc. Instructional Design Models Instructional design models (or methods) are guidelines (Read: blueprints) that instructional designers follow when creating instruction. ADDIE instructional design model Systematic Design of Instruction Model (Dick & Carey instructional design model) Reeves multimedia design model Systematic Design of Instruction Model (Dick & Carey Instructional Design Model) Based on a systems approach for designing instruction Based on a behaviorist perspective of learning Instructional Triad Define Instructional Goals (a.k.a. Front- end analysis) Refer to figure 2.1 p17 1) Conduct performance analysis 2) Conduct needs assessment / analysis 3) Conduct job analysis 4) Identify the instructional goal a) The instructional goal is written as goal statement. b) Goal statement is the broad, general purpose of instruction. Not measurable because it does not clarify exactly what a learner must do or how a learner should perform. A goal may be defined as a general statement of desired accomplishment. It does not specify exactly all of the components or steps or how each step will be achieved on the road to accomplishing the goal. Example Goals: (1) Students will master the procedure of setting up a multiple camera production system (2) Students will understand the concept of basic networking. 5) Classify the goal Classify each goal into one of the domains (identify the type of learning outcome specified in goal statement) The type of assessment and instructional strategy will vary depending upon the learning outcome to be Goals & objectives Instruct ion Assessme nt

description

This study guide is a companion to The Systematic Design of Instruction by Walter Dick, Lou Carey, and James O. Carey. These four documents have been created as job aids (study guides) for the following four courses: EME5601, EME5603, EDP5216, EDP5217. The courses did not present the students with such documentation, and thus, I created said documentation for myself and for those students who would take these courses after me.

Transcript of FSU - Systematic Instructional Design Guide

Page 1: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

This study guide is a companion to The Systematic Design of Instruction by Walter Dick, Lou Carey, James O. Carey.

Instructional Design

Reiser & Dempsey (2007) define instructional design as a "systematic process that is employed to develop education and training programs in a consistent and reliable fashion" (pg. 11).Reiser, R.A. & Dempsey, J.V. (2007). Trends and issues in instructional design (2nd ed.). Upper Saddle River, NJ: Pearson Education, Inc.

Instructional Design Models

Instructional design models (or methods) are guidelines (Read: blueprints) that instructional designers follow when creating instruction.

ADDIE instructional design model Systematic Design of Instruction Model

(Dick & Carey instructional design model) Reeves multimedia design model

Systematic Design of Instruction Model (Dick & Carey Instructional Design Model)

Based on a systems approach for designing instruction

Based on a behaviorist perspective of learning

Instructional Triad

Define Instructional Goals (a.k.a. Front-end analysis)

Refer to figure 2.1 p171) Conduct performance analysis2) Conduct needs assessment / analysis3) Conduct job analysis4) Identify the instructional goal

a) The instructional goal is written as goal statement.

b) Goal statement is the broad, general purpose of instruction. Not measurable because it does not clarify exactly what a learner must do or how a learner should perform.

A goal may be defined as a general statement of desired accomplishment. It does not specify exactly all of the components or steps or how each step will be achieved on the road to accomplishing the goal.

Example Goals: (1) Students will master the procedure of setting up a multiple camera production system (2) Students will understand the concept of basic networking.

5) Classify the goal Classify each goal into one of the

domains (identify the type of learning outcome specified in goal statement)

The type of assessment and instructional strategy will vary

depending upon the learning outcome to be taught. Four types of learning outcomes: i) Intellectual skill goals (Cognitive

domain)(1) Concept – identify examples of

concepts. E.g. identify architectural style of buildings.

(2) Rule – apply rule to solve problem. E.g. compute averages.

(3) Problem solving – select and apply a variety of rules to solve problems. E.g. write a business letter.

ii) Psychomotor skill goals (Psychomotor domain)

Goals that require basic motor skills and/or physical movement.

iii) Attitudinal goals (Affective domain) Goals pertaining to attitudes, appreciations, values, and emotions.

iv) Verbal information goals (Cognitive domain)

Facts and knowledge. E.g. name the capitals of various countries.ii) Cognitive strategy (Cognitive domain) Employ a learning strategy. E.g. use a memorization technique.

Goals & objectives

InstructionAssessment

Page 2: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

Conduct Goal Analysis (a.k.a. Instructional Analysis or Task Analysis)

Purpose: To identify what learning steps and skills will be involved in reaching the goal. This is done through a task analysis, which identifies each step and the skills needed in order to complete that step, and an information processing analysis, which identifies the mental operations the learner needs to employ in performing that skill. The task analysis is performed by asking “What are all of the things the student must know and/or be able to do to achieve the goal?” Task analysis is the systematic process

of identifying tasks to be trained and a detailed analysis of each of those tasks.

1) What aids might you use to help conduct the procedural analysis- consult documentation, interview experts, observe, software.

2) A goal analysis consists of:a) Procedural analysisb) Subordinate skills analysis

i) Entry skills analysis3) Conduct procedural analysis (procedural

goal analysis) Identify and sequence the major steps

(tasks) required to perform (or complete) the goal.

Not all goal statements will involve steps.

4) Conduct subordinate skills analysisa) Identify subordinate skills (a.k.a.,

subtasks)

i) Subordinate skills(1) Examine each step to

determine what learners must know or be able to do before they can learn to perform that step in the goal.

Type of goal or step

Types of subordinate skills analysis

Intellectual skill Hierarchical*

Psychomotor skill

Hierarchical*

Verbal information

Cluster

Attitude Hierarchical* and/or cluster

*Note that the hierarchical analyses can contain sequences of procedural steps.

(2) Hierarchical approach (for intellectual or psychomotor skills)(a) Identify subordinate skills in

each step(b) Diagram your analysis.(c) Procedural analysis

(3) Cluster analysis approach (verbal information)

(4) Analysis techniques for attitude goals(a) "What must learners do

when exhibiting this attitude?" and "Why should they exhibit this attitude?"

(b) The second par of the analysis is "Why should the

learner make a particular choice?"

5) Identify entry level skillsa) For a hierarchyb) For a cluster

Analyze learners and contexts (identify entry behaviors / learner characteristics)

Purpose: Having determined via the instructional analysis which steps and skills the learner must accomplish, it is now necessary to identify the knowledge and skill level that the learner possesses at the outset. Although there may be pronounced differences from learner to learner in their knowledge and skill levels, the instruction must be targeted as much as possible to the level of the learners’ needs.

1) Learner analysisa) What information do designers need to

know about their target population?i) General characteristics

Age, Educational and ability levels,

grade level Homogeneity Group characteristics Attitudes toward content and

potential delivery system Academic motivation (ARCS) General learning preferences Attitudes toward the

organization giving the instruction

Page 3: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

ii) Characteristics directly related to task analysis Already perform terminal

objective? Entry skills Prior knowledge of the topic

area2) Context analysis

The purpose of a Context Analysis is to identify and describe the environmental factors that inform the design of.

a) Performance context analysis (analysis of performance)

Managerial or supervisor support Physical aspects of the site Social aspects of the site Relevance of skills to the workplace The major outputs of this phase of the study are 1) a description of the physical and organizational environment where the skills will be used 2) a list of any special factors that may facilitate or interfere with the learners use of the new skills.

b) Learning context analysis (analysis of learning context)

Compatibility of site with instructional requirements Adaptability of site to stimulate workplace Adaptability for delivery approaches Learning site constraints affecting design and delivery The major outputs are 1) a description of the extent to which the site can be used to deliver

training on skills that will be required for transfer to the workplace 2) a list of any limitations that may have serious implications for the project.

3) Evaluation and revision of the instructional analysis

Develop Instructional Objectives (a.k.a., Performance Objectives, Learning Objectives, Behavioral Objectives, and Objectives OR Instructional Outcome or Learning Outcome)

Purpose: At this stage, it is necessary to translate the needs and goals into objectives that are sufficiently specific to guide the instructor in teaching and the learner in studying. In addition, these objectives form the blueprint for testing as a means of evaluating both the instruction and the learning that has occurred. Example: The student will be able to explain the role of waterfall model in system analysis and design.

Assessment and evaluation measure whether or not learning and/or learning objectives are being met.

Two types of instructional objectives:a) Terminal objective

(1) Only oneb) Subordinate objectives (enabling

objectives or intermediate objective) (1) Often more than one

The instructional goal is rephrased as a terminal objective describing what students will be able to do in the learning context, and subordinate objectives describe the building block skills (steps and subordinate skills) that students must master before the terminal objective is obtained. The terminal objective describes a

specific behavior that learners engage in to demonstrate success with a goal.

The subordinate objectives may be general or specific but they are used as a means of attaining a terminal objective. They are objectives which apply during learning process.

Generally, each instructional unit (or lesson) has a corresponding terminal objective.

Instructional objectives ii) Are:

Specific in that they describe precisely what the learner is expected to do. Outcome based in that they state what the learner should be able to do after the instruction is complete. Instructional objectives focus on learning outcomes for students. Not actions by the teacher. Measurable in that they describe learning outcomes that can be measured. Note that goal statements are not measurable.

c) Are written to include (Mager model for instructional objectives):

Audience - the learner(s) the objective is written for. Usually "the learner" or "the student".

Page 4: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

Behavior (action or performance) - Identify the behavior the learner will be taking when he/she has achieved the objective. The behavior should be specific and observable (observable learner outcome). Condition - Describe the relevant conditions under which the learner will be acting. The conditions under which the behavior is to be completed, including what tools or assistance is to be provided. Conditions reflect the context that will be available in the learning environment. Standard (criterion or degree) - Describe the level of performance that is desirable, including an acceptable range of answers that are allowable as correct. Do not provide ambiguous standards such as "…to the instructor's satisfaction".

Differences in writing conventions. The goal statement describes the

context in which the learner will ultimately use the new skill while the terminal objective describes the conditions for performing the goal at the end of the instruction.

Objectives are to include condition (CN), type of learning (intel, behavior,…) and criteria (CR).i) See p122-125

1) Edit goal to reflect eventual performance context if necessary.

2) Write terminal objective to reflect context of learning environment.

a) Job / Task Analysis, the terminal objective is developed from duties.

3) Write objectives for each step in goal analysis for which there are no substeps shown.a) Job / Task Analysis, the enabling

objectives are derived from tasks.4) Write objective for each grouping of

substeps under a major step of the goal analysis, or write objectives for each substep.

5) Write objectives for all subordinate skills.6) Write objectives for entry skills if some

students are likely not to possess them.7) -OR- only write objectives for those items in

your task analysis which you plan to assess.8) SEE Instructional Objective Helper -

http://www.cogsim.com/idea/forms/Inst_obj2.htm

>PRODUCE LEARNER / TASK ANALYSIS REPORT

Report consists of:1) Learner analysis2) Task analysis

Develop assessment instruments

Purpose: To diagnose whether a learner possesses the necessary prerequisites (entry skills) for learning the new information. To check the results of student learning during the process of instruction and provide document of learners progress (or improvement). To test terminal

objective and relevant subordinate objectives.

1) Identify which steps and/or subordinate skills to assess.

2) Identify which type of test to conduct.a) Four types of criterion-referenced

(objective-referenced) testsi) Entry skills test ii) pretest - test entry skills and

instructional objectivesiii) practice or rehearsal test (formative

assessment)iv) Posttest (summative assessment) -

measures learners performance on the subordinate and terminal objectives

3) Identify what domain the objective is in.a) Organizing your objectives according to

learning domain can also aid you in selecting the most appropriate type of assessment item.i) Verbal (a.k.a. knowledge),

intellectual (concept, rule, or problem solving), attitude, psychomotor

4) Writing test items You should write an assessment item

for each objective whose accomplishment you want to measure.

a) Read the objective and determine what it wants someone to be able to do (i.e., identify the performance).

b) Draft a test item that asks students to exhibit that performance.

c) Read the objective again and note the conditions under which the performing should occur (i.e., tools and equipment provided, people present, key environmental conditions).

Page 5: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

d) Write those conditions into your item.e) For conditions you cannot provide,

describe approximations that are as close to the objective as you can imagine.

f) If you feel you must have more than one item to test an objective, it should be because (a) the range of possible conditions is so great that one performance won’t tell you that the student can perform under the entire range of conditions, or (b) the performance could be correct by chance. Be sure that each item calls for the performance stated in the objective, under the conditions called for.

5) Criteria for writing test items Goal-Centered Criteria Learner-Centered Criteria Context-Centered Criteria Assessment-Centered Criteria

6) Type of behavior stated in objective and select associated types of test items p.140

7) Objective test item or non-objective test item:a) Is the test item an objective test item?

Completion, short answer, true/false, matching, and multiple choice

i) Sequence itemsii) Writing directionsiii) Determine how instrument will be

scoredb) Is the test item a performance, product,

or attitude (not an objective item)? Essay, product development, live performance

i) Writing directions

ii) Developing instrument(1) Develop a response format

(a) Checklist (yes/no); Rating scale (0,1,2,3…); Frequency count (tally)

(2) Identify the elements to be evaluated

(3) Paraphrase each element(4) Sequence elements in

instrument(5) Select the type of judgment to

be made by the evaluator(6) Determine how instrument will

be scored8) Design evaluation

The skills, objectives, and assessments must all refer to the same skills, so careful review is required in order to ensure this congruence.

Requirements: Instructional analysis diagram, performance objectives, summaries of learner characteristics and performance and learning contexts, performance objectives, and assessment

Complete a design evaluation chart Congruence assessment

Use design evaluation chart in some of these steps.

i) Organize and present the materials to illuminate their relationship.

ii) Judge the congruence between the informaiton and skills in instructional goal analysis and the materials created

iii) Judge the congruence between the materials and the characteristics of the target learners

iv) Judge the congruence between the performance and learning contexts and the materials

v) Judge the clarity of all the materials

Select and Develop an instructional strategy (instructional method)

Purpose: To outline how instructional activities will relate to the accomplishment of the objectives.

Determine delivery system:a) Review the instructional analysis and

identify logical clusters of objectives that will be taught in appropriate sequence.

b) Plan the learning components that will be used in the instruction.

c) Choose the most effective students groupings for learning.

d) Specify the most effective media and materials that are within the range of cost, convenience, and practicality for the learning context.

e) Assign objectives to lessons and consolidate media selection.

f) Select or develop a delivery system that best accommodates the decisions made in steps 1 through 5.

Select content sequencing From left to right, subskills first. 3 exceptions noted

Select clustering size - size of unit(s) of instruction.

Develop instructional materials

Page 6: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

Except for the pretests and posttests, all learning components are included within the materials.

If instruction is intended to be independent of an instructor, then the materials will have to include all the learning components in the strategy. The instructor in this case is not expected to play a role in delivering instruction.

Develop instructional package Instructional materials Assessments Course management information

Instructional activities These instructional activities are not

intended to serve as one model for formatting your instructional strategy; it should not be used as a model for the specific sequence of instructional activities you design.

The inst. strategy is related to the section of a task analysis.

Both Dick & Carry and Reiser have their own instructional strategies. They have different titles, but all the steps are there and they are in the same order.

The two [rephrased] instructional strategies both follow with a high degree of similarity Gagne's Nine Events of Instruction: gain attention; inform learners of ojbectives; stimulate recall of prior learning; present content; provide learning guidance; elicit performance (practice); provide feedback; assess performance;

enhance retention and transfer to the job.

Instructional activities (components) [Dick and Carry]a) Pre-instructional activities

i) Gain attention and motivate learners

ii) Inform learners of objectives iii) Stimulate recall of prerequisite skills

b) Content presentationi) Present content*ii) Provide learning guidance*

c) Learner participationi) Provide practice*ii) Provide feedback*

d) Assessment (note: usually not considered part of a lesson)i) Entry skills testii) Pretestiii) Practice testsiv) Posttest

e) Follow-through activities (incorporated throughout lesson)i) Provide memory aidsii) Promote transfer of learning

*repeated for each objective or cluster of objectives Instructional Activities (events) [Reiser]

a) Introductory activitiesi) Motivate learners (at the beginning

and throughout)(1) ARCS Model

(a) Gain learners' attention.(b) Inform learners of the

relevance of what they are learning.

(c) Confidence by providing material at the appropriate level of difficulty.

(d) Satisfaction derived through extrinsic or intrinsic rewards.

(2) Reiser Model(a) Relevant to learners' needs

and interests (b) Entertaining (via stories,

anecdotes, graphics, pictures, etc.)

(c) Interactive (via practice problems, rhetorical questions, etc.)

(d) Success-producing (present tasks that learners can accomplish, but don't make tasks so easy that learners will get bored)

(e) Enthusiasm-provoking (use language that will get learners interested; write in the the active voice; avoid dull, dry words)

(f) Rewarding (incorporate praise, encouragement, etc. that is at the appropriate level for the target audience )

ii) Inform learners of objectives(1) Two objectives for each

element for which an objective is written.(a) One objective is the three

part objective for the designer

(b) The second objective is a short form to provide to the learner

(2) Do not provide a long list of objectives to the learner

Page 7: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

(3) Emphasize the relevance of the objectives

iii) Help learners recall prerequisite skills and knowledge(1) Tell learners what the

prerequisite knowledge and skills are

(2) Describe how the prerequisites are performed. Recall stories and examples from previous instruction

(3) Question them(4) Give learners cues or a few

practice problems.b) The heart of the lesson

Information->example->practice->feedback

i) Present essential information (rules, procedures, facts, etc.) (1) Be succinct, concise, precise(2) Highlight key information(3) Branching if necessary. See

index for further information on.

(4) Different strategies(a) Break each individual part

down and move forward (exposition theory)

(b) Give then the big picture and them move from there.

ii) Present worked examples*(1) Properly executed (correctly

completed) samples of the desired skill that are shown to learners so they can see how to perform the skill correctly.

(2) Provide examples(3) Provide non-examples

iii) Provide practice*

(1) The opportunity students are given to perform a particular behavior prior to the time they are formally assessed

iv) Provide feedback* (1) Provide frequent feedback(2) Feedback should be arranged so

learners have difficulty seeing it prior to trying to respond to practice problems.

(3) Knowledge of results(4) Knowledge of correct response(5) Corrective/instructional

feedbackc) Concluding activities

i) Promote transfer (often not a separate activity, built in throughout)(1) Provide practice opportunities

that stimulate "real world" conditions.

(2) Describe(a) "real world" conditions and

constraints.(b) "real world" applications of

the skillii) Conclude the “lesson” (not

mentioned by Dick and Carey)(1) Restate goal/objective of the

lesson (2) Summarize key points(3) Congratulate learners for

attaining goal(4) Encourage learners to apply

skills taught*for each obj. or set of objectives

Blooms original cognitive domain taxonomy

There are three domains to Blooms taxonomy (cognitive, affective, psychomotor)

Cognitive domainiii) A learning task is a question or

statement that asks a student to complete a certain task.

A knowledge learning task is a task that asks the student to recall content in the exact form that it was presented. It asks learners to recall, locate specific information, or remember or memorize details. A comprehension learning task is a task that asks the student to restate material in their own words. It asks learners to explain, demonstrate, and translate understanding. An application learning task is a task that asks the student to apply rules, concepts, principles, and theories in new situations. It asks the learner to use the information, use methods/concepts/theories in new situations, and interpret facts. An analysis learning task is a task that asks the student to break down information into parts. It asks the learner to dissect information, identify and distinguish components, and compare and contrast. A synthesis learning task asks the students to put together ideas into a new or unique product or plan. An evaluation learning task asks the student to evaluate or

Page 8: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

judge the value of a concept or object. It asks the learner to judge outcomes, dispute thoughts and ideas, and form opinions.

Develop Instructional Materials Purpose: To select materials and media

intended to convey activities (or events) of instruction.

Delivery system Three factors often compromise in

selection of media and delivery system Availability of existing instructional materials Production and implementation constraints Amount of facilitation that the instructor will provide

Components of an instructional package Instructional materials

Written, mediated, or facilitated that all students will use to achieve the objectives. Student workbooks, activity guides, problem scenarios, computer simulations, case studies, resource lists...

Assessments All instructional materials should be accompanied by objective tests or by product or performance assessments May include pretest and postest. Will the assessments be available to students or will they appear as part of the instructor's

material so they are not available to students.

Course management information A general description of the total package, typically called an instructor's manual that provides an overview of the materials and shows how they might be incorporated into an overall learning sequence for students. May include the tests and other information important for implementing the course. If the management system is a commercial web-based instructional management system then it may include automated class listing, student tracking, online testing, project monitoring, grade book , and communication tools.

Existing instructional materials Sharable content object reference

model (SCORM) is a set of e-learning standards for interchangeability of learning objects. A learning object is what might have been traditionally called a lesson or module, that would include a cluster of content with the required learning components of an instructional strategy.

Learning centered and technical criteria Goal-centered criteria for evaluating

materials Instructional analysis document provides the basis for determining the acceptability of the content in various instructional materials

Learner-centered criteria for evaluating existing materials

Your learner analysis document should provide the foundation for consideration of the appropriateness of instructional materials for your target group

Learning-centered criteria for evaluating materials

Your instructional strategy can be used to determine whether existing materials are adequate as is or whether they need to be adapted or enhanced prior to use.

Context-centered criteria for evaluating materials

Your instructional and performance context analyses can provide the foundation for judging whether existing materials can be adopted as is or adapted for your settings.

Technical criteria for evaluating material

Materials should also be judged for their technical adequacy, according to criteria related to 1) delivery system 2) packaging 3) graphic design and typography 4) durability 5)legibility 6)audio and video quality 7) interface design, navigation, and functionality 8) updatability.

Instructional materials and formative evaluation Rough draft materials Rapid prototyping - a series of

informed, successive approximations, emphasizing the word informed because this development approach relies absolutely on information

Page 9: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

gathered during tryouts to ensure the success of the final product. Concurrent design and development

Conduct Formative Evaluations Purpose: To provide data for revising and

improving instructional materials.

2) Expert appraisal (reviewers): Have SME's and/or interested specialists not involved in the instructional development project review the instruction.a) SME's comments on the accuracy and

currency of instruction.b) A specialist in the type of learning

outcome involved in may critique and enhance the instructional strategy as concerned to the learning outcome.

c) Share draft with someone familiar with the target audience.

Three phases of formative evaluation:d) One to one (clinical evaluation) - the

designer works with individual learners to obtain data to revise the materials.

Purpose: To remove the most obvious errors in the instruction and to obtain initial performance indications and reactions to the content by learners. During this stage of direct interaction between designer and individual learners, the designer works individually with three or more of the learners who are representative of the target

population. Take care not to over generalize the data gathered from only one individual. Learner selection by four categories: (1) By achievement - one who is

above average, one who is average, one who is below average.

(2) By attitude (optional) - a highly positive learner, a neutral learner, a negative learner.

(3) Previous experience and years on the job (optional) - 10 years on the job, 5 years on the job, less than 1 year.

Procedure An interactive process. The designer should sit diagonally beside the learner, the designer should read (silently) with the learner and, at predetermined points, discuss with the learner what has been presented in the materials.

Explain to the learner that a new set of instructional materials have been designed and that you would like his or her reaction to them. You should state that any mistakes that learners might make are probably due to deficiencies in the material and not theirs. Encourage the learners to be relaxed and to talk about the materials. Have them go through the instructional

materials and also take the test(s) provided with the material. Learners must be convinced that it is legitimate to be critical of what is presented to them. Support an atmosphere of acceptance and support for any negative comments from the learner. Best herein to use qualitative evaluations rather than quantitative.

Criteria and Data Three main criteria the

decisions designers will make during the evaluation are as follows. (i) Clarity of instruction: Is the

message, or what is being presented, clear to individual target learners?1. Intended outcome -

contains appropriate information

2. Considerations:a. Message:

Vocabulary level, language complexity, message complexity, introductions, elaborations, conclusions, transitions

b. Links: Contexts, examples, analogies, illustrations,

Page 10: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

demonstrations, reviews, summaries

c. Procedures: Sequence, segment size, transition, pace, variation

(ii) Impact on learner: What is the impact of the instruction on individual learner's attitudes and achievement of the objectives and goals?1. Intended outcome -

Yields reasonable learner attitudes and achievement.

2. Considerations:a. Attitudes: Utility of

information and skills (relevance), how easy/difficult the information and skills are to learn (confidence), satisfaction with skills learned

b. Achievement: Clarity of directions and items for posttests, scores on postests

(iii) Feasibility: How feasible is the instruction given the available resources (time/context)?1. Intended outcome -

Appears feasible for use with the available

learners, resources and setting.

2. Considerations:a. Learner: Maturity,

independence, motivation

b. Resources: Time, equipment, environment

Assessments and questionnaires After the one-to-one trials the learners

should review the posttest and attitude questionnaire (instructional evaluation survey) in the same fashion. After each question ask the learners why they made the response that they did.

Learning time Identify the amount of time required

for learners to complete the instruction. An estimate because of the interaction between the learner and the designer. Do not attempt to subtract a certain percentage because such estimates can be quite inaccurate.

REVISE INSTRUCTIONAL MATERIAL IF NECESSARY BEFORE PROCEEDING.

3) Small-group evaluation - a group of eight to twenty learners representative of the target population study the materials on their own and are tested to collect the required data. Purposes: 1) To determine the

effectiveness of changes made following the one-to-one evaluation and to identify any remaining learning problems that learners may have. 2) To determine whether learners can use the instruction without interacting with the

instructor. The outcome is refined effective instruction

Criteria and Data Learner performance scores - typical measures used to evaluate instructional effectiveness include learner performance scores on pretests and posttests. Attitudes about instruction by means of attitude questionnaire and sometimes a follow-up interview. Feasibility:

Time required for learners to complete both the instruction and required performance measures Costs of viability of delivering the instruction in the intended format and environment Attitudes of those implementing or managing the instruction.

Selecting learners Select 8 - 20 learners. Should be representative of the target population. Use sampling methodology.

Procedures Designer explains that the materials are in the formative stage ofe development ant that it is necessary to obtain feedback on how they may be improved. Administer the materials in the manner in which they are intended to be used when they are in final form. If a pretest is to be given then

Page 11: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

it should be given first. Only when equipment fails or a learner becomes bogged down should an instructor intervene.

Assessments and Questionnairesi) Administer and attitude

questionnaire. Questions on an attitude questionnaire:1. Was the instruction

interesting?2. Did you understand what

you were supposed to learn?

3. Were the materials directly related to the objectives?

4. Were sufficient practice exercises included?

5. Were the practice exercises relevant?

6. Did the tests really measure your knowledge of the objectives?

7. Did you receive sufficient feedback on your practice exercises?

8. Did you feel confident when answering questions on the tests?

ii) In-depth debriefings with some of the learners.

Data summary and analysis Both quantitative (test scores) and descriptive information (attitude questionnaires, interviews, evaluators notes) gathered during the trial should be summarized and analyzed.

REVISE INSTRUCTIONAL MATERIALIF NECESSARY BEFORE PROCEEDING.

4) Field trial - the number of learners is not of particular consequence; often thirty is sufficient. Emphasis is on testing procedures required for installing the instruction in a situation as close to the "real world" as possible. (1) Purpose: To determine whether the

changes in the instruction made after the small-group stage were effective. To locate and eliminate any remaining problems. To determine whether it is administratively possible to use the instruction in its intended setting. The outcome is effective instruction that yields learner achievement and positive attitudes.

(2) Location of evaluation: The optimal location is the location where final instruction will actually be given.

(3) Selecting learners: Identify a group of about thirty individuals. Individuals must be representative of the target audience.

(4) Procedure: Similar to that of the small group. Except, instruction should be administered not by the designer, but by a typical instructor. Inclusion of a questionnaire. Optional observation of instruction and interviews with learners and instructor.

(5) Data summary and interpretation: Same as that of the small group. Achievement data should be organized by instructional objective, and attitudinal information from both learners and instructors should be

anchored to specific objectives when possible.

(6) REVISE INSTRUCTIONAL MATERIALIF NECESSARY BEFORE PROCEEDING.

See table 10.1 p.260 for example framework for designing a formative evaluation.

SEE p.277 for Formative Evaluation Activities

Revising instructional materials

Note: Revising involves making needed changes based on the formative evaluation data.

The feedback line traces back through all stages of the design.

Two basic types of revisions to consider : Changes made to the content or

substance of the materials to make them more accurate or more effective as a learning tool

Changes related to the procedures employed in using the materials.

Data analysis for one-to-one trials Information available: learner

characteristics and entry skills, direct responses to the instruction, learning time, posttest performance, and responses to an attitude questionnaire.

a) Describe the learners who participated in the one-to-one evaluation and to indicate their performance on any entry-skill measures.

b) Bring together all the comments and suggestions about the instruction that resulted from going through it with

Page 12: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

each learner, which can be done by integrating everything on a master copy of the instruction using a color code to link each learner to his or her particular problems. It is also possible to include comments from an SME and any alternative instructional approaches that were used with learners during the one-to-ones sessions. As you go through the instruction

with each learner take notes on a single copy and use a pen of a different color for each learner.

c) Posttest data are summarized by obtaining individual item performance and then combining item scores for each objective and for a total score. It is often of interest to develop a table that indicates each student's pretest score, posttest score, and total learning time. In addition, student performance on the posttest should be summarized along with any comments for each objective. Begin within those sections that resulted in the poorest performance by learners and those that resulted in the most comments.

d) Determine, based on learner performance, whether the rubric or the test items are faulty. If flawed, then changes should be made to make them clearer or consistent with the objectives and the intent of instruction. If the items are satisfactory, and the learners performed poorly, then the instruction must be changed.

e) Three sources of suggestions for change: learner suggestions, learner

performance and the your (the designers) reactions to the instruction.

Data analysis for small group and field trials The available data includes: item

performance on the pretest, posttest, and responses to an attitude questionnaire, learning and testing time, and comments made directly in the materials.

The fundamental unit of analysis for all the assessments is the individual assessment item. Performance on each item must be scored as correct or incorrect. If an item has multiple parts, then each part should be scored and reported separately so that the information is not lost. This individual item information is required for three reasons:

Groups's item by item performance Item by objective table 11.1 & table 11.2 (percentages of learners who mastered each objective should increase from pretest to posttest) & table 11.3 ( learners' performances across test using the percentage of objectives mastered on each test). The test-by-objective analysis is threefold: To determine the difficulty of each item for the group, to determine the difficulty of each objective for the group, and to determine the consistency with which the set of items within an objective measures learners' performance on the objective. An item difficulty value is the percentages of learners who answer an item correctly. Item

difficulty values above 80 percent reflect relatively easy items for the group, whereas lower values reflect more difficult ones. Similarly, consistently high or low values for items within earn objective reflect the difficulty of the objective for the group. The consistency of item difficulty indices within an objective typically reflects the quality of the items. If items are measuring the same skill, and if there is no inadvertent complexity or clues in the items, then learners' performance on the set of items should be relatively consistent. Within small groups, differences of 10 or 20 percent are not considered large, but differences of 40 percent or more should cause concern. When inconsistent difficulty indices are observed within an objective, it indicates that the items within the set should be reviewed and revised prior to reusing them to measure learner performance.

Learners' item-by-objective performance

The item by objective table provides the data for creating tables to summarize learners' performance across tests.

Graphing learners' performances Another way to display data. A graph may show the pretest and postest performance for each objective in the formative evaluation study. You may also

Page 13: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

want to graph the amount of time required to complete the instructional materials as well as the amount of time required for the pretest and posttest. Figure 11.1

Other types of data

Sequence for examining dataa) Instructional analysis and entry skills -

did the learners have the entry skills anticipated. If they did succeed but did not have the required skills, you must question whether you have identified critical entry skills.

b) Objectives, pretests, and posttests - the second step is to review the pretest and posttest data as displayed on the instructional analysis chart. Learners pretest performances should decrease as you move upward through the hierarchy - there should be poorer learner performance on the terminal objective than on the earlier skills.

c) Examine the pretest scores to determine the extent to which individual learerns, and the group as a whole, had already acquired the skills that you were teaching. If they already possess most of the skills, then you will receive relatively little information about the effectiveness of the instruction or how it might be improved. If they lack these skills, you will have more confidence in the analyses that follow.

d) Comparing pretest with posttest scores objective by objective, which is the usual procedure when you examine the instructional analysis chart, you can

assess learner performance on each particular objective and begin to focus on specific objectives and the related instruction that appear to need revision. You may need to revise the conditions or the criteria specified in the objectives. Recall that conditions are used to control the complexity of performance tasks, and your criteria may be too lenient or harsh for the target group.

e) Examine the exact wording of the objective and the associated test items and the exact student answers to the items. Before revising the instructional materials, refer to you item analysis table to see whether poor test items, rather than the materials, indicate poor learner performance. All that may be needed are revised test items rather than a major revision of the instructional materials.

f) Learning components of instructional strategy and materials - the next step is to examine the instructional strategy associated with the various objectives with which learners had difficulty. Was the planned strategy actually used in the instructional materials? Are there alternative strategies that might be employed? The final step is to examine the materials themselves to evaluate the comments about problem areas made by learners, instructors, and subject matter experts.

g) Learning time - It may be necessary to revise the materials to make them fit within a particular time period.

h) Media, materials, and instructional procedures - data that relate to the implementation of the instructional materials must also be examined. Controllable vs. non-controllable. Were learners hindered by the logistics required to use the materials? Were there questions about how to proceed from one step to the next? Were there long delays in getting test scores? Some solutions may need to be incorporated into the instructor's manual to make the instructional activity more efficient.

Revision processa) Summarize data in a clear and accurate

fashion.b) Table 11.4 template for summarizing

information from a formative evaluation.

c) Any problems identified with a component

d) The changes that are being proposed based upon the problems

e) The evidence gathered illustrating the problem. The evidence might name source materials, assessments, data summaries, observations or interviews.

f) The problems may be apparent but the appropriate changes might not be. If a comparison of several approaches has been embedded in the formative evaluation (such as implementing several versions of the instructional material with changes to formatting, steps…), then the results should indicate the type of changes to be made. Otherwise, the strategies suggested for revising instruction following the one-to-one evaluation

Page 14: FSU - Systematic Instructional Design Guide

SYSTEMATIC INSTRUCTIONAL DESIGN

also apply at this point - namely, use the data, expertise, and sound learning principles as the basis for revisions.

g) Avoid responding too quickly to any single piece of data.

h) In instructor-led instruction it is relevant to note that some students are unlikely to understand the concepts as rapidly as others during a given class. Identifying learners who are performing poorly and inserting appropriate activities are important components of the revision process for the instructor who is using an interactive instructional approach.

i) Do not simply assume that changes are for the better.

Summative Evaluation

Collect data and information in order to make decisions about the acquisition or continued use of (overall effectiveness and worth of) instruction. This is accomplished through the design of evaluation studies and the collection of data to verify the effectiveness of instruction and instructional materials with target learners Did the intervention, including the

instruction, solve the problem that led to the need for the instruction in the first place?

Two phases of summative evaluation: expert judgment phase and field trial phase

Expert judgment phase

Do the materials have the potential for meeting this organization’s needs?

Congruence analysis: Are the needs and goals of the organization congruent with those in the instruction?

Content analysis: Are the materials complete, accurate, and current?

Design analysis: Are the principles of learning instruction, and motivation clearly evident in the materials?

Feasibility analysis: Are the materials convenient, durable, cost-effective, and satisfactory for current users?

Field trial phase

Are the materials effective with target learners in the prescribed setting?

Outcomes analysis Impact on learners: Are the

achievement of motivation levels of learners satisfactory following instruction?

Impact on job: Are learners able to transfer the information, skills, and attitudes from instructional setting to the job setting or to subsequent units of related instruction?

Impact on organization: Are learners’ changed behaviors (performance, attitudes) making positive differences in achievement of the organization’s mission and goals (e.g., reduced dropouts, resignations; improved attendance, achievement; increased productivity, grades)?

Management analysis: Are instructor and manager attitudes

satisfactory?

Are recommended implementation procedures feasible?

Are costs related to time, personnel, equipment, and resources reasonable?

Comparison of formative and summative evaluations

Formative evaluation

Summative evaluation

Purpose Locate weaknesses in instruction in order to revise it

Document strengths and weaknesses in instruction in order to decide whether to maintain or adopt it

Phases or stages

One-to-one; small group; field trial

Expert judgment; field trial

Instructional development history

Systematically designed in-house and tailored to the needs of the org.

Produced in-house or elsewhere not necessarily following a systems approach

Materials One set of materials or several differing sets

One set of materials or several competing sets

Position of evaluator

Member of design and development team

Typically an external evaluator

Outcomes A prescription for revising instruction

A report documenting the design, procedures, results, recommendations, and rationale