Evaluating and Institutionalising Organisational Development Interventions
Presented by-
Yusra Iqbal
Prateek Singh
Molla Asmare
Taufeeq Ahmad
Evaluation is concerned with providing feedback to practitioners and organization members about the progress and impact of OD interventions
Institutionalization involves making OD interventions a permanent part of organizations normal functioning.
Evaluating Organisational Development Interventions Implementation and Evaluation Feedback Measurement Research Design
Institutionalising Interventions
Institutionalisation Framework
Organisational characteristics
Intervention Characteristics
Institutionalisation Process
Indicators of Institutionalisation
Evaluating Organisational Development Interventions
Necessity of Evaluating OD Interventions
Judge whether intervention has been implemented as intended or not and whether it is having desired results
Implementation of OD intervention cost money=> Rigorous assessment necessary
Decisions about the measurement of relevant variables and design of evaluation process is to be made early in the OD cycle so as to integrate evaluation with intervention decisions
Implementation and Evaluation Feedback
Most OD intervention require significant changes in people’s behaviour and attitude yet intervention typically offer broad prescriptions only for how such changes are to occur.
Eg. Job enrichment calls for addition of variety and meaningful feedback to the jobs. Implementing this can be a challenge as managers learn and experiment to translate the prescriptions into specific behaviours.
So evaluation needs to include during implementation assessment to see if interventions are actually being implemented and after implementation assessment evaluation of whether interventions are producing expected results
OD Evaluation
To guide implementation of
intervention= Implementation
Feedback
To assess overall impact= Evaluation
Feedback
Implementation Feedback consists of two types of information
Data about different features of intervention itself
Data about immediate effect of interventions
Data is collected repeatedly at intervals
This information is used to gain clearer understanding of intervention and lan for next implementation step
This cycle may proceed for several rounds
Once implementation feedback shows that intervention was sufficiently in place evaluation feedback begins.
It includes outcome measures like performance, job satisfaction, absenteeism and turnover.
Negative results indicate that either initial diagnosis was flawed or choice of intervention was wrong.
Positive results indicate successful implementation of intervention and should prompt search for ways to institutionalize the change.
Diagnosis
Choice of intervention
Alternative Interventions
Implementation of Intervention
Clarification of Intervention
Development of plan for next implementation steps
Implementation Feedback
Measures of features of implementation and immediate effect
Evaluation feedback
Measurement of long term effects
MEASUREMENT
Select the right variables to measure
Design good measurements
Operational
Reliable
Valid
Research Design
Sources of Reliability
Rigorous Operational Definition
Provide precise guidelines for measurement: How high does a team have to score on a five-point scale to say that it is effective?
Multiple Measures
Multiple items on a survey
Multiple measures of the same variable (survey, observation, unobtrusive measure)
Standardized Instruments
Types of Validity
Face Validity: Does the measure “appear” to reflect the variable of interest?
Ask colleagues and clients if a proposed measure actually represents a particular variable.
Types of Validity
Content Validity: Do “experts” agree that the measure appears valid?
If experts and clients agree that the measure reflects the variable of interest then there is increased confidence in the measure’s validity
Types of ValidityCriterion or Convergent Validity: Do
measures of “similar” variables correlate?
Use multiple measures of the same variable, to make preliminary assessments of the measure’s criterion or convergent validity.
If several different measures of the same variable correlate highly with each other, especially if one or more of the other measures have been validated in prior research, then there is increased confidence in the measure’s validity.
Types of ValidityDiscriminant Validity: Do measures of
“non-similar” variables show no association?
This exists when the proposed measure does not correlated with measures that is not supposed to correlate with.
Example: there is not good reason for daily measures of productivity to correlate with daily air temperature.
Types of ValidityPredictive Validity: Are present variables
indicative of future or other variables?
This is demonstrated when the variable of interest accurately forecasts another variable over time.
Example: A measure of team cohesion can be said to be valid if it accurately predicts improvements in team performance in the future.
RESEARCH DESIGN
Longitudinal Measurement
Change is measured over time
Ideally, the data collection should start before the change program is implemented and continue for a period considered reasonable for producing expected results.
CONT..
Comparison Units
Appropriate use of “control” groups
It is always desirable to compare results in the intervention situation with those in another situation where no such change has taken place.
CONT..
Statistical Analysis
Alternative sources of variation have been controlled
Whenever possible, statistical methods should be used to rule out the possibility that the results are caused by random error or chance.
Evaluating Different Types of Change
Alpha Change
Refers to movement along a measure that reflects stable dimensions of reality.
For example, comparative measures of perceived employee discretion might show an increase after a job enrichment program. If this increase represents alpha change, it can be assumed that the job enrichment program actually increased employee perceptions of discretion.
CONT…Beta Change
Involves the recalibration of the intervals along some constant measure of reality. For example, before-and-after measures of perceived employee discretion can decrease after a job enrichment program. If beta change is involved, it can explain this apparent failure of the intervention to increase discretion.
CONT…The first measure of discretion may accurately reflect the individual’s belief about the ability to move around and talk to fellow workers in the immediate work area. During implementation of the job enrichment intervention, however, the employee may learn that the ability to move around is not limited to the immediate work area. At a second measurement of discretion, the employee using this new and recalibrated understanding, may rate the current level of discretion as lower than before.
CONT…
Gamma Change
Involves fundamentally redefining the measure as a result of an OD intervention. In essence, the framework within which a phenomenon is viewed changes.
For example, the presence of gamma change would make it difficult to compare measures of employee discretion taken before and after a job enrichment program.
CONT..
The measure taken after the intervention might use the same words, but they represent an entirely different concept. After the intervention, discretion might be defined in terms of the ability to make decisions about work rules, work schedules, and productivity levels. In sum, the job enrichment intervention changed the way discretion is perceived and how it is evaluated.
Institutionalizing Organizational Development
Interventions
Institutionalising Interventions
The model of Institutionalisation framework shows that both organisation as well as intervention characteristics affect different institutionalization processes operating in organisations.
Organisation characteristics also affect intervention processes
Eg. Organisation with powerful unions find it difficult to get support for OD interventions.
Institutionalization Framework
Organizational CharacteristicsCongruence Stability of environmentUnionization
Institutionalization Framework
Intervention CharacteristicsGoal specificityProgrammabilityLevel of Change TargetSponsor
Institutional process
Socialization
Commitment
Reward allocation
Diffusion
Sensing and recalibration
Institutionalisation Framework
Indicators of Institutionalization
Knowledge
Performance
Preferences
Normative consensus
Value consensus
Institutionalisation Framework
Organizational Characteristics
Congruence Stability of environmentUnionization
Intervention Characteristics
Goal specificityProgrammabilityLevel of Change TargetSponsor
Institutionalization Process
Socialization CommitmentReward allocationDiffusionSensing and calibration
Indicators of Institutionalization
KnowledgeperformancePreferenceNormative ConsensusValue Consensus
Conclusion Evaluating and Institutionalising interventions are
the final two stages of them are planned change.
Evaluation of intervention also involves decisions about its measurement and research design.
Measurement focuses on quantifying dimensions and magnitude of change.
Research design focuses on setting up conditions for making valid assessment of intervention effect.
Institutionalising is concerned with making the change permanent.
Thank You!
Top Related