Transparency of Assessment Centers: Lower Criterion ... · Transparency of Assessment Centers:...

47
38th International Congress on Assessment Center Methods Pia Ingold, Ph.D. University of Zurich, Switzerland Transparency of Assessment Centers: Lower Criterion-Related Validity but Greater Opportunity to Perform?

Transcript of Transparency of Assessment Centers: Lower Criterion ... · Transparency of Assessment Centers:...

38th International Congress on Assessment Center Methods

Pia Ingold, Ph.D.

University of Zurich, Switzerland

Transparency of Assessment Centers: Lower Criterion-Related Validity but Greater Opportunity to Perform?

2

38th International Congress on Assessment Center Methods Pia Ingold

Imagine you and your team construct and implement ACs for selecting consultants. When planning the AC in detail, your team members propose two alternative solutions.

3

38th International Congress on Assessment Center Methods Pia Ingold

Massstab Benchmark for ACs

Masssta Criterion-related validity

= How well does AC performance predict job performance?

4

38th International Congress on Assessment Center Methods Pia Ingold

Two observations

1. Criterion-related validity has decreased over time

AT&T study 1966 (Bray & Grant) r = .46

Meta-analyses 1987 (Gaugler et al.,) = .37

2007 (Hermelin et al.) = .28

2007 (Hardison & Sackett) = .26

2. Dimension transparency has increased

Transparency 1997 (Spychalski et al.) 29% 2009 (Thornton & Krause) 43%

Starting point for this study on the practically relevant effects of dimension transparency

5

38th International Congress on Assessment Center Methods Pia Ingold

Purpose of the study

What effects does dimension transparency have…

1) on the criterion-related validity?

2) on the relation of assessees’ behavior and job performance?

3) on assessees’ opportunity to perform?

Practical implications for future construction of ACs

6

38th International Congress on Assessment Center Methods Pia Ingold

Dimension transparency in ACs

detailled information about assessed dimensions for assessees before each exercise

no information about assessed dimensions

ACs differ with regard to dimension transparency (Krause & Gebert, 2003)

Continuum

nontransparent transparent

7

38th International Congress on Assessment Center Methods Pia Ingold

Effects of transparency on criterion-related validity

Controversial assumptions concerning the effects of transparency on criterion-related validity:

Transparency levels chances. positive effect (Dodd, 1977; Maurer et al., 2008)

Transparency lowers the similarity of selection and job situation. negative effect (Klehe et al. 2008;

Kleinmann, 1997; Smith-Jentsch, 2007)

vs.

8

38th International Congress on Assessment Center Methods Pia Ingold

Position 1: Positive effect on CRV

clear view for everyone of what is important for performance levels chances for performing well

Performance

criteria

9

38th International Congress on Assessment Center Methods Pia Ingold

Position 2: Negative effect on CRV

On the job:

Employees usually do not know according to which

criteria their performance will be evaluated in the next situation.

In the AC:

a) nontransparent AC: assessees do not know according to which criteria their performance will be evaluated

b) transparent AC: assessees know according to which criteria their performance will be evaluated

10

38th International Congress on Assessment Center Methods Pia Ingold

Cognitive affective personality system theory (CAPS, Mischel & Shoda, 1995)

interaction of the person and the situation

affective and cognitive representation

behavioral scripts

behavior

11

38th International Congress on Assessment Center Methods Pia Ingold

Position 2: Negative effect on CRV

Nigel in the nontransparent AC:

What is important for my performance? What is crucial for achieving the best outcome?

Tom in the transparent AC:

Assertiveness, structurizing behavior and persuasion are important for my performance.

on

th

e jo

b

in t

he

AC

Nigel and Tom on the job: What is important for my

performance? What is crucial for achieving the best outcome?

12

38th International Congress on Assessment Center Methods Pia Ingold

Excursion to research on ATIC

ATIC = ability to identify criteria in selection settings (cf. Kleinmann et al., 2011 for an overview)

Assessees differ with regard to their ability to identify evaluation criteria in (nontransparent) selection settings (e.g.,

Kleinmann, 1993)

Assessees‘ ATIC predicts their selection performance and supervisors‘ job performance ratings (e.g., Jansen et al. 2013, Ingold et al., in press)

Making evaluation criteria transparent makes this criterion-relevant ability irrelevant lower criterion-related validity

13

38th International Congress on Assessment Center Methods Pia Ingold

Excursion to research on ATIC II

(Table adapted from Jansen et al. 2013)

14

38th International Congress on Assessment Center Methods Pia Ingold

Research on dimension transparency in ACs I

performance in 2nd

nontransparent AC

performance in nontransparent

AC

performance in transparent AC

Kleinmann (1997):

15

38th International Congress on Assessment Center Methods Pia Ingold

Research on dimension transparency in ACs II

self-reported directiveness

directiveness ratings in

nontransparent condition

directiveness ratings in

transparent condition

Smith Jentsch et al., (2001):

16

38th International Congress on Assessment Center Methods Pia Ingold

Hypothesis 1

Ratings from a nontransparent AC (i.e., without detailed information for assessees about the targeted dimensions) are more criterion valid than ratings from a transparent AC (i.e., with detailed information).

transparent intransparent

crit

erio

n-r

elat

ed

val

idit

y

17

38th International Congress on Assessment Center Methods Pia Ingold

Life is a stage … Work is a stage. ACs are a stage.

18

38th International Congress on Assessment Center Methods Pia Ingold

Assessees‘ behavior in ACs

Assessees in ACs use impression management

Impression Management

Assertive Impression

Management

Self-promotion

Ingratiation

Defensive Impression

Management

Excuses Justification

positive effect on AC performance possible (Klehe, Kleinmann, Niess,

& Grazi, 2014; McFarland et al., 2003, 2005)

19

38th International Congress on Assessment Center Methods Pia Ingold

Assessees‘ behavior in ACs

Controversy exists as to how impression management affects the selection decision

a) negative: Impression management biases selection decisions (e.g., Anderson, 1991; McFarland, Ryan, & Kriska, 2002)

b) positive: Impression management shown in selection settings beneficial for the job (e.g., Ellis et al., 2002;

Kleinmann & Klehe, 2011)

20

38th International Congress on Assessment Center Methods Pia Ingold

Impression management on the job

21

38th International Congress on Assessment Center Methods Pia Ingold

Transparency, self-promotion and job performance

transparent ACs: allow assessees to adapt their self-promotion to the dimensions

nontransparent ACs: no dimension information available for assessees that allows adaptation

on the job: employees do not have information to adapt their self promotion accordingly

Typical self-promotion in nontransparent ACs

Atypical self-promotion in transparent ACs

22

38th International Congress on Assessment Center Methods Pia Ingold

Hypothesis 2

Assessees’ self-promotion in the nontransparent condition will be more positively related to supervisor-rated job performance than assessees' self-promotion in the transparent condition.

self-promotion job performance

nontransparency

self-promotion job performance

transparency

+

23

38th International Congress on Assessment Center Methods Pia Ingold

Fairness perceptions of assessees

assessees‘ fairness perceptions important for selection procedure implementation (König et al., 2010; Eurich

et al., 2009)

central component of fairness perceptions: opportunity to perform (OTP, Schleicher et al., 2006)

OTP = perception that s/he has an adequate opportunity to demonstrate one‘s knowledge, skills, and abilities in the testing situation (Arvey & Sackett, 1993, Bauer et al., 2001, Gilliland, 1993)

24

38th International Congress on Assessment Center Methods Pia Ingold

25

38th International Congress on Assessment Center Methods Pia Ingold

Hypothesis 3

Assessees’ perceived opportunity to perform is higher in the transparent condition than in the nontransparent condition.

Op

po

rtu

nit

y t

o p

erfo

rm

transparent nontransparent

26

38th International Congress on Assessment Center Methods Pia Ingold

27

38th International Congress on Assessment Center Methods Pia Ingold

Considerations when planning the study

Design that allows for conclusions on the effects of transparency

manipulation of transparency

full control over ACs (i.e. conditions identical except for transparency manipulation)

Simulated ACs with a between-subjects design

Sample with supervisors

collecting criterion data from participants’ supervisors employed participants who permitted us to contact their supervisors

sample representative for ACs newcomers with some job experience as target group

28

38th International Congress on Assessment Center Methods Pia Ingold

Setting

Simulated ACs für employed graduates who are looking for a new job

one-day AC with several exercises and trained assessors

extensive feedback on performance

29

38th International Congress on Assessment Center Methods Pia Ingold

Assessees, assessors, & supervisors

Assessees

106 male, 91 female, mean age ≈ 29 years

worked in several branches, e.g., 40% research and education sector

9 % in the banking and insurance sector

7% in the industrial sector

6% in the service sector

6% in the media and communication sector

5% in the public sector

Assessors

mean age ≈ 25 years

one-day FOR-training

Supervisors

mean age ≈ 46 years old

≈ 5 years in supervisory function

30

38th International Congress on Assessment Center Methods Pia Ingold

Between-Subjects-Design

nontransparent AC

transparent AC

detailled information about assessed dimensions

no information about assessed dimensions

Assessees take part in one of two conditions:

AC identical except for transparency manipulation

31

38th International Congress on Assessment Center Methods Pia Ingold

Example information about dimensions

32

38th International Congress on Assessment Center Methods Pia Ingold

Procedure: Data from 3 sources

Assessees fill in measures: self-promotion opportunity to

perform cognitive ability test manipulation check sociodemographic

variables

Trained assessors evaluate assessees’ performance in AC exercises on a scale from 1 to 5 on every dimension.

Supervisors fill in a confidential online survey: employees’ task-

based performance sociodemographic

variables

All sources blind for the two conditions

33

38th International Congress on Assessment Center Methods Pia Ingold

Preliminary analysis I

A. Manipulation check: Higher sum scores in the transparent condition than in the nontransparent condition.

M= 6.77, SD= 1.40 M= 8.95, SD= 1.76

t(173.44) = 9.43, p < .001, d = 1.37

34

38th International Congress on Assessment Center Methods Pia Ingold

Preliminary analysis II

Sample in nontransparent ACs

Sample in transparent ACs

Both samples comparable with regard to: age sex cognitive ability number of working hours

work experience job performance ratings how well supervisors

could evaluate their employees’ performance

≈ ?

B. Comparable samples

all ts < 1.31, all ps > .19

35

38th International Congress on Assessment Center Methods Pia Ingold

Hypothesis 1

Ratings from a nontransparent AC (i.e., without detailed information

for assessees about the targeted dimensions) are more criterion valid than ratings from a transparent AC (i.e., with detailed information).

+

t(8) = 3.84**, p < 0.01

Transparent Nontransparent

Analytical skills .04 .29**

Organizing & planning .06 .18†

Consideration of others .09 .14

Persuasion -.01 .18†

Presentation skills .15 .18†

Mean validity .09 .20*

36

38th International Congress on Assessment Center Methods Pia Ingold

Hypothesis 2

Assessees’ self-promotion in the nontransparent condition will be more positively related to supervisor-rated job performance than assessees' self-promotion in the transparent condition.

self-promotion job performance

nontransparency

self-promotion job performance

transparency

+

r = -.20

r = .17

(Z = 2.46**, p < 0.01)

37

38th International Congress on Assessment Center Methods Pia Ingold

Hypothesis 2

Figure 2. Interaction diagram of transparency as moderator of the relation of candidates’ self-promotion in the assessment center and supervisor-rated task-based job performance.

38

38th International Congress on Assessment Center Methods Pia Ingold

Hypothesis 3

Assessees’ perceived opportunity to perform is higher in the transparent condition than in the nontransparent condition.

Op

po

rtu

nit

y t

o p

erfo

rm

transparent nontransparent transparent nontransparent

t(192) = 1.02, n.s.

M = 3.30 M = 3.26

39

38th International Congress on Assessment Center Methods Pia Ingold

Discussion

Dimension transparency had

1) negative effects for the criterion-related validity of the AC, as the ratings from a nontransparent AC were more criterion valid than the ratings from a transparent AC

2) moderated the relation of self-promotion and supervisor’s ratings of job performance

3) no effect on the assessees’ perception of their opportunity to perform

40

38th International Congress on Assessment Center Methods Pia Ingold

Limitations

data collection in a simulated setting and not in an operational AC,

but

representative sample

full control over AC and manipulation

authentic setting

criterion data from supervisors

41

38th International Congress on Assessment Center Methods Pia Ingold

Implications for practice

refrain from making dimensions transparent in selection contexts

generally pay attention to design factors in ACs

continuous evaluation of ACs recommendable

effect of impression management on job performance depends on the situation interactionist perspective necessary

42

38th International Congress on Assessment Center Methods Pia Ingold

Future research

more primary research with field data and varyiing degrees of transparency

meta-analysis on effects of transparency on criterion-related validity

linking IM research in the selection context to IM research in the work context

Pia Ingold

Thank you for your attention

E-Mail: [email protected]

and thanks to my co-authors

Martin Kleinmann

Cornelius König

Klaus Melchers

Pia Ingold, Ph.D. Work & Organizational Psychology, University of Zurich Binzmühlestrasse 14/12; 8050 Zürich, Switzerland

and the Swiss National Science Foundation

44

38th International Congress on Assessment Center Methods Pia Ingold

45

38th International Congress on Assessment Center Methods Pia Ingold

Selected references

Bray, D. W., & Grant, D. L. (1966). The assessment center in the measurement of potential for business management. Psychological Monographs: General and Applied, 80, 1-27.

Gaugler, B. B., Rosenthal, D. B., Thornton, G. C., & Bentson, C. (1987). Meta-analysis of assessment center validity. Journal of Applied Psychology, 72, 493-511. doi:10.1037//0021-9010.72.3.493

Hardison, C. M., & Sackett, P. R. (2007). Kriterienbezogene Validität des Assessment Centers: lebendig und wohlauf? [Assessment center criterion related validity: Alive and well?]. In H. Schuler (Ed.), Assessment Center zur Potenzialanalyse (pp. 192-202). Göttingen, Germany: Hogrefe.

Hermelin, E., Lievens, F., & Robertson, I. T. (2007). The validity of assessment centres for the prediction of supervisory performance ratings: A meta-analysis. International Journal of Selection and Assessment, 15, 405-411. doi:10.1111/j.1468-2389.2007.00399.x

Ingold, P. V., Kleinmann, M., König, C. J., Melchers, K. G., & Van Iddekinge, C. H. (in press). Why do situational interviews predict job performance: The role of interviewees’ ability to identify criteria. Journal of Business and Psychology. doi:10.1007/s10869-014-9368-3

Jansen, A., Melchers, K. G., Lievens, F., Kleinmann, M., Brändli, M., Fraefel, L., et al. (2013). Situation assessment as an ignored factor in the behavioral consistency paradigm underlying the validity of personnel selection procedures. Journal of Applied Psychology, 98, 326-341. doi:10.1037/a0031257

Klehe, U.-C., Kleinmann, M., Niess, C., & Grazi, J. (2014). Impression management behavior during assessment centers: Artificial behavior or much ado about nothing? Human Performance, 27, 1-24. doi:10.1080/08959285.2013.854365

Kleinmann, M. (1993). Are rating dimensions in assessment centers transparent for participants? Consequences for criterion and construct validity. Journal of Applied Psychology, 78, 988-993. doi:10.1037/0021-9010.78.6.988

Kleinmann, M., Ingold, P. V., Lievens, F., Jansen, A., Melchers, K. G., & König, C. J. (2011). A different look at why selection procedures work: The role of candidates’ ability to identify criteria. Organizational Psychology Review, 1, 128-146. doi:10.1177/2041386610387000

McFarland, L. A., Yun, G., Harold, C. M., Viera, L., & Moore, L. G. (2005). An examination of impression management use and effectiveness across assessment center excercises: The role of competency demands. Personnel Psychology, 58, 949-980. doi:10.1111/j.1744-6570.2005.00374.x

Schleicher, D. J., Venkataramani, V., Morgeson, F. P., & Campion, M. A. (2006). So you didn't get the job ... now what do you think? Examining opportunity-to-perform fairness perceptions. Personnel Psychology, 59, 559-590. doi:10.1111/j.1744-6570.2006.00047.x

Smith-Jentsch, K. A., Salas, E., & Brannick, M. T. (2001). To transfer or not to transfer? Investigating the combined effects of trainee characteristics, team leader support, and team climate. Journal of Applied Psychology, 86, 279-292. doi:10.1037/0021-9010.86.2.279

Spychalski, A. C., Quinones, M. A., Gaugler, B. B., & Pohley, K. (1997). A survey of assessment center practices in organizations in the United States. Personnel Psychology, 50, 71-90. doi:10.1111/j.1744-6570.1997.tb00901.x

Thornton, G. C., & Krause, D. E. (2009). Selection versus development assessment centers: An international survey of design, execution, and evaluation. International Journal of Human Resource Management, 20, 478. doi:10.1080/09585190802673536

Thornton, G. C., & Gibbons, A. M. (2009). Validity of assessment centers for personnel selection. Human Resource Management Review, 19, 169-187. doi:10.1016/j.hrmr.2009.02.002

46

Pia Ingold

Correlation table

Table 1

Descriptive Statistics and Intercorrelations of Study Variables in the Nontransparent and Transparent Condition

Variables 1 2 3 4 5 6 7 8 9 10 11 12 13 M SD

1. Task-based job

performance – .08 .04 .06 .09 -.01 .15 -.20† -.05 -.10 .09 -.13 .00 5.80 0.87

2. Overall AC

performance .24* – .70** .88** .57** .89** .79** .26* .25* .15 .08 -.14 -.15 3.56 0.57

3. Analytical skills .29** .68** – .55** .23* .62** .54** .15 .22* .04 .23* -.08 -.18† 3.55 0.69

4. Organizing &

planning .18

† .91** .54** – .43** .68** .61** .25* .25* .08 .05 -.20† -.15 3.43 0.73

5. Consideration of

others .14 .62** .39** .43** – .35** .37** .14 .06 .14 .05 .07 -.14 3.61 0.66

6. Persuasion .18† .90** .61** .74** .48** – .65** .29** .21* .20† .05 -.14 -.16 3.56 0.69

7. Presentation skills .18† .66** .29** .55** .30** .46** – .05 .20* .14 .08 -.11 -.05 3.73 0.76

8. Self-promotion .17 -.01 -.04 .04 -.09 .02 -.06 – .32** .01 -.02 -.16 -.08 2.57 0.77

9. Ingratiation -.06 .20* .11 .14 .21* .17† .17† .29** – .23* -.13 -.18† .05 3.10 0.73

10. Opportunity to

perform .13 .12 .09 .04 .24* .12 .00 -.05 .00 – -.07 .07 -.13 3.30 0.52

11. Cognitive ability .16 .05 .10 .02 .09 .05 .01 .07 .18† -.01 – -.08 -.07 97.43 13.67

12. Age .11 -.02 .00 -.08 -.06 .05 .08 -.02 -.28** -.19† -.28** – .07 28.33 5.23

13. Sex -.04 -.22* -.07 -.30** -.14 -.13 -.10 -.06 -.23* -.06 -.21* .25* – 1.51 0.50

M 5.72 3.59 3.63 3.43 3.75 3.60 3.64 2.53 2.91 3.26 99.37 28.65 1.42

SD 0.90 0.54 0.74 0.72 0.64 0.64 0.69 0.62 0.77 0.59 12.88 5.66 0.50

Note. Intercorrelations for the nontransparent condition are presented below the diagonal, and intercorrelations from the transparent condition are presented above the

diagonal. For the nontransparent condition N = 103 with the exception of job performance for which N = 92 and for age for which N = 98. For the transparent condition N =

94 with the exception of job performance for which N = 87 and for age for which N = 91. Sex was coded as 1 = male, 2 = female. AC = Assessment center. †p < .10,*p < .05, **p < .01, two-tailed.

47

38th International Congress on Assessment Center Methods Pia Ingold

AC-exercise-dimension-matrix

Exercise

Dimensions Presentation 1 Presentation 2

Group

discussion

1

Group

discussion

2

Analytical skills x x

Organizing & planning x x x x

Persuasiveness x x x x

Consideration of others x x

Presentation skills x x