Educational and Psychological Measurement · members of Educational and Psychological Measurement...

21
http://epm.sagepub.com Educational and Psychological Measurement DOI: 10.1177/0013164401612006 2001; 61; 229 Educational and Psychological Measurement Troy Courville and Bruce Thompson Use of Structure Coefficients in Published Multiple Regression Articles: ß is not Enough http://epm.sagepub.com/cgi/content/abstract/61/2/229 The online version of this article can be found at: Published by: http://www.sagepublications.com can be found at: Educational and Psychological Measurement Additional services and information for http://epm.sagepub.com/cgi/alerts Email Alerts: http://epm.sagepub.com/subscriptions Subscriptions: http://www.sagepub.com/journalsReprints.nav Reprints: http://www.sagepub.com/journalsPermissions.nav Permissions: http://epm.sagepub.com/cgi/content/refs/61/2/229 SAGE Journals Online and HighWire Press platforms): (this article cites 22 articles hosted on the Citations © 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.com Downloaded from

Transcript of Educational and Psychological Measurement · members of Educational and Psychological Measurement...

Page 1: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

http://epm.sagepub.com

Educational and Psychological Measurement

DOI: 10.1177/0013164401612006 2001; 61; 229 Educational and Psychological Measurement

Troy Courville and Bruce Thompson Use of Structure Coefficients in Published Multiple Regression Articles: ß is not Enough

http://epm.sagepub.com/cgi/content/abstract/61/2/229 The online version of this article can be found at:

Published by:

http://www.sagepublications.com

can be found at:Educational and Psychological Measurement Additional services and information for

http://epm.sagepub.com/cgi/alerts Email Alerts:

http://epm.sagepub.com/subscriptions Subscriptions:

http://www.sagepub.com/journalsReprints.navReprints:

http://www.sagepub.com/journalsPermissions.navPermissions:

http://epm.sagepub.com/cgi/content/refs/61/2/229SAGE Journals Online and HighWire Press platforms):

(this article cites 22 articles hosted on the Citations

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 2: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENTCOURVILLE AND THOMPSON

USE OF STRUCTURE COEFFICIENTS IN PUBLISHEDMULTIPLE REGRESSION ARTICLES: β IS NOT ENOUGH

TROY COURVILLESam Houston State University

BRUCE THOMPSONTexas A&M University

The importance of interpreting structure coefficients throughout the General LinearModel (GLM) is widely accepted. However, regression researchers too infrequently con-sult regression structure coefficients to augment their interpretations. The authorsreviewed articles published in the Journal of Applied Psychology to determine how inter-pretations might have differed if standardized regression coefficients and structure coef-ficients (or else bivariate rs of predictors with the criterion) had been interpreted. Somedramatic misinterpretations or incomplete interpretations are summarized. It is sug-gested that beta weights and structure coefficients (or else bivariate rs of predictors withthe criterion) ought to be interpreted when noteworthy regression results have beenisolated.

Following the publication of Cohen’s (1968) seminal article, “MultipleRegression as a General Data-Analytic System,” social scientists began totake seriously the notion that regression is the univariate General LinearModel (GLM). Knapp (1978) subsequently extended this view by presentingcanonical correlation analysis as the multivariate GLM. More recently, struc-tural equation modeling has been represented as the most general case of thegeneral linear model, when measurement modeling is incorporated simulta-neously into substantive modeling (Bagozzi, Fornell, & Larcker, 1981). Evenwhen only measurement modeling is conducted, the same basic variance par-titioning methods used in substantive modeling are applied, albeit for a dif-ferent purpose (Dawson, 1999).

Xitao Fan served as Action Editor for this manuscript.

Educational and Psychological Measurement, Vol. 61 No. 2, April 2001 229-248© 2001 Sage Publications, Inc.

229

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 3: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

The general linear model has helped researchers come to understand thatall parametric analyses (a) are correlational, (b) apply some system ofweights to measured/observed variables to estimate scores on composite orsynthetic variables that then become the analytic focus, and (c) yield effectsize analogs of r2 values (e.g., Fan, 1996, 1997; Thompson, 1991, 2000). Thislast realization ultimately helped lead to encouraging effect size reporting inall research (American Psychological Association [APA], 1994, p. 18) and,more recently, to the repeated suggestion by the APA Task Force on Statisti-cal Inference that effect size reporting should occur in all research(Wilkinson & APA Task Force on Statistical Inference, 1999).

The Task Force emphasized, “Always [italics added] provide some effect-size estimate when reporting a p value” (Wilkinson & APA Task Force onStatistical Inference, 1999, p. 599). Later, the Task Force wrote,

Always [italics added] present effect sizes for primary outcomes. . . . It helps toadd brief comments that place these effect sizes in a practical and theoreticalcontext. . . . We must stress again that reporting and interpreting effect sizes inthe context of previously reported effects is essential [italics added] to good re-search. (p. 599)

In construct or predictive validity or in substantive regression studies,once statistically significant and/or noteworthy effects are detected, the ori-gins of detected effects must then (and only then) be explored. Of course,many researchers interpret standardized regression (sometimes called β orbeta) weights for this purpose. Many such researchers deem unimportant anypredictor variables with near-zero beta weights.

However, the flaws of interpreting only beta weights have been noted.Thompson and Borrello (1985) argued that structure coefficients are just asimportant in regression as they are in other GLM methods, such as descrip-tive discriminant analysis and factor analysis. Indeed, the editorial boardmembers of Educational and Psychological Measurement subsequentlycited that article as one of the most important measurement-related publica-tions of the past 50 years (Thompson & Daniel, 1996).

The purpose of the present study was to characterize what regressionresearchers are actually doing in published regression research as regards theinterpretation of regression results using β weights and/or structure coeffi-cients. In particular, we wanted to summarize the interpretations offered byauthors in published regression research, and where possible conduct ourown interpretations by supplementary analyses that then led to independentinterpretations on our part. We wanted to determine how dramatic the differ-ences might be for actual versus alternative interpretations in which βweights and structure coefficients (or else bivariate rs of predictors with thecriterion) are interpreted as part of a more complete system of results.

230 EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 4: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

Regression Interpretation Strategies

On a superficial first-glance basis, a regression interpretation focusingsolely on β weights erroneously seems reasonable, given one formula for themultiple R2 effect size (e.g., Thompson, 1995):

R2 = β1 (rYX1) + β2 (rYX2

) + . . . βp (rYX p). (1)

A superficial examination of the formula erroneously intimates that a predic-tor variable with a near-zero beta weight does not add to the predictive effi-cacy of the model.

Of course, such a view falls apart beyond a cursory examination. Forexample, a predictor (e.g., Xp) may have a large absolute correlation with Ybut have a zero β weight, if one or more other correlated predictors areassigned credit for that predictor’s shared explanatory ability. Indeed, insome cases a predictor with near-zero beta weight may be a very good predic-tor or even the single best predictor (e.g., Thompson & Borrello, 1985).

Furthermore, many researchers making such misinterpretations also failto recognize that this β-weight-focused interpretation strategy is contextdependent on having an exactly correctly specified model, because adding ordeleting a single predictor could radically alter all the weights and thus allthe interpretations resulting from them (e.g., Thompson, 1999b). In thewords of Dunlap and Landis (1998), “The size of the regression weightdepends on the other predictor variables included in the equation and is,therefore, prone to change across situations involving different combinationsof predictors” (p. 398).

But as Pedhazur (1982) has noted, “The rub, however, is that the truemodel is seldom, if ever, known” (p. 229). And as Duncan (1975) has noted,“Indeed it would require no elaborate sophistry to show that we will neverhave the ‘right’ model in any absolute sense” (p. 101).

For these and other reasons, some researchers have suggested that struc-ture coefficients (or alternatively rs of predictors with Y; see Pedhazur, 1997,pp. 899-900) must be interpreted in conjunction with the standardizedweights when predictors are correlated (see Cooley & Lohnes, 1971, p. 55;Darlington, 1968; Thompson, 1997b; Thompson & Borrello, 1985;Thorndike, 1978, pp. 170-172). In the regression case, a structure coefficient(rS) is the bivariate correlation between a given predictor variable and the syn-thetic variable, predicted Y or �Y.

Two Coefficients Should Be Interpreted

However, interpreting only structure coefficients would be just as errone-ous as interpreting only beta weights. A superficial examination of Equation 1

COURVILLE AND THOMPSON 231

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 5: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

might incorrectly intimate that predictor Xp with a zero correlation with Ycannot contribute to the R2 effect size, because for this variable the productterm in Equation 1, βp (rYX p

), equals zero regardless of what the predictor’s βweight is.

Yet a variable may have a zero correlation with Y but a sizeable non-zero βweight. This does affect R2 by allowing the β weights for other predictors todeviate further from zero than the boundary of their respective zero-ordercorrelations with Y, thus making some of the other product terms within theequation larger, and thus making R2 larger. This is exactly what happens in theclassic “suppressor variable” case described by Horst (1966) based on pilottraining data from World War II (see also Henard, 1998; Lancaster, 1999;Stevens, 1996, pp. 106-107; Woolley, 1997).

In this classic example, notwithstanding the fact that verbal ability wasuncorrelated with pilot ability, using verbal ability scores in the regressionequation to predict pilot ability actually served to remove the contaminatinginfluence of verbal ability from the other predictors, which effectively in-creased the R2 value from what it would have been if only mechanical andspatial abilities were used as predictors. As Horst (1966) noted,

To include the verbal score with a negative weight served to suppress or sub-tract irrelevant ability, and to discount the scores of those who did well on thetest simply because of their verbal ability rather than because of abilities re-quired for success in pilot training. (p. 355)

Regression Purposes and Three Heuristic Examples

Stevens (1996) and others were careful to distinguish two distinct applica-tions of multiple regression: prediction versus explanation (or theory test-ing). Huberty and Petoskey (1999) discussed the distinctions at some length.In a pure prediction case we usually have (a) a group a people for whom dataon a criterion variable and several predictors are available and (b) a group ofpeople for whom the same predictors are available, but the criterion variableis not known or has not yet occurred. We can derive a prediction equation(rule or factor) from the first group and—if (and only if) (a) the people in thetwo groups are reasonably similar and (b) the regression equation worksfairly well in the first group—then we can reasonably apply the weights fromthe first group’s equation in the second group to make predictions of theabsent criterion scores within the second group.

When our research application is purely predictive, in a sense interpreta-tion may be irrelevant. We may very much desire an accurate prediction, butwe may not care why our predictive rule works. For example, parents maycare very much to know that they can obtain a very accurate prediction of theadult height of their 2-year-old children ( �Yi as a prediction of Yi for each of theith children) using the rule �Yi = 0.0 + (2 × Xi), where Xi is the height of the chil-

232 EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 6: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

dren at age 2. As parents, we may not care why this rule or equation works sowell, as long as it works. Thus, interpretation may be less relevant in regres-sion prediction applications.

But in theory testing or explanation applications, interpretation is very rel-evant. We want to know how useful the variables are. We may have theorysuggesting that some variables should be important in one or more senses inthe model, and theory that other variables should not be useful in any sense inthe model. Here we may then need to examine regression beta weights, but, itwill be argued, we will not (generally) want to interpret only regression betaweights.

Four cases will be distinguished here. Case 1 involves a single predictorvariable. In this case, the multiplicative weight for predicting ZY using thescores on Z X1

isrX Y1(i.e., �Y = β (Z X1

), andβ X1=rX Y1

). Case 1 is actually a spe-cial case of Case 2—uncorrelated predictors, because with only one predictorthere can be no nonzero correlations among the predictors. We will thereforebriefly illustrate Case 1 using Case 2, and illustrate Cases 3 and 4 as well, allusing the example of multiple regression using two predictor variables.

Case 2: Uncorrelated predictors. Table 1 presents the various calculationsassociated with Case 2. In Case 2, the predictors are perfectly uncorrelatedwith each other, and the denominator in each beta calculation equals 1.0because 1.0 –rX X1 2

= 1.0 – 0.0. Furthermore, the numerator always simplifiesto equal the correlation of Y with each predictor because the right side of thenumerator equals 0.0 (e.g., rYX2

[rX X1 2] = 0.7071[0.0]). Finally, in Case 2, R2

equals the sum of the r2 values of each predictor with Y, because

R2 = β1 (rYX1) + β2 (rYX2

) + . . . βp (rYX p),

which in this example equals

(rYX1)(rYX1

) + (rYX2)(rYX2

),

or

R2 = (rYX1)2 + (rYX2

)2.

Thus, when predictors are uncorrelated, each predictor’s so-called stan-dardized regression coefficient equals each predictor’s correlation with Y.Furthermore, because the structure coefficient for a given predictor equalsthe correlation of the predictor with Y divided by the multiple correlation R,in this case the beta weights, the structure coefficients for the predictors andthe correlations of the predictors with Y will all rank order the predictorsidentically, except that the structure coefficients will be scaled in a differentmetric (unless R = 1.0).

COURVILLE AND THOMPSON 233

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 7: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

Table 1Regression Analyses for Three Cases

Case 2 Case 3 Case 4

Correlation matricesX2 Y �Y X2 Y �Y X2 Y �Y

X1 .0000 .7071 .7071 X1 .9990 .7071 .9998 X1 –.7071 .7071 .7071X2 .7071 .7071 X2 .7060 .9983 X2 .0000 .0000Y 1.0000 Y .7072 Y 1.0000

β weight calculations

ß1 =±

±

r r r

r

YX YX X X

X X

1 2 1 2

1 2210

( )( )

. ( )

±. (. )(. )

. .

7071 7071 0000

10 00002 =±

±. (. )(. )

. .

7071 7060 9990

10 99902 =± ±

± −. (. )( . )

. ( . )

7071 0000 7071

10 7071 2

±. .

. .

7071 0000

10 0000=

±±

. .

. .

7071 7053

10 9980=

±±

. .

. .

7071 0000

10 5000

=.

.

7071

10=

.

.

0018

0020=

.

.

7071

5000= .7071 = .9068 =1.4142

ß2 =±

±

r r r

r

YX YX X X

X X

2 1 1 2

1 2210

( )( )

. ( )

234

©

2001 SA

GE

Pu

blicatio

ns. A

ll righ

ts reserved. N

ot fo

r com

mercial u

se or u

nau

tho

rized d

istribu

tion

. at P

EN

NS

YLV

AN

IA S

TA

TE

UN

IV on A

pril 16, 2008 http://epm

.sagepub.comD

ownloaded from

Page 8: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

±. (. )(. )

. .

7071 7071 0000

10 00002 =±

±. (. )(. )

. .

7060 7071 9990

10 99902 =± ±

± −. (. )( . )

. ( . )

0000 7071 7071

10 7071 2

±. .

. .

7071 0000

10 0000=

±±

. .

. .

7060 7064

10 9980=

±±

. .

. .

0000 5000

10 5000

=.

.

7071

10=

−.

.

0004

0020=

.

.

5000

5000= .7071 = –.1999 =1.0000

R2 calculationsR2 = β1(rYX1

) + β2(rYX2)

= .7071(.7071) + .7071(.7071) = .9068(.7071) + –.1999(.7060) = 1.4142(.7071) + 1.0000(.0000)= .5000 + .5000 = .6412 – .1412 = 1.0000 + .0000= 1.0000 = .5001 = 1.0000

Note. Calculations were computed to six decimal places, but are rounded here to four decimal places. The bivariate correlation between a predictor variable and the composite variable, �Y(where �Y = [β1ZX1 ] + [β2ZX2 ]), is the structure coefficient for that predictor variable. The bivariate correlation between Y and �Y is also R YX X( )1 2 .

235

©

2001 SA

GE

Pu

blicatio

ns. A

ll righ

ts reserved. N

ot fo

r com

mercial u

se or u

nau

tho

rized d

istribu

tion

. at P

EN

NS

YLV

AN

IA S

TA

TE

UN

IV on A

pril 16, 2008 http://epm

.sagepub.comD

ownloaded from

Page 9: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

Case 3: Correlated predictors. When the predictor variables are corre-lated (i.e., are collinear or multicollinear), the beta weight for a given predic-tor no longer equals the correlation of that predictor with Y. Instead, the betaweight computations take into account all the pairwise correlations of theobserved variables with each other.

This is done so that portions of Y variance that are redundantly explainedby two or more predictors will not be multiply counted as explained. Thus, asTable 1 illustrates for Case 2, although both predictors in the heuristic exam-ple explain about half the variance in the Y scores, because the two predictorswere almost perfectly correlated with each other (r = .999), together the twopredictors still explain little more than half (R2 = .5001) of the variance in theY scores.

The example also illustrates the folly of interpreting beta weights as if theyare correlation coefficients; this misinterpretation is unfortunately all toocommon. For example, X2 in the example has a beta weight of –.1999, eventhough rYX2

= 0.7060.

Case 4: Suppressor variables. Table 1 illustrates a dramatic example ofsuppressor effects. Here, X2 has a zero correlation with Y, X1 explains only50% (–.70712) of the variance in Y, and yet together the two predictors explain100% (R2 = 1.00) of the variance in Y. Again, the folly of interpreting betaweights as correlation coefficients is demonstrated, in that (a) β1 is greaterthan the maximum value of r and (b) β2 = 1.00 when rYX2

= 0.0.

Structure Coefficients Within the GLM

Throughout the general linear model (GLM), structure coefficients arebivariate correlation coefficients between a given measured/observer vari-able and a latent/synthetic variable. For example, in multiple regression, thestructure coefficient for predictor X1 is the correlation between the scores of npeople on X1 with the same n people’s scores on the predicted outcome vari-able, �Y (Cooley & Lohnes, 1971).

Similarly, in either exploratory or confirmatory factor analysis, the struc-ture coefficient for measured variable X1 on Factor I is the correlationbetween the scores of n people on X1 with the same n people’s factor scores onFactor I (Wells, 1999). And in canonical correlation analysis, for example,the structure coefficient of measured criterion variable X1 on Function I isthe correlation between the scores of n people on X1 with the same n people’scriterion-variable composite scores on Function I (Thompson, 1984, 2000).

236 EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 10: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

Emphasis Elsewhere in the General Linear Model

Huberty (1994) has noted that

if a researcher is convinced that the use of structure rs makes sense in, say, a ca-nonical correlation context, he or she would also advocate the use of structurers in the contexts of multiple correlation, common factor analysis, and descrip-tive discriminant analysis. (p. 263)

For example, principal components analysis is actually an implicit part ofcanonical correlation analysis (CCA) and all the parametric methods sub-sumed by CCA (Thompson, 1984, pp. 11-16). Regarding exploratory com-ponent and factor analysis, Gorsuch (1983) emphasized that a “basic [italicsadded] matrix for interpreting the factors is the factor structure” (p. 207).

Regarding confirmatory factor analysis, Thompson (1997b) and others(e.g., Bentler & Yuan, 2000) have emphasized that when factors are corre-lated, the measured variables have nonzero structure coefficients even withthe factors on which pattern coefficients have been fixed to be zero, and thatthese structure coefficients must be consulted to arrive at correct interpreta-tions. Similarly, as regards descriptive discriminant analysis, Huberty (1994)noted that “construct definition and structure dimension [and not hit rates] con-stitute the focus [italics added] of a descriptive discriminant analysis” (p. 206).

Again, most researchers agree that the interpretation of structure coeffi-cients is essential to understanding canonical results. As Meredith (1964)suggested, “If the variables within each set are moderately intercorrelated thepossibility of interpreting the canonical variates by inspection of the appro-priate regression weights [function coefficients] is practically nil” (p. 55).Levine (1977) was even more emphatic:

I specifically say that one has to do this [interpret structure coefficients] since Ifirmly believe as long as one wants information about the nature of the canoni-cal correlation relationship, not merely the computation of the [synthetic func-tion] scores, one must have the structure matrix. (p. 20)

And Cohen and Cohen (1983) observed that “interpretation of a given canon-ical variate is best undertaken by means of the structure coefficients, whichare simply the (zero-order) correlations of that variate with its constituentvariables (as was r

Yi�in MRC)” (p. 456).

Computation of Regression Structure Coefficients

Regression structure coefficients for the regression case can be computedby estimating the �Y scores and then requesting the correlations of the predic-

COURVILLE AND THOMPSON 237

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 11: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

tor variables with these scores. However, logistically it may be easier to com-pute regression coefficients simply by dividing a given r between predictorscores and the Y scores by the multiple correlation coefficient, R. For exam-ple, the structure coefficient for predictor X1 can be computed as:

rS = rYX1/ R.

An Interpretation Alternative

Pedhazur (1997) noted that “such coefficients are simply zero-order cor-relations of independent variables with the dependent variable divided by aconstant, namely, the multiple correlation coefficient. Hence, the zero-ordercorrelations provide the same information” (p. 899). But Thompson andBorrello (1985) argued, “However, it must be noted that interpretation ofonly the bivariate correlations seems counterintuitive. It appears inconsistentto first declare interest in an omnibus system of variables taken only two at atime” (p. 208).

In a recent American Educational Research Association (AERA) invitedaddress, Thompson (1999a) stated, “The reason that structure coefficientsare called ‘structure’ coefficients is that these coefficients provide insightregarding what is the nature or structure of the underlying synthetic variablesof the actual research focus” (p. 15). This view of structure coefficients cer-tainly suggests the interpretive importance of these coefficients.

However, Pedhazur (1997) has argued that “because one may obtain largestructure coefficients even when results [i.e., R2] are meaningless, their use insuch instances may lead to misinterpretations” (p. 899). He then presented ahypothetical data set involving an R2 of .00041, for which the rS for the firstpredictor variable was .988. Pedhazur then said, “These are impressive coef-ficients, particularly the first one. . . . But what is not apparent from an exami-nation of these coefficients is that they were obtained from meaninglessresults” (p. 899).

This objection seems unusual. As Thompson (1997a) explained,

All analyses are part of one general linear model. . . . When interpreting resultsin the context of this model, researchers should generally approach the analysishierarchically, by asking two questions:

—Do I have anything? (Researchers decide this question by looking atsome combination of statistical significance tests, effect sizes . . . andreplicability evidence.)

—If I have something, where do my effects originate? (Researchers oftenconsult both the standardized weights implicit in all analyses and structure co-efficients to decide this question.) (p. 31)

As Pedhazur himself acknowledged (1997, p. 899) regarding other GLManalyses, such as descriptive discriminant and canonical correlation analy-

238 EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 12: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

ses, one would only bother to examine the structure coefficients after one hasdetermined that the results are noteworthy.

So, given this hierarchical contingency-based approach to the interpreta-tion of all GLM results, including regression, this criticism of Pedhazur(1997) seems irrelevant. In short, in a regression in which noteworthy effectshave been isolated, and only then, one ought to interpret either the standard-ized weights and the correlations of the predictors with Y or the standardizedweights and structure coefficients.

Note that neither perspective (i.e., weights vs. structure coefficients) isinherently superior or correct. Only the use of both sets of coefficients pre-sents the full dynamics of the data when predictors are correlated, as is com-monly expected in behavioral research. For example, a near-zero weight witha large squared structure coefficient indicates that a predictor might havebeen useful in a prediction, but that the shared predictive power of that predic-tor was arbitrarily (i.e., not wrongly, just arbitrarily) assigned to another pre-dictor. Conversely, when a predictor has a large absolute beta weight but anear-zero structure coefficient, a suppressor effect is indicated, as discussedpreviously (e.g., Horst, 1966).

Sample

The studies we examined were collected from the Journal of Applied Psy-chology from 1987 (Volume 77) to 1998 (Volume 84). This is an APA I/Opsychology journal in which regression is used with some frequency. To beconsidered, the articles had to use multiple linear regression to analyze thedata. The articles also had to present a correlation matrix that included all thevariables used in each analysis, so that reanalyses were readily possible.Finally, the article authors had to have deemed their results sufficiently note-worthy that they interpreted their effects, so that these interpretations couldbe contrasted with alternative interpretations that also consulted structurecoefficients. Thirty-one articles met these criteria.

Results

In the 31 articles that met the study’s three inclusion criteria, there were110 regression analyses performed. In all of these analyses, the authors onlyinterpreted standardized weights as opposed to either (a) standardizedweights and structure coefficients or (b) standardized weights and correla-tions between the predictors and the outcomes. Of course, the authorsreported the correlation matrices, to be included in our examination, but theseauthors did not consult either these coefficients or the structure coefficientswhen evaluating the import of the predictors.

Of the 110 analyses, 103 (94%) contained at least one discrepancybetween the standardized weights and structure coefficients as regards the

COURVILLE AND THOMPSON 239

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 13: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

rank orderings of the predictive powers of the predictor variables. The stan-dardized weights can be interpreted to evaluate the importance of the predic-tors in the context-specific setting, in which it is presumed that the model isexactly correctly specified; the weights tolerate no redundancy in credit forshared predictive power when the predictors are correlated. Structure coeffi-cients, on the other hand, evaluate which predictors do or could produce thepredicted outcome scores. When predictors are correlated, different interpre-tations may arise from the two perspectives, both of which have interpretativevalue.

In 37 (34%) of the 110 analyses, the beta weights failed to identify thesingle-best predictor variable. In 90 (82%) analyses, either the first, second,or third single-best predictor variable was not identified. Finally, in 77 (70%)of the analyses, the interpreted beta weights did not identify the single-worstpredictor, which is extremely important if a researcher uses the beta weightsto reduce the number of predictor variables applied in a model.

Of course, the interpretive pictures painted by the weights are not intrinsi-cally wrong, any more than the picture painted by the structure coefficients isintrinsically correct. However, beta weights are affected by the presence orthe absence of any other predictor in the model, and the interpretations aris-ing from these weights are context specific and presume that the model isexactly correctly specified. And we will never distinguish between suppres-sor effects and the direct predictive power of a predictor if we limit our inter-pretations solely to the examination of standardized weights. We turn now tosome illustrative differences in interpretations arising when a fuller set ofresults is considered.

Illustrative Results

Space restrictions preclude complete presentation of all examples here.Suffice it to say that some gross misinterpretations or incomplete interpreta-tions of regression results occurred in the articles that we studied.

Some examples may convey the general tenor of these problems. Becauseas the number of predictor variables increases, the likelihood ofmulticollinearity increases and so then does the opportunity for discrepan-cies in interpretations arising from beta weights as against structure coeffi-cients, the examples are categorized by the number of predictor variables.However, prior to turning to these examples, a common misinterpretation isbriefly noted.

Misinterpretation of beta weights as measuring relationship. It must beremembered that the correlations between predictors and the Y scores andbetween predictors and the �Y scores (i.e., rS) are correlation coefficients. Thatis, the results are bounded by –1 to +1, measure relationship, and have signsreflecting the pattern (direct or inverse) of the relationship.

240 EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 14: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

Beta weights, on the other hand, do not measure relationship. They do nothave universal statistical boundaries (i.e., –1 to +1). The weights, for exam-ple, can be positive when the predictor’s relationship with the criterion is neg-ative, as occurred in the Case 2 example presented in Table 1. Or, the weightscan be large and nonzero when the correlations of the predictors with the Yscores are zero, as occurred in the Case 3 example in Table 1.

Clearly, beta weights should not be interpreted as measuring relationship!A beta weight evaluates, given one unit of change (e.g., increase) in Z X1

, howmuch �Y will change. For example, if X1 (and thus Z X1

) is perfectlyuncorrelated with Y and has a beta weight of –2.0, this means that if a person’sscore was higher on Z X1

by one unit, �Y would be lower by two units. This isimportant to evaluate as part of interpretation, but does not evaluate the statis-tical issue of correlation! Indeed, a predictor may have a zero correlation withY but have the largest absolute value of β for that model.

Five or fewer predictors. In a study of organizational citizenship behavior,Podsakoff, Ahearne, and MacKenzie (1997) noted that the

data reported in this table indicate that both sportsmanship (standardized b =.393, p < .05) and helping behavior (standardized b = .397, p < .05) had signifi-cant positive relationships [sic] with the quantity of output and accounted forabout a quarter of the variance (25.7%) in this criterion variable. The data alsoindicated that helping behavior was negatively related [sic] (standardized b =–.424, p < .05) to the percentage of paper produced that was rejected. . . . Civicvirtue was not found to be related [sic] to either the quantity or quality of out-put, and sportsmanship was not related [sic] to the quality of output. (p. 266)

However, in reanalysis, the structure coefficients indicated that helpingbehavior and civic virtue were the best predictors of quantity of output, withboth having positive relationships with quality, as opposed to negative and norelationship, respectively.

Maslyn and Fedor (1998) examined the relevance of measuring differentfoci in politics. The authors reported that

LMX and participant age were positively related to organizational commit-ment. In contrast, group-focused politics were negatively associated with orga-nizational commitment.

Turnover intentions also were significantly predicted by the set of controlvariables, accounting for 33% of the variance. In this case, LMX and partici-pant age were both negatively related [sic] to turnover intentions, whereas thegroup-focused perceptions of politics were not predictive of turnover inten-tions. (pp. 650-651)

The structure coefficients in this reanalysis indicated that LMX was the bestpredictor (rs = –.829). Although group focus had the most near-zero betaweight in predicting turnover intentions (reported as –.00), which the authors

COURVILLE AND THOMPSON 241

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 15: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

noted, the structure coefficients (rs = .542) indicated that this variable indeedhad sizeable predictive ability.

Six to 10 predictors. In an article on executive recognition and thewrite-off of problem loans, Staw, Barsade, and Koput (1997) reported that

regression analyses again demonstrated that the relative turnover of top man-agers at T – 1 significantly predicted both adjusted provision for loan loss (B =.003, p < .01) and adjusted net loan loss at Time T (B = .0046, p < .001). The rel-ative turnover of other senior managers showed similar effects (B = .0041, p <.001, for provisions; and B = .0023, p < .005, for write-offs). However, onceagain, turnover of outside board members did not predict either provisionfor loan loss or net loan loss (B = –.0023, ns, and B = –.0024, ns, respectively).(p. 137)

The structure coefficients suggest a different picture. Contrary to the authors’findings, for adjusted provision for loan loss and adjusted loan loss, the single-most important predictor was the relative turnover of outside board members(i.e., bank outside directors), and the relative turnover of top managers (i.e.,bank presidents, chief executive officers, and chairs) was the least importantpredictor of all three.

Furthermore, Staw et al. (1997) observed that “turnover in banks’ operat-ing management was significantly associated with the way banks dealt withproblem loans” (p. 138). However, the importance of structure coefficients isdemonstrated by the finding that, from this alternate perspective, it was actu-ally the turnover of outside management, not the turnover of the banks deal-ing with problem loans, that was the single-best predictor.

Vliert, Euwema, and Huismans (1995) found that “as superiors, the ser-geants treated their subordinate more effectively if they removed forcing (β =–.58, p < .01) or added precess controlling (β = .46, p < .01) or accommodat-ing (β = .21, p < .05)” (p. 276). However, the unreported structure coefficientsindicated that the sergeants treated their subordinates more effectively if theyadded problem solving (rS

2 = .743) first and foremost, whereas the third-mosteffective technique was to remove avoiding (rS

2 = .411).Similarly, Wanberg, Watt, and Rumsey (1996) maintained that

our multiple regression analyses found conscientiousness and job-seekingsupport to be significant, positive predictors of job-seeking frequency andjob-seeking intention. One additional variable, gender, was also a significantpredictor of job-seeking intention, with women being more likely than men tohave future intentions of looking hard for work. (p. 83)

This result was reported to be at odds with previous results. However, lookingat the structure coefficients, the rank of the structure coefficient for genderwas seventh, not second, as the beta weights suggested.

242 EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 16: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

In their findings on perceived equity, motivation, and final-offer arbitra-tion in major league baseball, Bertz and Thomas (1992) stated that

multiple ordinary least squares regression indicated that arbitration outcomesignificantly predicted subsequent performance. The pre-arbitration perfor-mance variables were the most significant predictors of subsequent perfor-mance. As hypothesized, the coefficient on lost arbitration was negative andsignificant, suggesting that losing arbitration had detrimental effect on subse-quent performance. (p. 283)

After computing the structure coefficients for this model, it was found thatlosing arbitration was found to have no effect on the player performance (rs =–.086). That is, losing arbitration was an unnoticed suppressor variable.

In their study of the influence of structural features of local unions onmembers’ unions commitment, Mellor, Mathieu, and Swim (1994) reportedthat “only age (beta = –.14, p < .05) and union-family conflict (beta = –.42, p <.01) evidenced significant linear effects” (p. 206). However, the structurecoefficients indicated something different. Union-family conflict was thesingle-most noteworthy predictor of union commitment. Furthermore, agewas actually only the ninth-best single predictor of union commitment.

Eleven to 15 predictor variables. In studying the roles of job enrichmentand other organization interventions on self-efficacy, Parker (1998) notedthat “decision-making influence (the second measure of job enrichment) didnot make a significant independent contribution to the regression equation”(p. 842). However, the structure coefficients indicated that, of the organiza-tional variables, decision-making influence was the second-best predictor ofrole breadth self-efficacy.

Parker (1998) also indicated that “it is relevant to observe that bothself-esteem and proactive personality were significant predictors of RBSE (β =.11, p < .01, and β = .24, p < .001, respectively), suggesting that these person-ality factors are associated with self-efficacy” (p. 842). However, the struc-ture coefficients indicated that although proactivity was the second-best pre-dictor of RBSE, self-esteem was only the sixth best.

Sixteen or more predictor variables. In explaining the results of theirstudy of substance abuse and on-the-job behaviors, Lehman and Simpson(1992) reported that on the Psychological Withdrawal Behaviors scale,“Alcohol use, lifetime drug use, and substance use at work each had signifi-cant b weights in the full regression. Other important predictors of psycho-logical withdrawal behaviors included age, education, self-esteem,depression, tenure with city, job involvement, job satisfaction, organizationalcommitment, and power” (pp. 315-316). However, surveying the structurecoefficients, the ordering of the variables predicting Psychological With-

COURVILLE AND THOMPSON 243

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 17: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

drawal Behaviors was different from the ordering associated with the betaweights. The eight most important predictor variables, according to the struc-ture coefficients, were organizational commitment (rs = –.740), job satisfac-tion (rs = –.606), job involvement (rs = –.564), self-esteem (rs = –.428), druguse (rs = .428), faith in management (rs = –.402), loyalty (rs = –.351), and age(rs = –.337).

For Physical Withdrawal Behaviors, Lehman and Simpson (1992) notedthat “the strongest individual predictors were positive affect, pay level,power, and lifetime drug use. Substance use at work and alcohol use also hadsignificant b weights” (p. 316). However, looking at the structure coeffi-cients, the strongest single predictor variables were recent drug use (rs =.551), drug use (rs = .416), gender (rs = –.409), substance use at work (rs =.409), self-esteem (rs = .372), and age (rs = .326).

For Antagonistic Work Behaviors, Lehman and Simpson (1992) notedthat “the strongest individual predictors were being White, faith in manage-ment, job involvement, job satisfaction, and loyalty. Substance use at workwas the only substance use variable to have a significant b weight” (p. 316).However, the structure coefficients suggested that the most important predic-tor variables were faith in management (rs = –.668), job satisfaction (rs =–.524), loyalty (rs = –.479), organizational commitment (rs = –.325), and sub-stance abuse at work (rs = .322).

Finally, in Tannenbaum, Mathieu, Salas, and Cannon-Bowers’s (1991)article on the influence of training fulfillment on the development of commit-ment, self-efficacy, and motivation, they indicated that training fulfillmentwas positively related to training motivation, but the structure coefficientsindicated that training fulfillment was actually negatively related to trainingmotivations. Similarly, they also mentioned that inspections were negativelyrelated to physical self-efficacy, but the structure coefficients indicated thatthere was actually a positive relationship between these two variables.

Discussion

In most cases, regression researchers ought to interpret β weights andstructure coefficients (or else bivariate correlations of predictors with the cri-terion) once a noteworthy omnibus effect is detected. Our study demonstratesthat this is not merely some pedantic statistical concern. Researchers maymisinterpret or incompletely interpret their regression results by consultingonly selected aspects of their analyses.

The finding that so few regression researchers in the articles we studiedconsulted structure coefficients (i.e., none) is not atypical as regards the prac-tices in other journals (Burdenski, in press). For example, Bowling (1993)reported that the Journal of Counseling Psychology published 20 articles thatused multiple regression analysis between January 1990 and April 1993, but

244 EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 18: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

that authors of only 3 studies reported structure coefficients in their resultsand only a few provided a correlation matrix that would allow an ambitiousresearcher to derive them post facto. In this vein, Dunlap and Landis (1998)noted that although structure coefficients “are invariably computed forcanonical correlations by modern statistical software, they are never reportedfor multiple regression analysis” (p. 398).

We emphasize again that we do not advocate ignoring the regression betaweights. However, we must remember that these weights and the interpreta-tions arising from them are context specific. The confidence we vest in inter-pretations of the weights hinges on our certainty that our model is exactly cor-rectly specified. The weights can all change dramatically with the addition orthe deletion of a single predictor.

It can be useful to also consult regression structure coefficients or the cor-relations of the predictors with Y to obtain another perspective on dynamicswithin our data. This consultation may yield the insight that a predictor with anear-zero beta weight actually was the single-best predictor. Or we may dis-cover that the predictor is a suppressor that improves the model R2 not bydirectly predicting Y but indirectly doing so by removing extraneous variancefrom other predictors.

Of course, when predictors are perfectly uncorrelated, both sets of coeffi-cients will yield identical interpretations, because β in this case for a predic-tor will equal rYX, and because rS equals rYX / R, so the two sets of coefficientswill merely be scaled differently. But, as our review showed, in many articlespredictors are often correlated, just as they often are in the reality beingstudied.

When interpreting regression results, once noteworthy effects have beendetected it may be best to consult the full system of results, just as we rou-tinely would in applications of other members of the general linear modelanalytic family. The two sets of coefficients—β weights and structure coeffi-cients—provide us with a more insightful stereoscopic view of dynamicswithin our data. Interpreting only beta weights, once noteworthy omnibuseffects have been isolated, usually will not yield sufficient understanding ofall the relevant dynamics within our data. Other results may also augmentinterpretation (e.g., Johnson, 2000).

As Cohen and Cohen (1983) argued, “It is also important to keep in mindthe zero-order correlations of Xi with Y (and hence with �Y)” (p. 113). Ofcourse, Ezekiel’s (1930) old admonition remains prescient:

Except insofar as the effort to reduce the variables to specific numerical state-ment, definitely related, forces the investigator to think more clearly and defi-nitely about his problem, statistical analysis is not a substitute for logical analy-sis, clear-cut thinking, and full knowledge of the problem. (p. 351)

COURVILLE AND THOMPSON 245

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 19: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

References

American Psychological Association. (1994). Publication manual of the American Psychologi-cal Association (4th ed.). Washington, DC: Author.

Bagozzi, R. P., Fornell, C., & Larcker, D. F. (1981). Canonical correlation analysis as a specialcase of a structural relations model. Multivariate Behavioral Research, 16, 437-454.

Bentler, P. M., & Yuan, K.-H. (2000). On adding a mean structure to a covariance structuremodel. Educational and Psychological Measurement, 60, 326-339.

Bertz, R., Jr., & Thomas, S. (1992). Perceived equity, motivation, and final-offer arbitration inmajor league baseball. Journal of Applied Psychology, 77, 280-287.

Bowling, J. (1993, November). The importance of structure coefficients as against beta weights:Comments with examples from the counseling psychology literature. Paper presented at theannual meeting of the Mid-South Education Research Association, New Orleans. (ERICDocument Reproduction Service No. ED 364 606)

Burdenski, T. K., Jr. (in press). The importance of structure coefficients in multiple regression: Areview with examples from published literature. In B. Thompson (Ed.), Advances in socialscience methodology (Vol. 6). Stamford, CT: JAI.

Cohen, J. (1968). Multiple regression as a general data-analytic system. Psychological Bulletin,70, 426-443.

Cohen, J., & Cohen, P. (1983). Applied multiple regression/correlation analysis for the behav-ioral sciences. Hillsdale, NJ: Erlbaum.

Cooley, W. W., & Lohnes, P. R. (1971). Multivariate data analysis. New York: Wiley.Darlington, R. B. (1968). Multiple regression in psychological research and practice. Psycho-

logical Bulletin, 69, 161-182.Dawson, T. E. (1999). Relating variance partitioning in measurement analyses to the exact same

process in substantive analyses. In B. Thompson (Ed.), Advances in social science methodol-ogy (Vol. 5, pp. 101-110). Stamford, CT: JAI.

Duncan, O. D. (1975). Introduction to structural equation models. New York: Academic Press.Dunlap, W. P., & Landis, R. S. (1998). Interpretations of multiple regression borrowed from fac-

tor analysis and canonical correlation. Journal of General Psychology, 125, 397-407.Ezekiel, M. (1930). Methods of correlational analysis. New York: Wiley.Fan, X. (1996). Canonical correlation analysis as a general analytic model. In B. Thompson

(Ed.), Advances in social science methodology (Vol. 4, pp. 71-94). Greenwich, CT: JAI.Fan, X. (1997). Canonical correlation analysis and structural equation modeling: What do they

have in common? Structural Equation Modeling, 4, 65-79.Gorsuch, R. L. (1983). Factor analysis (2nd ed.). Hillsdale, NJ: Erlbaum.Henard, D. H. (1998, January). Suppressor variable effects: Toward understanding an elusive

data dynamic. Paper presented at the annual meeting of the Southwest Educational ResearchAssociation, Houston, TX. (ERIC Document Reproduction Service No. ED 416 215)

Horst, P. (1966). Psychological measurement and prediction. Belmont, CA: Wadsworth.Huberty, C. J (1994). Applied discriminant analysis. New York: Wiley and Sons.Huberty, C. J, & Petoskey, M. D. (1999). Use of multiple correlation analysis and multiple re-

gression analysis. Journal of Vocational Education Research, 24(1), 15-43.Johnson, J. W. (2000). A heuristic method for estimating the relative weight of predictor vari-

ables in multiple regression. Multivariate Behavioral Research, 35, 1-19.Knapp, T. R. (1978). Canonical correlation analysis: A general parametric significance testing

system. Psychological Bulletin, 85, 410-416.Lancaster, B. P. (1999). Defining and interpreting suppressor effects: Advantages and limita-

tions. In B. Thompson (Ed.), Advances in social science methodology (Vol. 5, pp. 139-148).Stamford, CT: JAI.

Lehman, W., & Simpson, D. (1992). Employee substance use and on-the-job behaviors. Journalof Applied Psychology, 77, 309-321.

246 EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 20: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

Levine, M. S. (1977). Canonical analysis and factor comparison. Beverly Hills, CA: Sage.Maslyn, J., & Fedor, D. (1998). Perceptions of politics: Does measuring different foci matter?

Journal of Applied Psychology, 84, 645-653.Mellor, S., Mathieu, J., & Swim, J. (1994). Cross-level analysis of the influence of local union

structure on women’s and men’s union commitment. Journal of Applied Psychology, 79,203-210.

Meredith, W. (1964). Canonical correlations with fallible data. Psychometrika, 29, 55-65.Parker, S. (1998). Enhancing role breadth self-efficacy: The roles of job enrichment and other or-

ganizational interventions. Journal of Applied Psychology, 83, 835-852.Pedhazur, E. J. (1982). Multiple regression in behavioral research: Explanation and prediction

(2nd ed.). New York: Holt, Rinehart & Winston.Pedhazur, E. J. (1997). Multiple regression in behavioral research (3rd ed.). Ft. Worth, TX: Har-

court Brace.Podsakoff, P., Ahearne, M., & MacKenzie, S. (1997). Organizational citizenship behavior and

the quantity and quality of work group performance. Journal of Applied Psychology, 82,262-270.

Staw, B., Barsade, S., & Koput, K. (1997). Escalation at the credit window: A longitudinal studyof bank executives’ recognition and write-off of problem loans. Journal of Applied Psychol-ogy, 82, 130-142.

Stevens, J. (1996). Applied multivariate statistics for the social sciences (3rd ed.). Mahwah, NJ:Erlbaum.

Tannenbaum, S., Mathieu, J., Salas, E., & Cannon-Bowers, J. (1991). Meeting trainees’ expecta-tions: The influence of training fulfillment on the development of commitment, self-efficacy,and motivation. Journal of Applied Psychology, 76, 759-769.

Thompson, B. (1984). Canonical correlation analysis: Uses and interpretation. Newbury Park,CA: Sage.

Thompson, B. (1991). A primer on the logic and use of canonical correlation analysis. Measure-ment and Evaluation in Counseling and Development, 24, 80-95.

Thompson, B. (1995). Stepwise regression and stepwise discriminant analysis need not applyhere: A guidelines editorial. Educational and Psychological Measurement, 55, 525-534.

Thompson, B. (1997a). Editorial policies regarding statistical significance tests: Further com-ments. Educational Researcher, 26(5), 29-32.

Thompson, B. (1997b). The importance of structure coefficients in structural equation modelingconfirmatory factor analysis. Educational and Psychological Measurement, 57, 5-19.

Thompson, B. (1999a, April). Common methodology mistakes in educational research, revis-ited, along with a primer on both effect sizes and the bootstrap. Invited address presented atthe annual meeting of the American Educational Research Association, Montreal. (ERICDocument Reproduction Service No. ED 429 110)

Thompson, B. (1999b). Five methodology errors in educational research: A pantheon of statisti-cal significance and other faux pas. In B. Thompson (Ed.), Advances in social science meth-odology (Vol. 5, pp. 23-86). Stamford, CT: JAI.

Thompson, B. (2000). Canonical correlation analysis. In L. Grimm & P. Yarnold (Eds.), Readingand understanding more multivariate statistics (pp. 285-316). Washington, DC: AmericanPsychological Association.

Thompson, B., & Borrello, G. M. (1985). The importance of structure coefficients in regressionresearch. Educational and Psychological Measurement, 45, 203-209.

Thompson, B., & Daniel, L. G. (1996). Seminal readings on reliability and validity: A “hit pa-rade” bibliography. Educational and Psychological Measurement, 56, 741-745.

Thorndike, R. M. (1978). Correlational procedures for research. New York: Gardner.Vliert, E., Euwema, M., & Huismans, S. (1995). Managing conflict with a subordinate or a supe-

rior: Effectiveness of conglomerated behavior. Journal of Applied Psychology, 80, 271-281.

COURVILLE AND THOMPSON 247

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from

Page 21: Educational and Psychological Measurement · members of Educational and Psychological Measurement subsequently cited that article as one of the most important measurement-related

Wanberg, C., Watt, J., & Rumsey, D. (1996). Individuals without jobs: An empirical study ofjob-seeking behavior and reemployment. Journal of Applied Psychology, 81, 76-87.

Wells, R. D. (1999). Factor scores and factor structure and communality coefficients. InB. Thompson (Ed.), Advances in social science methodology (Vol. 5, pp. 123-138). Stam-ford, CT: JAI.

Wilkinson, L., & APA Task Force on Statistical Inference. (1999). Statistical methods in psy-chology journals: Guidelines and explanations. American Psychologist, 54, 594-604.

Woolley, K. K. (1997, January). How variables uncorrelated with the dependent variable canactually make excellent predictors: The important suppressor variable case. Paper presentedat the annual meeting of the Southwest Educational Research Association, Austin. (ERICDocument Reproduction Service No. ED 407 420)

248 EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT

© 2001 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on April 16, 2008 http://epm.sagepub.comDownloaded from