Multiple Regression Analysis: Wrap-Up

27
1 Multiple Regression Analysis: Wrap-Up More Extensions of MRA

description

Multiple Regression Analysis: Wrap-Up. More Extensions of MRA. Contents for Today. Probing interactions in MRA: Reliability and interactions Enter Method vs. Hierarchical Patterns of interactions/effects coefficients Polynomial Regression Interpretation issues in MRA Model Comparisons. - PowerPoint PPT Presentation

Transcript of Multiple Regression Analysis: Wrap-Up

Page 1: Multiple Regression Analysis: Wrap-Up

1

Multiple Regression Analysis: Wrap-Up

More Extensions of MRA

Page 2: Multiple Regression Analysis: Wrap-Up

2

Contents for Today

Probing interactions in MRA: Reliability and interactions Enter Method vs. Hierarchical Patterns of interactions/effects coefficients

Polynomial Regression Interpretation issues in MRA Model Comparisons

Page 3: Multiple Regression Analysis: Wrap-Up

3

Recall our continuous variable interaction Job satisfaction as a function of Hygiene and

Caring Atmosphere Steeper slope for the regression of job satisfaction

on hygiene, when people perceived an (otherwise) caring atmosphere.

Page 4: Multiple Regression Analysis: Wrap-Up

4

Simple Slopes of satisfaction on hygiene for three levels of caring atmosphereInteraction of Care x Basics on Satisfaction

0

1

2

3

4

5

6

7

8

Low X High X

Low/High Basics

Sat

isfa

ctio

n L

evel

Low Value of care

Mean of care

High value of care

Low Basic High BasicLow Care 3.464 5.036Mean of Care 4.192 6.258High Care 4.575 6.902

Page 5: Multiple Regression Analysis: Wrap-Up

5

Developing equations to graph:Using Cohen et al.’s notation:

2 1 3' ( ) ( )Y A B Z B B Z X 1) Choose a high, medium, and low value for Z and solve the following:

Example: Low value of Z [Caring atmosphere] might be -1.57 (after centering)

1' [5.23 0.3721( 1.57)] [0.5165 0.0472( 1.57)]Y X

' [4.646] [0.442]Y X

2) Next, solve for two reasonable (but extreme) values of X1

Example: Doing so for -2 and +2 for basics gives us 3.76 and 5.53

3) Repeat for medium & high values of X2

Page 6: Multiple Regression Analysis: Wrap-Up

6

More on centering…

' 5.225 .516 .372 .047Y X Z XZ Full regression equation for our example (centered):

0 1 2 3'Y X Z XZ First, some terms (two continuous variables with an interaction)

Assignment of X & Z is arbitrary.

What do β1 and β2 represent if β3 =0?

What if β3 ≠ 0?

Where, X = Hygeine (cpbasic) and Z = caring atmosphere (cpcare)

Page 7: Multiple Regression Analysis: Wrap-Up

7

Even more on centering

We know that centering helps us with multicollinearity issues.

Let’s examine some other properties, first turning to p. 259 of reading…

Note the regression equation above graph A. Then above graph B.

Page 8: Multiple Regression Analysis: Wrap-Up

8

Why does this (eq. slide 6) make sense?Or does it?

The right-hand side (in parentheses) reflects the intercept.

The left-hand side (in parentheses) reflects the slope.

We then solve this equation at different values of Z.

0 1 2 3'Y X Z XZ Our regression equation:

1 3 0 2'Y X XZ Z Rearranging some terms:

Then factor out X:

1 3 0 2' ( ) ( )Y Z X Z

Page 9: Multiple Regression Analysis: Wrap-Up

9

Since the regression is “symmetric”…

The right-hand side (in parentheses) reflects the intercept.

The left-hand side (in parentheses) reflects the slope.

We then solve this equation at different values of X.

0 1 2 3'Y X Z XZ Our regression equation:

2 3 0 1'Y Z XZ X We can rearrange the terms differently:

Then factor out Z:

2 3 0 1' ( ) ( )Y X Z X

Page 10: Multiple Regression Analysis: Wrap-Up

10

Are the simple slopes different from 0?

The simple slope is 0.442.

Next we need to obtain an error term. The standard error is given by:

May be a reasonable question, if so solve for the simple slope:

1 3 0 2' ( ) ( )Y Z X Z And solve for a chosen Z value (e.g., one standard deviation below the mean: -1.57)

1' [5.23 0.3721( 1.57)] [0.5165 0.0472( 1.57)]Y X

' [4.646] [0.442]Y X

11 13 33

2 2 22 covBat Z B B BSE SE Z Z SE

Page 11: Multiple Regression Analysis: Wrap-Up

11

How to solve…

Under “Statistics” request covariance matrix for regression coefficients

For our example:Coefficient Covariances cpbasic cpcare cbasxcarb1 cpbasic 0.00187830 -0.00053855 0.00015193b2 cpcare -0.00053855 0.00047930 0.00011203b3 cbasxcar 0.00015193 0.00011203 0.00031912

Note: I reordered these as SPSS didn’t provide them in order, I also added b1, b2, etc.

20.0018783 2(-1.5735)0.00015193 ( 1.5735) 0.00031912Bat ZSE

.002190286 .0468Bat ZSE

1038

0.4429.444

0.0468t

Page 12: Multiple Regression Analysis: Wrap-Up

12

Simple Slope Table

Simple Regression Equations

Value of Z (z_cpcare) Value Slope Intercept

SE of simple slope t p

95% CI Low

95% CI High

Low -1.5735 0.4423 4.6396 0.0468 9.4502 0.0000 0.3504 0.5341Medium 0.0000 0.5165 5.2251 0.0433 11.9172 0.0000 0.4314 0.6015High 1.5735 0.5907 5.8106 0.0561 10.5303 0.0000 0.4806 0.7008

Confidence Intervals for Simple Slope

B Std. Error Beta t Sig. Lower Upper

(Constant) 5.2251 0.0286 182.6190 0.0000 5.1690 5.2813x_cpbasic 0.5165 0.0433 0.3413 11.9172 0.0000 0.4314 0.6015z_cpcare 0.3721 0.0219 0.4981 16.9960 0.0000 0.3291 0.4151xz_cbasxcar 0.0472 0.0179 0.0650 2.6399 0.0084 0.0121 0.0822

95% CI for B

Simple slopes, intercepts, test statistics and confidence intervals

Compare to original regression table

Page 13: Multiple Regression Analysis: Wrap-Up

13

A visual representation of regression plane:The centering thing again…

Page 14: Multiple Regression Analysis: Wrap-Up

14

Using SPSS to get simple slopes When an interaction is present

bx = ? Knowing this, we can “trick” SPSS into computing simple

slope test statistics for us.

Descriptive Statistics

5.26 1.175 1042

4.131144 .7766683 1042

5.621881 1.5735326 1042

24.0355 9.20924 1042

satis

pbasic

pcare

basxcar

Mean Std. Deviation N

Uncentered DescriptivesDescriptive Statistics

5.26 1.175 1042

-.0007 .77667 1042

.0032 1.57353 1042

.8107 1.61918 1042

satis

cpbasic

cpcare

cbasxcar

Mean Std. Deviation N

Centered Descriptives

When X is centered, we get the “middle” simple slope. So…

Page 15: Multiple Regression Analysis: Wrap-Up

15

Using SPSS to get simple slopes (cont’d) If we force Z=0 to be 1 standard deviation above the mean… We get the simple slope for X at 1 standard deviation below the mean

compute cpbasic=pbasic-4.131875.compute cpcare = pcare-5.61866+1.573532622.compute cbasxcar=cpbasic*cpcare.execute.

TITLE 'regression w/interaction centered'.REGRESSION /DESCRIPTIVES MEAN STDDEV CORR SIG N /MISSING LISTWISE /STATISTICS COEFF OUTS CI BCOV R ANOVA COLLIN TOL ZPP /CRITERIA=PIN(.05) POUT(.10) /NOORIGIN /DEPENDENT satis /METHOD=ENTER cpbasic cpcare cbasxcar /SCATTERPLOT=(*ZRESID ,*ZPRED ) /SAVE PRED .

This code gets us…

Page 16: Multiple Regression Analysis: Wrap-Up

16

This…

B Std. Error Beta t Sig.Lower Bound

Upper Bound

(Constant) 4.6396 0.0479 96.7988 0.0000 4.5456 4.7337cpbasic 0.4423 0.0468 0.2922 9.4502 0.0000 0.3504 0.5341cpcare 0.3721 0.0219 0.4981 16.9960 0.0000 0.3291 0.4151cbasxcar 0.0472 0.0179 0.0610 2.6399 0.0084 0.0121 0.0822Dependent Variable: satis

And, setting the zero-point to one standard deviation below the mean:compute cpcare = pcare-5.61866-1.573532622.Gives us…

B Std. Error Beta t Sig.Lower Bound

Upper Bound

(Constant) 5.8106 0.0414 140.3721 0.0000 5.7294 5.8918cpbasic 0.5907 0.0561 0.3903 10.5303 0.0000 0.4806 0.7008cpcare 0.3721 0.0219 0.4981 16.9960 0.0000 0.3291 0.4151cbasxcar 0.0472 0.0179 0.0976 2.6399 0.0084 0.0121 0.0822

Page 17: Multiple Regression Analysis: Wrap-Up

17

Choice of levels of Z for simple slopes

Interaction of Care x Basics on Satisfaction

0

1

2

3

4

5

6

7

8

Low X High X

Low/High Basics

Sat

isfa

ctio

n L

evel

Low Value of care

Mean of care

High value of care

•+/- 1 standard deviation

•Range of values

•Meaningful cutoffs

Page 18: Multiple Regression Analysis: Wrap-Up

18

Wrapping up CV Interactions

Interaction term (highest order) is invariant: Assumes all lower-level terms are included

Upper limits on correlations governed by rxx

Crossing point of regression lines For Hygeine: -10.9 (centered) For Caring atmosphere: -7.9

If your work involves complicated interaction hypotheses – Examine Aiken & West (1991).

Section 7.7 not covered, but good discussion Cannot interpret β weights using method discussed

here

Page 19: Multiple Regression Analysis: Wrap-Up

19

Polynomial Regression (10,000 Ft)Polynomial 2nd Power

-1.00000000

0.00000000

1.00000000

2.00000000

3.00000000

4.00000000

5.00000000

6.00000000

7.00000000

8.00000000

9.00000000

-3.0000 -2.0000 -1.0000 0.0000 1.0000 2.0000 3.0000

Series1

Poly. (Series1)

Polynomial 3rd Power

-10.00000000

-5.00000000

0.00000000

5.00000000

10.00000000

15.00000000

20.00000000

25.00000000

-3.0000 -2.0000 -1.0000 0.0000 1.0000 2.0000 3.0000

Series1

Poly. (Series1)

Polynomial 4th Power

-10.0000

0.0000

10.0000

20.0000

30.0000

40.0000

50.0000

60.0000

-3.0000 -2.0000 -1.0000 0.0000 1.0000 2.0000 3.0000

Y'

X+

X^

2+X

^3+

X^

4

Series1

Poly. (Series1)

Polynomial to the 5th Power

-60.0000

-40.0000

-20.0000

0.0000

20.0000

40.0000

60.0000

80.0000

100.0000

120.0000

140.0000

-3.0000 -2.0000 -1.0000 0.0000 1.0000 2.0000 3.0000

Y

Series1

Poly. (Series1)

Page 20: Multiple Regression Analysis: Wrap-Up

20

Predicting job satisfaction from IQ

Model Summaryb

.365a .134 .111 2.10543Model1

R R SquareAdjustedR Square

Std. Error ofthe Estimate

Predictors: (Constant), IQa.

Dependent Variable: JobSatb.

ANOVAb

25.952 1 25.952 5.855 .020a

168.448 38 4.433

194.400 39

Regression

Residual

Total

Model1

Sum ofSquares df Mean Square F Sig.

Predictors: (Constant), IQa.

Dependent Variable: JobSatb.

Page 21: Multiple Regression Analysis: Wrap-Up

21

Continued

Unstandardized CoefficientsStandardized Coefficients 95% Confidence Interval for BB Std. Error Beta t Sig. Lower BoundUpper Bound

(Constant) -0.8432 3.1763 -0.2655 0.7921 -7.2734 5.5869IQ 0.0720 0.0297 0.3654 2.4196 0.0204 0.0118 0.1322Dependent Variable: JobSat

It’s all good, let’s inspect our standardized predicted by residual graph

Page 22: Multiple Regression Analysis: Wrap-Up

22

Ooops!

Page 23: Multiple Regression Analysis: Wrap-Up

23

Next step…

Center X Square X Add X2 to

prediction equation

compute c_IQ=IQ-106.20.compute IQsq=c_IQ**2.execute.

REGRESSION /DESCRIPTIVES MEAN STDDEV CORR SIG N /MISSING LISTWISE /STATISTICS COEFF OUTS CI R ANOVA COLLIN TOL ZPP /CRITERIA=PIN(.05) POUT(.10) /NOORIGIN /DEPENDENT JobSat /METHOD=ENTER IQ IQsq /SCATTERPLOT=(*ZRESID ,*ZPRED ) .

Page 24: Multiple Regression Analysis: Wrap-Up

24

Results

Model Summaryb

.944a .891 .886 .75534Model1

R R SquareAdjustedR Square

Std. Error ofthe Estimate

Predictors: (Constant), IQsq, IQa.

Dependent Variable: JobSatb.

ANOVAb

173.290 2 86.645 151.867 .000a

21.110 37 .571

194.400 39

Regression

Residual

Total

Model1

Sum ofSquares df Mean Square F Sig.

Predictors: (Constant), IQsq, IQa.

Dependent Variable: JobSatb.

Page 25: Multiple Regression Analysis: Wrap-Up

25

And both predictors are significant

B Std. Error Beta t Sig.Lower Bound

Upper Bound

(Constant) -6.5099 1.1928 -5.4575 0.0000 -8.9269 -4.0930IQ 0.1455 0.0116 0.7385 12.5294 0.0000 0.1219 0.1690IQsq -0.0171 0.0011 -0.9472 -16.0700 0.0000 -0.0192 -0.0149

Page 26: Multiple Regression Analysis: Wrap-Up

26

Interpretation Issues & Model Comparison Linearity vs. Nonlinearity

Nonlinear effects well-established? Replicability of nonlinearity Degree of nonlinearity

Interpretation Issues Regression coefficients context specific

Assumption that we are testing “the” model β-weights vs. b-weights Replication Strength of relationship

Model Comparison May sometimes wish to determine whether one model

significantly better predictor than another (where different variables are used) E.g., which of two sets of predictors best predict relapse?

Page 27: Multiple Regression Analysis: Wrap-Up

27

Strength of relationship:My test is sooo valid!

Example of "Artificial" Correlation

R2 = 0.4499

0.0000

1.0000

2.0000

3.0000

4.0000

5.0000

6.0000

7.0000

8.0000

9.0000

10.0000

0.0000 1.0000 2.0000 3.0000 4.0000 5.0000 6.0000 7.0000 8.0000 9.0000 10.0000

Predictor

Ou

tco

me Y

Linear (Y)

Linear (Y)