MGMT 162

30
Chapter 23 Chapter 23 Bivariate Bivariate Statistical Statistical Analysis: Analysis: Measures of Measures of Association Association © 2010 South-Western/Cengage Learning. All rights reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. EIGHTH EDITION BUSINESS MARKET RESEARCH ZIKMUND BABIN CARR GRIFFIN

description

Management

Transcript of MGMT 162

  • LEARNING OUTCOMESApply and interpret simple bivariate correlationsInterpret a correlation matrixUnderstand simple (bivariate) regressionUnderstand the least-squares estimation techniqueInterpret regression output including the tests of hypotheses tied to specific parameter coefficientsAfter studying this chapter, you should be able to

  • The BasicsMeasures of AssociationRefers to a number of bivariate statistical techniques used to measure the strength of a relationship between two variables.The chi-square (2) test provides information about whether two or more less-than interval variables are interrelated.Correlation analysis is most appropriate for interval or ratio variables.Regression can accommodate either less-than interval or interval independent variables, but the dependent variable must be continuous.

  • EXHIBIT 23.1Bivariate AnalysisCommon Procedures for Testing Association

  • Simple Correlation Coefficient (continued)Correlation coefficientA statistical measure of the covariation, or association, between two at-least interval variables.CovarianceExtent to which two variables are associated systematically with each other.

  • Simple Correlation CoefficientCorrelation coefficient (r)Ranges from +1 to -1Perfect positive linear relationship = +1Perfect negative (inverse) linear relationship = -1No correlation = 0Correlation coefficient for two variables (X,Y)

  • EXHIBIT 23.2Scatter Diagram to Illustrate Correlation Patterns

  • Correlation, Covariance, and CausationWhen two variables covary, they display concomitant variation.This systematic covariation does not in and of itself establish causality.e.g., Roosters crow and the rising of the sunRooster does not cause the sun to rise.

  • EXHIBIT 23.3Correlation Analysis of Number of Hours Worked in Manufacturing Industrieswith Unemployment Rate

  • Coefficient of DeterminationCoefficient of Determination (R2)A measure obtained by squaring the correlation coefficient; the proportion of the total variance of a variable accounted for by another value of another variable.Measures that part of the total variance of Y that is accounted for by knowing the value of X.

  • Correlation MatrixCorrelation matrixThe standard form for reporting correlation coefficients for more than two variables.Statistical SignificanceThe procedure for determining statistical significance is the t-test of the significance of a correlation coefficient.

  • EXHIBIT 23.4Pearson Product-Moment Correlation Matrix for Salesperson Examplea

  • Regression AnalysisSimple (Bivariate) Linear RegressionA measure of linear association that investigates straight-line relationships between a continuous dependent variable and an independent variable that is usually continuous, but can be a categorical dummy variable.The Regression Equation (Y = + X )Y = the continuous dependent variableX = the independent variable = the Y intercept (regression line intercepts Y axis) = the slope of the coefficient (rise over run)

  • Regression Line and Slope130

    120

    110

    100

    90

    80 80 90 100 110 120 130 140 150 160 170XY

  • The Regression EquationParameter Estimate Choices is indicative of the strength and direction of the relationship between the independent and dependent variable. (Y intercept) is a fixed point that is considered a constant (how much Y can exist without X)Standardized Regression Coefficient ()Estimated coefficient of the strength of relationship between the independent and dependent variables.Expressed on a standardized scale where higher absolute values indicate stronger relationships (range is from -1 to 1).

  • The Regression Equation (contd)Parameter Estimate ChoicesRaw regression estimates (b1)Raw regression weights have the advantage of retaining the scale metricwhich is also their key disadvantage.If the purpose of the regression analysis is forecasting, then raw parameter estimates must be used.This is another way of saying when the researcher is interested only in prediction.Standardized regression estimates ()Standardized regression estimates have the advantage of a constant scale.Standardized regression estimates should be used when the researcher is testing explanatory hypotheses.

  • EXHIBIT 23.5The Advantage of Standardized Regression Weights

  • EXHIBIT 23.6Relationship of Sales Potential to Building Permits Issued

  • EXHIBIT 23.7The Best Fit Line or Knocking Out the Pins

  • Ordinary Least-Squares (OLS) Method of Regression AnalysisOLSGuarantees that the resulting straight line will produce the least possible total error in using X to predict Y.Generates a straight line that minimizes the sum of squared deviations of the actual values from this predicted regression line.No straight line can completely represent every dot in the scatter diagram.There will be a discrepancy between most of the actual scores (each dot) and the predicted score .Uses the criterion of attempting to make the least amount of total error in prediction of Y from X.

  • Ordinary Least-Squares Method of Regression Analysis (OLS) (contd)

  • Ordinary Least-Squares Method of Regression Analysis (OLS) (contd)The equation means that the predicted value for any value of X (Xi) is determined as a function of the estimated slope coefficient, plus the estimated intercept coefficient + some error.

  • Ordinary Least-Squares Method of Regression Analysis (OLS) (contd)

  • Ordinary Least-Squares Method of Regression Analysis (OLS) (contd)Statistical Significance Of Regression ModelF-test (regression)Determines whether more variability is explained by the regression or unexplained by the regression.

  • Ordinary Least-Squares Method of Regression Analysis (OLS) (contd)Statistical Significance Of Regression ModelANOVA Table:

  • Ordinary Least-Squares Method of Regression Analysis (OLS) (contd)R2The proportion of variance in Y that is explained by X (or vice versa)A measure obtained by squaring the correlation coefficient; that proportion of the total variance of a variable that is accounted for by knowing the value of another variable.

  • EXHIBIT 23.8Simple Regression Results for Building Permit Example

  • EXHIBIT 23.9OLS Regression Line

  • Simple Regression and Hypothesis TestingThe explanatory power of regression lies in hypothesis testing. Regression is often used to test relational hypotheses.The outcome of the hypothesis test involves two conditions that must both be satisfied:The regression weight must be in the hypothesized direction. Positive relationships require a positive coefficient and negative relationships require a negative coefficient.The t-test associated with the regression weight must be significant.

  • EXHIBIT 23A.1Least-Squares Computation