Lecture16 Module3 Anova 1

download Lecture16 Module3 Anova 1

of 10

Transcript of Lecture16 Module3 Anova 1

  • 8/11/2019 Lecture16 Module3 Anova 1

    1/10

    Desi nDesi n of Ex erimentsof Ex eriments--II

    MODULEMODULE IIIIII

    LECTURELECTURE -- 1616

    MODELSMODELS

    Dr. Shalabh

    Department of Mathematics and Statistics

    Indian Insti tute of Technology Kanpur

  • 8/11/2019 Lecture16 Module3 Anova 1

    2/10

    Two-way classification with interactions

    2

    Consider the two-way classification with an equal number, say K observations per cell. Let : kth observation in

    (i, j)th cell , i.e., receiving the treatments ith level of factorA andjth level of factor B,

    and are inde endentl drawn from so that the linear model under consideration is

    ijky

    1,2,..., ; 1, 2,..., ; 1, 2,...,i I j J k K = = =

    2N

    ij i

    ijk ij ijk y = +

    where are identically and independently distributed following Thusijk 2(0, ).N

    ( )

    ( ) ( ) ( )

    ijk ij

    oo io oo oj oo ij io oj oo

    i j ij

    E y

    =

    = + + + +

    = + + +

    oo

    i io oo

    j oj oo

    =

    =

    =

    1 1 1 1

    0, 0, 0, 0.

    with

    ij ij io oj oo

    I J I J

    i j ij ij

    i i

    = = = =

    = +

    = = = =

    Assume that the design matrixX

    is of full rank so that all the parametric functions of are estimable.ij

  • 8/11/2019 Lecture16 Module3 Anova 1

    3/10

    The null hypothesis are

    3

    0 1 2

    0 1 2

    0

    ...

    : ... 0

    : 0 , .allAll for

    =

    I

    J

    ij

    H

    H i j

    = = =

    =

    1 : ,

    : ,

    The corresponding alternative hypothesis is

    At least one for

    At least one for

    i jH i j

    H i j

    1 : , .At least one forij ik H j k

    Minimizing the error sum of squares

    2

    1 1 1

    ( ) ,ijk i j ij

    i j k

    E y = = =

    =

    the normal equations are obtained as

    E=

    ,

    0 ,for all

    i

    Ei

    E

    =

    0 .

    for all and

    j

    ij

    E i j

    =

  • 8/11/2019 Lecture16 Module3 Anova 1

    4/10

    The least squares estimates are obtained as

    I J K

    4

    1 1 1

    1

    1

    ooo ijk i j k

    I

    i ioo ooo ijk ooo

    i

    y yIJK

    y y y yJK

    = = =

    =

    = =

    = =

    1

    1

    J

    j ojo ooo ijk ooo

    j

    ij ijo ioo ojo ooo

    y y y yIK

    y y y y

    =

    = =

    = +

    1 1

    1

    I J

    ijk ioo ojo ooo

    i j

    y y y yK = =

    = + .

    2

    , , ,1 1 1

    2

    ( )

    i j

    I J K

    ijk i j ij

    ij i j k

    I J K

    SSE Min y

    = = =

    =

    1 1 1

    2

    1 1 1

    ( )

    ijk i j iji j k

    I J K

    ijk ijo

    i j k

    y

    y y

    = = =

    = = =

    =

    =

    2

    2 ~ ( ( 1)).with

    SSEIJ K

  • 8/11/2019 Lecture16 Module3 Anova 1

    5/10

    Now minimizing the error sum of squares under , i.e., minimizing0 1 2 ... 0 = = = = =IH

    5

    2

    1

    1 1 1

    ( ) = = =

    = I J K

    ijk j ij

    i j k

    E y

    with respect to and and solving the normal equations,j ij

    1 1 10, 0 0for all and for all andj ij

    E E Ej i j

    = = =

    ooo

    j ojo ooo

    y

    y y

    =

    =

    .ij ijo ioo ojo oooy y y y = +

    The sum of squares due to is0H

    2

    , ,1 1 1

    ( )j ij

    I J K

    ijk j ij

    i j k

    Min y

    = = =

    2

    1 1 1

    2 2

    ( )

    ( ) ( )

    I J K

    ijk j ij

    i j k

    I J K I

    i k i o ioo ooo

    y

    y y JK y y

    = = =

    =

    = +

    1 1 1 1

    2

    1 ( ) .

    i j k i

    I

    ioo ooo

    iSSE JK y y

    = = = =

    == +

  • 8/11/2019 Lecture16 Module3 Anova 1

    6/10

    Thus the sum of squares due to deviation from or the sum of squares due to effectA is0H

    6

    2

    0

    1

    2

    2

    ( )

    ~ ( 1).

    Sum of squares due to

    with

    I

    ioo ooo

    i

    SSA H SSE JK y y

    SSAI

    =

    = =

    Minimizing the error sum of squares under i.e., minimizing0 1 2: ... 0JH = = = =

    2

    2

    1 1 1

    ( )I J K

    ijk i ij

    i j k

    E y = = =

    =

    and solving the normal equations

    2 2 2

    0, 0 0for all and for all andi ij

    E E E

    j i j

    = = =

    yields the least squares estimators

    ooo

    i iooo ooo

    y

    y y

    =

    =

    .ij ijo ioo ojo oooy y y y = +

    The minimum error sum of squares is

    I J K J2 2

    1 1 1 1

    ( ) ( )ijk i ij ojo oooi j k j

    y SSE IK y y = = = =

    = +

  • 8/11/2019 Lecture16 Module3 Anova 1

    7/10

    and the sum of squares due to deviation from or the sum of squares due to effect B is0H

    7

    0

    2

    1

    ( )

    Sum of squares due to

    J

    ojo ooo

    j

    SSB H SSE

    IK y y

    SSB

    =

    =

    =

    Next, minimizing the error sum of squares under for all i, j, i.e., minimizing

    2 ~ .w

    0 : 0ijH all =

    2

    3

    1 1 1

    ( )I J K

    ijk i j

    i j k

    E y = = =

    =

    with respect to and and solving the normal equations, j

    yields the least squares estimators as

    3 3 30, 0 0for all and for alli j

    E E Ei j

    = = =

    .

    ooo

    i ioo ooo

    y

    y y

    =

    =

    = o o ooo

  • 8/11/2019 Lecture16 Module3 Anova 1

    8/10

    The sum of squares due to is0H

    8

    2

    , ,1 1 1

    2

    ( )

    ( )

    i j

    I J K

    ijk i j

    i j k

    I J K

    ijk i j

    Min y

    y

    = = =

    =

    1 1 1

    2

    1 1

    ( ) .

    i j k

    I J

    ijo ioo ojo ooo

    i j

    SSE K y y y y

    = = =

    = =

    = + +

    Thus the sum of squares due to deviation from or the sum of squares due to interaction effect AB is0H

    0Sum of squares due to

    I J

    SSAB H SSE =

    1 1

    2

    2 ~ (( 1) 1)).with

    ijo ioo ojo ooo

    i j

    y y y y

    SSABI J

    = =

    = +

    The total sum of squares can be partitioned as

    TSS = SSA + SSB + SSAB + SSE

    where SSA, SSB, SSAB and SSE are mutually orthogonal. So either using the independence of SSA, SSB, SSAB and

    SSE as well as their respective distributions or using the likelihood ratio test approach, the decision rules for the

    null hypothesis at level of significance are based on F - statistics as follows:

  • 8/11/2019 Lecture16 Module3 Anova 1

    9/10

    9

    [ ]1 0. ~ ( 1, ( 1) ,1

    ( 1). ~ 1 1

    under

    under

    F F I IJ K H I SSE

    IJ K SSBF F J IJ K H

    =

    =

    [ ]3 0

    1

    ( 1). ~ ( 1)( 1), ( 1) .

    ( 1)( 1)

    and

    under

    J SSE

    IJ K SSABF F I J IJ K H

    I J SSE

    =

    [ ]0 1 1

    0 2 1

    ( 1), ( 1)

    So

    Reject

    Reject

    if

    if

    H F F I IJ K

    H F F

    >

    > [ ]( 1), ( 1)J IJ K

    If or is re ected one can use t -test or multi le com arison test to find which airs of are

    [ ]0 3 1 ( 1)( 1), ( 1) .Reject ifH F F I J IJ K >

    ' 's s or

    significantly different.

    If is rejected, one would not usually explore it further but theoretically t- test or multiple comparison tests can be

    0 0

    0H

    used.

  • 8/11/2019 Lecture16 Module3 Anova 1

    10/10

    It can also be shown that 2 2

    1

    ( )1

    I

    i

    i

    JKE SSA

    I

    =

    = +

    10

    2 2

    1

    2 2

    1 1

    ( )1

    ( )( 1)( 1)

    J

    j

    j

    I J

    ij

    i j

    IKE SSB

    J

    KE SSAB

    I J

    =

    = =

    = +

    = +

    2( ) .E SSE =

    The analysis of variance table is as follows:

    -

    variation

    freedom

    FactorA ( I - 1) SSA1

    MSAF

    MSE

    =

    1

    MSAMSA

    I

    =

    1 ( , )C F p n p=

    Factor B (J - 1) SSB2

    MSBF

    MSE=

    1

    MSBMSB

    J=

    00:H =Interaction AB (I - 1) (J - 1) SSAB

    3

    MSABF

    MSE=

    ( )( 1)

    SSABMSAB

    I I J=

    Error I J (K - 1) SSE

    Total (I J K 1) TSS

    ( 1)

    SSEMSE

    IJ K=