Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

48
1 Individual Differences in Technological Proficiency Project Findings A White Paper Prepared by Darrell M. Hull, Rebecca J. Glover, and Judy A. Bolen Department of Educational Psychology University of North Texas Special thanks to A. Alexander Beaujean of the Baylor Psychometric Laboratory for his consultation on the analysis of the results. Liesel Ritchie, for her leadership in directing project DECA, and the Research Review committee (Elaine Craft, Dennis Faber, Stephen Jurs, Frances Lawrenz, Michael Martinez, Nick Smith, and Vanessa Smith-Morest) for their helpful comments and reviews of the study. This material is based upon work supported by the National Science Foundation under Grant No. NSF/DUE 0702981. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Transcript of Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

Page 1: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

1

Individual Differences in Technological Proficiency

Project Findings

A White Paper Prepared by Darrell M. Hull, Rebecca J. Glover, and Judy A. Bolen

Department of Educational Psychology University of North Texas

Special thanks to A. Alexander Beaujean of the Baylor Psychometric Laboratory for his consultation on the analysis of the results. Liesel Ritchie, for her leadership in directing project DECA, and the Research Review committee (Elaine Craft, Dennis Faber, Stephen Jurs, Frances Lawrenz, Michael Martinez, Nick Smith, and Vanessa Smith-Morest) for their helpful comments and reviews of the study. This material is based upon work supported by the National Science Foundation under Grant No. NSF/DUE 0702981. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Page 2: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

2

Technological proficiency: Individual differences in two-year college graduates

Introduction

Sandra Scarr (1992) is a developmental psychologist who conceptualized human development as individuals finding out who they are and, concurrently, becoming more uniquely themselves. Rather than being passive agents reacting to the environment, according to Scarr each person is endowed with a unique set of abilities, penchants, and personalities that interact with the environment to produce various life outcomes—both more and less successful. From Scarr’s framework, people self-select into any career path (technical or otherwise) for various reasons, primarily as a result of individual cognitive ability and personality characteristics.

Given the proclivity of individuals to select life paths based, at least in part, on their skills and penchants (sometimes consciously known to them and sometimes not), the DECA project titled Individual Differences in Technological Proficiency attempted to measure a meaningful constellation of cognitive ability and personality characteristics in two-year college students. The students selected for analysis were graduating from both technological and non-technological programs of study. The goal of the research was to determine whether or not technological and non-technological students could be identified by distinct differences in their abilities and penchants since these groups have already selected life paths with different orientations, despite similarities in their environmental context. The purpose of the present study was to determine how these self-selected groups might vary and the extent of such variation. If differences are substantially distinct, the implication of the present study is that technological education at a two year college is not equally attractive or attainable for all two year college students as a function of the degree to which they vary with regard to cognitive and personality differences. Such distinctions in highly stable characteristics of people might serve as markers or indicators of students for two important purposes: first, some students possessing these characteristics might possess talent that could be nurtured by technological education. Second, students in technological education programs who do not possess such characteristics might be supported differently in their educational pursuits in order to help them retain interest or comprehend the subject matter.

The project was conceived as a study of individual differences. Perhaps the observation we most notice about each other is that we differ from one another, sometimes in subtle ways but also in ways that make us very distinct. This simple observation is known to psychologists as individual differences and has been the subject of considerable research. Individual differences are important because people perceive things differently and, consequently, behave differently. Some of these differences impinge directly on how people go about work, how they select their

Page 3: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

3

careers, and what they find interesting in school. People with different attitudes respond differently to direction, or different personalities interact differently with bosses, coworkers, subordinates, and customers. Consequently, one worker or student might learn a task more quickly or effectively than another and is more likely to be attracted to the task that suit him.

Much of the work of psychologists in individual differences research has been to categorize the infinite number of ways we differ into a limited set of taxonomic categories often referred to as unobservable or latent constructs. These categories generally fall into two broad domains: personality and cognitive ability. There is also a conative (motivational) domain which we did not address in this study, because motivational effects can be highly situational dependent, reflect temporal states, and consequently vary based on life events beyond our interest. Much of who we are as individuals, the ways we distinguish ourselves from one another, is reflected in our personality and cognitive ability traits. Traits are relatively enduring characteristics of who we are that are to a significant degree, but not entirely, determined by our genetic makeup. One reason traits are used to distinguish individuals in individual differences research is that traits tend to remain stable over time. Educational psychologists have extended knowledge of individual differences to groups of individuals (say for example males vs. females, or different ethnic groups) in order to understand how these groups retain their distinct differences after averaging together tests of ability for all of those within a group. It is important to point out that simply because groups manifest differences between one another on some trait that an individual within that group may or may not possess the same distinct difference or characteristic. For example, if we gave a test of football knowledge to a group of males and a group of females, we might expect to observe significant differences in the average scores of the male and female groups, but it does not mean, however, that all females in the female group scored lower on our football knowledge test than all males in the male group. However, the probability that someone within a group retains the same group distinction is greater than for someone who is not a member of that group.

In addition to trait-based differences humans possess, another related field of psychological theorizing addresses how these differences manifest themselves over time and, at the same time, how the environments in which we find ourselves or, more aptly stated, choose for ourselves tend to reinforce or cancel out these differences. This is the second part of the present study and provides the rationale for why we identified just two groups of individuals—technological and non-technological students—graduating from the same community colleges. We attempt to theorize later that these two manifest groups we identified at the outset who have themselves selected different work environments are, on the whole, manifested by some basic differences in their psychological traits.

Page 4: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

4

Cognitive Ability

The most widely accepted organizational structure of cognitive abilities is a hierarchical framework known as the Cattell-Horn-Carroll (CHC) theory, depicted in Figure 1. It represents the integrated works of Raymond Cattell, John Horn, and John Carroll (Alfonso, Flanagan, & Radwan, 2005; Flanagan, McGrew, & Ortiz, 2000; McGrew, 2005; 2009; Neisser et al., 1996), thus it contains the most comprehensive empirical support of any model of cognitive and academic ability.

Figure 1. Schematic representation and comparisons of Carroll's Three-Stratum, Cattell–Horn's Extended Gf–Gc, and the integrated Cattell–Horn–Carroll models of human cognitive abilities (McGrew, 2009, p. 4)

At the highest level of the hierarchy, general cognitive ability, or g as it was called to distinguish it from various meanings of the word intelligence (Spearman, 1904), is composed of individual differences in diverse cognitive abilities such as verbal, spatial, memory and

Page 5: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

5

processing speed. The second stratum abilities correlate about 0.30 on average and result in a general factor (an unrotated first principal component) that accounts for approximately 40% of the total variance, as indicated in a meta-analysis of more than 300 studies (Carroll, 1993; Jensen, 1998). Research has continually shown that g is one of the most reliable, valid and stable behavioral traits (Neisser et al., 1996), and it predicts educational outcomes and level of occupation far more accurately than any other trait (Deary, Whiteman, Starr, Whalley, & Fox, 2004; Gottfredson, 1997; Schmidt, & Hunter, 1998; Bouchard, & McGue, 2003).

Moreover, the genetic influence on general cognitive ability increases significantly across childhood to adolescence to young adulthood. Thus, genetically driven differences increasingly account for differences in general cognitive ability during the school years, from 42% in childhood (age 9) to 54% in adolescence (age 12) to 68% in young adulthood (age 17) in a longitudinal sample of 11,000 twins from four countries (Haworth et al., 2009). The implication from this and other behavioral genetic studies (see for example Bouchard, & McGue, 1981; Deary, Spinath, & Bates, 2006; Plomin, & Spinath, 2004; Plomin, DeFries, McClearn, & McGuffin, 2008) is that as students move from childhood through adolescence and into young adulthood they select, modify and create their own experiences, increasingly based on these propensities, that are determined in part by the portion of their cognitive ability that is genetically derived. Therefore, we can expect students in college programs, who are well into young adulthood, and some beyond young adulthood, would frequently possess clear genetic proclivities to be drawn to and successfully complete the program of study in which they have elected to enroll.

The Importance of Visual Processing (Gv) and Fluid Reasoning (Gf)

There is good reason to believe that the second order stratum factor of visual processing ability might identify students completing technological education programs. More than 50 years ago, a National Science Foundation advisory panel published a report entitled Scientific Careers and Vocational Development (Super & Bachrach, 1957), that characterized the personal attributes of scientists and engineers for the purposes of better identifying human capital and, ultimately, uncovering ways to nurture scientific and technical potential (Wai, Lubinski & Benbow, 2009). The report emphasized the critical role of spatial ability, which is the “ablility to generate, retain, retrieve, and transform well-structured visual images” (Lohman, 1994a, p. 1000). This ability was identified as relevant for learning advanced scientific-technical material needed for developing outstanding STEM contributors. However, the report also stressed other attributes beyond spatial ability, mathematics in particular, and other attributes such as persistence. Subsequently, few projects or resources have been devoted specifically to the identification or development of spatial abilities in STEM education, particularly in technician education.

Page 6: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

6

Today, spatial ability remains no more important in identification/selection for guidance counseling, curriculum, and instruction in educational settings than it was 50 years ago, even in STEM subject areas, despite abundant evidence of the educational-occupational significance of spatial ability (Gohm, Humphreys, & Yao, 1998; Humphreys, Lubinski, & Yao, 1993; Lohman, 1988; 1994a; 1994b; Smith, 1964; Snow, 1999). Indeed, basic science indicates students performing across the ability range could profit from spatial ability assessments and the provision of educational opportunities aimed at developing spatial ability (Humphreys & Lubinski, 1996; Lohman, 2005), while millions of dollars are spent annually to assess prospective students on only mathematical and verbal domain abilities. Spatial ability has been shown to possess incremental validity using both Scholastic Aptitude Test (SAT) scales and comprehensive educational occupational preferences questionnaires over a 5-year interval, when predicting favorite high school course, leisure activities relevant to science, technology, engineering and mathematics (STEM), college major, and intended occupation (Webb, Lubinski, & Benbow, 2007). Overall, spatial ability accounted for an additional 3% variance in predicting all of these criteria beyond not only the SAT assessments in math and verbal ability, but also when combined with two comprehensive educational-vocational preference questionnaires. Moreover, adolescents have been shown to display such predictive abilities by the age of 13.

First defined by Horn and Cattell (1966), fluid intelligence (Gf) is a broad factor of intelligence (Carroll, 1993; 1997; Horn & Noll, 1997). It is a mental activity that “involves making meaning out of confusion; developing new insights; going beyond the given to perceive that which is not immediately obvious; forming (largely nonverbal) constructs which facilitate the handling of complex problems involving many mutually dependent variables” (Raven, Raven, & Court, 1998, p. G4). Fluid intelligence (Gf) or fluid reasoning ability has been assessed though cognitively complex tasks, this ability is “measured in tasks “requiring inductive, deductive, conjunctive, and disjunctive reasoning to arrive at understanding relations among stimuli, comprehend implications, and draw conclusions” (Horn, 1997, p. 62).

In the last four decades, the basic component processes that comprise this complex mental activity have been under study by various cognitive psychologists (Bethell-Fox et al., 1984; Carpenter et al., 1990; Embretson, 1995; 1998; Evans, 1968; Goldman & Pellegrino, 1984; Green & Kluever, 1992; Hornke & Habon, 1986; Hunt, 1974; Mulholland et al., 1980; Primi, 1995; Primi & Rosado, 1995; Rumelhart & Abrahamson, 1973; Sternberg, 1977; 1978; 1980; 1984; 1986; 1997). This research has tried to identify the cognitive processes people use to solve geometric analogy tasks which, according to Marshalek, Lohman, and Snow (1983), are the prototype tasks to assess Gf. Basically these studies (a) identify the basic component processes and the strategies that organize them in a complex chain, (b) investigate the correlations between component and traditional psychometric measures, (c) discover

Page 7: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

7

complexity factors underlying the tasks, and (d) simulate problem-solving behavior using artificial intelligence.

Fluid reasoning has also been directly associated with science achievement and preference for science-related careers (Lau & Roeser, 2002). For technological education, it would appear that fluid reasoning is particularly useful in tasks such as constructing, troubleshooting, repairing, or operating complex systems that incorporate combinations of physical forces, dynamics, solids and fluids. Such systems often incorporate more than one physical domain, including mechanics, acoustics, optics, thermodynamics, and electromagnetism. Proficient technicians must be able to understand relations among the elements of such systems in order to comprehend implications of these elements on one another as well as their own or equipment-related manipulations and then draw conclusions about how these systems will function. Proficiency as a technician would extend to the use of a variety of tools or pieces of equipment that enable them to act on these complex systems. In addition to other abilities, such tasks appear to draw heavily on fluid reasoning ability.

Personality

In vocational counseling, industrial/organizational psychology, and personnel psychology there has been general disagreement about the utility of personality contributions to career choice. However, since the early 1990s, the validity of personality measures have improved largely due to the consistent factor structure that has emerged as the core elements in many studies (Barrick & Mount, 1991), in what has become known as the Big-Five personality factors. The same five factor model of personality has been identified consistently across language and cultures to the point that these findings have led some researchers to suggest the model can be universally applied to all humans and constitutes a cross-cultural psychological law (McCrae & Costa, 1997). According to the model, most personality traits, and behaviors associated with these traits, can be described in terms of five basic dimensions: Neuroticism (the opposite pole of this trait would be Emotional Stability), Extraversion, Openness to Experience, Agreeableness (the opposite pole being Antagonism), and Conscientiousness or Will to Achieve (Digman, 1990; Hong, Paunonen, & Slade, 2008; John & Srivastava, 1999).

The five factors have demonstrated a relationship to many life outcomes such as (a) the ability to cope with stress; (b) academic and vocational success or failure; (c) the establishment of romantic relationships, lifelong friendships, and life goals; and (d) health outcomes, such as mortality and accident incidence (Black, 2000; Block, 1993; Haan, Millsap, & Hartka, 1986; Helson & Moane, 1987; Roberts, Kuncel, Shiner, Caspi, & Goldberg, 2007; Robins, Fraley, Roberts, & Trzesniewski, 2001). Moreover, there is consensus that individual differences in personality begin to exert their influence as early as adolescence; thus, they are important to

Page 8: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

8

measure when investigating the psychological processes in adolescents and emerging adults (Caspi, Roberts, & Shiner, 2005).

Career counselors often assume certain personality traits in students may make them more or less likely to pursue a particular major. For example, extraverted clients or students may be seen as more likely to pursue business careers; neurotic clients may be viewed as more likely to be interested in artistic pursuits. Likewise, John Holland in his writings noted that choice of occupation and, by extension, choice of educational major is an expression of personality (Holland, 1997). Also, the role of personality traits in vocational choice actions (e.g., selection of a major) is explained in social cognitive career theory (SCCT; Lent, Brown, & Hackett, 1994). That is, personality is a precursor to vocational choice actions and influences choice actions through domain-specific self-efficacy and interests.

Barrick and Mount (1991) examined 117 studies of personality for employment selection purposes containing 162 samples. Their results showed that Conscientiousness was consistently related to all job performance criteria across a wide variety of occupational groups represented in the samples (professionals, police, managers, sales, and skilled/semi-skilled workers). Extraversion was a valid predictor for occupations requiring social interaction. Also, both Openness to Experience and Extraversion were valid predictors of training proficiency. Other dimensions of personality were found to be valid predictors for some occupations. Overall, the meta-analysis illustrated the benefits of using the five factor model of personality to make meaningful relationships between the dimensions of the model and occupational suitability.

There has been wide acceptance in the individual differences literature of vocational interests as enduring psychological traits (Lubinski, 2000) as evidenced by the associations between vocational interest and personality (Barrick, Mount, & Gupta, 2003; Larson, Rottinghaus, & Borgen, 2002), and the influence on a variety of outcomes. As an individual differences phenomenon, vocational interests represent one of the most enduring and compelling areas for research (Lubinski & Dawis, 1995) being one of the most popular ways to characterize, compare, and match people with environments they self-select (Hogan & Blake, 1996). In general, personality and interests appear to be composed of two types of motivational constructs (Mount, Barrick, Tippie, Scullen, & Rounds, 2005): (a) striving for self-growth versus accomplishment strivings, and (b) interacting with people versus interacting with things. Moreover, the enduring nature of vocational interests as a psychological trait has been supported by behavioral genetic studies (Lykken, Bouchard, McGue, & Tellegen, 1993; Moloney, Bouchard, & Segal, 1991), in which vocational interest has been attributed to 40%-50% genetic factors.

Page 9: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

9

Research Questions

The present study addressed several research questions related to students enrolled in technological education in two-year community and technical colleges.

1. Do students graduating from technological education programs possess different academic and cognitive abilities from students graduating from nontechnolgical education programs? If so, to what extent do they differ?

2. In particular, do technological and non-technological students differ in cognitive tasks that are known to emphasize visual processing and fluid reasoning?

3. To what extent do technical and non-technical students differ in their personality traits? 4. Given the above measures, what is the profile for technical and non-technical students

as they approach the end of their respective two year programs, and, on these measures as a whole, are students in these two self-selected groups different from one another?

5. Based on measures of academic knowledge, performance, cognitive ability, and personality, how many latent classes of students exist in two-year colleges, and to what extent do technical and non-technical students map onto these empirically derived latent profiles?

Responses to these questions are not expected to result in immediately obvious interventions to support technological education. Rather, the results should explicate important variations in students that choose technological education, and identify areas for further investigation, perhaps in ways that such differences might be useful to developers, researchers, and practitioners with technological education interventions or practices. Indeed, this is perhaps the first study of its kind that attempts to explain uniqueness of the technological education student population.

Methods

This research project began in 2009 by recruiting five two-year postsecondary institutions in four locations as follows:

• Indian River State College, Port St. Lucie, FL • Texas State Technical College AND McLennan Community College, Waco, TX • Indian Hills Community College, Ottumwa, IA • Connecticut College of Technology, Hartford, CT

Engagement of each college began by personally visiting each to describe recruitment and test administration procedures, as well as to identify a testing coordinator at each location. The exception was the colleges in Texas as these colleges were geographically accessible to the

Page 10: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

10

research team. Consequently, research team members made multiple visits to the two campuses in Texas to conduct data collection. An instructor at the Texas locations supported student recruitment. The need for two different colleges arose because Texas State Technical College (TSTC) did not offer any non-technological programs; however, McLennan Community College, located within 10 miles of TSTC, provides primarily non-technological programs of study.

Test materials were shipped to the assessment coordinator at each test site as well as a complete protocol for recruitment and administration (see Appendix A). The protocol contained detailed instructions, order of test administration, and test administrator scripts to ensure similar testing conditions. For program license, headsets, and test stimuli for the Computer-based Academic Assessment System (CAAS) were also provided to each test site. The test site was responsible for providing a quiet, well-lit testing area; a computer for the CAAS system; and a test administrator. All completed group administered tests were returned to the University of North Texas for scoring, data entry, and analysis. The reaction time test was administered on site individually at a time convenient for the participant after the group test administration and the data file was returned to UNT. As an incentive to engage responders, upon completion of the reaction time test, each participant received a $30 stipend.

Data collection began in Fall, 2009. No individual participant required longer than two (2) weeks to complete all assessments. Data collection at both Indian River State College and Indian Hills Community College was completed during one semester. Data collection at the two Texas sites was completed in two semesters.

By late 2010, Connecticut College of Technology (CCT) had not collected any data from students. In response, a new college location was identified to replace CCT. The College of Lake County north of Chicago volunteered to support the project as the fourth location. CLC collected data during the Spring term of 2011.

Each institution was instructed to recruit only students who were within one to two semesters of completing a two-year degree. This was done so that the investigators could be relatively certain participants would persist in their chosen program of study and likely continue beyond their education into work related to their chosen occupational area. Recruiters were also instructed to identify/recruit 50% of the students from Technological programs, defined for this study as Optics/Lasers/Photonics technology, Mechanical technology, Electricity/Electronics technology, or Robotics technology. The other 50% of the students were to be recruited from non-Technological occupational preparation programs, defined for this study as Criminal Justice, Education/Child Development, Paralegal, or Office Administration. Within the technological and non-technological participant subgroups, recruiters were requested to recruit equal numbers of males and females when possible.

Page 11: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

11

All test administrators/research assistants and recruiters at the colleges completed a training program in Human Subjects Protection and submitted certification of completion of training to the Office of Research Services at the University of North Texas, which maintains compliance documentation for the UNT Institutional Review Board.

Participants

The participants forming the sample for the present study (N = 306) are described in Table 1. All participants provided signed informed consent of their agreement to participate in the present study as required by the Institutional Review Board Committee for the Protection of Human Subjects in Research at the University of North Texas.

Table 1. Participant Frequencies

Participants Technical Non-Technical Total 186 120 Sex Male 172 63 Female 14 57 Location Florida 51 49 Illinois 13 5 Iowa 79 22 Texas 43 44 Ethnicity White 154 79 African American 11 17 Hispanic 15 22 Asian 2 1 Other 4 1 Age Age 17-20 80 62 Age 21-30 64 41 Age 31-40 26 7 Age 41-62 16 10 Average Age (years) 25.6 23.8

Instrumentation

The present study utilized a battery of instrumentation to assess cognitive ability, math ability, and personality traits: Shipley-2 Block Pattern, Raven’s Standard Progressive Matrices,

Page 12: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

12

Spatial Reaction Time test, Shipley-2 Vocabulary, ASSET Elementary Algebra, and the NEO FFI. All instruments were administered in a group setting, except for the Spatial Reaction Time test which was administered individually. A brief demographic survey was also administered and included questions that would permit extension of the present study to longitudinal follow up. The entire test protocol involved 2.5 hours of group test administration, including breaks, and the individually administered task required approximately 20 minutes to administer. Group test administration sessions generally included between 6 and 20 participants at one time, consequently, several administration sessions were offered at each location based on availability of participants.

Shipley-2

The Shipley-2 is a revision and re-standardization of the Shipley Institute of Living Scale (Shipley, Gruber, Martin, & Klein, 2009). This instrument has been standardized for use with children and adults ranging in age from 7 to 89 years (Shipley et al., 2009). Two aspects of cognitive ability are assessed: crystallized ability through the vocabulary test and spatial ability (fluid) through the block pattern test. Table 2 reports the normative sample internal consistency for the block pattern and vocabulary tests compared to those for the current study.

Shipley-2 Block Pattern

Block Pattern Test is a nonverbal task assessing fluid cognitive ability as it is a paper-and-pencil adaptation of Kohs Block Design Test (Kohs, 1920). While individually- or group-administered in 10 minutes, the subject must determine which piece(s) is needed to complete the displayed mosaic which supports Zachary, Crumpton and Spiegel’s description of a quick, economical estimate of the person’s cognitive ability (Shipley et al., 2009). Psychometric properties of the Shipley Block Pattern Test have been examined in depth (Beaujean, Hull, Worrell & Sheng, in review).

Shipley-2 Vocabulary

The Vocabulary test is also a nonverbal task assessing crystallized ability in a paper-and-pencil format. Similarly, it can be individually- or group-administered in 10 minutes. The subjects are given 40 words and are asked to select a word with the same meaning from 4 displayed words.

Page 13: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

13

Table 2. Shipley-2 Normative Group Internal Consistency Compared to the Current Study

Coefficient Alpha

Vocabulary Block Pattern

Adult Sample, N = 1203, median 0.90 0.91

Adult Sample, Age 17-19, n = 278 0.85 0.94

Adult Sample, Age 20-29, n = 223 0.90 0.93

Current Study, n = 306 (age range 17-62; average age 24.90) 0.77

Current Study, n = 306 (age range 17-62; average age 24.90) 0.90

Ravens Standard Progressive Matrices

The Ravens Standard Progressive Matrices tests for cognitive ability related to problem solving and spatial ability as the subject is required to select the piece from given options necessary to complete a diagrammatic puzzle with serial change in two dimensions simultaneously in a paper-and-pencil format (Raven, Raven, & Court, 2000). Subjects are given as much time as needed to complete the test which can be used with both children and adults. The test consists of 60 problems divided into five sets of 12 problems, each with increasing difficulty. Varying difficulty allows for discrimination of cognitive ability among the subjects.

Because the items are of increasing difficulty, inter-correlation of the items would be erroneous as the ability to answer easier items correctly does not predict the ability to accurately complete more difficult items. The manual reports split-half internal consistency coefficients in the literature have exceeded .90, with values ranging from 0.89 to 0.97 with over 500 adults in the U.S. (Raven, Raven, & Court, 2000). For the current study the test was split into even and odd items and the resulting Pearson correlation coefficient for the total scores was 0.82 (n = 275).

ASSET Elementary Algebra

The ASSET system of tests was originally developed by ACT to assist community and technical colleges in advising, course placement, and retention services for students (ACT, 2009). Subjects are allowed 25 minutes to complete the 25 items on the Elementary Algebra test. The skills necessary to successfully complete this test are usually acquired in a first-year high school algebra course. The technical manual reports an internal consistency coefficient of 0.78 (Form C2, n = 1867). The current study obtained a coefficient alpha of 0.82 (Form C2, n = 263).

Page 14: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

14

NEO-FFI

The NEO-FFI is a shorter version of the NEO-PI-R instrument used to assess personality traits under the Five Factor model (Hull, Beaujean, Worrell & Verdisco, 2010). It is suitable for men and women 17 years of age or older, and the English language version was used (Costa & McCrae, 1992). Respondents select from among the five Likert-type scale options ranging from strongly agree to strongly disagree for each of the 60 statements without a time limit. Typically it can be completed in 10-15 minutes, but individuals with lower reading skills may take longer. There are 12 items per domain (Neuroticism, Extroversion, Openness to Experiences, Agreeableness, and Conscientiousness). The NEO-FFI is appropriate for use when global personality information is sufficient. As it is a self-report instrument professional training is not required for the administration of this test; however, interpretation requires professional training in psychological testing and measurement. Table 3 provides internal consistency reliabilities for the NEO-FFI reported in Caruso’s (2000) meta-analytic study of U.S. samples and the present study.

Table 3.

Internal Consistency Reported by Caruso Compared to Current Study

Coefficient Alpha

Domain Caruso (2000) (20 studies)

Form S Current Study

N = 303

Neuroticism 0.83 0.82

Extraversion 0.75 0.75

Openness 0.65 0.74

Agreeableness 0.67 0.70

Conscientiousness 0.80 0.82

Page 15: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

15

Spatial Reaction Time Test

Spatial Reaction Time (RT) – This task was developed by the researchers and based on the work of Roger Shepard (Shepard & Metzler, 1971; Cooper & Shepard, 1973). Similar to the cube comparison task, the assessment requires the mental manipulation of 3-D stimuli as shown in Figure 2. Two stimuli are presented: (1) the original object, and (2) a second object that is either a rotated view of the original object or its mirror reflection. If the object is only rotated, the participant should respond by saying the two objects they are the same; if the objects are mirror images of each other, the participant should respond that they are different. The task measures reaction time in milliseconds via a laptop computer that uses the computer-based academic assessment system (CAAS; Royer, 1999). The CAAS presents the pre-programmed stimuli, and the respondent answers verbally into a head-set microphone. The test administrator then records via an attached mouse if the response is correct or incorrect. Reaction time tasks have been shown to be predictors of life outcomes (Jensen, 2006).

Analyses

Group Differences

We first examined descriptive statistics for the two groups (Technological students and Non-Technological students). Then we conducted mean difference tests for each assessment separately. Following these, we examined the manifest profiles of the groups (Timm, 2002) with resulting profile plots. A profile analysis (PA) is a multivariate technique that allows not only assessing if the mean scores on the variables of interest are the same, but also if the profiles are parallel. First, we made sure all the variables were on the same scale. We chose the T-score metric (mean: 50, SD: 10). For all variables except reaction time (RT), this was done by taking the percentile score and matching it to a normal curve with the T-Score properties. For the RT variables, the values were standardized and then transformed to the T-score scale. We conducted all of these analyses in R (2011).

Figure 2. Representation of Shepard & Metzler (1971) 3-D stimuli.

Page 16: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

16

Latent Profile Analysis.

LPA is a latent variable modeling technique that is known in the literature by a variety of names, including latent class cluster analysis (Vermunt & Magidson, 2002) and finite mixture modeling (McLachlan & Peel, 2000). The goal of LPA is the same as that of cluster analysis -- to identify clusters of observations that have similar values on cluster indicators. The main difference between LPA and traditional cluster analytic techniques is that LPA is model-based, whereas hierarchical and most non-hierarchical applications of cluster analysis are not.

LPA is a type of latent variable mixture model. The term latent variable in this situation is referring to the latent categorical variable of cluster membership. This latent categorical variable has K number of categories or clusters. A person’s value on this variable is thought to cause his or her levels on the observed cluster indicators, which in our situation would be the different measures of cognitive ability, personality, and so on. The term “mixture” is referring to the notion that the data are not being sampled from a population that can be described by a single probability distribution. Instead, the data are conceived as being sampled from a population composed of a mix of distributions, one for each cluster, with each cluster distribution characterized by its own unique set of parameters.

When latent variable mixture modeling is used with only continuous cluster indicators, it is often called LPA. When only categorical variables are used, the technique is often called latent class analysis (LCA). This distinction is not necessary because it is the same model, a latent variable mixture model, which is being used in both situations. In fact, the distinction made between LPA and LCA seems even more unnecessary when one considers the fact that both categorical and continuous cluster indicators can be used simultaneously in latent variable mixture models.

Although standard clustering techniques can also be used with both categorical and continuous cluster indicators, the use of latent variable mixture modeling for such a purpose is relatively less difficult. Mixture modeling is also advantageous because indicators on different scales do not need to be transformed prior to their input into the analysis. With traditional clustering techniques, it is recommended that variables on different scales or with widely divergent variances be standardized prior to the analysis. With latent variable mixture modeling, no such transformation is necessary.

To further illustrate the notion of LPA, consider an example where a continuous variable yi is used as a single indicator of cluster membership for person i in our sample of size N (i = 1,. . . ,N). To make the example more concrete, one could consider the use of just a single cognitive ability factor (e.g., processing speed) as the continuous variable. Although the number of clusters, K, is not typically known a priori, suppose there are two different clusters of persons (K

Page 17: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

17

= 2) in our population. In mixture modeling this would translate into the presence of two different distributions, typically assumed to be normal, from which our data were sampled. Note that although the population distribution is assumed to be a mixture of two normal distributions in this example, the population distribution itself need not be normal.

In LPA, it is possible for a unique set of parameters to be estimated for each cluster. For instance, parameters µ1 and σ2

1 could be estimated for Cluster 1 and parameters µ2 and σ22

could be estimated for Cluster 2. This is the most complex model that could be estimated for this example, and more parsimonious models could be specified by constraining some of the parameters to be equal across clusters. For example, one could allow the means for each distribution to remain unique but constrain the variances to be equal across clusters, or one could allow the variances to remain unique across clusters and constrain the means to be equal.

In addition to the parameters of each cluster’s distribution, LPA also provides estimates for the mixing proportion or the weight given to each cluster in the population. The model for this example can be represented using the following equation:

𝑓(𝑦𝑖|𝜃) = 𝜋1𝑓1(𝑦𝑖|𝜇1,𝜎12) + 𝜋2𝑓2(𝑦𝑖|𝜇2,𝜎22),

which shows the distribution of our cluster indicator, yi, given the model parameters (𝜃 = 𝜋1, 𝜇1,𝜎12,𝜋2, 𝜇2,𝜎22) is a weighted mixture of two separate distributions, each characterized by a unique set of parameters. The weights in a mixture model are non-negative and must sum to one. If the weights in our example were estimated to be π1 = .60 and π2 = .40, it would imply that 60% of our population can be described by the parameters of Cluster 1 and 40% of our population by the parameters of Cluster 2.

When more than one continuous cluster indicator is used in LPA, the multivariate distribution of the r cluster indicators, contained in vector yi for person i, is conceived of as a weighted mixture of K different distributions, typically assumed to be multivariate normal. For instance, if the subscales associated with either the 2-, 3-, or 4-factor conceptualization of general cognitive ability were used as cluster indicators, a multivariate LPA model would need to be utilized. The multivariate representation of the previous equation with r indicators and K clusters is:

𝑓(𝑦𝑖|𝜃) = �𝜋𝑘𝑓𝑘(𝑦𝑖|𝜇𝑘, Σk),𝐾

𝑘=1

As with the univariate model, the weights in the multivariate equation are constrained to be non-negative and must sum to one. In the univariate example shown in the first equation, the

Page 18: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

18

distribution for each cluster was defined by only two parameters, a mean and a variance. In the multivariate case, the distribution for each cluster k is now defined by a mean vector µk and covariance matrix Σk. All latent analyses were conducted using MPlus (Muthén & Muthén, 1998-2012).

Results

Descriptive statistics are given in Table 4, and the profile plot is provided in Figure 3.

Table 4. Descriptive Statistics

As stated, all descriptive statistics are provided in a transformed T-score metric (mean: 50, SD: 10).

Page 19: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

19

Figure 3. Profile Plot on manifest measures.

As the variables were on different scales, there was no assumption that the profiles would be parallel. Thus, we tested to see if the means were the same between the groups using a one-way MANOVA. These results (Wilks Λ =.61, F(11,262)=15.19, P< .001) indicate that the means were different between the groups.

As the profiles were between group differences, we then examined the between group differences for all the variables. The results are given in Table 5. Looking at the effect size measure

Page 20: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

20

(Cohen, 1988), the between group differences appear to be medium to large on all variables with the exception of the personality variables of Openness to Experience, Extraversion, and Agreeableness.

Table 5. Between Group Differences for All Measured Variables

Variable t df p Cohen's d Raven’s Standard Progressive Matrices 7.6352 271.76 <0.00 0.93 Asset Math 10.14 266.16 <0.00 1.24 NEO-FFI Neuroticism 2.85 231.66 <0.00 0.37 NEO-FFI Extraversion -0.15 244.76 0.88 -0.02 NEO-FFI Openness to Experience 1.32 273.14 0.19 0.16 NEO-FFI Agreeableness -0.32 239.33 0.75 -0.04 NEO-FFI Conscientiousness -3.82 238.99 <0.00 -0.49 Shipley-R Block Patterns 7.90 195.86 <0.00 1.13 Shipley-R Vocabulary 5.49 215.20 <0.00 0.75 Spatial RT Average 4.36 251.02 <0.00 0.55 Spatial RT SD 3.31 250.76 <0.00 0.42 Note. All mean comparisons are done using the Welch’s t (1947) and df values due to lack of variance homogeneity in the variables between groups. Effect sizes are calculated so that positive values indicate the technical group is higher.

The results show that students participating in technological programs, as a group, possess a unique constellation of abilities

The LPA was run for models with 1 – 4 classes as shown in Table 6.

Table 6. Empirically derived class structure using LPA

Classes AIC BIC Entropy Class 1 n Class 2 n Class 3 n 1 20,223 20,301 - 306 2 20,036 20,159 .72 105 201 3 19,947 20,114 .81 192 107 7 4 Did Not Converge

We identified a 2 class model as the optimal model. Even though the 3 class model fits the data the best, the 3rd class is not interpretable. Using the 2 classes of participants with unique latent profiles, we then sought to examine the manifest mean structures in the measures in Table 7 (results provided in the original metric for each instrument).

Page 21: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

21

Table 7. Manifest mean and variance for the latent classes on each measure

Class 1 Class 2 Variable Mean Variance Mean Variance d GPA 3.1 0.3 3.3 0.3 -0.44 ASSET 33.1 39.4 40.9 39.4 -1.24 Neuroticism 42.2 119.5 45.4 119.5 -0.30 Openness 43.8 135.0 50.2 135.0 -0.56 Agreeableness 48.3 136.5 50.9 136.5 -0.22 Conscientiousness 54.9 102.5 53.3 102.5 0.15 Extraversion 53.4 110.2 53.9 110.2 -.04 Vocabulary 92.8 118.5 105.9 118.5 -1.20 Block Patterns 89.3 243.6 112.2 243.6 -1.47 Ravens Matrices 30.5 42.4 39.6 42.4 -1.40 Shepherd-Metzler SD 1.4 0.8 1.7 0.8 -0.31 Female 0.4 0.2 0.2 0.1 0.59

Figure 4. Latent Class Profile Plot.

Finally, we examined the number and percentage of technicians and non-technicians within each latent profile as shown in Table 8.

0

20

40

60

80

100

120

Class 1

Class 2

Page 22: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

22

Table 8. Technicians and non-Technicians within the two latent profile groups.

Class Technicians n Technicians % Non-Technicians n Non-Technicians % 1 64 27.23 242 64.1 2 171 72.77 135 35.8

Given the results in Table 8, it would appear Class 2 represents a profile from the measures administered that is representative of students selecting a technological education program that continued to persist in that program to within 1-2 semesters of completion. In essence, many of the students in technician education programs appear to have self-selected career path that have similar cognitive abilities and personalities. A similar outcome, but less convincing, was produced for Class 1, as 64.1% of the non-technological students possess the alternative profile.

Discussion

The results indicate several distinct differences between students who are close to graduating from technological education programs and other non-(or at least less) technological students equally close to graduation. The congruence of manifest selection of technological program versus non-technological program with the latent profiles identified provides evidence that students who participate in these areas possess different abilities and personalities from each other. Increased algebra knowledge, as well as greater fluid and visual processing cognitive abilities, is consistent with findings for students in engineering (see Kichuk & Wiesner, 1997). For the five personality factors, the picture is quite different. Technological students have less emotional stability1 (higher Neuroticism) and significantly lower scores on Conscientiousness than non-technological students.

Relatively lower scores in Neuroticism would suggest that technological education students probably prefer less social interaction. Given that social interactions would be a necessary part of any given work environment, technological education programs should strive to help students understand how to communicate well, the importance of teamwork, and adapting to change. The nature of technological work environments requires all workers to be able to identify elements of job transition and formulate transition work plans, but also how stress impacts job performance. In communication, it may be necessary to ensure technician

1 While we note lower scores on Neuroticism (the opposite of Emotional Stability), scores for the technician group are very much in the normative range. Our comments here are about the relative differences in scores between the technician group and the non-technician group, and simply serve to point out potential relative weaknesses in the technician group. As a group, they would certainly not be described as emotionally unstable. The same general idea is true for our observations related to Conscientiousness.

Page 23: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

23

education programs emphasize workplace etiquette, the ability to interpret and use body language, and the ability to follow oral and written directions.

Lower Conscientiousness suggests a need for technological education programs to emphasize organization as it relates to problem solving and work ethics. Because of greater spatial awareness, it is likely that technicians possess the ability to identify problem elements earlier than others, but need guidance in their educational programs on how to evaluate options and outcomes, and how to set priorities so that they can select and implement a solution to a problem. They may also need to know how to manage/organize workloads. In terms of work ethics, educational programs should help future technicians to identify established policies, time management techniques, and the importance of initiative and willingness to learn the workplace. For example, one characteristic of conscientiousness would include maintaining a clean workspace, but such natural skills are frequently mentioned by faculty and employers as lacking in technicians.

All of the above can likely be managed by faculty that attempt to structure learning environments similar to that found in the workplace, with consequences when students do not adhere to what is expected. This means that it is important for faculty to structure their learning environments (laboratories, classrooms, project activities, and so on) with clear expectations for performance, as would be the case in the workplace.

Observed higher scores for Gf and Gv in the technician group are not surprising, but point out some important differences in learning preferences that might be found in the two latent classes. Maximizing the acquisition of information with technicians should involve hands-on laboratory work in which they are exposed to increasingly more complex systems. In optics and photonics education programs it is very likely to be the case that proficient graduating technicians must be able to operate, and in fact be able to detect problems with, complex laser “systems” that are, for example operating under high vacuum pressures, produce extreme heat and consequently have water-cooling systems, delicate optical element alignment, and utilize an extraordinary amount of electrical current. Simply being in the room with such a system requires extensive knowledge of how all these subsystems interact with one another just to maintain personal safety, but a proficient technician must also be able to alter pressure, or water flow, or adjust optical elements or current supply and recognize the multi-varied nature of adjusting just one on all the other subsystems. This example highlights the critical nature of Gf for technicians. Without this basic cognitive skill, technicians can risk danger to themselves or others, or potentially risk damage to multi-million dollar systems for which they are responsible. It is essential for technological education programs to expose technicians to highly complex systems that permit examination of such skills. Moreover, it is incumbent on faculty in these programs to insist that students learn to manage these multi-varied systems.

Page 24: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

24

The spatial (Gv) domain represents another important ability for technological education. Several tasks performed by technicians require highly developed spatial talent. Prints and schematics are one clear example. Reading a two-dimensional print and transferring the specifications of the print with different views onto a 3-dimensional part requires the ability to recognize patterns, sometimes when the part is not visible. Technicians responsible for repairing devices with parts hidden from view must be able to mentally represent how the device is constructed to either troubleshoot or replace elements in that device. Mechanical technicians, especially, must be able to mentally represent parts in order to successfully render/fabricate them on a lathe or mill. Again, it is important for technological education programs to recognize that basic cognitive abilities, such as spatial visualization, are skills that make technician careers possible and satisfying for some.

Perhaps our most unusual observation is in the mathematical domain. In elementary algebra, technicians appear to possess relative strength when compared to other two-year college students. It would seem that comparisons with engineers, particularly in this domain, would be helpful. We suspect that future individual differences research might highlight this domain as being the primary distinguishing trait when examining technicians and engineers.

Because we were able to empirically identify unique profiles that largely fit the extant choices of participants in the two types of educational programs, the present study supports Gottfredson’s theory of circumscription and compromise (1981; 1983; 1985; 1996; 2002; 2005) which goes on to suggest that individuals tend to seek out environments that reinforce their abilities and vocational interests or personalities. If, as suggested by other researchers, that cognitive abilities and personality are substantively heritable, these trait differences may be causally related to students’ selection and persistence in technological education and would be important to identify early. In other words, it likely would not be detrimental if early assessment were done to identify potential technological education students or at least to provide detailed information about technological education programs as early as middle school. For example, early exposure to technological apparatus in elementary and middle school education with exercises to identify students that appear to have interest and providing additional exposure to those interested would be a valuable program to pursue and evaluate long term. This information is also important for technological education practitioners involved in counseling students interested in pursuing technological programs, as the differences constituting profiles of future graduates may have consequences on student persistence and satisfaction. For students that possess an atypical profile, a few of them might find themselves still enrolled in the last 1-2 semesters of a technological education program, but technological educators need to be aware of special needs these students might have regarding their need for different learning experiences given their abilities and interests. They may not respond to instruction in the same way, for example, such students might appreciate the opportunity to

Page 25: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

25

work in groups, but may not respond as well as, or as quickly in laboratory work with spatially complex or multi-system devices such as those that involve mechanical, electrical, optical subsystems simultaneously. Additional research is needed to clarify and understand how students with atypical profiles successfully participate in technological education. More work should be carried out on long-term persistence in technological careers of students from technological programs to better understand whether or not the minority of students represented in Class 1 of the LPA in contrast with those in Class 2 (the majority of students in technological education), seek out alternative work environments and the extent to which they remain satisfied with their career choice.

Limitations

What we may not know from these findings is if these uniquenesses existed prior to entry into these programs. From the present study, technician students and non-technician students have these skills as they exit the programs, but did they already have them when they entered the program or did they develop them while there? Research would cause us to hypothesize that these traits largely existed prior to entry into the program as these traits are fairly stable across time, particularly after adolescence, and are in some cases genetically related. An extension of the present study, while students are in the middle of their studies, at graduation and 2 and 5 years into their career to see if these findings can be replicated and if they can predict not just differences in academic success but also long-term career success.

The present study does not address long-term success of students. Moreover, the study sample is limited to four community college sites and may not represent all community/technical colleges in the U.S.

Page 26: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

26

References ACT (2009). ASSET technical manual. Iowa City: IA Alfonso, V. C., Flanagan, D. P., & Radwan, S. (2005). The impact of the Cattell-Horn-Carroll

theory on test development and interpretation of cognitive and academic abilities. In D. P. Flanagan & P. L. Harrison (Eds.) Contemporary intellectual assessment: Theories, tests, and issues (2nd ed., pp. 185-202). New York: Guilford Press.

Barrick, M. R., Mount, M. K. (1991). The big five personality dimensions and job performance: A

meta-analysis. Personnel Psychology, 44, 1-26. Barrick, M. R., Mount, M. K., & Gupta, R. (2003). Meta-analysis of the relationship between the

Five-Factor Model of personality and Holland’s occupational types. Personnel Psychology, 56, 45–74.

Beaujean, A. A., Hull, D. M., Sheng, Y., & Worrell, F. C. (in review). Psychometric properties of

the Shipley Block Design Task: A study with Jamaican young adults. Bethell-Fox, C. E., Lohman, D. F., & Snow, R. E. (1984). Adaptive reasoning: Componential and

eye movement analysis of geometric analogy performance. Intelligence, 8, 205-238. Black, J. (2000). Personality testing and police selection: Utility of the “Big Five.” New Zealand

Journal of Psychology, 29, 2-9. Block, J. (1993). Studying personality the long way. In D. C. Funder, R. D. Parke, C. Tomlinson

Keasey, & K. Widaman (Eds.), Studying lives through time: Personality and development (pp. 9-41). Washington, DC: American Psychological Association. doi:10.1037/10127-018

Bouchard, T. J., Jr., & McGue, M. (2003). Genetic and environmental influences on human

psychological differences. Journal of Neurobiology, 54, 4-45. Bouchard, T. J., Jr., & McGue, M. (1981). Familial studies of intelligence: A review. Science, 212,

1055-1059. Carpenter, P. A., Just, M. A., & Shell, P. (1990). What one intelligence test measures: A

theoretical account of the processing in the Raven Progressive Matrices test. Psychological Review, 97, 404-431.

Caruso, J. C. (2000). Reliability generalization of the NEO personality scales. Educational and

Psychological Measurement, 60, 236-254. doi 10.1177/00131640021970484

Carroll, J. B. (1993). Human cognitive abilities. New York: Cambridge University Press.

Page 27: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

27

Carroll, J. B. (1997). Psychometrics, intelligence, and public perception. Intelligence, 24, 25-52. Caspi, A., Roberts, B. W., & Shiner, R. L. (2005). Personality development: Stability and change.

Annual Review of Psychology, 56, 453-484 doi:10.1146/annurev.psych.55.090902.141913

Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed). Hillsdale, NJ:

Lawrence Erlbaum Associates.

Cooper, L. A., & Shepard, R. N. (1973). Chronometric studies of the rotation of mental images. In W. G. Chase (Ed.), Visual information processing. New York: Academic Press.

Costa, P. T. & McCrae, R. R. (1992). Revised NEO personality inventory (NEO PI-R) and NEO five-factor inventory (NEO-FFI): Professional manual. Lutz, FL: Psychological Assessment Resources, Inc.

Deary, I. J., Spinath, F. M., & Bates, T. C. (2006). Genetics of intelligence. European Journal of Human Genetics, 14, 690-700.

Deary, I. J., Whiteman, M. C., Starr, J. M., Whalley, L. J., & Fox, H. C. (2004). The impact of

childhood intelligence on later life: Following up the Scottish mental surveys of 1932 and 1947. Journal of Personality and Social Psychology, 86, 130-147.

Digman, J. M. (1990). Personality structure: Emergence of the five-factor model. Annual

Reviews of Psychology, 41, 417-440. Embretson, S. (1995). The role of working memory capacity and general control process in

intelligence. Intelligence, 20, 169-189. Embretson, S. (1998). A cognitive design system approach to generating valid tests: Application

to abstract reasoning. Psychological Methods, 3, 380-396. Evans, T. G. (1968). Program for the solution of a class of geometric-analogy intelligent-test

questions. In M. Minsky (Ed.), Semantic information processing (pp. 271-353). Cambridge, MA: MIT Press.

Flanagan, D. P., McGrew, K. S., & Ortiz, S. O. (2000). The Wechsler intelligence scales and Gf-Gc

theory: A contemporary approach to interpretations. Needham Heights, MA: Allyn & Bacon.

Goldman, S. R., & Pellegrino, J. W. (1984). Deductions about induction: Analyses of

developmental and individual differences. In R. J. Sternberg (Ed.), Advances in the psychology of human intelligence (vol. 2, pp. 149-197). Hilldale, NJ: Lawrence Erlbaum Associates.

Page 28: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

28

Gottfredson, L. S. (2005). Using Gottfredson's theory of circumscription and compromise in career guidance and counseling. In S. D. Brown & R. W. Lent (Eds.), Career development and counseling: Putting theory and research to work (pp. 71-100). New York: Wiley.

Gottfredson, L. S. (2002). Gottfredson's theory of circumscription, compromise, and self-

creation. Pages 85-148 in D. Brown (Ed.), Career choice and development (4th ed.). San Francisco: Jossey-Bass.

Gottfredson, L. S. (1997). Why g matters: The complexity of everyday life. Intelligence, 24(1),

79-132. Gottfredson, L. S. (1996). Gottfredson's theory of circumscription and compromise. In D. Brown,

& L. Brooks, (Eds.), Career choice and development (3rd ed.), pp. 179-232. San Francisco: Jossey-Bass.

Gottfredson, L. S. (1985). The role of self-concept in vocational theory. Journal of Counseling

Psychology, 32 (1), 159-162. Gottfredson, L. S. (1983). Creating and criticizing theory. Journal of Vocational Behavior, 23,

203-212. Gottfredson, L. S. (1981). Circumscription and compromise: A developmental theory of

occupational aspirations. Journal of Counseling Psychology (Monograph), 28 (6), 545-579

Gohm, C. L., Humphreys, L. G., & Yao, G. (1998). Underachievement among spatially gifted

students. American Educational Research Journal, 35, 515-531. Green, K. E., & Kluever, T. C. (1992). Components of item difficulty of Raven’s matrices. Journal

of General Psychology, 119, 189-199. Haan, N., Millsap, R., & Hartka, E. (1986). As time goes by: Change and stability in personality

over fifty years. Psychology and Aging, 1, 220-232. doi:10.1037/0882-7974.1.3.220 Haworth, C. M. A., Wright, M. J., Luciano, M., Martin, N. G., de Geus, E. J. C., van Beijsterveldt,

C. E. M., … Plomin, R. (2009). The heritability of general cognitive ability increases linearly from childhood to young adulthood. Molecular Psychiatry. doi 10.1038/mp.2009.55

Helson, R., & Moane, G. (1987). Personality change in women from college to midlife. Journal of

Personality and Social Psychology, 53, 176-186. doi:10.1037/0022-3514.53.1.176 Hoachlander, G., Sikora, A. C., & Horn, L. (2003). Community College Students: Goals, Academic

Preparation, and Outcomes (NCES 2003–164).

Page 29: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

29

Holland, J. L. (1997). Making vocational choices: A theory of vocational personalities and work

environments (3rd ed.). Odessa, FL: Psychological Assessment Resources. Hong, R. Y., Paunonen, S. V., & Slade, H. P. (2008). Big five personality factors and the prediction

of behavior: A multitrait-multimethod approach. Personality and Individual Differences, 45, 160-166. doi:10.1016/j.paid.2008.03.015

Horn, J. L. (1997). A basis for research on age differences in cognitive capabilities. In J. J.

McArdle, & R. W. Woodcock (Eds.), Human cognitive abilities in theory and practice. Chicago, IL: The Riverside Publishing.

Horn, J. L., & Cattell, R. B. (1966). Refinement and test of the theory of fluid and crystallized

general intelligences. Journal of Educational Psychology, 57, 253-270. doi: 10.1037/h0023816

Horn, J. L., & Noll, J. (1997). Human cognitive capabilities: Gf-Gc theory. In D. P. Flanagan, J. L.

Genshaft, & P. L. Harrison (Eds.), Contemporary intellectual assessment: Theories, tests, and issues (pp. 53–91). New York: Guilford Press.

Horn, L., Nevill, S., & Griffith, J. (2006). Profile of Undergraduates in U.S. Postsecondary

Education Institutions: 2003–04: With a Special Analysis of Community College Students (NCES 2006-184). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

Hornke, L. F., & Habon, M. W. (1986). Rule-based item bank construction and evaluation within

the linear logistic framework. Applied Psychological Measurement, 10, 369-380. Hull, D. M., Beaujean, A. A., Worrell, F. C., & Verdisco, A. E. (2010). An item-level examination of

the factorial validity of the NEO Five-Factor Inventory Scores. Educational and Psychological Measurement, 70, 1021-1041. doi: 10.1177/0013164410378091

Humphreys, L. G., & Lubinski, D. (1996). Brief history and psychological significance of assessing

spatial visualization. In C. P. Benbow & D. Lubinski (Eds.), Intellectual talent: Psychometric and social issues (pp. 116–140). Baltimore: Johns Hopkins University Press.

Humphreys, L. G., Lubinski, D., & Yao, G. (1993). Utility of predicting group membership and the

role of spatial visualization in becoming an engineer, physical scientist, or artist. Journal of Applied Psychology, 78, 250-261.

Hunt, E. (1974). Quote the Raven? Nevermore! In L. W. Gregg (Ed.), Knowledge and cognition

(pp. 129-158), Potomac, MD: Lawrence Erlbaum Associates. Jenson, A. R. (1998). The g factor: The science of mental ability. Westport, CT: Praeger.

Page 30: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

30

Jensen, A. R. (2006). Clocking the mind: Mental chronometry and individual differences. Oxford:

Elsevier.

John, O. P., & Srivastava, S. (1999). The big five trait taxonomy: History, measurement, and theoretical perspectives. In L. A. Pervin & O. P. John (Eds.), Handbook of personality: Theory and research (2nd ed., pp. 102-138). New York, NY: Guilford Press.

Kichuk, S. L., & Wiesner, W. H. (1997). The Big Five personality factors and team performance:

Implications for selecting successful product design teams. Journal of Engineering Technology Management, 14, 195-221.

Koenig, K. A., Frey, M. C., & Detterman, D. K. (2008). ACT and general cognitive ability.

Intelligence, 36, 153-160. Kohs, S. C. (1920). The block-design tests. Journal of Experimental Psychology, 3, 357-376. doi:

10.1037/h0074466

Larson, L. M., Rottinghaus, P. J., & Borgen, F. H. (2002). Meta-analyses of Big Six interests and Big Five personality variables. Journal of Vocational Behavior, 61, 217–239.

Lau, S., Roeser, R. W., & Kupermintz, H. (2002). On cognitive abilities and motivational

processes in students’ science engagement and achievement: A multidimensional approach to achievement validation. (CSE Technical Report 570). Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing, UCLA.

Lent, R. W., Brown, S. D., & Hackett, G. (1994). Toward a unifying social cognitive theory of

career and academic interest, choice, and performance. Journal of Vocational Behavior, 45, 79-122.

Lohman, D. F. (1988). Spatial abilities as traits, processes, and knowledge. In R. J. Sternberg

(Ed.). Advances in the psychology of human intelligence (Vol. 4, pp. 181–248). Hillsdale, NJ: Erlbaum.

Lohman, D. F. (1994a). Spatial ability. In R. J. Sternberg (Ed.), Encyclopedia of intelligence (Vol. 2,

pp. 1000–1007). New York: Macmillan. Lohman, D. F. (1994b). Spatially gifted, verbally inconvenienced. In N. Colangelo, S. G.

Assouline, & D. L. Ambroson (Eds.), Talent development: Proceedings from the 1993 Henry B. and Jocelyn Wallace National research symposium on talent development (Vol. 2, pp. 251–264). Dayton: Ohio Psychology Press.

Lohman, D. F. (2005). The role of nonverbal ability tests in identifying academically gifted

students: An aptitude perspective. Gifted Child Quarterly, 49, 111-138.

Page 31: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

31

Long, B. T., & Kurlaender, M. (2009). Do community colleges provide a viable pathway to a

baccalaureate degree? Educational Evaluation and Policy Analysis, 31(1), 30-53. doi 10.3102/0162373708327756

Lubinski, D. (2000). Scientific and social significance of assessing individual differences: “Sinking

shafts at a few critical points.” Annual Review of Psychology, 51, 405-444. Lubinski, D., & Dawis, R. V. (1995). Assessing individual differences in human behavior: New

methods, concepts, and findings. Palo Alto, CA: Davies-Black. Lykken, D. T., Bouchard, T. J., Jr., McGue, M., & Tellegen, A. (1993). Heritability of interests: A

twin study. Journal of Applied Psychology, 78, 649–661. Marshalek, B., Lohman, D. F., & Snow, R. E. (1983). The complexity continuum in the radex and

hierarchical models of intelligence. Intelligence, 7, 107-127. McCall, R. B. (1981). Nature-nurture and the two realms of development: A proposed

integration with respect to mental development. Child Development, 52, 1-12. McGrew, K. S. (2005). The Cattell-Horn-Carroll theory of cognitive abilities: Past, present, and

future. In D. P. Flanagan & P. L. Harrison (Eds.) Contemporary intellectual assessment: Theories, tests, and issues (2nd ed., pp. 136-181). New York: Guilford Press.

McGrew, K. S. (2009). CHC theory and the human cognitive abilities project: Standing on the

shoulders of the giants of psychometric intelligence research. Intelligence, 37, 1-10. doi:10.1016/j.intell.2008.08.004

Miele, F. (2002). Intelligence, race, and genetics: Conversations with Arthur R. Jensen.

Cambridge, MA: Westview Press. Moloney, D. P., Bouchard, T. J., & Segal, N. L. (1991). A genetic and environmental analysis of

the vocational interests of monozygotic and dizygotic twins reared apart. Journal of Vocational Behavior, 39, 76–109.

Mount, K. M., Barrick, M. R., Tippie, H. B., Scullen, S. M., & Rounds, J. B. (2005). Higher order

dimensions of the Big Five personality traits and the Big Six interest types. Personnel Psychology, 58, 447–478.

Mulholland, T. M., Pellegrino, J. W., & Glaser, R. (1980). Components of geometric analogy

solution. Cognitive Psychology, 12, 252-284. Muthén, L. K. and Muthén, B. O. (1998-2012). Mplus User’s Guide. Fifth Edition. Los Angeles,

CA: Muthén & Muthén.

Page 32: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

32

Neisser, U., Boodoo, G., Bouchard, T. J., Boykin, A. W., Brody, N., Ceci, S. J., et al. (1996).

Intelligence: Knowns and unknowns. American Psychologist, 51, 77-101. Plomin, R., & Spinath, G. M. (2004). Intelligence: Genetics, genes, and genomics. Journal of

Personality and Social Psychology, 86, 112-129. Plomin, R., DeFries, J. C., McClearn, G. E., & McGuffin, P. (2008). Behavioral genetics (5th ed.),

New York: Worth. R Development Core Team (2011). R: A language and environment for statistical computing. R

Foundation for Statistical Computing, Vienna, Austria. ISBN 3-900051-07-0, URL http://www.R-project.org/.

Raven, J., Raven, J. C., & Court, J. H. (2000). Raven manual: Section 3 standard progressive matrices. San Antonio, TX: Pearson.

Raven, J. C., Raven, J. E., & Court, J. H. (1998). Progressive matrices. Oxford, England: Oxford Psychologists Press.

Roberts, B. W., Kuncel, N. R., Shiner, R., Caspi, A., & Goldberg, L. R. (2007). The power of

personality: The comparative validity of personality traits, socioeconomic status, and cognitive ability for predicting important life outcomes. Perspectives on Psychological Science, 2, 313-345. doi:10.1111/j.1745-6916.2007.00047.x

Robins, R. W., Fraley, R. C., Roberts, B. W., & Trzesniewski, K. H. (2001). A longitudinal study of

personality change in young adulthood. Journal of Personality, 69, 617-640. doi:10.1111/1467-6494.694157

Royer, J. (1999). The Computer-based Academic Assessment System (Windows Version).

Belchertown, MA.

Rumelhart, D. E., & Abrahamson, A. A. (1973). A model for analogical reasoning. Cognitive Psychology, 5, 1-28.

Scarr, S. (1992). Developmental theories for the 1990s: Development and individual differences.

Child Development, 63, 1-19. Scarr, S., & McCartney, K. (1983). How people make their own environments: A theory of

genotype → environment effects. Child Development, 54, 424-435. Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel

psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124, 262–274.

Page 33: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

33

Shepard, R. N. & Metzler, J. (1971). Mental rotation of three-dimensional objects. Science, 19, 701-703. doi: 10.1126/science.171.3972.701

Shipley, W. C., Gruber, C. P., Martin, T. A., & Klein, A. M. (2009). Shipley-2 manual. Los Angeles, CA: Western Psychological Services.

Smith, I. M. (1964). Spatial ability: Its educational and social significance. London: University of London Press.

Snow, R. E. (1999). Commentary: Expanding the breadth and depth of admissions testing. In S.

Messick (Ed.), Assessment in higher education (pp. 133–140). Hillsdale, NJ: Erlbaum. Spearman, C. (1904). General intelligence, objectively determined and measured. American

Journal of Psychology, 15, 201-292. Sternberg, R. J. (1977). A component process in analogical reasoning. Psychological Review, 84,

353-378. Sternberg, R. J. (1978). Isolating the components of intelligence. Intelligence, 2, 117-128. Sternberg, R. J. (1980). Sketch of a componential subtheory of human intelligence. Behavioral

and Brain Sciences, 3, 573-613. Sternberg, R. J. (1986). Toward a unified theory of human reasoning. Intelligence, 10, 281-314. Sternberg, R. J. (1997). The triarchic theory of intelligence. In D. P. Flanagan, J. L. Genshaft, & P.

L. Harrison (Eds.), Contemporary intellectual assessment: Theories, tests, and issues (pp. 92-104). New York: Guilford Press.

Super, D. E., & Bachrach, P. B. (1957). Scientific careers and vocational development theory.

New York: Bureau of Publications, Teachers College, Columbia University. Wai, J., Lubinski, D., & Benbow, C. P. (2009). Spatial ability for STEM domains: Aligning over 50

years of cumulative psychological knowledge solidifies its importance. Journal of Educational Psychology, 101, 817-835.

Webb, R. M., Lubinski, D., & Benbow, C. P. (2007). Spatial ability: A neglected dimension in

talent searches for intellectually precocious youth. Journal of Educational Psychology, 99, 397-420.

Welch, B. L. (1947). The generalization of "Student's" problem when several different

population variances are involved. Biometrika, 34(1-2), 28–35. doi: 10.1093/biomet/34.1-2.28

Page 34: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

34

Appendix A: Site Administrator Instructions for Participant Recruitment and Assessment

Page 35: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

35

NSF Grant Test Administration Instructions NIH certification is required for anyone who recruits or administers any of these tests. The first website below is a University of North Texas site that explains UNT’s policy on IRBs and NIH training, should anyone want to know. The second website links to the actual training: http://research.unt.edu/ors/complaince/trainingirb.htm http://phrp.nihtraining.com/users/login.php Recruit 100 subjects nearing completion of their respective program: 50 technical, 50 non-technical. Try for 25 males and 25 females in each group. Group Administered Tests: Group Administration Materials Needed Provided by UNT Provided by Test Site Quiet, well-lit room with table and chairs or desks

Pens √ Pencils √ Pencil sharpener or extra pencils √ Calculators (only four-function, scientific or graphing calculators permitted)

Scratch Paper √ Reliable watch, stop watch or clock √ Individual Computer test time sign –up sheet √ UNT IRB Informed Consent Form (105) √ Participant Survey (105) √ ASSET test booklets Form E2 (14) √ Pearson NCS Test Sheets (105) √ NEO FFI test booklet (110) √ Shipley Vocabulary test booklets (110) √ Shipley Block test booklets (110) √ Ravens Progressive Matrices Plus test booklets (20)

Ravens SPM Plus Answer Sheets (110) √ Group Administered Test session

1. Introduce yourself. 2. Tell the group the following:

On behalf of Dr. Darrell Hull and Dr. Rebecca Glover of the University of North Texas, thank you for participating in this study. This work is being used for a National Science Foundation Grant through the University of Colorado. This study is looking for measurable ability differences among technical and non-technical students enrolled in 2-year post-high school programs. No individual’s results will be identified in this study. The aggregate data is of interest. Data will

Page 36: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

36

be collected in this group session, plus one session that will be conducted individually. The group session consists of five paper-pencil tests. The individual session consists of a computer-aided test measuring response time for identifying the similarity of two cube objects that need to be mentally rotated. In appreciation for your participation you will receive $30 after completing the individual computer session.

3. In ink have each participant read and fill out the University of North Texas Institutional Review Board Informed Consent form. A copy should be returned to the subject at the individually conducted computer test. Contact information is being collected so the researchers can follow-up with them should the study be extended.

4. In ink have each participant fill out the Participant Survey. At the top of the form the participant will generate their ID number using the following format: three initials, date of birth (mmddyy) and state. [e.g. DMH052462TX] This ID number will be used as the identifier on all the tests to follow.

5. Administer the tests in the following order using the instructions provided in the next pages. ASSET Elementary Algebra (25 minutes), NEO FFI (typically 10-15 minutes), Shipley Vocabulary (10 minutes), Shipley Block Pattern (12 minutes), Ravens Progressive Matrices (typically completed within 45 minutes).

ASSET Elementary Algebra test • Calculators may be used; however, all problems on the test can be solved without using calculators. • Provide each subject with a #2 pencil, ASSET Form E2 test booklet and Pearson NCS 100 test sheet. No marks are to be made in the test booklet. • Have each subject put their ID number on their answer document. • Have each subject open the test booklet to page 20. • Read the following directions aloud to the group. 1. This is an elementary algebra test consisting of 25 questions. You will have 25

minutes to work on it. Solve each problem, choose the correct answer, and then fill in the corresponding space on your answer sheet with your pencil. You may change your answer by cleanly erasing your first choice and filling in your subsequent choice. Please avoid leaving any stray marks on the answer document as this may affect the machine scoring of your document.

2. You are permitted to use any approved calculator on this test. You may use your calculator for any problems you choose, but some problems may best be done without using a calculator. All the problems on the test can be solved without using a calculator.

3. Do your figuring on the scratch paper provided. DO NOT WRITE IN THE TEST BOOKLET.

4. Do not linger over problems that take too much time. Solve as many as you can; then return to the others in the time you have left. You will have 25 minutes to work on this test. Work quickly and carefully.

5. Any questions? 6. You may turn the page and begin.

Page 37: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

37

• Give the subject 25 minutes to complete the test. During this time watch that the subjects are working independently and that they are not marking in the test booklets. • Announce when the time is up • Double check that their ID number is on the answer document. • Collect the answer documents, test booklets and calculators. • Ask if the subjects would like a short break before continuing to the next test.

NEO-FFI

• Provide each subject with a NEO-FFI test booklet – form S and a pencil. • Do not remove the perforated edges. • Read the following instructions aloud to the group: 1. Write only where indicated in this booklet. 2. Carefully read all of the instructions before beginning. 3. This questionnaire contains 60 statements. Read each statement carefully. For each

statement fill in the circle with the response that best represents your opinion. Make sure that your answer is in the correct box. Your answer choices are SD if you strongly disagree with the statement, D if you disagree with the statement, N if you are neutral on the statement, A if you agree with the statement, SA if you strongly agree with the statement.

4. The form does not allow for erasures. If you need to change an answer, make an X through the incorrect response and then fill in the correct response.

5. Note that the responses are numbered in rows, so you will work across the form when filling in your responses.

6. Before you begin responding to the statements be sure to fill in your ID number on top of the left page.

7. When you are finished close the test booklet and return it to me. 8. You may open the booklet and begin. 9. Again remember to fill in your ID number.

Shipley-2 Vocabulary

• Provide each subject with a Shipley-2 Vocabulary test booklet. • It can be taken in pencil. • Read the following instructions aloud to the group: 1. Do not remove the perforated edge. 2. Remember to fill in your ID number on the front page. Do not remove the front page. 3. This task is about word meaning. 4. When you are finished close the test booklet and return it to me. 5. You may read the instructions and begin. • Give the group 10 minutes to complete this test. • When they return their test booklet make sure that their ID number is on the booklet

Shipley-2 Block Patterns

• Provide each subject with a Shipley-2 Block Patterns Form. • It can be taken in pencil.

Page 38: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

38

• Read the following instructions aloud to the group: 1. Remember to fill in your ID number on the front page. 2. The next task shows some block patterns to figure out. It is on two pages that face

each other. You can do both pages without stopping. 3. When you are finished close the test booklet and return it to me. 4. You may read the instructions and begin. • Give the group 12 minutes to complete this test. • When they return their test booklet make sure that their ID number is on the booklet. • Ask if the group would like a short break before continuing to the last test.

Raven’s Progressive Matrices (SPM Plus Sets A-E)

• Provide each subject with a Raven’s Progressive Matrices test booklet and SPM Plus Answer Sheet • This test can be taken in pencil or pen. • Read the following instructions aloud to the group: 1. Do not remove the perforated edge on the answer sheet 2. Fill in your ID number on the answer sheet 3. Do not make any marks on the test booklet. All of your answers should be made on

the answer sheet. 4. This is a test of observation and clear thinking. 5. Please open your test booklet to the first page. Look at Problem A1. 6. Now look at your answer sheet. 7. You will see that under the heading Set A there is a column of identifiers, A1 through

A12. Your answers for Set A will go in this column. 8. Now look back at your test booklet. 9. In the figure A1 at the top of the page there is a piece missing. You are to select the

piece from those provided on the bottom half of the page that will complete the figure on the top half of the page. Only one of these pieces is perfectly correct.

10. Give them a moment to examine the pieces. 11. Number 4 is the piece needed to complete figure A1. Mark it on your answer sheet by

drawing a single horizontal line through figure 4 next to Identifier A1 in the Set A column. Do not mark in the test booklet.

12. Now turn the page and examine Figure A2. 13. Give them a moment to examine the pieces. 14. The right answer is number 5. Mark number 5 on your answer sheet next to A2 in the

Set A column. 15. If you would like to change an answer after you have marked your answer sheet, do

not try to erase your answer. Draw an X through it and put a single horizontal line through your correct selection.

16. You may skip items and return to them. When you turn in your test do not leave any blank. If you are not sure, mark your best guess.

17. When you are finished turn in your answer sheet, test booklet, pen, and pencil. Also sign-up for your individual computer test time slot before you leave.

18. Remember to fill in your ID number and do not make any marks in the test booklet. 19. Any questions?

Page 39: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

39

20. You may proceed to the item A3 and proceed at your own pace. You will have as much time as you need.

• At the end of 20 minutes, ask those taking the test to circle the number of the problem they are working on at that time. • Collect their test booklets and answer sheets. Check to make sure that their ID number is on the answer sheet. Also make sure that each subject signs up for their individual computer test time. • Note: it will be useful to give the subjects a reminder phone call or email about their appointment so be sure to get their contact information before they leave. • All of the group administered tests will be scored at UNT. You do not need to score them before shipping.

Page 40: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

40

Individually Administered Test: Individual Administration Materials Needed Provided by UNT Provided by Test Site Quiet, well-lit room with table, 2 chairs and laptop or desk-top computer with speakers

CAAS program installed License Installation completed Shepard and Metzler task and image installation Files provided on

thumb drive File installed by test site

Microphone headset √ Shepard and Metzler image file key (105) √ Receipt Book √ Using the CAAS system to test individual response time Overview Each individual will view a pair of cube images. They must examine the pair and determine if they are in the same configuration or not. The images could be the same, but being displayed from a different angle. The individual may have to mentally rotate the image to determine if the two images are the same. They will view 100 pairs of images. The pairs are broken into five categories to allow for breaks, if needed, after every 20 pairs. The subject will say ‘same’ if the pair is the same or ‘different’ if the pair is different into a headset with a microphone. You will hear a sound when the microphone detects their response. This is why it is important that a short microphone calibration be made for each individual before they begin testing. Have the subject speak in the same volume of voice during the calibration as during the testing. The administrator will indicate a correct response by clicking the left mouse button, an incorrect response by clicking the right mouse button and input error by clicking both the left and right mouse buttons simultaneously. A tone will be made when the item is scored to indicate if the answer was correct, incorrect or if the administrator eliminated the item due to input error. An input error may occur when the microphone picks up a sound that was not the response, hence the importance of a quiet testing area. Or an input error may occur when the microphone does not pick up the response. The response time, as well as the accuracy, is a measure of interest; therefore, if the microphone does not pick up the response when it is first made, the response time will be measured as longer than it actually was. The test administrator will use a Shepard and Metzler Images File Key for each subject. Be sure to fill in their ID number. Any item(s) that was removed due to input error should be marked on this form. Also any other irregularities that may have occurred during testing should be noted on this form. Should a break be needed before the end of a category one can be taken after the microphone detects a response, but BEFORE the item is scored. The test administrator should make note of the accuracy of the response on the Shepard and Metzler Images File key form so as not to forget it by the time it needs to be entered into the system to continue. There is a two second delay between scoring the previous item and the next pair being displayed to give the administrator an opportunity to change to score.

Page 41: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

41

It is strongly suggested that the test administrator practice with the system before starting with the study participants. Test System Instructions

1. Click on the Reading Success Lab icon.

2. Click on the Assessment module.

3. A series of five Getting Started screens will appear. Read each one and then click Next to

proceed to the next screen. Of particular interest is screen number 4. This screen will inform the test administrator how to score the response given.

Page 42: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

42

4. After viewing these five screens the Assistant window will appear in the upper right

corner of the monitor. Click on the Mic-Sound button

5. After clicking on the Mic Setup button, the Microphone Setup Wizard Introduction

window will appear, click OK.

6. You will view a series of informational screens. This information will become redundant

for the test administrator, but is very helpful for each of the participants so do not skip through them. First the background noise will be measured. Click OK.

Page 43: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

43

7. Compare the volume of your recording with the sample of a good recording. Click OK, if

your recording is acceptable or click Re-Record to try again.

8. The subject will now be asked to record four different words. It is important that the subject speak in the same manner and volume that they will use during the test.

9. Compare the volume of your recording with the sample of a good recording. Click OK if your recording is acceptable or click Re-Record to try again.

Page 44: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

44

10. An AudioInWizard Results window be appear. If you are satisfied with the results, click OK, if not, click Re-try.

11. The Assistant window will reappear. Click on the New User button.

12. Enter the subjects ID number into the first name field, select their sex and enter their birthdate. You may ignore the grade field and the optional information box. Click OK.

13. Click OK in the Creating New User Logon box.

Page 45: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

45

14. The Assistant window will reappear. Click on the Run Task/Eval button.

15. Click on the More Tasks… button

16. Click on the More Tasks… button again

17. Click on the Aux1 Tasks button

Page 46: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

46

18. Select task 1, Shepard_Metzler_images, and click OK. (note your system will only show

one task for Shepard and Metzler).

19. A series of four Points to Remember screens will be displayed.

20. An Instructions screen will be displayed.

Page 47: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

47

21. Five practice screens will appear before the actual test items begin.

22. A Start Task window will appear before the first test item. Click OK when ready.

23. Five categories of 20 pairs will be displayed. A break can be taken, if needed, between

the categories. Should a break be needed before the end of a category one can be taken after the microphone detects a response, but BEFORE the item is scored.

Page 48: Hull, Darrell M., Rebecca J. Glover, and Judy A. Bolen. 2012 ...

48

24. When the subject has completed all of the items, return a copy of the signed UNT IRB Informed Consent form to the subject, pay them $30, fill out the receipt book, have them sign the form and thank them for participating.

25. After all subjects have been tested transfer the data file to the thumb drive and return it to UNT with the other materials. Use the following path to find the data file:

My Computer C Drive Program Files folder Educational Help folder Caas System folder Materials folder ENU folder Scroll down to shepard_metzler_images.scr

- Copy this file to the thumb drive Return all items, used and unused, provided by UNT to this address: Dr. Darrell Hull Assistant Professor Educational Psychology Department Matthews Hall Rm 304 1155 Union Circle #311335 Denton, TX 76203 Deadline Our first choice would be to have the data collected and return to UNT by late August. If you do not have enough graduates at this time to participate, February would be the latest deadline. Please keep us advised of your progress. Contact Information Feel free to contact Judy Bolen via email at [email protected] or phone at XXXXXXXXXXX with your status or questions. Thank you.