2007 Annual Conference
Best Practices in Job Analysis
Speakers: Patricia Muenzen Lee Schroeder
Moderator: Sandra Greenberg
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
The Survey Experience• What makes people take surveys?
• What is their experience with both paper and pencil and computer-delivered surveys?
• How can you design and administer the best possible survey?
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Components of the Survey Experience
• Survey content (what)
• Survey sampling plan (who)
• Survey administration mode (how)
• Survey taker experience (where, when, why)
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Designing a Successful Survey
• Survey Content (what)– Statements to be rated– Rating scales– Demographic data collection– Qualitative data collection
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Survey Content – Statements to be
Rated• Statements to be rated may include domains of practice, tasks/activities, and knowledge and skills
• How thorough is your development process? Who is involved? What process do they use?
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Survey Content – Rating Scales
• Rating scales may include time spent/frequency, importance/criticality, proficiency at licensure
• How do you select your rating scales? What questions to you want to answer? How many scales do you use?
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Survey Content – Demographic Data
Collection• Demographic questions may include years of experience, work setting, educational background, organizational size
• What do you need to know about your respondents? How might respondent background characteristics influence their survey ratings?
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Survey Content – Qualitative Data
Collection• Open-ended questions might address items missing from the survey, changes in the practice of the profession, reactions to the survey
• What information do your respondents need to provide that is not available through closed questions?
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Delivering a Successful Survey – Sampling Plan
• Survey sampling plan (who)
• Who do you want to take your survey? What should your sample size be? Should you stratify?
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Delivering a Successful Survey – Administration
• Survey administration mode (how)– Delivery could be via a traditional paper
survey (TPS) or a computer-based survey (CBS)
– Which administration mode should you select?
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
TPS v CBS• TPS
– Probably more likely that invitation to participate is delivered
– More rating scales per page
– Better with non-technologically sophisticated audience
• CBS– Cheaper– Quicker– Logistically easier to do
versioning– Inexpensive options for
visual display (color, graphics)
– Qualitative questions: more responses & easier to read
– In-process data verification
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Comparability of Survey Administration
Modes• An empirical investigation of two data collection modes (Traditional Paper Survey (TPS) and Computer-Based Survey (CBS))
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
General Survey Characteristics
• Two data collection modes were used (Traditional Paper Survey (TPS) and Computer-Based Survey (CBS))
• A total of 12,000 were sampled– 6,000 TPS – 6,000 CBS
• Total of 150 activities rated for frequency and priority
• Approximately 20 demographic questions
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Three General Research Questions
• Is there a difference in response rates across modes?
• Are there demographic differences across the administration modes (e.g., do younger people respond more frequently to one mode over another)?
• Are there practical administration mode differences in developing test specifications?
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Response Rate• TPS
– Out of 6,000, 263 were removed due to bad addresses and limited practice
– 1,709 were returned for an adjusted return rate of 30%
• CBS– Out of 6,000, 1,166 were removed due to bad email
addresses and limited practice– 1,115 were returned for an adjusted return rate of
21%
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Response Rate• With the aggressive mailing (5-stages,
incentives, and reduced survey length) the TPS had a 9% higher response rate in this study compared to the CBS
• That being said, return rates at or above 20% for unsolicited surveys are typical
• The affect of Spam filters is unknown
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Demographic Differences
• Very few differences
• Appears as the administration mode had no affect on population
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Demographic Differences
Sample geographic location questionGeo Location TPS CBS
Urban 62.2 % 63.2 %
Suburban 25.5 % 27.5 %
Rural 12.2 % 9.5 %
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Demographic Differences
• Do the activities represent what you do in your position?
• Slight difference in perception, although very high for both modes.
Mode Yes
CBS 92.0%
TBS 95.8%
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Rating Scales• Mean Frequency Ratings (0 to 5) six point
scale (150 tasks)– CBS mean= 2.24– TPS mean= 2.18
• Mean Priority Ratings (1 to 4) four point scale (150 tasks)– CBS mean= 2.96– TPS mean= 2.95
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Test Specifications• Small differences were observed in the
mean ratings
• Do these differences affect decisions made on test content?
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Test Specifications• Evaluated inclusion criteria for final outline• Created artificial task exclusion criterion (1.25 Standard
Deviation units below the mean for each administration modality)
• Frequency:– CBS mean = 2.24 cutpoint = .83 – TPS mean = 2.18 cutpoint = .78
• Priority– CBS mean = 2.96 cutpoint = 2.40– TPS mean = 2.95 cutpoint = 2.40
• Activities above the cutpoint are included; those below are excluded from the final content outline
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Test Specifications- Frequency ratings over 99%
Classification Accuracyonly 1 difference in activities excluded (task
# 68 on CBS and task 29 on CPS)Activity CBS .83 Activity TPS .78
142 0.2609 142 0.3133
76 0.3411 117 0.4734
117 0.4906 100 0.4907
100 0.5049 76 0.5213
131 0.5911 138 0.5244
138 0.6093 139 0.5352
58 0.6551 140 0.5400
139 0.6608 131 0.5730
140 0.6617 94 0.5881
74 0.6787 113 0.6442
113 0.7346 58 0.7263
68 0.7445 29 0.7280
94 0.8140 74 0.7470
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Test Specifications- Priority ratings over 99%
Classification AccuracyActivity CBS 2.4 Activity TPS 2.4
142 1.717 142 1.726
11 1.727 11 1.815
7 1.926 55 1.846
55 1.958 8 2.068
9 1.971 7 2.095
117 2.004 117 2.101
107 2.116 9 2.148
138 2.135 10 2.198
8 2.153 139 2.199
10 2.203 105 2.271
45 2.236 45 2.294
139 2.255 138 2.305
41 2.264 41 2.305
105 2.302 60 2.306
60 2.306 106 2.332
58 2.331 107 2.332
106 2.407 89 2.366
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Results of Test Specification Analysis
• Number of misclassifications approaches random error
• Differences are within a standard error of the task exclusion cutpoint
• Because near standard error, are reviewed by the committee for final inclusion
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Conclusions• Based on this limited sample and may not
generalize• Response rate higher for TPS• No differences in respondent sample
(demographics)• TPS group had slightly more agreeable opinion
of elements• Most importantly, there were no practical
differences in final test specification development
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Conclusions continued• Cost
– TPS-Over $6.00 postage and printing for each TPS (5-stage mailing) plus mailing labor. A conservative estimate would be $6.50 per unit or in this case $39,000. (estimate excludes data entry and scanning)
– CBS- initial cost for survey setup and QC. No postage, printing, scanning and limited labor after initial setup. Cost is probably less that $10,000 for the administration of this type of survey
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Delivering a Successful Survey - User experience
• Survey taker experience (where, when, why)
• To enhance user experience, survey should be maximally:– Accessible – Visually appealing – Easy to complete – Relevant
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
User experience – Suggestions
• To reduce time demands, create different versions of survey
• To ensure questions are correctly targeted to respondent subgroups, use routing
• To motivate respondents, use PR campaign and participation incentives
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Discussion• What best practices can you share with us
regarding: – Survey content (what)?– Survey administration mode (how)?– Survey sampling plan (who)?– Survey taker experience (where, when, why)?
Council on Licensure, Enforcement and RegulationAtlanta, Georgia2007 Annual Conference
Speaker Contact Information
Patricia M. Muenzen
Director of Research Programs
Professional Examination Service
475 Riverside Drive, 6th Fl.
New York, NY 10115
Voice: 212-367-4273
Fax: 917-305-9852
www.proexam.org
Dr. Lee L. Schroeder
President
Schroeder Measurement Technologies, Inc.
2494 Bayshore Blvd., Suite 201, Dunedin, FL 34698
Voice: 727-738-8727
Toll Free: 800-556-0484
Fax: 727-734-9397
e-mail: [email protected]
www.smttest.com
Top Related