2007 Annual Conference Best Practices in Job Analysis Speakers: Patricia Muenzen Lee Schroeder Moderator: Sandra Greenberg
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation The Survey Experience What makes people take surveys? What is their experience with both paper and pencil and computer-delivered surveys? How can you design and administer the best possible survey?
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Components of the Survey Experience Survey content (what) Survey sampling plan (who) Survey administration mode (how) Survey taker experience (where, when, why)
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Designing a Successful Survey Survey Content (what) –Statements to be rated –Rating scales –Demographic data collection –Qualitative data collection
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Statements to be Rated Statements to be rated may include domains of practice, tasks/activities, and knowledge and skills How thorough is your development process? Who is involved? What process do they use?
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Rating Scales Rating scales may include time spent/frequency, importance/criticality, proficiency at licensure How do you select your rating scales? What questions to you want to answer? How many scales do you use?
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Demographic Data Collection Demographic questions may include years of experience, work setting, educational background, organizational size What do you need to know about your respondents? How might respondent background characteristics influence their survey ratings?
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Qualitative Data Collection Open-ended questions might address items missing from the survey, changes in the practice of the profession, reactions to the survey What information do your respondents need to provide that is not available through closed questions?
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Delivering a Successful Survey – Sampling Plan Survey sampling plan (who) Who do you want to take your survey? What should your sample size be? Should you stratify?
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Delivering a Successful Survey – Administration Survey administration mode (how) –Delivery could be via a traditional paper survey (TPS) or a computer-based survey (CBS) –Which administration mode should you select?
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation TPS v CBS TPS –Probably more likely that invitation to participate is delivered –More rating scales per page –Better with non- technologically sophisticated audience CBS –Cheaper –Quicker –Logistically easier to do versioning –Inexpensive options for visual display (color, graphics) –Qualitative questions: more responses & easier to read –In-process data verification
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Comparability of Survey Administration Modes An empirical investigation of two data collection modes (Traditional Paper Survey (TPS) and Computer-Based Survey (CBS))
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation General Survey Characteristics Two data collection modes were used (Traditional Paper Survey (TPS) and Computer- Based Survey (CBS)) A total of 12,000 were sampled –6,000 TPS –6,000 CBS Total of 150 activities rated for frequency and priority Approximately 20 demographic questions
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Three General Research Questions Is there a difference in response rates across modes? Are there demographic differences across the administration modes (e.g., do younger people respond more frequently to one mode over another)? Are there practical administration mode differences in developing test specifications?
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Response Rate TPS –Out of 6,000, 263 were removed due to bad addresses and limited practice –1,709 were returned for an adjusted return rate of 30% CBS –Out of 6,000, 1,166 were removed due to bad addresses and limited practice –1,115 were returned for an adjusted return rate of 21%
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Response Rate With the aggressive mailing (5-stages, incentives, and reduced survey length) the TPS had a 9% higher response rate in this study compared to the CBS That being said, return rates at or above 20% for unsolicited surveys are typical The affect of Spam filters is unknown
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Demographic Differences Very few differences Appears as the administration mode had no affect on population
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Demographic Differences Sample geographic location question Geo LocationTPSCBS Urban62.2 %63.2 % Suburban25.5 %27.5 % Rural12.2 %9.5 %
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Demographic Differences Do the activities represent what you do in your position? Slight difference in perception, although very high for both modes. ModeYes CBS92.0% TBS95.8%
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Rating Scales Mean Frequency Ratings (0 to 5) six point scale (150 tasks) –CBS mean= 2.24 –TPS mean= 2.18 Mean Priority Ratings (1 to 4) four point scale (150 tasks) –CBS mean= 2.96 –TPS mean= 2.95
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications Small differences were observed in the mean ratings Do these differences affect decisions made on test content?
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications Evaluated inclusion criteria for final outline Created artificial task exclusion criterion (1.25 Standard Deviation units below the mean for each administration modality) Frequency: – CBS mean = 2.24 cutpoint =.83 – TPS mean = 2.18 cutpoint =.78 Priority –CBS mean = 2.96 cutpoint = 2.40 –TPS mean = 2.95 cutpoint = 2.40 Activities above the cutpoint are included; those below are excluded from the final content outline
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications- Frequency ratings over 99% Classification Accuracy only 1 difference in activities excluded (task # 68 on CBS and task 29 on CPS) ActivityCBS.83ActivityTPS
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications- Priority ratings over 99% Classification Accuracy ActivityCBS 2.4ActivityTPS
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Results of Test Specification Analysis Number of misclassifications approaches random error Differences are within a standard error of the task exclusion cutpoint Because near standard error, are reviewed by the committee for final inclusion
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Conclusions Based on this limited sample and may not generalize Response rate higher for TPS No differences in respondent sample (demographics) TPS group had slightly more agreeable opinion of elements Most importantly, there were no practical differences in final test specification development
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Conclusions continued Cost –TPS-Over $6.00 postage and printing for each TPS (5-stage mailing) plus mailing labor. A conservative estimate would be $6.50 per unit or in this case $39,000. (estimate excludes data entry and scanning) –CBS- initial cost for survey setup and QC. No postage, printing, scanning and limited labor after initial setup. Cost is probably less that $10,000 for the administration of this type of survey
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Delivering a Successful Survey - User experience Survey taker experience (where, when, why) To enhance user experience, survey should be maximally: –Accessible –Visually appealing –Easy to complete –Relevant
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation User experience – Suggestions To reduce time demands, create different versions of survey To ensure questions are correctly targeted to respondent subgroups, use routing To motivate respondents, use PR campaign and participation incentives
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Discussion What best practices can you share with us regarding: –Survey content (what)? –Survey administration mode (how)? –Survey sampling plan (who)? –Survey taker experience (where, when, why)?
Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Speaker Contact Information Patricia M. Muenzen Director of Research Programs Professional Examination Service 475 Riverside Drive, 6 th Fl. New York, NY Voice: Fax: Dr. Lee L. Schroeder President Schroeder Measurement Technologies, Inc Bayshore Blvd., Suite 201, Dunedin, FL Voice: Toll Free: Fax: