Presentation is loading. Please wait.

Presentation is loading. Please wait.

2007 Annual Conference Best Practices in Job Analysis Speakers: Patricia Muenzen Lee Schroeder Moderator: Sandra Greenberg.

Similar presentations


Presentation on theme: "2007 Annual Conference Best Practices in Job Analysis Speakers: Patricia Muenzen Lee Schroeder Moderator: Sandra Greenberg."— Presentation transcript:

1 2007 Annual Conference Best Practices in Job Analysis Speakers: Patricia Muenzen Lee Schroeder Moderator: Sandra Greenberg

2 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation The Survey Experience What makes people take surveys? What is their experience with both paper and pencil and computer-delivered surveys? How can you design and administer the best possible survey?

3 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Components of the Survey Experience Survey content (what) Survey sampling plan (who) Survey administration mode (how) Survey taker experience (where, when, why)

4 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Designing a Successful Survey Survey Content (what) –Statements to be rated –Rating scales –Demographic data collection –Qualitative data collection

5 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Statements to be Rated Statements to be rated may include domains of practice, tasks/activities, and knowledge and skills How thorough is your development process? Who is involved? What process do they use?

6 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Rating Scales Rating scales may include time spent/frequency, importance/criticality, proficiency at licensure How do you select your rating scales? What questions to you want to answer? How many scales do you use?

7 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Demographic Data Collection Demographic questions may include years of experience, work setting, educational background, organizational size What do you need to know about your respondents? How might respondent background characteristics influence their survey ratings?

8 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Qualitative Data Collection Open-ended questions might address items missing from the survey, changes in the practice of the profession, reactions to the survey What information do your respondents need to provide that is not available through closed questions?

9 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Delivering a Successful Survey – Sampling Plan Survey sampling plan (who) Who do you want to take your survey? What should your sample size be? Should you stratify?

10 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Delivering a Successful Survey – Administration Survey administration mode (how) –Delivery could be via a traditional paper survey (TPS) or a computer-based survey (CBS) –Which administration mode should you select?

11 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation TPS v CBS TPS –Probably more likely that invitation to participate is delivered –More rating scales per page –Better with non- technologically sophisticated audience CBS –Cheaper –Quicker –Logistically easier to do versioning –Inexpensive options for visual display (color, graphics) –Qualitative questions: more responses & easier to read –In-process data verification

12 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Comparability of Survey Administration Modes An empirical investigation of two data collection modes (Traditional Paper Survey (TPS) and Computer-Based Survey (CBS))

13 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation General Survey Characteristics Two data collection modes were used (Traditional Paper Survey (TPS) and Computer- Based Survey (CBS)) A total of 12,000 were sampled –6,000 TPS –6,000 CBS Total of 150 activities rated for frequency and priority Approximately 20 demographic questions

14 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Three General Research Questions Is there a difference in response rates across modes? Are there demographic differences across the administration modes (e.g., do younger people respond more frequently to one mode over another)? Are there practical administration mode differences in developing test specifications?

15 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Response Rate TPS –Out of 6,000, 263 were removed due to bad addresses and limited practice –1,709 were returned for an adjusted return rate of 30% CBS –Out of 6,000, 1,166 were removed due to bad email addresses and limited practice –1,115 were returned for an adjusted return rate of 21%

16 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Response Rate With the aggressive mailing (5-stages, incentives, and reduced survey length) the TPS had a 9% higher response rate in this study compared to the CBS That being said, return rates at or above 20% for unsolicited surveys are typical The affect of Spam filters is unknown

17 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Demographic Differences Very few differences Appears as the administration mode had no affect on population

18 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Demographic Differences Sample geographic location question Geo LocationTPSCBS Urban62.2 %63.2 % Suburban25.5 %27.5 % Rural12.2 %9.5 %

19 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Demographic Differences Do the activities represent what you do in your position? Slight difference in perception, although very high for both modes. ModeYes CBS92.0% TBS95.8%

20 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Rating Scales Mean Frequency Ratings (0 to 5) six point scale (150 tasks) –CBS mean= 2.24 –TPS mean= 2.18 Mean Priority Ratings (1 to 4) four point scale (150 tasks) –CBS mean= 2.96 –TPS mean= 2.95

21 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications Small differences were observed in the mean ratings Do these differences affect decisions made on test content?

22 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications Evaluated inclusion criteria for final outline Created artificial task exclusion criterion (1.25 Standard Deviation units below the mean for each administration modality) Frequency: – CBS mean = 2.24 cutpoint =.83 – TPS mean = 2.18 cutpoint =.78 Priority –CBS mean = 2.96 cutpoint = 2.40 –TPS mean = 2.95 cutpoint = 2.40 Activities above the cutpoint are included; those below are excluded from the final content outline

23 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications- Frequency ratings over 99% Classification Accuracy only 1 difference in activities excluded (task # 68 on CBS and task 29 on CPS) ActivityCBS.83ActivityTPS.78 1420.26091420.3133 760.34111170.4734 1170.49061000.4907 1000.5049760.5213 1310.59111380.5244 1380.60931390.5352 580.65511400.5400 1390.66081310.5730 1400.6617940.5881 740.67871130.6442 1130.7346580.7263 680.7445290.7280 940.8140740.7470

24 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications- Priority ratings over 99% Classification Accuracy ActivityCBS 2.4ActivityTPS 2.4 1421.7171421.726 111.727111.815 71.926551.846 551.95882.068 91.97172.095 1172.0041172.101 1072.11692.148 1382.135102.198 82.1531392.199 102.2031052.271 452.236452.294 1392.2551382.305 412.264412.305 1052.302602.306 602.3061062.332 582.3311072.332 1062.407892.366

25 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Results of Test Specification Analysis Number of misclassifications approaches random error Differences are within a standard error of the task exclusion cutpoint Because near standard error, are reviewed by the committee for final inclusion

26 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Conclusions Based on this limited sample and may not generalize Response rate higher for TPS No differences in respondent sample (demographics) TPS group had slightly more agreeable opinion of elements Most importantly, there were no practical differences in final test specification development

27 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Conclusions continued Cost –TPS-Over $6.00 postage and printing for each TPS (5-stage mailing) plus mailing labor. A conservative estimate would be $6.50 per unit or in this case $39,000. (estimate excludes data entry and scanning) –CBS- initial cost for survey setup and QC. No postage, printing, scanning and limited labor after initial setup. Cost is probably less that $10,000 for the administration of this type of survey

28 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Delivering a Successful Survey - User experience Survey taker experience (where, when, why) To enhance user experience, survey should be maximally: –Accessible –Visually appealing –Easy to complete –Relevant

29 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation User experience – Suggestions To reduce time demands, create different versions of survey To ensure questions are correctly targeted to respondent subgroups, use routing To motivate respondents, use PR campaign and participation incentives

30 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Discussion What best practices can you share with us regarding: –Survey content (what)? –Survey administration mode (how)? –Survey sampling plan (who)? –Survey taker experience (where, when, why)?

31 Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Speaker Contact Information Patricia M. Muenzen Director of Research Programs Professional Examination Service 475 Riverside Drive, 6 th Fl. New York, NY 10115 Voice: 212-367-4273 Fax: 917-305-9852 pat@proexam.org www.proexam.org Dr. Lee L. Schroeder President Schroeder Measurement Technologies, Inc. 2494 Bayshore Blvd., Suite 201, Dunedin, FL 34698 Voice: 727-738-8727 Toll Free: 800-556-0484 Fax: 727-734-9397 e-mail: lschroeder@smttest.com www.smttest.com


Download ppt "2007 Annual Conference Best Practices in Job Analysis Speakers: Patricia Muenzen Lee Schroeder Moderator: Sandra Greenberg."

Similar presentations


Ads by Google