2007 Annual Conference Best Practices in Job Analysis Speakers: Patricia Muenzen Lee Schroeder Moderator: Sandra Greenberg.

Slides:



Advertisements
Similar presentations
A Systems Approach To Training
Advertisements

Knowledge Dietary Managers Association 1 PART II - DMA Certification Exam Blueprint and Exam Development-
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Response Rate in Surveys Key resource: Dillman, D.A.,
2007 Annual Conference Survey Says! Collecting Feedback From Customers James Collins.
EMR 6500: Survey Research Dr. Chris L. S. Coryn Kristin A. Hobson Spring 2013.
Survey Methodology Nonresponse EPID 626 Lecture 6.
Self-Administered Surveys: Mail Survey Methods ChihChien Chen Lauren Teffeau Week 10.
Differential Item Functioning of the English- and Spanish-Administered HINTS Psychological Distress Scale Chih-Hung Chang, Ph.D. Feinberg School of Medicine.
2.06 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
Association for Institutional Research Annual Forum May 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana.
Advanced Topics in Standard Setting. Methodology Implementation Validity of standard setting.
STATISTICS FOR MANAGERS LECTURE 2: SURVEY DESIGN.
2007 Annual Conference “ Vital Implementation Issues in Translation” Erika Irby Schroeder Measurement Technologies, Inc.
Presented at the 2006 CLEAR Annual Conference September Alexandria, Virginia Something from Nothing: Limitations of Diagnostic Information in a CAT.
Examining the use of administrative data for annual business statistics Joanna Woods, Ria Sanderson, Tracy Jones, Daniel Lewis.
© 2003 Prentice-Hall, Inc.Chap 1-1 Business Statistics: A First Course (3 rd Edition) Chapter 1 Introduction and Data Collection.
Chapter 1 The Where, Why, and How of Data Collection
© 2002 Prentice-Hall, Inc.Chap 1-1 Statistics for Managers using Microsoft Excel 3 rd Edition Chapter 1 Introduction and Data Collection.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 8 Using Survey Research.
Chapter 1 The Where, Why, and How of Data Collection
Georgia Modification Research Study Spring 2006 Sharron Hunt Melissa Fincher.
7-1 Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall Chapter 7 Sampling and Sampling Distributions Statistics for Managers using Microsoft.
Basic Business Statistics (8th Edition)
Copyright ©2005 Brooks/Cole, a division of Thomson Learning, Inc. How to Get a Good Sample Chapter 4.
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
Chapter 5 Copyright © Allyn & Bacon 2008 This multimedia product and its contents are protected under copyright law. The following are prohibited by law:
2007 Annual Conference Competence and Conduct William D. Hogan Applied Measurement Professionals, Inc.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Lecture 30 sampling and field work
Assessing the Heritage Planning Process: the Views of Citizens Assessing the Heritage Planning Process: the Views of Citizens Dr. Michael MacMillan Department.
EMR 6500: Survey Research Dr. Chris L. S. Coryn Spring 2012.
McCann Associates Presented by: Michael Childs, Barbara Dyer, Ira Taylor, Bruce Nugyen Chad Warner & Joe Koury, McCann Associates.
Conducting a Job Analysis to Establish the Examination Content Domain Patricia M. Muenzen Associate Director of Research Programs Professional Examination.
Chapter 10 Surveys McGraw-Hill/Irwin Copyright © 2011 by The McGraw-Hill Companies, Inc. All Rights Reserved.
Survey Research and Other Ways of Asking Questions
Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 1-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
Questionnaires and Interviews
RESEARCH A systematic quest for undiscovered truth A way of thinking
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
SAMPLING.
Eurostat Overall design. Presented by Eva Elvers Statistics Sweden.
Research Methodology Lecture No :14 (Sampling Design)
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Basic Sampling Issues CHAPTER Ten.
Copyright ©2011 Pearson Education 7-1 Chapter 7 Sampling and Sampling Distributions Statistics for Managers using Microsoft Excel 6 th Global Edition.
3.6 – Questioning & Bias (Text Section 2.4 & 2.5).
AIR 2000 On-line vs. Paper-and-Pencil Surveying of Students’ Willingness to Re-enroll: A Case Study Phil Handwerk, Cristi Carson, and Karen Blackwell University.
Sampling “Sampling is the process of choosing sample which is a group of people, items and objects. That are taken from population for measurement and.
European Conference on Quality in Official Statistics Session 26: Quality Issues in Census « Rome, 10 July 2008 « Quality Assurance and Control Programme.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 1-1 Statistics for Managers Using Microsoft ® Excel 4 th Edition Chapter.
Survey Methodology Lilian Ma November 6, Three aspects 1. How questions were designed 2. How data was collected 3. How samples were drawn Probability.
Universal Access to Assessments. Project Overview Four Implementing States: New Hampshire, Vermont, Rhode Island, and Maine Eight Partner States: Connecticut,
European Conference on Quality in Official Statistics 8-11 July 2008 Mr. Hing-Wang Fung Census and Statistics Department Hong Kong, China (
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Questionnaires Questions can be closed or open Closed questions are easier to analyze, and may be done by computer Can be administered to large populations.
A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. Chap 1-1 A Course In Business Statistics 4 th Edition Chapter 1 The Where, Why, and How.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 7-1 Chapter 7 Sampling and Sampling Distributions Basic Business Statistics 11 th Edition.
Basic Business Statistics
Descriptive Research & Questionnaire Design. Descriptive Research Survey versus Observation  Survey Primary data collection method based on communication.
AHIMA’s Commission on Certification for Health Informatics and Information Management (CCHIIM) Test Development Process Jo Santos, RHIA Senior Manager,
Using Surveys to Design and Evaluate Watershed Education and Outreach Day 5 Methodologies for Implementing Mailed Surveys Alternatives to Mailed Surveys.
Important statistical terms Population: a set which includes all measurements of interest to the researcher (The collection of all responses, measurements,
COMMUNICATION ENGLISH III September 27/28 th 2012.
McGraw-Hill/Irwin Business Research Methods, 10eCopyright © 2008 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 9 Surveys.
Chapter Ten Basic Sampling Issues Chapter Ten.
Chapter 1 The Where, Why, and How of Data Collection
The Where, Why, and How of Data Collection
Chapter 1 The Where, Why, and How of Data Collection
Presentation transcript:

2007 Annual Conference Best Practices in Job Analysis Speakers: Patricia Muenzen Lee Schroeder Moderator: Sandra Greenberg

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation The Survey Experience What makes people take surveys? What is their experience with both paper and pencil and computer-delivered surveys? How can you design and administer the best possible survey?

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Components of the Survey Experience Survey content (what) Survey sampling plan (who) Survey administration mode (how) Survey taker experience (where, when, why)

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Designing a Successful Survey Survey Content (what) –Statements to be rated –Rating scales –Demographic data collection –Qualitative data collection

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Statements to be Rated Statements to be rated may include domains of practice, tasks/activities, and knowledge and skills How thorough is your development process? Who is involved? What process do they use?

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Rating Scales Rating scales may include time spent/frequency, importance/criticality, proficiency at licensure How do you select your rating scales? What questions to you want to answer? How many scales do you use?

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Demographic Data Collection Demographic questions may include years of experience, work setting, educational background, organizational size What do you need to know about your respondents? How might respondent background characteristics influence their survey ratings?

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Survey Content – Qualitative Data Collection Open-ended questions might address items missing from the survey, changes in the practice of the profession, reactions to the survey What information do your respondents need to provide that is not available through closed questions?

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Delivering a Successful Survey – Sampling Plan Survey sampling plan (who) Who do you want to take your survey? What should your sample size be? Should you stratify?

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Delivering a Successful Survey – Administration Survey administration mode (how) –Delivery could be via a traditional paper survey (TPS) or a computer-based survey (CBS) –Which administration mode should you select?

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation TPS v CBS TPS –Probably more likely that invitation to participate is delivered –More rating scales per page –Better with non- technologically sophisticated audience CBS –Cheaper –Quicker –Logistically easier to do versioning –Inexpensive options for visual display (color, graphics) –Qualitative questions: more responses & easier to read –In-process data verification

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Comparability of Survey Administration Modes An empirical investigation of two data collection modes (Traditional Paper Survey (TPS) and Computer-Based Survey (CBS))

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation General Survey Characteristics Two data collection modes were used (Traditional Paper Survey (TPS) and Computer- Based Survey (CBS)) A total of 12,000 were sampled –6,000 TPS –6,000 CBS Total of 150 activities rated for frequency and priority Approximately 20 demographic questions

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Three General Research Questions Is there a difference in response rates across modes? Are there demographic differences across the administration modes (e.g., do younger people respond more frequently to one mode over another)? Are there practical administration mode differences in developing test specifications?

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Response Rate TPS –Out of 6,000, 263 were removed due to bad addresses and limited practice –1,709 were returned for an adjusted return rate of 30% CBS –Out of 6,000, 1,166 were removed due to bad addresses and limited practice –1,115 were returned for an adjusted return rate of 21%

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Response Rate With the aggressive mailing (5-stages, incentives, and reduced survey length) the TPS had a 9% higher response rate in this study compared to the CBS That being said, return rates at or above 20% for unsolicited surveys are typical The affect of Spam filters is unknown

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Demographic Differences Very few differences Appears as the administration mode had no affect on population

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Demographic Differences Sample geographic location question Geo LocationTPSCBS Urban62.2 %63.2 % Suburban25.5 %27.5 % Rural12.2 %9.5 %

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Demographic Differences Do the activities represent what you do in your position? Slight difference in perception, although very high for both modes. ModeYes CBS92.0% TBS95.8%

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Rating Scales Mean Frequency Ratings (0 to 5) six point scale (150 tasks) –CBS mean= 2.24 –TPS mean= 2.18 Mean Priority Ratings (1 to 4) four point scale (150 tasks) –CBS mean= 2.96 –TPS mean= 2.95

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications Small differences were observed in the mean ratings Do these differences affect decisions made on test content?

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications Evaluated inclusion criteria for final outline Created artificial task exclusion criterion (1.25 Standard Deviation units below the mean for each administration modality) Frequency: – CBS mean = 2.24 cutpoint =.83 – TPS mean = 2.18 cutpoint =.78 Priority –CBS mean = 2.96 cutpoint = 2.40 –TPS mean = 2.95 cutpoint = 2.40 Activities above the cutpoint are included; those below are excluded from the final content outline

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications- Frequency ratings over 99% Classification Accuracy only 1 difference in activities excluded (task # 68 on CBS and task 29 on CPS) ActivityCBS.83ActivityTPS

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Test Specifications- Priority ratings over 99% Classification Accuracy ActivityCBS 2.4ActivityTPS

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Results of Test Specification Analysis Number of misclassifications approaches random error Differences are within a standard error of the task exclusion cutpoint Because near standard error, are reviewed by the committee for final inclusion

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Conclusions Based on this limited sample and may not generalize Response rate higher for TPS No differences in respondent sample (demographics) TPS group had slightly more agreeable opinion of elements Most importantly, there were no practical differences in final test specification development

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Conclusions continued Cost –TPS-Over $6.00 postage and printing for each TPS (5-stage mailing) plus mailing labor. A conservative estimate would be $6.50 per unit or in this case $39,000. (estimate excludes data entry and scanning) –CBS- initial cost for survey setup and QC. No postage, printing, scanning and limited labor after initial setup. Cost is probably less that $10,000 for the administration of this type of survey

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Delivering a Successful Survey - User experience Survey taker experience (where, when, why) To enhance user experience, survey should be maximally: –Accessible –Visually appealing –Easy to complete –Relevant

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation User experience – Suggestions To reduce time demands, create different versions of survey To ensure questions are correctly targeted to respondent subgroups, use routing To motivate respondents, use PR campaign and participation incentives

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Discussion What best practices can you share with us regarding: –Survey content (what)? –Survey administration mode (how)? –Survey sampling plan (who)? –Survey taker experience (where, when, why)?

Atlanta, Georgia 2007 Annual Conference Council on Licensure, Enforcement and Regulation Speaker Contact Information Patricia M. Muenzen Director of Research Programs Professional Examination Service 475 Riverside Drive, 6 th Fl. New York, NY Voice: Fax: Dr. Lee L. Schroeder President Schroeder Measurement Technologies, Inc Bayshore Blvd., Suite 201, Dunedin, FL Voice: Toll Free: Fax: