Brad Cousins University of Ottawa October 2010. Evaluation design options Data quality assurance – validity/credibility – reliability/dependability Instrument.

Slides:



Advertisements
Similar presentations
Chapter 11 Direct Data Collection: Surveys and Interviews Zina OLeary.
Advertisements

Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian
Program Evaluation Useful Sources of Information Carol Pilcher Department of Entomology Iowa State University.
Experimental Design True Experimental Designs n Random assignment n Two comparison groups n Controls threats to internal validity n Strongest evidence.
Qualitative Social Work Research
Collecting Qualitative Data
Collecting Qualitative Data
Collecting data Chapter 5
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Chapter 6 Field Research (outside of lab)  Naturalistic observation: in natural setting  Archival research: preexisting records  Case study: single.
FOUNDATIONS OF NURSING RESEARCH Sixth Edition CHAPTER Copyright ©2012 by Pearson Education, Inc. All rights reserved. Foundations of Nursing Research,
Data gathering.
Chapter 13 Survey Designs
Beginning the Research Design
Chapter 41 Training for Organizations Research Skills.
Research Ethics Levels of Measurement. Ethical Issues Include: Anonymity – researcher does not know who participated or is not able to match the response.
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Data Analysis Statistics. Levels of Measurement Nominal – Categorical; no implied rankings among the categories. Also includes written observations and.
Chapter 13 Survey Designs
Survey Designs EDUC 640- Dr. William M. Bauer
GS/PPAL Section N Research Methods and Information Systems A QUANTITATIVE RESEARCH PROJECT - (1)DATA COLLECTION (2)DATA DESCRIPTION (3)DATA ANALYSIS.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
1 Major topics to be addressed  Types of research  Fundamental concepts.
Development of Questionnaire By Dr Naveed Sultana.
Edwin D. Bell Winston-Salem State University
© 2001 Dr. Laura Snodgrass, Ph.D.1 Non-experimental Methods Observation and Survey Research.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Questionnaires and Interviews
Epistemology and Methods Survey Research & Interview Techniques May
+ Using Mixed Methods Research Designs for Research in Teaching and Learning Dr. Elizabeth G. Creamer, Virginia Tech Dr. Beth Mac Donald, Utah State University.
Collecting Quantitative Data Creswell Chapter 6. Who Will You Study? Identify unit of analysis Specify population Describe sampling approach  Class =
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
 Cynthia J. Miller, Ph.D. Assistant Professor Dept. of Physiology & Biophysics University of Louisville.
Chapter 12: Survey Designs
Week 8: Research Methods: Qualitative Research 1.
MAC Fall Symposium: Learning What You Want to Know and Implementing Change Elizabeth Yakel, Ph.D. October 22, 2010.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Methods Choices Overall Approach/Design
Evaluating a Research Report
“Quantitative” methodologies (a) Assumptions underpinning research Associate Professor Rob Cavanagh September 19, 2007 Fraenkel, J.R. & Wallen, N.E. (2003).
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
LOOKING FOR EVIDENCE? A CASE FOR THE CASE STUDY DR. GURU NAGARAJAN DR. SARA BHATTACHARJI.
Quality Assessment July 31, 2006 Informing Practice.
Opening Scenario Two research I universities A & B Same requirements for admission of doctoral students into Ph.D. Science Education program. Upon graduation,
SURVEY RESEARCH.  Purposes and general principles Survey research as a general approach for collecting descriptive data Surveys as data collection methods.
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
Data Collection Methods
Collecting Qualitative Data
© (2015, 2012, 2008) by Pearson Education, Inc. All Rights Reserved Chapter 7: Collecting Qualitative Data Educational Research: Planning, Conducting,
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
Methods of Data Collection Survey Methods Self-Administered Questionnaires Interviews Methods of Observation Non-Participant Observation Participant Observation.
Chapter Nine: Qualitative Procedures
Sampling Design & Measurement Scaling
Chapter Eight: Quantitative Methods
Quantitative Data Collection In Advertising Research.
Data gathering (Chapter 7 Interaction Design Text)
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Data Collection Methods NURS 306, Nursing Research Lisa Broughton, MSN, RN, CCRN.
Overview Introduction to marketing research Research design Data collection Data analysis Reporting results.
Measurement & Data Collection
Chapter 6 Collecting Data Susan, Sharen, Dolla
Collecting Qualitative Data
Qualitative research: an overview
Instructor’s manual Mass Media Research: An Introduction, 7th Edition
Chapter Eight: Quantitative Methods
Field Research (outside of lab)
What are Mixed Methods? Donna M. Mertens, Ph.D., Professor Emeritus, Gallaudet University Economic and Social Research Council National Centre for Research.
Presentation transcript:

Brad Cousins University of Ottawa October 2010

Evaluation design options Data quality assurance – validity/credibility – reliability/dependability Instrument development and validation Data collection strategies

Comparison groups? – Yes, no, hybrid – Black box, grey box, glass box Data collected over time? – Yes, no, hybrid Mixed methods – Quant, qual., simultaneous, sequential

One shot, post only – X O1 Comparative post only – X O1 – O2 Randomized control trial – R X O1 – R O2

Time series design – O1 O2 O3 O4 X O5 O6 O7 O8 Pre-post comparative group design – O1 X O3 – O2 O4 Delayed treatment group design – O1 X O3 O5 – O2 O4 X O6

Major Concepts VALIDITY/CREDIBILITY Key points – Degrees on a continuum; – Describes the results or inferences; NOT the instrument; – Depends on the instrument and the process; – Involves evidence and judgment; Internal validity/credibility – Attribution: how confident can we be that the observed effects are attributable to the intervention?

Actual but non-program related changes in participants – Maturation – History Apparent changes dependent on who was observed – Selection – Attrition – Regression Changes related to methods of obtaining observations – Testing – Instrumentation

General Principles – Build on existing instruments and resources – Ensure validity: face, content, construct, – Ensure reliability (eliminate ambiguity) – Consider task demands – Obtrusive vs unobtrusive measures – Use of conceptual framework as guide – Demographic information solicited at end – Pilot test

Scales: Nominal, ordinal, interval Selected response – Multiple choice (tests) – Fixed option: Check all that apply Check ONE option only – Likert type rating scales Frequency (observation): N R S F A Agreement (opinion): SD D A SA

Selected response (cont) – Rank ordered preferences (avoid) – Paired comparison Constructed response – Open-ended comments Structured Unstructured – If ‘other’ (specify)

Data collection formats – Hardcopy – data entry format – Hardcopy – scan-able format – Internet format Over specify instructions Judicious use of bold/italics and font variation Response options on right hand side Stapling: booklet > upper left > left margin Judicious determination of length (8 p. max)

Review of purpose / expectations Spacing of questions to permit response recording Questions vs prompts Use of quantification

Ethics – Ethics review board procedures/protocols – Letters of informed consent Purpose How/why selected Demands / Right to refusal Confidential vs. anonymous Contact information – Issues and tensions

Interview tips – Small talk – set the tone – Audio tape recording – permission – Develop short-hand or symbolic field note skills – Permit some wandering but keep on track – Minimize redundancy

Quantitative for representation – proportionate to population – random Qualitative to maximize variation – Purposive sampling: based on prior knowledge of case(s)

Colton, D. & Covert, R. W. (2007). Designing and constructing instruments for social research and evaluation. San Fransisco: John Wiley and Sons, Inc. Cresswell, J. W. & Miller, D. L. (2000). Determining validity in qualtitative inquiry. Theory into practice, 39(3), Fraekel, J.R. & Wallen, N.E. (2003). How to Design and Evaluate Research in Education. New York: McGraw-Hill. McMillan, J. H. (2004). 4th Ed. Educational Research. Toronto: Pearson, Bacon and Allen, pp Shultz, K.S. & Whitney, D.J. (2005). Measurement Theory in Action: Case Studies and Exercises. Thousand Oaks, CA: SAGE Publications.