Winnie mucherah ball state university FOUNDATIONS OF QUALITY RESEARCH DESIGN: RELIABILITY & VALIDITY.

Slides:



Advertisements
Similar presentations
Standardized Scales.
Advertisements

Chapter 8 Flashcards.
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
Research Methods in Psychology
+ Evidence Based Practice University of Utah Presented by Will Backner December 2009 Training School Psychologists to be Experts in Evidence Based Practices.
Part II Knowing How to Assess Chapter 5 Minimizing Error p115 Review of Appl 644 – Measurement Theory – Reliability – Validity Assessment is broader term.
Concept of Measurement
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
Chapter 13: Descriptive and Exploratory Research
Chapter 4 Selecting a Sample Gay, Mills, and Airasian
Chapter 7 Correlational Research Gay, Mills, and Airasian
Classroom Assessment A Practical Guide for Educators by Craig A
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
Chapter 2 Understanding the Research Process
Methodology: How Social Psychologists Do Research
Proposal Writing.
Measurement and Data Quality
Descriptive and Causal Research Designs
Reliability, Validity, & Scaling
Chapter 2: The Research Enterprise in Psychology
Chapter 1 Psychology as a Science
Chapter 2: The Research Enterprise in Psychology
Chapter 1: Introduction to Statistics
Research and Statistics AP Psychology. Questions: ► Why do scientists conduct research?  answer answer.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
LECTURE 06B BEGINS HERE THIS IS WHERE MATERIAL FOR EXAM 3 BEGINS.
Final Study Guide Research Design. Experimental Research.
Chapter 4 Selecting a Sample Gay and Airasian
RESEARCH IN MATH EDUCATION-3
Chapter 1: The Research Enterprise in Psychology.
Basic and Applied Research. Notes:  The question asked is either “basic” or “applied”  “Try again…” NEVER with the same data set  *data mining*  Literature.
The Research Enterprise in Psychology. The Scientific Method: Terminology Operational definitions are used to clarify precisely what is meant by each.
Evaluating a Research Report
Chapter 2 The Research Enterprise in Psychology. Table of Contents The Scientific Approach: A Search for Laws Basic assumption: events are governed by.
Major Types of Quantitative Studies Descriptive research –Correlational research –Evaluative –Meta Analysis Causal-comparative research Experimental Research.
Copyright ©2008 by Pearson Education, Inc. Pearson Prentice Hall Upper Saddle River, NJ Foundations of Nursing Research, 5e By Rose Marie Nieswiadomy.
Slides to accompany Weathington, Cunningham & Pittenger (2010), Chapter 3: The Foundations of Research 1.
EDU 8603 Day 6. What do the following numbers mean?
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
Quantitative and Qualitative Approaches
Measurement Models: Exploratory and Confirmatory Factor Analysis James G. Anderson, Ph.D. Purdue University.
Conducting and Reading Research in Health and Human Performance.
Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Review of Research Methods. Overview of the Research Process I. Develop a research question II. Develop a hypothesis III. Choose a research design IV.
Academic Research Academic Research Dr Kishor Bhanushali M
Research Design. Selecting the Appropriate Research Design A research design is basically a plan or strategy for conducting one’s research. It serves.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Question paper 1997.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Measurement Theory in Marketing Research. Measurement What is measurement?  Assignment of numerals to objects to represent quantities of attributes Don’t.
Chapter 2 The Research Enterprise in Psychology. Table of Contents The Scientific Approach: A Search for Laws Basic assumption: events are governed by.
Experimental Research Methods in Language Learning Chapter 12 Reliability and Reliability Analysis.
Chapter Eight: Quantitative Methods
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
Sociology. Sociology is a science because it uses the same techniques as other sciences Explaining social phenomena is what sociological theory is all.
CHAPTER ONE: INTRODUCTION TO ACTION RESEARCH CONNECTING THEORY TO PRACTICE IMPROVING EDUCATIONAL PRACTICE EMPOWERING TEACHERS.
McGraw-Hill © 2007 The McGraw-Hill Companies, Inc. All rights reserved. Slide 1 Sociological Research SOCIOLOGY Richard T. Schaefer 2.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
Research design By Dr.Ali Almesrawi asst. professor Ph.D.
Connecting Theory to Practice Improving Educational Practice Empowering Teachers Chapter One: Introduction to Action Research.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Research Design. How do we know what we know? The way we make reasoning Deductive logic Begins with one or more premises, reasoning then proceeds logically.
CHAPTER ONE EDUCATIONAL RESEARCH. THINKING THROUGH REASONING (INDUCTIVELY) Inductive Reasoning : developing generalizations based on observation of a.
RESEARCH METHODS 8-10% 250$ 250$ 250$ 250$ 500$ 500$ 500$ 500$ 750$
Meta-analysis, systematic reviews and research syntheses
Presentation transcript:

winnie mucherah ball state university FOUNDATIONS OF QUALITY RESEARCH DESIGN: RELIABILITY & VALIDITY

Literature review Systematic identification, location, and analysis of documents containing information related to the research problem Reviews are used to guide practice and/or to guide research Narrative reviews Topic reviews Theoretical reviews Meta-analyses (Mills, Airasian, & Gay, 2012)

Types of reviews Narrative/Traditional Reviews Most often conducted when writing dissertations and theses in the social sciences Also used in introductory paragraphs of a typical research article Provides a brief narrative about previous research on a subject to set the context for the current research topic

Topic reviews Introductory and investigatory reviews Conducted when working on a topic for the first time Often includes introductory works, e.g., encyclopedia entries and textbooks Criteria for good topic reviews: Recency (based on up-to-date sources) Importance (built on important sources, quality of the journal, impact factor) Breath (sources discuss topic broadly)

Theoretical reviews Not usually featured in lists of types of reviews, but are important subtypes It’s a version of a traditional/narrative review It’s specific purpose is to synthesize established theories by focusing on points of agreement and/or to generate new theories by focusing on gaps To either synthesize previous theories or to generate new ones.

Meta-analyses reviews Systematic reviews/ Research synthesis Systematic- used frequently to refer to evidence-based practical applications Research synthesis-often refers to research that is not necessarily tied to practical applications Similar: researcher states in advance the procedures for findings, selecting, coding and analyzing the data Data enables you to calculate effect size

Effect size Effect size is aptly named It’s a measure of the size of an effect. Specifically, it’s a standardized measure Standardized measures are often stated in standard deviation units Therefore, they can be used to compare and combine results across studies Comparing and combining results across studies is the whole point of meta-analysis.

quantitative v. qualitative Quantitative research Numerical data Ex - surveys and tests Research plan includes an introduction, method section, data analysis description, and results Qualitative research Comprehensive, narrative, and visual data Ex - interviews and naturalistic observations Research plan must be responsive to the context and setting under study Mixed-method design is ideal (Mills, Airasian, & Gay, 2012)

correlational v. Experimental Correlational research Collecting data to determine whether a relation exists between two or more quantifiable variables Measured by a correlation coefficient (r) Strength of relationship ranges from 0 to 1 Relationship can be positive or negative (inverse) Correlation is not causation

Experimental research Random assignment to groups Involves IV and DV At least one independent variable is manipulated Effect of one or more dependent variable(s) observed Quantitative measure of the degree of correspondence between two or more respondents

Reliability It’s the consistency or agreement among measures Consistency of data collection Results are more likely to be repeatable if you conduct the experiment all over again (because the sample size is large enough to produce the necessary precision) Reliability coefficients generally range from 1.0 for a perfectly reliable measure to 0 for one that is completely inconsistent from one rater/test/observation to the next Cronbach’s alpha (α)-estimates internal consistency (Rumsey, 2005)

Measure of reliability Cronbach’s alpha (α) It’s used when you want to know whether the items in your scale or index are measuring aspects of the same thing The “scale if item deleted” feature helps identify items that could be removed or analyzed individually (IRT).70 is usually considered the minimum acceptable level; higher levels are needed when results are used for high- stakes decisions

Types of reliability Inter-rater reliability-refers to the consistency of two or more raters Test-retest reliability-refers to the consistency of the same test over time or consistency of results on repeated tests Internal reliability- refers to the consistency of multiple questions probing aspects of the same concept

Validity It’s a central issue at all stages of a research project Chief concern is whether the study is set up so that you can reach justifiable conclusions about your topic. This is referred to as Internal Validity It addresses the question: Do my conclusions apply to my sample? The degree to which differences on a measure are attributable to the manipulation of the independent variable This is highest in true experimental studies (Mills, Airasian, & Gay, 2012)

External validity The degree to which results will be generalizable and to a certain extent replicable in other settings It addresses the question: Do my conclusions apply to anyone else? Can you generalize your conclusions beyond the participants in the experiment? The answer depends on the quality and the appropriateness of your sample Construct validity: are concepts measured in ways that enable us to study what we aim to study? Content validity: is the measure thorough or representative of the thing being measured?

Sampling procedures Population collection of all individuals of interests Sample subset of the population we measure Parameter a numeric characterization of the population that is of interest to us Statistic a numeric characterization of the sample that is an estimate of the population Since we cannot access population, we don’t have access to parameter, so we take a sample we can obtain, then we make a numeric measurement, also known as a statistic Coladarci & Cobb, 2014

Contextualizing your research Refining the substantive question and developing a plan for collecting relevant data Use of existing/new measures: Use Factor Analysis FA helps you decide about reliability and validity of your measurements of latent variables and thus how to analyze and interpret them FA is simply correlations and associations among items Purpose of FA is to improve the measurement of latent variables or constructs that cannot be directly observed (Coladarci & Cobb, 2014)

Latent variables Latent variables can only be studied indirectly by using indicators of observed variables, e.g., in a multi-item measure of traits, the items would be indicators (or observed variables) and clusters of questions identified by the FA would help you identify the factors or latent variables, which are the constructs or concepts you seek in your research. E.g., 15 questions toward a controversial issue Efficacy or social tolerance or attitudes

Types of Factor analysis Exploratory FA and Confirmatory FA EFA-used when researchers are looking for interesting patterns among variables CFA-used when researchers have theories about the patterns they want to test The two are often linked because it is very common to conduct them in sequence-first EFA to refine theories, then CFA to test them.

Conclusion Substantive Question ---> Statistical Question ---> Statistical Conclusion -- -> Substantive Conclusion Substantive Conclusion is a context- based conclusion

references Coladarci, T., Cobb, C.D., Minium, E.W. & Clarke, R.C. Fundamentals of Statistical Reasoning in Education. Mills, G.E., Airasian, P. & Gay, L.R Educational Research: Competencies for analysis and applications. 10 th Edition. Rumsey, D Statistics Workbook for Dummies.