Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Chapter 6 Validity.

Slides:



Advertisements
Similar presentations
Chapter 6 Process and Procedures of Testing
Advertisements

 Degree to which inferences made using data are justified or supported by evidence  Some types of validity ◦ Criterion-related ◦ Content ◦ Construct.
© 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Validity and Reliability Chapter Eight.
Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Chapter 5 Reliability.
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
Item Analysis: A Crash Course Lou Ann Cooper, PhD Master Educator Fellowship Program January 10, 2008.
Chapter 4A Validity and Test Development. Basic Concepts of Validity Validity must be built into the test from the outset rather than being limited to.
Issues Related to Assessment with Diverse Populations
Part II Knowing How to Assess Chapter 5 Minimizing Error p115 Review of Appl 644 – Measurement Theory – Reliability – Validity Assessment is broader term.
Assessment: Reliability, Validity, and Absence of bias
Chapter 4 Validity.
Reliability or Validity Reliability gets more attention: n n Easier to understand n n Easier to measure n n More formulas (like stats!) n n Base for validity.
Personality, 9e Jerry M. Burger
Norms & Norming Raw score: straightforward, unmodified accounting of performance Norms: test performance data of a particular group of test takers that.
Validity Lecture Overview Overview of the concept Different types of validity Threats to validity and strategies for handling them Examples of validity.
Understanding Validity for Teachers
V ALIDITY - C ONSEQUANTIALISM Assoc. Prof. Dr. Sehnaz Sahinkarakas.
Chapter 4. Validity: Does the test cover what we are told (or believe)
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
Test Validity S-005. Validity of measurement Reliability refers to consistency –Are we getting something stable over time? –Internally consistent? Validity.
Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Chapter 9 Assessment of Achievement.
Validity and Reliability
Measurement in Exercise and Sport Psychology Research EPHE 348.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
MEASUREMENT CHARACTERISTICS Error & Confidence Reliability, Validity, & Usability.
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
LECTURE 06B BEGINS HERE THIS IS WHERE MATERIAL FOR EXAM 3 BEGINS.
Principles of Test Construction
Chapter 7 Item Analysis In constructing a new test (or shortening or lengthening an existing one), the final set of items is usually identified through.
Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Chapter 7 Selecting, Administering,
Validity. Face Validity  The extent to which items on a test appear to be meaningful and relevant to the construct being measured.
Validity Is the Test Appropriate, Useful, and Meaningful?
Week 5 Lecture 4. Lecture’s objectives  Understand the principles of language assessment.  Use language assessment principles to evaluate existing tests.
Chapter 4 – Research Methods in Clinical Psych Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Selecting a Sample. Sampling Select participants for study Select participants for study Must represent a larger group Must represent a larger group Picked.
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
1 The Theoretical Framework. A theoretical framework is similar to the frame of the house. Just as the foundation supports a house, a theoretical framework.
Validity Validity: A generic term used to define the degree to which the test measures what it claims to measure.
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Class Final Exam & Project Review.
Chapter 4 Validity Robert J. Drummond and Karyn Dayle Jones Assessment Procedures for Counselors and Helping Professionals, 6 th edition Copyright ©2006.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
Validity: Introduction. Reliability and Validity Reliability Low High Validity Low High.
Validity Validity is an overall evaluation that supports the intended interpretations, use, in consequences of the obtained scores. (McMillan 17)
Validity and Item Analysis Chapter 4. Validity Concerns what the instrument measures and how well it does that task Not something an instrument has or.
Validity and Item Analysis Chapter 4.  Concerns what instrument measures and how well it does so  Not something instrument “has” or “does not have”
Experimental Research Methods in Language Learning Chapter 5 Validity in Experimental Research.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Obtaining Valid and Reliable Classroom Evidence Chapter 4:
Ch 9 Internal and External Validity. Validity  The quality of the instruments used in the research study  Will the reader believe what they are readying.
Chapter 6 - Standardized Measurement and Assessment
Classroom Assessment Chapters 4 and 5 ELED 4050 Summer 2007.
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
Language Assessment Lecture 7 Validity & Reliability Instructor: Dr. Tung-hsien He
Measurement and Scaling Concepts
Copyright © 2009 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 47 Critiquing Assessments.
Principles of Language Assessment
Ch. 5 Measurement Concepts.
VALIDITY by Barli Tambunan/
Lecture 5 Validity and Reliability
Concept of Test Validity
Test Validity.
Validity and Reliability
Journalism 614: Reliability and Validity
Reliability & Validity
Classroom Assessment Validity And Bias in Assessment.
Week 3 Class Discussion.
پرسشنامه کارگاه.
PSY 614 Instructor: Emily Bullock Yowell, Ph.D.
Chapter 8 VALIDITY AND RELIABILITY
Presentation transcript:

Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Chapter 6 Validity

Assessment Procedures for Counselors and Helping Professionals, 7e Drummond/Jones © 2010 Pearson Education, Inc. All rights reserved Definitions Validity: the claims and decisions made on the basis of assessment results are sound, meaningful, and useful for the intended purpose of the results. Validity: adequacy and appropriateness of the uses of assessment results. Construct: scientifically developed concepts, ideas, or hypotheses that are used to describe or explain behavior

Assessment Procedures for Counselors and Helping Professionals, 7e Drummond/Jones © 2010 Pearson Education, Inc. All rights reserved Important Points Not all test results are valid for all individuals at all times. If a test is given to someone of the wrong age or background to the normative sample, the resulting decisions made from that information is invalid.

Assessment Procedures for Counselors and Helping Professionals, 7e Drummond/Jones © 2010 Pearson Education, Inc. All rights reserved Threats to Validity Construct Underrepresentation: a test is too narrow and fails to include important dimensions of the construct (e.g., only measuring articulation when evaluating oral expression). Construct-Irrelevant Variance: a test is too broad and includes more dimensions than those in the construct (e.g., using reading cues in a listening comprehension test)

Assessment Procedures for Counselors and Helping Professionals, 7e Drummond/Jones © 2010 Pearson Education, Inc. All rights reserved Threats to Validity, Cont. Within factor threats: items are poorly made, too few items to measure construct, answers can be provided without having to know the construct (e.g., can use “test taking strategies” to do well). Administration and scoring procedure errors (e.g., not following standardization, not using objective scoring)

Assessment Procedures for Counselors and Helping Professionals, 7e Drummond/Jones © 2010 Pearson Education, Inc. All rights reserved Threats to Validity, Cont. Test-taker characteristics: some variables (other than the construct in question) within the test taker that effect results (e.g., test anxiety, 2 nd language issues, emotional problems, rapport, etc.) Inappropriate test group: using test on individuals that it was not designed or normed for. (e.g., giving a test designed for a 2 nd grader to a 4 th grader)

Assessment Procedures for Counselors and Helping Professionals, 7e Drummond/Jones © 2010 Pearson Education, Inc. All rights reserved Validity and Reliability Reliability is a necessary but insufficient condition for validity. Meaning… if your scores are different each time you give a test, you aren’t measuring the same thing each time. Validity is not required for reliability. Meaning… you can consistently measure the wrong thing

Assessment Procedures for Counselors and Helping Professionals, 7e Drummond/Jones © 2010 Pearson Education, Inc. All rights reserved Sources of Validity Evidence DescriptionProcedures for Assessing Validity ContentEvidence based on the representativeness of test content (test items) and response processes to the content domain Table of specifications Expert judges Analysis of response processes Criterion-RelatedEvidence based on the relationship between test scores and external variables Correlation b/w test scores & other scores on similar test. Correlation b/w test scores & some other data about same concept. ConstructEvidence based on the appropriateness of inferences drawn from test scores as they relate to a particular construct Evidence of homogeneity Convergent and discriminant validity evidence Group differentiation studies Factor analysis

Assessment Procedures for Counselors and Helping Professionals, 7e Drummond/Jones © 2010 Pearson Education, Inc. All rights reserved Validity Coefficient >.50Very High High Moderate/Acceptable <.20Low/ Unacceptable

Assessment Procedures for Counselors and Helping Professionals, 7e Drummond/Jones © 2010 Pearson Education, Inc. All rights reserved Consequences of Testing in Validation Process If the decisions made from test data are based off of invalid results, then the decisions themselves are invalid. Test score bias and unfair use of test scores can greatly influence validity of the use of test scores.