1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

Curriculum Development and Course Design
Standardized Scales.
Chapter 8 Flashcards.
Victorian Curriculum and Assessment Authority
© McGraw-Hill Higher Education. All rights reserved. Chapter 3 Reliability and Objectivity.
© 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Validity and Reliability Chapter Eight.
What is a Good Test Validity: Does test measure what it is supposed to measure? Reliability: Are the results consistent? Objectivity: Can two or more.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
MODULE 3 1st 2nd 3rd. The Backward Design Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to.
Chapter 4 Validity.
Concept of Measurement
Seminar /workshop on cognitive attainment ppt Dr Charles C. Chan 28 Sept 2001 Dr Charles C. Chan 28 Sept 2001 Assessing APSS Students Learning.
Uses of Language Tests.
Principles of High Quality Assessment
Research problem, Purpose, question
Classroom Assessment A Practical Guide for Educators by Craig A
Assessing and Evaluating Learning
Understanding Validity for Teachers
Questions to check whether or not the test is well designed: 1. How do you know if a test is effective? 2. Can it be given within appropriate administrative.
Understanding Task Orientation Guidelines for a Successful Manual & Help System.
MEASUREMENT AND EVALUATION
Validity and Reliability
Click to edit Master title style  Click to edit Master text styles  Second level  Third level  Fourth level  Fifth level  Click to edit Master text.
REFLECTING ON ASSESSMENT DESIGN. INTRODUCTION & PURPOSE.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
MEASUREMENT CHARACTERISTICS Error & Confidence Reliability, Validity, & Usability.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
EDU 385 Education Assessment in the Classroom
Nursing Diagnosis Research for Students Chapter Five.
The Analysis of the quality of learning achievement of the students enrolled in Introduction to Programming with Visual Basic 2010 Present By Thitima Chuangchai.
EDU 385 Education Assessment in the Classroom
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Teaching Today: An Introduction to Education 8th edition
Validity Is the Test Appropriate, Useful, and Meaningful?
Week 5 Lecture 4. Lecture’s objectives  Understand the principles of language assessment.  Use language assessment principles to evaluate existing tests.
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
Selected Teaching-Learning Terms: Working Definitions...
A ssessment & E valuation. Assessment Answers questions related to individuals, “What did the student learn?” Uses tests and other activities to determine.
Introduction to Earth Science Section 2 Section 2: Science as a Process Preview Key Ideas Behavior of Natural Systems Scientific Methods Scientific Measurements.
Assessment Specifications Gronlund, Chapter 4 Gronlund, Chapter 5.
1 Math 413 Mathematics Tasks for Cognitive Instruction October 2008.
Assessment and Testing
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
Work Sample Seminar1 Developing a Pretest & Posttest for the Literacy Work Sample Portland State University.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Validity Validity is an overall evaluation that supports the intended interpretations, use, in consequences of the obtained scores. (McMillan 17)
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
Review: Alternative Assessments Alternative/Authentic assessment Real-life setting Performance based Techniques: Observation Individual or Group Projects.
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
ELED 6560 Summer Learning Exercises #10 The Un-Natural Part of Teaching  Five Ways that Teaching Behavior is Un-Natural 1. Helping Others 2.
Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase.
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
Dr. Jeffrey Oescher 27 January 2014 Technical Issues  Two technical issues  Validity  Reliability.
1 Solving Problems with Methods Questions. 2 Problem solving is a process similar to working your way through a maze. But what are these “steps” and what.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Classroom Assessment A Practical Guide for Educators by Craig A
QUESTIONNAIRE DESIGN AND VALIDATION
Assessment and Evaluation
Week 3 Class Discussion.
پرسشنامه کارگاه.
Critically Evaluating an Assessment Task
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Presentation transcript:

1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of Staten Island, The City University of New York

2 What Are Case Studies? Simulations of patients’ stories: Written, Video-taped, Computer-based Designed for specific purpose(s) Variety of designs: Length Content Style: Linear, branching

3 When Are Case Studies Used? Practice e.g., orientation of new nurses, discussion of complex cases Education e.g., teaching aids, testing Research e.g., measurement of accuracy, evaluation of knowledge

4 Advantages: Simulations of Patients’ Stories Standardized Represent important, usual, familiar, & challenging situations Restricts the complexity of practice User “gets involved” Reasonable cost

5 Why Is It Difficult to Develop Good Case Studies? Case studies are tools Quality= Validity & Reliability Development takes time, energy and commitment

6 Case Studies as Tools Goal Strengthen the link between knowledge and application How Operationalize complex abstract concepts and “real” patients’ stories, e.g., Hope Caregiver Stress Ineffective Self Health Management

7 Case Studies as Tools to Measure Nursing Concepts Nursing Concepts nursing diagnoses Capture key elements of a real patient situation Use principles of measurement Instrumentation-Method to measure concepts

8 Case Studies as Tools (cont.) Measurement: Definition See Waltz, Strickland & Lenz, 2005 Importance of conceptual frameworks Challenges of measurement

9 Case Studies as Tools (cont.) The dilemma & challenge: Patients’ stories are complex Overlapping variables Biases in interpretation Use of heuristics Many types of thinking Goal Reduce ambiguous, abstract ideas to concrete behavioral indicators

10 Measurement Frameworks Norm-referenced Evaluate performance relative to the performance of others Criterion-referenced Evaluate performance based on pre-determined standards

11 Case Studies as Criterion- Referenced Tools: Reliability Definition: Consistency of scoring Range of variability is reduced, use nonparametric procedures Test-Retest Parallel forms Interrater & intrarater agreement

12 Case studies as Criterion- Referenced Tools: Validity Definition: Measures what was intended, systematic error is reduced Types: Content validity Criterion-related validity Construct validity

13 Case Studies as Criterion- Referenced Tools: Tasks Precisely specify target behaviors Identify standards for target behavior Discriminate who did and did not acquire the target behavior Compare subjects’ performance to the standards

14 Case Studies as Tools (cont.) Types of written simulations Linear Branching Free branching Modified free branching Forced branching

15 Linear Technique Subjects follow same sequence One or more sections If more than one section, students receive specific instructions, and Sections are appropriate for all subjects One section does not influence other sections Each section samples inquiries or actions

16 Guidelines for Case Study Development: Linear Format 1.Identify the overall purposes:  What is the general topic? What kind of problem solving will be represented? How important are the responses, e.g., learning process, grade, research data?

17 Guidelines (cont.) 2. Specify objectives, e.g., to Measure accuracy of diagnosis Illustrate relation of cues to inferences Facilitate planning Orient new nurses Teach sequential decision making

18 Guidelines (cont.) 3.Decide the complexity: Length Number of diagnoses, interventions, and/or outcomes Amounts of high, moderate & low relevance data Types of data

19 Guidelines (cont.) 4. Obtain literature sources & nurse experts to support validity: Select authoritative sources: actual cases, literature sources Decide importance of each source Identify expert judges, e.g., may ask colleagues Use well-known experts for research tools

20 Guidelines (cont.) 5. Formulate case studies, directions, & scoring manual Directions must be explicit, comprehensive, and clearly written Scoring manual provides scores for possible answers

21 A) Create Blueprint Choose a general problem area What should this exercise teach or test? What kinds of problems need to be solved? What kinds of knowledge needs to be used? Define the objectives to be sampled Select concrete problem situation(s) Outline and diagram the exercise

22 B) Prepare Specifications Method of administration Proportion of content for each objective Style of writing case(s) Restrictions, e.g. time Describe scoring procedures

23 C) Construct Case Studies Develop pool of data/items to match objectives Review to determine content validity and appropriateness Edit, delete, change as indicated Assemble the case(s)

24 D) Set Standards or Cut Scores What types of answers are “correct” What answers/scores are acceptable or unacceptable Classification of various answers, # of points toward a grade

25 Guidelines (cont.) 6. Obtain content validity ratings from experts: Select content validity method Item-objective congruence Interrater agreement Average congruency percentage  Send with explicit instructions  Be prepared to make changes

26 Guidelines (cont.) 7. Evaluate with a pilot test: Consider demographics, e.g., Experience Knowledge  Evaluate directions, restrictions, scoring methods, interrater reliability

27 Conclusion Case studies are fun to use The effort is “worth it” New case study feature in IJNTC, send submissions to Margaret Lunney Suggested website: