Outcome measures Let’s choose one!. What is the deal with outcome measures? It’s more than a phab cozi coat to be worn on a sadl.

Slides:



Advertisements
Similar presentations
The meaning of Reliability and Validity in psychological research
Advertisements

The Community Themes and Strengths Assessment A How-To Guide.
Agent-Based Architecture for Intelligence and Collaboration in Virtual Learning Environments Punyanuch Borwarnginn 5 August 2013.
ENGLISH B HIGHER LEVEL The Mackay School – May 2014 Examinations.
Noise-Induced Hearing Disability… So What? n Problems perceiving auditory warnings n Problems localizing sound sources n Problems understanding verbal.
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
ETHICS OF CONSENTING IMPAIRED INDIVIDUALS THERAPEUTICS Col Xolani Currie, Nat Dipl Rad, BA, HED, MPH Regulatory Oversight Manager Project Phidisa.
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
CS305: HCI in SW Development Evaluation (Return to…)
OCR GCSE Humanities Get Ahead - improving delivery and assessment of Unit 3 Unit B033 Controlled Assessment Approaches to Preparing Candidates for the.
Cox data: Average ratings for both sets of instruments for each category (percent preference for each condition)
Assessment: Reliability, Validity, and Absence of bias
‘Building Bridges’ An innovative tool to capture small health behaviour changes; the development process. Mills, H., Uphill, M., & Weed, M. Introduction.
Selecting Your Evaluation Tools Chapter Five. Introduction  Collecting information  Program considerations  Feasibility  Acceptability  Credibility.
Measuring the quality of academic library electronic services and resources Jillian R Griffiths Research Associate CERLIM – Centre for Research in Library.
Define usability testing Usability is all about how easy a product, service or system is to use. The extent to which a product can be used by specified.
Uses of Language Tests.
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
1 Enviromatics Decision support systems Decision support systems Вонр. проф. д-р Александар Маркоски Технички факултет – Битола 2008 год.
Measuring Learning Outcomes Evaluation
Preceptor Orientation For the Nurse Practitioner Program
Customer Satisfaction with Hearing Instruments in the Digital Age September 2005 Hearing Journal.
Development of Questionnaire By Dr Naveed Sultana.
Education Adjustment Program Verification Current September 2012.
Developing Evaluation Instruments
Lecture 8 Objective 20. Describe the elements of design of observational studies: case reports/series.
What they asked... What are the long term effects of fitting bilateral amplification simultaneously (both aids on Day #1) versus sequentially (the second.
1 WRS Feedback Overview. 2 Agenda Introduction to WRS Assessment Feedback Report Developmental Planning Best Practices Summary/Wrap Up.
Research Proposal Part II Creating a NAEP Teacher Questionnaire Ashley Singer ARE March 26, 2013.
1 Career Exploration & Plan Development Day Five Questions? “You’ve got to be very careful if you don’t know where you’re going, because you might not.
Chapter 7 Business Process Redesign Reference: Tan, A. (2007). Business Process Reengineering in Asia: A Practical Approach, Pearson Education, Singapore.
State of Oregon Department of Human Services
Self-reported cognitive and emotional effects and lifestyle changes shortly after preventive cardiovascular consultations in general practice Dea Kehler.
Development of a new self-report instrument on participation and environment With Ros Madden & Professor Anita Bundy Danielle Cheeseman.
Assistive Technology in the Classroom Family Center on Technology and Disability.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Chapter 11 AR for Adults Perry C. Hanavan. Strategies for Planning Subjective –Comments, case history, communication partners comments, questionnaires,
Home 1 Career Counseling and Services: A Cognitive Information Processing Approach James P. Sampson, Jr., Robert C. Reardon, Gary W. Peterson, and Janet.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Anthony Fasano, P.E. Executive Director The New York State Society of Professional Engineers, Inc. The New York State Society of Professional Engineers,
Successful Concepts Study Rationale Literature Review Study Design Rationale for Intervention Eligibility Criteria Endpoint Measurement Tools.
New Fitter News Volume 2, Number 3 This Month’s Topic for the New Fitter Is… Beltone AVE. Have you taken a trip down Beltone AVE.? Beltone AVE. is a multi-media.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Topics Covered Phase 1: Preliminary investigation Phase 1: Preliminary investigation Phase 2: Feasibility Study Phase 2: Feasibility Study Phase 3: System.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Assessment Developing an Assessment. Assessment Planning Process Analyze the environment Agency, clients, TR program, staff & resources Define parameters.
Marketing Pharmaceutical Care Dr. Muslim Suardi, MSi., Apt. Faculty of Pharmacy University of Andalas 2010.
This Outcome Report is based on data from patients who completed a Functional Restoration Programme (FRP) at the RealHealth Treatment Centre in Coventry.
INTRODUCTION Emotional distress and sense of burden are experienced by many caregivers of persons with traumatic brain injury (TBI). 1-8 Predicting which.
Chapter 23 Deciding how to collect data. UIDE Chapter 23 Introduction: Preparing to Collect Evaluation Data Timing and Logging Actions –Automatic Logging.
Figures for Chapter 13 Assessing outcomes Dillon (2001) Hearing Aids.
Usability Testing Instructions. Why is usability testing important? In a perfect world, we would always user test instructions before we set them loose.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
The Engineering Design Process
Evaluation and Assessment Evaluation is a broad term which involves the systematic way of gathering reliable and relevant information for the purpose.
The Scale for Assessment of Family Enjoyment within Routines (SAFER)
Chapter 23: Overview of the Occupational Therapy Process and Outcomes
Using the BRFSS to Validate a New Measure of Participation
CH 14 Implementing CH 15 Evaluating
Initial Intake Process: Clinical Interview Training
Preceptor Orientation For the Nurse Practitioner Program
Self-Adjusted Amplification by Experienced Hearing Aid Users
“…In trying to measure caring, one is drawn into a process of reducing a complex subjective, intersubjective, relational, often private, and invisible.
Meet ADDIE: Instructional Systems Development
Chapter 11 AR for Adults Perry C. Hanavan.
Monitoring Children’s Progress
Component 11 Unit 7: Building Order Sets
Chapter 1 Assessment Basics
Interreg-IPA Cross-border Cooperation Programme Romania-Serbia
Presentation transcript:

Outcome measures Let’s choose one!

What is the deal with outcome measures? It’s more than a phab cozi coat to be worn on a sadl

Outcome measures Hearing aid outcome Self assessment Self report For the purpose of determining patient perceived benefit in order to: Know how to adjust hearing aids Change to another hearing aid Determine what counseling is needed Validate

What is Validation? Validation of my feelings Validation – the act of validating – finding or testing the truth of something, the cognitive process of establishing proof Confirmation that something (application, experiment, equipment) consistently fulfills the requirements for specific use Making or declaring valid; proof; confirmation

VALIDATION of HEARING AIDS Confirm that the hearing aid is providing benefit We give patients hearing aids and we want to be sure they’re helping

Is anyone validating? Majority of dispensers do not administer self assessment outcomes (Lindley, 2006) Report from an AuD class study said that 80% of practitioners use outcome measures Subjective outcomes seem to have become the “gold standard” (Mendel, 2009)

Informal Survey of 41 Offices in NY and NJ

Investigation of 41 offices Do you use any formal standardized outcome measures? If so, which one(s)? If not, do you have your own that you’ve developed? Or, do you use a more “informal” interview method?

Embarrassed

Are they being used? <5% (2 out of 41) use outcome measures >95% DO NOT!

Typical responses I know we should… We’ve talked about it at meetings I want to… We used to but… Use them with difficult cases Use real ear Use pre and post testing Use intake questionnaire Use data logging or diaries

Verification 17% (7 out of 41) use some type of verification as validation

Validation vs. Verification Verification is measurement to see if the gain/output is matching proposed targets Real-ear measurement Aided vs. unaided testing speech discrimination aided/aided speech in noise VERIFICATION

Other findings: 4 of 41 offices developed their own measure – one of those 4 developed one that looks at the ease of use 2 people in survey were involved in developing measures and are NOT using them

Validation Are matched targets appropriate? Treatment effectiveness Treatment efficiency Treatment effect Weinstein, 1997

ICF The ICF is the World Health Organization's International Classification of Function (Disability and Health) The three main areas of the ICF are: Structure and Function (relating to the actual hearing impairment) Activities and activity limitations (previously referred to as disability) Participation and participation restrictions (previously referred to as handicap) (WHO-DAS)

What is Validation of Hearing Aids looking at? reduction of handicap acceptance benefit satisfaction

Acceptance If they provide benefit If they are satisfied If they reduce handicap If they can physically use them Data logging

Three different types of measures Outcome measures Pre-fitting measurement Satisfaction measures

Satisfaction Measures SADL – Satisfaction with Amplification in Daily Life – by Cox and Alexander – 15 items in 4 areas – with 7 possible ratings Marke Trac – by Kochkin – examines 5 areas, multiple questions under each – with 5 possible ratings

What are we left with? Pre-fitting measures Outcome measures Often accomplished with one tool

Importance of patient perception First looked at in 1947 (Davis and Silverman) Aided speech testing doesn’t work One of the first printed assessments 1964 (High, Fairbanks, and Glorig)

Why look at self report measures? Healthcare is customer driven Real world performance cannot be simulated in the office Using evidence based assessment Need to some how justify the use of technology like directional microphone advancements For insurance purposes

Other reasons to consider New graduates High frequency hearing loss Counseling and realistic expectations Reduced return rates

Reduction of Returns Study by Peterson and Bell (2004) 5 year study return rate = 15.2% (includes all returns and previous vs. experienced users) NO mention of use of formal standardized outcome measures

Objective vs. Subjective measures Subjective – formal questionnaire or interview Objective – formal questionnaire – multiple choice – rating system

Choosing the right tool for you Prioritizing goals – your goals might be to: evaluate benefit of hearing aid fitting diagnose fitting problems predict fitting success compare fitting to similar fittings compare different hearing aids address the patient’s real life concerns (Cox, 2005)

Specifying Essential Features Based upon your specifications: goals population setting (Cox, 2005)

Limit your choices 4 to 6 possible measures Obtain a copy of each Review the items and instructions (Cox, 2005)

Appreciating the Fundamentals (looking at the features of the measurement you’re considering) Learning to administer the test The patient’s burden Scoring the test Is the test valid Is the test sensitive enough (Cox, 2005)

Choose the best compromise Nothing is a perfect fit Learn about the questionnaire Become familiar with it Decide whether or not it is for you after 20 to 30 uses. (Cox, 2005)

Readily Available COSI – in Phonak software APHAB – in Noah

Susan’s Quick Guide Review choices Eliminated Narrow down choices Get copies Review test Administer Choose

APHAB (Abbreviated Profile of Hearing Aid Benefit) Cox

HHIE (Hearing Handicap Inventory) Weinstein

COSI (Client Oriented Scale of Improvement) Dillon

What tests may not address Personality Cognitive ability

Suggestions Evaluate return rate Try using an outcome measure Open ended in an interview Closed ended administered by dispenser Promote communication Begin using outcome measures Re-evaluate return rate Evaluate time difference

Thank you