Measurement, Data and Information for Residential Aged Care

Slides:



Advertisements
Similar presentations
Quality control tools
Advertisements

The Aged Care Standards and Accreditation Agency Ltd Continuous Improvement in Residential Aged Care.
User Satisfaction Why? User Satisfaction Surveys are conducted to ensure we receive feedback from our customers in order to gauge.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
1 The aim…. ‘to enable assessors to objectively assess a laboratory’s compliance with the new standards’
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Project Monitoring Evaluation and Assessment
B121 Chapter 7 Investigative Methods. Quantitative data & Qualitative data Quantitative data It describes measurable or countable features of whatever.
© Grant Thornton UK LLP. All rights reserved. Review of Sickness Absence Vale of Glamorgan Council Final Report- November 2009.
National Commission for Academic Accreditation & Assessment Preparation for Developmental Reviews.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Business research methods: data sources
Purpose of the Standards
Quality Improvement Prepeared By Dr: Manal Moussa.
How to Develop the Right Research Questions for Program Evaluation
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
RESEARCH A systematic quest for undiscovered truth A way of thinking
BSBIMN501A QUEENSLAND INTERNATIONAL BUSINESS ACADEMY.
Impact assessment framework
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
Performance Measurement and Analysis for Health Organizations
Topic 4 How organisations promote quality care Codes of Practice
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Medical Audit.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
Lecture #9 Project Quality Management Quality Processes- Quality Assurance and Quality Control Ghazala Amin.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
Commissioning Self Analysis and Planning Exercise activity sheets.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
The Role of Information in Decision Making ITFM – Outcome 1.
Overall Quality Assurance, Selecting and managing external consultants and outsourcing Baku Training Module.
University of Palestine Dept. of Urban Planning Introduction to Planning ( EAGD 3304 ) M.A. Architect: Tayseer Mushtaha Mob.:
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Project quality management. Introduction Project quality management includes the process required to ensure that the project satisfies the needs for which.
Basic Nursing: Foundations of Skills & Concepts Chapter 9
Critically reviewing a journal Paper Using the Rees Model
CLINICAL AUDIT A quick guide. Why Audit? ‘Clinical audit is about improvement. If you are not changing or improving things as a result of audit then ask.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Standard 10: Preventing Falls and Harm from Falls Accrediting Agencies Surveyor Workshop, 13 August 2012.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
This was developed as part of the Scottish Government’s Better Community Engagement Programme.
Kathy Corbiere Service Delivery and Performance Commission
StagesOf Assessment Stages Of Assessment. The Stages of Assessment for the Single Assessment Process §Publishing information about services. §Completing.
Program Evaluation Principles and Applications PAS 2010.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
TCRF Strategic Planning Process A Stakeholders’ Consultative Retreat- Morogoro 26 th -27 April 2013.
Moving ON Audits Illawarra Retirement Trust. Foundation An opinion without data is just another opinion Real data helps services and managers to make.
Regional Dental Consultants’ Meeting Presented by Emerson Robinson, DDS, MPH Region II and V Dental Consultant.
1 Performance Auditing ICAS & IRAS Officers NAAA 21 Jan 2016.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
WHAT IS RESEARCH? According to Redman and Morry,
Basic Concepts of Outcome-Informed Practice (OIP).
Stage 1 Integrated learning Coffee Shop. LEARNING REQUIREMENTS The learning requirements summarise the knowledge, skills, and understanding that students.
ICAJ/PAB - Improving Compliance with International Standards on Auditing Planning an audit of financial statements 19 July 2014.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Roles and Responsibilities of the IRO. Role and Responsibilities of IRO When consulted about the guidance, children and young people were clear what they.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Training Trainers and Educators Unit 8 – How to Evaluate
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Training Trainers and Educators Unit 8 – How to Evaluate
Evaluating CLE Community Legal Education Masterclass
Introduction to CPD Quality Assurance
Presenter: Kate Bell, MA PIP Reviewer
Presentation transcript:

Measurement, Data and Information for Residential Aged Care

Overview Section One: An introduction to measurement, data and information Measurement, data and information defined Why measure? Some background to measurement practices

Measurement, Data and Information Defined to ascertain the extent, dimensions, quality of something Data: known or available facts or figures Information: knowledge communicated or received concerning some fact or circumstance

Turning Data Into Useful Information and Knowledge

Why Measure? Benefits derived from collecting, analysing and reporting data and information are: Provides the home with the ability to demonstrate its current level of performance to all stakeholders and against stakeholder needs Enables the home to gather information about the needs of stakeholders in an ongoing way Provides information for strategic and improvement planning Enables the home to monitor how well its plans are being implemented

The Link between Data and Information, Self-Assessment and Continuous Improvement As a regular and cyclical process, rigorous measurement and analysis of results… Enables the staff and management of the home to understand their current performance for example through results of resident satisfaction, resident outcomes, clinical indicators, audit results, other survey results Ensures staff and management of the home have up to date knowledge of their current level of achievement against the Accreditation Standards As part of self-assessment or routine performance review assists the home to identify areas where improvements could be made

The Link between Data and Information, Self-Assessment and Continuous Improvement Is an input into planning processes enabling decisions abut improvement, priorities, goals and resourcing Also informs the home’s management of the results achieved through improvement projects previously implemented Ensures that staff and management of the home have up to date evidence available to support their accreditation application and evidence provided during a support contact

Data and Information As Part of Self Assessment

Background to Measurement Practices The CSIRO found the following problems with collection and use of information. This package aims to touch on these problems. Little relationship existed between the strategic intents of the organisation and what was actually measured and reported Information usually focused only on financial performance Information did not support assessment and management of performance Information was largely operational Ambiguity about what was being measured Lack of capability to provide the appropriate analysis Lack of understanding of variation and statistics

Overview Section Two: Deciding what to measure Levels of measurement Start the thinking about measurement with what the home strives to achieve Process drive achievement – understanding processes leads to other appropriate measures Measurement and the expected outcomes of the Accreditation Standards Where measurement associated with improvements fits in

Levels of Measurement It is important to be clear about what measures are important and for what reason. For instance: How the the home performs overall in its major areas of care and service Enable staff to manage a particular job or process such as care planning or the catering services To track the success of a quality improvement project

Levels of Measurement The following steps simplify the approach to measurement: Define what is important to the residents, other stakeholders and the home Identify what the residential aged care home wants to achieve Look at the major processes of the home and how they are undertaken Decide what information will assist staff and managers to assess performance at the various levels

Measurement – What the Home Strives to Achieve It is important to begin with a review of what overall achievements the home aims for. Information about critical areas for achievement could be found through: Review of the home’s strategic plans and major objectives Review of the mission and vision statements Workshops and discussion between the managers and the board

Resident Focus Resident satisfaction and the satisfaction of other stakeholders are generally important high level measures for a home. Good management practice also focuses on the needs, expectations and satisfaction of other key stakeholders such as: Family members, carers and/or resident representatives Staff Visiting professionals The owners of the home Local and professional communities Other agencies

Understanding Processes, Outputs and Outcomes Results are what we aim for… Systems and processes help us achieve them!

When a Problem Arises

Measurement from a Process Perspective There are three sorts of measures that may be relevant to the residential aged care home’s major processes: Measurement of outcome Measurement of output Measurement within the process itself

Measurement of Outcome Outcome measures provide answers to the following probing questions… “You did this…and so…?” “And the result of this effort was…?” “Did this benefit the residents?…the staff?” They provide information about the ultimate results achieved.

Measurement Around Output Output measures are a step removed from the measures of outcome and can sometimes be seen as predictors of outcome. That is if results of outputs measures are positive the results of the measurement of outcome are also likely to be positive.

Measurement Inside the Processes Examples of commonly used in-process measures include: Staff compliance rates (to procedure and/or protocols) for example: - compliance to residents’ needs assessment procedures - compliance to medication procedures - compliance to cooking procedures and times - compliance to occupational health and safety procedures such as hand washing and universal precautions

Measurement and the Expected Outcomes of the Accreditation Standards The issue of effectiveness is the link between processes and outcome measures. Where the home has robust information around outcomes, effectiveness may be relatively easy to assess. Without this data, the home will need other ways of demonstrating that care and services are effective. R

Where Measurement Associated With Improvement Activities Fits In Measurement can be used at various levels and to suit various purposes: Routine high level measures of performance for management and staff Routine measurement to enable the management of processes Targeted measurement for quality improvement projects

Where Measurement Associated With Improvement Activities Fits In Each set of measures suits a particular purpose and as such: The various sets of measures should be considered differently and appropriately A comprehensive measurement system would most likely include measures to enable review of each of the three aspects of management described above It is important to distinguish between the measures being used

Measurement and Quality Improvement The development of appropriate measures and target for each quality improvement project enables the home to readily assess whether its improvement efforts are successful and, if measured over a defined period, will also show whether improvements have been sustained. Without such measurement and analysis it is difficult for a home to demonstrate the effectiveness of its improvements and the benefits achieved for residents and others.

Overview Section Three: Considerations in data collection Aspects of effective data management Aspects of effective data collection Sampling Types of data and data collection tools Planning for routine data collection

Aspects of Effective Data Management The appropriate collection of data, covered in this section, is one of the important aspects of a data management system that ensures that the data and information does in fact provide value to the users. Effective data management includes the presentation, analysis and interpretation of the data and information and these are covered in the next section

Aspects of Effective Data Collection Major aspects of data collection include: Clarity about the relevance and importance of the measure The definition of how the measure will be made or taken Clarity about who will collect the measure, where and when Definition of how the results will be reported – in what format and to whom Clarity about who has responsibility to respond to the results of the measurement

The Relevance and Importance of the Measure If the purpose of any measure is not clear, or it is obvious that the results are difficult to understand or act on…why measure it? Try asking: What major area of care and/or service delivery does this measure relate to? Is it clear how it relates? Will it effectively tell us about the area of care and/or service delivery we are interested in? Are there other measures that would provide more accurate information about the area under study? Are there additional measures that would provide more complete information about the area under study?

Definition About How the Measure Will be Made or Taken The definition could include details of: Any tools required for the measurement The appropriate level of detail required in the measurement Instructions about timing Any associated information that should be collected at the same time How to record the measurement How to store the measurement once recorded

Who Will Perform the Measurement? Some measurements could be taken easily by any staff member and others may require specialised skills and knowledge. The detail can include: The designated staff responsible for collecting the measurement Where they will source the measurement Detail around the choice of subject Detail around timing and sampling

Reporting of the Measurement Clarity about reporting could include details of: How often the results are to be reported Who they are reported to such as the department manager, the quality committee, the director of nursing, the management committee, or other forums In what format they will be reported for instance as: - raw data - averages, ranges - in tables - in run charts - other formats as required

Identifying who has Responsibility to Respond to the Results The person responsible will vary according to the particular measures being evaluated. For instance: The Chief Executive Officer, Director of Nursing, Care Manager or other senior member of staff are likely to be responsible for review of, and response to, the residential aged care home’s outcome measures and other high level data Service managers such as unit managers, catering managers and a support services manager may be responsible for the review of, and response to, process measures used to monitor their individual activities

Sampling There are several methods of sampling, four of which are described: Incidental Sampling The person conducting the measurement chooses whoever they wish. It may result in bias. Though easy, it is not recommended. Stratified Sampling Involves identifying sub-groups within the population and choosing an appropriate number from each category.

Sampling Random Sampling Random sampling minimises the introduction of bias and provides a sound basis for generalising the results found to the whole population. It is considered one of the best types of sampling. Systematic Sampling Systematic sampling involves choosing a subject at regular intervals from the whole population of subjects for instance every fifth or tenth subject from a list. It is not truly random but approaches this.

What is a Good Approach to Sampling? To decide the best overall approach for any particular study, audit or other assessment consider: Are there particular sub-groups that are important to this measurement? If so, note this. Attempt to choose the subjects (residents, staff, files) as randomly as possible, choosing numbers from any relevant sub-groups proportionally. If sub-groups are not important in a particular study, attempt to choose files, residents or other subjects at random.

Sample Size Attempt to review approximately 10-20% of cases for any audit, review or assessment. Be flexible in the approach. For instance if the audit or review shows unusual findings or mixed results, you may want to sample more subjects to develop a clearer view and understanding of the findings. When sampling from a sub-group attempt wherever possible to have a minimum number (say five) in each individual sub-group. An inclusive approach where all subjects are in the study, is entirely appropriate where the home has considered the importance of the assessment and plans the appropriate resources for the data collection.

Types of Data and Data Collection Tools Quantitative Data Can be counted. The results can be described and analysed numerically. Examples of data collection tools to gather quantitative data include audits, check sheets, tick sheets, count sheets. Qualitative Data Is generally concerned with individuals and their opinions, thoughts, feelings, experiences and other feedback. Examples of qualitative data collection tools are surveys, questionnaires, one to one interviews and focus groups.

Audits An audit can be defined as: “a planned, independent, and documented assessment to determine whether agreed-upon requirements are being met.” Arter, D.R, 94 Some considerations before developing any audit: Be clear about the objectives of the audit Ensure that this element of care and/or service is important and warrants its own auditing approach – never audit for the sake of auditing Be clear about exactly what you will do with the results

Collecting Data on Opinions Steps in developing a questionnaire Define what it is you want to ask about Draft the tool (such as the survey or questionnaire, interview process) Test the tool Refine the tool based on the findings from the test Conduct the process and review It is clear that this process follows the simple Plan-Do-Check-Act cycle common in any quality improvement exercise.

Planning for Routine Data Collection Most residential aged care homes collect a range of data using a range of tools. It is a useful exercise to list the major data collection activities and schedule them. Preparing such a schedule provides a number of advantages such as: Communication to staff, managers and others about the range of measurement activities Providing a timetable for staff and managers for their particular audits and activities Providing a platform for review of the measurement of activities

Reviewing the Data that is Collected and Reported Some useful questions to review measurement activities. For each audit, survey or or other data collection exercise ask: How does this measurement activity fit with the information needs of the residential aged care home? How important is it? How well does the data and information meet the needs of the user? What do the results show?

Overview Section Four: Presentation and analysis of data Aspects of effective data management Effective presentation and analysis of data Presentation and analysis of quantitative data Case Study and exercise

Aspects of Effective Data Management This section addresses the final cornerstone of an effective data management system – the way that data are presented and analysed. Turning data into useful information and knowledge.

Effective Presentation and Analysis of Data The choice of the format for presentation of data relies on two important considerations: the type of data and the audience. Quantitative Tables Bar charts & histograms Run charts Control charts Pie charts Pareto charts Scatter diagrams Qualitative Flowcharts Cause and effect diagrams

Effective Presentation and Analysis of Data Whichever format is chosen it is important that data and information are presented as clearly and accurately as possible with attention to standard considerations for data presentation such as: Clear title for the data and information set Clear dates and other identifying information The origin of the data clearly identified on the data and information set (for instance “this data was collected from the north wing of XYZ Nursing Home”) Details of who collected the data Consideration to what features are required to enable the user to appropriately analyse the information.

Presentation and Analysis of Quantitative Data Single data points for results (such as a solitary figure in a table in a monthly report) rarely provide meaningful information to the user. The ability to analyse results is strengthened when comparative information is also provided along with the current results. Whatever the presentation format chosen, the chart or table should be clean and uncluttered. The baseline against which to compare current results (say the median and range) can be calculated from the previous 12-18 months data if these results appear to have been relatively stable.

Presentation and Analysis of Quantitative Data If significant improvements are achieved over time, and sustained, then the baseline will need to be recalculated to accurately represent the current system and its results. A new standard has been set. If comparing your residential aged care home’s results with those of others ensure that the details of data collection and presentation are the same. This simple point lies at the root of much dissatisfaction with many data comparison programs. It may be more valuable to focus on refining and developing the home’s own internal measurement system to best meet its own needs before embarking on external comparative programs.

Analysis of Qualitative Data Important Note! Where results for a particular month are in line with what has been previously achieved, this indicates the home’s performance is stable. That is, with all else being equal and nothing changing, the results can be expected to continue. It is quite another question whether the results are acceptable. The use of an acceptable range or a target strengthens the ability to determine if results achieved are acceptable.

Presentation and Analysis of Qualitative Data Qualitative data includes such things as staff ideas, resident ideas and suggestions, and the process knowledge and understanding of staff and managers. Presentation formats that are useful for qualitative data include: Flow charts Cause and effect diagrams Note: whilst surveys and questionnaires collect qualitative data (such as opinions and perception) the analysis of survey and questionnaire results commonly involves giving some numeric number to the responses. As such, numerical results for survey responses are presented and analysed as for quantitative data.