Janet Maher November 1, 2011.  Curriculum Development: design and monitor a collaborative process for achieving consensus on core competencies  Participant/Learner.

Slides:



Advertisements
Similar presentations
Professional Learning Communities Connecting the Initiatives
Advertisements

The SUNY Assessment Initiative: Best Practices for Mapping Program Objectives to Curricular Activities Presentation to Middle States Commission on Higher.
FORESTUR: “Tailored training for professionals in the rural tourist sector” ES/06/B/F/PP QUALITY MANAGEMENT PLAN Valencia, November 2006.
Participation Requirements for a Guideline Panel Co-Chair.
Teaching/Learning Strategies to Support Evidence-Based Practice Asoc. prof. Vida Staniuliene Klaipeda State College Dean of Faculty of Health Sciences.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Reliability, Validity, Trustworthiness If a research says it must be right, then it must be right,… right??
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Business research methods: data sources
Critique of Research Outlines: 1. Research Problem. 2. Literature Review. 3. Theoretical Framework. 4. Variables. 5. Hypotheses. 6. Design. 7. Sample.
Nursing Quality Assurance Issues in CVVH Timothy L. Kudelka, RN, BSN Pediatric Dialysis Program C.S. Mott Children’s Hospital University of Michigan.
Performance Planning Presented by: Mike Dougherty AVP for Human Resources.
What should be the basis of
Standards and Guidelines for Quality Assurance in the European
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Chapter 14 Overview of Qualitative Research Gay, Mills, and Airasian
Assessment Workshop SUNY Oneonta May 23, Patty Francis Associate Provost for Institutional Assessment & Effectiveness.
Consistency of Teacher Judgement. CTJ is a key strategy for implementing the curriculum and monitoring its effect on students’ learning. (p.17)
Program Evaluation Using qualitative & qualitative methods.
Data Quality Review: Best Practices Sarah Yue, Program Officer Jim Stone, Senior Program and Project Specialist.
Ten years of Evaluability at the IDB Yuri Soares, Alejandro Pardo, Veronica Gonzalez and Sixto Aquino Paris, 16 November, 2010.
Prof. György BAZSA, former president Hungarian Accreditation Committee (HAC) CUBRIK Workshop IV Beograd, 13 March, 2012 European Standards and Guidelines.
UNIT 9: An Ecosystem Approach to Fisheries Management Plan.
Accountability in Health Promotion: Sharing Lessons Learned Management and Program Services Directorate Population and Public Health Branch Health Canada.
Unit 2 – Quantitative Research. Quantitative research is a research method that is used to explain phenomena in our world. In quantitative research 
Systematic Reviews.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
CHAPTER III IMPLEMENTATIONANDPROCEDURES.  4-5 pages  Describes in detail how the study was conducted.  For a quantitative project, explain how you.
Reservoir Primary School Literacy Share Day
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Evaluating a Research Report
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 7 Portfolio Assessments.
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Understanding Meaning and Importance of Competency Based Assessment
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
INTRODUCTION TO STUDY SKILLS. What are Study Skills?  Study skills are approaches applied to learning. They are considered essential for acquiring good.
The Danielson Framework Emmanuel Andre Owings Mills High School Fall 2013.
Developing an Assessment System B. Joyce, PhD 2006.
Unpacking and Implementing Training Packages Linda Hopkins.
TRUSTWORTHINESS OF THE RESEARCH Assoc. Prof. Dr. Şehnaz Şahinkarakaş.
Dr Vladimir Radevski Ohrid, 4 April 2012 National Frameworks and their associated Quality Assurance.
1 The National Center for Interprofessional Practice and Education is supported by a Health Resources and Services Administration Cooperative Agreement.
1 Tempus Tempus Workshop Sarajevo 7 June 2006 « Good practice in Preparing an Application » Anne Collette European Training Foundation Tempus Department.
Revised AQTF Standards for Registered Training Organisations Strengthening our commitment to quality - COAG February August 2006.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
The School Effectiveness Framework
Guidelines Recommandations. Role Ideal mediator for bridging between research findings and actual clinical practice Ideal tool for professionals, managers,
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Project Management Strategies Hidden in the CMMI Rick Hefner, Northrop Grumman CMMI Technology Conference & User Group November.
Gibraltar Philanthropy Forum Conference 19 November 2013 Dr Balwant Singh Kusuma Trust UK Philanthropy in Action.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Revised Quality Assurance Arrangements for Registered Training Organisations Strengthening our commitment to quality - COAG February 2006 September 2006.
TEMPLATE DESIGN © Group Work Evaluation Issue Fairness in Group Work Evaluation Evaluation can be difficult enough when.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
HCS 465 Week 3 Individual Applying the Results and Conclusion of the Research Process to Problems in Health Care To purchase this material click below.
Peer to Peer PD Accreditation Process
DATA COLLECTION METHODS IN NURSING RESEARCH
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Auditing Sustainable Development Goals
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Performance management and talent management
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Evaluating the Performance of Salespeople
February 21-22, 2018.
Presenter: Kate Bell, MA PIP Reviewer
Writing Chapter 3 and Locating Instruments
Presentation transcript:

Janet Maher November 1, 2011

 Curriculum Development: design and monitor a collaborative process for achieving consensus on core competencies  Participant/Learner satisfaction with learning experience  Participant/Learner outcomes linked to learning experience

 To ensure that study protocols meet the highest possible standards with regard to: ◦ Quality of Evidence ◦ Format (engagement, ease of use) ◦ Practicality (usefulness in applied setting) ◦ Feasibility (ease of implementation) ◦ Maintenance (amount of time, cost required to maintain knowledge )

 Background ◦ Historically a disjuncture between quantitative and qualitative research over rigour and trustworthiness of analysis and by implication of data used  : methodological debates developed ‘parallel’ standards for qualitative and quantitative research  Since 2000: move back to reviewing both types in terms of validity and reliability and strategies for achieving them This presentation focuses primarily on quality of evidence and relies on 2 main sources which are posted on igloo: Morse et al., 2002 in nursing and Golafshani, 2003 in education

 Refers to ◦ Reproducibility of an outcome  Tested by ◦ Study protocol sufficiently specifies a procedure so that others can use it to achieve consistent and stable results with other similar populations

 Refers to ◦ The extent to which a given test accurately represents the features of the phenomena it is intended to describe, explain or theorize (Hammersley 1992)  Tested by ◦ An agreement between 2 or more efforts to measure the same thing using different indicators or measures

 Main objective is to persuade peers that the results can be generalized ◦ Protocol should summarize criteria, including interest in typical or critical cases, in a similar fashion to quantitative strategies, including  Sampling strategy  Review at several points by different coders/analysts in data collection not just at end  Triangulation—use and document different strategies to validate  Verification strategies—audit trail, member check

Suggest we revisit the process of the past 10 days or so collaborating on design of our brand  Sampling strategy—whole core team with shared objectives, range of expertise/competencies, some overlapping  Review at several points by different coders/analysts in data collection not just at the end—moderator(s) intervene and take account of contrary information  Triangulation—not formally, but could be done based on the materials on our forums  Verification strategies—not formally, but documentation is available—would be good if it were all in one place

 To summarize, consider advantages of mixed method approach  Look at the other criteria for study tools ◦ Format (engagement, ease of use) ◦ Practicality (usefulness in applied setting) ◦ Feasibility (ease of implementation) ◦ Maintenance (amount of time, cost required to maintain knowledge )

 Do you agree on the primary focus on quality of evidence?  Can we commit to careful documentation of process through the igloo community space(s)?  Can you help me with some better search terms around polling and distance education  Thanks  Janet