Meredith Davison Mike Roscoe

Slides:



Advertisements
Similar presentations
1 Establishing Performance Indicators in Support of The Illinois Commitment Presented to the Illinois Board of Higher Education December 11, 2001.
Advertisements

Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
Program Evaluation Strategies to Improve Teaching for Learning Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement.
An Assessment Primer Fall 2007 Click here to begin.
An Outcomes-based Assessment Model for General Education Amy Driscoll WASC EDUCATIONAL SEMINAR February 1, 2008.
Apples to Oranges to Elephants: Comparing the Incomparable.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
TaskStream Training Presented by the Committee on Learning Assessment 2015.
Abu Raihan, MD, MPH Director of Program, Asia IAPB 9th GA, Hyderabad, September 2012 Symposium 6: Challenges in Monitoring.
FACULTY RETREAT MAY 22, H ISTORY 2006 Middle States Self-Study Reviewer’s Report Recommendations: The institution is advised that General Education.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Student Outcomes Assesment for CTE Programs. Student Outcomes Assessment and Program Review  In 2004, the Program Review process was redesigned based.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Performance Management A briefing for new managers.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
External Review Team: Roles and Responsibilities A Very Brief Training! conducted by JoLynn Noe Office of Assessment.
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
INSTITUTIONAL RESEARCH PLANNING AND ASSESSMENT DR. SHEMEKA MCCLUNG DIRECTOR ARNITRA HUNTER RESEARCH ASSOCIATE.
Auditors’ Dilemma – reporting requirements on Internal Financial Controls under the Companies Act 2013 and Clause 49 of the Listing agreement V. Venkataramanan.
Educator Recruitment and Development Office of Professional Development The NC Teacher Evaluation Process 1.
Strategic planning A Tool to Promote Organizational Effectiveness
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Jeanie McHugo Meredith Davison
Peer to Peer PD Accreditation Process
Curriculum Development
Center for Applied Linguistics
Assessment Basics PNAIRP Conference Thursday October 6, 2011
The assessment process For Administrative units
Standard I Systematic Planning.
Conceptual Framework: The Reflective Practitioner
Program Quality Assurance Process Validation
Director of Policy Analysis and Research
Effective Outcomes Assessment
DRAFT Standards for the Accreditation of e-Learning Programs
Institutional Effectiveness Plan
An Introduction to NETS*T
School Self-Evaluation 
DESE Educator Evaluation System for Superintendents
Institutional Effectiveness USF System Office of Decision Support
Governance and leadership roles for equality and diversity in Colleges
Start with the Science & Technology Standards (2002, 2008?)
Assessment and Accreditation
Leveraging Instructional Design Teams
Dr. James W. Dottin Department Chair Business Administration
Presented by: Skyline College SLOAC Committee Fall 2007
Assessing Academic Programs at IPFW
Regulated Health Professions Network Evaluation Framework
An Update of COSO’s Internal Control–Integrated Framework
What to do with your data?
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Completing your Program Review
BTECH Dental Hygiene School of Science, Health & Criminal Justice Fall 2015 Assessment Report
Criminal Justice: Law Enforcement Leadership School of Health, Science and Criminal Justice Fall 2015 Assessment Report Curriculum Coordinator: Lisa Colbert.
Response to Intervention in Illinois
OGB Partner Advocacy Workshop 18th & 19th March 2010
Physical Therapist Assistant Program School of Science, Health, and Criminal Justice Fall 2016 Assessment Report Curriculum Coordinator: Deborah Molnar.
NON-ACADEMIC ASSESSMENT AND REPORTING FY’17
NON-ACADEMIC ASSESSMENT REPORTING FY’17
School Leadership Evaluation System Orientation SY12-13
DCEC & Getting to Impact
DCEC & Getting to Impact
TLQAA STANDARDS & TOOLS
NON-ACADEMIC ASSESSMENT REPORTING FY’19
Institutional Self Evaluation Report Team Training
Presentation transcript:

Meredith Davison Mike Roscoe Program Directors 101 Pando™ Workshop Program Evaluation Meredith Davison Mike Roscoe

Objectives Differentiate between student outcomes assessment and program evaluation Describe the key components of an effective assessment or evaluation plan Identify instruments for efficient data collection Develop and implement an effective program evaluation plan Define the components of a needs assessment Develop and implement an effective needs assessment Define the elements of an ongoing self-study process Define and implement an effective ongoing self-study process Discuss data driven program and curriculum changes

Program Evaluation CRITICAL to the success of PA Programs Effective plan starts with well-developed strategy for Data Collection Analysis Action Evaluation (assessment) is one of the more difficult concepts program directors and faculty face

Three Components Three major components of any assessment plan 1) Data Collection 2) Data Analysis 3) Action Plan DYNAMIC process – NOT a static event that is a separate/distinct process

Discussions about assessment plans often start with data collection… BUT, you have to know where you are going before you build a road

Triangulation Information from more than one source Allows for greater perspective more meaningful analysis = better action plan

Information for your assessment plan must be organic to YOUR program! Mission Statement Goals Competencies Outcomes

Program Evaluation Answers the question – Is everything that happens working to the extend “we” would like it to work Who is “we”? Framework Relevance Measurability Improvability

Evaluation Purposes Program Improvement Data Sharing with Partners Reporting and Accountability

Definitions Program Specific Items Student Specific Items Mission Statement Goals Objectives (Programmatic) Student Specific Items Competencies Learning outcomes Instructional objectives

Essential Elements/Tools Timeline assessment map(s) Goals/Program items Outcomes/student items Who/what/when/responsible “Problem” document (needs assessment/Self-study) Dynamic Accessible to faculty/committee/PD Strategy to report and disseminate results

Data Source and Methods Timelines for Data Collection Program Area Including Person Responsible Data Source and Methods Timelines for Data Collection Critical Indicators Date Assessed: Met? Not Met? Mission, Philosophy, Goals, and Expected Outcomes Philosophy, Principles and Values of the Program Program Director Written curriculum plan Spring and Summer faculty retreat Philosophy and values of the program are congruent with the University’s mission and philosophy. Indicators: evidence of contemporary health policy related to disparities, equity and access are embedded throughout curriculum   Program philosophy, goals and mission Philosophy and values are consistent with professional core values Program Faculty University and school mission January external advisory committee meeting Graduate outcomes reflect the philosophy, principles and values of the program. Indicators: 5 year rates of graduates employed in underserved settings and/or community based practice Student outcomes Philosophy, principles and values of the program are consistent with needs and values of the local professional community. Indicators: # of faculty active in local professional organizations, community boards, and DC Board of Health committees. Advisory Committee Recommend having: Assessment timeline document Problem document… PAEA Copyright 2016

Example of what a timeline for goals assessment (colors are collection, analysis and action might look like Example of what a “problem” document might look like

Developing an Assessment Plan (10 steps) Step 1: Determine WHAT is to be measured Based on mission and goals (think broad) What are some examples? Step 2: Determine HOW to measure the items selected Step 3: Determine if measures will be direct or indirect What is the difference? Examples?

Step 4: Determine data to be collected for each instrument This is specific – think objectives/outcomes Examples? Step 5: Determine benchmarks Reason for threshold?

Step 6: Determine/Describe how the data will be analyzed Examples? Step 7: Determine where data will be kept and disseminated Step 8: Describe/Determine how changes (Action) will occur Who is responsible, how recorded, assessment

Step 10: What are the barrier to implementing the action plan? Step 9: Check the plan? Does the plan assist us in achieving our mission and goals? Step 10: What are the barrier to implementing the action plan? Stakeholders been considered? Culture, politics Buy-in Impact

Common Mistakes Reporting Numbers = Analysis Meeting Goals = Good Outcomes (goals may be flawed) Outcomes Assessment = Program/Process Evaluation Side Effects (Impacts) = Negative Outcome Failure to Triangulate (> 1 data source) Failure to Evaluate the Evaluation Measures PAEA Copyright 2016 20

Failure to Evaluate the Process Failure to do Formative & Summative Statistical Significance  Practical Significance Summary without Recommendations Bias Towards Favorable (or Unfavorable) Findings PAEA Copyright 2016 21

Not defining benchmarks (arbitrary) Lack of transparency Not including stakeholders Analysis without a follow-up action plan PAEA Copyright 2016 22

Helpful Hints Set realistic expectations. Start with what you have. Set realistic expectations. Sampling is ok for many assessments. Stagger for realistic workload. Set up tracking mechanisms for longitudinal analysis. Write definitions (e.g. remediation vs. retesting). Use results to make evaluation relevant.

Helpful Hints Consider external issues. Triangulate. 360 evaluation. Make decisions that are data driven and mission driven. Consider external issues. Triangulate. 360 evaluation. Look at trends. Overall pattern of change in an indicator over time…this is the unit of measure!

Helpful Hints Assessment does not have to be complex Make assessment “organic” – it is part of learning process Not just summative Find your help Institution PAEA Community

Acknowledgements Portions of this lecture were adapted from information by: Robert Philpot Ruth Ballweg