Taking Student Success to Scale. High Impact Practices in the States

Slides:



Advertisements
Similar presentations
TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
Advertisements

Connecting Completion and Quality for Student Success Illinois Performance Funding Steering Committee Chicago, IL November 13, 2013 Carol Geary Schneider.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.
Dr. Bettina Shuford, Associate Vice Chancellor of Student Affairs Dr. Amy Gauthier, Senior Associate Director, Housing and Residential Education High Impact.
Supporting Quality of Student Learning Online: Using Quality Matters to Strengthen Online Teaching and Learning Valencia College - Orlando, Florida Charles.
INACOL National Standards for Quality Online Teaching, Version 2.
Health Career Recruitment and Retention Service-Based Learning.
TENN TLC addresses retention through student engagement UT SIFE students 13 May 2010.
1. Stuart Boersma: Professional Development Coordinator, Mathematics. Kandee Cleary: Director of Diversity and Inclusivity, and Sociology (chair). George.
Thomas College Name Major Expected date of graduation address
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
HECSE Quality Indicators for Leadership Preparation.
An Overview.  Association of American Colleges and Universities (AAC&U)  Liberal Education and America’s Promise (LEAP)  aacu.org/leap.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Professional framework for public sector employees Using the framework.
Laying the Foundation for Scaling Up During Development.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Why Are HIP Practices so Important to Students?... Where and how are we accomplishing these at CWU? HIGH IMPACT PRACTICES: Create an environment that helps.
Stages of Research and Development
Online Quality Course Design vs. Quality Teaching:
HIPs Literature at the End of Decade One
Effective Strategies: First Year Experience Class
What Does a “High Impact” Course Look Like, Anyway?
Implementing QM towards Program Certification
Linda J. Sax, Professor, GSEIS/UCLA
Key features of the first draft
A community of learners improving our world
First-Year Experience Seminars: A Benchmark Study of Targeted Courses for Developmental Education Students.
Keywords: Engineering ethics, design education,
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Department of Political Science & Sociology North South University
Using the North Carolina Teacher Evaluation Rubric Proactively
Program Learning Outcomes
High-Impact Practices
NCATE Standard 3: Field Experiences & Clinical Practice
SUNY Applied Learning Resolution.
Project-Based Learning (PBL)
Your Institutional Report Step by Step
High Impact Practices: HU-HIPs plan
Project-Based Learning
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Evaluating Digital Learning Implementation with the CWiC Framework
Critical pedagogy: Building strong learning communities.
Institutional Effectiveness USF System Office of Decision Support
Deep Dive: Writing Intensive, Service Learning, First Year Experience
Purposeful Pathways: Designing Relevant and Transparent HIPs
Natasha Cook, M.Sc., Kerry Ritchie, Ph.D.,
Developing 21st Century Classrooms: Connecting the Dots IV
Implementation Guide for Linking Adults to Opportunity
Ready-2-Zoom: Transparent Syllabus (Th) Noon – 1 P.M.
Leanne Havis, Ph.D., Neumann University
Creating Meaningful Student Learning Outcomes
The Heart of Student Success
PRESENTATION TITLE Faculty Enhancement and Instructional Development (FEID) Proposal Support Sharon Seidman, Ph.D. (HHD) and Erica Bowers, Ed.D. (Director,
We VALUE HIPs Utilizing VALUE Rubrics and HIP QA Tools in Course Revitalization Presented by Melynda Conner, TBR OSS HIP Specialist 2019.
Taking the STANDARDS Seriously
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Title V Initiatives to Promote Student Success: Capstone Seminar and Student Research With Faculty May 31, 2016.
Welcome to Your New Position As An Instructor
TENN TLC addresses retention
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Physical Therapist Assistant Program School of Science, Health, and Criminal Justice Fall 2016 Assessment Report Curriculum Coordinator: Deborah Molnar.
Using a High-Impact Practice Taxonomy to Scale-Up with Quality
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
Presentation transcript:

Taking Student Success to Scale. High Impact Practices in the States

Assessment as Culture Management & Professional Development Jerry Daday Executive Director Center for Innovative Teaching & Learning Western Kentucky University

Criteria for high quality hips Expectations set at appropriately high levels Intentional (clear Essential Learning Outcomes (ELOs); structured experience Significant investment of time and effort Preparation, orientation, and training Interaction with faculty and peers Experience with diversity Frequent and constructive feedback Periodic and structured opportunities for reflection Relevance through real world applications (i.e. hands-on experience) Public demonstrations of competence Kuh & O’Donnell (2013)

outcomes Transactional Outcomes Retention Persistence GPA Essential Learning Outcomes AAC&U LEAP Essential LO Value rubrics used for assessment (Value rubrics used for assessment) Deep Learning Applied Learning Community/relationship building

HIPs Inventory Last spring, we conducted an inventory of HIPs used in 53 of our majors at WKU 88% of these majors offer a writing-intensive course 77% offer a capstone experience 66% offer collaborative/project based assignments 64% offer common intellectual experience 54% offer global-learning 43% offer undergraduate research & creative activities

HIPs Inventory On average, departments reported that students receive 5 HIPs within each major 1 major = 10 HIPs 2 majors = 9 HIPs 3 majors = 8 HIPs

Validity Are we measuring what we think we are measuring? Before we can accurately/validily measure a concept (like a HIP), we need to engage in . . . Conceptualization: process of defining a concept Operationalization: process of specifying measures/indicators of a concept

Value of taxonomies Explicitly define HIPs Ensure fidelity of the HIP by expressing purpose/intent Essential Learning Outcomes (ELOs) Specific attributes that define the HIP Guide professional development of faculty & staff PD tool for improvement Validation of activity’s impact for tenure & promotion and annual review Precursor to assessment effort Must define parameters before meaningful assessment of student learning can take place

Value of taxonomies Programs within a department or university Taxonomies may be used to guide the develop of Individual courses First-Year-Experience Living-Learning-Communities Undergraduate Research Service Learning Programs within a department or university Internships & Study Abroad University-wide initiatives

Example: CSU Service Learning (Slide 1)

Example: CSU Service Learning (Slide 2)

Intermission

HIP: What Works and How Do We Know? Jennifer Merriman, PhD Executive Director Research August 25, 2017

Why should we care about program evaluation? If it is not working, why keep doing it? Why should we care about program evaluation? $$$$

Overview HIP Theory of Action Heterogeneity of Programs Revised Theory of Action Claims Formative and Summative Evaluation Fidelity of Implementation Developing an evaluation to help determine if your program is HIGH IMPACT

Improved Student Outcomes Theory of Action HIP Improved Student Outcomes The causal logic of a program’s impact on outcomes

Heterogeneity of Practices Documentation of the specific practices yields more accurate evaluation of impact What do we mean by HIP? How is it operationalized?

Variation in which practices offered Institution A B C First-Year Seminar and Experiences √ Common Intellectual Experiences Learning Communities Writing-Intensive Courses Collaborative Assignments and Projects Undergraduate Research Diversity/Global Learning Service Learning Internships Capstone Courses and Projects

Variation in practice operationalization Institution Collaborative Assignments and Projects A B C Study Groups √ Team-based assignments Cooperative projects Meet twice a year Meet once a month Meet weekly Professor-led Student-led

Improved Student Outcomes Theory of Action HIP Improved Student Outcomes Which practices? Implemented in what ways? For which students? In what context? Which outcomes? Why and how will these practices influence student outcomes? How does implementation affect outcomes? Are outcomes seen for all students? In all contexts? 20

Revised Theory of Action Specificity and context matter Revised Theory of Action Urban institution serving many 1st gen students Student Outcomes Engagement GPA Retention Other? Team-based assignments Meet weekly throughout semester Student-led Collaborative Assignments and Projects Specific implementation details here Specifics Internships Undergraduate Research All Students Only Policy Students Only STEM Students

Black Box vs Specific Practices Which practices actually drive outcomes? A black box approach might mask effectiveness, OR heterogeneous effects might yield null effects on average Black Box vs Specific Practices Student Outcomes: GPA Team-based assignments Meet weekly throughout semester Student-led Collaborative Assignments and Projects Specific implementation details here Specifics Internships Undergraduate Research All Students Only Policy Students Student Outcomes: None Only STEM Students Student Outcomes: None

Claims Too Vague Example Claims Students who participate in HIP are more successful Traditionally underserved minority students who successfully complete an undergraduate research project in a STEM field at 4-year public IHEs are 8 times more likely to go on to graduate school. Too Vague Figure out the kinds of claims you want to make BEFORE you begin developing your evaluation. The desired claims will drive evaluation design and data requirements.

Formative vs Summative Evaluation Evaluation Type Definition Uses Examples Formative Evaluates a program during  development in order to make early improvements Helps to refine or improve program When starting a new program To assist in the early phases of program development How well is the program being delivered? What strategies can we use to improve this program? Summative Provides information on program effectiveness Conducted after the completion of the program design To help decide whether to continue or end a program  To help determine whether a program should be expanded to other locations Should this program continue to be funded? Should we expand these services to all other after-school programs in the community?

Fidelity of Implementation The extent to which the delivery of an intervention adheres to the program model as intended by the developers of the intervention If null program effects are found it may be that the program is not effective OR that the program was not implemented with fidelity Five dimensions: Adherence: Program adherence refers to the extent to which program components are delivered as prescribed by the model. Adherence indicators can include program content, methods, and activities. 2) Exposure: Program exposure (i.e., dosage) is the amount of program delivered in relation to the amount prescribed by the program model. Exposure can include the number of sessions or contacts, attendance, and the frequency and duration of sessions. 3) Quality of delivery: Quality of delivery reflects the manner in which a program is delivered. Aspects of delivery quality can include provider preparedness, use of relevant examples, enthusiasm, interaction style, respectfulness, confidence, and ability to respond to questions and communicate clearly. 4) Participant responsiveness: Participant responsiveness refers to the manner in which participants react to or engage in a program. Aspects of participant responsiveness can include participants’ level of interest in the program; perceptions about the relevance and usefulness of a program; and their level of engagement, enthusiasm, and willingness to engage in discussion or activities. 5) Program differentiation: Program differentiation is the degree to which the critical components of a program are distinguishable from each other and from other programs. Fidelity of Implementation

intermission

Action Steps Describe your HIP programs carefully: Which programs do we offer? Who participates in the programs? How is the program operationalized? What are the expected outcomes for each program and why do we think we will see that outcome? This sets up the Theory of Action What claims do we want to make from the evaluation? Different stakeholders have different claims they want to make internally and externally—vet the claims with all pertinent groups Is the program still is development, continuously changing, or fully developed? This will determine whether to undertake formative or summative evaluation Data availability How will we standardize our metrics? Are all data available for the evaluation? When will the data be available? When do we need to report findings? What is the counterfactual? When you talk about program outcomes you need to think carefully about the comparison group—improved student success compared to what? What can I do next?

Key Questions Taxonomies can be used for faculty professional development to ensure HIPs are implemented with fidelity Taxonomies can also be used to set up program evaluation. For example . . . Is there a difference in outcomes based on these degrees? Does a “low intensity” HIP have the same impact as a “high intensity HIP”? What should we track for outcomes? What do you wish to measure? (transactional? essential learning? Deep or applied learning?) What about the 8 criteria of a HIP (Kuh & O’Donnell 2013)? Should these be what we are measuring as attributes across all HIPS? Or should attributes vary across HIPs?

Taking student success to scale. High Impact Practices in the States.

National Model and Laboratory for Student Success http://ts3.nashonline.org/high-impact-practices/