Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College Making Sense of the.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Survey Responses Challenges and Opportunities Matt Richey St. Olaf College.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
“Scientifically Based Evaluation Methods” Presented by Paula J. Martin COE Conference, September 13, 2004.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
CHAPTER 12, evaluation research
Genre Shift: Instructor Presence and its Impact on Student Satisfaction in Online Learning.
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Writing Effective Assessment Plans. Why Assessment Plans? Facilitates periodic, not episodic assessment of student learning and program outcomes Serves.
Impact and outcome evaluation involve measuring the effects of an intervention, investigating the direction and degree of change Impact evaluation assesses.
Problem Identification
Writing Effective Assessment Plans Office of Assessment and Accreditation Indiana State University.
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
How to Write Goals and Objectives
Molly Chamberlin, Ph.D. Indiana Youth Institute
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
IAEA International Atomic Energy Agency The IAEA Safety Culture Assessment Methodology.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Evaluating NSF Programs
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Program Evaluation Using qualitative & qualitative methods.
RESEARCH A systematic quest for undiscovered truth A way of thinking
Case Management 1 What Will be Covered? 1. Organizational Assessments (Module 1) 2. Designing Quality Services (Module 2) 3. HIV Prevention and Care Advancing.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Skunk Works Evaluation Tools: How do we know if we are having an impact?
What is an effective induction (within an academic context) and how do you implement it across the whole university or college? Michael Hill Action on.
Making the most of it: Assessing impact with small data sets Susanna Dilliplane, PhD Deputy Director Aspen Planning and Evaluation Program ANDE Annual.
Too expensive Too complicated Too time consuming.
Learning within Teaching What professors can learn about their students and themselves as teachers when they innovate in their teaching ANABELLA MARTINEZ,
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Irene Khan – Secretary General Building effective and responsive INGOs, the strategic role of HR: The IS Job Value Review 8 February 2008.
Evaluating a Research Report
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
NSF IGERT proposals Yang Zhao Department of Electrical and Computer Engineering Wayne State University.
Guidance for Completing Interim Report I Evaluation Webinar Series 3 Dec 2013.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
AGEP Evaluation Capacity Meeting 2008 Yolanda George, Deputy Director, Education & Human Resources Programs.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Broadening Participation.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Getting Information for Marketing Decision Making
This material is based on work supported by the National Science Foundation under Grant No. DACS-06D1420. Any opinions, findings, and conclusions or recommendations.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
1 An Evaluation Plan and Tool Kit for the Archibald Bush Innovative Teaching and Technology Strategies Grant Valerie Ruhe and J.D. Walker, Center for Teaching.
Trouble? Can’t type: F11 Can’t hear & speakers okay or can’t see slide? Cntrl R or Go out & come back in 1 Sridhar Rajappan.
How do you know your product “works”? And what does it mean for a product to “work”?
Assessment of Your Program Why is it Important? What are the Key Elements?
Insights from an evaluator and professor HOW TO MEASURE IMPACT Paul Penley, PhD Director of Research in theological education Excellence in Giving.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Third-Party Evaluation Studies as a Basis for Determining Program Effectiveness and Improvement Needs Center for Research and Reform in Education Johns.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Assessment and Evaluation of CAREER Educational Components Center for Teaching Advancement and Assessment Research.
How to Assess the Effectiveness of Your Language Access Program
Research Designs Social Sciences.
Michael T. Stephenson Ginger Carney George Cunningham Jon Kotinek
CATHCA National Conference 2018
Seminar on the Evaluation of AUT STEM Programme
Presentation transcript:

Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College Making Sense of the “Eval”: RFP language, lingo and assessment strategies

Overview ①Cultural Shift in Assessment and Evaluation ②Grant Requirements and Lingo ③Logic model to Evaluation design (and a little more lingo) a.) design variations b.) tricks of the trade - scalability of program assessment ④Give it a whirl …

Cultural shift … ① Data – driven, Value – added… Post program effects(?) ② Development, Scale – up … State & National Models ③ Emphasis on Programs to Emphasis on Assessment of Programs (Grant world and beyond) ④ Less funds, more requests (typical supply and demand) ⑤ Data is EVERYWHERE now (ugh!) Can you do what you say you are going to do, can you do it well, and can you “prove” it?

Funder’s Needs Our Needs and

Grant Requirements and Lingo … 1. Does the agency have an outcome that is key to the current mission? 2. What standard of evaluation is expected? 3. Identify key vocabulary: project objectives, evaluation and outcomes 4. Plan integrated evaluation – seamless … 5. If possible, model the proposed program first (pilot?) Step 1: Know your granting agency

Enter the RFP (and more lingo) Step 2: Dissect the Program Description – NSF, Research Experiences for Undergraduates Research experience is one of the most effective avenues for attracting Students to and retaining them in science and engineering, and for preparing them for careers in these fields. The REU program, through both Sites and Supplements, aims to provide appropriate and valuable educational experiences for undergraduate students through participation in research. REU projects involves students in meaningful ways in ongoing programs or in research projects specifically designed for the REU program. REU projects feature high-quality interaction of students with faculty and/or other research mentors and access to appropriate facilities and professional development opportunities.

More RFP lingo Describe the plan to measure qualitatively and quantitatively the success of the project in achieving its goals, particularly the degree to which students have learned and their perspectives on science, engineering, or education research related to these disciplines have been expanded. Evaluation may involve periodic measures throughout the project to ensure that it is progression satisfactorily according to the project plan, and may involve pre-project and post-project measures aimed at determining the degree of student learning that has been achieved. In addition, it is highly desirable to have a structured means of tracking participating students beyond graduation, with the aim of gauging the degree to which the REU Site experience has been a lasting influence in the students’ career path. Proposers may wish to consult The 2010 User-Friendly Handbook for Project Evaluation for guidance on the elements in a good evaluation plan. Although not required, REU Site PIs may wish to engage specialists in education research in planning and implementing the project evaluation. Step 3: Dissect the Program Evaluation – NSF, Research Experiences for Undergraduates

Funder’s Goals Our Goals and

Logic Model to Evaluation Design

Deep Breath Logic Model to Evaluation Design

Design & Assessment Strategies A. Random Assignment, Field – “Experiment” Design -Intervention and Control Groups (ethical and practical considerations/limitations) - Willingness to take long -view B. Quasi – Experimental Design -Intervention and Control Groups * self-selected, matching (composition, predispositions and experiences are relatively close)... More immediate view and easier to sell – less confident in program effects

Design & Assessment Strategies C. Units of Analysis -Individuals vs. Groups vs. Communities/Schools/Centers etc. (Sample size? Effect Sizes?) D. Assessment Approach (Think MIXED METHODS!) Qualitative -Survey (Short Answer) -Interviews (Individual) -Structured vs. Semi-structured -Focus Groups -Observation (rubric)

Design & Assessment Strategies D. Assessment Approach, cnt… (Think MIXED METHODS!) Quantitative -Measurements, Instruments … -Valid and Reliable E. Data Collection Strategies -Single Measurement, often post intervention -Pre – Post Measurement -Periodic Measurement (which could include pre-post too) -Time-series, better estimates of program effects

Design & Assessment Strategies D. Data Analysis Strategies -Difference on the original measurement scale a.) Intervention vs. Control -Comparison with test norms, Normative Population a.) Absolute vs. Relative Change -Proportion over a success threshold a.) Diagnostic b.) Arbitrary - Comparison of effects with similar programs

Assessment Strategies – “Tricks of the trade” (before you call in the cavalry …) 1. What is the (your) organization already doing in assessment? a.) Think Past, Present and Future 2. Process vs. Product Assessment a.) Continuous Improvement Model (Formative/Summative) 3. What instruments and tools can you find on your own? a.) Expertise issues 4. NO KITCHEN SINKS ALLOWED! Be judicious and realistic. a.) Capacity issues (you can’t do it all, and won’t be able to)

Assessment Strategies (call in the cavalry) 1.Evaluation Design a.) Program to Eval intimately linked b.) Put a team together early! - Provides confidence and avoids pitfalls 2.Statistics a.) Confidence in your descriptive findings b.) Nvivo, SPSS, etc. expertise

Always keep in mind… A solid program with continuous improvement and measurable findings can (and often does) lead to the NEXT GRANT

Give it a whirl … RFP Lingo … language you can use to develop the proposal and the evaluation Mini Logic Model - Activity (Strategy), Output(s), Outcome(s) Design? (Random assignment, Quasi?) Think Mixed Methods Data Collection … Data Analysis …