Margaret A. Johnson, PhD Candidate Wanda Casillas, PhD Candidate Jen Brown Urban, PhD William Trochim, PhD Using Systems Thinking in Evaluation Capacity.

Slides:



Advertisements
Similar presentations
Theories, Values and Perspectives of Macro Social Work Generalist Macro Practice Generalist Macro Practice.
Advertisements

Cornell University Cornell Office for Research on Evaluation (CORE) A Systems Approach to Planning Evaluations The SEP and the Netway William M. Trochim.
Mywish K. Maredia Michigan State University
MYP (Middle Years Programme).  m7oU.
Chapter 2: The Research Process
Knows and performs Illinois Professional Teaching Standards including working with diverse learners Demonstrates basic competency in planning, instruction,
Communities of Practice: The Leading Edge Joanne Cashman, IDEA Partnership Emilie Braunel, WI Facets Jen Ledin, WI CoP.
Office of Accountability, Assessment and Intervention 1 Getting Ready for SIP: Developing the Action Sequences FALL 2006.
An Assessment Primer Fall 2007 Click here to begin.
 Here’s What... › The State Board of Education has adopted the Common Core State Standards (July 2010)  So what... › Implications and Impact in NH ›
Software Quality: An Overview From the Perspective of Total Quality Management By Kan, Basili and Shapiro.
Systems Dynamics and Equilibrium
Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell.
1 GENERAL OVERVIEW. “…if this work is approached systematically and strategically, it has the potential to dramatically change how teachers think about.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Helping Families Receive the Best Start in Life.  Check In  AOK History  AOK Communities  Conceptual Framework  Advancing Collaborative Leadership.
Family, School, and Community Partnerships: A Model for Pre-service Teacher Education Presentation at Center for Research, Evaluation & Advancement of.
Narrowing the Gap between Evaluation and Developmental Science: Implications for Evaluation Capacity Building Tiffany Berry, PhD Research Associate Professor.
Investing in Change: Funding Collective Impact
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Cornell University Program Planning and Evaluation From Theory to Practice and Back: Finding New Ways to Link Research and Practice Jennifer Southwick.
IST 210 Database Design Process IST 210 Todd S. Bacastow January 2005.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Scott Lowrey, Ed.D. (OISE/University of Toronto) CCEAM/CASEA 2014.
Introduction to the Systems Evaluation Protocol William Trochim Professor, Policy Analysis and Management Director of Evaluation for Extension and Outreach.
 Internal Validity  Construct Validity  External Validity * In the context of a research study, i.e., not measurement validity.
The BC Clinical Care Management Initiative as a Case Study in Large Scale Change CARES International Conference on Realist Approaches, October 29,
Word Generation and Massachusetts Model System for Educator Evaluation August 5, 2013 Presenter: Sophia Boyer Documents 1 and 2 adopted from Catherine.
Human Services Integration Building More Effective Responses to Peoples’ Needs.
Type Date Here Type Presenter Name/Contact Here Making Evaluation Work at Your School Leadership Institute 2012.
Based upon a presentation by Dr. Rob Weinberg Director, Experiment in Congregational Education Thinking, Planning, and Acting Systemically in Communities.
Rethinking Pre-College Math: A Brief Reminder about Why We’re Here and What We’re Trying to Do Overall context/purpose of project Defining characteristics.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Quantitative and Qualitative Approaches
Diagnosing Organizations. Diagnosis Defined Diagnosis is a collaborative process between organizational members and the OD consultant to collect pertinent.
Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.
Methodology Matters: Doing Research in the Behavioral and Social Sciences ICS 205 Ha Nguyen Chad Ata.
Agenda Introductions Objectives and Agenda Review Research Review Taking Stock Collect evidence Principal Practices & the Rubric End-of-the-Year Looking.
THINKING DIFFERENTLY ABOUT PROFESSIONAL COLLABORATION Prof Alma Harris Dr Michelle Jones.
School Improvement Partnership Programme: Summary of interim findings March 2014.
The Practical Aspects of Doing Research An Giang University June, 2004 Dennis Berg, Ph.D.
Nora Sabelli, NSF What could data mining and retrieval contribute to the study of education?
USING TEACHER EVALUATION TO IMPROVE TEACHING Melinda Mangin.
Career Academic Technical Institute (CATI) Division of Career-Technical Education TN State Department of Education 25th NACTEI New Orleans, 2005.
1 COMPUTER SCIENCE DEPARTMENT COLORADO STATE UNIVERSITY 1/9/2008 SAXS Software.
Logic Models: How to Develop, Link to M&E and Adapt Logic Models: How to Develop, Link to M&E and Adapt Evaluating Int’l Development Projects: One-Day.
Kathy Corbiere Service Delivery and Performance Commission
Technology Action Plan By: Kaitlyn Sassone. What is Systemic Change? "Systemic change is a cyclical process in which the impact of change on all parts.
Min.cenmi.org Michigan Implementation Network Michigan Implementation Network 15 WAYS TO IMPROVE YOUR INNOVATION.
Broward County Public Schools BP #3 Optimal Relationships
CSU Evaluation Team: How to prepare your evaluation plan.
The Big Rocks: TLC, MTSS, ELI, C4K, and the Iowa Core School Administrators of Iowa July 2014 IOWA Department of Education.
Integrating the MTSS Framework at the Secondary Level Dr. Jayna Jenkins, Learning and Development Facilitator, MTSS Shelly Dickinson, MTSS Trainer Charles.
Assessment of Your Program Why is it Important? What are the Key Elements?
Using Qualitative Methods to Identify System Dynamics and Inform System Evaluation Design Margaret Hargreaves Mathematica Policy Research American Evaluation.
Representing Simple, Complicated and Complex Aspects in Logic Models for Evaluation Quality Presentation to the American Evaluation Association conference,
IST 210 Database Design Process IST 210, Section 1 Todd S. Bacastow January 2004.
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
Evaluating the Impact of a System- Level Intervention Using a Developmental Evaluation Approach S. Elizabeth McGee, MA Evangeline Danseco, PhD Kyle Ferguson.
Bringing Diversity into Impact Evaluation: Towards a Broadened View of Design and Methods for Impact Evaluation Sanjeev Sridharan.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
1 Considerations When Providing Technical Assistance on Using Evidence December 13, 2010 Shawna L. Mercer, MSc, PhD, Director, The Guide to Community Preventive.
Module 7 Key concepts Part 2: EVALUATING COMPLEX DEVELOPMENT PROGRAMS
Thomas Schwandt University of Illinois, USA
Assessing Academic Programs at IPFW
Considering Fidelity as an Element Within Scale Up Initiatives Validation of a Multi-Phase Scale Up Design for a Knowledge-Based Intervention in Science.
Presentation transcript:

Margaret A. Johnson, PhD Candidate Wanda Casillas, PhD Candidate Jen Brown Urban, PhD William Trochim, PhD Using Systems Thinking in Evaluation Capacity Building: the Systems Evaluation Protocol (SEP) 1

What do you think? 2 You can’t do evaluation capacity building without doing systems thinking.

An even bigger claim… 3 All thinking is systems thinking.

This might be easier to see… 4 Systems thinking is one element of evaluative thinking

So what is systems thinking? It’s causal loop analysis! No, it’s social network analysis! No, it’s system dynamics… 5

Central ST themes in evaluation Williams on common patterns in ST: -perspectives -boundaries -entangled systems Patton on systems framework premises in systems dynamics modeling: -whole greater than the sum; parts are interdependent -focus is on interconnected relationships -systems are composed of subsystems; context matters -systems boundaries are necessary but arbitrary 6

The Big Picture “Systems evaluation considers the complex factors inherent within the larger structure or system within which the program is embedded. The goal is to accomplish high-quality evaluation with integration across organizational levels and structures.” ~William Trochim, Principal Investigator 7

A conceptual framework “We mine the systems literature for its heuristic value, as opposed to using systems methodologies (i.e., system dynamics, network analysis) in evaluation work.” ~Dr. Jen Brown Urban, Co-Principal Investigator 8

The Faciliated Systems Evaluation Protocol 9

SEP as a multi-level intervention Program level—programs may be simple and linear, though not always Cohort level--facilitation of the SEP (group learning) by the Cornell team is complicated, with many moving parts requiring specialization and coordination System level—emerging network of participants (past, present, incoming cohorts) within their larger program systems is complex and unpredictable, not centrally controlled.

Streams of ST meeting in the SEP General Systems Theory: part-whole relationships, local and global scale Ecological theory: static and dynamic processes, boundaries Evolutionary theory: ontogeny and philogeny System dynamics: causal pathways, feedback Network theory: multiple perspectives Complexity theory: simple rules and emergence 11

Zeroing in on the program level Protocol process, generally Specific steps in the planning protocol: Lifecycle analysis and alignment Stakeholder analysis Boundary analysis Causal pathway modeling 12

Walking the steps of the Protocol Simple rules lead to complex results Feedback, iteration and learning 13

Simple rules, feedback, iteration 14

Lifecycle analysis and alignment Static and dynamic processes Ontogeny Phylogeny Co-evolution and symbiosis 15

Co-evolution and symbiosis 16 Evaluation Special Projects Process assessment and post-only evaluation of participant reactions and satisfaction. Post-only assessment of outcomes, implementation assessment, outcome measurement development and assessment of internal consistency (reliability). Unmatched pretest and posttest of outcomes, qualitative assessment of change, and assessment of reliability and validity of measurement. Matched pretest and posttest of outcomes. Verify reliability and validity of change. Human subjects review. Controls and comparisons (control groups, control variables or statistical controls). Controlled experiments or quasi-experiments (randomized experiment; regression-discontinuity) for assessing the program effectiveness. Multi-site analysis of integrated large data sets over multiple waves of program implementation. Formal assessment across multiple program implementations that enable general assertions about this program in a wide variety of contexts (e.g., meta-analysis). Phase I Phase II Phase III Phase IV Program LifecycleEvaluation Lifecycle Initiation Development Maturity Dissemination Process & Response Change Comparison & Control Phase IA Is program in initial implementation(s)? Is program in revision or reimplementation? Phase IB Is program being implemented consistently? Does program have formal written procedures/protocol? Is program associated with change in outcomes? Is effective program being implemented in multiple-sites? Does program have evidence of effectiveness? Phase IIA Phase IIB Phase IIIA Phase IIIB Phase IVA Phase IVB Is evidence-based program being widely distributed?

Stakeholder analysis Part-whole relationships Local and global scale Multiple perspectives 17

18 EP Program name here Participants Competitor Programs Families Organizational Leader Program staff Other program colleagues CCE Administration Local employers Curriculum developer Statewide system(s) National system(s) Community Organizations Local Agency NSF Local suppliers Research scientists Cornell Future participants Board of Directors/Advisors Local Government Collaborators Local funder Non-Local funder(s) Future business leaders Volunteers Insurance supplier Regulatory group The above items are samples. Please replace them with ones that fit your program. If desired, you can highlight “key” stakeholders with a darker color font. Industry Groups

Boundary analysis Boundaries 19

DEFINING THE BOUNDARY IN:OUT: increasing kids’ science world peace awareness ? increasing the number of young people in science careers 20

Causal pathway modelling Causal pathways 21

From columnar logic model… 22

…to pathway model 23

24 Systems thinking and quality Criteria for plan qualitySystems thinking roots 1) Consistency with a high- quality program model part-whole relationships; local-global scale; causal pathways 2) Fitness of evaluation plan elements to the program and program context static and dynamic processes; ontogeny and phylogeny, symbiosis and co-evolution, feedback 3) Alignment of evidence framework static and dynamic processes; symbiosis and co-evolution 4) Reflecting judgments well- grounded in awareness of tradeoffs part-whole relationships; local and global scale; boundaries; ontogeny and phylogeny; multiple perspectives

Thank you! 25 This presentation is based on work by the research and facilitation team at the Cornell Office for Research on Evaluation, led by Dr. William Trochim. For more information on the Systems Evaluation Protocol, see