Presentation is loading. Please wait.

Presentation is loading. Please wait.

Margaret A. Johnson, PhD Candidate Wanda Casillas, PhD Candidate Jen Brown Urban, PhD William Trochim, PhD Using Systems Thinking in Evaluation Capacity.

Similar presentations


Presentation on theme: "Margaret A. Johnson, PhD Candidate Wanda Casillas, PhD Candidate Jen Brown Urban, PhD William Trochim, PhD Using Systems Thinking in Evaluation Capacity."— Presentation transcript:

1 Margaret A. Johnson, PhD Candidate Wanda Casillas, PhD Candidate Jen Brown Urban, PhD William Trochim, PhD Using Systems Thinking in Evaluation Capacity Building: the Systems Evaluation Protocol (SEP) 1

2 What do you think? 2 You can’t do evaluation capacity building without doing systems thinking.

3 An even bigger claim… 3 All thinking is systems thinking.

4 This might be easier to see… 4 Systems thinking is one element of evaluative thinking

5 So what is systems thinking? It’s causal loop analysis! No, it’s social network analysis! No, it’s system dynamics… 5

6 Central ST themes in evaluation Williams on common patterns in ST: -perspectives -boundaries -entangled systems Patton on systems framework premises in systems dynamics modeling: -whole greater than the sum; parts are interdependent -focus is on interconnected relationships -systems are composed of subsystems; context matters -systems boundaries are necessary but arbitrary 6

7 The Big Picture “Systems evaluation considers the complex factors inherent within the larger structure or system within which the program is embedded. The goal is to accomplish high-quality evaluation with integration across organizational levels and structures.” ~William Trochim, Principal Investigator 7

8 A conceptual framework “We mine the systems literature for its heuristic value, as opposed to using systems methodologies (i.e., system dynamics, network analysis) in evaluation work.” ~Dr. Jen Brown Urban, Co-Principal Investigator 8

9 The Faciliated Systems Evaluation Protocol 9

10 SEP as a multi-level intervention Program level—programs may be simple and linear, though not always Cohort level--facilitation of the SEP (group learning) by the Cornell team is complicated, with many moving parts requiring specialization and coordination System level—emerging network of participants (past, present, incoming cohorts) within their larger program systems is complex and unpredictable, not centrally controlled.

11 Streams of ST meeting in the SEP General Systems Theory: part-whole relationships, local and global scale Ecological theory: static and dynamic processes, boundaries Evolutionary theory: ontogeny and philogeny System dynamics: causal pathways, feedback Network theory: multiple perspectives Complexity theory: simple rules and emergence 11

12 Zeroing in on the program level Protocol process, generally Specific steps in the planning protocol: Lifecycle analysis and alignment Stakeholder analysis Boundary analysis Causal pathway modeling 12

13 Walking the steps of the Protocol Simple rules lead to complex results Feedback, iteration and learning 13

14 Simple rules, feedback, iteration 14

15 Lifecycle analysis and alignment Static and dynamic processes Ontogeny Phylogeny Co-evolution and symbiosis 15

16 Co-evolution and symbiosis 16 Evaluation Special Projects Process assessment and post-only evaluation of participant reactions and satisfaction. Post-only assessment of outcomes, implementation assessment, outcome measurement development and assessment of internal consistency (reliability). Unmatched pretest and posttest of outcomes, qualitative assessment of change, and assessment of reliability and validity of measurement. Matched pretest and posttest of outcomes. Verify reliability and validity of change. Human subjects review. Controls and comparisons (control groups, control variables or statistical controls). Controlled experiments or quasi-experiments (randomized experiment; regression-discontinuity) for assessing the program effectiveness. Multi-site analysis of integrated large data sets over multiple waves of program implementation. Formal assessment across multiple program implementations that enable general assertions about this program in a wide variety of contexts (e.g., meta-analysis). Phase I Phase II Phase III Phase IV Program LifecycleEvaluation Lifecycle Initiation Development Maturity Dissemination Process & Response Change Comparison & Control Phase IA Is program in initial implementation(s)? Is program in revision or reimplementation? Phase IB Is program being implemented consistently? Does program have formal written procedures/protocol? Is program associated with change in outcomes? Is effective program being implemented in multiple-sites? Does program have evidence of effectiveness? Phase IIA Phase IIB Phase IIIA Phase IIIB Phase IVA Phase IVB Is evidence-based program being widely distributed?

17 Stakeholder analysis Part-whole relationships Local and global scale Multiple perspectives 17

18 18 EP Program name here Participants Competitor Programs Families Organizational Leader Program staff Other program colleagues CCE Administration Local employers Curriculum developer Statewide system(s) National system(s) Community Organizations Local Agency NSF Local suppliers Research scientists Cornell Future participants Board of Directors/Advisors Local Government Collaborators Local funder Non-Local funder(s) Future business leaders Volunteers Insurance supplier Regulatory group The above items are samples. Please replace them with ones that fit your program. If desired, you can highlight “key” stakeholders with a darker color font. Industry Groups

19 Boundary analysis Boundaries 19

20 DEFINING THE BOUNDARY IN:OUT: increasing kids’ science world peace awareness ? increasing the number of young people in science careers 20

21 Causal pathway modelling Causal pathways 21

22 From columnar logic model… 22

23 …to pathway model 23

24 24 Systems thinking and quality Criteria for plan qualitySystems thinking roots 1) Consistency with a high- quality program model part-whole relationships; local-global scale; causal pathways 2) Fitness of evaluation plan elements to the program and program context static and dynamic processes; ontogeny and phylogeny, symbiosis and co-evolution, feedback 3) Alignment of evidence framework static and dynamic processes; symbiosis and co-evolution 4) Reflecting judgments well- grounded in awareness of tradeoffs part-whole relationships; local and global scale; boundaries; ontogeny and phylogeny; multiple perspectives

25 Thank you! 25 This presentation is based on work by the research and facilitation team at the Cornell Office for Research on Evaluation, led by Dr. William Trochim. For more information on the Systems Evaluation Protocol, see HTTP://CORE.HUMAN.CORNELL.EDU/AEA_CONFERENCE.CFM


Download ppt "Margaret A. Johnson, PhD Candidate Wanda Casillas, PhD Candidate Jen Brown Urban, PhD William Trochim, PhD Using Systems Thinking in Evaluation Capacity."

Similar presentations


Ads by Google