Download presentation
Presentation is loading. Please wait.
Published byBrice Walker Modified over 9 years ago
1
Coming from Behind: Developing a Logic Model and an Evaluation Strategy Late in a Center’s Life Cycle Judah Leblang, Project Evaluator Program Evaluation and Research Group (PERG) at Lesley University American Evaluation Association November 2, 2011
2
Evaluation of CELEST PERG served as the primary evaluator for CELEST—The Center of Excellence in Learning in Education, Science and Technology in 2009-10. The Center, based at Boston University, (also involving researchers from Brandeis, Harvard and MIT) was preparing for their critical review after receiving only provisional funding—an 18 month extension—in 2009. Program Evaluation & Research Group (PERG) at Lesley University
3
What is CELEST? CELEST’s core scientific objective is to understand how the whole brain learns; i.e., how it adapts as an integrated system to enable intelligent autonomous behavior. CELEST’s core technology objective is to develop novel brain-inspired technologies by implementing key insights gained from experiments and modeling: from bench to models to applications. (CELEST website) Program Evaluation & Research Group (PERG) at Lesley University
4
CELEST: Educational Goals CELEST is creating a new paradigm for educating graduate and undergraduate students in systems neuroscience by connecting biological knowledge about brains to an understanding of intelligent behavior through neural and computational models. Accordingly, CELEST graduate students cross-train in experimental and modeling techniques in ways that will make them valuable members of cross-disciplinary teams of the future of systems neuroscience, because they will be fluent in more than one “language.” (CELEST website) Program Evaluation & Research Group (PERG) at Lesley University
5
CELEST Evaluation Priorities 2009-10 Need for a comprehensive evaluation plan/credible evaluation Need for a logic model and clarity around key goals Demonstration of ‘value added’ or “centerness” as required by NSF Integration of an external evaluator into the project planning process and data collection Need to demonstrate change in leadership and restructuring of the project Desire for external evaluator to oversee all aspects of the evaluation and respond to NSF/SLC requests Program Evaluation & Research Group (PERG) at Lesley University
6
NSF/SLC Evaluation Priorities 2009-10 Rapid development of new/more comprehensive evaluation plan integrated into CELST project SIP Data collection and analysis by January 2010 Examination of CELEST’s functioning as a center vs as a cluster of independent researchers Measurements of “centerness,” to be compared with data from other SLC projects. Development of Year 6 Evaluation Report as basis for site visit and critical review. Program Evaluation & Research Group (PERG) at Lesley University
7
CELEST Evaluation: Phase 1 Sept 2009-January 2010 Documenting the value of CELEST as a center – Reviewed CELEST archival documents – Identified project strengths and examples of “centerness” – Conducted focus groups and interviews with graduate students and post-docs – Analyzed data and wrote evaluation report – Presented findings to CELEST board and site visit team—March 2010 Evaluation Methods used: – Two Focus groups: Interviewed 7 post docs, 6 graduate students – Student and Faculty survey data: Surveyed 28 of 34 grad students, 16 of 17 post docs; Surveyed 15 of 16 faculty – Interviews with PI and Co-PIs, 3 interviews with diversity team – Review of project website and archival documents Program Evaluation & Research Group (PERG) at Lesley University
8
Evaluation Phase 2: Fall 2009--ongoing Primary areas of focus: “Centerness”: Value of CELEST as a center – Mechanisms to share resources and ideas about neuroscience and related technology, within centers, with partners, with broader audience—researchers, educators and general public – Document outgrowth of CELEST/benefits of collaborations Education, outreach and partnerships – Education, research and career development – New Undergraduate courses--focus on CN 210—reviewed 2009 course materials and conferred with faculty and teaching fellows – Evaluate effectiveness of 2010 course – CELEST-related research experiences/internships – Partnerships with other institutions, industry, professional societies Diversity – Documented CELEST activities promoting inclusiveness and diversity: WISE, SACNAS conferences, summer internships, other activities Program Evaluation & Research Group (PERG) at Lesley University
9
Data collection was driven by the following evaluation questions, framed in consultation with CELEST staff, and developed to satisfy SLC requirements: 1.What makes CELEST a center? 2.What does CELEST do that is different than research, educational, and other activities that would be carried out by a group of individual investigators? 3.What value [to the participating institutions, to students, to the field] does CELEST add and how? Evaluation Questions Program Evaluation & Research Group (PERG) at Lesley University
10
Evaluation Questions cont’d 4. How, and how effectively does CELEST: a)Facilitate interdisciplinary research collaborations among scientists and across institutions? b)Support the education of a new interdisciplinary generation of scientists? c)Increase the number of underrepresented group members, including women, people of various ethnic backgrounds, and people with disabilities entering the field? d)Foster partnerships for the exchange of ideas with industry, other schools and professional societies? e)Provide avenues for sharing resources and ideas within the center, with partners and with the larger community? Program Evaluation & Research Group (PERG) at Lesley University
11
CELEST: Data Collection Methodologies Evaluation Area Interviews Focus Groups Observations and Meetings Surveys Project Artifacts Value of CELEST as a center CELEST PI/Co- PIs Postdocs Grad students CELEST board meetings SLC seminars Researchers Postdocs Grad students Site visit and EASRB reports Web site Table of cross- project collaborations Diversity activities Diversity co- chairs Diversity consultant Board meetings Researchers Web site Project documents E-mails Education- related initiatives PI/Co-PIs Postdocs Grad students Board meetings CELEST student retreat SLC seminar Postdocs Grad students Web site E-mails Course materials Table of cross- project collaborations Program Evaluation & Research Group (PERG) at Lesley University
12
CELEST Draft Logic Model—Fall 2009 Resources (includes NSF funding support) Activities Outputs & measures Short term outcomes— current year Intermediate outcomes—18 months Potential impact
13
CELEST Draft Logic Model-Fall 2009 cont’d Resources (includes NSF funding support) Activities Outputs & measures Short term outcomes— current year Intermediate outcomes—18 months Potential impact
14
Key Findings: CELEST and Centerness More collaboration across departments at BU More collaboration between BU and partner institutions (Harvard, MIT, Brandeis) More collaboration across fields and areas of specialization New funding structure: project, rather than investigator based New projects established as outgrowths of CELEST (SYNAPSE/DARPA project) CELEST research as “transformational” Provides opportunity to do integrated research utilizing multiple approaches (modeling and experimental techniques) Program Evaluation & Research Group (PERG) at Lesley University
15
CELEST: Key Challenges Uncertainty about funding and future of Center Challenges related to transition to new management structure Some resources still committed to projects from the previous configuration Ongoing need to create and implement new systems Program Evaluation & Research Group (PERG) at Lesley University
16
Coming from Behind: Lessons Learned for Evaluation The importance of having a clear delineation of the evaluator’s role and responsibilities both within the project and between project staff and NSF; The need for a flexible but detailed logic model, to serve as a visual representation of key project objectives and the means of achieving them; The value of ongoing data collection by project staff and the need for leadership’s ‘buy in’ to the program evaluation process (ideally) from the project’s inception; The need for clear/consistent guidelines from the funding agency, and provision of sufficient time and resources for a comprehensive evaluation to take place. Program Evaluation & Research Group (PERG) at Lesley University
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.