Coming from Behind: Developing a Logic Model and an Evaluation Strategy Late in a Center’s Life Cycle Judah Leblang, Project Evaluator Program Evaluation.

Slides:



Advertisements
Similar presentations
Teacher Education and Training for Social Inclusion in the Western Balkans ______________________________________________________________ Parental Involvement.
Advertisements

The Readiness Centers Initiative Early Education and Care Board Meeting Tuesday, May 11, 2010.
UCSC History. UCSC: A brief history 60s University Placement Committee A lot of field trips/interaction with employers.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
Evaluating Outcomes of Federal Grantee Collaborations Patricia Mueller, Chair AEA Conference 2014 Denver, CO The National Center on Educational Outcomes-National.
An Excellent Proposal is a Good Idea, Well Expressed, With A Clear Indication of Methods for Pursuing the Idea, Evaluating the Findings, and Making Them.
The Proposal Review Process Matt Germonprez Mutual of Omaha Associate Professor ISQA College of IS&T.
Proposal Writing Workshop Features of Effective Proposals: Fellowship Track Washington, DC January 9, 2014.
The IGERT Program Preliminary Proposals June 2008 Carol Van Hartesveldt IGERT Program Director IGERT Program Director.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Webinar #1 The Webinar will begin shortly. Please make sure your phone is muted. (*6 to Mute, #6 to Unmute) 7/3/20151.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Advancing Campus Internationalization Through an Integrated Approach: The Role of Languages and Cultures Across the Curriculum Regional Meeting of the.
Keystone State Reading Conference October 29, 2012 Dr. Deb Carr, King’s College.
Milwaukee Math Partnership Year 1 External Evaluation Lizanne DeStefano, Director Dean Grosshandler, Project Coordinator University of Illinois at Urbana-Champaign.
NSF Programs That Support Research in the Two-Year College Classroom  V. Celeste Carter, National Science Foundation Jeffrey Ryan, University of South.
1 EEC Board Policy and Research Committee October 2, 2013 State Advisory Council (SAC) Sustainability for Early Childhood Systems Building.
NSF ADVANCE Program Academic Careers in Engineering & Science (ACES) Lynn T. Singer (Provost’s Office), PI John Angus (Chemical Engineering), co-PI Mary.
DRAFT Building Our Future 2017 Fulton County Schools Strategic Plan Name of Meeting Date.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
WE ARE A COMPLEX LAND. MASLOW’S HIERARCHY OF NEEDS DESIRE TO HELP OTHERS MEANING TO LIFE ESTEEM NEEDS RECOGNITION & APPRECIATION BELONGINGNESS AND LOVE.
Prince Mohammad Bin Fahd University April HRH Prince Mohammad Bin Fahd Bin Abdulaziz Founder and Patron of the University.
Company LOGO Broader Impacts Sherita Moses-Whitlow 07/09/09.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Integrating Diversity into.
Indiana University School of Education Key Ideas for the Carnegie Initiative on the Doctorate Project.
Partnerships and Broadening Participation Dr. Nathaniel G. Pitts Director, Office of Integrative Activities May 18, 2004 Center.
Building Strong Geoscience Departments for the Future Cathy Manduca, Carol Ormand Carleton College Heather Macdonald, Geoff Feiss, College of William and.
Community Assessment Process WHY?? To identify and document the opportunities, challenges, strengths, and needs of a specific geographic community and.
Institutional Change and Sustainability: Lessons Learned from MSPs Nancy Shapiro & Jennifer Frank CASHÉ KMD Project University System of Maryland January.
Responsible Conduct of Research (RCR) What is RCR? New Requirements for RCR Who Does it Affect? When? Data Management What is the Institutional Plan? What.
Welcome! Please join us via teleconference: Phone: Code:
Differential Effects of Participatory Evaluation in a National Multi-site Program Evaluation Frances Lawrenz University of Minnesota.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
© 2011 Partners Harvard Medical International Strategic Plan for Teaching, Learning and Assessment Program Teaching, Learning, and Assessment Center Strategic.
Organizational Consultant, Critical Friend, and Evaluator: The Value and Challenge of Flexible Roles Kristine L. Chadwick, Ph.D. Evaluator.
NSF IGERT proposals Yang Zhao Department of Electrical and Computer Engineering Wayne State University.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Graduates for the 21 st Century - Perspective from Research Ian Diamond RCUK.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
David Mogk Dept. of Earth Sciences Montana State University April 8, 2015 Webinar SAGE/GAGE FACILITIES SUPPORTING BROADER EDUCATIONAL IMPACTS: SOME CONTEXTS.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Broadening Participation.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
Proposal Writing Workshop Features of Effective Proposals.
Proposal Writing Workshop Features of Effective Proposals.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Integrating Broader Impacts into your Research Proposal Delta Program in Research, Teaching, and Learning Trina McMahon Professor of Civil and Environmental.
NOVA Evaluation Report Presented by: Dr. Dennis Sunal.
Building a Culture of Leadership at Belmont High School Michael M. Harvey, Ed.D. Principal, Belmont High School.
NSF Peer Review: Panelist Perspective QEM Biology Workshop; 10/21/05 Dr. Mildred Huff Ofosu Asst. Vice President; Sponsored Programs & Research; Morgan.
School Accreditation School Improvement Planning.
NSF ADVANCE: Institutional Transformation for Faculty Diversity The University of Texas at El Paso April 2004 Evelyn Posey, Department of English Libby.
NSF Funding Opportunities Anthony Garza. General Funding Opportunities Standard proposals or investigator-initiated research projects (submission once.
The Georgia Department of Juvenile Justice Board of Education Presentation May 26, 2011.
Why Do I Have To Take This Course?!!!! STEM Connections and Community Based Learning.
Cal Poly Pomona University Strategic Plan 2011 ‐ 2015 Partial Assessment of Progress Presented to the University Strategic Planning Committee (USPC) 12/4/2014.
Intellectual Merit & Broader Impact Statements August 2016
Linda J. Sax, Professor, GSEIS/UCLA
D I S C O V E R Y Challenge.
University Career Services Committee
NCATE Standard 3: Field Experiences & Clinical Practice
Intellectual Merit & Broader Impact Statements August 2018
  PREM Annual Reporting “These guidelines were developed to provide a uniform reporting structure for the Partnerships in Research and Education in Materials.
Intellectual Merit & Broader Impact Statements August 2017
EDUCAUSE MARC 2004 E-Portfolios: Two Approaches for Transforming Curriculum & Promoting Student Learning Glenn Johnson Instructional Designer Penn State.
VISION AND CHANGE IN UNDERGRADUATE BIOLOGY EDUCATION: A Call to Action
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Share.Shape.Unite. Building our SSU Sonoma State University Academic Senate May 17, 2018 University Budget Office.
Intellectual Merit & Broader Impact Statements August 2019
Presentation transcript:

Coming from Behind: Developing a Logic Model and an Evaluation Strategy Late in a Center’s Life Cycle Judah Leblang, Project Evaluator Program Evaluation and Research Group (PERG) at Lesley University American Evaluation Association November 2, 2011

Evaluation of CELEST PERG served as the primary evaluator for CELEST—The Center of Excellence in Learning in Education, Science and Technology in The Center, based at Boston University, (also involving researchers from Brandeis, Harvard and MIT) was preparing for their critical review after receiving only provisional funding—an 18 month extension—in Program Evaluation & Research Group (PERG) at Lesley University

What is CELEST? CELEST’s core scientific objective is to understand how the whole brain learns; i.e., how it adapts as an integrated system to enable intelligent autonomous behavior. CELEST’s core technology objective is to develop novel brain-inspired technologies by implementing key insights gained from experiments and modeling: from bench to models to applications. (CELEST website) Program Evaluation & Research Group (PERG) at Lesley University

CELEST: Educational Goals CELEST is creating a new paradigm for educating graduate and undergraduate students in systems neuroscience by connecting biological knowledge about brains to an understanding of intelligent behavior through neural and computational models. Accordingly, CELEST graduate students cross-train in experimental and modeling techniques in ways that will make them valuable members of cross-disciplinary teams of the future of systems neuroscience, because they will be fluent in more than one “language.” (CELEST website) Program Evaluation & Research Group (PERG) at Lesley University

CELEST Evaluation Priorities Need for a comprehensive evaluation plan/credible evaluation Need for a logic model and clarity around key goals Demonstration of ‘value added’ or “centerness” as required by NSF Integration of an external evaluator into the project planning process and data collection Need to demonstrate change in leadership and restructuring of the project Desire for external evaluator to oversee all aspects of the evaluation and respond to NSF/SLC requests Program Evaluation & Research Group (PERG) at Lesley University

NSF/SLC Evaluation Priorities Rapid development of new/more comprehensive evaluation plan integrated into CELST project SIP Data collection and analysis by January 2010 Examination of CELEST’s functioning as a center vs as a cluster of independent researchers Measurements of “centerness,” to be compared with data from other SLC projects. Development of Year 6 Evaluation Report as basis for site visit and critical review. Program Evaluation & Research Group (PERG) at Lesley University

CELEST Evaluation: Phase 1 Sept 2009-January 2010 Documenting the value of CELEST as a center – Reviewed CELEST archival documents – Identified project strengths and examples of “centerness” – Conducted focus groups and interviews with graduate students and post-docs – Analyzed data and wrote evaluation report – Presented findings to CELEST board and site visit team—March 2010 Evaluation Methods used: – Two Focus groups: Interviewed 7 post docs, 6 graduate students – Student and Faculty survey data: Surveyed 28 of 34 grad students, 16 of 17 post docs; Surveyed 15 of 16 faculty – Interviews with PI and Co-PIs, 3 interviews with diversity team – Review of project website and archival documents Program Evaluation & Research Group (PERG) at Lesley University

Evaluation Phase 2: Fall ongoing Primary areas of focus: “Centerness”: Value of CELEST as a center – Mechanisms to share resources and ideas about neuroscience and related technology, within centers, with partners, with broader audience—researchers, educators and general public – Document outgrowth of CELEST/benefits of collaborations Education, outreach and partnerships – Education, research and career development – New Undergraduate courses--focus on CN 210—reviewed 2009 course materials and conferred with faculty and teaching fellows – Evaluate effectiveness of 2010 course – CELEST-related research experiences/internships – Partnerships with other institutions, industry, professional societies Diversity – Documented CELEST activities promoting inclusiveness and diversity: WISE, SACNAS conferences, summer internships, other activities Program Evaluation & Research Group (PERG) at Lesley University

Data collection was driven by the following evaluation questions, framed in consultation with CELEST staff, and developed to satisfy SLC requirements: 1.What makes CELEST a center? 2.What does CELEST do that is different than research, educational, and other activities that would be carried out by a group of individual investigators? 3.What value [to the participating institutions, to students, to the field] does CELEST add and how? Evaluation Questions Program Evaluation & Research Group (PERG) at Lesley University

Evaluation Questions cont’d 4. How, and how effectively does CELEST: a)Facilitate interdisciplinary research collaborations among scientists and across institutions? b)Support the education of a new interdisciplinary generation of scientists? c)Increase the number of underrepresented group members, including women, people of various ethnic backgrounds, and people with disabilities entering the field? d)Foster partnerships for the exchange of ideas with industry, other schools and professional societies? e)Provide avenues for sharing resources and ideas within the center, with partners and with the larger community? Program Evaluation & Research Group (PERG) at Lesley University

CELEST: Data Collection Methodologies Evaluation Area Interviews Focus Groups Observations and Meetings Surveys Project Artifacts Value of CELEST as a center CELEST PI/Co- PIs Postdocs Grad students CELEST board meetings SLC seminars Researchers Postdocs Grad students Site visit and EASRB reports Web site Table of cross- project collaborations Diversity activities Diversity co- chairs Diversity consultant Board meetings Researchers Web site Project documents s Education- related initiatives PI/Co-PIs Postdocs Grad students Board meetings CELEST student retreat SLC seminar Postdocs Grad students Web site s Course materials Table of cross- project collaborations Program Evaluation & Research Group (PERG) at Lesley University

CELEST Draft Logic Model—Fall 2009 Resources (includes NSF funding support) Activities Outputs & measures Short term outcomes— current year Intermediate outcomes—18 months Potential impact

CELEST Draft Logic Model-Fall 2009 cont’d Resources (includes NSF funding support) Activities Outputs & measures Short term outcomes— current year Intermediate outcomes—18 months Potential impact

Key Findings: CELEST and Centerness More collaboration across departments at BU More collaboration between BU and partner institutions (Harvard, MIT, Brandeis) More collaboration across fields and areas of specialization New funding structure: project, rather than investigator based New projects established as outgrowths of CELEST (SYNAPSE/DARPA project) CELEST research as “transformational” Provides opportunity to do integrated research utilizing multiple approaches (modeling and experimental techniques) Program Evaluation & Research Group (PERG) at Lesley University

CELEST: Key Challenges Uncertainty about funding and future of Center Challenges related to transition to new management structure Some resources still committed to projects from the previous configuration Ongoing need to create and implement new systems Program Evaluation & Research Group (PERG) at Lesley University

Coming from Behind: Lessons Learned for Evaluation The importance of having a clear delineation of the evaluator’s role and responsibilities both within the project and between project staff and NSF; The need for a flexible but detailed logic model, to serve as a visual representation of key project objectives and the means of achieving them; The value of ongoing data collection by project staff and the need for leadership’s ‘buy in’ to the program evaluation process (ideally) from the project’s inception; The need for clear/consistent guidelines from the funding agency, and provision of sufficient time and resources for a comprehensive evaluation to take place. Program Evaluation & Research Group (PERG) at Lesley University