Relationships between Involvement and Use in the Context of Multi-site Evaluation American Evaluation Association Conference November 12, 2009.

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

STEM RAYS Research STEMRAYS Program Evaluation (Peterfreund & Associates) Educational research (Allan and Kelly) After School Club science and engineering.
Performance Assessment
Customised training: Learner Voice and Post-16 Citizenship.
Orelena Hawks Puckett Institute American Institutes for Research PACER Center University of Connecticut Center for Excellence in Disabilities Presentation.
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Collaborative Evaluation Communities in Urban Schools.
Research Narrative Designs Dr. William M. Bauer
Making the Most of Multisite Evaluations ADD PLACE ADD DATE.
Program Evaluation Essentials. WHAT is Program Evaluation?
Quantitative vs. Qualitative Research Method Issues Marian Ford Erin Gonzales November 2, 2010.
Selected Results from the Robert Noyce Teacher Scholarship Program Evaluation Frances Lawrenz Christina Madsen University of Minnesota.
Reproduced with permission from BESTEAMS 2004
Historical Research.
Noyce Program Evaluation Conference Thursday, December 6, 2007 Frances Lawrenz Michelle Fleming Pey-Yan Liou Christina Madsen Karen Hofstad-Parkhill 1.
Action Research Designs
Outline: Research Methodology: Case Study - what is case study
Cluster Analysis on Perceived Effects of Scholarships on STEM Majors’ Commitment to Becoming Teachers versus Teaching in High Needs Schools Pey-Yan Liou.
WRITING A RESEARCH PROPOSAL
Impact of Including Authentic Inquiry Experiences in Methods Courses for Pre-Service Elementary and Secondary Teachers Timothy F. Slater, Lisa Elfring,
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 18 Action Research Designs.
Session Materials  Wiki
1 MSP-Motivation Assessment Program (MSP-MAP) Tools for the Evaluation of Motivation-Related Outcomes of Math and Science Instruction Martin Maehr
DeAnn Huinker & Kevin McLeod University of Wisconsin-Milwaukee Designing High Quality Professional Development Knowledge, Management, & Dissemination Conference.
Needs Analysis Session Scottish Community Development Centre November 2007.
Data and Data Collection Questionnaire
Margaret J. Cox King’s College London
2014 E DUCATIONAL T ECHNOLOGY P LAN P ROJECT K ICKOFF.
COMMUNITY OF PRACTICE SESSION STEM Education: Communication Tools and Collaboration Opportunities May 20, /20/11Superintendents Community of Practice.
Too expensive Too complicated Too time consuming.
 When an adopted policy is put into practice.  Formal implementers- government officials who have legal authority to put ne policies into effect- Superintendent.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Archived Information Dissemination Strategies for a Hub and Spokes Model Mentoring Institution with Adaptors Based on presentation notes by Laurie Schreiner,
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
TLA – Teacher Learning Academy A Brief Overview. What is the TLA? The TLA provides a national system for teacher learning and professional development.
By Bankole Ebisemiju At an Intensive & Interactive workshop on Techniques for Effective & Result Oriented Annual Operation Plan November 24th 2010 Annual.
Differential Effects of Participatory Evaluation in a National Multi-site Program Evaluation Frances Lawrenz University of Minnesota.
Inquiry and Investigation. What was the TOPIC? PROBLEM? CIVIC INQUIRY?
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Stakeholder consultations Kyiv May 13, Why stakeholder consultations? To help improve project design and implementation To inform people about changes.
Dr. Lesley Farmer California State University Long Beach
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Graduate studies - Master of Pharmacy (MPharm) 1 st and 2 nd cycle integrated, 5 yrs, 10 semesters, 300 ECTS-credits 1 Integrated master's degrees qualifications.
Measuring and reporting outcomes for BTOP grants: the UW iSchool approach Samantha Becker Research Project Manager U.S. IMPACT Study 1UW iSchool evaluation.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Research and survey methods Introduction to Research Islamic University College of Nursing.
Conducting and Reading Research in Health and Human Performance.
1 What Are We Doing Here Anyway? Vision for our Work: Effective Science Learning Experiences Dave Weaver RMC Research Corp.
1 The Theoretical Framework. A theoretical framework is similar to the frame of the house. Just as the foundation supports a house, a theoretical framework.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Enhancing teaching and learning: Building of capabilities through the establishment of a University Community of Practice. Dr Jack Frawley Associate Professor.
GOALS OF USE OF RESEARCH Opportunities and choices in SRH Safer passages to adulthood Strengthened health systems Improved health and development.
The Development and Validation of the Evaluation Involvement Scale for Use in Multi-site Evaluations Stacie A. ToalUniversity of Minnesota Why Validate.
A Quick Guide to Empirical Research Collaborative Construction of a CSCL Theory EME 6403 Fall 2008 – Team 1.
Action Research Designs
Accounting for Partnership in MSP Evaluations Dr. Gordon Kingsley School of Public Policy Georgia Institute of Technology.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Research Design in Education Research Methods. Describe your research topic What is the nature of the problem and your research question? To answer the.
National Professional Standards for Teachers. Focus Role of the Australian Institute for Teaching and School Leadership Background on the National Professional.
Erin M. Burr, Ph.D. Oak Ridge Institute for Science and Education Jennifer Ann Morrow, Ph.D. Gary Skolits, Ed.D. The University of Tennessee, Knoxville.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
MY TIME, OUR PLACE Framework for School Age Care In Australia Prepared by: Children’s Services Central April 2012 Team Meeting Package.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Dr. Julia H. Bryan College of Public Affairs Doctor in Public Administration University of Baltimore (2013 Graduate) October 19, 2013.
MICHAEL A. HARNAR DOCTORAL CANDIDATE, CLAREMONT GRADUATE UNIVERSITY AMERICAN EVALUATION ASSOCIATION ANNUAL CONFERENCE NOV 5, 2011; SESSION 842 ANAHEIM,
Using Citation Analysis to Study Evaluation Influence: Strengths and Limitations of the Methodology Lija Greenseid, Ph.D. American Evaluation Association.
Stages of Research and Development
This material is based upon work supported by the National Science Foundation under Grant #XXXXXX. Any opinions, findings, and conclusions or recommendations.
Presentation transcript:

Relationships between Involvement and Use in the Context of Multi-site Evaluation American Evaluation Association Conference November 12, 2009

Beyond Evaluation Use Four-year NSF grant to study the relationships between involvement in program evaluation and use/influence Research team (2 co-PIs and 8 graduate students) based at the University of Minnesota Context of four NSF-funded multi-site programs Involvement and use by not directly intended (unintended) users

Framework for Involvement Cousins and Whitmore’s (1998) Systematic Collaborative Inquiry –Control of the Evaluation –Stakeholder Selection –Depth of Participation Burke’s (1998) Key Decision Points –Evaluation Stages –Activities –Levels of control

Framework for Use TypeUse For Definition: The Use of Knowledge... InstrumentalAction... for making decisions Conceptual or Enlightenment Understanding... to better understand a program or policy Political, Persuasive, or Symbolic Justification... to support a decision someone has already made or to persuade others to hold a specific opinion

Framework for Use and Influence TermDefinition Evaluation use The purposeful application of evaluation processes, findings, or knowledge to produce an effect Influence ON evaluation The capacity of an individual to produce effects on an evaluation by direct or indirect means Influence OF evaluation (from Kirkhart, 2000) The capacity or power of evaluation to produce effects on others by intangible or indirect means

More Recent Developments Kirkhart, 2000 –Evaluation Influence = capacity of persons or things to produce effects on others by intangible or indirect means (Kirkhart, 2000) –Map influence along three dimensions: source, intention, and time Mark & Henry 2003, Henry & Mark 2004 –Intangible influence on individuals, programs, and communities –Focus on direct use of evaluation results or processes not adequate

“Beyond Evaluation Use” NSF Programs Name of Program Years of Evaluations Local Systemic Change through Teacher Enhancement (LSC) 1995 – present Advanced Technological Education (ATE) Collaboratives for Excellence in Teacher Preparation (CETP) Building Evaluation Capacity of STEM Projects: Math Science Partnership Research Evaluation and Technical Assistance Project (MSP-RETA) 2002 – present

Four Programs and their Evaluations ATE: Advanced Technological Education— mainly community college level projects to enhance work force—evaluation included site visits, yearly survey LSC: Local Systemic Change—professional development for STEM in K-12 school districts—evaluation included observations, interviews, and surveys

Four Programs and Their Evaluations (cont.) CETP: Collaboratives for Excellence in Teacher Preparation—projects to improve STEM teacher education—evaluation included surveys and observations MSP-RETA: Math Science Partnerships, Research Evaluation and Technical Assistance—evaluation technical assistance included national meetings and provision of consultants

Methods Surveys of project PIs and evaluators in the four projects (645 respondents, 46%) Document review Interviews with key informant project personnel (29) Citation analysis (246 documents 376 citations) Survey of NSF PIs (191 respondents, 54.7%) In-depth analytic case studies

Results Perception of Evaluation Quality –Ability to conduct high quality evaluation –Be recognized as capable Interface with NSF –Evaluators as brokers and negotiators –NSF leveraging involvement and use –Importance of dissemination Life Cycles –Program –Projects –Individuals

Results Project Control –Complete choice –Required involvement –Balance affects use Community and Networking –Outreach –Development of a community of practice –Mutual respect –Skill sharing –Process use

Results Tensions –Where best to spend time and money –Balance local and national evaluation –Balance project and evaluation goals Uniqueness –Complex context –Individual responses

Implications Participants differentially affected by the depth and breadth of involvement in evaluation activities. Neither breadth nor depth was consistently predictive of perceived level of involvement. Lack of consistency in perceived involvement and use makes measuring involvement challenging. Any investigation likely to be substantially affected by the nature of the evaluation and the characteristics of the individual.

Limitations Only four instances of large, multi-site NSF evaluations and therefore generalizations to other settings are not possible, although potentialities can be suggested. The case studies themselves are based on self-report data along with some archival records. The numbers of people surveyed and interviewed are small but appear to be at least representative of the groups included. The instruments used for data gathering were developed as part of the project and therefore might not be valid as measures of involvement and use in other contexts.

Future Research Research on the causal nature of involvement with evaluation use Themes presented here provide fruitful areas for more investigation Cross-case analysis provides a strong baseline for more positivistic research Examine the issues raised here through quantitative path analytic procedures Develop strong theories about the relationship between involvement and use that could form the basis for hypothesis formulation

Note This material is based upon work supported by the National Science Foundation under Grant No. REC Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

For Further Information Online - – Research Team: –Dr. Frances Lawrenz –Dr. Jean A. King –Dr. Stacie Toal –Kelli Johnson –Denise Roseland