Download presentation
Presentation is loading. Please wait.
1
Eleventh hour evaluation
Guiding the process when the train has left the station
2
The Project State funded professional development of high school STEM teachers Funding provided in one year cycles for up to three years Grant awarded to a consortium of university and rural school system partners and coordinated by university personnel Program design consisted of a website hosting content modules, four one-day spring workshops and two four-day summer workshops held at the university
3
APPLYING AEA’S 1ST PRINCIPLE - SYSTEMATIC INQUIRY - TO EVALUATION WHEN. . .
Evaluators are asked to step in at the 11th hour Evaluators are expected to adhere to constructed by the grant writer Evaluators have limited time or opportunity to engage [in meaningful ways] with stakeholders Evaluators are tasked with developing valid and reliable data collection processes and instruments without the guidance of clearly defined goals and research questions
4
Our approach in cycle 1 We gathered information about the project through documents and conversations with the project team – the coordinator and two university faculty We constructed evaluation instruments to gather data from participants in the spring and summer workshops We reported results within 7 days of receipt of data We relied on our practical knowledge to make decisions As described by Kundin (2012) – “practitioners rely more on their assumptions, expertise, values and judgment to respond to an evaluation situation” than on evaluation theory Our practical knowledge grew out of our experiences evaluating U.S. Department of Education funded K-12 professional development and our experiences as teacher educators
5
Concerns about the collaboration
Limited response to completed reports Project team provided no information about the summer workshops We initiated conversation to understand their outcomes for summer workshops Based on the conversation and materials provided we developed draft evaluation tools
6
The Tipping Point The conversation occurred on May 23
Summer workshop materials were sent to us on June 8 & 11 We sent draft evaluation tools for their review on June 13
7
On the evening of June 14, just days prior to the start of the summer workshop, we received an from the Project Coordinator: “The most recent set of materials has caused some frustration on the part of our team in its thoroughness. I would like to make you aware of this and attempt to come to a resolution so future communications will be more smooth. Below are a few of the concerns being expressed. Please offer some insight on solutions to any/all of these as you can. Thank you. 1. The daily assessment questions were not specific enough to gain any real insight not already gathered elsewhere. 2. A plentiful amount of information was given during the phone conference with regard to what was desired to be assessed and not reflected in the documents sent. 3. The section headers were incomplete or inaccurate. 4. The materials arrived on the last working day prior to the start of the workshops. This is much too late. 5. If additional information was needed to draft a more thorough assessment piece, a call or should have been made last week or at best early this week to allow time to complete. Thank you for your time and I hope we can begin to work more cohesively moving forward.”
8
Preparing for cycle 2 If we continued with the project, we needed to revise our approach based on identifying problems and solutions. Our process Working backward to map a timeline of events beginning with stakeholders’ initial contact to confirmation of their desire to have us continue with the project in cycle 2 Gathering data from communications, face-to-face meetings and documents Reflecting on the data to identify and prioritize problems Hypothesizing solutions to problems Implementing our action steps/strategies Revising our strategies as needed
9
Problems we identified
Faulty assumptions made by evaluators We assumed that … Because the project aimed to provide professional development to high school teachers, the university faculty had knowledge of secondary education Because the project was coordinated by the university, the project team was knowledgeable about evaluation Because the project had multiple partners, the approach to evaluation would be collaborative and provide formative and summative findings Because the project was funded, the project team was guided by clearly defined goals Faulty assumptions made by project team The team assumed that… Evaluation processes should be determined by the team so that findings are predictable Evaluation of the project doesn’t require clearly defined goals Formative evaluation would point to a need for revisions in the professional development they had no time to accommodate
10
IDENTIFIED PROBLEMS (CONT’D)
Timing Eleventh hour hire resulted in… Too little time to meet to exchange information and clarify expectations and processes prior to implementation of Cycle 1 professional development Rushed communications via and phone conferences in which both parties assumed the other was clear about needs, timing and deliverables Last minute sharing of critical materials and information necessary to construct data collection instruments Communication The evaluation tasks defined in our service agreement did not align with what the project team said they wanted after receiving the data collection instruments we had constructed Use of and phone conferences proved inadequate in framing the ‘what, how and why’ of the evaluation There was a disconnect in understanding key evaluation terms including outcomes and goals
11
Hypothesized solutions
Identify… The questions we need to ask Clarify… Expectations, roles and processes Collaborate to… Construct a logic model Revise the evaluation plan Develop data collection measures
12
ACTION STEPS WE’VE TAKEN
We have communicated with the project team our commitment to work collaboratively in cycle 2. The team has agreed to this approach. We are in the process of collaborating on the design of the evaluation plan for cycle 2 and scheduled a meeting to work out details We are initiating communications to ensure we are “in the loop” on all aspects of the project that may impact the evaluation We are being proactive in requesting project documents and information about meetings relating to cycle 2 We are communicating our ongoing support of the project
13
Conclusion Ongoing steps. . . .
Conduct monthly assessment of steps 1-5 Communicate/meet frequently with project team Summarize all meetings in writing and clarify any misunderstandings Provide draft evaluation instruments well in advance of use to ensure time for review and revision Our Goal: Effective evaluation of this project is developed and carried out as a collaborative venture by entire project team for cycle 2 leading into cycle 3.
14
Reference Kundin, D. (2010). A conceptual framework for how evaluators make everyday practice decisions. American Journal of Evaluation, 31(3), 347 – 362.
15
Karen Kortecamp, Ph.D. Kathleen Anderson Steeves, Ph.D. (Ret.) Department of Curriculum & Pedagogy The George Washington University
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.