Nick L. Smith Syracuse University

Slides:



Advertisements
Similar presentations
Inquiry-Based Instruction
Advertisements

Analyzing Student Work
TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
Chapter 4 How to Observe Children
1 Core Module Three – The Summative Report Core Module Three: The Role of Professional Dialogue and Collaboration in the Summative Report.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
SUNITA RAI PRINCIPAL KV AJNI
INACOL National Standards for Quality Online Teaching, Version 2.
Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 18 Action Research Designs.
ASSESSMENT& EVALUATION Assessment is an integral part of teaching. Observation is your key assessment tool in the primary and junior grades.
Goal Understand the impact on student achievement from effective use of formative assessment, and the role of principals, teachers, and students in that.
CYCO Professional Development Packages (PDPs) Teacher Responsiveness to Student Scientific Inquiry 1.
Qualitative Research.
EDU 385 Education Assessment in the Classroom
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
Cooperative Language Learning (CLL) Collaborative Learning (CL)
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.
LEARNER CENTERED APPROACH
BEGINNING EDUCATOR INDUCTION PROGRAM MEETING CCSD Professional Development Mrs. Jackie Miller Dr. Shannon Carroll August 6, 2014.
1 TESL Evaluating CALL Packages:Curriculum/Pedagogical/Lingui stics Dr. Henry Tao GUO Office: B 418.
Working in Partnership
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Moodle Wiki Trial Design for Online Learning SEM
Discuss how researchers analyze data obtained in observational research.
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
Open Math Module 3 Module 3: Approaches to Integrating OER into Math Instruction Planning Instruction with OER 1.0 Introduction.
Stages of Research and Development
Conceptual Change Theory
School Building Leader and School District Leader exam
Inquiry-Based Instruction
CHAPTER 10, qualitative field research
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Improving Governance Governance arrangements in complex and challenging circumstances Ofsted HMCI survey Dec 2016.
Competency Based Learning and Project Based Learning
MUHC Innovation Model.
Action Research Designs
Instructional Coaching Samir Omara RELO-NileTESOL Trainer s. m
Southern Regional Education Board Annual Leadership Forum
Assist. Prof.Dr. Seden Eraldemir Tuyan
E. Mahan Cultural Competency Prof. Ozcan Spring 2006
Data-Driven Instructional Leadership
Action Research in Education
CASE STUDY RESEARCH An Introduction.
CURRENT TRENDS IN COMMUNICATIVE LANGUAGE TEACHING
CURRENT TRENDS IN COMMUNICATIVE LANGUAGE TEACHING
Diagnostic Essay Feedback Analysis
Cambridge Upper Secondary Science Competition
Supervision and creating culture of reflective practice
CHAPTER 10, qualitative field research
Implementation Guide for Linking Adults to Opportunity
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Formulating the research design
CATHCA National Conference 2018
ISSUES AND ETHICAL PRACTICES
The Heart of Student Success
Standard for Teachers’ Professional Development July 2016
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Facilitating UFE step-by-step: a process guide for evaluators
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Characteristics of Improving School Districts Themes from Research
Cooperative Language Learning
CONSTRUCTIVISM Submitted To: Ma’am Misbah Yasmeen BPGCW (Air University)
HOW TO ENGAGE COMMUNITY MEMBERS IN OUTCOME EVALUATION?
LEARNER-CENTERED PSYCHOLOGICAL PRINCIPLES. The American Psychological Association put together the Leaner-Centered Psychological Principles. These psychological.
Presentation transcript:

Emergent, Investigative Evaluation: Theory, Development, and Use in Evaluation Practice Nick L. Smith Syracuse University Presentation at the American Evaluation Association annual meeting, Anaheim, CA, November, 2011.

Focus of Presentation Overview of emergent, investigative evaluation approach. Brief look at one currently ongoing case example.

Emergent, Investigative Evaluation - EIE Emergent / Flexible / Investigative Emergent – constantly adapt design. Flexible – responsive to changing contexts, client needs and interests, new information. Investigative – focus on discovery. Preordinate / Fixed / Confirmatory Preordinate – design established at outset. Fixed – conditions controlled to insure design integrity. Confirmatory – focus on proof & justification.

EIE – Related Variations Design Based Research Educational Design Research Formative Assessment Developmental Evaluation

EIE – Conditions of Use When context matters and keeps changing. When evaluand is unknown, dynamic, or developmental. When uniqueness matters. When questions and issues of concern are fluid.

EIE – Design  Design Process Since the design is responsively adaptive and changing, attention focuses less on the specific steady state design and more on the process by which changes are made to the design – the Design Process.

EIE – Design Process  Evaluator Roles Since the Design Process reflects constant change, the evaluator and stakeholders focus more on the evaluator role in adapting the design rather than on a stable design. Greater attention to evaluator roles is seen in client responsive evaluation approaches (e.g., participatory, collaborative, responsive, empowerment, transformative) than in fixed approaches (e.g., experimental).

Clarifying Evaluator Roles What are the possible specific types of evaluator-client relationships? How might such relationships be characterized in order to create, maintain, and modify them as needed? General characteristics are vague and difficult to operationalize: evaluator as teacher, as judge, as researcher; evaluator as supportive, responsive, independent. Need specific role or relationship protocols: posture, activities, and resources.

Sample EIE Evaluator Role: Evaluator As Critical Observer Posture: Formative Role Internal Purpose External Focus Emergent Findings

Formative Role The evaluator performs a formative, development role. Monitors and reviews project work to assist project staff in improving activities and products. Summative judgments of quality are not warranted given the limited access and resources.

Internal Purpose Purpose of evaluator’s assistance is to support project staff in monitoring and improving direction and quality of efforts. Information provided for staff internal use, and generally not for external audiences. Evaluator provides a review and advising function, not an external evaluation of process or products.

External Focus Evaluation assistance is for internal use, but focus of observations is external in order to pose questions and issues of interest and importance to outside audiences. Intent is to assist the project in maintaining external accountability; to work with the project, but maintain an outside perspective.

Emergent Findings Issues identified and dealt with according to their urgency and importance. Flexible responsiveness is valued over prespecification of topics. Contributions of the evaluator become a part of the fabric of the project itself. Strongest form of accountability is evidence that the best possible decisions were made as the work unfolded. Project records of issues, decisions, and subsequent results from evaluator input are evidence that project made thoughtful, critical assessments on key issues throughout its work.

Sample EIE Evaluator Role: Evaluator As Critical Observer Activities: Evaluator’s participation reflects an open, responsive process. Evaluator review of materials, reports, and data. Monthly conference calls with project staff; observations shared by the evaluator; real time questions to the evaluator during the conference calls; staff recording of conference call observations and decisions. Periodic review and updating of observations, decisions, and subsequent actions. As appropriate and needed, evaluator: reviews and comments, raises questions, identifies assumptions, asks for clarifications, questions decisions, suggests alternatives. Evaluator seeks to illuminate and question the logic and reasoning supporting ongoing project decisions.

Sample EIE Evaluator Role: Evaluator As Critical Observer Resources: 2-4 hours a month Emails Once monthly conference calls 1 annual face to face meeting

Example: External Evaluation of SRI ATE Community College Partnership Models and Instructional Impacts Research project to study development and maintenance of industry/community college partnerships and subsequent instructional impact in training technologists. External evaluator employs Evaluator as Critical Observer role

Case Example: SRI CC Partnerships Evaluation of research process and findings. Overview – 2 hours a month. Monthly conference calls usually include a prior review of material, a presentation of progress to date, a consideration of specific problems or concerns that have arisen since the last call, and a discussion of general conceptual, methodological, and practical issues related to the ongoing research.

Sample Conceptual, Methodological, and Practical Issues About Ongoing Research. 1. Tradeoffs in emphasizing investigative explanation versus conclusive generalization? 2. Most appropriate types of generalization of research claims: sampling generalization? statistical generalization? causal mechanism generalization? instance generalization? 3. Most useful and accurate understanding of partnership/instruction relationships: static? dynamic? evolving? 4. Type of portrayals to best capture and reflect relationships between partnerships and instruction: linear logic models? recursive dynamic patterns? complex configural representations? 5. Most informative levels of analysis given that partnerships and instruction interact at level of classroom, curriculum, center, community, region, industry, etc.? 6. Desirability that research design be emergent and fluid rather than preordinate and fixed given the observed continual changes in industry needs, local economic context, community college collaborative arrangements and structure, etc.?

Changes in Research Strategy – 18 Months Preordinate to more emergent research design Survey to case studies Generalization to explanation Fixed logic model to more fluid representation In Response to: Shifting context Evolving evaluand Increasing understanding Changes in research strategy could have been different had conditions so warranted.

Assessment of Evaluator as Critical Observer Requirements Requires peer-to-peer relationships among experienced participants. Requires mutual trust and respect among researchers and evaluator that enable difficult questions to be asked without judgment and answered without defensiveness. Requires a minimalist investment in evaluation resources. Benefits Provides fresh eyes; grounds discussion of research design issues in terms of what is actually happening in the field. Provides additional assistance in discerning emerging issues possibly overlooked when focusing on task completion. Evaluation approach is free to adapt as the study unfolds and the needs of the researchers change.

EIE Approach – Future Variations Yarnall, L. & Smith, N. L., The evaluation theory-practice interface in 2036. Smith, N. L., Brandon, P. R., Hwalek, M., Kistler, S. J., Labin, S. N., Rugh, J., Thomas, V., & Yarnall, L. (2011). Looking ahead: The future of evaluation. American Journal of Evaluation, 32(4), 565-599.