Presentation is loading. Please wait.

Presentation is loading. Please wait.

Nick L. Smith Syracuse University

Similar presentations


Presentation on theme: "Nick L. Smith Syracuse University"— Presentation transcript:

1 Emergent, Investigative Evaluation: Theory, Development, and Use in Evaluation Practice
Nick L. Smith Syracuse University Presentation at the American Evaluation Association annual meeting, Anaheim, CA, November, 2011.

2 Focus of Presentation Overview of emergent, investigative evaluation approach. Brief look at one currently ongoing case example.

3 Emergent, Investigative Evaluation - EIE
Emergent / Flexible / Investigative Emergent – constantly adapt design. Flexible – responsive to changing contexts, client needs and interests, new information. Investigative – focus on discovery. Preordinate / Fixed / Confirmatory Preordinate – design established at outset. Fixed – conditions controlled to insure design integrity. Confirmatory – focus on proof & justification.

4 EIE – Related Variations
Design Based Research Educational Design Research Formative Assessment Developmental Evaluation

5 EIE – Conditions of Use When context matters and keeps changing.
When evaluand is unknown, dynamic, or developmental. When uniqueness matters. When questions and issues of concern are fluid.

6 EIE – Design  Design Process
Since the design is responsively adaptive and changing, attention focuses less on the specific steady state design and more on the process by which changes are made to the design – the Design Process.

7 EIE – Design Process  Evaluator Roles
Since the Design Process reflects constant change, the evaluator and stakeholders focus more on the evaluator role in adapting the design rather than on a stable design. Greater attention to evaluator roles is seen in client responsive evaluation approaches (e.g., participatory, collaborative, responsive, empowerment, transformative) than in fixed approaches (e.g., experimental).

8 Clarifying Evaluator Roles
What are the possible specific types of evaluator-client relationships? How might such relationships be characterized in order to create, maintain, and modify them as needed? General characteristics are vague and difficult to operationalize: evaluator as teacher, as judge, as researcher; evaluator as supportive, responsive, independent. Need specific role or relationship protocols: posture, activities, and resources.

9 Sample EIE Evaluator Role: Evaluator As Critical Observer
Posture: Formative Role Internal Purpose External Focus Emergent Findings

10 Formative Role The evaluator performs a formative, development role.
Monitors and reviews project work to assist project staff in improving activities and products. Summative judgments of quality are not warranted given the limited access and resources.

11 Internal Purpose Purpose of evaluator’s assistance is to support project staff in monitoring and improving direction and quality of efforts. Information provided for staff internal use, and generally not for external audiences. Evaluator provides a review and advising function, not an external evaluation of process or products.

12 External Focus Evaluation assistance is for internal use, but focus of observations is external in order to pose questions and issues of interest and importance to outside audiences. Intent is to assist the project in maintaining external accountability; to work with the project, but maintain an outside perspective.

13 Emergent Findings Issues identified and dealt with according to their urgency and importance. Flexible responsiveness is valued over prespecification of topics. Contributions of the evaluator become a part of the fabric of the project itself. Strongest form of accountability is evidence that the best possible decisions were made as the work unfolded. Project records of issues, decisions, and subsequent results from evaluator input are evidence that project made thoughtful, critical assessments on key issues throughout its work.

14 Sample EIE Evaluator Role: Evaluator As Critical Observer
Activities: Evaluator’s participation reflects an open, responsive process. Evaluator review of materials, reports, and data. Monthly conference calls with project staff; observations shared by the evaluator; real time questions to the evaluator during the conference calls; staff recording of conference call observations and decisions. Periodic review and updating of observations, decisions, and subsequent actions. As appropriate and needed, evaluator: reviews and comments, raises questions, identifies assumptions, asks for clarifications, questions decisions, suggests alternatives. Evaluator seeks to illuminate and question the logic and reasoning supporting ongoing project decisions.

15 Sample EIE Evaluator Role: Evaluator As Critical Observer
Resources: 2-4 hours a month s Once monthly conference calls 1 annual face to face meeting

16 Example: External Evaluation of SRI ATE Community College Partnership Models and Instructional Impacts Research project to study development and maintenance of industry/community college partnerships and subsequent instructional impact in training technologists. External evaluator employs Evaluator as Critical Observer role

17 Case Example: SRI CC Partnerships
Evaluation of research process and findings. Overview – 2 hours a month. Monthly conference calls usually include a prior review of material, a presentation of progress to date, a consideration of specific problems or concerns that have arisen since the last call, and a discussion of general conceptual, methodological, and practical issues related to the ongoing research.

18 Sample Conceptual, Methodological, and Practical Issues About Ongoing Research.
1. Tradeoffs in emphasizing investigative explanation versus conclusive generalization? 2. Most appropriate types of generalization of research claims: sampling generalization? statistical generalization? causal mechanism generalization? instance generalization? 3. Most useful and accurate understanding of partnership/instruction relationships: static? dynamic? evolving? 4. Type of portrayals to best capture and reflect relationships between partnerships and instruction: linear logic models? recursive dynamic patterns? complex configural representations? 5. Most informative levels of analysis given that partnerships and instruction interact at level of classroom, curriculum, center, community, region, industry, etc.? 6. Desirability that research design be emergent and fluid rather than preordinate and fixed given the observed continual changes in industry needs, local economic context, community college collaborative arrangements and structure, etc.?

19 Changes in Research Strategy – 18 Months
Preordinate to more emergent research design Survey to case studies Generalization to explanation Fixed logic model to more fluid representation In Response to: Shifting context Evolving evaluand Increasing understanding Changes in research strategy could have been different had conditions so warranted.

20 Assessment of Evaluator as Critical Observer
Requirements Requires peer-to-peer relationships among experienced participants. Requires mutual trust and respect among researchers and evaluator that enable difficult questions to be asked without judgment and answered without defensiveness. Requires a minimalist investment in evaluation resources. Benefits Provides fresh eyes; grounds discussion of research design issues in terms of what is actually happening in the field. Provides additional assistance in discerning emerging issues possibly overlooked when focusing on task completion. Evaluation approach is free to adapt as the study unfolds and the needs of the researchers change.

21 EIE Approach – Future Variations
Yarnall, L. & Smith, N. L., The evaluation theory-practice interface in 2036. Smith, N. L., Brandon, P. R., Hwalek, M., Kistler, S. J., Labin, S. N., Rugh, J., Thomas, V., & Yarnall, L. (2011). Looking ahead: The future of evaluation. American Journal of Evaluation, 32(4),


Download ppt "Nick L. Smith Syracuse University"

Similar presentations


Ads by Google