Download presentation
Presentation is loading. Please wait.
Published byReed Hankes Modified over 9 years ago
1
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University shelly.potts@asu.edu
2
Developing an Evaluation Plan n Key components What? How? When? Who? $? What does it mean? Who/what to tell? How used? n Selecting evaluation tools & methods What to consider? Types? n Assessing outcomes Did we accomplish goals? How do we know? n Recommendations
3
Comprehensive Evaluation Plan
4
Goals n General aims or purposes n Broad, long-range intended outcomes n Written in broad, global language n Used for planning and policy making. n Include such statements as: “gain an understanding of,” “become aware of,” or “acquire the ability to.”
5
Objectives n Brief, clear statements that describe intended results n Focused on specific types of behaviors participants are expected to demonstrate n Must be attainable, measurable, and feasible. n Include an action verb & a statement of ability
6
Sources of Data Entering StudentsCurrent Students Exiting Seniors Staff AlumniFaculty Employers of Graduates
7
Types of Evaluation Measures n Direct –Standardized, local, course-embedded tests –Review of student products (paper, exhibit, project, design,program) –Structured evaluation of performance (clinical/lab, simulation, presentation) –Portfolio assessment –Review of project documents n Indirect –Surveys –Interviews –Focus groups –Student evaluations –Institutional data Grades, enrollment, participation, retention/persistence
8
Selecting an Evaluation Method n Match method with goal, objective n Existing assessment? n Reliability, validity n Assess resources n Timing n Utility of data n Triangulation of data sources n Direct vs. indirect measures
9
Matching Methods to Objectives
10
Why use multiple methods? n Ensure continuity of evaluation n Increase confidence in findings
11
Why use mixed methods? n KNOWLEDGE GAIN –Richest, most comprehensive description of participants, processes, & outcomes –Strengthen validity of findings/reduce methodological bias n CREDIBILITY –Conveyed “what,” “how,” & “why” of experience & outcomes –Breadth and depth of participants’ experiences n UTILITY –Most complex, useful administrative resource –Meet competing needs of multiple stakeholders –Assess diverse project objectives
12
Learning Outcomes Assessment (an example)
13
Scoring criteria n Structure report into sections that reflect stages in the research process n Present research question clearly & describe strategy n Discuss literature & provide support for hypothesis n Describe how research was conducted n Summarize how data were collected and their statistical treatment n Evaluate and interpret implications of data with respect to the original hypothesis n Present ideas and arguments clearly and logically using appropriate balance of text and visuals n Cite references in appropriate format n Use English syntax and technical terms appropriately
14
Evaluation Recommendations n Develop a comprehensive (not complicated) plan n Define project objectives clearly and completely n Align methods with project objectives n Utilize mixed and multiple methods n Utilize direct and indirect methods n Focus on utility of evaluation data n Start early n Enlist an independent evaluator n Evaluate your evaluation plan and procedures
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.