Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008.

Similar presentations


Presentation on theme: "1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008."— Presentation transcript:

1 1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008

2 2 Caution The information in these slides represents the opinions of the individual program directors and not an official NSF position.

3 3 Session Goals The session will The session will  Increase your understanding of evaluation  Enable you to collaborate more effectively with evaluation experts in preparing and performing effective project evaluation plans It will NOT make you an evaluation expert

4 4 Perspective Ideas here applicable to your project whether in Week 5 or Year 5 Ideas here applicable to your project whether in Week 5 or Year 5 Consider your evaluation in light of what you discover here Consider your evaluation in light of what you discover here Make adjustments to try to get the best evaluation data possible Make adjustments to try to get the best evaluation data possible

5 5 Definitions Assessment Assessment  Measurement of something (e. g., students’ ability to solve problems  Usually does not carry a value judgment Assessment instrument (tool) Assessment instrument (tool)  The method or device for measuring (e. g., a test; survey; portfolio) Evaluation (e-VALUE-ation) Evaluation (e-VALUE-ation)  A judgment of something based upon results from one or more assessments  Frequently compared to an expected outcome

6 6 Evaluation and Assessment Evaluation (assessment) is used in many ways Evaluation (assessment) is used in many ways  Individual’s performance (grading)  Program’s effectiveness (ABET; regional accreditation)  Project’s progress & accomplishments (monitoring & validating) Session addresses project evaluation Session addresses project evaluation  May involve evaluating individual and group performance – but in the context of the project Project evaluation Project evaluation  Formative – monitoring progress  Summative – characterizing final accomplishments

7 7 Evaluation and Project Goals & Outcomes Evaluation starts with carefully defined project goals & expected (measurable) outcomes Evaluation starts with carefully defined project goals & expected (measurable) outcomes Goals & expected outcomes related to: Goals & expected outcomes related to:  Project management outcomes  Initiating or completing an activity  Finishing a “product”  Expected s tudent outcomes  Modifying a learning outcome  Modifying attitudes or perceptions Workshop focuses on student outcomes Workshop focuses on student outcomes

8 8 Evaluation and Project Goals/Outcomes/Questions

9 9 Developing Student Behavior Goals & Outcomes Start with one or more overarching statements of project intention Start with one or more overarching statements of project intention  Each statement is a GOAL What is your overall ambition? What do you hope to achieve? Convert each goal into one or more specific expected measurable results Convert each goal into one or more specific expected measurable results  Each result is an EXPECTED OUTCOME How will achieving your “intention” reflect in student behavior?

10 10 Goals – Objectives – Outcomes -- Questions Converting goals to expected outcomes may involve intermediate steps Converting goals to expected outcomes may involve intermediate steps  Intermediate steps maybe called objectives  More specific, more measurable than goals  Less specific, less measurable than outcomes Expected outcomes lead to questions Expected outcomes lead to questions  These form the basis of the evaluation  Evaluation process collects and interprets data to answer evaluation questions

11 11 Exercise Identification of Goals and Expected Outcomes Read the abstract Read the abstract  Note - Goal statement removed Suggest two plausible goals Suggest two plausible goals  One focused on a change in learning  One focused on a change in some other aspect of student behavior Use student not instructor focus

12 12 Abstract The goal of the project is …… The project is developing computer- based instructional modules for statics and mechanics of materials. The project uses 3D rendering and animation software, in which the user manipulates virtual 3D objects in much the same manner as they would physical objects. Tools being developed enable instructors to realistically include external forces and internal reactions on 3D objects as topics are being explained during lectures. Exercises are being developed for students to be able to communicate with peers and instructors through real-time voice and text interactions. The project is being evaluated by … The project is being disseminated through … The broader impacts of the project are … The goal of the project is …… The project is developing computer- based instructional modules for statics and mechanics of materials. The project uses 3D rendering and animation software, in which the user manipulates virtual 3D objects in much the same manner as they would physical objects. Tools being developed enable instructors to realistically include external forces and internal reactions on 3D objects as topics are being explained during lectures. Exercises are being developed for students to be able to communicate with peers and instructors through real-time voice and text interactions. The project is being evaluated by … The project is being disseminated through … The broader impacts of the project are … Substitute sample “organic chemistry” for “statics and mechanics of materials” “Interactions” for “external forces and internal reactions” “molecules” for “objects” Focus student not instructor perspective

13 13 PD’s Response - Goals Goals may focus on Goals may focus on  Cognitive changes  Knowledge-driven  Skill-driven  Affective (or attitudinal) changes  Success changes  Diversity Changes

14 14 PD’s Response Goals - Cognitive Change GOAL: Improve understanding or skills In the context of the course In the context of the course  Describe verbally the effect of external forces on a solid object  Solve textbook problems In application beyond course In application beyond course  Solve out-of-context problems  Visualize 3-D problems  Communicate technical problems orally

15 15 PD’s Response Goals on Attitudinal Changes GOAL: Improve  Interest in the course  Attitude about  Profession  Curriculum  Department  Self- confidence  Intellectual development

16 16 PD’s Response Goals on Success Changes Goals: Improve Goals: Improve  Recruitment rates  Retention or persistence rates  Graduation rates

17 17 PD’s Response Goals on Diversity “Broaden the participation of underrepresented groups” GOAL: To change a target group’s  Cognitive abilities (understanding or skills)  Attitudes  Success rates

18 18 Exercise Transforming Goals into Expected Outcomes Write one expected measurable outcome for each of the following goals: Write one expected measurable outcome for each of the following goals: 1. Increase the students’ understanding of the concepts in statics 1. Improve the students’ attitude about engineering as a career

19 19 PD’s Response Expected Outcomes Conceptual understanding Students will be better able to solve conceptual problems that don’t require the use of formulas or calculations Students will be better able to solve conceptual problems that don’t require the use of formulas or calculations Students will be better able to solve out-of- context problems. Students will be better able to solve out-of- context problems.Attitudinal Students will be more likely to describe engineering as an exciting career Students will be more likely to describe engineering as an exciting career The percentage of students who transfer out of engineering after the statics course will decrease. The percentage of students who transfer out of engineering after the statics course will decrease.

20 20 Exercise Transforming Expected Outcomes into Questions Write a question for these expected measurable outcomes: Write a question for these expected measurable outcomes: 1.Students will be better able to solve conceptual problems that do not require the use of formulas or calculations 2.In informal discussions, students will be more likely to describe engineering as an exciting career

21 21 PD’s Response Questions Conceptual understanding Did the students’ ability to solve conceptual problems increase ? Did the students’ ability to solve conceptual problems increase ? Did the students’ ability to solve conceptual problems increase because of the use of the 3D rendering and animation software? Did the students’ ability to solve conceptual problems increase because of the use of the 3D rendering and animation software?

22 22 PD’s Response - QuestionsAttitudinal Did the students discussions indicate more excitement about engineering as a career? Did the students discussions indicate more excitement about engineering as a career? Did the students discussions indicate more excitement, about engineering as a career because of the use of the 3D rendering and animation software? Did the students discussions indicate more excitement, about engineering as a career because of the use of the 3D rendering and animation software?

23 23 Reflection What is the most surprising idea you heard in this session?

24 24 Evaluation Tools

25 25 FLAG (Field-Tested Learning Assessment Guide) A Primer of Assessment & Evaluation A Primer of Assessment & Evaluation Classroom Assessment Techniques (CATS) Classroom Assessment Techniques (CATS)  Qualitative  Quantitative Searchable & Down-Loadable Tools Searchable & Down-Loadable Tools Resources in Assessment Resources in Assessmenthttp://www.flaguide.org

26 26 FLAG Classroom Assessment Techniques (CATS) Attitudinal Survey Attitudinal Survey Concept Tests Concept Tests Concept Mapping Concept Mapping Conceptual Diagnostic Tests Conceptual Diagnostic Tests Interviews Interviews Performance Assessment Portfolio Performance Assessment Portfolio Scoring Rubrics Scoring Rubrics Weekly Reports Weekly Reportshttp://www.flaguide.org

27 27 SALG (Student Assessment of Learning Gains) Assesses perceived degree of “gain” students made in specific aspects of the class Assesses perceived degree of “gain” students made in specific aspects of the class Spotlights course elements that best support student learning and those needing improvement Spotlights course elements that best support student learning and those needing improvement Web-based instrument requiring 10-15 minutes to use Web-based instrument requiring 10-15 minutes to use Easily modified by the instructor Easily modified by the instructor Provides instant statistical analysis of results Provides instant statistical analysis of results Facilitates formative evaluation throughout course Facilitates formative evaluation throughout coursehttp://www.salgsite.org

28 28 Concept Inventories (CIs)

29 29 Introduction to CIs A tool that measures conceptual understanding A tool that measures conceptual understanding Series of multiple choice questions Series of multiple choice questions  Questions involve single concept  Formulas, calculations, or problem solving not required  Possible answers include “detractors”  Common errors  Reflect common “misconceptions” Force Concept Inventory (FCI) is the prototype

30 30 Exercise Evaluating a CI Tool Suppose you where considering an existing CI for use in your project’s evaluation Suppose you where considering an existing CI for use in your project’s evaluation What questions would you consider in deciding if the tool is appropriate? What questions would you consider in deciding if the tool is appropriate?

31 31 PD’s Response Evaluating a CI Tool Nature of the tool Nature of the tool  Is the tool relevant to what was taught?  Is the tool competency based?  Is the tool conceptual or procedural? Prior validation of the tool Prior validation of the tool  Has the tool been tested?  Is there information on reliability and validity?  Has it been compared to other tools?  Is it sensitive? Does it discriminate novice and expert? Experience of others with the tool Experience of others with the tool  Has the tool been used by others besides the developer? At other sites? With other populations?  Is there normative data?

32 32 Decision Factors for Other Tools Would these questions be different for another tool? Would these questions be different for another tool?

33 Interpreting Evaluation Data

34 34 Hypothetical Concept Inventory Data

35 35 E xercise Alternate Explanation For Change Data suggests that the understanding of Concept #2 increased Data suggests that the understanding of Concept #2 increased One interpretation is that the intervention caused the change One interpretation is that the intervention caused the change List some alternative explanations List some alternative explanations  Confounding factors  Other factors that could explain the change

36 36 PD's Response Alternate Explanation For Change Students learned concept out of class (e. g., in another course or in study groups with students not in the course) Students learned concept out of class (e. g., in another course or in study groups with students not in the course) Students answered with what the instructor wanted rather than what they believed or “knew” Students answered with what the instructor wanted rather than what they believed or “knew” An external event (big test in previous period or a “bad-hair day”) distorted pre- or post-test data An external event (big test in previous period or a “bad-hair day”) distorted pre- or post-test data Instrument was unreliable Instrument was unreliable Other changes in course and not the intervention caused improvement Other changes in course and not the intervention caused improvement Students not representative groups Students not representative groups

37 37 E xercise Alternate Explanation for Lack of Change Data suggests that the understanding of Concept #1 did not increase Data suggests that the understanding of Concept #1 did not increase One interpretation is that the intervention did cause a change but it was masked by other factors One interpretation is that the intervention did cause a change but it was masked by other factors List some confounding factors that could have masked a real change List some confounding factors that could have masked a real change

38 38 PD's Response Alternate Explanations for Lack of Effect An external event (big test in previous period or a “bad-hair day”) distorted pre- or post-test data An external event (big test in previous period or a “bad-hair day”) distorted pre- or post-test data The instrument was unreliable The instrument was unreliable Implementation of the intervention was poor Implementation of the intervention was poor Population too small Population too small One or both student groups not representative One or both student groups not representative Formats were different on pre and post tests Formats were different on pre and post tests

39 39 Reflection What is the most surprising idea you heard in this session so far? What is the most surprising idea you heard in this session so far?

40 40 Evaluation Plan

41 41 Exercise Evaluation Plan Read the evaluation plan and suggest improvements Read the evaluation plan and suggest improvements

42 42 Exercise - Evaluation Plan The results of the project will be evaluated in several ways. First, students will be surveyed at the end of the semester on the content, organization, continuity of the topics, and the level of difficulty. Since the course can be taken as dual enrollment by local high school students, it is expected that the course will attract more students to the college. Due to the urban setting of Grant College, a majority of these students will be working part time, and they will be asked about the appropriateness of the content and its relevancy in their jobs. The results of the project will be evaluated in several ways. First, students will be surveyed at the end of the semester on the content, organization, continuity of the topics, and the level of difficulty. Since the course can be taken as dual enrollment by local high school students, it is expected that the course will attract more students to the college. Due to the urban setting of Grant College, a majority of these students will be working part time, and they will be asked about the appropriateness of the content and its relevancy in their jobs. Second, the professors teaching the subsequent advanced bioinformatics lecture courses will be asked to judge the students’ ability to apply bioinformatics. While the projects in these upper level classes focus on different application areas, the problems frequently involve concepts initially learned in the new bioinformatics laboratory. The current advanced bioinformatics instructors have taught the course for several years and are qualified to compare the future students’ abilities with their previous students. Second, the professors teaching the subsequent advanced bioinformatics lecture courses will be asked to judge the students’ ability to apply bioinformatics. While the projects in these upper level classes focus on different application areas, the problems frequently involve concepts initially learned in the new bioinformatics laboratory. The current advanced bioinformatics instructors have taught the course for several years and are qualified to compare the future students’ abilities with their previous students.

43 43 PD’s Responses - Evaluation Plan Tie evaluation to expected outcomes Tie evaluation to expected outcomes Include measures of student learning Include measures of student learning Include capturing the demographics of the population Include capturing the demographics of the population Use an “external” evaluator for objectivity Use an “external” evaluator for objectivity Describe processes for formative evaluation and summative evaluation Describe processes for formative evaluation and summative evaluation Consider including beta test at one or more other sites Consider including beta test at one or more other sites Include impact statement Include impact statement

44 44 References NSF’s User Friendly Handbook for Project Evaluation NSF’s User Friendly Handbook for Project Evaluation  http://www.nsf.gov/pubs/2002/nsf02057/start.htm Field-Tested Learning Assessment Guide (FLAG) Field-Tested Learning Assessment Guide (FLAG)  http://www.flaguide.org http://www.flaguide.org Online Evaluation Resource Library (OERL) Online Evaluation Resource Library (OERL)  http://oerl.sri.com/ CCLI Evaluation Planning Webinar CCLI Evaluation Planning Webinar  http://oerl.sri.com/ccli_resources.html SALG (Student Assessment of Learning Gains SALG (Student Assessment of Learning Gains  http://www.salgsite.org http://www.salgsite.org

45 45 Questions?

46 46


Download ppt "1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008."

Similar presentations


Ads by Google