1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008.

Slides:



Advertisements
Similar presentations
By: Edith Leticia Cerda
Advertisements

Performance Assessment
Curriculum Development and Course Design
Wynne Harlen. What do you mean by assessment? Is there assessment when: 1. A teacher asks pupils questions to find out what ideas they have about a topic.
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
What is the ADDIE Model? By San Juanita Alanis. The ADDIE model is a systematic instructional design model consisting of five phases: –Analysis –Design.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
1 Assessing CORE Student Learning Outcomes Summer Assessment Institute August, 2005 Presented by Jerry Rudmann Coastline Community College.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
North American Coordinating Council on Japanese Library Resources Training the Trainers Emily Werrell and Sharon Domier August 7-9, 2004.
Evaluation of Education Development Projects Barb Anderegg, Connie Della-Piana, and Russ Pimmel National Science Foundation FIE Conference October 29,
Principles of High Quality Assessment
Dr. Pratibha Gupta Associate professor Deptt. of Community Medicine ELMC & H, Lucknow.
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
Colorado Learning About Science Survey for Experimental Physics Benjamin Zwickl, Heather Lewandowski & Noah Finkelstein University of Colorado Physics.
Measuring Learning Outcomes Evaluation
Proposal Strengths and Weakness as Identified by Reviewers Russ Pimmel & Sheryl Sorby FIE Conference Oct 13, 2007.
INTRODUCTION.- PROGRAM EVALUATION
CENTER FOR TEACHING AND LEARNING Course Design, Outcomes and Objectives Presented by: Stacey DeLoose, Instructional Technologist, CTL.
FLCC knows a lot about assessment – J will send examples
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
March 20, 2012 Susan Finger & Sue Fitzgerald Division of Undergraduate Education National Science Foundation March 21, 2012 Sue Fitzgerald & Maura Borrego.
Project Evaluation Webinar 3 of the Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics Series Scott Grissom & Janis.
Connie Della-Piana, Susan Finger, and Sue Fitzgerald Division of Undergraduate Education National Science Foundation CCLI/TUES PI Meeting January 27, 2011.
Department of Computing and Technology School of Science and Technology A.A.S. in Computer Aided Design Drafting (CADD) CIP Code Program Quality.
Becoming a Teacher Ninth Edition
David Gibbs and Teresa Morris College of San Mateo.
Department of Mathematical Sciences School of Science and Technology B.A. in Mathematics CIP Code: Program Code: Program Quality Improvement.
Writing More Effective Proposals Russ Pimmel Abe Nisanci U of Alabama NSF. Share The Future IV March 17, 2003.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Adolescent Literacy – Professional Development
Evaluation of Education Development Projects Presenters Kathy Alfano, Lesia Crumpton-Young, Dan Litynski FIE Conference October 10, 2007.
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
Proposal Enhancement Strategies Russell Pimmel AASCU Workshop March 4, 2006.
GOAL: To improve conceptual understanding and processing skills  In the context of course ◦ Draw free-body diagrams for textbook problems ◦ Solve 3-D.
Project Evaluation Connie Della-Piana Russ Pimmel Bev Watford Workshop for Faculty from Minority Serving Intuitions Feb. 8 –10, 2006.
Understanding of the fundamentals ◦ By the end of the class, 70% of the students will be able to:  Correctly draw freebody diagrams of 2D truss structures.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
EDU 385 Education Assessment in the Classroom
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Improving the Teaching of Academic Genres in High-Enrollment Courses across Disciplines: A Three-Year Reiterative Study Chris Thaiss University of California,
Project Evaluation Barb Anderegg and Russ Pimmel Division of Undergraduate Educxcation National Science Foundation Annual ASEE Conference June 24, 2007.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Teaching Thermodynamics with Collaborative Learning Larry Caretto Mechanical Engineering Department June 9, 2006.
Teaching Today: An Introduction to Education 8th edition
Project Evaluation Barb Anderegg, Connie Della-Piana, Russ Pimmel Division of Undergraduate Education National Science Foundation FIE Annual Conference.
Proposal Writing Strategies Barb Anderegg and Susan Burkett National Science Foundation Annual ASEE Conference June 18, 2006.
 An in-class quiz could be called assessment.  Using the in-class quiz results to realize the students are not achieving a learning outcome is an evaluation.
Selected Teaching-Learning Terms: Working Definitions...
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Design of a Typical Course s c h o o l s o f e n g I n e e r I n g S. D. Rajan Professor of Civil Engineering Professor of Aerospace and Mechanical Engineering.
1.  Most of the information presented in this workshop represents the presenters’ opinions and not an official NSF position  Local facilitators will.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
This material is based on work supported by the National Science Foundation under Grant No. DACS-06D1420. Any opinions, findings, and conclusions or recommendations.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
1. October 25, 2011 Louis Everett & John Yu Division of Undergraduate Education National Science Foundation October 26, 2011 Don Millard & John Yu Division.
PBL Instructional Design. PBL Instructional Design Name: Name of PBL: Grade Level: Content Area:
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Instructional Plan | Slide 1 AET/515 Instructional Plan For Associate’s Degree in Library Skills (Donna Roy)
Assessment and Evaluation of CAREER Educational Components Center for Teaching Advancement and Assessment Research.
NEEDS ANALYSIS.
MEASURES OF SUCCESS: Assessment and Evaluation
Creating Assessable Student Learning Outcomes
CCLI Evaluation Planning Webinar
Presentation transcript:

1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008

2 Caution The information in these slides represents the opinions of the individual program directors and not an official NSF position.

3 Session Goals The session will The session will  Increase your understanding of evaluation  Enable you to collaborate more effectively with evaluation experts in preparing and performing effective project evaluation plans It will NOT make you an evaluation expert

4 Perspective Ideas here applicable to your project whether in Week 5 or Year 5 Ideas here applicable to your project whether in Week 5 or Year 5 Consider your evaluation in light of what you discover here Consider your evaluation in light of what you discover here Make adjustments to try to get the best evaluation data possible Make adjustments to try to get the best evaluation data possible

5 Definitions Assessment Assessment  Measurement of something (e. g., students’ ability to solve problems  Usually does not carry a value judgment Assessment instrument (tool) Assessment instrument (tool)  The method or device for measuring (e. g., a test; survey; portfolio) Evaluation (e-VALUE-ation) Evaluation (e-VALUE-ation)  A judgment of something based upon results from one or more assessments  Frequently compared to an expected outcome

6 Evaluation and Assessment Evaluation (assessment) is used in many ways Evaluation (assessment) is used in many ways  Individual’s performance (grading)  Program’s effectiveness (ABET; regional accreditation)  Project’s progress & accomplishments (monitoring & validating) Session addresses project evaluation Session addresses project evaluation  May involve evaluating individual and group performance – but in the context of the project Project evaluation Project evaluation  Formative – monitoring progress  Summative – characterizing final accomplishments

7 Evaluation and Project Goals & Outcomes Evaluation starts with carefully defined project goals & expected (measurable) outcomes Evaluation starts with carefully defined project goals & expected (measurable) outcomes Goals & expected outcomes related to: Goals & expected outcomes related to:  Project management outcomes  Initiating or completing an activity  Finishing a “product”  Expected s tudent outcomes  Modifying a learning outcome  Modifying attitudes or perceptions Workshop focuses on student outcomes Workshop focuses on student outcomes

8 Evaluation and Project Goals/Outcomes/Questions

9 Developing Student Behavior Goals & Outcomes Start with one or more overarching statements of project intention Start with one or more overarching statements of project intention  Each statement is a GOAL What is your overall ambition? What do you hope to achieve? Convert each goal into one or more specific expected measurable results Convert each goal into one or more specific expected measurable results  Each result is an EXPECTED OUTCOME How will achieving your “intention” reflect in student behavior?

10 Goals – Objectives – Outcomes -- Questions Converting goals to expected outcomes may involve intermediate steps Converting goals to expected outcomes may involve intermediate steps  Intermediate steps maybe called objectives  More specific, more measurable than goals  Less specific, less measurable than outcomes Expected outcomes lead to questions Expected outcomes lead to questions  These form the basis of the evaluation  Evaluation process collects and interprets data to answer evaluation questions

11 Exercise Identification of Goals and Expected Outcomes Read the abstract Read the abstract  Note - Goal statement removed Suggest two plausible goals Suggest two plausible goals  One focused on a change in learning  One focused on a change in some other aspect of student behavior Use student not instructor focus

12 Abstract The goal of the project is …… The project is developing computer- based instructional modules for statics and mechanics of materials. The project uses 3D rendering and animation software, in which the user manipulates virtual 3D objects in much the same manner as they would physical objects. Tools being developed enable instructors to realistically include external forces and internal reactions on 3D objects as topics are being explained during lectures. Exercises are being developed for students to be able to communicate with peers and instructors through real-time voice and text interactions. The project is being evaluated by … The project is being disseminated through … The broader impacts of the project are … The goal of the project is …… The project is developing computer- based instructional modules for statics and mechanics of materials. The project uses 3D rendering and animation software, in which the user manipulates virtual 3D objects in much the same manner as they would physical objects. Tools being developed enable instructors to realistically include external forces and internal reactions on 3D objects as topics are being explained during lectures. Exercises are being developed for students to be able to communicate with peers and instructors through real-time voice and text interactions. The project is being evaluated by … The project is being disseminated through … The broader impacts of the project are … Substitute sample “organic chemistry” for “statics and mechanics of materials” “Interactions” for “external forces and internal reactions” “molecules” for “objects” Focus student not instructor perspective

13 PD’s Response - Goals Goals may focus on Goals may focus on  Cognitive changes  Knowledge-driven  Skill-driven  Affective (or attitudinal) changes  Success changes  Diversity Changes

14 PD’s Response Goals - Cognitive Change GOAL: Improve understanding or skills In the context of the course In the context of the course  Describe verbally the effect of external forces on a solid object  Solve textbook problems In application beyond course In application beyond course  Solve out-of-context problems  Visualize 3-D problems  Communicate technical problems orally

15 PD’s Response Goals on Attitudinal Changes GOAL: Improve  Interest in the course  Attitude about  Profession  Curriculum  Department  Self- confidence  Intellectual development

16 PD’s Response Goals on Success Changes Goals: Improve Goals: Improve  Recruitment rates  Retention or persistence rates  Graduation rates

17 PD’s Response Goals on Diversity “Broaden the participation of underrepresented groups” GOAL: To change a target group’s  Cognitive abilities (understanding or skills)  Attitudes  Success rates

18 Exercise Transforming Goals into Expected Outcomes Write one expected measurable outcome for each of the following goals: Write one expected measurable outcome for each of the following goals: 1. Increase the students’ understanding of the concepts in statics 1. Improve the students’ attitude about engineering as a career

19 PD’s Response Expected Outcomes Conceptual understanding Students will be better able to solve conceptual problems that don’t require the use of formulas or calculations Students will be better able to solve conceptual problems that don’t require the use of formulas or calculations Students will be better able to solve out-of- context problems. Students will be better able to solve out-of- context problems.Attitudinal Students will be more likely to describe engineering as an exciting career Students will be more likely to describe engineering as an exciting career The percentage of students who transfer out of engineering after the statics course will decrease. The percentage of students who transfer out of engineering after the statics course will decrease.

20 Exercise Transforming Expected Outcomes into Questions Write a question for these expected measurable outcomes: Write a question for these expected measurable outcomes: 1.Students will be better able to solve conceptual problems that do not require the use of formulas or calculations 2.In informal discussions, students will be more likely to describe engineering as an exciting career

21 PD’s Response Questions Conceptual understanding Did the students’ ability to solve conceptual problems increase ? Did the students’ ability to solve conceptual problems increase ? Did the students’ ability to solve conceptual problems increase because of the use of the 3D rendering and animation software? Did the students’ ability to solve conceptual problems increase because of the use of the 3D rendering and animation software?

22 PD’s Response - QuestionsAttitudinal Did the students discussions indicate more excitement about engineering as a career? Did the students discussions indicate more excitement about engineering as a career? Did the students discussions indicate more excitement, about engineering as a career because of the use of the 3D rendering and animation software? Did the students discussions indicate more excitement, about engineering as a career because of the use of the 3D rendering and animation software?

23 Reflection What is the most surprising idea you heard in this session?

24 Evaluation Tools

25 FLAG (Field-Tested Learning Assessment Guide) A Primer of Assessment & Evaluation A Primer of Assessment & Evaluation Classroom Assessment Techniques (CATS) Classroom Assessment Techniques (CATS)  Qualitative  Quantitative Searchable & Down-Loadable Tools Searchable & Down-Loadable Tools Resources in Assessment Resources in Assessmenthttp://

26 FLAG Classroom Assessment Techniques (CATS) Attitudinal Survey Attitudinal Survey Concept Tests Concept Tests Concept Mapping Concept Mapping Conceptual Diagnostic Tests Conceptual Diagnostic Tests Interviews Interviews Performance Assessment Portfolio Performance Assessment Portfolio Scoring Rubrics Scoring Rubrics Weekly Reports Weekly Reportshttp://

27 SALG (Student Assessment of Learning Gains) Assesses perceived degree of “gain” students made in specific aspects of the class Assesses perceived degree of “gain” students made in specific aspects of the class Spotlights course elements that best support student learning and those needing improvement Spotlights course elements that best support student learning and those needing improvement Web-based instrument requiring minutes to use Web-based instrument requiring minutes to use Easily modified by the instructor Easily modified by the instructor Provides instant statistical analysis of results Provides instant statistical analysis of results Facilitates formative evaluation throughout course Facilitates formative evaluation throughout coursehttp://

28 Concept Inventories (CIs)

29 Introduction to CIs A tool that measures conceptual understanding A tool that measures conceptual understanding Series of multiple choice questions Series of multiple choice questions  Questions involve single concept  Formulas, calculations, or problem solving not required  Possible answers include “detractors”  Common errors  Reflect common “misconceptions” Force Concept Inventory (FCI) is the prototype

30 Exercise Evaluating a CI Tool Suppose you where considering an existing CI for use in your project’s evaluation Suppose you where considering an existing CI for use in your project’s evaluation What questions would you consider in deciding if the tool is appropriate? What questions would you consider in deciding if the tool is appropriate?

31 PD’s Response Evaluating a CI Tool Nature of the tool Nature of the tool  Is the tool relevant to what was taught?  Is the tool competency based?  Is the tool conceptual or procedural? Prior validation of the tool Prior validation of the tool  Has the tool been tested?  Is there information on reliability and validity?  Has it been compared to other tools?  Is it sensitive? Does it discriminate novice and expert? Experience of others with the tool Experience of others with the tool  Has the tool been used by others besides the developer? At other sites? With other populations?  Is there normative data?

32 Decision Factors for Other Tools Would these questions be different for another tool? Would these questions be different for another tool?

Interpreting Evaluation Data

34 Hypothetical Concept Inventory Data

35 E xercise Alternate Explanation For Change Data suggests that the understanding of Concept #2 increased Data suggests that the understanding of Concept #2 increased One interpretation is that the intervention caused the change One interpretation is that the intervention caused the change List some alternative explanations List some alternative explanations  Confounding factors  Other factors that could explain the change

36 PD's Response Alternate Explanation For Change Students learned concept out of class (e. g., in another course or in study groups with students not in the course) Students learned concept out of class (e. g., in another course or in study groups with students not in the course) Students answered with what the instructor wanted rather than what they believed or “knew” Students answered with what the instructor wanted rather than what they believed or “knew” An external event (big test in previous period or a “bad-hair day”) distorted pre- or post-test data An external event (big test in previous period or a “bad-hair day”) distorted pre- or post-test data Instrument was unreliable Instrument was unreliable Other changes in course and not the intervention caused improvement Other changes in course and not the intervention caused improvement Students not representative groups Students not representative groups

37 E xercise Alternate Explanation for Lack of Change Data suggests that the understanding of Concept #1 did not increase Data suggests that the understanding of Concept #1 did not increase One interpretation is that the intervention did cause a change but it was masked by other factors One interpretation is that the intervention did cause a change but it was masked by other factors List some confounding factors that could have masked a real change List some confounding factors that could have masked a real change

38 PD's Response Alternate Explanations for Lack of Effect An external event (big test in previous period or a “bad-hair day”) distorted pre- or post-test data An external event (big test in previous period or a “bad-hair day”) distorted pre- or post-test data The instrument was unreliable The instrument was unreliable Implementation of the intervention was poor Implementation of the intervention was poor Population too small Population too small One or both student groups not representative One or both student groups not representative Formats were different on pre and post tests Formats were different on pre and post tests

39 Reflection What is the most surprising idea you heard in this session so far? What is the most surprising idea you heard in this session so far?

40 Evaluation Plan

41 Exercise Evaluation Plan Read the evaluation plan and suggest improvements Read the evaluation plan and suggest improvements

42 Exercise - Evaluation Plan The results of the project will be evaluated in several ways. First, students will be surveyed at the end of the semester on the content, organization, continuity of the topics, and the level of difficulty. Since the course can be taken as dual enrollment by local high school students, it is expected that the course will attract more students to the college. Due to the urban setting of Grant College, a majority of these students will be working part time, and they will be asked about the appropriateness of the content and its relevancy in their jobs. The results of the project will be evaluated in several ways. First, students will be surveyed at the end of the semester on the content, organization, continuity of the topics, and the level of difficulty. Since the course can be taken as dual enrollment by local high school students, it is expected that the course will attract more students to the college. Due to the urban setting of Grant College, a majority of these students will be working part time, and they will be asked about the appropriateness of the content and its relevancy in their jobs. Second, the professors teaching the subsequent advanced bioinformatics lecture courses will be asked to judge the students’ ability to apply bioinformatics. While the projects in these upper level classes focus on different application areas, the problems frequently involve concepts initially learned in the new bioinformatics laboratory. The current advanced bioinformatics instructors have taught the course for several years and are qualified to compare the future students’ abilities with their previous students. Second, the professors teaching the subsequent advanced bioinformatics lecture courses will be asked to judge the students’ ability to apply bioinformatics. While the projects in these upper level classes focus on different application areas, the problems frequently involve concepts initially learned in the new bioinformatics laboratory. The current advanced bioinformatics instructors have taught the course for several years and are qualified to compare the future students’ abilities with their previous students.

43 PD’s Responses - Evaluation Plan Tie evaluation to expected outcomes Tie evaluation to expected outcomes Include measures of student learning Include measures of student learning Include capturing the demographics of the population Include capturing the demographics of the population Use an “external” evaluator for objectivity Use an “external” evaluator for objectivity Describe processes for formative evaluation and summative evaluation Describe processes for formative evaluation and summative evaluation Consider including beta test at one or more other sites Consider including beta test at one or more other sites Include impact statement Include impact statement

44 References NSF’s User Friendly Handbook for Project Evaluation NSF’s User Friendly Handbook for Project Evaluation  Field-Tested Learning Assessment Guide (FLAG) Field-Tested Learning Assessment Guide (FLAG)  Online Evaluation Resource Library (OERL) Online Evaluation Resource Library (OERL)  CCLI Evaluation Planning Webinar CCLI Evaluation Planning Webinar  SALG (Student Assessment of Learning Gains SALG (Student Assessment of Learning Gains 

45 Questions?

46