Project Evaluation Barb Anderegg, Connie Della-Piana, Russ Pimmel Division of Undergraduate Education National Science Foundation FIE Annual Conference.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Learning Outcomes Participants will be able to analyze assessments
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
How to Integrate Students with Diverse Learning Needs in a General Education Classroom By: Tammie McElaney.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Evaluation of Education Development Projects Barb Anderegg, Connie Della-Piana, and Russ Pimmel National Science Foundation FIE Conference October 29,
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Principles of High Quality Assessment
Introduction to Workshop 10 Choosing Learning and Teaching Approaches and Strategies.
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
Proposal Strengths and Weakness as Identified by Reviewers Russ Pimmel & Sheryl Sorby FIE Conference Oct 13, 2007.
IAEA International Atomic Energy Agency The IAEA Safety Culture Assessment Methodology.
LECTURER OF THE 2010 FIRST-YEAR STUDENT: How can the lecturer help? February 2010.
1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008.
Project Evaluation Webinar 3 of the Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics Series Scott Grissom & Janis.
Coaching and Providing Feedback for Improved Performance
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
Connie Della-Piana, Susan Finger, and Sue Fitzgerald Division of Undergraduate Education National Science Foundation CCLI/TUES PI Meeting January 27, 2011.
Margaret J. Cox King’s College London
Dr Elena Luchinskaya, Lancaster University/ Leeds Metropolitan University, UK.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Central concepts:  Assessment can measure habits of mind or habits of recall.  Tests have their limits.  It is important to know the purpose the test.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
Brooke Bennett. *National Educational Technology Standards and Performance Indicators for Teachers* 1. Facilitate & inspire student learning and creativity.
Evaluation of Education Development Projects Presenters Kathy Alfano, Lesia Crumpton-Young, Dan Litynski FIE Conference October 10, 2007.
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
Proposal Enhancement Strategies Russell Pimmel AASCU Workshop March 4, 2006.
Project Evaluation Connie Della-Piana Russ Pimmel Bev Watford Workshop for Faculty from Minority Serving Intuitions Feb. 8 –10, 2006.
Connected Learning with Web 2.0 For Educators Presenter: Faith Bishop Principal Consultant Illinois State Board of Education
Understanding of the fundamentals ◦ By the end of the class, 70% of the students will be able to:  Correctly draw freebody diagrams of 2D truss structures.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
© 2011 Brooks/Cole, A Division of Cengage Learning Chapter 16 Consultation and Collaboration You must be the change you wish to see in the world. Mahatma.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Project Evaluation Barb Anderegg and Russ Pimmel Division of Undergraduate Educxcation National Science Foundation Annual ASEE Conference June 24, 2007.
The Areas of Interaction are…
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Authentic Learning and Assessment Erin Gibbons Five Standards of Authentic Instruction  Higher-Order Thinking  Depth of Knowledge  Connectedness to.
Teaching Today: An Introduction to Education 8th edition
Proposal Writing Strategies Barb Anderegg and Susan Burkett National Science Foundation Annual ASEE Conference June 18, 2006.
 An in-class quiz could be called assessment.  Using the in-class quiz results to realize the students are not achieving a learning outcome is an evaluation.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 13 Assessing Affective Characteristics.
Curriculum Report Card Implementation Presentations
Writing a More Effective Proposal Susan Burkett and Stephanie Adams February 9, 2006.
Chapter 1 –organizing principle
Designing A Course Course Design, Learning Styles, Teaching Styles Heather Macdonald Rachel Beane.
Assessment Formats Charlotte Kotopoulous Regis University EDEL_450 Assessment of Learning.
1.  Most of the information presented in this workshop represents the presenters’ opinions and not an official NSF position  Local facilitators will.
NOVA Evaluation Report Presented by: Dr. Dennis Sunal.
NSF’s Broader Impacts Criteria Bev Watford, Sue Kemnitzer, Russ Pimmel Division of Undergraduate Education National Science Foundation Session T4B, Thursday.
“Teaching”…Chapter 11 Planning For Instruction
Workshop for Faculty from Minority Serving Intuitions ---- Overview ---- Russ Pimmel Workshop for Faculty from Minority Serving Intuitions Feb. 8 –10,
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
1. October 25, 2011 Louis Everett & John Yu Division of Undergraduate Education National Science Foundation October 26, 2011 Don Millard & John Yu Division.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
NSF’s Broader Impacts Criterion Bev Watford and Russ Pimmel Division of Undergraduate Education National Science Foundation Annual ASEE Conference June.
Coding Connections at the Interface of Algebra I and Physical World Concepts Improving Teacher Quality Grant Program Summer 2016.
Professional Development: Imagine Difference Shapes and Sizes
Scholarship of Teaching and Learning
Oleh: Beni Setiawan, Wahyu Budi Sabtiawan
Evaluation of An Urban Natural Science Initiative
ASSESSMENT OF STUDENT LEARNING
THE JOURNEY TO BECOMING
Research on Geoscience Learning
Promoting the Transfer of Mathematical Skills in Food Science Programmes Colette Fagan November 13, 2018.
Research on Geoscience Learning
Presentation transcript:

Project Evaluation Barb Anderegg, Connie Della-Piana, Russ Pimmel Division of Undergraduate Education National Science Foundation FIE Annual Conference October 19, 2005

Workshop Goals and Outcomes Goal: The workshop will enable you to collaborate with evaluation experts in preparing effective project evaluation plans

Workshop Goals and Outcomes Outcomes : Participants should be able to: –Describe several types of evaluation tools –Discuss some advantages and limitations of them –Define questions about the appropriateness of a tool in given situation –Outline a first draft of an evaluation plan –Work with an evaluator to develop a complete evaluation plan

Framework for the Workshop Learning situations involve prior knowledge –Some knowledge correct –Some knowledge incorrect (i. e., misconceptions) Learning is –Connecting new knowledge to prior knowledge –Correcting misconception Learning requires –Recalling prior knowledge – actively –Altering prior knowledge

Active-Cooperative Learning Learning activities must encourage learners to: –Recall prior knowledge -- actively, explicitly –Connect new concepts to existing ones –Challenge and alter misconception The think-share-report-learn (TSRL) process addresses these steps

Workshop Format “Working” Workshop –Short presentations (mini-lectures) –Group exercise Exercise Format –Think  Share  Report  Learn (TSRL) Limited Time -- Feel rushed –Intend to identify issues & suggest ideas Get you started No closure -- No “answers” – No “formulas”

Workshop Outline The workshop will address the following topics: –Project goals, objectives, and outcomes –Tools for evaluating learning outcomes –Tools for evaluating non learning behavioral outcomes –Project evaluation plans –Working with an evaluator

Evaluation and Assessment Evaluation and assessment have many meaning –Evaluating an individual's performance (i. e., grading) –Evaluating a program’s effectiveness (i. e., ABET assessment) –Evaluating project’s progress and implementation (i e., formative evaluation as part of project management) –Evaluating a project’s success (i e., summative evaluation) Workshop concerned with project evaluation –May involve evaluating individual and group performance – but in the context of the project

Project Goals, Objectives, and Outcomes

Goals, Objectives, and Outcomes Goal – Broad, overarching statement of intention or ambition –A goal typically leads to several objectives Objective – Specific statement of intention –Measurable –More focused and specific than goal –A objective may lead to one or more outcomes Outcome – Statement of expected result –Measurable with criteria for success

Project Goals, Objectives, and Outcomes Evaluation starts with carefully defined project goals, objectives, and outcomes Goals objectives, and outcomes take many forms –Some related to project management Completing –Products (e.g., laboratory write-ups) –Milestones in a process (e.g., presenting a paper at a conference Implementing activities –Some related to student behavior Achieving a learning outcome Modifying an attitude or a perception

Read the abstract and come up with several goals for the project. Include both learning and non-learning goals. Identification of Goals and Objectives: Exercise (1)

Abstract The goal of the project is …… The project is developing computer-based instructional modules for statics and mechanics of materials. The project uses 3D rendering and animation software, in which the user manipulates virtual 3D objects in much the same manner as they would physical objects. Tools being developed enable instructors to realistically include external forces and internal reactions on 3D objects as topics are being explained during lectures. Exercises are being developed for students to be able to communicate with peers and instructors through real-time voice and text interactions. Educational research includes detailed exploration of learning and instructional design variables to provide insights into basic cognitive and educational issues that underlie learning using innovative software for a diverse population of students. The material is being beta tested at multiple institutions including community colleges. The project is being evaluated by …….. The project is being disseminated through ……….

PD’s Response Increase student understanding of statics and strength of materials through animated, web-based 3D objects Increase student ability to visualize external forces and internal reactions in web-based 3D objects Improve student communication through real time voice and text interactions Broaden participation of underrepresented groups Understand how the use of learner-centered instruction affects students with different learning styles

Write several objectives for this goal: “Broaden participation of underrepresented groups” Identification of Goals and Objectives: Exercise (2)

PD’s Response Create lab exercises with clear social context to possibly increase female students’ interest Provide better retention of minority students –Tutoring, mentoring, etc. Increase African American students’ participation in laboratory groups

Tools for Evaluating Learning Outcomes

Surveys (forced choice or open-ended responses) Interviews and focus groups Conversational analysis Observations Ethnographies Meta-analysis Olds et al, JEE 94:13, 2005

Comparison of Two Tools Survey –Efficient –Accuracy depends on subjects honesty –Difficult to develop reliable and valid survey –Low response rate threatens reliability and validity Observations –Useful for observable behavior –Useful for capturing behavior that subjects unlikely to report –Interrater reliability must be established –Time and labor intensive Olds et al, JEE 94:13, 2005

Concept inventories (CIs)

Introduction to CIs Force Concept Inventory (FCI) (The Physics Teacher 30:141, 1992) Diagnostic test Measures students’ conceptual understanding –Strongly influenced by their beliefs, intuition, and “common sense” –Many misconception from prior knowledge Difficult to change

From Thermodynamics Concept Inventory: Midkiff, Litzinger, and Evans,

Introduction to CIs Series of multiple choice questions –Items involving single concept Do not require use of formulas, calculations, or solving problems –Answers include “detractors” reflecting common misconceptions Use of FCI has changed how physics is taught (Mazur, Optics and Photonics News 3:38, 1992)

Other Concept Inventories Concept inventories developed for: –Chemistry –Statistics –Strength of materials –Thermodynamics –Heat transfer –Fluid mechanics –Circuits –Signal and systems –Electromagnetic waves Richardson, in Invention and Impact, AAAS, 2004

Tools for Evaluating CI: Exercise (3) What questions would you use decide if a specific CI was appropriate for your evaluation?

PD’s Response Is it competency based? Is the tool relevant to what was taught? Is it conceptual or procedural? What is the tool’s development stage? Is the tool tested? Reliable? Validated? Has it been compared to other measures? Other tools? Is it sensitive? Does it discriminate novice and expert or pre and post? Has it been used by others beside the developer? At other sites? With other populations? Is there normative data?

Concept Inventories: Exercise (4) Consider the hypothetical CI data. –What observations can you make from this data? –What inferences would you draw from those observations?

Hypothetical CI Data PREPOST T T 1c--4c c8 2cc c23cc--c30 3cc--15c--c11 ::::::::::::: 20c--8ccc c42 T Pre and Post scores for 50 students (S1 to S50) on 20 questions (Q1 to Q20). All 50 students answered every question. A blank indicate incorrect answer and a “c” indicates correct answer. “T” indicates the total correct answers. Q S

Hypothetical CI Data PREPOST T T 1c--4c c8 2cc c23cc--c30 3cc--15c--c11 ::::::::::::: 20c--8ccc c42 T Use the concept inventory method to interpret question 3. The data suggests: a. This was an easy concept b. This was a difficult concept c. The intervention had an effect d. The intervention did not have an effect Q S

Tools for Evaluating Learning: E xercise (5) These observations could be attributed to changes in learning or some other factor. What are some alternative explanations?

PD's response Students learned concept out of class such as in the dorm study sessions. The students answered what the instructor wanted rather than what they knew. This may not be a representative group of students. The students had a bad hair day or didn’t care about performance. The instrument was not sensitive. Implementation of the intervention was poor.

Tools for Evaluating Non-learning Outcomes

Improved Non-Learning Outcomes Attitudes toward learning Perceptions about the profession, the department, working in teams Motivation for learning Self-efficacy, confidence Learning styles Intellectual development Ethical behavior

Tools for Evaluating Non- Learning Outcomes Perception of engineering Beliefs about own abilities Communication capabilities Ability to engage in design activities Adaptive expertise Intellectual development Turns et al, JEE 94:27, 2005

Evaluating Student Characteristics Learning styles –Student Preferentially focus on different types of information Tend to operate on perceived information in different ways –Tools Meyers-Briggs Type Indicator (MBTI) Learning Style Indicator (LSI) Index of Learning Styles (ILS) Felder et al, 94:57, 2005

Tools for Evaluating Student Characteristics Approaches to learning: –Students use a Surface (memorize) Deep (focus on understanding) Strategic (multiple) approaches –Tool Lancaster Approach to Studying Questionnaire (LASQ) Felder et al, 94:57, 2005

Tools for Evaluating Student Characteristics Levels of Intellectual Development –Students see knowledge, beliefs, and authorities in different ways (e. g., certain or contextual) –Tools Measure of Intellectual Development (MID) Measure of Epistemological Reflection (MER) Learning Environment Preferences (LEP) Felder et al, 94:57, 2005

Assessment of Attitude - Example Besterfield-Sacre et al evaluated students attitudes Developed a survey – select one of five predefined answers –Questions about their perception of their abilities Confidence in their skills in chemistry, communications, engineering, etc. –Questions about their perception of engineering as a profession Impressions about engineering as a precise science, as a lucrative profession, etc. JEE 86:37, 1997

Assessment of Attitude - Example Survey validated in pilot studies using alternate approaches: –Item analysis –Verbal protocol elicitation –Factor analysis Used survey tool to compare students who stayed in engineering to those who left

Evaluation Plan

Evaluation Plan: Exercise (6) Every project (proposal) needs an evaluation plan How would you go about developing an evaluation plan?

Getting a Start on Project Evaluation : What is working? For whom is it working? Why is it working? Under what conditions is it working?

Getting a Start on Project Evaluation The Project Goals Objectives Activities Expectations Measures/Methods GoalsObjectivesActivitiesOutputs/ Outcomes Measures What do I want to know about my project? (a) (b) (c)

Finding an Evaluator Departments of education, educational psychology, psychology, administration, sociology, science education, engineering education Campus teaching and learning center Colleagues and researchers Professional organizations Independent consultants NSF workshops/outreach NSF project and project PIs

Consult other sources –User Friendly Handbook for Project Evaluation –Existing tools –Science education literature

What are our purposes for the evaluation? What are the benefits for each party? Who is responsible for what? What are the resources? What is known about the project and evaluation? What can we add to the knowledge base? How can we work together? How do we end the relationship?

Summary Become knowledgeable. Draw on your experience/talk to colleagues. Stress usefulness. State the purpose. Develop a map of the project. Find an evaluator. Develop questions. Document activities. Focus on implementation and outcomes. List expected benefits. Consider possible unanticipated benefits. Consider possible unintended negative consequences. Develop a team-orientation. Assess the relationship.

… What evaluation can do is provide reasonably reliable, reasonably valid information about the merits and results of a particular program or project operating in particular circumstances …

Presentation available at: