Connie Della-Piana, Susan Finger, and Sue Fitzgerald Division of Undergraduate Education National Science Foundation CCLI/TUES PI Meeting January 27, 2011.

Slides:



Advertisements
Similar presentations
Inquiry-Based Instruction
Advertisements

Performance Assessment
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
SUCCESSFUL PROGRAM DEVELOPMENT STRATEGIES Solid research base is lacking Solid research base is lacking Hundreds of literature prescribe how to develop.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Russell Pimmel, Roger Seals and Stephanie Beard.  Spring 2010, NSF/DUE Engineering PDs initiate IWBW Series; Spring 2011-CS PDs join  Overall goals.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Identifying Content and Specifying Behaviors
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Evaluation of Education Development Projects Barb Anderegg, Connie Della-Piana, and Russ Pimmel National Science Foundation FIE Conference October 29,
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
Classroom Assessment A Practical Guide for Educators by Craig A
Proposal Strengths and Weakness as Identified by Reviewers Russ Pimmel & Sheryl Sorby FIE Conference Oct 13, 2007.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
March 20, 2012 Susan Finger & Sue Fitzgerald Division of Undergraduate Education National Science Foundation March 21, 2012 Sue Fitzgerald & Maura Borrego.
1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008.
Developing Evaluation Instruments
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
Depth of Knowledge (DOK)
Project Evaluation Webinar 3 of the Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics Series Scott Grissom & Janis.
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Developing teachers’ mathematics knowledge for teaching Challenges in the implementation and sustainability of a new MSP Dr. Tara Stevens Department of.
Evaluation of Education Development Projects Presenters Kathy Alfano, Lesia Crumpton-Young, Dan Litynski FIE Conference October 10, 2007.
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
Project Evaluation Connie Della-Piana Russ Pimmel Bev Watford Workshop for Faculty from Minority Serving Intuitions Feb. 8 –10, 2006.
Understanding of the fundamentals ◦ By the end of the class, 70% of the students will be able to:  Correctly draw freebody diagrams of 2D truss structures.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
EDU 385 Education Assessment in the Classroom
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Project Evaluation Barb Anderegg and Russ Pimmel Division of Undergraduate Educxcation National Science Foundation Annual ASEE Conference June 24, 2007.
Evaluating a Research Report
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Teaching Today: An Introduction to Education 8th edition
Project Evaluation Barb Anderegg, Connie Della-Piana, Russ Pimmel Division of Undergraduate Education National Science Foundation FIE Annual Conference.
Course and Syllabus Development Presented by Claire Major Assistant Professor, Higher Education Administration.
 An in-class quiz could be called assessment.  Using the in-class quiz results to realize the students are not achieving a learning outcome is an evaluation.
Quantitative and Qualitative Approaches
Lecture-3.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Research Problem From historical anecdotal evidence from colleagues, as well as from my own subjective, informal observations, students have a particularly.
Educational Objectives
Second Language Classroom Research (Nunan, D. 1990) Assoc. Prof. Dr. Sehnaz Sahinkarakas.
Summative vs. Formative Assessment. What Is Formative Assessment? Formative assessment is a systematic process to continuously gather evidence about learning.
Part 1 1. March 20, 2012 Susan Finger & Sue Fitzgerald Division of Undergraduate Education National Science Foundation March 21, 2012 Sue Fitzgerald &
1.  Most of the information presented in this workshop represents the presenters’ opinions and not an official NSF position  Local facilitators will.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
MSP Program Evaluation Carol L. Fletcher, Ph.D. TRC Project Director Meetings 1/27/09 and 2/5/09 Carol L. Fletcher, Ph.D. TRC Project Director Meetings.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
1. October 25, 2011 Louis Everett & John Yu Division of Undergraduate Education National Science Foundation October 26, 2011 Don Millard & John Yu Division.
Developing a Useful Instrument to Assess Student Problem Solving Jennifer L. Docktor Ken Heller Physics Education Research & Development Group
Teaching with Depth An Understanding of Webb’s Depth of Knowledge.
Introduction to research
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
1. 2 Issues in the Design and Testing of Business Survey Questionnaires: Diane K. Willimack U.S. Census Bureau Economic Census The International.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Understanding Depth of Knowledge. Depth of Knowledge (DOK) Adapted from the model used by Norm Webb, University of Wisconsin, to align standards with.
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
IB Assessments CRITERION!!!.
Presentation transcript:

Connie Della-Piana, Susan Finger, and Sue Fitzgerald Division of Undergraduate Education National Science Foundation CCLI/TUES PI Meeting January 27,

The following information represents the opinions of the individual program officers and is not an official NSF position.

 What are the 2-3 most important pieces of advice for a colleague on dealing with evaluation in an engineering education focused proposal (i.e., a TUES proposal)? Write down the advice/ideas you would offer No discussion Put them aside and save for later

 Effective learning activities actively, explicitlyRecall prior knowledge -- actively, explicitly Connect new concepts to existing ones Challenge and alter misconceptions Reflect on new knowledge  Active & collaborative processes TThink individually SShare with a neighbor RReport to group LLearn from Program Directors’ responses

The session will enable you to collaborate with evaluation experts in preparing effective project evaluation plans It will not make you an evaluation expert

After the session, participants should be able to:  Discuss the importance of goals, outcomes, and questions in the evaluation process  Cognitive and affective outcomes  Be aware of several types of evaluation tools  Advantages, limitations, and appropriateness  Discuss data interpretation issues  Variability, alternate explanations  Develop an evaluation plan with an evaluator  Outline a first draft of an evaluation plan

 Evaluation and assessment have many meanings…one definition: Assessment - is gathering evidence Evaluation - is interpreting data and making value judgments  Examples of assessment and evaluation Individual’s performance (grading) Program’s effectiveness (ABET and regional accreditation) Project’s progress and success (monitoring and validating)

 Session addresses: Project Evaluation May involve evaluating individual and group performance – but in the context of the project  Project evaluation Formative – monitoring progress to improve approach Summative – characterizing final accomplishments

 Effective evaluation starts with carefully defined project goals and expected outcomes  Goals and expected outcomes related to: Project management  Initiating or completing an activity  Finishing a “product” Student behavior  Modifying a learning outcome  Modifying an attitude or a perception

 Goals provide overarching statements of project intention What is your overall ambition? What do you hope to achieve?  Expected outcomes identify specific observable results for each goal How will achieving your “intention” reflect changes in student behavior? How will it change their learning and their attitudes?

 Goals → Expected outcomes → Evaluation questions  Questions form the basis of the evaluation process  Evaluation process collects and interprets data to answer evaluation questions

 Read the following abstract... then write expected measurable outcomes for these two goals: 1.Improve the students’ understanding of the fundamentals in course material (cognitive) 2.Improve the students’ self-confidence in solving engineering problems (affective) Individually identify several guidelines Share these with a neighbor or two Report to the group

The goal of the project is… The project is developing computer-based instructional modules for statics and mechanics of materials. The project uses 3D rendering and animation software, in which the user manipulates virtual 3D objects in much the same manner as they would physical objects. Tools being developed enable instructors to realistically include external forces and internal reactions on 3D objects as topics are being explained during lectures. Exercises are being developed for students to be able to communicate with peers and instructors through real-time voice and text interactions. The project is being evaluated by… The project is being disseminated through… The broader impacts of the project are… Goals: 1. Improve the students’ understanding of the fundamentals in course material 2. Improve the students’ self-confidence in solving engineering problems

(Outcomes)

Understanding of the fundamentals Students will be better able to:  Describe all parameters, variable, and elemental relationships  Describe the governing laws  Describe the effects of changing some feature in a simple problem  Changes in the frictional force on a block when the angle of an inclined plane changes  Changes in the forces in a simple three element truss when the connecting angles changeSelf-Confidence Students will do more of the homework Student will have less test anxiety Students will express more confidence in their solutions Students will be more willing to discuss their solutions

 Write an evaluation question for these expected measurable outcome: Understanding of the fundamentals  Students will be better able to describe the effects of changing some feature in a simple problem Self-Confidence  Students will express more confidence in their solutions Individually identify a question for each outcome Report to the group

(Evaluation Questions)

Understanding of the fundamentals Are students better able to describe the effects of changing some feature in a simple problem? Are students better able to describe the effects of changing some feature in a simple problem as a result of the intervention? Self-Confidence Do students express more confidence in their solutions? Do students express more confidence in their solutions as a result of the intervention?

 Quantitative methods What happened?  Qualitative methods Why?  Mixed methods

 Surveys Forced choice or open-ended responses  Observations Actually monitor and evaluate behavior  Interviews Structured (fixed questions) or in-depth (free flowing)  Concept Inventories Multiple-choice questions to measure conceptual understanding  Rubrics for analyzing student products Guides for scoring student reports, test, etc.  Focus groups Similar to interviews but with group interaction Olds et al, JEE 94:13, 2005 NSF’s Evaluation Handbook

Surveys  Efficient  Accuracy depends on subject’s honesty  Difficult to develop reliable and valid survey  Low response rate threatens reliability, validity, & interpretation Observations  Time & labor intensive  Inter-rater reliability must be established  Captures behavior that subjects unlikely to report  Useful for observable behavior Olds et al, JEE 94:13, 2005 NSF’s Evaluation Handbook

 Use interviews to answer these questions: What does program look and feel like? What do stakeholders know about the project? What are stakeholders’ and participants’ expectations? What features are most salient? What changes do participants perceive in themselves? The 2002 User Friendly Handbook for Project Evaluation, NSF publication REC

 Originated in physics -- Force Concept Inventory (FCI)  Several are being developed in engineering fields  Series of multiple choice questions Questions involve single concept  Formulas, calculations, or problem solving skills not required Possible answers include detractors  Common errors -- misconceptions  Developing a CI is involved Identify misconceptions and detractors Develop, test, and refine questions Establish validity and reliability of tool Language is a major issue

 Pittsburgh Freshman Engineering Survey ◦ Questions about perception  Confidence in their skills in chemistry, communications, engineering, etc.  Impressions about engineering as a precise science, as a lucrative profession, etc.  Validated using alternate approaches: ◦ Item analysis ◦ Verbal protocol elicitation ◦ Factor analysis  Compared students who stayed in engineering to those who left Besterfield-Sacre et.al., JEE 86:37, 1997

 Levels of Intellectual Development Students see knowledge, beliefs, and authority in different ways  “Knowledge is absolute” versus “Knowledge is contextual”  Tools Measure of Intellectual Development (MID) Measure of Epistemological Reflection (MER) Learning Environment Preferences (LEP) Felder et.al., JEE 94:57, 2005

 Suppose you were considering an existing tool (e.g., a concept inventory) for use in your project’s evaluation  What questions would you consider in deciding if the tool is appropriate? Individually identify several questions Share these with a neighbor or two Report to the group

(Outcomes)

 Nature of the tool Is the tool relevant to what was taught? Is the tool competency based? Is the tool conceptual or procedural?  Prior validation of the tool Has the tool been tested? Is it sensitive? Is there information on reliability and validity? Has it been compared to other tools? Does it discriminate between a novice and an expert?  Experience of others with the tool Has the tool been used by others besides the developer? At other sites? With other populations? Is there normative data?

Question/ Concept Number of StudentsPercent with Correct Answer Non- Intervention Group Intervention Group Non- Intervention Group Intervention Group ……………

 The data suggest that the understanding of Concept #2 increased  One interpretation is that the intervention caused the change  List some alternative explanations ◦ Confounding factors ◦ Other factors that could explain the change Individually identify several explanations Share these with a neighbor or two Report to the group

(Change in Data Interpretation)

 Students learned concept out of class (e.g., in another course or in study groups with students not in the course)  Students answered with what the instructor wanted rather than what they believed or “knew”  An external event distorted pretest data  Instrument was unreliable  Other changes in course and not the intervention caused improvement  Characteristics of groups were not similar  An external event influenced responses on posttest

 Data suggest that the understanding of the concept tested by Q1 did not improve  One interpretation is that the intervention did cause a change that was masked by other factors  Think about alternative explanations  How would these alternative explanations (confounding factors) differ from the previous list? Individually identify several explanations Share these with a neighbor or two Report to the group

(Lack of Change Interpretation)

 Sloppy administration of instrument  Non-random drop-out between pre- and post-  Faulty analysis  Lack of motivation of respondents  Item not well-constructed  Mixed finding in relevant literature

 List the topics that need to be addressed in the evaluation plan Individually identify all the topics Share these with a neighbor or two Report to the group

(Evaluation Plan Topics)

 Name & qualifications of the evaluation expert  Get the evaluator involved early in the proposal development phase  Explicit relationship between intervention and outcomes  Goals, outcomes, and evaluation questions  Instruments for evaluating each outcome  Protocols defining when and how data will be collected  Analysis & interpretation procedures  Confounding factors & approaches to minimize impact  Formative evaluation techniques for monitoring and improving the project as it evolves  Summative evaluation techniques for characterizing the accomplishments of the completed project

 NSF’s User Friendly Handbook for Project Evaluation  Online Evaluation Resource Library (OERL)  Field-Tested Learning Assessment Guide (FLAG)  Student Assessment of Their Learning Gains (SALG)  Evaluating & Improving Undergraduate Teaching in STEM National Research Council (2003)

 What are the three most important pieces of advice for a colleague on dealing with evaluation in an engineering education-focused proposal (i.e., a TUES proposal)? Write your ideas on your advice No discussion

Review your reflective statements How have they changed? What have you learned? 45 Individually review your statements Share these with a neighbor or two Report to the group

(Final Reflection)