Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett

Slides:



Advertisements
Similar presentations
Classroom Assessment Techniques for Early Alert of Students At Risk Carleen Vande Zande, Ph.D. Academic Leaders Workshop.
Advertisements

Institutional Effectiveness (ie) and Assessment
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
Methodology Overview: Developing Marine Ecological Scorecards Commission for Environmental Cooperation.
Making an Impact: Building Transportable and Sustainable Projects (formerly Dissemination) Webinar 4 of the Transforming Undergraduate Education in Science,
Russell Pimmel, Roger Seals and Stephanie Beard.  Spring 2010, NSF/DUE Engineering PDs initiate IWBW Series; Spring 2011-CS PDs join  Overall goals.
Computer Science Department Program Improvement Plan December 3, 2004.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Dr. Pratibha Gupta Associate professor Deptt. of Community Medicine ELMC & H, Lucknow.
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
Grade 12 Subject Specific Ministry Training Sessions
What are some instructional strategies that support inquiry?
Proposal Strengths and Weakness as Identified by Reviewers Russ Pimmel & Sheryl Sorby FIE Conference Oct 13, 2007.
Clickers in the Classroom Monday Models Spring 08 source:
March 20, 2012 Susan Finger & Sue Fitzgerald Division of Undergraduate Education National Science Foundation March 21, 2012 Sue Fitzgerald & Maura Borrego.
1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008.
Project Evaluation Webinar 3 of the Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics Series Scott Grissom & Janis.
Overview of NSF Education R & D Programs with an Emphasis on the TUES Program Louis Everett Susan Finger Sue Fitzgerald.
Mentor Workshop: Assessing Learners Facilitated by a Practice Education Facilitator.
Connie Della-Piana, Susan Finger, and Sue Fitzgerald Division of Undergraduate Education National Science Foundation CCLI/TUES PI Meeting January 27, 2011.
The Integration of Embedded Librarians at Tuskegee University Juanita M. Roberts Director Library Services Ford Motor Company Library/Learning Resources.
Formative Assessment.
Opening Doors to the Future Gateway Engineering Education Coalition Building a Program Assessment Plan Jack McGourty Associate Dean Fu Foundation School.
Formative and Summative Assessment. Session Goals: Create a common understanding of formative assessment Understand key findings on formative assessment’s.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
Understanding of the fundamentals ◦ By the end of the class, 70% of the students will be able to:  Correctly draw freebody diagrams of 2D truss structures.
Writing More Effective NSF Proposals Jeanne R. Small Oklahoma City, Oklahoma March 2, 2006 Division of Undergraduate Education (DUE) National Science Foundation.
Teacher Evaluation and Professional Growth Program Module 1: MSFE TEPG Rubric.
 An in-class quiz could be called assessment.  Using the in-class quiz results to realize the students are not achieving a learning outcome is an evaluation.
Next steps / Summative assessment IL Strategy Workshop Day 3 Session 5.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Dr. Sande Caton. Assessments Why do we assess our students? Individually, write at least three ideas you have about assessments With one or two colleagues.
PMSS 2015 Continuing Professional Learning Program.
Copyright © 2008 by Educational Testing Service. All rights reserved. ETS, the ETS logo and LISTENING. LEARNING. LEADING. are registered trademarks of.
Summative vs. Formative Assessment. What Is Formative Assessment? Formative assessment is a systematic process to continuously gather evidence about learning.
MakingConnections Unit planning.
Part 1 1. March 20, 2012 Susan Finger & Sue Fitzgerald Division of Undergraduate Education National Science Foundation March 21, 2012 Sue Fitzgerald &
MakingConnections Assessment.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
1.  Most of the information presented in this workshop represents the presenters’ opinions and not an official NSF position  Local facilitators will.
Tastes Great, Less Filling: How to Design and Deliver Substantial Instruction to Large Enrollment Classes Without Being Overwhelmed STEPHANIE JH MCREYNOLDS.
January 26, 2011 Careers Conference, Madison, Wisconsin Robin Nickel, Ph.D. Associate Director, Worldwide Instructional Design System.
111 MakingConnections Focus on Assessment. 222 Facilitator/s: Date:
MakingConnections 5Es. 2 Facilitator/s: Date: 3 Workshop purpose You are here to develop your knowledge and understanding of the PrimaryConnections 5Es.
Workshop for Faculty from Minority Serving Intuitions ---- Overview ---- Russ Pimmel Workshop for Faculty from Minority Serving Intuitions Feb. 8 –10,
SEPTEMBER 20, 2011 facilitated by Dr. Heather Sheridan-Thomas TST BOCES Network Team Lead Evaluator of Teachers Training: Session 2 Developed by Teaching.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
1. October 25, 2011 Louis Everett & John Yu Division of Undergraduate Education National Science Foundation October 26, 2011 Don Millard & John Yu Division.
This work is supported by a National Science Foundation (NSF) collaboration between the Directorates for Education and Human Resources (EHR) and Geosciences.
Considering the Roles of Research and Evaluation in DR K-12 Projects December 3, 2010.
Unit 9: Evaluating a Public Health Surveillance System #1-9-1.
This work is supported by a National Science Foundation (NSF) collaboration between the Directorates for Education and Human Resources (EHR) and Geosciences.
A very simple way to define and differentiate learning goals and learning objectives! The following set of 5 slides and accompanying questions seems to.
An AAC Professional Learning Module Book Study based on the AAC publication Scaffolding for Student Success Scaffolding for Student Success Module 3: A.
Formative Assessment. Fink’s Integrated Course Design.
Workshop #1 Writing Quality Formative and Performance Based Assessments for MS Science.
Using Student Assessment Data in Your Teacher Observation and Feedback Process Renee Ringold & Eileen Weber Minnesota Assessment Conference August 5, 2015.
GOING DEEPER INTO STEP 1: UNWRAPPING STANDARDS Welcome!
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
Instructional Leadership Supporting Common Assessments.
Day Two: February 25, :30-3:00. Series Goals Participants will have the opportunity to:  Work collaboratively to:  Deepen their knowledge of the.
Philippines – Australia Basic Education Assistance for Mindanao Beam Pre-service Workshop “Authentic Assessment”
National 4 & 5 Physical Education. Documents available on website Unit by Unit approach to Performance (package 1) Unit by Unit approach to Factors impacting.
Getting Prepared for the Webinar
Thinking with Technology Course Module 4
Session Six Putting it all together: Planning learning and teaching
Presentation transcript:

Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett March 28, 2012 Handout 1

 Most of the information presented in this workshop represents the presenters’ opinions and not an official NSF position  Local facilitators will provide the link to the workshop slides at the completion of the webinar.  Participants may ask questions by “raising their virtual hand” during a question session. We will call on selected sites and enable their microphone so that the question can be asked.  Responses will be collected from a few sites at the end of each Exercise. At the start of the Exercise, we will identify these sites in the Chat Box and then call on them one at a time to provide their responses. 2

 Learning must build on prior knowledge ◦ Some knowledge correct ◦ Some knowledge incorrect – Misconceptions  Learning is ◦ Connecting new knowledge to prior knowledge ◦ Correcting misconceptions  Learning requires engagement ◦ Actively recalling prior knowledge ◦ Sharing new knowledge ◦ Forming a new understanding 3

 Effective learning activities ◦ Recall prior knowledge -- actively, explicitly ◦ Connect new concepts to existing ones ◦ Challenge and alter misconceptions  Active & collaborative processes ◦ Think individually ◦ Share with partner ◦ Report to local and virtual groups ◦ Learn from program directors’ responses 4

 Coordinate the local activities  Watch the time ◦ Allow for think, share, and report phases ◦ Reconvene on time -- 1 min warning ◦ With one minute warning, check Chat Box to see if you will be asked for a response  Ensure the individual think phase is devoted to thinking and not talking  Coordinate the asking of questions by local participants and reporting local responses to exercises 5

The session will enable you to collaborate more effectively with evaluation experts in preparing credible and comprehensive project evaluation plans …. it will not make you an evaluation expert. 6

After the session, participants should be able to:  Discuss the importance of goals, outcomes, and questions in the evaluation process ◦ Cognitive and affective outcomes  Describe several types of evaluation tools ◦ Advantages, limitations, and appropriateness  Discuss data interpretation issues ◦ Variability, alternative explanations  Develop an evaluation plan in collaboration with an evaluator ◦ Outline a first draft of an evaluation plan 7

 The terms evaluation and assessment have many meanings ◦ One definition  Assessment is gathering evidence  Evaluation is interpreting data and making value judgments  Examples of evaluation and assessment ◦ Individual’s performance (grading) ◦ Program’s effectiveness (ABET and regional accreditation) ◦ Project’s progress and success (monitoring and validating)  Session addresses project evaluation ◦ May involve evaluating individual and group performance – but in the context of the project  Project evaluation ◦ Formative – monitoring progress to improve approach ◦ Summative – characterizing and documenting final accomplishments 8

 Think about your favorite course. What types of in-class activities could be called: ◦ Assessment versus Evaluation ◦ Formative versus Summative Evaluation  Exercise min ◦ Think individually ~2 min ◦ Share with a partner ~2 min ◦ Report in local group ---- ~2 min  Watch time and reconvene after 6 min  Use THINK time to think – no discussion, Selected local facilitators report to virtual group  With one minute warning, check Chat Box to see if you will be asked for a response 9