Understanding of the fundamentals ◦ By the end of the class, 70% of the students will be able to:  Correctly draw freebody diagrams of 2D truss structures.

Slides:



Advertisements
Similar presentations
Evaluating and Institutionalizing
Advertisements

Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
The Assessment Toolbox Linda Suskie Middle States Commission on Higher Education AB Tech February 2005.
Assessment Photo Album Science Fair Project
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
Difficult Concepts in Science and Engineering: Identifying, Assessing, and Helping Students Learn Them Ruth Streveler (CSM), Mary Nelson (CU-Boulder),
Measurement  The process whereby individual instances within a defined population are scored on an attribute according to rules Usually given a numeric.
Objective vs. subjective in assessment Jaime Correia de Sousa, MD, MPH Horizonte Family Health Unit Matosinhos Health Centre - Portugal Health Sciences.
Russell Pimmel, Roger Seals and Stephanie Beard.  Spring 2010, NSF/DUE Engineering PDs initiate IWBW Series; Spring 2011-CS PDs join  Overall goals.
ABET The Complete Report on Your Course. ABET OUTCOME CHECKLIST.
Evaluation of Education Development Projects Barb Anderegg, Connie Della-Piana, and Russ Pimmel National Science Foundation FIE Conference October 29,
Collaborating for Student Success Using Collaborative Inquiry with Student Teachers to Support Teacher Professional Development Sponsored by Teachers for.
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
Quantitative Research
Introduction to Behavioral Science Unit 1. I.Social Sciences  The study of society and the activities and relationships of individuals and groups within.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008.
 Do non-majors learn genetics at a different rate than majors?  What factors affect how students think about and learn difficult genetics concepts? Jenny.
Developing Evaluation Instruments
Project Evaluation Webinar 3 of the Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics Series Scott Grissom & Janis.
Assessing Critical Thinking Skills Dr. Barry Stein - Professor of Psychology, Director of Planning, Coordinator of TTU Critical Thinking Initiative Dr.
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
Connie Della-Piana, Susan Finger, and Sue Fitzgerald Division of Undergraduate Education National Science Foundation CCLI/TUES PI Meeting January 27, 2011.
Program Evaluation Using qualitative & qualitative methods.
Central concepts:  Assessment can measure habits of mind or habits of recall.  Tests have their limits.  It is important to know the purpose the test.
Evaluation of Education Development Projects Presenters Kathy Alfano, Lesia Crumpton-Young, Dan Litynski FIE Conference October 10, 2007.
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
GOAL: To improve conceptual understanding and processing skills  In the context of course ◦ Draw free-body diagrams for textbook problems ◦ Solve 3-D.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Business Analysis and Essential Competencies
Project Evaluation Connie Della-Piana Russ Pimmel Bev Watford Workshop for Faculty from Minority Serving Intuitions Feb. 8 –10, 2006.
Classroom Assessment A Practical Guide for Educators by Craig A
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Project Evaluation Barb Anderegg and Russ Pimmel Division of Undergraduate Educxcation National Science Foundation Annual ASEE Conference June 24, 2007.
Quantitative and Qualitative Data Analysis: What’s the Difference? Jim Smith & Christine Maidl Pribbenow 2012 Research Residency.
Protocols for Mathematics Performance Tasks PD Protocol: Preparing for the Performance Task Classroom Protocol: Scaffolding Performance Tasks PD Protocol:
Data Collection Methods
Qualitative Research Interviews March 25, What Are Qualitative Interviews? “…attempts to understand the world from the subjects' point of view,
Project Evaluation Barb Anderegg, Connie Della-Piana, Russ Pimmel Division of Undergraduate Education National Science Foundation FIE Annual Conference.
 An in-class quiz could be called assessment.  Using the in-class quiz results to realize the students are not achieving a learning outcome is an evaluation.
© Copyright Pearson Prentice Hall Measurements and Their Uncertainty > Slide 1 of Using and Expressing Measurements A ___________________ is a quantity.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
 Descriptive Methods ◦ Observation ◦ Survey Research  Experimental Methods ◦ Independent Groups Designs ◦ Repeated Measures Designs ◦ Complex Designs.
Sharing Design Knowledge through the IMS Learning Design Specification Dawn Howard-Rose Kevin Harrigan David Bean University of Waterloo McGraw-Hill Ryerson.
Potential Errors In Epidemiologic Studies Bias Dr. Sherine Shawky III.
Kendall & KendallCopyright © 2014 Pearson Education, Inc. Publishing as Prentice Hall4-1 Interactive Methods to collect Information Requirements Interviewing.
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
1.  Most of the information presented in this workshop represents the presenters’ opinions and not an official NSF position  Local facilitators will.
Measurement Issues General steps –Determine concept –Decide best way to measure –What indicators are available –Select intermediate, alternate or indirect.
RESEARCH PAPER. AGENDA FOR TODAY (TUES. 3.25)  Debrief secondary source sheet  Enter & “cruise” the library database, find one or two primary sources.
Ch. 1: Introduction: Physics and Measurement. Estimating.
Copyright © 2014 by The University of Kansas Data Collection: Designing an Observational System.
Performance Based Assessment. What is Performance Based Assessment? PBA is a form of assessment that requires students to perform a task rather than an.
1. October 25, 2011 Louis Everett & John Yu Division of Undergraduate Education National Science Foundation October 26, 2011 Don Millard & John Yu Division.
David Steer Department of Geology & Environmental Sciences University of Akron August 2007 Improving Student Success by Scaffolding Learning.
1 An Evaluation Plan and Tool Kit for the Archibald Bush Innovative Teaching and Technology Strategies Grant Valerie Ruhe and J.D. Walker, Center for Teaching.
Chapter 1: Introduction. Physics The most basic of all sciences! Physics: The “Parent” of all sciences! Physics: The study of the behavior and the structure.
Developing a Useful Instrument to Assess Student Problem Solving Jennifer L. Docktor Ken Heller Physics Education Research & Development Group
MAT 735 : Meeting the Needs of Diverse Learners Problem Statement: Each year I have one or two gifted (QUEST) students in my classroom, as well as three.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
By Dr Hidayathulla Shaikh. Objectives  At the end of the lecture student should be able to –  Define survey  Mention uses of survey  Discuss types.
Chapter 7 Work & Energy Classical Mechanics beyond the Newtonian Formulation.
Assessing - How do we get beyond the final exam and effectively assess what our students are learning? David Steer Department of Geology & Environmental.
Classroom Assessment A Practical Guide for Educators by Craig A
Chapter 3: Measurement: Accuracy, Precision, and Error
Engage Cobb Conference
Presentation transcript:

Understanding of the fundamentals ◦ By the end of the class, 70% of the students will be able to:  Correctly draw freebody diagrams of 2D truss structures  Correctly write Newton’s laws when given a FBD  Describe the effects on member force when one angle in a 2D truss is changed Self-Confidence ◦ By the end of the semester:  30% of the class volunteers to show the solution to any homework problem on the board  Self reported test anxiety reduces to 50% of the initial amount  80% will say the class was easier than they expected it would be  50% report they are excited about taking the follow-on course 1 Handout 4

Understanding of the fundamentals ◦ Are the students better able to describe the effects of changing some variable in a simple problem ◦ Are the students better able to describe the effects of changing some variable in a simple problem as a result of the intervention Self-Confidence ◦ Do the students express more confidence in their solutions ◦ Do the students express more confidence in their solutions as a result of the intervention 2

BREAK 15 min

Tools for Evaluating Learning Outcomes

 Surveys ◦ Forced choice or open-ended responses  Concept Inventories ◦ Multiple-choice questions to measure conceptual understanding  Rubrics for analyzing student products ◦ Guides for scoring student reports, tests, etc.  Interviews ◦ Structured (fixed questions) or in-depth (free flowing)  Focus groups ◦ Like interviews but with group interaction  Observations ◦ Actually monitor and evaluate behavior Olds et al, JEE 94:13, 2005 NSF’s Evaluation Handbook 5

Surveys  Efficient  Accuracy depends on subject’s honesty  Difficult to develop reliable and valid survey  Low response rate threatens reliability, validity & interpretation Observations  Time & labor intensive  Inter-rater reliability must be established  Captures behavior that subjects are unlikely to report  Useful for observable behavior Olds et al, JEE 94:13,

 Use interviews to answer these questions: ◦ What does program look and feel like? ◦ What do stakeholders know about the project? ◦ What are stakeholders’ and participants’ expectations? ◦ What features are most salient? ◦ What changes do participants perceive in themselves? The 2002 User Friendly Handbook for Project Evaluation, NSF publication REC

 Originated in physics -- Force Concept Inventory (FCI)  Several are being developed in engineering fields  Series of multiple choice questions ◦ Questions involve single concept  Formulas, calculations or problem solving skills not required ◦ Possible answers include distractors  Common errors -- misconceptions  Developing CI is involved ◦ Identify misconceptions and detractors ◦ Develop, test, and refine questions ◦ Establish validity and reliability of tool ◦ Language is a major issue 8

 Pittsburgh Freshman Engineering Survey ◦ Questions about perception  Confidence in their skills in chemistry, communications, engineering, etc.  Impressions about engineering as a precise science, as a lucrative profession, etc.  Validated using alternate approaches: ◦ Item analysis ◦ Verbal protocol elicitation ◦ Factor analysis  Compared results for students who stayed in engineering to those who left Besterfield-Sacre et al, JEE 86:37,

 Levels of Intellectual Development ◦ Students see knowledge, beliefs, and authority in different ways  “ Knowledge is absolute” versus “Knowledge is contextual”  Tools ◦ Measure of Intellectual Development (MID) ◦ Measure of Epistemological Reflection (MER) ◦ Learning Environment Preferences (LEP) Felder et al, JEE 94:57,

 Suppose you were considering an existing tool (e. g., a concept inventory) for use in your project’s evaluation of learning outcomes  What questions would you consider in deciding if the tool is appropriate?  Long Exercise min ◦ Think individually ~2 min ◦ Share with a partner ~2 min ◦ Report in local group ---- ~2 min  Watch time and reconvene after 6 min  Use THINK time to think – no discussion  Selected local facilitators report to virtual group 11