A Proposed Measurement Procedure for Indicators of Graduate Attributes M.F. Lightstone Mechanical Engineering 1.

Slides:



Advertisements
Similar presentations
A Focus on Higher-Order Thinking Skills
Advertisements

? freely adapted from Tulsa Community College- Engaged Student Programming.
DEVELOPING QUESTIONS FOR SCRIPTURE STUDY THAT SUPPORT MAXIMUM LEARNING J AN P ARON, P H D A LL N ATIONS L EADERSHIP I NSTITUTE Bloom’s Taxonomy: Six Levels.
Introduction to Programming with Excel and VBA Course Overview.
Objectives WRITING WORTHWHILE OBJECTIVES FOR YOUR CLASS.
Making Assignment Expectations Clear: Create a Grading Rubric Barb Thompson Communication Skills Libby Daugherty Assessment FOR Student Learning 1.
Creating an SLO or PLO Statement Presented by ORIE Team Summer 2013 Academy for Planning, Assessment, and Research.
Learning Taxonomies Bloom’s Taxonomy
Module Two: Learning Strategies Learning strategies are methods used by individuals in their interactions with learning tasks. Source:
Bloom's Taxonomy of Learning (Cognitive domain)
Learning Goals and Alignment: What, Why, How Joshua Caulkins Department of Geosciences University of Rhode Island.
Opening Day Presentation V. Jaramillo & A. Cadavid A. Ryan-Romo & F. OW Assessment Basics.
Lesson Planning. Teachers Need Lesson Plans So that they know that they are teaching the curriculum standards required by the county and state So that.
Writing Student Learning Outcomes Consider the course you teach.
Writing Learning Outcomes David Steer & Stephane Booth Co-Chairs Learning Outcomes Committee.
Effective Lesson Planning EnhanceEdu. Agenda  Objectives  Lesson Plan  Purpose  Elements of a good lesson plan  Bloom’s Taxonomy – it’s relevance.
1 Assessment Gary Beasley Stephen L. Athans Central Carolina Community College Spring 2008.
Increasing Critical Thinking POWER VERBS with. Remembering Level.
Student Learning Outcomes
Writing Objectives Including Bloom’s Taxanomy. Three Primary Components of an Objective Condition –What they’re given Behavior –What they do Criteria.
Writing Student-Centered Learning Objectives Please see Reference Document for references used in this presentation.
Bloom’s Taxonomy Revised Version. Bloom’s Taxonomy of Instructional Activities ( REVISED VERSION – PAGE 52) Create Evaluate Analyze Apply Understand Remember.
Bloom’s Taxonomy.
Models of Teaching Week 5 – Part 2.
1 Math 413 Mathematics Tasks for Cognitive Instruction October 2008.
Blooms Taxonomy Margaret Gessler Werts Department of Language, Reading, and Exceptionalities.
A Decision-Making Tool.  Goal  Educational Objectives  Student Learning Outcomes  Performance Indicators or Criteria  Learning Activities or Strategies.
Does this learning goal focus on what the student will do? Objective: Conservation of energy A.Yes B.No C.Depends on context.
© SCHLECHTY CENTER FOR LEADERSHIP IN SCHOOL REFORM All rights reserved. Introduction to Bloom’s Taxonomy Coaching for Design.
Assessment. Levels of Learning Bloom Argue Anderson and Krathwohl (2001)
BLOOM’S TAXONOMY Mrs. Eagen A, A. Bloom identified six levels within the cognitive domain, from the simple recall or recognition of facts,
Bloom’s Taxonomy A Focus on Higher-Order Thinking Skills.
1xx K K K Program Level Student Learning Outcomes K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation 1xx S K.
COMPREHENSION ANALYSIS EVALUATION APPLICATION SYNTHESIS KNOWLEDGE
QUESTIONING! 10/15. Agenda Discuss open-ended questions Discuss different question stems and levels Blooms and Costas Watch a clip on gun violence and.
Walking Through Grade 9 English
Unit 5 Seminar D ESCRIBING Y OUR L EARNING. Agenda Unit Objectives Bloom’s Taxonomy Learning Statements Questions.
Learning Objectives Learning objectives are statements as to what you intend your students to achieve. They can serve as a guiding light for workshop design.
Teaching and Thinking According to Blooms Taxonomy human thinking can be broken down into six categories.
D ESCRIBING Y OUR L EARNING Unit 5 Seminar. Agenda Unit Objectives Bloom’s Taxonomy Learning Statements Questions.
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
Bloom’s Taxonomy How to Create REALLY good questions!!
Facilitating Higher Order Thinking in Classroom and Clinical Settings Vanneise Collins, PhD Director, Center for Learning and Development Cassandra Molavrh,
Higher Order Thinking Overview. What to Expect in this Course This course may be different than others by: Incorporating instructional strategies that.
Activities, Activities, Activities Use the activities listed here to generate ideas for you own training materials. Mine the Metadata – Participants are.
Assessment.
Assessment.
Objectives Course Goal
A Focus on Higher-Order Thinking Skills
85. BLOOM’S TAXONOMY “Bloom’s Taxonomy is a guide to educational learning objectives. It is the primary focus of most traditional education.”
Evaluating Classroom assignments: Planning for Grading
Teacher’s Name Date of Lesson Title of Lesson
Author: Brenda Stephenson The University of Tennessee
Teacher’s Name Date of Lesson Title of Lesson
Teacher’s Name Date of Lesson Title of Lesson
Teacher’s Name Date of Lesson Title of Lesson
Orientation Vision Statement
Depth of Knowledge (DOK) Levels
Bloom’s Taxonomy: Six Levels for Understanding
Teacher B Pythagorean Theorem
Assessments for “Remembering” Outcomes
What you assess makes a statement about what you value
A Focus on Higher-Order Thinking Skills
Mathematics Pythagorean Theorem
Costa’s Levels of Questioning
Teacher’s Name Date of Lesson Title of Lesson
Teacher’s Name Date of Lesson Title of Lesson
Teacher’s Name Date of Lesson Title of Lesson
Our goal is to be thinking at a higher level.
? INQUIRY to question is to learn.
Presentation transcript:

A Proposed Measurement Procedure for Indicators of Graduate Attributes M.F. Lightstone Mechanical Engineering 1

Background/Language CEAB has defined 12 Graduate Attributes – example of an Attribute: “A knowledge base for engineering” – McMaster has added a 13 th attribute on sustainability. Each attribute has a number of ‘indicators’ associated with it. – Example of an indicator for the attribute ‘knowledge base for engineering’: “Competence in engineering fundamentals” We need to measure ‘indicators’ 2

Structure of Attributes/Indicators: (we are measuring learning outcomes that relate to the ‘indicators’) Graduate Attribute #1: – Indicator 1.1 – Indicator 1.2 Grad Attribute #2: – Indicator 2.1 – Indicator 2.2 – Indicator 2.3 … Graduate Attribute #13: – Indicator 13.1 – Indicator

Measurement Goals: 1.To determine the extent to which our students are attaining specific learning outcomes associated with the ‘indicator’ that we are measuring. 2.To use this information to improve our program in subsequent years. Need to satisfy the “continuous improvement” requirement of CEAB. 4

High level description of measurement procedure Want to get a sense of how well our students are attaining the learning outcomes associated with the indicator and determine what they are struggling with. We will be using tests, assignments, presentations, reports in our measurement process. We will use rubrics to measure the student learning outcomes. Four levels of grading will be used: 1.Does not meet expectations 2.Marginal 3.Meets expectations 4.Exceeds expectations 5

What is a learning outcome? Simply stated, a learning outcome is: 1.What faculty members want students to know at the end of the course AND 2.What faculty members want students to be able to do at the end of the course 6 Reference: Pauline Smiley, Fleming College, Symposium on Learning Outcomes Assessment, Toronto, 2012.

Characteristics of Learning Outcomes 1.They specify an action by the students that is observable. 2.They specify an action by the students that is measurable. 3.They specify an action that is done by the students (rather than the faculty members). 7 Reference: Pauline Smiley, Fleming College, Symposium on Learning Outcomes Assessment, Toronto, 2012.

Writing learning outcome statements Must include verbs! Example of a bad learning outcome statement: – “Differential equations” Example of a good learning outcome statement: – “A demonstrated ability to solve a linear homogeneous differential equation with associated boundary conditions.” – Note: verbs are included and detail is given. 8

Aside: Bloom’s Taxonomy Taxonomy is: Remember, Understand, Apply, Analyze, Evaluate, Create. It provides useful words for developing learning outcome statements. How does Bloom’s fit into our process? – It tends to confuse us. – Don’t get caught up in using Bloom’s to determine your learning outcomes. – Upshot: Don’t worry about Bloom’s! – Just think of it as providing useful verbs! 9

REMEMBERUNDERSTANDAPPLYANALYZEEVALUATECREATE Count Define Describe Draw Identify Label List Match Name Outline Point Quote Read Recall Recite Recognize Record Repeat Reproduce Select State Write Associate Compute Convert Defend Discuss Distinguish Estimate Explain Extend Extrapolate Generalize Give examples Infer Paraphrase Predict Rewrite Summarize Add Apply Calculate Change Classify Complete Compute Demonstrate Discover Divide Examine Graph Interpolate Manipulate Modify Operate Prepare Produce Show Solve Subtract Translate Use Analyze Arrange Breakdown Combine Design Detect Develop Diagram Differentiate Discriminate Illustrate Infer Outline Point out Relate Select Separate Subdivide Utilize Appraise Assess Compare Conclude Contrast Criticize Critique Determine Grade Interpret Judge Justify Measure Rank Rate Support Test Categorize Combine Compile Compose Create Drive Design Devise Generate Group Integrate Modify Order Organize, Plan Prescribe Propose Rearrange Reconstruct Related Reorganize Revise, Rewrite Summarize Transform Specify 10

Overall Measurement Procedure 1.Decide on which student work will be used for measurement (i.e., tests, exam,…) 2.Develop a rubric to describe desired student learning outcomes (more detail on this to come) 3.While you are marking, keep track of how the student did by ticking the appropriate box 4.Analyze results to provide information for continuous improvement (i.e., identify learning outcomes that the students are struggling with) 5.Document measurement results 11

Example: Measurement of the Indicator “Competence in Specialized Engineering Knowledge” using MECH ENG 4S03 (Incompressible Flows) 1.Student work used for measurement: Final exam 2.Development of rubric (example to follow): Think about what you wanted the students to learn Link those learning outcomes to the exam questions Decide on what the students needed to be able to do to demonstrate that they “met expectations” Then define learning outcomes for “exceed expectations”, “marginal”, “does not meet expectations” 12

13 Example Rubric – MECH ENG 4S03 (Incompressible Flow) Topic (exam questions used) Below Expectations MarginalMeets Expectations Exceeds Expectations Topic #1: Heat and momentum transfer analogy (Question 4 of final exam) -Does not understand the concept of the analogy -Able to use the correlations. - Understands that there is an analogy, but cannot explain the math behind it. - Can explain the math. basis of the analogy. - Can determine appropriate correlation to solve for heat transfer or drag -“meets expectations” plus: - Can explain why analogy does not hold if there is pressure gradient Comments on Topic #1 performance:

Example Rubric – MECH ENG 4S03 (Incompressible Flow) Topic (exam questions used) Below Expectations MarginalMeets Expectations Exceeds Expectations Topic #2: Boundary layers (Question 3 of final exam) -Cannot use correlations correctly -Unable to explain separation -Can draw velocity profile -Can calculate local shear and total drag -Doesn’t understand separation -Can draw boundary layer velocity profile - can calculate local shear and total drag -Can say whether flow will separate or not -”meets expectations” plus: - Can explain (based on physics in near wall region) why sep. cannot occur for fav. pressure grad. Comments on Topic #2 performance: 14

Example Rubric – MECH ENG 4S03 (Incompressible Flow) Topic (exam questions used) Below Expectations MarginalMeets Expectations Exceeds Expectations - Keep adding rows until you have covered topics you wish to measure - remember to leave blank space for ‘ticks’ when measuring - remember to leave area to write in comments while you are marking - If all the elements of the exam or test pertain to the indicator being measured (i.e., ‘Competence in specialized engineering knowledge’ as in this example), then add a row with the overall mark distribution on the exam or test: Overall exam performance % that did not meet expectations (i.e., grade of less than 48% on exam) % that were marginal (grade of roughly 48% to 59%) % that met expectations (grade of roughly 60% to 79%) % that exceeded expectations (top students: 80% and above) 15

Measurement Logistics: As you mark a question that is on the rubric, tick off the appropriate box. Add comments as appropriate. 16 Topic #1: Heat and momentum transfer analogy (Question 4 of final exam) -Does not understand the concept of the analogy /// -Able to use the correlations. - Understands that there is an analogy, but cannot explain the math behind it. //// - Can explain the math. basis of the analogy. -Can choose appropriate correlation to solve for heat transfer or drag //// //// //// -“meets expectations” plus: - Can explain why analogy does not hold if there is pressure gradient /// Comments on Topic #1 performance: -Students were generally good at using correct correlation -Some had trouble explaining the mathematics behind the analogy (need to spend more lecture time on that next year)

So what do we do with this data? 17 30% “marginal” or below Overall exam distribution Topic #1Topic #2Topic #3 * Plot is from Brian Frank from Queen’s Topic #1 – many are exceeding expectations (could reduce lecture time on it) Topic #3 – more than 30% of class is marginal or below (struggling with this)

Documentation of Measurement Results Need to write a short document summarizing results. It should include: – Rubric used for measurement – Corresponding exam or test – Distributions for each learning outcome area – Identified areas for continuous improvement – Sample exam papers with performance in each area (below expectations, marginal, …) – Suggestions for how to improve measurement procedure (if any) Need to identify a database where these documents can be kept. 18

Follow-up Incorporate areas identified as needing improvement into your lectures the next time you teach this course. Keep track of the changes made to your course since we will likely need to incorporate them into the next CEAB report. Subsequent measurement of the same learning outcomes should hopefully show improvement in those areas where we found the students were struggling. 19

Questions for discussion at May 25, 2012 Workshop How many rows do we need in our rubric? How many exams do we need to use in our measurement? (We are looking for statistical quantities – not assessing individual student performance. Can we measure every 2 nd exam?) How precise does the language in the rubric need to be? (i.e., “A demonstrated understanding of …..) Can we use some software to help us automatically analyze the data, write the report, …? (so that we can avoid using the hand written ‘ticks’) 20

Notes from Meeting Process flow diagram that shows feedback Include details on deadline for changes to courses Communication of measurement results to downstream courses (this also helps with communicating learning outcomes between courses) Can use measurements from early in the term as a way to guide continuous improvement within the same term for that course. 21