Gerald Kruse and David Drews Juniata College Huntingdon, PA

Slides:



Advertisements
Similar presentations
Provincial Report Cards Mathematics Grades 1 to 12.
Advertisements

Assessment of General Education Patricia E. Beeson Vice Provost for Graduate and Undergraduate Studies February 11, 2008 University of Pittsburgh.
Bill Zannini Business Programs Coordinator October 27, 2008.
Common Core State Standards K-5 Mathematics Kitty Rutherford and Amy Scrinzi.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Engineering Design Rubric Dimensions 1, 2 and 7.
A Quantitative Literacy Assessment Rubric Development & Lessons Learned - Stuart Boersma, Central Washington Univ. - Caren Diefenderfer, Hollins University.
Report on Faculty Exchange and Sabbatical during the Academic Year Gerald Kruse, Ph.D. Associate Professor of Computer Science and Mathematics.
1 General Education Assessment at Cleveland State University What We Have Accomplished What We Have Yet to Do.
Teaching Mathematics for Elementary Teachers through Problem Solving Martha VanCleave MathFest 2000 UCLA August 5, 2000.
A Quantitative Literacy Assessment Rubric Development & Lessons Learned - Stuart Boersma, Central Washington Univ. - Caren Diefenderfer, Hollins University.
IB Diploma Program Exams – Semester Report Cards
Data on Student Learning Office of Assessment University of Kentucky.
Daniel Fasko, Jr., Ph.D..  Definition of Critical Thinking  Critical Thinking Skills  Critical Thinking Dispositions  Instructional Strategies  Assessment.
Structuring an essay. Structuring an Essay: Steps 1. Understand the task 2.Plan and prepare 3.Write the first draft 4.Review the first draft – and if.
Leslie Dacosta M.S., G.E. Program Math Faculty, Kathleen Lodge, M.S., G.E. Program Math Faculty, Elizabeth Sieminski, M.S., M.A. Ed., G. E. Math Program.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
The New SAT ® What Does It Mean for Students?. 3The New SAT: What Does It Mean for Students? June, 2004 The New SAT Focuses on College Success ™ Skills.
Michigan Career Education Conference February 6, 2012.
May 8 th Assessment Day 2015 Agenda Introductions Assessment Overview Review General Education Outcomes. Overview of past assessment work. What.
Principles of Assessment
Can technology assist students in the classroom and could it be used effectively to improve student performance in a mathematics class?
Preparing For the N.J. GEPA What Skills Do Students Need?
Sheila Roberts Department of Geology Bowling Green State University.
LinearRelationships Jonathan Naka Intro to Algebra Unit Portfolio Presentation.
Copyright © 2008 Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and the Intel Teach Program are trademarks.
Welcome Math Leaders Mac Scoring Training Year 17 …analyzing student thinking and improving instruction.
1 Summer 2012 Educator Effectiveness Academies English Language Arts Transitioning to the CCSS by Making Strategic and Informed Choices in the Classroom.
Number Sense Standards Measurement and Geometry Statistics, Data Analysis and Probability CST Math 6 Released Questions Algebra and Functions 0 Questions.
CriteriaExemplary (4 - 5) Good (2 – 3) Needs Improvement (0 – 1) Identifying Problem and Main Objective Initial QuestionsQuestions are probing and help.
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
TEA Science Workshop #3 October 1, 2012 Kim Lott Utah State University.
Gerald Kruse and David Drews Juniata College Huntingdon, PA
Effectiveness of Instructional Techniques to Create Retainable Quantitative Skills Kathy Baughman Assistant Professor of Accounting Wei-Chung Wang Assistant.
Height Weight I.Q. Income Body mass index Age Testosterone Highest grade #years college Amount of hair Measures of Self-esteem Acceptance of Others Narcissism.
REVISIONS TO GENERAL EDUCATION STUDENT LEARNING OUTCOMES Auburn University Senate Information Item, August 2014.
Using CLA-in-the-Classroom Performance Tasks for Assessment in a Quantitative Reasoning Course Gerald Kruse, PhD. Associate Professor of Mathematics and.
Community Context Series of Community Math Nights Leadership Development for Math Support Team Mathematics Content Courses for K-20 Teachers Ongoing support.
Using Assessment Data Helen Thumann Department of Education.
COMMON CORE STATE STANDARDS MATHEMATICAL PRACTICES OBJECTIVE: APPLY BEST MATH PRACTICES TO SOLVE PROBLEMS TO BECOME A MATHEMATICALLY PROFICIENT STUDENT.
New Developments in NYS Assessments. What is new? Required use of Standardized Scannable Answer Sheets for all Regents Exams starting in June 2012 Beginning.
Historical Thinking Why Historical Thinking Matters.
Connecting Course Goals, Assignments, and Assessment Faculty Development for Student Success at Prince George’s Community College William Peirce
Core Curriculum Oversight Committee Learning Outcomes University Senate Dr. Linda S. Glaze, Chair.
Assessing Critical Thinking and Quantitative Reasoning in MA 103 Gerald Kruse, PhD. Associate Professor of Mathematics and Computer Science Juniata College.
Quantitative Writing: Communicating Data Presenter: Kim Massaro The University of Texas at San Antonio Quantitative Literacy Program Co-Author: Gail Pizzola.
Orchestrating Mathematical Discussion SESSION 3 OCTOBER 21, 2015.
AP Capstone: Seminar A Brave New YWCPA. Class Overview The AP Capstone is an inquiry-based course that aims to engage students in cross-curricular.
Copyright © 2009 Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and the Intel Teach Program are trademarks.
Prior Learning Assessment (PLA) Pilot Project At VSU Prepared by the PLA Assessors Group.
Integrating Quantitative Literacy into Your Course.
Chapter 10 Understanding and Planning Reports and Proposals 10-1.
Laboratory Science and Quantitative Core Requirements.
4/16/07 Assessment of the Core – Quantitative Reasoning Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
GRE Test Coaching Prosperoverseas. About Prosperoverseas Prosperoverseas one of the best GRE training institutes in Hyderabad India. We will guide you.
College Course Opportunities for High School Students
New Developments in NYS Assessments
GRE.
GMAT.
General Education Assessment
AP Seminar IWA Directions & Rubric
Picking the Right Exam.
Warm UP- Write in complete sentences
Spring Semester Overview
What is a Performance Task
Collegiate Learning Assessment (CLA+)
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
SUNY Oneonta’s CLA Results:
Syllabus Math Decision making.
Presentation transcript:

Gerald Kruse and David Drews Juniata College Huntingdon, PA

Outline of Status Report/Mentoring Session History Experimental Design Data Tentative Plans Comments/Suggestions

MA 103, Quantitative Methods, aka “QM” “Mathematics 103 prepares students to be quantitatively literate citizens in today's world. By learning to think critically about quantitative issues, students will be able to make responsible decisions in their daily lives. Problems are analyzed and solved using numerical, graphical, statistical, and algebraic reasoning. Technology is used to help visualize data and facilitate calculations, as well as to present quantitative output and verbal arguments. ” Collegiate Learning Assessment, (CLA), Performance Task Goals “Critical Thinking, Analytic Reasoning, and Problem Solving “Written Communication”

MA 103, Quantitative Methods at Juniata College Juniata has a Quantitative Skills Requirement Non-majors course: MA 103, Quantitative Methods Pre and Post Assessment (Skills and Attitudes) - 55 min exam given on the first and last day of semester - Fall 2009 transition from math skills to CLA performance task Three Projects during the semester - began using CLA performance tasks Spring one class period to present and start, then due over one week later - students appreciate open-ended assignment

Assessment Schedule, Fall 2009

Evaluating Evidence Analysis / Synthesis / Conclusion Presenting / “creating” evidence Acknowledging alternatives to THEIR conclusion Completeness Mechanics/Persuasiveness Higher-Order Skills Assessed in Rubric

Identifying relevant evidence, evaluating evidence credibility, reliability, relevance Not Attempted (0) Emerging (1,2) Mentions one or two documents, with: - No or wrong evaluation on both (1) - cursory-to-OK eval on document C, flawed on other (2) Developing (3,4) Mentions two documents (one must be C), with: - cursory-to-OK evaluation on both (3) - good evaluation on both (4) Mastering (5,6) Evaluation of C is good, and evaluates two other doc with: - acceptable evaluations (5) - good evaluations (6) Evaluating Evidence Category on Rubric

Rubric Reliability Dimension Question 1 Question 2 % spot on % +/- 1 % spot on % +/- 1 Evaluation Conclusions Create Alternatives Completeness Mechanics Overall: spot on = 63.6%+/- 1 = 91.3 %

Rubric Reliability, continued Question 1 total score: r =.713, p = Question 2 total score: r =.828, p = Total score: r =. 873, p = 0.000

Question 1 Mean by group and pre/post

Question 2 Mean by group and pre/post

Total Score Mean by group and pre/post

Explore the connection between the critical thinking rubric and assessments/grades used for the in-class tasks Provide more critical thinking feedback to the students More attention in class on the module analyzing sources (some low scores on the pre/post assessment were a result of students accepting flawed documents, done the day before fall break in 2009…) Your ideas here… Possible changes for Spring 2011

Identifying relevant evidence, evaluating evidence credibility, reliability, relevance Not Attempted (0) Emerging (1,2) Incorrectly implies, or states directly, agrees that “banning aspartame would improve the health of the state’s citizens” with: - no evidence (1) - evidence (2) Developing (3,4) Implies, or directly disagrees, with Sauer, noting inconsistency of claim with data in doc C but reason is: - inaccurate/unclear or incomplete(3) - good (4) Mastering (5,6) - Says C doesn’t support claim and is clear about reason and uses F reasonably well (5) - Satisfactorily uses conditional probability when discussing relationship between headaches and aspartame usage (6) Analysis/Synthesis/Conclusion Category on Rubric

Sen. Nathan Dulce is running for re-election vs. Pat Sauer Proposed bill to ban aspartame, an artificial sweetener, from being added to any soft drink or food product, Dulce opposes, Sauer approves. Pat Sauer made two arguments during a recent TV interview: (1) Strong correlation between the number of people who consume aspartame and headaches, so,“banning aspartame would improve the health of the state’s citizens.” (2)“Aspartame should be banned and replaced with sucralose.” Pat Sauer supported this argument by referring to a news release. Performance Task Scenario for Pre and Post-Assessment

Question #1 Pat Sauer claims that “banning aspartame would improve the health of the state’s citizens” (Documents E contains the chart he presented, and it is based on data from the tables in Document C). What are the strengths and/or limitations of Pat Sauer’s position on this matter? Based on the evidence, what conclusion should be drawn about Pat Sauer’s claim? Why? What specific information in the documents led you to this conclusion? Question #2 Pat Sauer claims that “aspartame should be banned and replaced with sucralose.” What are the strengths and/or limitations of Pat Sauer’s position on this matter? Based on the evidence, what conclusion should be drawn about Pat Sauer’s claim? Why? Is there a better solution, and if so, what are the its strengths and/or limitations? Be sure to cite the information in the documents as well as any other factors you considered (such as the quality of the research conducted on aspartame) that led you to this conclusion. Questions on Pre/Post Assessment

Making good arguments involves... Clearly stating a conclusion Evaluating and selecting evidence Creating links between evidence and conclusion We can then consider quantitative reasoning as critical thinking involving numbers/data… Critical thinking involves evaluating/making good arguments