Download presentation
Presentation is loading. Please wait.
Published byElfrieda Pope Modified over 9 years ago
1
Gerald Kruse and David Drews Juniata College Huntingdon, PA kruse@juniata.edu drews@juniata.edu
2
MA 103, Quantitative Methods, aka “QM” “Mathematics 103 prepares students to be quantitatively literate citizens in today's world. By learning to think critically about quantitative issues, students will be able to make responsible decisions in their daily lives. …as well as to present quantitative output and verbal arguments. ”
3
MA 103, Quantitative Methods, aka “QM” “Mathematics 103 prepares students to be quantitatively literate citizens in today's world. By learning to think critically about quantitative issues, students will be able to make responsible decisions in their daily lives. …as well as to present quantitative output and verbal arguments. ” Three Projects during the semester - began using CLA performance tasks Spring 2009 - authentic and open-ended
4
MA 103, Quantitative Methods, aka “QM” “Mathematics 103 prepares students to be quantitatively literate citizens in today's world. By learning to think critically about quantitative issues, students will be able to make responsible decisions in their daily lives. …as well as to present quantitative output and verbal arguments. ” Three Projects during the semester - began using CLA performance tasks Spring 2009 - authentic and open-ended Pre and Post Assessment (Skills and Attitudes) - 55 min exam given on the first and last class of semester - Fall 2009 transition from math skills to CLA performance task
5
“The Collegiate Learning Assessment (CLA) provides one possible example of an assessment that fits a situated notion of QR.” Richard Shavelson, “Reflections on Quantitative Reasoning, an Assessment Perspective,” In B.L. Madison & L.A. Steen (Eds.), Calculation vs. context: Quantitative literacy and its implications for teacher education. MAA.
6
Being quantitatively literate is being “able to think and reason quantitatively when the situation so demands”
7
Evaluating Evidence Higher-Order Skills Assessed
8
Evaluating Evidence Analysis / Synthesis / Conclusion Higher-Order Skills Assessed
9
Evaluating Evidence Analysis / Synthesis / Conclusion Presenting / “creating” evidence Higher-Order Skills Assessed
10
Evaluating Evidence Analysis / Synthesis / Conclusion Presenting / “creating” evidence Acknowledging alternatives to THEIR conclusion Higher-Order Skills Assessed
11
Evaluating Evidence Analysis / Synthesis / Conclusion Presenting / “creating” evidence Acknowledging alternatives to THEIR conclusion Completeness Higher-Order Skills Assessed
12
Experimental Design
13
Explicit scoring guidelines (based on rubric) established. Fall 2009, Version 1.o
14
Explicit scoring guidelines (based on rubric) established. Scoring guidelines gave “good” reliability. Fall 2009, Version 1.o
15
Explicit scoring guidelines (based on rubric) established. Scoring guidelines gave “good” reliability. Encouraging, but not statistically significant, results indicated that students in the section with performance task based projects showed more improvement in critical thinking skills. Fall 2009, Version 1.o
16
Explicit scoring guidelines (based on rubric) established. Scoring guidelines gave “good” reliability. Encouraging, but not statistically significant, results indicated that students in the section with performance task based projects showed more improvement in critical thinking skills. Use results to prepare for next round of assessment in Spring 2011. Fall 2009, Version 1.o
17
The content of the course Fall 2009 vs. Spring 2011 remained the same, as did 95% of the “classroom experience,” but the course was reframed with a focus on quantitative reasoning (q. r.) and critical thinking (c. t.) Modifications for V2.0 “How was the student experience different in Spring 2011 vs. Fall 2009?”
18
The content of the course Fall 2009 vs. Spring 2011 remained the same, as did 95% of the “classroom experience,” but the course was reframed with a focus on quantitative reasoning (q. r.) and critical thinking (c. t.) - syllabus - assignments - opportunities during lecture - “salt and pepper” Modifications for V2.0 “How was the student experience different in Spring 2011 vs. Fall 2009?”
19
The content of the course Fall 2009 vs. Spring 2011 remained the same, as did 95% of the “classroom experience,” but the course was reframed with a focus on quantitative reasoning (q. r.) and critical thinking (c. t.) - syllabus - assignments - opportunities during lecture - “salt and pepper” The pre/post assessment was modified: - better linkage with specific learning outcomes - more open-ended scenario - names - one prompt Modifications for V2.0 “How was the student experience different in Spring 2011 vs. Fall 2009?”
20
Present the idea of a rubric: - objective assessment, “trust” - familiarize with elements - create one for “chips” Modifications for V2.0 continued “How was the student experience different in Spring 2011 vs. Fall 2009?”
21
Scoring Guidelines
22
Present the idea of a rubric: - objective assessment, “trust” - familiarize with elements - create one for “chips” Improved feedback for projects completed during the semester: - students used guideline to score their work - compared this to my scoring - general trends discussed with entire class - scheduled time to meet for specific feedback Each of the three projects emphasized different quantitative content (what we were doing at the time) as well as different categories in the rubric Modifications for V2.0 continued “How was the student experience different in Spring 2011 vs. Fall 2009?”
23
Results for V2.0 Total Score PrePost% change % of possible improvement my section9.4613.2540.0632.84 other section8.29.2412.688.13 Evaluating Evidence PrePost% change % of possible improvement my section1.682.548.8118.98 other section0.961.2429.175.56
24
Results for V2.0 Analysis, Synthesis, Conclusion PrePost% change % of possible improvement my section2.86439.8636.31 other section2.63.2826.1520.00 Presenting, Creating Evidence PrePost% change % of possible improvement my section0.381163.1623.66 other section0.160.6275.0015.49
25
Results for V2.0 Acknowledging Alternatives to Their Conclusion PrePost% change % of possible improvement my section0.710.9635.2110.92 other section0.80.44-45.00-16.36 Completeness PrePost% change % of possible improve- ment my section1.892.2820.6335.14 other section1.84 0.00
27
Rubric Reliability Dimension% spot on% +/- 1 Evaluation56.740.096.7 Anal/Synth/Concl58.634.593.1 Create 73.326.7100.0 Alternatives 51.737.998.7 Completeness 62.134.596.6 Total Score39.235.775.0
28
Rubric Reliability Correlation of Total Scores Pearson correlation of D and J = 0.927 P-Value = 0.000
29
Critical thinking involves evaluating/making good arguments Making good arguments involves... Clearly stating a conclusion Evaluating and selecting evidence Creating links between evidence and conclusion We consider quantitative reasoning as critical thinking involving numbers/data…
30
MA 103, Quantitative Methods at Juniata College Juniata has a Quantitative Skills Requirement, Q = QM + QS MA 103, Quantitative Methods, is offered for students who don’t fulfill the “Q” in their POE Three Projects during the semester - began using CLA performance tasks Spring 2009 - authentic and open-ended
31
Making good arguments involves... Clearly stating a conclusion Evaluating and selecting evidence Creating links between evidence and conclusion We can then consider quantitative reasoning as critical thinking involving numbers/data… Critical thinking involves evaluating/making good arguments
32
Identifying relevant evidence, evaluating evidence credibility, reliability, relevance Not Attempted (0) Emerging (1,2) Mentions one or two documents, with: - No or wrong evaluation on both (1) - cursory-to-OK eval on document C, flawed on other (2) Developing (3,4) Mentions two documents (one must be C), with: - cursory-to-OK evaluation on both (3) - good evaluation on both (4) Mastering (5,6) Evaluation of C is good, and evaluates two other doc with: - acceptable evaluations (5) - good evaluations (6) Evaluating Evidence Category on Rubric
33
Identifying relevant evidence, evaluating evidence credibility, reliability, relevance Not Attempted (0) Emerging (1,2) Incorrectly implies, or states directly, agrees that “banning aspartame would improve the health of the state’s citizens” with: - no evidence (1) - evidence (2) Developing (3,4) Implies, or directly disagrees, with Sauer, noting inconsistency of claim with data in doc C but reason is: - inaccurate/unclear or incomplete(3) - good (4) Mastering (5,6) - Says C doesn’t support claim and is clear about reason and uses F reasonably well (5) - Satisfactorily uses conditional probability when discussing relationship between headaches and aspartame usage (6) Analysis/Synthesis/Conclusion Category on Rubric
34
Sen. Nathan Dulce is running for re-election vs. Pat Sauer Proposed bill to ban aspartame, an artificial sweetener, from being added to any soft drink or food product, Dulce opposes, Sauer approves. Pat Sauer made two arguments during a recent TV interview: (1) Strong correlation between the number of people who consume aspartame and headaches, so,“banning aspartame would improve the health of the state’s citizens.” (2)“Aspartame should be banned and replaced with sucralose.” Pat Sauer supported this argument by referring to a news release. Performance Task Scenario for Pre and Post-Assessment
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.