Download presentation
Presentation is loading. Please wait.
Published byRandolf Bell Modified over 9 years ago
1
Adventures in CPS Aaron Bruck Towns Group Meeting September 25, 2007
2
Goals for the summer ► Complete CPS projects 2 different projects ► Quantification and categorization of CPS questions ► Linking CPS results to results on exams ► Look into new ideas for research (and possibly an OP) Scientific Literacy Assessment tools
3
What is CPS? ► Students use “clickers” to respond to instructor-generated questions ► These responses are stored in an online database ► Students received grades based on the number of questions answered correctly
4
Categorization ► We decided to categorize the questions in the following ways: Solo vs. Buddy Definition vs. Algorithmic vs. Conceptual Using Bloom’s Taxonomy Using Smith/Nakhleh/Bretz Framework 1 ► We also compared our analyses with those of Mazur 2 to make sure we were looking for the right things. 1 Smith, K. C., Nakhleh, M. B., Bretz, S. L. An Expanded Framework for Analyzing General Chemistry Exams. Journal of Chemical Education. In press. 2 Fagen, A. P., Crouch, C. H., Mazur, E. (2002) Peer Instruction: Results from a Range of Classrooms. The Physics Teacher. 40, 206-209.
5
Categorization, cont. ► Here are the results from one of the sections (others followed a similar trend): Bloom's Taxonomy# of questions# Definition# Algorithmic# Conceptual Knowledge (1)444031 Comprehension (2)44141119 Application (3)274212 Analysis (4)3012 Synthesis (5)0000 Evaluation (6)0000 total118583624 Bloom's Taxonomy# of questions# solo# buddy Knowledge (1)44413 Comprehension (2)443410 Application (3)27189 Analysis (4)312 Synthesis (5)000 Evaluation (6)000 total1189424
6
More categorization Smith/Nakhleh Framework Bloom's Taxonomy# questions# Definition# A-MaMi# A-MaD# A-MiS# A-Mu# C-E# C-P# C-I# C-O Knowledge (1)444200100100 Comprehension (2)4415107011640 Application (3)27474630120 Analysis (4)3000010200 Synthesis (5)0000000000 Evaluation (6)0000000000 total118618414412060
7
Results by Category ► A 2 tailed t-test (solo/buddy) and one way ANOVAs (all others) were performed to test for statistical differences in the data ► Analyses showed no significant differences between any of the categories and how the students performed on the questions ► The only exception were the solo-buddy questions for one professor Solo vs. BuddyQuestion Type (D/A/C) Bloom’s Taxonomy Smith/Nakhleh/Bretz Framework tpFpFpFp Professor A-3.1890.002*0.7300.4850.3070.8200.2850.942 Professor B0.0490.9621.3010.2771.1020.3521.1020.429 Professor C-0.5790.5641.0010.3712.4560.0671.9230.064
8
Solo/Buddy Analysis ► Prompted by the unusual results, we further investigated the solo/buddy analysis ► We also looked at pairs of solo/buddy questions asked one after the other: Solo/BuddyNMeanStd. Deviation Std. Error Mean Solo/Buddy1 845.587519.097866.75211 2 870.252517.422256.15970 T-test results: t=-2.699 p=0.017 (significant difference)
9
That’s great, but… ► We found a significant difference between solo and buddy questions…but is it worth anything? ► Our next step was to see if this apparent difference in performance due to style of question translated into better test scores on the exams.
10
Exam Analysis ► We compared exam questions with questions asked in class using CPS. ► Surprisingly, we found very few questions on the exams that directly or indirectly corresponded to CPS questions. ► Each exam was analyzed individually before pooling all of the data to determine any and all effects.
11
Exam Analysis # solo# buddy# neither Exam 117040 Exam 221931 Exam 310 36 Exam 4291064 Totals7729171 Question Effects F valuep value Exam 13.5080.066 Exam 22.1620.124 Exam 32.7180.075 Final Exam2.7930.066 Pooled Exams1.6320.197 per instructor… Professor A1.0320.361 Professor B0.3410.712 Professor C1.4680.236 % correct solo % correct Buddy % correct Neither Exam 168.119n/a57.5222 Exam 256.567562.190066.1966 Exam 368.113867.949354.5532 Exam 466.169950.336860.3920 Totals64.233860.088759.5438 All analyses showed no significant differences at the p=0.05 confidence level.
12
Instructor Effects ► We also ran an analysis to check for any instructor effects that could have possibly skewed the data. ► Results showed no significant differences at the p=0.05 level: Instructor Effects F valuep value Exam 10.540.586 Exam 20.4840.619 Exam 30.1080.898 Final Exam1.2550.289 Pooled Exams0.9870.374
13
Is CPS better than nothing? ► A final analysis was performed between questions that correlated to CPS questions and those that did not. ► Unfortunately, no significant differences were found, though the average score was higher for CPS questions.
14
CPS vs. Nothing Results Results of ANOVA: Percent CorrectSum of SquaresdfMean SquareFSig. Between Groups 827.4511 2.271.133 Within Groups 100204.068275364.378 Total 101031.520276 Descriptive Statistics: Percent CorrectNMeanStd. DeviationStd. Error 110663.099817.254871.67594 217159.543820.138111.54000 Total27760.904619.132601.14957
15
Conclusions ► CPS is an effective lecture tool that engages students interactively in their content ► Most CPS questions are low-level questions in terms of Bloom’s Taxonomy and other categorization tools ► Students seem to learn content through interaction with their peers when using CPS, though this does not necessarily correlate to success on exams
16
What else did I do? ► Research Questions In the event that I need to do a project other than the NSDL project, what avenues are available? Could any of these ideas turn into a possible OP in the following months? ► Ideas of interest Scientific Literacy ► What is the value of a textbook? ► Could other materials help? Assessment ► Immediate feedback assessment technique (IFAT) Could it work in chemistry?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.