Adventures in CPS Aaron Bruck Towns Group Meeting September 25, 2007
Goals for the summer ► Complete CPS projects 2 different projects ► Quantification and categorization of CPS questions ► Linking CPS results to results on exams ► Look into new ideas for research (and possibly an OP) Scientific Literacy Assessment tools
What is CPS? ► Students use “clickers” to respond to instructor-generated questions ► These responses are stored in an online database ► Students received grades based on the number of questions answered correctly
Categorization ► We decided to categorize the questions in the following ways: Solo vs. Buddy Definition vs. Algorithmic vs. Conceptual Using Bloom’s Taxonomy Using Smith/Nakhleh/Bretz Framework 1 ► We also compared our analyses with those of Mazur 2 to make sure we were looking for the right things. 1 Smith, K. C., Nakhleh, M. B., Bretz, S. L. An Expanded Framework for Analyzing General Chemistry Exams. Journal of Chemical Education. In press. 2 Fagen, A. P., Crouch, C. H., Mazur, E. (2002) Peer Instruction: Results from a Range of Classrooms. The Physics Teacher. 40,
Categorization, cont. ► Here are the results from one of the sections (others followed a similar trend): Bloom's Taxonomy# of questions# Definition# Algorithmic# Conceptual Knowledge (1) Comprehension (2) Application (3) Analysis (4)3012 Synthesis (5)0000 Evaluation (6)0000 total Bloom's Taxonomy# of questions# solo# buddy Knowledge (1)44413 Comprehension (2) Application (3)27189 Analysis (4)312 Synthesis (5)000 Evaluation (6)000 total
More categorization Smith/Nakhleh Framework Bloom's Taxonomy# questions# Definition# A-MaMi# A-MaD# A-MiS# A-Mu# C-E# C-P# C-I# C-O Knowledge (1) Comprehension (2) Application (3) Analysis (4) Synthesis (5) Evaluation (6) total
Results by Category ► A 2 tailed t-test (solo/buddy) and one way ANOVAs (all others) were performed to test for statistical differences in the data ► Analyses showed no significant differences between any of the categories and how the students performed on the questions ► The only exception were the solo-buddy questions for one professor Solo vs. BuddyQuestion Type (D/A/C) Bloom’s Taxonomy Smith/Nakhleh/Bretz Framework tpFpFpFp Professor A * Professor B Professor C
Solo/Buddy Analysis ► Prompted by the unusual results, we further investigated the solo/buddy analysis ► We also looked at pairs of solo/buddy questions asked one after the other: Solo/BuddyNMeanStd. Deviation Std. Error Mean Solo/Buddy T-test results: t= p=0.017 (significant difference)
That’s great, but… ► We found a significant difference between solo and buddy questions…but is it worth anything? ► Our next step was to see if this apparent difference in performance due to style of question translated into better test scores on the exams.
Exam Analysis ► We compared exam questions with questions asked in class using CPS. ► Surprisingly, we found very few questions on the exams that directly or indirectly corresponded to CPS questions. ► Each exam was analyzed individually before pooling all of the data to determine any and all effects.
Exam Analysis # solo# buddy# neither Exam Exam Exam Exam Totals Question Effects F valuep value Exam Exam Exam Final Exam Pooled Exams per instructor… Professor A Professor B Professor C % correct solo % correct Buddy % correct Neither Exam n/a Exam Exam Exam Totals All analyses showed no significant differences at the p=0.05 confidence level.
Instructor Effects ► We also ran an analysis to check for any instructor effects that could have possibly skewed the data. ► Results showed no significant differences at the p=0.05 level: Instructor Effects F valuep value Exam Exam Exam Final Exam Pooled Exams
Is CPS better than nothing? ► A final analysis was performed between questions that correlated to CPS questions and those that did not. ► Unfortunately, no significant differences were found, though the average score was higher for CPS questions.
CPS vs. Nothing Results Results of ANOVA: Percent CorrectSum of SquaresdfMean SquareFSig. Between Groups Within Groups Total Descriptive Statistics: Percent CorrectNMeanStd. DeviationStd. Error Total
Conclusions ► CPS is an effective lecture tool that engages students interactively in their content ► Most CPS questions are low-level questions in terms of Bloom’s Taxonomy and other categorization tools ► Students seem to learn content through interaction with their peers when using CPS, though this does not necessarily correlate to success on exams
What else did I do? ► Research Questions In the event that I need to do a project other than the NSDL project, what avenues are available? Could any of these ideas turn into a possible OP in the following months? ► Ideas of interest Scientific Literacy ► What is the value of a textbook? ► Could other materials help? Assessment ► Immediate feedback assessment technique (IFAT) Could it work in chemistry?