Adventures in CPS Aaron Bruck Towns Group Meeting September 25, 2007.

Slides:



Advertisements
Similar presentations
What do you know about the Rigor/Relevance framework?
Advertisements

Inquiry-Based Instruction
Classroom Instruction That Works Providing Feedback.
Learning by Doing: An Empirical Study of Active Teaching Techniques
Checking For Understanding
Hybrid Statistics Clicks Janet Winter Penn State Berks.
An approach to teaching it. Jacqueline is purchasing her first car and feels torn as she balances conflicting desires and messages. She yearns to be seated.
Calibrated Peer Review TM Orville L. Chapman, Arlene A. Russell, and Michael A. Fiore University of California, Los Angeles.
The effective teacher’s characteristics as perceived by students by Charles Bélanger (Canada) Bernard Longden (U.K.)
More Writing But Less Grading: Calibrated Peer Review™ Share The Future IV - Tempe, AZ John Wise The Pennsylvania State University.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
The Impact of Project Based Learning on High School Biology SOL Scores Rhiannon Brownell April 1, 2008 ECI 637 ECI 637 Instructor: Martha Maurno, M.S.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
Clickers in the Classroom Monday Models Spring 08 source:
 Do non-majors learn genetics at a different rate than majors?  What factors affect how students think about and learn difficult genetics concepts? Jenny.
PowerPoint Presentation Materials
Welcome... Simon Walls PhD Marketing School of Business Administration.
K-State Online Instructional Design for Mediated Education “How to Get the Most Out of Your Online Quizzes and Exams” Swasati Mukherjee Ben Ward.
Effectiveness of Using Interactive Technology in a Programming Course Shyamal Mitra Department of Computer Sciences University of Texas at Austin.
Human Learning Lisa Holmes. Learning Theory A learning theory is a concept that describes how learning occurs. It takes into consideration how the information.
By: Latrica Williams Associate Professor of Mathematics St. Petersburg College
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
Exam Taking Kinds of Tests and Test Taking Strategies.
STEPHANIE AFFUL JANUARY 30, 2013 Using Clickers in Fontbonne Classrooms.
David McConnell, Geology University of Akron May 2005 NAGT Observing and Assessing Workshop Hey, Technology Boy: It’s the Message, Not the Medium.
The 5 E’s Science Lesson Inquiry-Based Instruction.
Course and Syllabus Development Presented by Claire Major Assistant Professor, Higher Education Administration.
Content-Area Writing Chapter 10 Writing for Tests and Assessments Darcey Helmick EIWP 2013.
Melrose High School MCAS Presentation October 22, 2013.
Using Peer Reviewed Research to Teach Reading, Critical Thinking and Information Literacy in Student Success Courses Dr. Christine Harrington Middlesex.
Promise and Problems of Learning Progression-guided Interventions Hui Jin, Hyo Jeong Shin, Michele Johnson, Jinho Kim.
Arkansas Tech University
Statistics AP Course Expectations. Guidelines for the Statistics AP Course We will emphasize statistical literacy and develop statistical thinking. We.
Inquiry Based Learning
QUIZ BUDDY AS A PAIR ASSESSMENT AND ITS EFFECT ON STUDENT’S PERFORMANCE AND ATTITUDE TOWARDS MATHEMATICS Researcher: AZENITH A. GALLANO Master Teacher.
1 Using Feedback as a Teaching Tool in the Online Classroom.
Develop Your Questioning Techniques Patty Blanton Assistant Editor The Physics Teacher Instructional coach for Modeling High School Physics, Physical Science,
HELPING TO MAKE LEARNING “CLICK” Student Responders.
Collaborative Problem Solving: Selected Parables Dave Dempsey Professor of Meteorology Department of Earth & Climate Sciences San Francisco State University.
Peer Instruction: Making Science Engaging Faculty Development Workshop September 4, 2012 Donna L. Pattison, PhD Instructional Professor Department of Biology.
Flipping for the Framework : Adapting a Library Instruction Session to the Framework for Information Literacy using Flipped and Discovery Based Learning.
MAKING MEANING OUT OF DATA Statistics for IB-SL Biology.
1 Engaging Students Incorporating Depth, Complexity, and Questioning Strategies into the classroom. Phase 1 “Plan for Using Questioning” November 4, 2009.
Class Assessment Best Practices for Monitoring and Evaluating Your Classroom pp
Information Technology Design & Multimedia. A Comparison of an Innovative Web-based Assessment Tool Utilizing Confidence Measurement to the Traditional.
Questioning Strategies EDG 4410 Teaching Strategies and Classroom Management University of Central Florida, Orlando, FL Dr. Verkler Fall 2014.
“ Hits ” (not “ Discussion Posts ” ) predict student success in online courses: A double cross-validation study Source: Computers & Education 50 (2008)
STAT MINI- PROJECT NKU Executive Doctoral Co-hort August, 2012 Dot Perkins.
EDU 312 Week Three: Planning for Science Instruction.
Adventures in flipping a cell biology course Dr. Katie Shannon Biological Sciences Missouri S&T How do online videos and textbook reading engage students.
The Use of a Knowledge Survey in Microbiology Lacey Favazzo.
What do we know (page 1)? Define the word "Taxonomy." (Knowledge) Define the word "Convergent." (Knowledge) Define the word "Divergent." (Knowledge) What.
1 IT/Cybersecurity - ICRDCE Conference Day Using Blooms to Write Student Learning Outcomes (SLO’s)
Philip W. Young Dept. of Chemistry & Engineering Physics, University of Wisconsin-Platteville, Platteville, WI Correlation between FCI Gains and.
Active Learning: Rethinking Our Teaching to Promote Deeper Learning
Assessing Learning Outcomes
Inquiry-Based Instruction
Writing Great Learning Outcomes
Teaching Undergraduate Statistics without Math
Internal Assessment IB Biology Year
NKU Executive Doctoral Co-hort August, 2012 Dot Perkins
School Library Services 21
Krista Jackson AET/570 May 23, 2016 Professor Dlabach
The purposes of grading student work
Active Clickers Active Learning with
What do we know (page 1)? Define the word "Taxonomy." (Knowledge)
Developing Questioning Skills
Investigation of Inverted and Active Pedagogies in STEM Disciplines: A Preliminary Report Reza O. Abbasian Texas Lutheran University, Department of.
Effective Questioning
The Impact of Project Based Learning on High School Biology SOL Scores
Presentation transcript:

Adventures in CPS Aaron Bruck Towns Group Meeting September 25, 2007

Goals for the summer ► Complete CPS projects  2 different projects ► Quantification and categorization of CPS questions ► Linking CPS results to results on exams ► Look into new ideas for research (and possibly an OP)  Scientific Literacy  Assessment tools

What is CPS? ► Students use “clickers” to respond to instructor-generated questions ► These responses are stored in an online database ► Students received grades based on the number of questions answered correctly

Categorization ► We decided to categorize the questions in the following ways:  Solo vs. Buddy  Definition vs. Algorithmic vs. Conceptual  Using Bloom’s Taxonomy  Using Smith/Nakhleh/Bretz Framework 1 ► We also compared our analyses with those of Mazur 2 to make sure we were looking for the right things. 1 Smith, K. C., Nakhleh, M. B., Bretz, S. L. An Expanded Framework for Analyzing General Chemistry Exams. Journal of Chemical Education. In press. 2 Fagen, A. P., Crouch, C. H., Mazur, E. (2002) Peer Instruction: Results from a Range of Classrooms. The Physics Teacher. 40,

Categorization, cont. ► Here are the results from one of the sections (others followed a similar trend): Bloom's Taxonomy# of questions# Definition# Algorithmic# Conceptual Knowledge (1) Comprehension (2) Application (3) Analysis (4)3012 Synthesis (5)0000 Evaluation (6)0000 total Bloom's Taxonomy# of questions# solo# buddy Knowledge (1)44413 Comprehension (2) Application (3)27189 Analysis (4)312 Synthesis (5)000 Evaluation (6)000 total

More categorization Smith/Nakhleh Framework Bloom's Taxonomy# questions# Definition# A-MaMi# A-MaD# A-MiS# A-Mu# C-E# C-P# C-I# C-O Knowledge (1) Comprehension (2) Application (3) Analysis (4) Synthesis (5) Evaluation (6) total

Results by Category ► A 2 tailed t-test (solo/buddy) and one way ANOVAs (all others) were performed to test for statistical differences in the data ► Analyses showed no significant differences between any of the categories and how the students performed on the questions ► The only exception were the solo-buddy questions for one professor Solo vs. BuddyQuestion Type (D/A/C) Bloom’s Taxonomy Smith/Nakhleh/Bretz Framework tpFpFpFp Professor A * Professor B Professor C

Solo/Buddy Analysis ► Prompted by the unusual results, we further investigated the solo/buddy analysis ► We also looked at pairs of solo/buddy questions asked one after the other: Solo/BuddyNMeanStd. Deviation Std. Error Mean Solo/Buddy T-test results: t= p=0.017 (significant difference)

That’s great, but… ► We found a significant difference between solo and buddy questions…but is it worth anything? ► Our next step was to see if this apparent difference in performance due to style of question translated into better test scores on the exams.

Exam Analysis ► We compared exam questions with questions asked in class using CPS. ► Surprisingly, we found very few questions on the exams that directly or indirectly corresponded to CPS questions. ► Each exam was analyzed individually before pooling all of the data to determine any and all effects.

Exam Analysis # solo# buddy# neither Exam Exam Exam Exam Totals Question Effects F valuep value Exam Exam Exam Final Exam Pooled Exams per instructor… Professor A Professor B Professor C % correct solo % correct Buddy % correct Neither Exam n/a Exam Exam Exam Totals All analyses showed no significant differences at the p=0.05 confidence level.

Instructor Effects ► We also ran an analysis to check for any instructor effects that could have possibly skewed the data. ► Results showed no significant differences at the p=0.05 level: Instructor Effects F valuep value Exam Exam Exam Final Exam Pooled Exams

Is CPS better than nothing? ► A final analysis was performed between questions that correlated to CPS questions and those that did not. ► Unfortunately, no significant differences were found, though the average score was higher for CPS questions.

CPS vs. Nothing Results Results of ANOVA: Percent CorrectSum of SquaresdfMean SquareFSig. Between Groups Within Groups Total Descriptive Statistics: Percent CorrectNMeanStd. DeviationStd. Error Total

Conclusions ► CPS is an effective lecture tool that engages students interactively in their content ► Most CPS questions are low-level questions in terms of Bloom’s Taxonomy and other categorization tools ► Students seem to learn content through interaction with their peers when using CPS, though this does not necessarily correlate to success on exams

What else did I do? ► Research Questions  In the event that I need to do a project other than the NSDL project, what avenues are available?  Could any of these ideas turn into a possible OP in the following months? ► Ideas of interest  Scientific Literacy ► What is the value of a textbook? ► Could other materials help?  Assessment ► Immediate feedback assessment technique (IFAT)  Could it work in chemistry?